Show HN: BBC “In Our Time”, categorised by Dewey Decimal, heavy lifting by GPT
661 by genmon | 167 comments on Hacker News.
I'm a big fan of the BBC podcast In Our Time -- and (like most people) I've been playing with the OpenAI APIs. In Our Time has almost 1,000 episodes on everything from Cleopatra to the evolution of teeth to plasma physics, all still available, so it's my starting point to learn about most topics. But it's not well organised. So here are the episodes sorted by library code. It's fun to explore. Web scraping is usually pretty tedious, but I found that I could send the minimised HTML to GPT-3 and get (almost) perfect JSON back: the prompt includes the Typescript definition. At the same time I asked for a Dewey classification... and it worked. So I replaced a few days of fiddly work with 3 cents per inference and an overnight data run. My takeaway is that I'll be using LLMs as function call way more in the future. This isn't "generative" AI, more "programmatic" AI perhaps? So I'm interested in what temperature=0 LLM usage looks like (you want it to be pretty deterministic), at scale, and what a language that treats that as a first-class concept might look like.
New best story on Hacker News: Show HN: BBC “In Our Time”, categorised by Dewey Decimal, heavy lifting by GPT
March 11, 2023
0
Tags