AI isn’t reserved for a human-level general intelligence. The computer-controlled avatars in some videogames are AI. My phone’s text-to-speech is AI. And yes, LLMs, like the smaller Markov-chain models before them, are AI.
In my first AI lecture at uni, my lecturer started off by asking us to spend 5 minutes in groups defining “intelligence”. No group had the same definition. “So if you can’t agree on what intelligence is, how can we possibly define artificial intelligence?”
AI has historically just described cutting edge computer science at the time, and I imagine it will continue to do so.
When I was in university (2002 or so) we had an “AI” lecture and it was mostly "if"s and path finding algorithms like A*.
So I would argue that us the engineers have been using the term to define a wider use cases long before LLM, CEO and marketing people did it. And I think that’s fine, as categorising algorithms/solutions as AI helps understand what they will be used for, and we (at least the engineers) don’t tend to assume an actual self aware machine when we hear that name.
nowadays they call that AGI, but it wasn’t always like that, back in my time it was called science fiction 😉
AFAIA, The pudding part is because pudding referred to meat dishes long before it was used for sweet dishes, and yorkshire pudding used to be exclusively served with meat - which is likely tightly linked to the original meaning of toad in the hole!
Yes, AI term is used for marketing, though it didn’t start with LLMs, a couple of years before, any ML algorithm was called AI together with the trendy data scientist job.
However, I do think LLMs are very useful, just try them for your daily tasks, you’ll see. I’m pretty sure they will become as common as a web search in the future.
Also, how can you tell that the human brain is not mostly a very powerful LLM hosting machine?
asklemmy
Active
This magazine is from a federated server and may be incomplete. Browse more on the original instance.