We need a better name for AI, or we risk talking past each other until actually intelligent AGI comes home mooing

A-I can't believe it's not actually intelligent.

A-I can't believe it's not actually intelligent.

When I say ‘AI’, what do you think of? Large Language Models saying something outrageous? Vaguely melted or otherwise bland looking images fashioned using a generative model? Perhaps even videogame NPC behaviour? While all of the above work on the same principle—to risk oversimplification, that is a computer stitching together human-created fragments to make something that looks new-ish—they’re all distinct technologies. This is just one reason I think the term ‘AI’ is a catch-all-too-much term in desperate need of workshopping.

Calling something ‘AI’ is attention-grabbing for all the splashy sci-fi reasons you’d expect, while also offering a deeply unhelpful layer of obfuscation. For instance, describing ChatGPT as displaying intelligence is a stretch at best, with that qualifying ‘artificial’ doing a lot of heavy lifting.

Chat bots such as ChatGPT employ Large Language Models, a type of machine learning that sucks up vast swathes of data in order to better predict the word most likely to come next in a sentence, in order to produce text that looks human-like…at least at first glance.

To put it another way, I personally think of it as a black box attempting to play the theatre kid favourite of ‘Yes, and,’ but it’s stolen all of the jokes and doesn’t particularly understand why any of it is funny. Neither ChatGPT nor DeepSeek truly understand things like context—as evidenced by both bots’ abysmal PC build recommendations.

An Ai face looks down on a human.

(Image credit: Colin Anderson via Getty Images)

And if it’s not a chat bot making a bad go at doing my job, it’s making stuff up. LLM-based AI is known to ‘hallucinate,’ or otherwise smash words together without fully appreciating their context and in effect offering up false information to users in the process. For just one example, let’s talk about my mortal enemy as a lactose intolerant: delicious cheese. In a recent official Google ad for Gemini, the AI stated Gouda accounts for “50 to 60 percent of the world’s cheese consumption,” which even I know doesn’t pass the sniff test.

Now, some may think this is small, cheesy potatoes, but AI ‘hallucinations’ have been known to be, um, particularly vivid. For a few more examples, Google’s AI Overview now overshadowing many search results pages has memorably recommended users add glue to their pizza sauce, in addition to issuing some truly stomach-turning medical advice.

So yeah, not really intelligent, artificially or otherwise. The AI branding conjures sci-fi images of tech capable of not just human-like output, but thought, and that vastly oversells the capabilities of today’s tech—the lights are on, but no one’s home. Artificial General Intelligence is the equally vague term that is more precisely assigned to the concept of a system that could one day perform cognitive tasks just as well as any human with their fleshy meat brain. With that in mind, confusion hardly seems surprising.

Then there’s everything else lumped in under the narrower machine learning banner of ‘AI’—such as actually helpful applications of trying to predict what keystroke is most likely to come next, and diagnosing sick dogs. The ‘AI’ banner looms large, sometimes completely overshadowing the actually beneficial examples of machine learning, like AI tools being used thoughtfully by creatives.

For example, 2025 film The Brutalist courted some controversy when various AI technologies were revealed to have been used during production. Though no AI-generated images made it on screen, voice generation tech Respeecher was deployed to ensure the accuracy of lines spoken in Hungarian, while also preserving the performances of its non-native speaker cast.

Furthermore, this voice generation tech was trained on data contributed by the production team themselves, as opposed to the AI scraping whatever it could get its hands on from the internet. Along similar lines, this unrelated GPU organ installation created by artists Holly Herndon and Mat Dryhurst also sought to assemble its own training data set without scraping others’ copyrighted material. Not only that but, according to Herndon, “This dataset is owned by the choirs [that contributed to the project] through a new IP structure [that allows] for common ownership of AI data.”

Shanghai, China - August 18th 2023: ByteDance's AI chatbot 'Doubao' app on screen.

(Image credit: Robert Way via Getty Images)

That is to say, creative, thoughtful use of AI tools is possible—but calling it AI evidently comes with a lot of baggage. So, no, ‘AI’ doesn’t just broadly refer to instances of machine learning I find personally questionable or ethically dubious, but shuffling these tools into such a broad category ensures we’re destined to keep talking past one another. This isn’t helped by the buzzword-y nature of ‘AI’ either, with events like CES 2024 (and 2025) plagued by so-called AI products where it wasn’t always clear just how certain features harnessed machine learning (or if they did at all).

This ultimately makes it even more difficult to pin down AI as a concept, or ensure any of us are on the same page when we wade into discussions with our two cents.

For all of the above reasons, I think we desperately need a better term than ‘AI’—though I’ll admit the current misnomer is a real zinger. A return to plain old ‘machine learning’ may manage expectations, but it doesn’t quite have the same panache. ‘Venture capital black box that’s slowly cooking our planet‘ hardly rolls off the tongue either.

I’ve been toying with the out of pocket term ‘electric nightmares,’ not just as yet another science fiction nod, but also as a way of broadly gesturing at the outsized power demand of AI as we know it, as well as its nonsensical ‘hallucinations’ (you know, like terrifying dream logic). Unfortunately, this proposed term runs the risk of making something as unimpressive as a chat bot sound rad as hell, and I just don’t think it’s worth it.

Who knew coining a brand new name would be such a noodle twister? And, no, I’m definitely not going to ask ChatGPT for any suggestions.

About Post Author

Leave a Reply

Your email address will not be published. Required fields are marked *