Occasionally, my annoyance with something manages to reach a level where it exceeds my inherent laziness to the point where I sit down and write something about it spontaneously. Not exactly as an expert, obviously, at best an opinion that wasn’t entirely plucked out of thin air, but somehow relevant to the thing that keeps making more headlines than it should. What bugged me this time was AI, which stands for artificial intelligence, and the absurd hype surrounding it.
If we were listing things AI is good at versus the things it’s bad at, the first will be the shorter of the two so let’s start there. Generative AI, where you tell the computer what you want and it creates something out of thin air, is probably the most spectacular manifestation of AI — when it gets it right anyway. Whether it’s a language based result from an LLM (large language model) or a visual one like an image or video, it’s pretty cool to be able to ask the AI for something and it just makes it. Not to mention useful on a professional basis.

It’s a bit more problematic if you start asking yourself whether the result you got is derived from intelligence or imitation. You see, how these AI models work is essentially by ”training” — feeding it countless gigabytes of human-made content and let it figure out the way we usually do things. Texts, pictures, artwork, you name it. If you ever wrote something online, from socials to an online forum dedicated to the mating rituals of geese, odds are your words have been ingested at some point by an AI model without your express permission. You may be fine with that, but people who were creating content to make an actual living are probably less inclined to look favorably on their work being borrowed to “educate” a kind of program that aims to replace them.
Heck, this very article might land me in hot water with our future AI-driven overlords.
This kind of stuff is possible now and not 50 years ago mostly because of the sheer amount of data that needs to be processed. There was just no way to store and compute all that stuff. Your smartwatch now has 100.000 times the computational power of the guidance computer that took people to land on the moon and in terms of memory capacity it’s more like a million times. Large data centers can now pack enough processing power and storage to make programs that will comb through pretty much everything that’s ever been officially written down and use that to “learn” how we write, or talk, or paint. And then imitate that. But I’m guessing that Artificial Imitation just isn’t as catchy as a marketing term. And a lot of the AI we hear about truly is just marketing. There’s no clearly defined border where a software program suddenly becomes AI. But it sounds good, and it helps sell devices and boosts the stock price so here we are.