We’ll never be free from AI paranoia, so as long as the burden of detective work keeps falling to the masses

I want to put my pipe and deerstalker hat away for five minutes.

I want to put my pipe and deerstalker hat away for five minutes.

You know what I miss? I miss being ‘wowed’ by a piece of art on the internet. In the good ol’ days (less than five years ago, but tech comes at you fast) you’d be scrolling through your feed and stumble into something beautiful, and you’d just be able to look at it. Soak it in. 

Nowadays, it’s less easy to tell whether genuine effort’s been put into something. Art that seems textured and detailed falls apart upon closer inspection—and even when you find a genuinely great piece of work, you’ve gotta spend a few minutes staring at it, first. Like checking a mattress for bedbugs in a sketchy hotel.

AI has turned us all into private investigators. If you care about not getting suckered, you have to be sceptical and critical, and even when you avoid being fooled, it was only because you were flinching at shadows. I wrote about this a little last year, though that was through the lens of analysing games. Still, that feeling of “an internet flooded with algorithmic doppelgangers” is wearing me down something fierce.

The internet was already tough to take at face value. Rumour mills have always had their hooks dug into everything. But I feel myself growing a thick shell of scepticism that muffles everything I see, listen to, or read—a barrier to entry in my brain that reads: ‘does this look AI-generated?’

AI causes problems for just about everybody

(Image credit: Wizards of the Coast (Artwork retracted from Bigby Presents: Glory of the Giants). )

In order to keep the internet usable—and there’s no guarantee that we’ll reach that point before the grey goo tanks it all anyway—a lot of work needs to be done. Proponents of AI will often tell you that it’ll speed up the process, unlock creativity, ‘democratise art’, and so on. I certainly do think there are ways in which this tech can be used to help people. As it stands, though, AI is creating more problems than solutions—even for the people who you’d think would stand to gain.

Take AI-generated art, for example. Broadly speaking it’s unpopular. Sure, people like to churn out meme images and busty anime girls, but most audiences feel like they’ve been short-changed when a company dumps something computer-generated in front of them. While we aren’t quite at the level of bullying NFTs reached, it still leaves folks sour in the mouth.

Wizards of the Coast has stubbed its toe on this problem a couple of times, recently. In August of last year, the company had to remove a contracted artist from one of its sourcebooks after it was discovered they’d been using AI without informing their employer. Likewise in January, when a promotional image for Magic: The Gathering was, Wizards insisted, “created by humans and not AI.” Turns out it was made with AI tools.

(Image credit: Wizards of the Coast)

Fan uproar caused Wizards to change tack and issue corrections—all because internal quality checks clearly weren’t designed to stop the slop from slipping through the cracks. In both cases, spotting the AI generation wasn’t hard—uneasy anatomy, melting pressure metres, nonsensical light bulbs. But it’s easy to say that in hindsight, after hundreds of discerning eyes have scanned every inch and drawn big red circles around the offending imagery. 

I get the feeling whoever gave the go-ahead just didn’t have the skills or the time to work through it with a fine comb. Wizards of the Coast needs to check for more than plagiarism—it needs to check for artistic integrity. It needs a discerning eye to look at the anatomy, construction, and composition of a piece. In other words, it might need to pay two artists, now—one to draw the thing, and another to make sure the first one didn’t cheat.

Which begs the question: is AI actually making things less costly for the moneybags in the long run? Assuming we don’t all just become numb to it someday, having your AI art discovered is a financial risk paid in reputation and goodwill. 

If you want to avoid that cost, you need to spend a lot of effort either covering it up—or hiring people to filter it out with their expertise. At which point, why not just actually pay one artist a little extra to make the damn thing? Still, I thought, at least AI is limited to fairly obvious bad art right now. Then Open AI’s new Sora model happened.

I had to rewrite half of this article because of something that happened overnight

(Image credit: OpenAI)

I don’t usually pull back the curtain like this, but I feel it’s warranted here since it’s a great example of how this tech can blindside us. The first draft of this article only talked about 2D AI art and the malaise of having to zoom in on eyes and hands. I went to sleep, woke up with a mind to edit it, and then this happened.

View post on imgur.com”

This is where we’re at after, what, a few years of development? While Sora’s generations are typically dreamlike, with plenty of surreal, acid-trip inconsistencies, assuming it’ll stay that way for long feels like jumping into a pot of water and shouting ‘boil me up, Scotty’. 

Again, there are uses for AI tech. But this doubt-exhaustion will only get worse unless something is actually done—a major election in the United States is just around a dreaded corner, and if this technology continues to enjoy an unregulated gold-rush era of tech bro fantasies, we’re all going to suffer for it.

But don’t worry! Open AI’s website has a section on safety, which reads: “We are working with red teamers — domain experts in areas like misinformation, hateful content, and bias — who will be adversarially testing the model.” Which is sort of like reassuring everybody that you’re consulting with radiation poisoning experts after developing the nuclear bomb.

Pandora’s box is open, sure—but that doesn’t mean we all have to lie down and let its evils steamroll us out into a nice flat pancake. Just because we have the ability to do something, doesn’t mean its use is inevitable. As my fellow PC Gamer writer Joshua Wolens rightfully pointed out last year, there’s all sorts of social action that can be taken on a personal level—and on a governmental one, we put safety checks in place all the time.

(Image credit: OpenAI)

Regulations exist. They’re why our drinking water is (typically) safe, why our food is (typically) edible. Technology isn’t used just because it’s there—for instance, we’ve had the ability to wipe out most/all life on planet earth since 1945. Sure, things got scary there for a while, but we’ve still managed to avoid atomising ourselves for two-thirds of a century. Give the human race a little credit.

At least, that’s how I want to feel. But my inner cynic isn’t surprised by our current state of affairs. Of course AI is flooding the internet, of course it’s making our search engines unusable—it fills our current cultural hunger for quantity over quality. We see that in some of its biggest advocates. ‘Imagine if your favourite TV series never ended’ or ‘imagine a game with infinite sidequests!’

If anything, AI is acting like a canary in a coalmine for a cultural problem we’ve had for a while, a problem that used to only cause issues for creatives. It’s easy to ignore where your processed food came from if it tastes fine—it’s harder to do that when you chip your teeth on the rocks they started sneaking in.

Our pre-AI state of affairs wasn’t acceptable, but we’re fast pushing the needle from ‘this sucks and it shouldn’t be this way’ to ‘everybody loses’. Artists, writers, and actors are getting screwed over just as thoroughly as before—moreso, with even poorly-paid work vanishing before their eyes. Right now, companies stand to gain the most—but even then, plenty are hitting roadblocks that’ll hurt the bottom line.

Sora only really promises to improve things for dodgy advertisers who want to make stuff on the cheap—that’s its major pro. The cons far outweigh that. ‘Yes, the internet may be destroyed. But for a beautiful moment in time we created a lot of value for corporations?’ We can do better, surely.

Until that point, though, we all get to enjoy an internet that’s gradually getting worse in a hundred little ways. Soon, none of us will get to watch a video without squinting to catch shapeshifting chairs, multiplying cat paws, or shifting geography—the paranoia will continue until something explodes.

About Post Author