AI-generated story and dialogue is not going to work, says Dragon Age creator David Gaider: ‘A lot of effort is going to be wasted on this’

The RPG veteran reveals that during his time at BioWare, the studio experimented with procedurally-generated quests—but they were ultimately a failure.

The RPG veteran reveals that during his time at BioWare, the studio experimented with procedurally-generated quests—but they were ultimately a failure.

Procedural generation is a powerful tool for games. From Minecraft’s endless worlds, to Borderlands’ “87 bazillion” guns, to Shadow of Mordor’s countless orc captains, a system that can create new, semi-randomised content on the fly can enormously extend replayability. 

Inevitably, in this era of renewed interest in AI-driven generation, companies are looking at whether that idea can be pushed forward into new frontiers. Does the increased sophistication of chatbot ChatGPT, for example, suggest that dialogue and story in games could be procedurally generated too? According to David Gaider—the man who created the world and story of Dragon Age—the answer is a firm “no”. And he knows because BioWare tried it.

Reacting to a Guardian article that asks the question “Could AI write super-intelligent video game characters?”, Gaider tweeted: “Ah, yes. The dream of procedural content generation. Even BioWare went through several iterations of this: ‘what if we didn’t need every conversation to be bespoke?’ Unlimited playtime with dialogue being procedurally created alongside procedural quests!” Based on when Gaider left the company, this work likely happened in or before 2016. 

“Each time, the team collectively believed—believed down at their CORE—that this was possible. Just within reach. And each time we discovered that, even when the procedural lines were written by human hands, the end result once they were assembled was… lackluster. Soulless.” 

He goes on to suggest that the core problem of any such system is that it can only create superficial content—it can make “something *shaped* like a quest… but the end result is no better than your typical ‘bring me 20 beetle heads” MMO quest.'” In other words, without a human mind behind it, generated story and dialogue will only ever feel like the filler it is. 

Gaider’s new project, Stray Gods, is about as far from procedurally-generated writing as you can get—it’s a musical narrative RPG where the songs have lyrical dialogue options. (Image credit: Summerfall Studios)

“The fact these dev teams will fail doesn’t mean they won’t TRY. Expect to see it.”

David Gaider

What BioWare was attempting was a system that drew on a database of pre-written lines, rather than generating the text itself through a language model as ChatGPT does—but Gaider is sceptical that that can make any difference, even as the technology improves. Indeed, his conclusion that it will still only spit out flat, uninteresting writing does line up with the broader experience right now of using AI to generate text. By its nature, at least in its current form, it creates generic content, as a consequence of being a sort of aggregate of all the text that’s ever been fed into it. 

“The fact these dev teams will fail doesn’t mean they won’t TRY,” concludes Gaider. “Expect to see it. It’s too enticing for them not to, especially in MMO’s and similar where they feel players aren’t there for deep narrative anyhow. A lot of effort is going to be wasted on this.”

It’s worth noting, however, that procedural generation of quests has been implemented in games to some success already. Though I’d certainly argue that they fall into Gaider’s category of “bring me 20 beetle heads”, procedural sidequests have been used for years now in Bethesda’s open world RPGs to extend playtime and give you reasons to keep exploring. Modders are also already producing pretty interesting results adding Chat GPT-voiced NPCs to Skyrim that can carry out full conversations with the player.

Bethesda has used a system called “Radiant AI” since Oblivion, to both guide the behaviour of NPCs and to generate new quests. (Image credit: Bethesda)

I’d suggest the greater problem Gaider doesn’t note here is that procedurally generated content can’t be vetted or moderated easily by the developer. In most existing uses, that’s fine—if a Minecraft map features an oddly phallic looking rocky outcropping, that’s not likely to have much impact other than causing some giggles. But if AI dialogue or story inadvertently makes some kind of offensive or inappropriate statement, it could cause serious PR problems and even legal issues for its developer. I’m reminded of the grief Fantasy Flight Games ran into with physical card game Keyforge, whose procedurally-generated deck names ended up including such gems as “The Emperor that Pays for Boys”, “Titanflyer, the Farmer of Racism”, and “Wang The Suddenly Bruised”.

All of that is not even to mention that it’s also currently just prone to getting stuff wrong—one of those AI Skyrim NPCs repeatedly sent Chris into a death trap because it couldn’t solve a basic puzzle, and it’s easy to imagine a system along these lines struggling to keep all of a fictional world’s details straight. 

Gaider doesn’t say what project BioWare hoped to use this idea on, but I’d speculate it was part of Mass Effect Andromeda’s reportedly tortured development, which Kotaku claimed included a long and ultimately damaging diversion into a No Man’s Sky-like attempt to procedurally generate entire planets for players to explore. It’s bizarre to think that a studio so celebrated for its hand-crafted writing and characters spent any time trying to automate away parts of that process—but if BioWare was doing it seven years ago or earlier, you can bet it’s a conversation that’s been had or is ongoing at many other studios today. Whether it works well or not, you’ll probably find yourself doing quests for a robot sooner rather than later. 

About Post Author