Voice Actors Speak Out on AI-Generated NSFW Skyrim Mods: 'It Should be Seen as the Violation It Is'
Voice Actors Speak Out on AI-Generated NSFW Skyrim Mods: 'It Should be Seen as the Violation It Is'

Skyrim’s prevalent modding scene is notorious in the video game industry for extending the life of one of Bethesda’s greatest games. But right now, the modding landscape is surrounded by controversy because of the rise of AI-generated voice clips in mods based on real Skyrim voice actors. If that doesn’t raise enough ethical questions already, some of the mods feature pornographic AI voice performances, causing backlash and outcry in the voice actor community.

This issue was first reported on by GamesRadar, which called attention to Nexus Mods, a mod distributor that hosts a number of the pornographic Skyrim mods in question. When IGN reached out to Nexus Mods for comment on this story, a spokesperson simply linked to the site’s current policy on AI content, which says in part, “AI-generated mod content is not against our rules, but may be removed if we receive a credible complaint from an affected creator/rights holder.”

Twitter user @Robbie92_ pointed out the practice of AI voice cloning in the Skyrim modding scene late last month, tagging a number of impacted voice actors. Robbie said the actors’ voice performances were mostly fed into an AI cloning tool called ElevenLabs, which generated the AI voices that were used in the explicit content.

With the rise of AI voice cloning, voice actors are being abused by the modding communities. As a member of the Skyrim modding scene, I am deeply concerned at the practice of using AI voice cloning to create and distribute non-consensual deepfake pornographic content. pic.twitter.com/ySUFqrtjH0

— Robbie (@Robbie92_) July 1, 2023

“I am a long-standing and deeply concerned member of the mod community for The Elder Scrolls series of games,” Robbie wrote. “I believe that the creation and distribution of deepfake pornography is unabashedly evil, and that we as a community have a responsibility to act.”

The replies to Robbie’s original Tweet are filled with some of the impacted voice actors reacting to learning that their voices have been used for pornographic AI content without their consent. Many voice actors were shocked as they were learning about this issue for the first time, with voice actor Richard Epcar writing, “What the s**t?!”

What the shit?!

— Richard Epcar (@RichardEpcar) July 4, 2023

Robbie also shared a Google Doc that contains a list of names and links of various mods that use AI voice cloning. Robbie wrote that the links are only accessible with an active Nexus Mods account since they are tagged as NSFW.

This list — while not exhaustive — has around 100 links to NSFW mods listed on Nexus Mods. As of this writing, many of the mods have already been hidden by the creators, with the message: “The mod has (possible) permission issues that the author is working to address.” However, many of the links are still fully accessible.

The “possible permission issues” become obvious when digging into ElevenLabs’ terms of service, as anyone that used the voice generator to create NSFW Skyrim mods violated part of the user agreement.

“By Uploading Files to our Services, you confirm that you are either the owner of these Files or that you have the necessary rights and permissions to use these Files. You can only Upload files if you tick a checkbox confirming this,” ElevenLabs’ website reads in part. “For example, if you upload somebody’s voice recording to our Services, you confirm that you have permission from the voice owner to clone and synthesize their voice. Only you are responsible for securing these rights and permissions.”

IGN has reached out to ElevenLabs for comment.

How Skyrim’s Deepfake Porn Mods Impact Voice Actors

When a modder uses AI-generated lines from a real voice actor, it can have a significant impact on the actor’s career security and mental health. IGN spoke to a pair of voice actors who have worked on The Elder Scrolls Online about how deepfakes (especially those of explicit nature) are affecting their colleagues. While neither voice actor we spoke to was specifically listed in the document of impacted voice actors, they both hold valuable insight into what this situation means for the industry.

Abbey Veffer did voice work for ESO’s recent Necrom expansion, and she’s been very outspoken on social media regarding the Skyrim modding controversy. In an email to IGN, Veffer shared her stance on the mods.

“I believe the use of AI synthesis for non-consensual voice cloning and NSFW mods/deepfakes should be seen as the violation that it is,” Veffer wrote. “In my opinion, this should be treated as similar to revenge porn. It’s a weighted issue rooted in perpetrators playing with power dynamics and wanting a semblance of control at someone else’s expense — namely, actors they may claim to be fans of.”

Veffer expressed frustration that the people who are making the mods don’t see it as harmful and don’t understand the implications. She was quick to point out that fans might not see that creating content like this is at a real person’s expense, and that voice actors aren’t as affluent or untouchable fans may think.

I believe the use of AI synthesis for non-consensual voice cloning and NSFW mods/deepfakes should be seen as the violation that it is

“Due to common misconceptions around actors’ wages, many uneducated observers may think we’re all equipped with expensive lawyers who can save us at the drop of a hat. They also assume that these AI impersonations won’t jeopardize our jobs — but that’s entirely untrue. Most voice actors are not rich, and job stability doesn’t really exist in this industry. We’re all in the same boat, and we all deserve protection from AI abuse.”

IGN also spoke with Kyle McCarley, another ESO voice actor that stands against the misuse of voice actors in AI-generated projects. In a statement to IGN, McCarley said, “the implications of this technology being left unchecked are frankly horrifying,” and he went on to explain the harm to voice actors.

“If fans are creating memes where our voices are saying things we didn’t say, that can absolutely hurt our employment opportunities if those fakes are good enough to convince our employers,” McCarley said. “We don’t own the rights to the characters we portray, our employers do. So many of them, quite understandably, don’t want us ‘using’ those characters to say things without their clearance, any more than we want people using our voices to say those things… Our public image could be tarnished by one troll forcing us to say heinous things, and that kind of thing can be nearly impossible to recover from, not just financially, but mentally and emotionally, as well.”

As for what can be done, McCarley and Veffer agree that legislation is the only true path forward to protect actors from AI deepfakes, and that independent actors won’t be able to stop this on their own in the long run.

McCarley also pointed to the ongoing SAG-AFTRA and Hollywood studios negotiations, where the entertainment union is pushing for protections against employers using AI technology without permission and/or additional payment.

“This is why we desperately need legislation at a federal level, to establish oversight of the companies developing this technology and rules on how it can be used by both the private companies in question and the public at large. And then enforcing those rules with strong punitive measures for those who violate them… All an individual actor can do right now is try to contact the person using the model and ask them politely to stop, which is obviously going to net mixed results.”

Logan Plant is a freelance writer at IGN

About Post Author