Activision has delivered a report on the work it’s done to combat toxicity in Call of Duty, insisting it’s already made a huge impact ahead of the launch of Black Ops 6.
Call of Duty was for years associated with toxic player behavior, both in voice chat and over text message. But Activision has worked to reverse the franchise’s thorny reputation, launching 2023’s Modern Warfare 3 with in-game voice chat moderation powered by AI.
Activision is using ToxMod from Modulate, which uses AI to identify in real-time and enforce against toxic speech including hate speech, discriminatory language, and harassment.
Addressing privacy concerns from the Call of Duty community, Activision has insisted voice chat is only monitored and recorded “for the express purpose of moderation,” and “is focused on detecting harm within voice chat versus specific keywords.”
“The Disruptive Behavior team knows that hype and passion is part of Call of Duty’s DNA,” Activision said in a fresh progress update. “Voice and text-based moderation tools in Call of Duty don’t target our competitive spirit – it enforces against behavior identified in the Call of Duty franchise Code of Conduct, targeting harassment and derogatory language.
“Similar to Modern Warfare 3, the Call of Duty Code of Conduct will be visible during the initial in-game flow when players first launch core multiplayer modes in Black Ops 6, asking players to acknowledge the Code of Conduct pillars.”
Since rolling out an improved voice chat enforcement in June, Call of Duty has seen a combined 67% reduction in repeat offenders of voice-chat based offenses in Modern Warfare 3 and Warzone, Activision added. In July 2024, 80% of players that were issued a voice chat enforcement since launch did not re-offend. Overall exposure to disruptive voice chat continues to fall, Activision said, dropping by 43% since January. At launch, Black Ops 6 will expand its voice moderation to French and German, in addition to English, Spanish, and Portuguese.
Voice and text-based moderation tools in Call of Duty don’t target our competitive spirit.
This is well-timed, given one new feature for Black Ops 6 that’s sure to test Call of Duty’s AI voice chat moderation to its limit. Black Ops 6 has a new Body Shield feature in multiplayer that lets you grab an enemy and hold them in front of you to soak up bullets while firing off a few rounds of your own. But that’s not all: it also enables voice chat between the attacker and the victim, which players certainly had fun with during the game’s beta weekends.
Call of Duty’s text moderation tech, meanwhile, which analyzes text chat traffic in “near” real-time, has blocked over 45 million text-based messages in violation of the Call of Duty Code of Conduct since November 2023. Activision said Call of Duty has also implemented a new analysis system for username reports “to enhance efficiency and accuracy, surfacing critical reports to our moderation team for investigation and action.”
Activision has used research from the California Institute of Technology (Caltech) to improve its approach here, and worked with researchers from the University of Chicago Booth School of Business to work out ways to better identify and combat disruptive behavior. Activision said it’s “actively engaged” in research surrounding disruptive behavior and prosocial activities in gaming.
Wesley is the UK News Editor for IGN. Find him on Twitter at @wyp100. You can reach Wesley at [email protected] or confidentially at [email protected].