1. Home >
  2. Gaming

Call of Duty to Begin Using AI to Assess Voice Chat for Toxicity

The tool will look for hate speech, harassment, and other “disruptive behavior" before forwarding conversations to the moderation team.
By Adrianna Nine
Screen capture from Modern Warfare II depicting two uniformed characters with firearms.
Credit: Activision Blizzard

Virtually any online multiplayer game can become a vehicle for harassment, but the Call of Duty franchise has more of an unfortunate reputation for that than most. In a blog post published Wednesday, Activision Blizzard announced that it has begun leveraging artificial intelligence to tackle “disruptive behavior” over voice chat. Conversations conducted between North American players via Call of Duty: Modern Warfare II and Call of Duty: Warzone are already being moderated, with other regions and titles to follow in the coming months. 

The tool is called ToxMod, and it’s produced by Modulate, the self-described “leader in the fight against toxic online behavior.” According to Modulate’s website, ToxMod flags potentially harmful conversations and then triages them for toxicity review. This process sometimes involves removing background noise for more accurate analysis. From there, ToxMod assesses the “tone, timbre, emotion, and context” of a phrase or conversation to determine whether a player is behaving in a harmful way. While the tool does perform keyword analysis, this more nuanced approach ensures the entirety of a conversation is considered before taking action against a player.

Screen capture from Modern Warfare II that depicts two characters in uniform, one reaching for a holstered gun.
Credit: Activision Blizzard

ToxMod forwards conversations flagged for a certain type or severity of toxicity to the game’s moderation team, which ultimately decides whether to take action against a particular player. This hopefully means there will be a human behind each decision to warn or ban a player from Call of Duty’s in-game voice chat. According to Activision, first warnings are only somewhat effective at preventing a player from bullying, harassing, or directing hate speech at another player again: In Wednesday’s blog post, the company said 20% of players don’t reoffend after receiving a first warning. Those who do reoffend are hit with chat bans and temporary account restrictions, though Activision hasn’t said how effective these penalties are. 

ToxMod represents Activision’s second major stab at preventing in-game toxicity over the past year. In November, the company introduced an overhauled reporting system that allowed players to report those using offensive text or speech. The new interface reflected the game’s code of conduct, which players were required to agree to at the start of their next play, even if they’d read or agreed to it in the past.

Activision’s ToxMod beta is now reviewing Modern Warfare II and Warzone conversations across North America. When Call of Duty: Modern Warfare III releases on Nov. 10, ToxMod will roll out to all players worldwide, excluding Asia.

Tagged In

Voice Chat Call Of Duty Artificial Intelligence

More from Gaming

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of use(Opens in a new window) and Privacy Policy. You may unsubscribe from the newsletter at any time.
Thanks for Signing Up