Call of Duty: Modern Warfare III Will Feature AI-Powered Chat Moderation

How Call of Duty Is Making Voice Chat Less Toxic

Activision announced on Wednesday that it would be collaborating with Modulate to create a new voice chat moderation system debuting with Call of Duty: Modern Warfare III.

How will Modern Warfare III’s chat moderation work?

When Modern Warfare III releases on November 10, 2023, Modulate — a company that aims to fight against toxic online behavior — will deliver “global real-time voice chat moderation, at-scale” utilizing an AI-Powered system known as ToxMod. This technology will be able to identify toxic voice chat in real-time, and enforce against it.

According to Modulate, some of the toxic speech that will be enforced are hate speech, discriminatory language, harassment, and more. On Call of Duty’s FAQ page, Activision notes that the system is managed by them, with voice chat monitored and recorded for the purpose of moderation. Violations against the Call of Duty Code of Conduct will remain in line with their standard policies.

“There’s no place for disruptive behavior or harassment in games ever. Tackling disruptive voice chat particularly has long been an extraordinary challenge across gaming. With this collaboration, we are now bringing Modulate’s state of the art machine learning technology that can scale in real-time for a global level of enforcement,” said Activision’s Chief Technology Officer, Michael Vance. “This is a critical step forward to creating and maintaining a fun, fair and welcoming experience for all players.”

In order to test the system out ahead of Modern Warfare III’s release, Activision will be rolling out a beta version of the chat moderation technology beginning on August 30, 2023. The technology will be found in Call of Duty: Modern Warfare II and Call of Duty: Warzone, and will support English to start, with other languages to follow later.

Movie News

Marvel and DC

X