The new Call of Duty AI voice moderation is welcome and sorely needed

A new Call of Duty AI voice chat moderation initiative began this week, which begs the question: When will other companies do the same?

It's hard to get through the weekly news cycle without hearing about artificial intelligence utilization. From newsrooms to medical journals and everything in between, AI is the new hot-button topic. In the case of games we've mostly heard hushed rumors about how AI will usher in the end of human-generated content. And yet, the announcement this week from Activision about the new Call of Duty AI voice chat moderation is the thing that has me most intrigued.

Revealed in a blog post yesterday, Activision revealed a new partnership with the AI company Modulate, to bring its proprietary voice moderation tool called ToxMod to Modern Warfare 2, Warzone 2, and the upcoming Modern Warfare 3. While used in smaller, VR-based games such as Rec Room, the Call of Duty rollout will be its biggest test to date.

And you know what? It cannot come soon enough for CoD and, hopefully, more online titles. If companies only deign to enforce their rules about toxicity in online games then the next best thing is letting an AI listen in on our lobbies.

Call of Duty AI enlisted for chat moderation

Despite playing Overwatch on the daily with my wife, we never play Competitive Mode or turn on team voice chat. There's a pretty simple reason for that: Even in a seemingly kinder, gentler era of Overwatch 2 there's still something about the mere presence of a woman in the lobby that makes the game's most toxic players go nutty. This isn't a unique experience for online gaming.

Go in search of and you'll find myriad examples of women, LGBTQ+ people, and everyone in between experiencing vitriol for the mere act of existing online. While Activision reports success with its current anti-toxicity measures and practices, upping its game with ToxMod feels intentional.

"In examining the data focused on previously announced enforcement, 20 percent of players did not reoffend after receiving a first warning," the company reports. "Those who did reoffend were met with account penalties, which include but are not limited to feature restrictions (such as voice and text chat bans) and temporary account restrictions."

In the grand scheme of things, 20 percent isn't a lot. Especially when you consider the hundreds of thousands of players on any given day. To further combat this, the Call of Duty AI initiative utilizing ToxMod looks to fill in the gaps. Although Activision provides a few examples on its FAQ page, it remains somewhat nebulous what ToxMod flags.

"Call of Duty’s Voice Chat Moderation system is focused on detecting harm within voice chat versus specific keywords," says the document. "Violations of the Call of Duty Code of Conduct are subject to account enforcement."

All fun and games

The biggest concern when it comes to auto-moderation is whether or not it kills the friendly competition aspects of online gaming. Admit it, we've all had that rush of adrenaline that comes from ruining an enemy player's moment to shine. According to the Activision FAQ, ToxMod is looking for specific cases.

"The system helps enforce the existing Code of Conduct, which allows for “trash-talk” and friendly banter. Hate speech, discrimination, sexism, and other types of harmful language, as outlined in the Code of Conduct, will not be tolerated."

You know what? Great. Good. Given how it often feels like talking to a brick wall when reporting blatant and obvious infractions of any given game's online code of conduct, perhaps a wider hammer is needed. Obviously, not every infraction gets caught and even less so recieve punishment.

But I can only wonder if anyone is doing anything about it after hearing blatant slurs in voice chat so many times. Given the small but noticeable downturn of the Call of Duty player base in recent years, it's no surprise the publisher is making moves. Along with the addition of more mainstream game IPs to its franchise like Nicki Minaj, 21 Savage, and Lara Croft comes a need to protect new players coming into the fold.

And while you cannot call the biggest shooter franchise in the world 'niche,' it's online community does possess a reputation. Notoriously known as the saltiest sailors this side of League of Legends. A Call of Duty lobby using AI to babysit its worst potty mouths can only help the game's reputation.

Expanding the net

Will the Call of Duty AI experiment succeed? Only time and data will tell. It entirely depends on whether or not this is a thing that will foster better behavior or simply turn people away from talking. Regardless, the entire industry may be keeping watch on how ToxMod does or doesn't moderate some of the most toxic players in online gaming.

You may see further applications of artificial intelligence in game lobbies, one way or another.

Stay tuned to esports.gg for esports news and Call of Duty information.