Call of Duty AI Moderator??

AI Moderator

The voice chat feature in the popular online shooter game series Call of Duty is getting a makeover to combat toxic behavior. To combat toxic behavior and make Call of Duty games more enjoyable for everyone, Activision has partnered with Modulate, an artificial intelligence-powered voice chat monitor.

The Importance of Using an AI Moderator

The public voice chats in today’s online shooters have a reputation for being a hostile place to play. A player’s gaming experience can be ruined by offensive or threatening language, hate speech, discriminatory comments, and other forms of harassment. Activision knows this is a serious problem and is using cutting-edge technology to solve it immediately.

Activision has enlisted Modulate and its AI-powered voice chat monitor ToxMod to address the growing concern of toxic behavior in voice chats. This artificially intelligent moderator is much more than a simple word filter; it is also capable of understanding tone, intent, and context. Not only can it detect overtly offensive language and conduct, but also more covert forms of harm, such as attempts to recruit minors to online extremist groups or engage in sexually explicit conduct with them.

Functioning of ToxMod

Among Us VR is one of the smaller virtual reality games that make use of ToxMod; so far, Call of Duty is the developer’s biggest client. During multiplayer matches in Call of Duty: Modern Warfare 2 and Warzone, the AI moderator listens in on voice chats in real time. It monitors the chats, identifies potentially offensive content, and records the conversations so that Activision’s human moderators can look them over.

The content that has been flagged is then reviewed by human moderators to ensure it complies with the official Call of Duty Code of Conduct. Disparaging remarks about people’s race, gender, sexual orientation, age, culture, religion, mental or physical capacities, or place of origin are strictly forbidden. If a player is found to have broken the rules, the moderators may take action against them, which could range from a warning to a permanent ban depending on the severity of the infraction.

Human Moderators and Their Importance

While ToxMod is instrumental in spotting potential infractions, it is important to remember that it cannot impose sanctions on its own. The AI moderator serves as a resource for human reviewers. Human moderators are ultimately responsible for deciding whether or not a violation has occurred and, if so, what action should be taken. This guarantees that real people are involved in making decisions, which improves transparency and equity.

Future Goals and Beta Results

ToxMod is currently in its beta testing phase for use in multiplayer matches in Call of Duty: Modern Warfare 2 and Warzone. During this phase of testing, we are able to fine-tune and enhance the capabilities of the AI moderator. ToxMod will be released in its entirety alongside Call of Duty: Modern Warfare 3 in November, according to Activision.

While in beta, ToxMod focuses on English-speaking voice chats. Nonetheless, Activision intends to eventually increase the AI moderator’s language support. This will make the voice chat monitoring system more accessible to more communities and more effective in its fight against toxic behavior in all of them.

See first source: PC World

FAQ

1. Why is Call of Duty updating its voice chat feature?

To combat toxic behavior and make the game more enjoyable for all players, Activision is introducing AI-powered voice chat monitoring.

2. What is the role of Modulate in this update?

Activision has partnered with Modulate to utilize its AI-powered voice chat monitor called ToxMod, which aims to detect and combat toxic behavior in voice chats.

3. How does ToxMod work?

ToxMod listens in on voice chats in real-time during multiplayer matches, monitors the conversations, identifies potentially offensive content, and records these chats for human moderators to review.

4. Is ToxMod solely responsible for moderating voice chats?

No. While ToxMod identifies potential infractions, human moderators are responsible for reviewing the flagged content and deciding on the appropriate action.

5. What kind of offenses does ToxMod monitor?

ToxMod can detect overtly offensive language, tone, intent, context, and even more covert forms of harm, such as attempts at online extremism recruitment or inappropriate conduct with minors.

6. What actions can be taken against players found violating the Call of Duty Code of Conduct?

The range of actions includes warnings to permanent bans, depending on the severity of the violation.

7. In which games is ToxMod currently being tested?

ToxMod is in its beta testing phase for Call of Duty: Modern Warfare 2 and Warzone.

8. When will ToxMod be fully implemented?

ToxMod is expected to be fully released alongside Call of Duty: Modern Warfare 3 in November, as stated by Activision.

9. Does ToxMod work for chats in languages other than English?

Currently, during its beta phase, ToxMod focuses on English-speaking voice chats. However, Activision plans to expand the AI moderator’s language support in the future.

10. Why is human moderation still necessary?

Human moderators ensure that decisions about violations and consequent actions are transparent, fair, and consistent, providing a level of judgment that AI alone cannot achieve.

Featured Image Credit: Dominik Sostmann; Unsplash – Thank you!