As gamers know, voice chats with other players can be made in FPS games that can be played online, such as Call of Duty. However, in these chats, players can say unwelcome things to each other. Research also reveals that CoD is the series with the most negative players.
This is exactly why Activision was making moves to prevent these toxic behaviors. The company has now partnered with a company called Modulate to find a solution to this problem.
Voice chats in Call of Duty will be controlled by artificial intelligence
This move of the gaming giant will allow in-game voice chats to be followed. This will be done with artificial intelligence. The system will work with Modulate’s artificial intelligence technology called ToxMod. The technology will be used to detect hate speech, discrimination or harassment in voice chats.
The beta version of ToxMod launches in North America starting today. The technology is currently active in Call of Duty: Modern Warfare II and Call of Duty: Warzone. Modulate states that with the release of MW3, that is, on November 10, the system will be available all over the world outside of Asia.
There is not much information about how ToxMod works at the moment. The company’s website says the system can distinguish bad behavior from voice chat, carefully analyze every conversation and notify moderators.
These statements indicate that the system will currently only be used to inform Activision moderators; We can understand that it will not cause any sanctions. We will see later whether it really works or not. However, it is a big question mark that voice recognition systems may be biased against people of certain races or accents. That’s why it’s absolutely essential to involve people in the process of controlling voice chats.