If players jump into a match across Call of Duty: Warzone, Black Ops Cold War or Mobile, it’s currently quite common to come across toxic behavior from other players. Whether it’s racial slurs, unsportsmanlike conduct, cheating, or harassment, Call of Duty is unfortunately no stranger to toxic behavior. That’s why publisher Activision has issued a statement detailing its plans to make the series a more positive experience for players.
https://twitter.com/CallofDuty/status/1397598417929609225
As part of the statement, the company stated it has banned more than 350,000 accounts that include racist names or that have been reported for toxic behavior within the past 12 months. Activision also implemented a new filter that aims to catch potentially offensive names, as well as “new technology” to filter inappropriate text.
Despite its efforts, the publisher admits there’s still much to be done in the way of creating a more positive experience. To reach that point, Activision wants to increase its efforts toward toxic behavior detection and monitoring, consistent and fair enforcement of policies, and frequent communication with the community.
As it stands, many toxic players have found a way around the systems in place to get away with inappropriate behavior, including using a number instead of a letter within a username, or utilizing software that circumvents anti-cheating functions. But just as there’s room for improvement on the community side, so too is there room for improvement on the developer side.
For instance, there still isn’t an option to report a player for using offensive language in-game, whether it’s via text chat or through the mic. The only reasons for reporting a player are exploiting, cheating, boosting, offensive username, and offensive clan tag — none of which address a potentially toxic player communicating in-game.