Gaming Reviews, News, Tips and More.
We may earn a commission from links on this page

Despite Advancements, Games Still Aren't Doing Enough To Stop Toxic Voice Chat

Though games like Overwatch and Call of Duty have automated systems for flagging toxicity, women gamers are still being harassed
Despite Advancements, Games Still Aren't Doing Enough To Stop Toxic Voice Chat
Illustration: Vicky Leta / Kotaku
We may earn a commission from links on this page.

I started regularly playing competitive online games in 2007, with the launch of Halo 3. Back then, participating in in-game voice chat was harrowing for a 17-year-old girl whose voice betrayed her gender and her youth. I was subjected to such frequent and horrific hostility (rape threats, misogynistic remarks, sexually inappropriate comments, you name it) that I eventually started screaming back, a behavior my parents still bring up today. And yet, voice chat is essential in competitive online games, especially modern ones like Call of Duty: Warzone, Apex Legends, Fortnite, Valorant, and Overwatch.

All of these popular games require extensive amounts of teamwork to succeed, which is bolstered by being able to chat with your teammates. But in-game voice chat remains a scary, toxic place—especially for women.

Unfortunately, despite efforts from developers to crack down on toxicity in voice and text chat, it still feels, at times, like I’m stuck in the same world as that 17-year-old girl just trying to compete in peace. And I’m not alone in that feeling. I spoke to several women about their voice chat experiences, as well as reps from some of today’s biggest online games, to get a better understanding of the current landscape.

A 17-year-old Alyssa Mercante sitting on a couch wearing an Xbox headset and using a pink controller.
A 17-year-old me playing Halo 3 circa 2007.
Photo: Alyssa Mercante / Kotaku

Voice-chatting as a woman

Competitive online games are intense, but doubly so if you’re identifiable as outside the industry’s so-called core playerbase for the last 35 years: white, straight, and male. “Marginalized users, especially women, non-binary people, and trans folks, are more likely to experience harassment in voice and video chats,” game researcher PS Berge told Kotaku’s Ashley Bardhan last year.

The moment a woman or woman-presenting person speaks in voice chat, they run the risk of being identified as an “other” and thus deserving of ridicule, ire, or sexual harassment. For many, that fear of being othered and how it could (and often does) lead to harassment directly affects their willingness to speak in competitive game settings.

“I usually wait for someone else to speak first so I know what the vibe will be,” video game level designer Nat Clayton, who regularly plays Apex Legends, told Kotaku via email. “Though I feel more comfortable chatting in Apex than I do going back to older PC games like Team Fortress 2 or Counter-Strike—games where the expectation of bigotry seems absolutely set in stone, where you feel like you cannot turn on voice chat without immediately experiencing a flood of slurs.” Both Team Fortress 2 and Counter-Strike came out in the early 2000s and still attract an older, male-leaning playerbase, many of whom can be hostile to women.

This problem has been long-standing, but companies are doing more to dissuade people from being toxic or abusive in in-game voice and text chat now than they were 10 years ago—though it often doesn’t feel like it.

Microsoft recently announced a new voice reporting feature that will let players save and submit a clip of someone violating the Xbox Community Standards, which a team will then review to determine the next course of action. “Reactive voice reporting on Xbox is designed to be quick and easy to use with minimal impact to gameplay,” reads the press release announcing the new feature. This means that Xbox players can report toxic voice chat no matter what game they’re playing, which adds another layer of protection on top of the ones set up by individual developers.

Those protections include ones laid out In the uber-popular battle royale game Fortnite. If a player is found in violation of Epic’s community rules (which have guidelines against hate speech, inappropriate content, harassment, and discrimination), they could lose access to in-game voice chat—a newer approach to punishment that the company introduced in 2022—or have their account permanently banned. Epic wouldn’t share specific numbers on bans, but did tell Kotaku that its team is “planning to introduce a new feature for voice chat soon.”

But Fortnite “[relies] on player reports to address violations of our voice and text chats,” which places the onus squarely on those who are on the receiving end of such violations. And for games that don’t record or store voice and text chat, reports can feel especially useless. When asked if she has reported people in Apex Legends, Clatyon replied, “Many, and often, but unfortunately the current Apex reporting system doesn’t monitor/record voice interactions and so doesn’t take action based on voice chat.”

An Xbox graphic detailing its new voice reporting feature for a "safer ocmmunity for all Xbox players." It includes images of three people wearing headsets and playing video games.
Image: Microsoft

New ways games are combatting toxicity

Companies don’t always rely on players, though. Activision, Blizzard, and Riot Games all use a mix of automation and human moderation for multiplayer modes in Call of Duty, Overwatch 2, and Valorant.

As detailed in an official Call of Duty blog post from last year, an automated filtering system flags inappropriate gamertags, while human moderation of text chat helps identify bad actors. The aforementioned post (which is from September 13, 2022) boasts 500,000 accounts banned and 300,000 renamed thanks to enforcement and anti-toxicity teams. We don’t have more recent data from the Call of Duty publisher.

After the launch of Overwatch 2, Blizzard announced its Defense Matrix Initiative which includes a “machine-learning algorithms to transcribe and identify disruptive voice chat in-game.” Though Blizzard did say what it considers “disruptive voice chat” or what the algorithms entail, the company did say the team is “happy with the results of this new tech” and has plans to deploy it to more regions and in more languages.

But women still often find themselves deploying strategies to deal with the toxicity that isn’t caught by these systems. Anna, a UI/UX researcher who regularly plays competitive games like Overwatch 2 and CS:GO, told Kotaku over email that she also waits to see what the vibe of the chat is before diving in. She’s “more inclined to speak up if I hear another woman too because there’s potentially more safety in numbers then,” she explained. Others, myself included, play solely with friends or offer to group up with women they meet in matches to avoid encountering agitated players.

Toxicity persists, which is likely why companies continue to try new methods and approaches. When Kotaku reached out to Riot Games for details on its efforts combating disruptive behavior and toxicity in Valorant, executive producer Anna Donlon said via email that:

In addition to the player reporting tools, automatic detection system, and our Muted Words List, we’re currently beta testing our voice moderation system in North America, enabling Riot to record and evaluate in-game voice comms. Riot’s fully-dedicated Central Player Dynamics team is leveraging brand new moderation technology, training multi-language models to collect and record evidence-based violations of our behavioral policies.

While companies struggle to find a solution to an admittedly complicated problem, some women have been discouraged from trying altogether. Felicia, a PhD candidate at the University of Montana and full-time content creator, told Kotaku that she used to say hello at the start of every game (she mainly plays Fortnite and Apex Legends) but that willingness eventually “turned into waiting to speak, then not speaking at all.” The shift came as a direct result of her experience using Overwatch’s in-game voice chat function. “It got so bad I’d only talk in Xbox parties,” she said of the feature which allows you to group up and voice chat with friends.

Jessica Wells, group editor at Network N Media, speaks up in her CS:GO matches despite the threat of toxicity. “I say hello, give information, and see how it goes. If my team is toxic to me, I’ll either mute individuals or mute all using the command,” she said via email. “I used to fight it—and I mean really fight the toxicity online—but I find toxicity breeds more toxicity and the game goes to shit as a result.”

Overwatch's D.Va stands out of her fighting mech with her arms crossed next to the words "Defense Matrix Initiative"
Image: Blizzard

Toxicity persists and worsens in highly competitive games

If you’ve played ranked matches in games like Overwatch or Valorant, you’ve experienced this direct correlation: Verbal harassment increases when competition levels increase. And no one experiences this phenomenon more acutely than women.

Alice, a former Grandmaster Overwatch 1 player, told Kotaku over email that her experience with the original game “changed how [she] interacted with online multiplayer.” She was ranked higher than her friends, so would have to queue for competitive matches alone, and said she’d get “the usual ‘go make me a sandwich’” remarks or requests to “let your boyfriend back on” in more than half of her games.

Overwatch is a curious case when it comes to harassment and toxicity. Despite a cartoonish visual design that suggests a more approachable game and a diverse cast of characters, competition is at the heart of the team shooter’s identity. Over time, patches and updates have focused on balancing competitive play, and its popular esports league encourages highly competitive gameplay. Overwatch players who regularly watch Overwatch League may be more prone to “backseating” (telling other players what to do) or be more judgmental of the way people play certain characters. And the more extreme ire is often directed towards women—especially those who play support or the few playing Overwatch at a professional level.

“Sometimes someone else on the team would stick up for me, but most of the time the other players would stay silent or join in.” Alice’s experience may not be surprising when you consider the one study that tracked over 20,000 players and found that men played more aggressively when their opponents or their characters were women. “Through our research, we found that women did perform better when they actively concealed their gender identities in online video games,” the study said.

Alyssa Mercante in a photo from around 2011, sitting on a bed with an Xbox 360 controller and headset.
Me, likely playing Call of Duty: Black Ops or Modern Warfare III circa 2011.
Photo: Alyssa Mercante / Kotaku

Because of her consistently negative experiences in Overwatch voice chat, Alice plays Valorant now—just not ranked. She chooses not to play at a higher level because competitive Valorant (which also has its own, uber popular esports league) is a cesspool of toxic masculinity.

Anna, who regularly plays Riot Games’ 5v5 hero shooter, told Kotaku over email that she’s “encountered increasing amounts of toxicity in Valorant…which can include anything from sexual assault threats, threats of general violence or death threats, to social media stalking.” Male players have told her to “get on [her] knees and beg for gun drops, and proceed to use their character to teabag or simulate a blowjob.”

Anna says she changed her Riot ID to a “common household object” to try and prevent harassment from male players.

Stacy, a full-time streamer, told Kotaku via email that the harassment has bled into the real world, too. “Threats of DDOS, stalking, assault, murder and other crimes - a lot of which ended up on my live stream...I’ve had people ask me for my personal connections and accounts like Snapchat...as well as my phone number, and have even had people use my PSN account name to find me on social media like Instagram for non-gaming related reasons. [They even found] my email address to try to either harass me, send me unsolicited photos or attempt to bully and berate me beyond the console.”

The future of competitive games for women

It’s clear that even with automated moderation systems, extensive reporting options, and loud declarations against toxicity from publishers and developers, women who play competitive online shooters still regularly experience harassment.

“I have reported people in the past and it was an easy report button but with all the toxicity I encountered it made it feel like reporting them wouldn’t make a difference,” Felicia said. “I stopped reporting for the most part unless they come into my stream or in my comment section being toxic.”

Overwatch has a feature that will show you a pop-up upon login if the team has taken action against someone you’ve reported, but many players rarely (if ever) see that login. I’ve only ever seen it once.

Jessica finds that reporting players in CS:GO is virtually useless. “I can’t think of a single case where it felt like Valve directly took action,” she said.

An image Apex Legends news site Alpha Intel shared on International Women's Day featuring all the women characters in the game.
Image: Alpha Intel / Respawn

The same can be said for Valorant, which has a similar reporting feature as Overwatch. “I think I’ve only seen [the report was actioned on] screen three or four times since it was implemented,” Anna said.

And though the process of reporting is simple, it requires women to retread traumatic territory. “With the particularly nasty people, it always feels gross having to recount the words someone used to explain how they’d like to assault me, or typing (partly censored) slurs that I’d never dream of using myself, but it feels like if my report is not water-tight, it won’t get dealt with,” said Anna.

Unfortunately, eliminating toxic game chat, like so many other problematic things in the gaming industry, requires changing the perspectives of people perpetuating the problem. We need a holistic approach, not one that’s centered solely on automated monitoring or the reports of victims.

“I think more than anything it is a cultural problem,” said Alice. “FPS games are ‘for boys’ and until we change that perception, I think people will continue to be rude in them, especially when there are minimal consequences.”

Game studios can and should center more women and marginalized creators, players, and developers in marketing materials, streams, and esports events—and they should make it explicitly clear that a toxic culture has no place in their games. Instead of shying away from providing details on banned or otherwise penalized players as a result of toxic behavior, studios should wear them like a badge of honor, presenting them proudly as a way of saying “you have no place here.”

FPS games like Splatoon 3 are a great example of how competitive games can be less toxic. Nintendo’s ink-based shooter has minimal communication tools and a diverse character creator that allows for some more gender fluidity, allowing it to feel less like a “boys game.” The perceived casual nature of a Switch player stands in stark contrast to the console warriors and PC try-hards, which begs the question: Can competitive games exist without toxicity?

Nat Clayton has some suggestions: “You need to visibly and publicly create a culture where this kind of behavior isn’t tolerated, to make your community aware that being a hateful wee shit to other players has consequences.”

Update 07/24/23 at 12:00 p.m. EST: The original story included a Jessica Wells quote about Overwatch, but Wells was referring to CS:GO’s reporting system, which is called Overwatch. The quote has been adjusted to reflect that.