Blog

voice tech

The Dirty Dozen - The Impact of 12 Types of Toxic Behavior in Online Game Voice Chat

Collin Borns

Mar 09, 2023

3 min read

Speechly surveyed over 1000 online gamers about toxic behavior in voice and text chat. The results show offensive names, trolling, bullying and annoying behavior top the list with the broadest impact. However, these behaviors are between 50%-200% more frequent in voice chat.

  • Copy link

  • Mail

  • LinkedIn

  • Facebook

  • Twitter

Speechly just published the results of a national consumer survey of online gamers about their experiences with toxic behavior in voice and text chat. Offensive names, trolling, bullying, and other annoying behavior topped the list for the broadest impact in both spoken and written communications in games. 

However, you may note that gamers have experienced these bad behaviors between 50% to 200% more often in voice chat for each category. And how players react to these incidents differs by the type of offense.

Download the Voice Chat Toxicity Report for Online Games

Toxic Behavior Incidents by Offense Category

This variance in the frequency of toxic behavior clearly influenced other results as well. Gamers rated voice chat toxicity as significantly worse than text chat, and they experienced an average of 35% more incidents per victim.

Player Impact

Over two-thirds of gamers change their behavior immediately after experiencing a toxic incident in voice chat. About 40% turned off voice chat, and 28% stopped playing that day. Only 29% said their gameplay was unaffected by the incident. 

Player Behavior After Toxic Incident

The longer-term impact is even more troubling for game makers. Almost 39% of players say they reduced play or quit using the game after experiencing toxic behavior in voice chat. These incidents clearly have significant impacts on victims’ behavior. The incidents also impact players’ perception of the game.

Player Usage after Toxic Incident

Different Behavior, Different Impacts

Some game makers have told Speechly that they do not differentiate between different forms of toxic behavior, preferring to treat every incident as equally bad. However, players clearly don’t react the same way in response to these incidents. The type of toxicity matters. 

For example, gamers revealed that reduced play or game abandonment is far more common after stalking and sexual harassment incidents than for name-calling, trolling, and the use of explicit language. While 38.7% of victims of any type of toxic behavior in voice chat will reduce play or abandon the game after the incident, the figure is 52.5% for stalking and 50.6% for sexual harassment.

Top 5 Incident Categories

We also see differences in player perception of games depending on the type of toxic incident. In general, most victims of toxic behavior differentiate between the bad actors and the game. The optimistic way to interpret the data is that only 17.3% of players are “less likely” or “much less likely” to recommend a game after experiencing toxic behavior in voice chat. However, given how important word-of-mouth promotion can be for game success, even this figure is surely troubling. And it is far worse for sexual harassment.

Negative Perception Incident Categories

Nearly 28% of victims of sexual harassment in voice chat say they are less likely to recommend a game. The figure is 23.8% for offensive names and 23.1% for bullying. Voice chat toxicity is clearly bad for games regardless of the category of offense. But, it is also worth noting that the behaviors and attitudes of the victims differ depending on the type of toxic incident. 

This doesn’t suggest that game makers should combat some forms of toxic behavior and ignore others. Instead, the findings indicate that game makers should become more proactive in identifying both the incidence of toxic behavior and the frequency of offense types. Victims of the bad behaviors have different reactions and also different expectations about how the game maker should respond depending on what happened. 

Data-Driven Understanding

The data referenced above are included in a new 60-page report developed by Speechly and Voicebot Research. The report includes over 40 chats and diagrams and is free to download. 

Speechly commissioned the research after learning that game makers had very little information about their players’ voice chat experience beyond the complaints submitted and some social media posts and game reviews. We hope that the new report can be helpful for game developers working to improve player experience by reducing the incidence of toxic behavior in voice chat.

Download the Voice Chat Toxicity Report for Online Games

Latest blog posts

case study

Combating Voice Chat Toxicity in VR Games: Speechly and Gym Class

Gym Class VR is a basketball game that was preparing to launch on Meta Quest after a very successful Beta. Voice chat is an important social element of the game, but the team noticed evidence of toxic behavior emerging. After trying speech recognition from cloud service providers, they quickly learned this was a cost-prohibitive approach and turned to Speechly.

Collin Borns

Mar 20, 2023

5 min read

voice tech

Voice Chat is Popular with Gamers - It's also the Top Source of Toxic Behavior - New Report

Speechly commissioned a survey of a nationally representative sample of over 1000 gamers. The survey found that nearly 70% of gamers have used voice chat at least once. Of those, 72% said they've experienced a toxic incident. Read more today in the Full Report.

Otto Söderlund

Mar 08, 2023

3 min read

company news

Speechly Introduces New Conformer Speech Recognition Model and Expanded Whisper Offering

This week the Speechly team released two new product updates. These updates include a new conformer AI model as an update to our original LSTM models and an updated Whisper solution with coverage for 99 languages.

Antti Ukkonen

Feb 17, 2023

3 min read