Gamers Say They Love Voice Chat, But 49% Have Been Victims of Toxic Behavior — New Report

Mar 08, 2023 - Helsinki, Finland

A survey of a nationally representative sample of more than 1,000 U.S. gamers found that nearly 70% have used voice chat while gaming. Nearly half of all gamers and 72% of voice chat users reported experiencing toxic behavior while using voice chat. Most of the victims have experienced multiple incidents but only about one-third have reported the incidents. The survey was commissioned by Speechly Ltd., a leading speech-recognition company that began providing artificial intelligence-based voice chat moderation tools to the game industry in 2022.

Speechly, a leading provider of fast and cost-efficient speech recognition tools for voice chat moderation deployed on-device, on premises, or in the cloud, announced today the release of the first Voice Chat Toxicity Report for Online Games. The report is based on a survey of a representative sample of over 1,000 U.S. adults who play online games and offers new insight into the problem of toxic behavior in online game voice chat.

“A plurality of gamers say voice chat improves the playing experience and game makers like that voice chat users play longer and more frequently. However, gamers also say, voice chat is the worst channel for toxic behavior in gaming and is significantly worse than in-game play, and text chat. Many of these gamers want game makers to implement proactive monitoring to create a better experience for everyone. The victims want help containing the problem,” said Otto Söderlund, CEO of Speechly. 

Report research director, Bret Kinsella, founder of and Voicebot Research, commented, “The most striking findings to me where that half of all gamers have experienced toxic behavior in voice chat, most of those were subjected to three or more incidents, and only about 36% say they have every reported a toxic event to a game maker. We know from other research that most game makers are only aware of toxic incidents originating in voice chat after a report is filed. This means it is likely that 60-90% of incidents are never reported and the game makers have zero visibility to the scale and scope of this problem.” 

According to the report, nearly two-thirds of toxic behavior victims in online game voice chat have never reported an incident, and those that have didn’t report every incident. The problem is much larger than previously reported. Voice chat is widely used today and liked by gamers, but they also say it is the worst channel for toxicity in terms of reach, frequency, and severity.

In addition, false accusations of toxic behavior in voice chat are a known problem in games and few game companies have a way to verify who is telling the truth. Gamers favor solutions ranging from proactive monitoring and recording audio for verifying incidents to simple one-click reporting.

Speechly understands that video games are social experiences and voice chat toxicity is a major challenge. The company hopes that the Voice Chat Toxicity Report for Online Games will draw attention to the issue and encourage game companies to take action to address it.

Charts and Diagrams included in the report:

  1. Gamer Voice Chat Use 2023
  2. Toxic Behavior by Game Engagement Channel
  3. Gamer Experience with Toxic Behavior in Voice Chat
  4. Impact of Toxic Behavior Incident
  5. What Gamers Want for Voice Chat Moderation
  6. Voice Chat Moderation Maturity Model
  7. Gamer Sentiment About Voice Chat
  8. Percent of Games Players Say they Use That Include Voice Chat
  9. Gamer Text Chat Use
  10. Gamer Voice Chat Use
  11. Discord Users of Voice Chat Outside of Game Platforms While Playing
  12. Gamer Use of Voice Chat on Another Platform While Playing
  13. Discord Users Frequency of Voice Chat Use While Gaming
  14. Who Gamers Connect with Via Voice Chat
  15. How Gamers Use Voice Chat
  16. Encountered Toxicity by Platform
  17. Frequency of Toxicity Encounters by Platform
  18. U.S. Adult Gamers that Experience a Voice and Text Chat Toxic Incidents
  19. Voice and Text Chat Users That Face Toxic Incidents While Gaming
  20. Frequency of Voice Chat Toxicity
  21. Estimated Number of Toxic Behavior Incidents Per Victim in Voice and Text Chat
  22. Significant Toxic Behavior Problems by Game Engagement Channel
  23. Voice Chat Toxic Behavior Incidents by Offense Category
  24. Player Behavior Immediately After a Toxic Incident
  25. Player Usage After a Toxic Incident
  26. Top 5 Incident Categories Most Likely to Lead to Reduced Play or User Churn
  27. How Toxic Incidents Effect Player Perceptions of Games
  28. Incident Categories That Generate The Most Negative Perception of Games
  29. Actions Taken by Victims of Toxic Behavior
  30. Post Incident Action Taken by Game Moderators
  31. Incident Categories Most Likely to Result in a Permanent Ban
  32. Incident Categories Most Likely to Result in a Temporary Ban
  33. Incident Categories Most Likely to Result in a Warning
  34. Victim Perception of Moderation Outcome
  35. Incident Categories Where Victims Are Most Likely to Rate Moderation Negatively
  36. Incident Categories Where Victims Are Most Likely to Rate Moderation Positively
  37. How User Moderation Expectations Compare to Outcomes
  38. Interest in Real-Time Voice Chat Monitoring by Gamer Frequency of Play
  39. Frequency of Reported False Accusations of Toxic Behavior in Voice Chat
  40. Voice Chat Users Reporting False Accusations of Toxic Behavior
  41. Gamer Text Chat Experience
  42. Gamer Experience With Toxic Behavior in Text Chat
  43. Online Gamer Experience With Text Chat Toxicity
  44. Voice and Text Chat Toxic Behavior Online Gamers Experience

Otto Söderlund added, “As a company of engineers, we wanted to understand why game companies were asking for help with this problem. When you learn that about half of all gamers and over 70% of voice chat users have experienced toxic behavior and that it immediately changes their playing behavior and sentiment towards the game, it becomes clear that a solution is needed. Deploying proactive monitoring and auditing for voice chat is a tough technical challenge to address cost efficiently, but advances in AI for other use cases have made this practical. We are happy to be addressing a problem that has such a far-reaching impact.”

You can read more or download the report on the Speechly website. For more information, please contact Speechly at

About Speechly:

Speechly is a leading provider of speech recognition tools for voice chat moderation. The company is a team of engineers who have previous experience building popular voice assistants like Siri and Amazon Alexa. Speechly entered the video game industry in 2022 during Y Combinator when leading AAA game studios challenged the Speechly team to help build cost-effective speech recognition tools to overcome the challenges with voice chat moderation.