Blog

What does toxicity look like in an online match?

Alexander Gee
January 1, 2021

Imagine playing an online game for 50 minutes where you continuously receive abuse from another player and 1 of every 5 messages is abusive. This does happen, have a look at the example below

We built a machine learning model tailored to a specific online game. We then applied the model to all historical chat data from one of the matches. The percentage of toxic messages (red part) is mostly coming from one player who throughout the 50 minute match is able to continuously berate the other players, even after being reported. A number of players try to confront or de-escalate the behaviour with no success, at the end of the match 1 out of 5 messages have been deemed toxic.

By using our machine learning models and community tools our automation and insights could have prevented this. We are on a mission to empower community managers and game developers to take action to prevent a situation like this from arising in games.

Latest articles
| Koalla Brand Studio