
Article and photos by Marcus Siu
As a regular gamer of “Call of Duty” over the years on the Sony PlayStation platform, I’ve always noticed that the number of online “undesirables” tend to increase as it gets later and later into the wee hours of the night. These are the diehard gamers who play 24/7 with a few cans or Red Bull next to them to keep them jolted. For them, winning is a life and death situation, so they do absolutely everything in their favor for that desired outcome. Even if it’s cheating, yelling obscenities and insults at their fellow teammates to do things differently.
For certain, these aren’t the friends I want to hang out with.
Even though “Call of Duty” games are rated PEGI 18, only suitable for 18 or over, there is always a mix of children, teenagers and adults playing online during the peak hours of the evening. However, when it’s past bedtime for those kids, it becomes super aggressive with toxic adult gamers. At times, I can’t believe how much swearing or threats I would hear online. In those circumstances, I wish I could recruit some online military SWAT team to take that player out.
Back in the day, when everyone wore headsets, it was exciting to chat with your teammates to plan strategies with each other. However, because of the toxic environment online gaming has quickly become over the years, especially when e-sports came around making gaming a professional sport, typical players nowadays don’t even want to use their headsets so they can enjoy their game at their own pace and skills level.
I’ve always wondered if there was any hope to battle this type of online conduct, and was very happy to discover the answers during my visit at GDC 2022, the Game Developers Conference in San Francisco last month.

UNITY
Last August, Unity, the world’s leading platform for creating and operating real-time 3D content, acquired OTO, an AI-driven acoustic intelligence platform that can be leveraged to build and foster safer gaming environments with voice and text chat environments. OTO will be integrated into Unity’s industry leading Vivox platform as a cornerstone for solving one of gaming’s global challenges: the rise of toxic behavior that leads to poor player experience, and ultimately, lost revenue for game creators.
Unity also released findings from a new survey conducted by The Harris Poll on its behalf that illustrates the growing problem around toxicity that impacts not just players, but profits for developers and creators. This survey was conducted online within the United States from June 21-23, 2021. Key findings of the survey include:
- Nearly seven in 10 (68%) of players – defined as those who played multiplayer games in the past year – said they’ve experienced toxic behavior while playing multiplayer games (e.g., sexual harassment, hate speech, threats of violence, doxing).
- Nearly half of players (46%) say that they at least sometimes experience toxic behavior while playing multiplayer video games, with 21% reporting it every time/often.
- 67% of players were very/somewhat likely to stop playing a multiplayer video game if another player were exhibiting toxic behavior.
- 92% of players think solutions should be implemented and enforced to reduce toxic behavior in multiplayer games.

Q&A with CLEANSPEAK
CleanSpeak handles content filtering and moderation platform protecting online communities from inappropriate content. Self hosted or cloud based, their products serve customers worldwide. The company helps the world’s largest companies manage their user-generated content and protect their brand from PR disasters and user attrition caused by offensive and inappropriate content.
Founded in Broomfield, Colorado, the company works with most of the large game companies in the world from all ranges that use this platform including EA, Ubisoft, Activision, Big Fish, Amazon Games, Per Blue, Pokémon, and even Nintendo.
I spoke with Brian Pontarelli, the founder and co-CTO of CleanSpeak, as well as Fusion Auth, which deals with identity & access management. Both companies are under the umbrella parent company, Inversoft.
He gave me some insight on the company on how the company deals with online profanity, as well as origins of the company.
“I started the company about 15 years ago and I was working on a social network at the time and we were wanting to do filtering around users profiles, their usernames, any content that they were posting because we wanted to keep the community clean, and as we grew it we just started to realize it was a great fit for game companies.”
“Game companies started coming to us and they said, “we’re trying to filter usernames, we’re trying to filter our chats, we really want toxicity to go down, and we really want users to enjoy playing, and get rid of like trolls and issues”. So, then we started building on moderation tools where you could actually discipline and kick users out in real time as events were happening”.
Q: Could you give me an example of what gets a player kicked out?
A: “Let’s say a player is being really inappropriate; they’re using racial slurs, they’re telling people that they’re going to kill them or they’re just using a lot of profanity in a really obscene way. A moderator might see that activity or CleanSpeak might analyze their profile and realize that they’re being really offensive…and then what we’ll do is we’ll do is what’s called a “discipline,” so we’ll activate an “event” on their account that causes them to be kicked out of the game and not be able to log in for a certain period of time.
Q: How can they get back in?
A: “In most cases they can go ask the moderation or open a support ticket. They can ask the moderation team to let them back in and basically take off that “discipline”, but most of the time they just have to wait. What that does is it just reduces the overall toxicity of the environment, and a lot of times when gamers realize that their account might be taken away that they worked really hard on, they tend not to come back in and keep doing it.”
Q: Do they get a warning?
A: “You can do warnings. We have lots and lots of layers. Everything is customizable so you can start with a warning. You can go to “mute” so that they can still play the game but they can’t chat and then you can go all the way to a band. It’s all customizable. Our customers get to define what is best for their company.”
Q: Is Sony PlayStation going to enlist?
A: “Potentially, you know we’ve been talking to Sony now for a number of years and we hope that they join the team… join the family”.
Playing video games should be for everyone, no matter what skill level they are at. No one needs to play in an toxic environment where it’s no fun to play with gamers who treat their gaming like “boot camp” as their life depends on it. It’s nice to know that there are companies out there that will make it more enjoyable for all of us as they continue to reduce the problems that many game companies and gamers face.
After all, gamers just wanna have fun.