Modulate’s FPA Pledge

<Cross-posted from our blog here.>

Solving toxicity and disruptive behavior in gaming isn’t a problem that can be overcome by any single organization on their own. That’s why Modulate is proud to be a member of the Fair Play Alliance (FPA), a group of gaming-related companies committed to improving the experience of online gaming for all. As a culmination of deep discussions and genuine research, the FPA recently published a framework for how to approach these challenges, and issued a concurrent call to action — for all interested companies to create a list of actionable goals over the next six-months covering what we can do to improve our communities.

Modulate is in a bit of an unusual position for such a pledge — as a software vendor, more often than not we’re partnering with community leaders on their specific games, rather than deploying our technology independently into any given community. On the flip side, though, we are positioned extremely well to have a broad impact, since any solutions we devise can flow into a wide range of games quite quickly through our network of partnerships. So we’re excited to outline a few of the challenges described in the FPA’s framework, and some of the work we’ll be committing to in the coming months to tackle them.

Gaining the benefits of voice chat without the risks

Chris Priebe at Two Hat Security noted a few years ago that users participating in some form of social chat are more than 300% more likely to continue playing. Another study emphasizes that voice chat is uniquely empowering for many communities — nearly 80% of female players felt at least partially empowered by the opportunity to interact with others in games differently from how they would in the real world, even as 75% of those same players reported experiencing active harassment while playing. Additional research has found that voice chat creates closer bonds than text chat, builds stronger and more accurate feelings of empathy, and overall simply enhances the social experience online.

Most importantly, though, many players will find a way to use voice chat either way. Those who avoid voice will be at an unfair disadvantage, and, if the players are forced to chat on a third-party tool without moderation, then things will likely become even more dangerous and toxic.

Solving this problem is a core part of Modulate’s mission, and we’re already hard at work building the tools to empower more players to speak up and to detect and stop bad behavior. Over the coming months, we’ll be working with a variety of studios to deploy these tools into real communities and evaluate how player experiences improve.

Reshaping social dynamics

This might sound like a fanciful dream, but we believe it’s possible, and are hard at work building the tools to make it happen. The key insight, as the framework notes, is that most harmful behavior isn’t committed by someone who is fundamentally evil. They might be misinformed, or led astray by a bad mentor, or frustrated and coping poorly, or simply making a mistake, but the vast majority of online offenders would prefer to be getting along with folks. The issue is that a combination of cultural elements, emotions, and social pressures leads them astray, and that the current approach of punishing these bad actors tends to actually reinforce bad behavior more often than not.

Modulate’s ToxMod system provides the tools necessary to understand this sort of contextual nuance. Rather than trying to make predictions about people based only on their text chat, ToxMod gains insight into the way each player’s emotions evolve over the course of the game, providing a much more fine-tuned and actionable understanding of what sorts of experiences tend to trigger players to become disruptive.

Of course, we need to be immensely careful here, as we are all too aware of the ways AI systems tend to build biases when asked to make these kinds of predictions — especially given that a player’s voice conveys information about their demographics too. Given this risk, we’re approaching this work extremely slowly, and likely will never trust a black-box AI system to actually reshape player interactions entirely on its own. But over the next few months, we’re aiming to at least begin surfacing initial insights to individual community teams, who will then be able to use them as jumping-off points for further, human-guided exploration of their community dynamics. We hope that even this spark alone may lead to some powerful new ideas about how to create more inclusive spaces by design.

Defining what “good” looks like

Modulate is thrilled to be a part of such a major industry shift, and we’re deeply excited and humbled by the substantial opportunity here to improve the lives of so many. If you have any feedback on how we’re thinking about enabling safer and more inclusive online communities, please don’t hesitate to reach out to us at ethics@modulate.ai .

CEO and cofounder at Modulate.ai, building crazy cool machine learning tools to make online voice chat more immersive and inclusive.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store