As a society, we generally agree that children should be protected from certain extreme dangers. For instance, children aren’t allowed to gamble in casinos, nor are they allowed to drink alcohol. If they attempt to sneak into a casino or a bar, even with an excellent disguise or fake ID, we’ve all agreed to penalize the institution they fooled. Even if a doorman made a good-faith effort to check the ID, we still penalize the bar. Pragmatically, this is because we want to incentivize these institutions to always be on the lookout for underage people, but there’s also a deeper reason. We demand they take this action because we, as a society, have agreed that anyone hosting potentially dangerous activities has a moral obligation to ensure that those who are underage are protected.
Unlike gambling and drinking, gaming has many well-documented benefits for people of all ages, and especially children. Advantages range from strong new friendships, developing greater familiarity with people from outside one’s bubble, learning strategic thinking and technical skills, and in many cases even emotional development. But at the same time, there are potential harms as well — and no harm as severe as the risk of child predators targeting minors on these platforms.
Let me be clear: the number of child predators online in these games is tiny, a fraction of a percent of a percent. Almost every time, a kid who decides to play an online game will do so completely safely.
But, the unfortunate truth is that the predators are out there, and much as we as a society have agreed that underage drinking cannot be condoned, I’m even more certain that we’re all agreed that sexual predation of children absolutely must not be enabled, no matter what.
Most game studios would agree with this statement as well. Those which Modulate works with typically share our passion for solving this problem, readily acknowledging that it’s their moral obligation to protect the youngest players on their platform. Yet there are also studios out there who close their eyes to the problem. More than a few studios, in fact, make the conscious decision to not collect data about voice conversations specifically because they believe that they cannot be held liable if they didn’t hear the offense take place.
But if a child wanders into a casino, grabs a cocktail from a tray, and begins to play blackjack, then I doubt many of us would be interested in the casino’s excuse that they’d fired all their security guards and so couldn’t possibly have been aware of this transgression. It’s their job to notice, so if they didn’t, they don’t get a pass — that just means that they were that much worse at their job.
I’m pleased to report that the gaming industry, on the whole, has largely woken up to their obligations here. And I readily acknowledge that there are real questions left about how to implement this sort of oversight, especially while respecting the privacy of users as much as possible, and without preventing younger players from the many benefits that online gaming can provide.
But the bottom line is that games are creating a space for social interaction in a way not that different from casinos or bars. So if we, as a society, agree that the latter two need to be held accountable for what happens on their premises, I think it’s worth emphasizing that the same should be true for games. If they know minors will try to use their platform, it doesn’t matter if they advertised that it was for adults only, or if the minors had to lie on an age form to get in, or even if the studio tried their best. The only thing that matters is that they can actually provide a real guarantee of safety — and the first step to getting there is to actually look at what’s happening in the first place.