The trauma can be overwhelming for the young victims. An F.B.I. study reviewing a sample of sextortion cases found that more than a quarter of them led to suicide or attempted suicide. In 2016, a Justice Department report identified sextortion as “by far the most significantly growing threat to children.”
There are a few seemingly simple protections against online predators, but logistics, gaming culture, and companies’ concerns about losing customers present obstacles.
Companies could require identification and parental approvals to ensure games are played by people of the same age. But even as some platforms have experimented with programs like Real ID, a verification effort, gamers have resisted giving up anonymity.
“There’s been community-layer rejection of those systems because people like to be able to be anybody,” says Todd Harris, who co-founded Hi-Rez Studios, a game development company.
While Facebook has algorithms that can detect some red-flag behaviors in written messages, many gamers use audio and video chat. And eliminating audio and video interactions would be a death sentence for a gaming company fighting for customers, who often communicate with teammates. “You can’t seriously compete without talking,” Harris says. “The team with the best communication will win.”
Separately, some gaming companies deploy automated systems they say can detect some behaviors that are often warning signs, including attempts to move a chat off platform. Microsoft, which owns Xbox and the popular game Minecraft, says it plans to release software this year that could recognize behaviors that are often associated with sextortion. The company says it would offer the software to other tech businesses free of charge.
Sony, the maker of PlayStation, says it takes sextortion seriously, pointing to its tutorials on parental controls and tools that let users report abusive behavior. And indeed, there has been some success in catching perpetrators.
But the solution many game developers and online safety experts return to is that parents need to know what their children are playing, and that young people need to know what tools are available to them. Sometimes that means blocking users and shutting off chat functions, and sometimes it means monitoring the games as they are being played (see “How to Protect Yourself,” below).