Still, when the biggest game during the summer break involves kids cultivating their own gardens, it’s easy to see why parents let their children continue to use the platform. “Maybe it’s because of the graphics, or maybe it’s because of the way it’s marketed, [but] there’s an assumption that Roblox is made for children, and parents assume that there are copious amounts of safeguards,” says Rachel Kowert, a research psychologist and author whose work focuses on the social impact of video games.

To filter out problematic content, Roblox uses a combination of human moderators and automated systems driven by artificial intelligence and machine learning. But researchers have repeatedly shown that these measures can be circumvented, replacing banned words with coded language the automated systems struggle to track. In many of the lawsuits filed to date, the victims claim the alleged perpetrator moved conversations from Roblox to platforms like Discord or Snapchat, where there is even less oversight.

When contacted by WIRED, a Snap spokesperson said the company is committed to combatting child sexual exploitation and that it is working to improve its technology to detect people who violate Snapchat’s guidelines.

One of the ways Roblox aims to prevent people from moving communications to other platforms is through Trusted Connections, a new feature that allows users over the age of 13 to have unfiltered chats with other users they know. Users who have verified their age—by sharing a video of themselves that is then scanned by AI—can add others between the ages of 13 and 17 without any additional steps.

Users over the age of 18 can only become Trusted Connections if they know a younger user in real life, according to Roblox. This is confirmed by either scanning a QR code in-person or if the adult’s phone number appears in a user’s contacts.

But critics have concerns that these measures could already be doomed to fail. On Discord, which began to roll out similar features in recent months, it took just a few days for gamers to figure out they could fool age verification. Using the photo mode in Death Stranding, they showed the platform an image of the main character, whose model is actor Norman Reedus.

In addition to the lawsuits being filed against Roblox, law enforcement agencies across the US have been tracking grooming on the platform. In 2024, Bloomberg Businessweek reported that authorities in the US had arrested at least two dozen people that law enforcement accused of abducting or abusing victims they’d met or groomed using Roblox since 2018. The arrests have continued since.

In addition to individual predators on the platform, researchers have shown how members of nihilistic groups like No Lives Matter, CVLT, and 764—all of which operate as part of the broader Com network—have repeatedly used Roblox as a place to recruit children, encouraging them to share sexually explicit images and videos, self-harm, or in extreme cases take their own lives. Members often do this to gain clout within the groups.

The new safeguards, claims extremism researcher Ry Terran, can only do so much to prevent this. In her view, they give “young teens more opportunities to chat, not less.” The “parental controls for teens are up to the teens, not the parents,” Terran says. “But they’re calling these ‘extra safety’ features to shift the burden of safety onto kids and parents and away from themselves.”

Share.
Exit mobile version