Roblox has introduced a suite of safety enhancements aimed at teenagers aged 13-17, featuring an innovative AI-driven age verification system that analyzes video selfies to estimate a user's age.
Today's update outlines multiple new tools designed to bolster safety for teens and children on the Roblox platform. Central to these updates are tailored features for teens aged 13-17, granting them more platform freedom than younger users but with stricter oversight than adults. Teens can now designate "trusted contacts" for unfiltered in-platform chats, aiming to keep conversations secure and reduce the risk of teens moving to unmonitored third-party platforms where inappropriate interactions could occur.
Trusted contacts are meant for users who know each other well. For a teen to add someone 18 or older as a trusted contact, they must use a QR code scanner or contact importer for verification.
Previously, Roblox required government-issued ID to confirm users as 13+ or 18+ for certain chat features. Now, an alternative method allows users to submit a video selfie, which AI evaluates against a vast, diverse dataset to determine if the user is 13 or older. Similar technologies have been tested by Google earlier this year and by Meta the previous year.
Beyond these updates, Roblox is introducing new tools like online status controls, a do-not-disturb mode, and enhanced parental controls for parents linked to their teen’s account.
Roblox has faced ongoing scrutiny over its child safety measures. In 2018, incidents surfaced, including a report of a seven-year-old’s character being sexually assaulted in-game and a six-year-old being invited to a "sex room." In 2021, a People Make Games report criticized Roblox’s business model for allegedly exploiting child labor. In 2022, a San Francisco lawsuit accused Roblox of enabling financial and sexual exploitation of a 10-year-old. In 2023, lawsuits targeted Roblox for allegedly supporting an illegal gambling ecosystem and inadequate child safety protocols, leading to financial losses and exposure to adult content. A Bloomberg report last year also exposed the presence of child predators on the platform. Roblox reported over 13,000 child exploitation incidents to the National Center for Missing and Exploited Children in 2023, leading to 24 arrests.
"Safety is at the core of everything we do at Roblox," said Matt Kaufman, Roblox’s chief safety officer, in a statement accompanying the new feature announcement. "We aim to set the global standard for safety and civility in online gaming, fostering engaging and empowering experiences for players of all ages while continually refining how users connect."
