Roblox, one of the world’s most popular gaming platforms, is rolling out mandatory age verification for users accessing its chat features as part of a significant expansion of its safety measures. Starting in December, accounts in Australia, New Zealand, and the Netherlands will undergo age checks, with global implementation set for January. This move comes amid growing criticism and legal challenges in the U.S., where Roblox faces lawsuits in Texas, Kentucky, and Louisiana over child safety concerns. The platform, which averaged over 80 million daily players in 2024—40% of whom are under 13—has been accused of exposing young users to inappropriate content and interactions with adults. The new system uses facial estimation technology to estimate a user’s age through their device’s camera. Images are processed by an external provider and deleted immediately after verification. Users will be categorized into age groups, and chat access will be restricted to peers within similar age ranges, except for trusted connections. Under-13s will still require parental permission for private messages. Roblox’s Chief Safety Officer, Matt Kaufman, claims the technology is highly accurate, estimating ages within a one-to-two-year margin for users aged 5 to 25. The platform’s efforts have been welcomed by child safety advocates, though groups like ParentsTogether Action and UltraViolet are staging a virtual protest within Roblox, demanding stronger measures to protect children from online predators. The changes align with global regulatory trends, including the UK’s Online Safety Act, which mandates tech firms to prioritize child safety. Roblox’s initiative marks a significant step toward creating a safer digital environment for young users, with the company urging other platforms to adopt similar measures.
