Roblox enjoys over tens of millions of daily users under the age of 16, and because of this, the platform has faced growing tides of scrutiny over its child safety measures, particularly following the introduction of new age verification tools in 2025, which have been deemed ineffective.
This comes from Ron Kerbs, CEO of Kidas, a cybersecurity company focused on protecting young gamers, who views the safety measures as ineffective. In an interview with Insider Gaming, Kerbs stated:
“Roblox’s new age verification effort is a step in the right direction, but it’s not the full answer. While technology like facial analysis and ID scans can help restrict access to mature content, these tools are still easily bypassed and don’t address the most pressing safety issues kids face online every day.”
Concerned individuals and parents have been quite vocal in recent months about child endangerment practices on multiplayer social games with voice and text chat features, making them a sprawling space for predators. This has led to games like Roblox and GTA Online to implement age verification as global demand grows.
Roblox’s age verification system rolled out in July 2025 and is part of its “Trusted Connections” feature, which uses facial age estimation technology, ID verification, and verified parental consent to determine a user’s age more accurately than previous attempts. These measures aim to create a wall between adults and minors unless they are deemed to be real-world verified content, allowing teens aged 13 to 17 to access chat features while maintaining moderation for community standards.
Despite these implementations, Kerbs believes these measures fall short and points out that “no verification method, AI or otherwise, can guarantee that the person behind a screen is who they claim to be.”
Much of this is further purported by Kidas’ work, which has protected over 400,000 gamers, showing that predators can easily bypass verification systems using fake IDs or AI-generated images, a problem that has been prevalent on platforms like Discord, where UK age checks were worked around using characters from Garry's Mod and Death Stranding.
Kerbs wrapped up the interview, stating:
“Roblox and platforms must invest in real-time behavioral monitoring. Protecting kids isn’t about proving you’re 13, it’s about detecting when a conversation is going dangerously off track and stepping in fast.”