Can We Make Roblox Safer? Age Verification and the Future of Online Play

The Problem: Online Solicitation on Gaming Platforms

A recent BBC article highlighted a disturbing issue: the solicitation of children on gaming platforms like Roblox. James was targeted at just eight years old, and pressured to take naked photographs. Solicitation often starts on platforms like Roblox and can migrate to apps like Kik. Children may initially receive unsolicited gifts (such as Roblox’s virtual Robux currency) to facilitate coercive control over a child.

Global Rise of Online Solicitation

James highlights that these predators do not let up on harassing a victim to send nude pictures. Research findings indicate that when a child generates sexually explicit images, the predators move onto financial extortion - ‘send me money, or I will share these pictures with everyone in your contacts list.’

In two years in the US, between October 2021 and March 2023, the FBI received over 13,000 reports of online financial extortion of minors. It impacted 12,600 victims and led to at least 20 suicides.

In the UK, NSPCC found over 34,000 online crimes against children across 150 platforms between 2017-2023, highlighting the widespread nature of the issue.

In 2023, INHOPE, a global network of hotlines combatting child sexual abuse material (CSAM), recently highlighted processing a staggering 1 million reports from the public, a significant increase of 25% from the previous year. 83% of victims depicted were between the ages of 3-13 years of age.

This massive increase in reports paints a grim picture: the online risk posed to children through abuse and exploitation continues to grow rapidly.

It doesn’t have to be this way

Children have the right to a safe online environment. James, now 20, is part of a growing movement of young people and parents calling on Big Tech companies to design platforms that respect children’s rights. Roblox and other platforms need to do more to protect children.

This growing body of findings underscores the critical need for platforms to reliably verify user ages, especially for younger children.

Creating a Safer Roblox

Age verification is no longer optional; it’s a legal obligation. A growing number of regulations, including data protection laws like the EU's General Data Protection Regulation (GDPR) and the EU Digital Services Act (DSA), online safety acts, and the EU AI Act, place legal obligations on platforms to know the ages of their users and create safer spaces for children.

Companies like Roblox must step up and prioritise child protection measures. Using robust age verification measures, such as those offered by TrustElevate, could significantly enhance child safety on Roblox and similar platforms. By reliably verifying users’ ages, Roblox can:


  • Implement stricter safeguards and content filtering for younger users, minimising their exposure to potential predators and inappropriate content.

  • Detect and address suspicious behaviour more effectively.

  • Facilitate parental data rights management for young children, allowing parents to ensure platforms respect their children’s rights.

  • Ensure compliance with emerging data protection and online safety regulations mandating age verification for children.


The ability to create age-appropriate spaces and experiences is crucial in curbing the alarming rise in online child exploitation highlighted by sources like INHOPE and law enforcement agencies. Conducting age checks can deter predators seeking to groom children, as seen in James’ harrowing experience.

Previous
Previous

Protecting Children Online: Why Age Verification Matters

Next
Next

Verifiable Parental Consent: A Key to a Safer Online World for Children