Dan Milmo and Keza MacDonald 

Ofcom urged to act after US firm claims Roblox is ‘paedophile hellscape’

Campaigners say watchdog must ensure Online Safety Act is rigorous enough, after allegations about gaming platform
  
  

A young child playing Roblox on a tablet
Roblox, which has 80 million daily users, was accused of lax safety controls expoxing children to grooming, pornography, violent content and abusive speech. Photograph: Phil Noble/Reuters

Child safety campaigners have urged the UK communications watchdog to make a “step-change” in its implementation of new online laws after a video game firm was accused of making its platform an “X-rated paedophile hellscape”.

Roblox, a gaming platform with 80 million daily users, was accused of lax safety controls last week by a US investment firm.

Hindenburg Research claimed Roblox’s games exposed children to grooming, pornography, violent content and abusive speech. The company, which has stated that it is seeking to profit from a fall in Roblox’s share price by taking out a so-called “short” position on the company’s stock, added it had found multiple accounts named using variations of Jeffrey Epstein, the disgraced financier and child sexual abuser, and had been able to set up an account under the name of a notorious American paedophile.

“We found Roblox to be an X-rated paedophile hellscape, replete with users attempting to groom our avatars, groups openly trading child pornography, widely accessible sex games, violent content and extremely abusive speech – all of which is open to young children,” said Hindenburg.

Roblox rejected Hindenburg’s allegations, saying safety and civility was “foundational” to the company.

“Every day, tens of millions of users of all ages have safe and positive experiences on Roblox and abide by the company’s community standards. However, any safety incident is horrible. We take any content or behaviour on the platform that doesn’t abide by our standards extremely seriously,” said the company.

The company said it had reviewed the references to child safety in the report and found in “many cases” the highlighted content had already been taken down, while all other content referred to in the report was either being reviewed or had been removed.

“We continuously evolve and enhance our safety approach to catch and prevent malicious or harmful activity. This includes text chat filters to block inappropriate words and phrases, and not allowing user-to-user image sharing on Roblox,” said the company.

One in five Roblox users are under nine years old and the majority are under 16. The platform offers a catalogue of games and allows players to socialise with each other including in chatrooms. There is no age limit, although the platform does advertise age recommendations for certain “experiences” and offers parental controls.

Roblox content is not authored by its developers, but instead by players. It provides the tools for children and teens to create their own simple gaming scenarios, and then play through them with friends. One popular Roblox “experience” has players working in a pizza parlour, and another involves a game of cops and robbers.

Child safety campaigners said the report underlined the need for Ofcom, the UK’s communications regulator, to implement the Online Safety Act as rigorously as possible and introduce strict codes of practice for tech companies.

The act requires platforms to protect children from harmful content, with these provisions underpinned by codes of practice being drawn up by Ofcom, which is charged with enforcing the legislation. The codes are voluntary but companies that adhere to them will be deemed to be in compliance with the act by Ofcom.

The Molly Rose Foundation, established by the parents of Molly Russell, the British teenager who took her own life after viewing harmful online content, said the watchdog would be judged on how quickly it addressed risks posed by platforms such as Roblox.

Andy Burrows, the foundation’s chief executive, said: “This report underscores the growing evidence that child safety shortcomings aren’t a glitch but rather a systemic failure in how online platforms are designed and run.

“The Online Safety Act remains the most effective route to keep children safe, but such preventable safety lapses will only be addressed if Ofcom delivers a step-change in its ambition and determination to act.”

Beeban Kidron, a child internet safety campaigner, said implementation of the act needed to “significantly up the game” on ensuring that tech platforms have in-built safety measures.

“Roblox is a consumer-facing product and in order to trade it has to be safe for children, and it has to have by-design mechanisms that mean it does not enable predators to convene or search for children,” she said.

Lady Kidron added: “We need political will and leadership to strengthen the provisions of the OSA and a regulator willing to implement them.”

An Ofcom spokesperson said the act would have a significant impact on online safety in the UK and the regulator would have a broad range of enforcement powers to protect users.

“Platforms – such as Roblox – will be required to protect children from pornography and violence, take action to prevent grooming, remove child abuse images, and introduce robust age-checks. We have set out clear recommended measures for how they can comply with these requirements in our draft codes.”

A Roblox spokesperson added that the company “fully” intended to comply with the OSA.

“Our internal teams have been assessing the obligations and have been engaging in the various consultations and calls for evidence Ofcom have published. We look forward to seeing Ofcom’s final codes of practice,” said the spokesperson.

 

Leave a Comment

Required fields are marked *

*

*