Dan Milmo Global technology editor 

Parents ‘don’t use’ parental controls on Facebook and Instagram, says Nick Clegg

Meta’s global affairs chief points to ‘behavioural issue’ around child safety tools on the social media platforms
  
  

Nick Clegg gesticulates as he speaks; he is sitting in a black leather chair against a purple backdrop with white Chatham House logos; he faces a shadowy head and shoulders in the foreground
Nick Clegg said Meta’s apps provided a positive experience for the ‘overwhelming majority of young people’. Photograph: Kin Cheung/AP

Parents do not use parental controls on Facebook and Instagram, according to Meta’s Nick Clegg, with adults failing to embrace the 50 child safety tools the company has introduced in recent years.

Meta’s global affairs chief said there was a “behavioural issue” around using the tools, after admitting they were being ignored by parents. Regulatory pressure is building on tech companies to protect children from harmful content, with the Australian government announcing plans this week to ban younger teenagers from accessing social media.

Speaking at an event hosted by Chatham House in London, Clegg said parents were not using controls that allowed them to set time limits and schedule viewing breaks.

“One of the things we do find … is that even when we build these controls, parents don’t use them,” he said. “So we have a bit of a behavioural issue, which is: we as an engineering company might build these things, and then we say at events like this: ‘Oh, we’ve given parents choices to restrict the amount of time kids are [online]’ – parents don’t use it.”

Clegg, a former UK deputy prime minister, said evidence suggested Meta’s apps provided a positive experience for the “overwhelming majority of young people”. However, testimony in 2021 from a company whistleblower, Frances Haugen, accused the Facebook and Instagram owner of putting profit before safety, while safeguarding problems on Instagram were highlighted by the 2022 inquest into the death of the UK teenager Molly Russell, who took her own life after viewing harmful content.

In the UK, the Online Safety Act has been introduced and imposes specific requirements on social media companies to shield children from harmful content. The issue remains high on the agenda for many governments including in Australia, where the prime minister, Anthony Albanese, has announced plans to block children from social media and other digital platforms unless they are over a certain age – likely to be between 14 and 16.

Asked if Meta would enforce such a ban, Clegg said the company would “of course” abide by it, but warned it would be difficult to implement and would require the cooperation of the Google and Apple app stores.

Andy Burrows, the chief executive of the Molly Rose Foundation, a charity set up by Russell’s family, said: “Nick Clegg would do a service to children’s safety by stopping passing the buck and starting to take responsibility for the preventable harm caused by Meta’s choices.”

Clegg also addressed the controversy over Elon Musk’s X platform, which he said had been turned, under the Tesla CEO’s ownership, into a “one-man sort of hyper-partisan, ideological hobby-horse”.

Clegg said X and the messaging app Telegram had allowed far-right figures such as Tommy Robinson, whose real name is Stephen Yaxley-Lennon, and the misogynist influencer Andrew Tate to “run amok” online in the wake of the UK riots related to the Southport murders, after they had been banned from Meta platforms.

He added that X was a small platform “for elites”. “I think it’s a tiny, elite, news-obsessed, politics-obsessed app. The vast, vast, vast majority of people join Facebook and Instagram for much more playful reasons,” Clegg said.

 

Leave a Comment

Required fields are marked *

*

*