Robert Booth UK technology editor 

Meta’s content moderation changes ‘hugely concerning’, says Molly Rose Foundation

Charity set up after 14-year-old’s death concerned as Zuckerberg realigns company with Trump administration
  
  

Meta logo
Andy Burrows, the foundation’s chief executive, said: ‘Mark Zuckerberg’s increasingly cavalier choices are taking us back to what social media looked like at the time that Molly died.’ Photograph: Dado Ruvić/Reuters

Mark Zuckerberg’s move to change Meta’s content moderation policies risks pushing social media platforms back to the days before the teenager Molly Russell took her own life after viewing thousands of Instagram posts about suicide and self-harm, campaigners have claimed.

The Molly Rose Foundation, set up after the 14-year-old’s death in November 2017, is now calling on the UK regulator, Ofcom, to “urgently strengthen” its approach to the platforms. Earlier this month, Meta announced changes to the way it vets content on platforms used by billions of people as Zuckerberg realigned the company with the Trump administration.

In the US, factcheckers are being replaced by a system of “community notes” whereby users will determine whether content is true. Policies on “hateful conduct” have been rewritten, with injunctions against calling non-binary people “it” removed and allegations of mental illness or abnormality based on gender or sexual orientation now allowed.

Meta insists content about suicide, self-injury and eating disorders will still be considered “high-severity violations” and it “will continue to use [its] automated systems to scan for that high-severity content”.

But the Molly Rose Foundation is concerned about the impact of content that references extreme depression and normalises suicide and self-harm behaviours, which, when served up in large volumes, can have a devastating effect on children.

It is calling on the communications watchdog to fast-track measures to “prevent teens from being exposed to a tsunami of harmful content” on Meta’s platforms, which also include Facebook.

Meta’s own data shows that less that 1% of the suicide and self-injury content it took action on between July and September last year came from user reports.

Andy Burrows, the Molly Rose Foundation’s chief executive, said: “Meta’s bonfire of safety measures is hugely concerning and Mark Zuckerberg’s increasingly cavalier choices are taking us back to what social media looked like at the time that Molly died. Ofcom must send a clear signal it is willing to act in the interests of children and urgently strengthen its requirements on tech platforms. If Ofcom fails to keep pace with the irresponsible actions of tech companies the prime minister must intervene.”

In May, Ofcom issued a draft safety code of practice which ordered tech firms to “act to stop their algorithms recommending harmful content to children and put in place robust age-checks to keep them safer”. The final codes are due to be published in April and are due to come into force in July after parliamentary approval.

A Meta spokesperson said: “There is no change to how we define and treat content that encourages suicide, self-injury, and eating disorders. We don’t allow it and we’ll continue to use our automated systems to proactively identify and remove it. We continue to have community standards, around 40,000 people working on safety and security to help enforce them, and Teen Accounts in the UK, which automatically limit who can contact teens and the types of content they see”.

An Ofcom spokesperson said the Online Safety Act means tech firms must take significant steps to protect children from risks, including the swift removal of illegal suicide and self-harm material.

“We are in contact with social media companies, including Meta, about the safety measures they have in place now, and what more they will have to do to comply once the duties are fully in force,” they said. “No one should be in any doubt about Ofcom’s resolve to hold tech firms to account, using the full force of our enforcement powers where necessary.”

 

Leave a Comment

Required fields are marked *

*

*