Rachel Hall and Sally Weale 

Banning us from social media is ‘neither practical nor effective’, UK teenagers say

UK youth parliament concludes tech firms should do more to protect users from violent and inappropriate content
  
  

A teenager using his mobile phone
The parliament’s committee of 14- to 19-year-olds examined the links between social media and youth violence. Photograph: Dean Lewins/AAP/AP

Banning teenagers from social media is neither a “practical nor effective” solution to the growing problem of youth violence, young people from across the UK have told MPs.

Although Australia has implemented a social media ban for under-16s, a better solution would be to strengthen regulation deterring social media companies from promoting violent and age-inappropriate content, the youth select committee of the UK youth parliament concluded in its report examining the links between social media and youth violence.

The report stated that this was because there were benefits to being online, such as learning about the world and forming connections, and because an age ban would be too easy to circumvent.

Wania Eshaal Ahmad, the chair of the youth select committee, said: “The inquiry has made one thing clear: that social media companies should do more to protect young people from violent and harmful content.

“The committee believes that a social media ban, like in Australia, is neither practical nor effective. Instead, tech companies must be held accountable.”

The committee’s membership is made up of 14- to 19-year-olds, who examined written evidence from teens across the UK as well as from experts. They urged the government to involve young people in policymaking that affects them at every stage of its development, and specifically those from marginalised and under-represented groups.

Asking for the introduction of a youth advisory panel on Ofcom, they noted: “We have heard little evidence that Ofcom has engaged with young people on online safety.”

They recommended that the government create a consumer-facing online safety standards rating, which would evaluate platforms on their safety measures, responsiveness to harmful content, and efforts to educate users. This would serve as a scorecard giving users accessible information about how safe an online space is.

The committee members added: “We are not persuaded that the Online Safety Act is robust enough to enforce minimum age limits on social media platforms and ensure children and young people will be protected from harmful content.”

They suggested that Ofcom should report annually to parliament on whether the act was proving effective in holding tech companies accountable for providing safe online spaces.

The report cited a 2024 survey from the Youth Endowment Fund (YEF) of 10,000 young people aged between 13 and 17, which found that 70% had encountered some form of violence on social media in the past 12 months, though only 6% had been actively searching for it. The most common form of violence viewed online was fights involving young people, reported by over half of 13- to 17-year-olds.

The same survey found that one in five children had been a victim of some sort of violence in the past year, while 16% had perpetrated violence themselves.

The report also cited research from the children’s commissioner for England noting that exposure to online violence “can desensitise them to violence, normalise aggressive behaviour and, in some instances, lead to retaliatory violence”, and suggested that the government should commission research to establish whether there is a causal link, which would feed into the Online Safety Act.

The role of toxic influencers was also cited, including comments from the YEF that “influencers’ presentation of crime as a lucrative career option can seem enticing” to young people in “challenging socioeconomic conditions with concerns about their future opportunities”. The report asked the government to work with social media companies to address the harmful content spread by influencers, and to ensure it is not rewarded financially.

On Wednesday, the head of Ofsted added his voice to calls for headteachers to ban smartphones in schools in England. Martyn Oliver, who is chief inspector of the schools watchdog, said heads already had the necessary powers and Ofsted would back those who make the tough decision to ban phones.

In a Q&A with parents, he said children’s developing brains should not be “bombarded by non-human algorithms that might be preying upon them”, adding: “It’s harmful and it’s damaging, so I do believe they should be banned. Ofsted will support schools in banning phones.”

Oliver, who led a large multi-academy trust before taking on the top job at Ofsted, said he had walked into schools in special measures that were in utter chaos, with mobile phone use rife. “And within those schools, within days of banning phones, and as hard as that is initially, you get an immediate sense of calmness across the school.”

The education secretary, Bridget Phillipson, has asked officials to look into how to monitor more effectively the use of smartphones in schools in England. She told a gathering of headteachers: “The government’s position is clear: you have our full backing in ridding our classrooms of the disruption of phones.”

A government spokesperson said: “We are making our streets and online spaces safer for children through delivering our plan for change. Last week, the key provisions of the Online Safety Act came into effect so that online services are required to take action to protect children from illegal content and criminal activity occurring on their platforms.

“This is just the beginning, and in summer additional protection will prevent children from encountering harmful material like pornography and violent and abusive content.”

 

Leave a Comment

Required fields are marked *

*

*