Emine Sinmaz 

UK experts warn of dangers of violent content being readily available online

Ofcom figures show number of people seeing material depicting or encouraging violence or injury has risen
  
  

Axel Rudakubana
Axel Rudakubana’s internet browsing history showed his ‘obsession with extreme violence’, officials have said. Photograph: AP

Six minutes before Axel Rudakubana left home to murder three girls at a Southport dance class, he searched for a video of the Sydney church attack in which a bishop was stabbed while livestreaming a sermon.

That video from last April, which is still available online, was among Rudakubana’s internet history, which officials said showed his “obsession with extreme violence”.

The 18-year-old had reportedly spent hours in his bedroom researching genocides and watching graphic videos of murder, and had looked at material about school massacres in the US. Documents about Nazi Germany, “clan cleansing” in Somalia, electronic detonators and car bombs were also found on Rudakubana’s devices during police searches of his home.

The details of the shocking case have led Keir Starmer to warn that Britain faces a new threat of terrorism from “extreme violence perpetrated by loners, misfits, young men in their bedrooms accessing all manner of material online”.

“To face up to this new threat, there are also bigger questions,” the prime minister said on Tuesday.

“Questions such as how we protect our children from the tidal wave of violence freely available online, because you can’t tell me that the material this individual viewed before committing these murders should be accessible on mainstream social media platforms, but with just a few clicks, people can watch video after horrific video – videos that, in some cases, are never taken down. No. That cannot be right.”

The number of people seeing content online depicting or encouraging violence or injury has increased, according to Ofcom, the communications regulator. Its latest research, dating to May and June 2024, shows that 11% of users aged 18 and over had seen such material on social media and elsewhere online, up from 9% a year earlier.

Meanwhile 9% of internet users aged 13 to 17 had also seen content depicting or encouraging violence or injury. More broadly, as of June 2024, 68% of users aged 13 and over said they had encountered at least one potential harm in the past four weeks, the same proportion as reported in June 2023 and in January 2024.

Prof Sonia Livingstone, from the London School of Economics department of media and communications, said violent content was easily available and that there had been an increasing amount of research on boys and young men accessing misogynist and hateful material.

“That’s not to say everyone is looking at it and in my research I talk to lots of teenagers who avoid it, or see it and deplore it, or see it and are intrigued but wouldn’t dream of taking any action,” she said.

“So however much we can see that there’s a problem with online, in this particular case, it’s never going to be the whole explanation. We also have to look at the question of who was this young man.”

Rudakubana was obsessed by violence but was deemed to have no coherent ideology. Starmer promised legal reforms on Tuesday to allow attackers to be charged under terror laws despite lacking such an ideology.

Dr Julia Ebner, a researcher specialising in radicalisation, extremism and terrorism at the University of Oxford and the Institute for Strategic Dialogue, said there had been an increase in radicalisation cases involving people with “fluid ideologies”.

“It’s a phenomenon of our time,” she said. “We all have highly tailored content and individualised feeds on Instagram or TikTok or Telegram. Because people can be part of several different groups or subscribe to different channels, they will see the content that is radicalising them, and at the same time, misogyny and potentially white supremacy or Satanism.”

Rudakubana had downloaded an academic study on Al-Qaida that is banned under terror laws and which police believe he may have used to make the ricin. He also used security software to hide his identity when he bought knives from Amazon in the days before his attack on the Taylor Swift-themed dance club in Merseyside on 29 July last year.

The home secretary, Yvette Cooper, told MPs on Tuesday that the government would consider “the wider challenge of rising youth violence” and that requests would be made to tech companies to remove online material accessed by Rudakubana. She said companies should take action before codes of practice on illegal content come into place in March under the Online Safety Act.

Rani Govender, the NSPCC’s policy manager for child safety online, said the government was right to be concerned about how easy it was to access violent content online.

“It is particularly concerning to think about how easy it is for children and young people to stumble upon this kind of content in public spaces and private messaging sites,” she said.

“We know that harmful algorithms can repeatedly serve this content and further harm the mental and emotional wellbeing of children.

“It’s vital that Ofcom are robust in challenging any part of the online world which serves harmful material to children. They must compel tech companies to identify and remove it at the earliest opportunity, and government should hold them accountable for doing so.”

Andy Burrows, the chief executive of the Molly Rose Foundation, said online safety laws needed to be reviewed to protect children and society from “a growing melting pot of extreme and violent threats”.

He referred to the case of Cameron Finnigan, a 19-year-old from West Sussex who was jailed last week for six years after police discovered “deeply concerning” online material linked to his membership of an online pseudo-satanic group.

“We are deeply concerned about the growing threat of violent motives and ideas fermenting online, including those which are fuelling a wave of sadistic grooming to coerce children into grooming and self-harm acts. Regrettably, our calls for Ofcom to respond to this urgent threat have so far fallen on deaf ears,” he said.

Margaret Mulholland, a Send and inclusion specialist for the Association of School and College Leaders, said algorithms helped to spread “violent and misogynistic material among young people” and helped to normalise harmful views.

“Schools have a role to play in spotting the early signs of inappropriate behaviour and helping pupils to stay safe online, but they also need support from the government,” she said.

“It is vital that social media companies are held accountable and safeguards are strengthened to prevent children and young people from being exposed to online content that harms them and their peers.”




 

Leave a Comment

Required fields are marked *

*

*