Richard Adams Education editor 

Three in four girls have been sent sexual images via apps, report finds

Just over half of teenagers who had been sent non-consensual sexual images via social media apps reported it, the report finds
  
  

A third of young people polled said they did not think reporting worked.
A third of young people polled said they did not think reporting worked. Photograph: martin-dm/Getty Images

Schools and parents should do more to support students who are being sexually harassed through platforms such as Snapchat and Instagram, while the tech companies need to clamp down on non-consensual sexual images being sent to young people, according to new research released on Monday.

The study by academics at University College London and the University of Kent found that just over 50% of teenagers who had been sent unsolicited sexually explicit images via social media apps say they have not reported the offences to either their parents, authorities or the companies involved.

The report highlights the technological functions and lack of accountability and identity-checking on platforms such as Instagram, and criticises app reporting functions as “useless,” meaning that young people are more likely to just block offenders rather than report the abuse.

Children’s safety groups have warned that the UK data watchdog must introduce age verification for commercial pornography sites or face a high court challenge over any failure to act.

Asked why they didn’t report incidents involving sexual images, about a third of the young people surveyed by the researchers answered: “I don’t think reporting works.” Just 17% of those who received unwanted sexual content reported it to the platforms involved.

Prof Jessica Ringrose of the UCL Institute of Education, one of the report’s authors, said: “Young people in the UK are facing a crisis of online sexual violence. Despite these young people, in particular girls, saying they felt disgusted, embarrassed and confused about the sending and receiving of non-consensual images, they rarely want to talk about their online experiences for fear of victim-blaming and worry that reporting will make matters worse.

“We hope this report allows all of us to better identify when and how image-sharing becomes digital sexual harassment and abuse and spread the message that, although the non-consensual sending and sharing of sexual images may be common and feel normal, it is extremely harmful.”

The study surveyed 480 young people aged 12 to 18 from across the UK, including 144 who participated in focus groups. Over half of those who had received unwanted sexual content or had their image shared without their consent reported doing nothing. Just 25% told a friend, but only 5% told their parents and 2% told their schools.

Of the 88 girls who took part in the focus groups, three-quarters said they had received images of male genitals. They said that close to half of the harassment had come from what appeared to be adult men, including adults who had created false identities. They also received online harassment and abuse from boys in their age range and peer groups.

A spokesperson for Meta, the holding company formerly known as Facebook which operates Instagram, said the safety of young people using its apps was its “top priority”. “If anyone is sent an unsolicited explicit image, we strongly encourage them to report it to us and the police,” the spokesperson said.

A spokesperson for Snapchat said: “There will always be people who try to evade our systems, but we provide easy in-app reporting tools and have teams dedicated to building more features, including new parental tools, to keep our community safe.”

 

Leave a Comment

Required fields are marked *

*

*