Shanti Das 

TikTok users in UK to be left with ‘more toxic’ version of app, say campaigners

Activists call on tech giant to extend a change making personalised algorithm optional to comply with rules for EEA countries
  
  

From the end of the month TikTok users in 30 EEA countries will be able to opt out of  receiving personalised content recommendations.
From the end of the month TikTok users in 30 EEA countries will be able to opt out of receiving personalised content recommendations. Photograph: Jonathan Raa/NurPhoto/Shutterstock

TikTok users in the UK face being left with a “less safe” and “more toxic” version of the app than others in Europe after the tech giant was forced to make its personalised algorithm optional there to comply with EU rules.

From the end of August, TikTok users in 30 countries in the European Economic Area will be able to opt out of receiving personalised content recommendations, which are largely generated based on past activity. Instead, their “For You” and live feeds will include popular videos from the places they live and around the world.

This weekend, TikTok said it had no plans to make the same option available to users in the UK. A spokesperson said the company was making the changes for EEA users to comply with requirements under the EU’s new Digital Services Act, which does not apply in the UK. It is keeping the situation under review.

The decision has been criticised by campaigners, who called on TikTok to make the feature more widely available. Imran Ahmed, chief executive of the Center for Countering Digital Hate, said research showed TikTok’s algorithm surfaced harmful content to users, including videos related to self-harm and eating disorders, and was designed to keep people hooked.

He called on the firm to extend the option to switch off the feed to users around the world. “Given they have already built the functionality, it would be frankly ridiculous if they did not,” he said.

Currently, TikTok’s personalised algorithm relies heavily on tracking and user profiling to recommend content to people. Its sensitivity has helped turn TikTok into one of the world’s most popular social platforms, used by 53% of those aged three to 17 in the UK, according to Ofcom – more than Instagram, Snapchat, and Facebook.

It has also been blamed for fuelling the promotion of harmful content. Last year, TikTok was blamed over Andrew Tate’s rise to notoriety after hundreds of videos were promoted on the For You feed. Many were later found to have broken TikTok’s rules on hate speech, but were viewed millions of times before they were removed.

Under the EU Digital Services Act, which came into force in November, “very large” platforms with more than 45m users have to provide an option to switch off personalised recommendations by the end of this month. The EU says the rule is being introduced due to the “systemic impact” that these firms have on “facilitating public debate, economic transactions and the dissemination of information, opinions and ideas”.

The UK has been planning for years to impose stricter regulations on tech companies, including introducing a threat of fines for failing to remove banned content. Under the online safety bill – expected to become law at some point this year – companies will be required to “empower adult internet users with tools so that they can tailor the type of content they see and can avoid potentially harmful content if they do not want to see it on their feeds”.

Andy Burrows, an expert in digital child safety and an adviser to the Molly Rose Foundation, which campaigns on suicide prevention, said the regulatory situation in Britain was “still lagging”, but called on TikTok to make its non-personalised feed available here, regardless of the law.

The feature’s introduction in the EU was a “welcome step” that would allow young people to “opt in to a safer version of the platform” which reduced the chance of them being exposed to “harmful content and toxicity”, he said. “It’s disappointing if TikTok is only going as far as regulation requires it to do, and as a result is giving people in the UK less agency to shield themselves from harmful content than those in the EU,” he said.

Other companies, including Instagram and Facebook, will also be affected by the EU legislation, and are expected to announce their own changes. Facebook said its platforms already gave users the chance to switch off personalised feeds. Twitter lets users toggle between a personalised feed and a feed of the people they follow.

TikTok said other tools were available to help people shape their feed, including keyword filtering, the “not interested” tool, and “refresh”, which allows people to view content as if they had just signed up to TikTok.

A government spokesperson said: “Our groundbreaking online safety bill will go further than any previous legislation in protecting users on a wide range of online sites including social media, video sharing platforms, and search engines. Under the new legislation, there will be a zero-tolerance stance on companies who allow targeted algorithms to lead their users to illegal content. If companies do not take proactive steps to prevent users from being exposed to illegal content, they will face huge fines.”

• This article was amended on 14 August 2023. An earlier version implied that the UK was not part of Europe.

 

Leave a Comment

Required fields are marked *

*

*