More than a dozen states and the District of Columbia filed lawsuits against TikTok on Tuesday, alleging the popular short-form video app is damaging children’s mental health with a product designed to be used compulsively and excessively.
The lawsuits stem from a national investigation into TikTok, which was launched in March 2022 by a bipartisan coalition of attorneys general from several states, including California, Kentucky and New Jersey. All of the complaints were filed in state courts and claim that TikTok’s algorithm is especially dangerous given the platform’s widespread use among young people and its ability to deliver quick hits of dopamine. Design choices such as infinite scrolling, push notifications and in-app purchases prey on youth and create addictive habits among users, prosecutors allege. There are over 170m monthly active TikTok users in the US, and over a billion worldwide.
At the heart of each lawsuit is the TikTok algorithm, which powers what users see on the platform by populating the app’s main “For You” feed with content tailored to people’s interests.
In its filings, the District of Columbia called the algorithm “dopamine-inducing”, and said it was created to be intentionally addictive so the company could trap many young users into excessive use and keep them on its app for hours on end. TikTok does this despite knowing that these behaviors will lead to “profound psychological and physiological harms”, such as anxiety, depression, body dysmorphia and other long-lasting problems, the complaint said.
“TikTok’s design choices exploit the neurotransmitter dopamine, which helps humans feel pleasure as part of the brain’s reward system to encourage reinforcement,” the state’s filing in the superior court of California read. “Dopamine ‘rewards’ can lead to addictive behavior, particularly when rewards are unpredictable.”
Citing internal documents and presentations, the states claim TikTok considers users under 13 a “critical demographic” and knowingly targets them and collects their data without parental consent.
Michael Hughes, a TikTok spokesperson, said in a statement: “We strongly disagree with these claims, many of which we believe to be inaccurate and misleading.” Hughes cited the app’s “robust safeguards” such as proactive removal of underage users, default screentime limits and default privacy settings for users under 16.
“We’ve endeavored to work with the attorneys general for over two years, and it is incredibly disappointing they have taken this step rather than work with us on constructive solutions to industrywide challenges,” Hughes said.
TikTok has faced intense pressure from US lawmakers and prosecutors over the past year. Congress passed a law in April that will force a nationwide ban on TikTok that will come into effect in January unless its parent company, the China-based ByteDance, sells the app to a new owner. The US Department of Justice and the Federal Trade Commission also sued TikTok and ByteDance in August for allegedly breaking child privacy laws.
“It is profiting off the fact that it’s addicting young people to its platform,” said Brian Schwalb, District of Columbia attorney general.
Keeping people on the platform is “how they generate massive ad revenue”, Schwalb said. “But unfortunately, that’s also how they generate adverse mental health impacts on the users.”
TikTok does not allow children under 13 to sign up for its main service and restricts some content for everyone under 18. But the District of Columbia and several other states said in their filing that children can easily bypass those restrictions, allowing them to access the service adults use despite the company’s claims that its platform is safe for children.
The lawsuit from California also alleges TikTok’s beauty filters can perpetuate “self-hatred of their appearance” among its young users. The complaint claims the platform knows these filters can harm users but continues to design these filters to increase engagement. The filters perpetuate beauty stereotypes, the complaint reads, including favoring certain Caucasian or European features.
“Indeed, plastic surgeons have reported an increase in patients seeking procedures to look better onscreen and have remarked that TikTok’s advanced ‘effects’ blur the line between fantasy and reality,” the complaint reads.
District of Columbia’s lawsuit takes aim at other parts of the company’s business.
It alleges TikTok is operating as an “unlicensed virtual economy” by allowing people to purchase TikTok Coins – a virtual currency within the platform – and gift it to streamers on TikTok Live, who can then cash it out for real money. TikTok takes a 50% commission on these financial transactions but hasn’t registered as a money transmitter with the US treasury department or authorities in the district, according to the complaint.
Officials say teens are frequently exploited for sexually explicit content through TikTok’s livestreaming feature, which has allowed the app to operate essentially as a “virtual strip club” without any age restrictions. They say the cut the company gets from the financial transactions allows it to profit from exploitation.
“TikTok profits from the sexual exploitation of children – operating like a virtual strip club,” Schwalb said on X.
TikTok’s CEO, Shou Zi Chew, has repeatedly pushed back against claims that the platform is harmful for young people and in January pledged to spend $2bn to protect US users.