Robert Booth UK technology editor 

More than 140 Kenya Facebook moderators diagnosed with severe PTSD

Exclusive: Diagnoses part of lawsuit being brought against parent company Meta and outsourcer Samasource Kenya
  
  

People gathered outside office saying samasource
Facebook content moderators demonstrating at Samasource Kenya’s offices in Nairobi last year. Photograph: Daniel Irungu/EPA

More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism.

The moderators worked eight- to 10-hour days at a facility in Kenya for a company contracted by the social media firm and were found to have PTSD, generalised anxiety disorder (GAD) and major depressive disorder (MDD), by Dr Ian Kanyanya, the head of mental health services at Kenyatta National hospital in Nairobi.

The mass diagnoses have been made as part of lawsuit being brought against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa.

The images and videos including necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and run away from their desks, the filings allege.

The case is shedding light on the human cost of the boom in social media use in recent years that has required more and more moderation, often in some of the poorest parts of the world, to protect users from the worst material that some people post.

At least 40 of the moderators in the case were misusing alcohol, drugs including cannabis, cocaine and amphetamines, and medication such as sleeping pills. Some reported marriage breakdown and the collapse of desire for sexual intimacy, and losing connection with their families. Some whose job was to remove videos uploaded by terrorist and rebel groups were afraid they were being watched and targeted, and that if they returned home they would be hunted and killed.

Facebook and other large social media and artificial intelligence companies rely on armies of content moderators to remove posts that breach their community standards and to train AI systems to do the same.

The moderators from Kenya and other African countries were tasked from 2019 to 2023 with checking posts emanating from Africa and in their own languages but were paid eight times less than their counterparts in the US, according to the claim documents.

Medical reports filed with the employment and labour relations court in Nairobi and seen by the Guardian paint a horrific picture of working life inside the Meta-contracted facility, where workers were fed a constant stream of images to check in a cold warehouse-like space, under bright lights and with their working activity monitored to the minute.

Almost 190 moderators are bringing the multi-pronged claim that includes allegations of intentional infliction of mental harm, unfair employment practices, human trafficking and modern slavery and unlawful redundancy. All 144 examined by Kanyanya were found to have PTSD, GAD and MDD with severe or extremely severe PTSD symptoms in 81% of cases, mostly at least a year after they had left.

Meta and Samasource declined to comment on the claims because of the litigation.

Martha Dark, the founder and co-executive director of Foxglove, a UK-based non-profit organisation that has backed the court case, said: “The evidence is indisputable: moderating Facebook is dangerous work that inflicts lifelong PTSD on almost everyone who moderates it.

“In Kenya, it traumatised 100% of hundreds of former moderators tested for PTSD … In any other industry, if we discovered 100% of safety workers were being diagnosed with an illness caused by their work, the people responsible would be forced to resign and face the legal consequences for mass violations of people’s rights. That is why Foxglove is supporting these brave workers to seek justice from the courts.”

According to the filings in the Nairobi case, Kanyanya concluded that the primary cause of the mental health conditions among the 144 people was their work as Facebook content moderators as they “encountered extremely graphic content on a daily basis, which included videos of gruesome murders, self-harm, suicides, attempted suicides, sexual violence, explicit sexual content, child physical and sexual abuse, horrific violent actions just to name a few”.

Four of the moderators suffered trypophobia, an aversion to or fear of repetitive patterns of small holes or bumps that can cause intense anxiety. For some, the condition developed from seeing holes on decomposing bodies while working on Facebook content.

Moderation and the related task of tagging content are often hidden parts of the tech boom. Similar, but less traumatising, arrangements are made for outsourced workers to tag masses of images of mundane things such as street furniture, living rooms and road scenes so AI systems designed in California know what they are looking at.

Meta said it took the support of content reviewers seriously. Contracts with third-party moderators of content on Facebook and Instagram detailed expectations about counselling, training and round-the-clock onsite support and access to private healthcare. Meta said pay was above industry standards in the markets where they operated and it used techniques such as blurring, muting sounds and rendering in monochrome to limit exposure to graphic material for people who reviewed content on the two platforms.

 

Leave a Comment

Required fields are marked *

*

*