Caroline Kimeu 

High-profile lawsuit against Meta can be heard in Kenya, Nairobi court rules

Decision on case of ex-Facebook moderator, who claims the work left him with PTSD, hailed as win for accountability of big tech in Africa
  
  

Daniel Motaung, the former content moderator who is suing Meta in Kenya
Daniel Motaung, the former content moderator who is suing Meta in Kenya. Photograph: Foxglove

A Kenyan court has ruled that a case brought against Facebook by a former content moderator can go ahead.

Daniel Motaung, who was hired as a Facebook content moderator by the tech firm’s subcontractor Sama in 2019, filed a suit against the two companies last year, alleging that he had been exposed to graphic and traumatic content at work, without adequate prior knowledge or proper psychosocial support – which he says left him with post-traumatic stress disorder.

He also claimed he was unfairly dismissed after trying to unionise his co-workers to fight for better conditions.

Facebook’s parent company, Meta, contested its involvement in the case, saying that Sama was Motaung’s employer, and Meta could not be subjected to a hearing in Kenyan courts because it was not registered, and did not operate, in the country.

However, on Monday the judge found that the tech giant was a “proper party” to the case.

The Kenya employment and labour relations court is yet to release its full ruling on Motaung’s case, but the decision – the first of its kind in Africa – is already being hailed as a win for the accountability of big tech on the continent, and in the global south.

“If the attempt by [Meta] to avoid Kenyan justice had succeeded, it would have undermined the fundamental tenets of access to justice and equality under the law in favour of foreign privilege,” said Irũngũ Houghton, executive director of Amnesty International Kenya.

“We finally have an avenue for accountability,” said Odanga Madung, senior researcher for platform integrity at Mozilla. “It calls for tech giants to make serious changes within their companies that take into consideration their workers and users outside the US and Europe.”

Cori Crider, director of Foxglove, a UK tech justice non-profit, which supported the Motaung case, said social media platforms should not outsource critical online safety functions like content moderation. “It is the core function of the business. Without the work of these moderators, social media is unusable. When they are not able to do their jobs safely or well, social media’s safety brutally falters.”

Meta is facing a second court case in Kenya, which was due to be heard this week but has been postponed. It was filed by two Ethiopian petitioners and a Kenyan rights advocacy group, Katiba Institute, who claim that the company failed to take online safety measures to manage hate speech on the platform during northern Ethiopia’s civil war – which they say fanned the conflict, with serious offline consequences.

The father of one of the petitioners was killed after a violent Facebook post that was reported, but not acted on in time. The petitioners claim that Facebook also failed to recruit enough moderation staff to its regional hub in Nairobi.

“There are problems with Facebook’s woeful failure to value or to staff content moderation outside of the English-speaking United States,” said Crider, adding that Monday’s ruling could have global and regional implications on how tech firms think about and manage content moderation.

Leah Kimathi, a convenor for the Council for Responsible Social Media, agrees. “Big tech should not just look at Kenyans as a market, but should be accountable and alive to the nuances, needs and peculiarities of Kenya, especially when it comes to content moderation.”

Facebook has more than 13 million users in Kenya. It and Meta’s WhatsApp are the most commonly used social media platforms in the country.

A nationwide poll conducted in 2022 by the Council for Responsible Social Media showed that 68% of Kenyans who have internet access get their news from social media, and that a majority of these feel that social media platforms could do more to remove harmful content.

 

Leave a Comment

Required fields are marked *

*

*