Lucy Mangan 

The Internet’s Dirtiest Secrets review – the human toll of detoxifying social media

A masterful edition of Storyville exposed the awful plight of the moderators tasked with purging tech platforms of violent and sexually abusive images
  
  

A social media moderator in Manila
Moderator are tasked with deleting 25,000 posts a day. Photograph: BBC/Gebrueder Beetz Filmproduktion

One woman wanted to quit her job as a moderator for an unnamed tech company during training, after hearing descriptions of the content and images she was likely to see. Once she had started, she came across pictures of a six‑year-old girl having terrible things done to her and asked to leave. Her manager told her this was what she had signed up for and sent her back to work. Her story was preceded by footage from testimony before a committee on child abuse images and exploitation by Nicole Wong, then a legal adviser at Google. “We’re doing the best we can,” she said.

We don’t know the name of the woman haunted by images that still make her voice shake when she speaks of them. She is one of tens of thousands of moderators employed by companies in the Philippines, themselves hired by big tech firms, to purge social media platforms of the worst that humanity offers when you give it the chance. Like the rest of her colleagues, she could only speak without risk anonymously.

The Internet’s Dirtiest Secrets: The Cleaners (BBC Four) was an episode of the BBC’s Storyville strand that masterfully wove together two indictments – personal and political – to damn the efforts of firms such as Google, Twitter and Facebook to protect the people who work for them and to damn the socio-political infrastructures that are still the best ways we have found to keep us safe.

Let us turn to the human misery unleashed at the individual level first. The moderators who work for these third parties – their names are not revealed, nor which companies they work for – are based in Manila, 7,000 miles from Silicon Valley. Here, there are two main options for employment. You can sift through garbage, scrambling over colossal heaps of the stuff and selling whatever you have scavenged at the end of each day until your body sickens, or you can be paid to sift through the online equivalent every day until your soul dies.

Some moderators have a target of deleting 25,000 posts a day from the thousands identified as questionable and sent to them for adjudication. “It damages your brain,” said one. “It’s like a virus in me, slowly penetrating my brain and changing the reactions of my body,” said another. One moderator, who specialised in judging live videos of self-harm, hanged himself after asking three times to be transferred. You can try to skip or avert your eyes, but managers review samples of everyone’s decisions to make sure they are correct. If your scorecard suffers, so will your chances of continued employment.

On a macro level, the implications of a system where broadbrush guidelines composed by the companies must be applied after only seconds of consideration by people usually unfamiliar with the language or context of a video or an image are almost limitless. A painting of a naked, looming Donald Trump with a tiny penis was taken down. There is no allowance in the backrooms for art, satire or nuance; a guideline outlawing naked genitals is a guideline outlawing naked genitals, and people need their jobs.

Syrian citizens’ footage of civilian bombings, which helps charities track the violence and try to hold parties to account, is routinely taken down. Other decisions make firms more overtly complicit in upholding repressive regimes. Google, for example, blocks any YouTube video the Turkish government doesn’t like from being shown in that country, rather than have the government block YouTube entirely. “I didn’t love that resolution,” said Wong. “But that’s where we got to.”

Polarisation, fragmentation, abuse and escalation are not byproducts of this new technology, as the former Google design ethicist Tristan Harris pointed out. They are the fuel of anything that depends on human engagement for survival and revenue. Outrage generates clicks. “It used to be that everyone was entitled to their own opinion,” said Antonio García Martínez, a former product manager at Facebook. “Now, everyone is entitled to their own set of facts.” Facebook’s filters will provide.

This was an extended instalment of Storyville – 85 minutes instead of the usual hour. It is always a dense hour, without sacrificing accessibility, but this allowed a deeper dive and time to introduce ideas about the ramifications of new technology that is still relatively unfamiliar to most of us. At its heart, though, lay an old, old question: who guards the guardians? Perhaps we ought to add a new question: who, in an era when companies big enough to operate as independent states abound, has the balls to do so?

• This article was amended on 1 April 2019. An earlier version implied that Facebook owns YouTube. Rather it is owned by Google.

 

Leave a Comment

Required fields are marked *

*

*