Emily Bell 

Facebook and Twitter are growing into the mainstream

The row over a fake Nancy Pelosi video shows platforms are no longer disruptors, but incumbents, gatekeepers and publishers
  
  

Nancy Pelosi
Facebook chose to leave up a doctored video of Nancy Pelosi in which she appeared to slur her words. Photograph: Matt Slocum/AP

In the first three months of 2019, Facebook removed about 2bn fake accounts from its servers. This is roughly equivalent to all the legitimate accounts the company has on its platform. The figures are in one sense incomparable: legitimate accounts on social media are (at least supposedly) real people.

Fake accounts can be proliferated at such volume and speed, and removed as quickly, because they are both auto-generated and auto-deleted. In other words, robots arguing with each other, and the Facebook robots prevailing.

Contrast the efficiency of this type of content decision with another, where robots are not so skilled: nuanced political and cultural decisions, such as should we remove a video which is “not real”? Or an even more wide open question, should neo-Nazis have access to advertise their views? Both are the latest detonations in an ongoing battle around how public debate and discourse should be regulated.

Twitter announced last week that it was talking to a range of academic experts about what to do about extreme rightwing presence. On one level this is an admirable approach, to ask experts about the mechanics of online radicalisation, harassment and extremism as an input into policy. On another, it is an absurd overthinking of a simple business question – do you want your platform to be plagued by neo-Nazis? Twitter is no doubt hoping the academic consultation will yield an answer which allows them to act. But it is not the only platform continually grappling with the same problems, none of which will solve themselves. For the past week, media headlines in the US have been largely dominated by a short doctored video of Democratic House speaker Nancy Pelosi slurring her words circulated largely on social platforms. YouTube took it down, but Facebook left it up.

Just as the embers of the debate were dying around Facebook’s responsibility, and even capacity, to deal with it, Pelosi reignited them by saying Facebook’s inability to remove the video showed that it was, in fact, a conscious participant in the Russian efforts to derail the 2016 presidential election. “We have said all along: ‘Poor Facebook, they were unwittingly exploited by the Russians’ … I think wittingly, because right now they are putting up something they know is false.”

This is a very big claim and one which ought to worry Facebook, coming, as it does, from the heart of the Democratic leadership. Hillary Clinton followed this up in a commencement speech to Hunter College graduates describing the video as “sexist trash”.

She laid into Facebook: “The big social media platforms know their systems are being manipulated by foreign and domestic actors to sow division, promote extremism and spread misinformation – but they won’t get serious about cleaning up their platforms unless consumers demand it.”

Facebook had fielded Monika Bickert to answer questions from Anderson Cooper on CNN about the video. Bickert is Facebook’s most thoughtful and knowledgeable executive, heading its product policy and counter-terrorism. Her talking points were very familiar: Facebook’s approach is to flag when something is false, to reduce its reach through algorithmic throttling, but not to remove material altogether unless immediate harm flows from it. Hence terrorism and scientifically inaccurate claims about vaccinations are gone, but Holocaust denial and a clearly misleading clip that affects the democratic process are not. “We are not in the news business,” Bickert told Anderson Cooper. “We are in the social media business.”

Engineers and lawyers in Silicon Valley are often mocked for their lack of facility in communications, but over the years they have been remarkably effective in constructing a whole language which casts a gauzy blanket of euphemism over a heap of ugly truths. They have eradicated all meaning and distinction around dramatically different social, cultural and political interactions by labelling them “content”, they have made the diseased metaphor of “virality” a virtue, and rebranded sales people “influencers”. Even the phrase “social media” is a positive denominator which hides a multitude of sins.

The business Facebook is in is advertising (to be clear), and the “social media business” is not only the news business (increasingly), it is also the politics business, the public health business, the terrorism business, the education business, the everything in bytes business.

In fact, the approach that Bickert described of flagging material as false, and reducing the reach of it, telling people who have seen it that it is wrong, is not a bad strategy and it exceeds the measures taken by many media outlets with their own material who would readily criticise Facebook. It is a necessary measure but not sufficient. In other matters, Facebook is on shaky ground. Its stated policy from its founder, Mark Zuckerberg, is that people should decide what they believe for themselves up to and including Holocaust denial, which is arguably incompatible with its position on removing material which creates the potential of immediate harm.

These are wide philosophical questions that cannot be used as the basis of rules that robots execute. There is not yet a computational way to correctly decide one slowed-down video is an attack on democracy and another is a piece of satire. What Facebook proposes, in one dimension, is the assembly of an independent content body, which will advise on such matters. I am pretty sure that there would be a split in how any diverse council would come to a conclusion on Pelosi: editors, whose job it is essentially to safeguard the publishing of very little material and stop incorrect information reaching the public domain, would say take it down. Those who think in a more long-term or philosophical dimension about free speech would say keep it up.

Facebook, Twitter, YouTube and whoever else outsources decision-making on these issues will eventually have to internalise them, too, and have a recognisable culture, set of rules, appeals process and communications system which is capable of responding quickly to information crises. The disruption that minor calls about malicious material creates are too much for any corporate system to withstand, unless it has been designed to deal with them. The inevitable outcome will be a narrowing of the type of material promoted and circulated.

The platforms are no longer innovators in terms of speech, publication and the public sphere; they are incumbents, gatekeepers, publishers, however you want to describe them. In other words they are done with disruption, and will therefore inevitably become perpetuators of incumbency themselves. As the experiment of levelling access to tools and speech comes to a shuddering halt, the platforms will increasingly favour the establishment, whether that is among a narrow band of mainstream media suppliers, a smaller number of high-follower influencers, a tiered system of political actors or a particular class of advertisers.

The Pelosi video and the rolling debate are not a sign of how far away we are from mainstream “normality” but how close to its return.

 

Leave a Comment

Required fields are marked *

*

*