John Naughton 

How can Facebook change when it exists to exploit personal data?

The tech giant’s astonishing growth is entirely based on drawing on what it knows of its users, whatever its CEO might sorrowfully tell us
  
  

Dom McKenzie Facebook illustration

Watching Alexander Nix and his Cambridge Analytica henchmen bragging on Channel 4 News about their impressive repertoire of dirty tricks, the character who came irresistibly to mind was Gordon Liddy. Readers with long memories will recall him as the guy who ran the “White House Plumbers” during the presidency of Richard Nixon. Liddy directed the Watergate burglary in June 1972, detection of which started the long chain of events that eventually led to Nixon’s resignation two years later. For his pains, Liddy spent more than four years in jail, but went on to build a second career as a talk-show host and D-list celebrity. Reflecting on this, one wonders what job opportunities – other than those of pantomime villain and Savile Row mannequin – will now be available to Mr Nix.

The investigations into the company by Carole Cadwalladr, in the Observer, reveal that in every respect save one important one, CA looks like a standard-issue psychological warfare outfit of the kind retained by political parties – and sometimes national security services – since time immemorial. It did, however, have one unique selling proposition, namely its ability to offer “psychographic” services: voter-targeting strategies allegedly derived by analysing the personal data of more than 50 million US users of Facebook.

The story of how those data made the journey from Facebook’s servers to Cambridge Analytica’s is now widely known. But it is also widely misunderstood. (Many people were puzzled, for example, by Facebook’s vehement insistence that the exfiltration of a huge trove of users’ data was not a “breach”.) The shorthand version of what happened – that “a slug of Facebook data on 50 million Americans was sucked down by a UK academic named Aleksandr Kogan, and wrongly sold to Cambridge Analytica” – misses an important point, which is that in acquiring the data in the first place Kogan was acting with Facebook’s full knowledge and approval.

In 2013, he wrote an app called “Thisisyourdigitallife” which offered users an online personality test, describing itself as “a research app used by psychologists”. Approximately 270,000 people downloaded it and in doing so gave their consent for Kogan to access information such as the city they set on their profile, or content they had liked, as well as more limited information about friends who had their privacy settings set to allow it. This drew more than 50 million unsuspecting Facebook users into Kogan’s net.

Name

Alexander James Ashburner Nix

Age

42

Education

Eton, then Manchester University, where he studied history of art

Career

Nix worked as a financial analyst in Mexico and the UK before joining SCL, a strategic communications firm, in 2003. From 2007 he took over the company’s elections division, and claims to have worked on more than 40 campaigns globally. Many of SCL’s projects are secret, so that may be a low estimate. He set up Cambridge Analytica to work in America, with investment from US hedge fund billionaire Robert Mercer. He has been both hailed as a visionary – featuring on Wired’s list of “25 Geniuses who are creating the future of business” – and derided as a snake oil salesman.

Controversies

Cambridge Analytica has come under scrutiny for its role in elections on both sides of the Atlantic, working on Brexit and Donald Trump’s election team. It is a key subject in two inquiries in the UK – by the Electoral Commission, into the firm’s possible role in the EU referendum, and the Information Commissioner’s Office, into data analytics for political purposes – and one in the US, as part of special counsel Robert Mueller’s probe into Trump-Russia collusion. The Observer revealed this week that the company had harvested millions of Facebook profiles of US voters, in one of the tech giant’s biggest ever data breaches, and used them to build a powerful software program to predict and influence choices at the ballot box. Emma Graham-Harrison

The key point is that all of this was allowed by the terms and conditions under which he was operating. Thousands of other Facebook apps were also operating under similar T&Cs – and had been since 2007, when the company turned its social networking service into an application platform.

So Kogan was only a bit player in the data-hoovering game: apps such as the insanely popular Candy Crush, for example, were also able to collect players’ public profiles, friends lists and email addresses. And Facebook seemed blissfully indifferent to this open door because it was central to its commercial strategy: the more apps there were on its platform the more powerful the network effects would be and the more personal data there would be to monetise.

That’s why the bigger story behind the current controversy is the fact that what Cambridge Analytica claimed to have accomplished would not have been possible without Facebook. Which means that, in the end, Facebook poses the problem that democracies will have to solve.

In that context, the firm’s response to the crisis has been instructive. The first stage was denial: there was no “data breach”; then there were legal threats and the heavy artillery of expensive law firms; after that we were treated to ludicrous attempts to portray a giant global corporation as an innocent victim of wicked people who did terrible things behind its back.

And then, finally – after an extended bout of corporate panic at its Californian HQ as the share price tanked – there was a return to euphemistic form as the boy wonder CEO came out from under the duvet wearing his signature grey T-shirt. He was in his sorrowful-but-resolute mode: “This was a breach of trust between Kogan, Cambridge Analytica and Facebook,” he wrote. “But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that. In this case, we already took the most important steps a few years ago in 2014 to prevent bad actors from accessing people’s information in this way. But there’s more we need to do … ”

If that sounds familiar, then that’s because it is. TechCrunch listed 11 separate controversies that resulted from Facebook being caught taking liberties with users’ data or trust. In most of these cases, the Zuckerberg response has been the same: sorrowful contrition followed by requests for forgiveness, topped off with resolutions to do better in future. The lad is beginning to sound like an alcoholic who, after his latest bout of drink-related excess, says sorry and expresses his determination to reform.

The trouble is that, as the veteran tech investor Om Malik pointed out this week, Facebook can’t reform without changing its very nature. “Facebook’s DNA,” he writes, “is that of a social platform addicted to growth and engagement. At its very core, every policy, every decision, every strategy is based on growth (at any cost) and engagement (at any cost). More growth and more engagement means more data – which means the company can make more advertising dollars, which gives it a nosebleed valuation on the stock market, which in turn allows it to remain competitive and stay ahead of its rivals.”

No amount of corporate spin can disguise that central truth. Facebook’s core business is exploiting the personal data of its users. That is its essence. So expecting it to wean itself off that exploitation is like trying to persuade ExxonMobil that it should get out of the oil and gas business.

That’s why the central question about the Cambridge Analytica controversy is whether it’s a scandal or a crisis. Scandals create a lot of noise and heat, but nothing much changes as a result. Crises are what lead to fundamental change. Facebook’s response – and its corporate DNA – suggest that it’s just a scandal. So stay tuned for the next data breach.

• John Naughton is an Observer columnist

 

Leave a Comment

Required fields are marked *

*

*