Julia Carrie Wong in San Francisco 

Mark Zuckerberg apologises for Facebook’s ‘mistakes’ over Cambridge Analytica

Following days of silence, CEO announces Facebook will change how it shares data with third-party apps and admits ‘we made mistakes’
  
  

Mark Zuckerberg: ‘Facebook made mistakes’
Mark Zuckerberg: ‘Facebook made mistakes.’ Photograph: Justin Sullivan/Getty Images

Facebook is changing the way it shares data with third-party applications, Mark Zuckerberg announced on Wednesday in his first public statement since the Observer reported that the personal data of about 50 million Americans had been harvested and improperly shared with a political consultancy.

The Facebook CEO broke his five-day silence on the scandal that has enveloped his company this week in a Facebook post acknowledging that the policies that allowed the misuse of data were “a breach of trust between Facebook and the people who share their data with us and expect us to protect it”.

“We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you,” Zuckerberg wrote. He noted that the company has already changed some of the rules that enabled the breach, but added: “We also made mistakes, there’s more to do, and we need to step up and do it.”

Facebook’s chief operating officer, Sheryl Sandberg, shared Zuckerberg’s post and added her own comment: “We know that this was a major violation of people’s trust, and I deeply regret that we didn’t do enough to deal with it.”

Zuckerberg also spoke to a handful of media outlets on Wednesday, including a televised interview with CNN in which he apologized for the “breach of trust”, saying: “I’m really sorry that this happened.” In similar conversations with the New York Times, Wired and the tech website Recode, Zuckerberg expressed qualified openness to testifying before Congress and said that he was not entirely opposed to Facebook being subject to more regulations.

The crisis stems from Facebook policies that allowed third-party app developers to extract personal data about users and their friends from 2007 to 2014. Facebook greatly reduced the amount of data that was available to third parties in 2014, but not before a Cambridge University researcher named Aleksandr Kogan had used an app to extract the information of more than 50 million people, and then transferred it to Cambridge Analytica for commercial and political use.

On Saturday, Facebook’s deputy general counsel, Paul Grewal, appeared to defend the lax policies that allowed data harvesting from unwitting friends, writing in a statement: “Aleksandr Kogan requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent.”

But after five days of outrage from the public, and calls for investigations and regulation from lawmakers in the US and UK, the company appears to be acknowledging that blaming users for not understanding its byzantine terms of service will not suffice.

In December 2016, while researching the US presidential election, Carole Cadwalladr came across data analytics company Cambridge Analytica, whose secretive manner and chequered track record belied its bland, academic-sounding name.

Her initial investigations uncovered the role of US billionaire Robert Mercer in the US election campaign: his strategic “war” on mainstream media and his political campaign funding, some apparently linked to Brexit.

She found the first indications that Cambridge Analytica might have used data processing methods that breached the Data Protection Act. That article prompted Britain’s Electoral Commission and the Information Commissioner’s Office to launch investigations whose remits include Cambridge Analytica’s use of data and its possible links to the EU referendum. These investigations are continuing, as is a wider ICO inquiry into the use of data in politics.

While chasing the details and ramifications of complex manipulation of both data and funding law, Cadwalladr came under increasing attacks, both online and professionally, from key players.

The Leave.EU campaign tweeted a doctored video that showed her being violently assaulted, and the Russian embassy wrote to the Observer to complain that her reporting was a “textbook example of bad journalism”.

But the growing profile of her reports also gave whistleblowers confidence that they could trust her to not only understand their stories, but retell them clearly for a wide audience.

Her network of sources and contacts grew to include not only former employees who regretted their work but academics, lawyers and others concerned about the impact on democracy of tactics employed by Cambridge Analytica and associates.

Cambridge Analytica is now the subject of special prosecutor Robert Mueller’s probing of the company’s role in Donald Trump’s presidential election campaign. Investigations in the UK remain live.

The company will investigate apps that had access to “large amounts of information” before the 2014 changes, Zuckerberg said, and audit any apps that show “suspicious activity”. A Facebook representative declined to share how Facebook was defining “large amounts of information” or how many apps would be scrutinized. Zuckerberg said in his interviews that the number of apps was in the “thousands”. The company will also inform those whose data was “misused”, including people who were directly affected by the Kogan data operation.

An online petition calling for just such disclosure for people included in Kogan’s data set had garnered more than 15,000 signatures since the weekend.

Facebook also promised to further restrict the amount of data third-party developers can access when users log in to their sites with their Facebook profile, turn off data sharing for apps that have not been used for three months, and move the tool that allows users to restrict the data they share from the Settings menu to the News Feed.

David Carroll, a US design professor who is challenging Cambridge Analytica through the UK courts to access his data profile harvested from Facebook, called the reforms “inadequate”. “Users should be notified, and not have to know to go and find out,” he told the Guardian by email.

Zuckerberg’s statement notably did not offer any explanation for why Facebook did not make any effort to inform affected users when Guardian reporters first told the company of the data misuse in December 2015. He did address the question in his press interviews, acknowledging to CNN that it was “a mistake” to rely on Kogan and Cambridge Analytica’s certifications that they had destroyed the data.

“I don’t know about you, but I’m used to when people legally certify that they’re going to do something, that they do it,” he said. “We need to make sure that we don’t make that mistake ever again.”

“With Mark Zuckerberg’s response, they are trying to convey that they are taking this seriously, but they are reacting to furore rather than facts,” said Jeff Hauser of the Center for Economic and Policy Research. “The facts are not new to them.”

Jonathan Albright, a research director at the Tow Center for Digital Journalism, said that while he welcomed Zuckerberg’s explanation of how Cambridge Analytica gained access to the data in question, he was disappointed that the CEO did not address why Facebook enabled so much third-party access to its users’ personal information for so many years.

“This problem is part of Facebook and cannot be split off as an unfortunate instance of misuse,” Albright said. “It was standard practice and encouraged. Facebook was literally racing towards building tools that opened their users’ data to marketing partners and new business verticals. So this is something that’s inherent to the culture and design of the company.”

Olivia Solon and Edward Helmore contributed reporting.

 

Leave a Comment

Required fields are marked *

*

*