Summary
Thank you for following the blog. I’m going to leave you with a summary of the salient points from Sandy Parakilas’s evidence on Facebook to the Digital, Culture, Media and Sport committee:
- Parakilas said Facebook executives who are still at the company were made awares of his concerns about data vulnerabilities. Asked to identify the executives by name, Parakilas offered to do so after the meeting.
- He said that Facebook failed to do enough to identify data breaches and to act when informed about them. “I do not remember a single physical audit of a developer’s storage,” he told MPs.
- Facebook gave the impression that it feared that if it investigated a suspected breach and found policies or laws were broken, it would be liable, said Parakilas. Instead it did not investigate because it felt that if it did not know of breaches, it could claim it was just a platform and so not liable, he suggested.
- Parakilas said he would provide the committee with questions to ask Facebook founder Mark Zuckerberg if he appears before its members.
Labour MP Julie Elliott asks Parakilas if he thinks Facebook understands the enormity of the problem here?
He says they do not, otherwise they would have done something quicker.
Does this pose a challenge to democracy?
It presents a “huge challenge to democracy” says Parakilas.
The final question is from the committee chair, Damian Collins, who asks if Parakilas would provide some recommendations on social media governance.
The former Facebook operations manager says he would love to but will do so in writing. And that concludes the hearing.
The committee chair Damian Collins asks if you can target advertising at individual people/users rather than just people within say people within a certain age group in a certain area for instance.
Parakilas says you can. He confirms you can target people by their interests. Asked if you can target someone by what they’ve said, Parakilas says Facebook does hold that info but to his knowledge you can’t target people by that information.
If I was interested in immigration as a political issue, would Facebook be able to identify people interested in immigration, Collins asks?
Parakilas says you could if immigration was a keyword.
If someone is a racist, could they create a dataset around racists and advertise to those people? Collins asks. Could Britain First (the far right group recently banned from the social network) theoretically use Facebook to reach out to people who share similar opinions to them?
That’s possible, replies Parakilas.
Labour MP Jo Stevens asks Parakilas what the committee should ask Mark Zuckerberg if he appears before them.
Parakilas says he would love love to provide some potential questions for Zuckerberg but would want to do so offline.
Stevens asks if Facebook should be able to identify the victims of the breach.
Parakilas says they should.
“What did you think when you first heard Facebook auditors were knocking on the door of Cambridge Analytica and demanding access?” Tory MP Simon Hart asks.
Parakilas answers:
I thought it was two-and-a-half-years late.
The amount of data that passed out of Facebook was “vast”, says Parakilas. You are likely talking about tens of thousands of apps and some of those apps had hundreds of millions of users, he says.
Hart asks whether players in the EU referendum campaign were accessing this data.
Parakilas says he cannot answer that.
Labour MP Ian Lucas asks about the difference between Facebook and apps, given that Facebook sells advertising to political campaigns and uses micro-targeting.
Parakilas is blasé about the fact that Facebook does this. He says:
That is a feature of digital advertising.
Parakilas says:
The users had no idea that this had happened ...Facebook was aware that this had happened and did not notify anyone.
The committee chair, Damian Collins, asks if there are likely to have been many other data breaches involving other companies.
Facebook rarely investigated sufficiently, Parakilas says.
He cites a company called Klout that was allegedly using third-party permissions to create profiles when that party had not created the profile in question and some of them were children’s profiles created using data from their parents’ Facebook profiles.
He says he contacted Klout and asked if they were violating Facebook’s rules but Klout denied it. The company stopped doing “ghost profiles” but remained as an app on Facebook, says Parakilas.
Facebook did not investigate deeply enough, says Parakilas.
He says he got the impression that Facebook feared that if it did an investigation and received information that policies or laws were being broken it would be liable but if it did not know it could claim it was just a platform and so not liable.
Under questioning from SNP MP Brendan O’Hara, Parakilas says the data which leaves Facebook servers is going to an “unvetted” group of people.
Anyone can make a Facebook app.
Explaining how this happened, he suggests “it was a risk they were willing to take”.
The goal was to grow the platform as quickly as possible and data was one of the ways to do that.
O’Hara asks when Facebook knew that data had been passed on to Cambridge Analytica.
Parakilas says he was not there at the time but he understands they knew when there was a Guardian article in December 2015 and there were multiple subsequent articles in the press.
As far as I can tell they took no action at all ...during that period.
He says options that would have been open to Facebook would be to file a lawsuit or involve law enforcement on the basis that it may have breached data protection laws.
Parakilas says the unofficial motto of Facebook was:
Move fast and break things.
Has Facebook approached data like it was “the Wild West frontier”, he is asked by Tory MP Julian Knight?
Yes, he replies.
Knight asks: “In an ideal world ...what should Facebook have done to better safeguard data?”
Parakilas says the first thing they should have done is turn off friend permission, which they did in 2014. He says that between 201 and 2014, Facebook’s practices were “far outside the boundaries of what should have been allowed”.
He says the social network should be much more pro-active about suing developers and auditing them.
Parakilas explains how the app developer can get data from friends, rather than just the person using the app.
If I use a Facebook app and agree to give permission to my friends’ data and you’re my friend then the developer gets your data too.
He says if you spoke to the 50 million people affected by the Cambridge Analytica breach, pretty much none would be aware that they had given access to their data to Aleksandr Kogan, the Cambridge University researcher who passed on the data, let alone to Cambridge Analytica.
Labour MP Chris Matheson asks if Facebook has “very strong security provisions” to prevent hacking.
Parakilas says the technical security team is very good but Facebook built a platform that allowed personal data of users to leave Facebook servers, when it had not been explicitly authorised by users.
Matheson suggests Facebook protects its own servers very seriously but not the privacy of its users.
Parakilas responds: “I think that’s fair.”
Parakilas continues:
The real challenge here is Facebook was allowing developers to access data of people who hadn’t explicitly granted that.
He says even a physical audit could not prevent a developer from hiding data elsewhere, so may not be effective.
Parakilas made a map of the various vulnerabilities and a list of potential “bad actors” such as foreign enemies and data brokers, he says.
Asked by Collins if executives he shared this with are still at the company, he answers in the affirmative. Asked if he can name them, Parakilas says he can speak with Collins after the meeting.
Collins asks him if Facebook founder Mark Zuckerberg would have been aware of these concerns, Parakilas says Zuckerberg would not have been aware of the specific concerns he raised but would have been aware of the risks of handling data, because of stories in the media.
Parakilas is explaining about how apps ask people for information (to access their likes, photographs etc) and that when the data passes from the Facebook server to the developer, Facebook loses control over the data and what is done with it.
As a result rules were brought in but Facebook had little scope to identify abuses or act on them, he says.
I do not remember a single physical audit of a developer’s storage.
He says Facebook had “relatively little information” about violations. Information about breaches usually came from the media or competitors, says Parakilas.
The meeting has finally begun. The committee chair, Damian Collins, says the members are “delighted” to welcome Sandy Parakilas and asks him for a brief description of his role at Facebook and its duration.
Parakilas says “policy compliance” and “data protection” were his primary concerns.
We are still waiting for the committee meeting to begin.
Here is a video interview, Sandy Parakilas gave to Frontline last month before the Cambridge Analytica scandal broke:
He said that, during his employment with the social media company:
I became more and more concerned about the broader data infrastructure of Facebook and the amount of data that Facebook had about its users and the vulnerabilities that the system had. And so I started thinking through what are the worst case scenarios of what people could do with this data?
Parakilas claimed he shared his concerns with people “among the top five executives in the company.”
The Digital, Culture, Media and Sport Committee meeting, featuring evidence from Sandy Parakilas, should begin any minute.
In the meantime, this is what the chair of the committee, Conservative MP, Damian Collins, had to say about Facebook on Sunday.
Data has been taken from Facebook users without their consent, and was then processed by a third party and used to support their campaigns. Facebook knew about this, and the involvement of Cambridge Analytica with it, and deliberately avoided answering straight questions from the committee about it. They have also failed to supply the committee with evidence relating to the relationship between Facebook and Cambridge Analytica, that we were promised at our evidence hearing on 8th February in Washington DC.
Revealing that he was asking Facebook founder Mark Zuckerberg to give evidence, he added:
We need to hear from people who can speak about Facebook from a position of authority that requires them to know the truth. The reputation of this company is being damaged by stealth, because of their constant failure to respond with clarity and authority to the questions of genuine public interest that are being directed to them.
Sandy Parakilas, former platform operations manager at Facebook, is appearing before the Digital, Culture, Media and Sport Committee this afternoon, after the Guardian published an interview with him yesterday in which he said the covert harvesting of data was routine at the social network.
Parakilas, who is due to give evidence via videolink from 3pm, was responsible at Facebook for policing data breaches by third-party software developers between 2011 and 2012.
He told the Guardian that hundreds of millions of Facebook users are likely to have had their private information harvested by companies that exploited the same terms as Global Science Research, which collected data and passed it on to Cambridge Analytica.
Parakilas, 38, who now works as a product manager for Uber said that he had warned senior executives at his former employer that its lax approach to data protection risked a major breach but was left frustrated by their lack of action.
Stay tuned for live updates of his testimony.