And that’s it. The session is over, a little under five hours after it started.
Facebook became a "beast that spreads hatred against muslims"
Stevens moves on to Myanmar, where Facebook has been implicated in violence against the Rohingya minority.
“I went to the refugee camps last November. The UN has accused Facebook of playing a role in the violence, saying social media has been exploited to spread hate speech. Just last month the UN said your platform had morphed into a ‘beast’ that spreads hatred against Rohingya muslims.”
Schroepfer says that in regions like this, “where there is hate and dehumanisation in the region, the answer is to get more people on the ground who know the region, we need more policy people on the ground. We need to, and are, trying to get a lot more to get hate speech and all of this vile content off the platform.
“We’ve been working in Myanmar a long time, I don’t know when exactly we decided to amp up the product teams on the ground.
“There are some challenges on the technical level. There’s a language there that we don’t have a lot of tools in, that’s not Burmese. In this region, the goal is to have people who can respond, not just quickly, but adequately. On a technical level, the work we’ve done in English, we’re trying to work out how to translate that into Burmese.”
Updated
Jo Stevens returns to Alex Kogan, Joseph Chancellor and GSR. When did Facebook know of the links?
“It is possible someone in Facebook knew about this and didn’t tell the rest of the organisation. 2017 is when legal, the investigation in to this, knew about it, but a recruiter may have known sooner.”
Stevens again asks how Facebook can take such a “direct and critical” line with Aleksandr Kogan but continue to ignore Joseph Chancellor.
“We are investigating his involvement in this now,” Schroepfer says. “I believe he works on virtual reality and some other things like that.
“I learned about much of this very recently, so that’s the full situation.”
Collins asks what the political ads transparency entails.
“Among the things you’ll see is spend, who they were trying to advertise to, and some basic information about how much it was viewed,” says Schroepfer. “Basically all the information you would want to know about who was advertising in an election and how.”
Collins turns to the FTC’s consent decree with Facebook. Have the required audits been undertaken, he asks.
“They were.”
Why did they not catch the issues with Aleksandr Kogan’s apps?
“I do not know… we were looking at these issues all the time. Not just with the FTC, the Irish DPC had gone through all these issues as well.”
Collins: How much data does Facebook share with governments or get from governments about citizens?
“I don’t know if we do. We’ll do research collaborations with aggregate data.”
Collins:“If I downloaded my data, would any government data be included?”
Scrhoepfer seems baffled by the question, uncertain what government data Collins might be referring to, but notes that Facebook’s transparency reports share information about government requests for data.
Collins notes that Facebook’s enforcement on what developers can do seems lax. “YOu don’t have a system of randomised checks or anything?”
Schroepfer: “I think the starting point is peopl eunderstanding, is this a random game they’ve never heard of or a big brand like spotify.
“I think we need to do everything we can to inform the consumer, to police bad behaviour, ideally to review the apps before they go on to ensure there’s no bad behaviour on the platforms.
“Part of their terms is that they’re required to honour these requests we send them to delete the data.”
Collins asks about Facebook’s tool to let users download their information, in compliance with GDPR. It doesn’t let users download information that the company has gathered about them off the site, he notes.
“You can get that, I know that it’s in the ads tools, it might be part of the DYI tools, I think. We’re trying to show it as we use it.” In other words, it won’t contain a list of sites you’ve visited, but might contain a list of things Facebook has worked out from your browsing.
Collins asks if you can get that information if you don’t have a Facebook account, and Schroepfer says no, because the company doesn’t know who you are. Collins asks if you could ask for information gathered from a specific device, which Facebook does know, and Schroepfer says you still can’t, because “I have no way to verify with you that that’s actually your device.
“There are challenging issues here because we’re not storing information about you, just about your device.”
Collins says he is pretty confident that refusal will lead to some legal test cases.
Updated
Jo Stevens points out that Facebook promoted Stephen Milner to Vice President in March, following his testimony to the committee. Schroepfer says that sort of promotion isn’t unusual.
A final question from Matheson: “is it possible that Facebook could attach a ‘little rider’ to every video uploaded to Facebook to say who the original uploader was? If I uploaded a video onto Facebook you could upload something saying that it was first uploaded by Chris Matheson?”
Schroepfer notes it would be possible, but may not be the most important aspect of advertising transparency. “Your general idea of providing much more information about the source is right.”
Matheson: It just strikes me that if someone’s name has to be attached to a new advert or story, it’s likely to be more legal, honest, open and truthful.
Matheson begins to ask about Vote Leave’s ad campaign on Facebook, but Schroepfer cuts him off, saying it’s likely to be easier to answer detailed questions in writing.
“My understanding is that Aggregate IQ was the advertiser of record, so they were the ones paying for the advertising.”
Matheson says he would also like to know what the adverts were, how much was spent, and whether it’s possible to know to whom they were sent, and how that was decided.
Schroepfer thinks he can provide some of that information. Matheson notes that Facebook has already made that promise, in February in Washington, and never followed up. Schroepfer apologises.
Matheson asks about the fraudulent adverts in Martin Lewis’ name that Facebook shut down. Schroepfer doesn’t know what happened to the money those advertisers paid, whether it was returned or kept.
“I understand what you’re getting at,” Schroepfer says, “but trust me, we are motivated to get these ads off the platform. Our ability to get all of it down immediately is technically challenging.”
Chris Matheson asks why Facebook didn’t write to the committee to correct the record from Simon Milner’s statement. Schroepfer says he hopes his appearance helps fix some of those issues.
Collins breaks the news to Schroepfer – and me – that Mark Zuckerberg has agreed to give evidence to the European Parliament, while the committee has been ongoing. “We still do need the opportunity to put some of these questions to him,” said Collins.
“What has frustrated us has been a pattern of behaviour, an unwillingness to engage. When we asked Facebook if it would investigate on the same terms that it had in America, it refused to do so and then changed its mind. When we had our session in Washington we asked… expressly about data breaches. We since learned that the company knew an awful lot more than it told us, and we wouldn’t have learned any of that if it hadn’t been for investigative journalists.
“I hope you understand that Mr Lucas’ line of questioning reflects a frustration of many of us. I don’t think anyone looking at those transcripts from February would feel we were given straight answers.”
“I’m doing my best to get the answers to you,” Schroepfer says.
Lucas replies: “You are, but the buck doesn’t stop with you does it?”
“No, it stops with Mark.”
“Why won’t Mr Zuckerberg come to us and answer the questions?”
“He is trying to dedicate his time, in the office, to solving these problems. I’m trying to do my best to answer the questions. We thought, given you wanted to go into fake news, about our plans.”
“What we want is the truth. We didn’t get the truth back in February. There are millions of our constituents who are concerned about this. Don’t you think the right thing to do would be for him to come to us and explain why someone representing Facebook didn’t tell us the whole truth back in February?”
“I don’t know what he knew or didn’t know. I don’t know the specifics of it. I will do my best to tell you about these topics. That’s the best I can do.”
“Mr Schroepfer, you have a head of integrity? I remain unconvinced that your company has integrity.”
Updated
“I think Facebook concealed the truth from us in February.”
Ian Lucas hands Schroepfer some evidence Facebook gave the DCMS committee in February, at a session in Washington DC. It’s the transcript of a conversation between Chris Matheson and Simon Milner, a Facebook VP, in which Milner denies that Cambridge Analytica had any Facebook data.
“The reason we have re-opened this, as we all learned this month, is that there’s allegations Cambridge Analytica kept this data.
“At the time, this was accurate.”
Lucas disputes this, noting that in December 2015, Facebook was well aware that there had been a “data breach” that precipitated action. “Why did you not tell this committee about this when you gave us evidence in February? You had very specific knowledge as a company, in February, of what happened in this case. Why did you not tell us? Do you think that Mr Milner should have told us about the agreement between Cambridge Analytica and Facebook?
“I think Facebook concealed the truth from us in February.”
Schroepfer disagrees, arguing that Milner is an “honest man” who likely told the committee what he believed to be true at the time. “I’m guessing he didn’t know.”
Updated
A question from Ian Lucas, Labour, about who owns information on Facebook, veers into interesting territory quickly.
Schroepfer begins by trotting out rubric about how you own everything you post to Facebook, but when asked about information Facebook gathers about non-users browsing off the site, he concedes that that user has no control over that data. “Even if you wanted to delete the data, we wouldn’t know it was you, so we couldn’t.”
Lucas asks “Is that information of value to Facebook?”
“For security purposes, absolutely,” says Schroepfer. “If you made us turn it off, it would be devastating to our ability to detect fake accounts.”
Updated
O’Hara asks whether data will be transferred out of the EU following GDPR. Schroepfer says yes, it will, that’s how Facebook works, and promises to be compliant with GDPR. “We are excited to be in compliance with every EU citizen, no matter where the compute capacity is.
“And in fact many of the basics of GDPR are also being rolled out for the rest of the world. The series of screens we’ve developed for GDPR are being rolled out to other regions.”
O’Hara asks why, if that’s the case, Facebook transferred 1.5bn accounts from Ireland to the US.
“We’ve got clear feedback from people in other regions that they would prefer to work with a local regulator to work with issues in those regions. By moving to the US, the US doesn’t have a lead regulator, which opens up the possibility for local regulators to lead on their own country.”
O’Hara asks how Facebook finds out information from browsing off site.
“If you’re not a Facebook user, we don’t find out things like name and so on, just basic log information,” Schroepfer says.
“There’s a use for third parties. For instance, the like button is useful as a share button. One of the important users is understanding fake accounts – have we seen this browser before, does it have a real pattern of behaviour?”
O’Hara asks if the app review team is large enough to catch future issues.
“We have challenges getting enough resources we need to deal with all the threats on the platform,” says Schroepfer. “My core job is to get the best AI researchers across the world. If you picked any area and asked do I want to hire more, I would say yes.”
O’Hara: how will you ensure that the audit of other apps will be complete? How will you audit apps that have since shut down?
“We may not always have perfect information on all of these things,” says Schroepfer “but the issue gets to the heart of what you’re raising: let’s not wait for somebody else to bring this to light.
“We’re going as fast as we can.”
“You’re asking us to believe that nobody in the entire organisation saw this crisis coming, between 2014 and now,” says O’Hara incredulously.
Schroepfer says Facebook has had a “change in security posture” since then, which should leave the company more able to catch future crises.
When asked if Facebook was slow to act because it was profitable not to, Schroepfer points out that “we don’t charge money for the platform.
“What I can do is spend our time… doing all the work we can to dramatically reduce the probability that this happens again in the future.”
O’Hara argues that Facebook “could not have been unaware of this looming crisis. You tinkered around the. edges in 2011, you tinkered again in 2014… did nobody say there is a crisis unless we fundamentally change what we do?”
Schroepfer highlights the fact that, just recently, when Facebook tightened up its platform last month, it broke a number of apps (such as Tinder), arguing that that is why the company was wary of acting. Nonetheless, he says, if they had known then what they know now – that he would be testifying to Parliament about election interference – he agrees they would have done differently.
Now, he says, Facebook is more actively trying to work out what problems lie in the future. “Let’s make it as Conservative and locked-down as we can, to avoid future issues.”
O’Hara points out that Parakilis warned Facebook management in 2012 that the company’s platform should be locked down.
SNP MP Brendan O’Hara picks up on Schroepfer’s admission that Facebook didn’t check the terms and conditions developers were offering to users. “The terms and conditions are an important part,” Schroepfer says, “but there’s much broader questions to ask about what data that app is asking for, to make sure there isn’t an over reach.
“The shift that happened [in 2014] was one that moved from user choice to more proactive review.”
“Knowing what we know now, I wish we had done more there at the time,” Schroepfer adds.
Collins asks Schroepfer to elaborate in comments made in his written statement about the investigation into Aggregate IQ and Cambridge Analytica. He declines to elaborate, but here is the relevant section from the written statement:
We are grateful to the UK’s Information Commissioner’s Office for their diligent work in investigating the Cambridge Analytica issue. We fully support their work and are providing help and assistance in order that they can complete their work as quickly as possible. This includes technical support where helpful, as well as providing them with information that will be of assistance.
They have asked us a number of questions about Cambridge Analytica, as well as AggregateIQ (“AIQ”), and their connection to the 2016 Referendum, and I know this Committee has also investigated these matters. My team and I met with the Information Commissioner’s Office this week to update them on our internal investigations, and I wanted to share with you a short summary of the information we have provided them and also the Electoral Commission.
• Cambridge Analytica - We did not find any referendum related ads or pages on Facebook directly managed by Cambridge Analytica or SCL Group.
• Aggregate IQ - Our records show that AIQ spent approximately $2M USD on ads from pages that appear to be associated with the 2016 Referendum. We have provided details on the specific campaigns and related spending to the ICO and Electoral Commission. In the course of our ongoing review, we also found certain billing and administration connections between SCL/Cambridge Analytica and AIQ. We have shared that information with ICO for the purposes of their investigation.
(A party political note: all the Conservative MPs save the chair, Collins, have left the hearing during the break. Four Labour MPs and one SNP MP remain in attendance.)
Updated
Collins notes that in 2014, when Facebook tightened up its privacy settings, it must have had some mis-uses that prompted that decision. Schroepfer says he doesn’t know about any specific examples.
Collins asks about the investigation that Facebook is running into other apps which may have been harvesting information at the same time as CA was. Schroepfer says he has no updates, but commits to informing “all the relevant parties” if anything is uncovered.
Collins follows up by asking, again, about former Facebook employee Sandy Parakilis, who says he expressed significant concern about Facebook’s data giveaways at the time.
“There was a lot of focus on giving people clear controls at the time,” Schroepfer says. “I think a lot of the idealism was that with good developers and informed consumers, it would work, but as we have seen, that wasn’t enough.”
Collins asks if there are any preliminary results from the investigation, and Schroepfer declines to answer. “It takes time to understand what’s happened here, with other developers,” the exec says.
Collins follows up asking if Facebook has sent letters asking developers to destroy data, similar to those it sent to CA, to any other developers. Schroepfer says he doesn’t know about that fact.
“A core problem with consumer data, in general, is that once it gets in someone’s hands, you can’t easily delete it. This is why we have moved, and the future here is in proactive enforcement. Figuring out how to provide consumers with control over their data, how to make the platform much more restrictive about the data it provides.”
GSR co-founder Joesph Chancellor still employed by Facebook
Collins returns to the questions about Joseph Chancellor, Aleksandr Kogan’s erstwhile colleague. Chancellor was asked about his work at GSR during his interview with Facebook, Collins, says, suggesting that maybe Facebook did know about that.
Collins also asked how Facebook can claim to be outraged by Kogan’s actions, but still employ Chancellor.
“It is possible he discussed his employment at GSR,” Schroepfer concedes. But he notes that Chancellor wasn’t involved in the 2015 Guardian reporting.
Updated
We’re back for round two, with Collins promising to “see what we can do” about finishing in time for Schroepfer’s 3:30pm meeting with Matt Hancock.
Collins also says that Alex Kogan’s lawyers have got in touch to say that Kogan wasn’t aware he’d been freed from his NDA.
Schroepfer says Facebook’s lawyers will reconfirm that, and provides more detail about Facebook’s communications with Kogan and Cambridge Analytica.
Schroepfer also returns to the questioning from Collins back at the start, about political affiliation: you can’t target adverts on political affiliation, Schroepfer clarifies, and you have to specifically opt in to share that information.
That’s all for part one – the committee is taking a short break for lunch, and will be back at 1:20pm.
If you want a summary of this morning’s events, my colleague Jim Waterson has what you need.
Updated
Watling asks the initial intent for building the data access that GSR used to harvest data for Cambridge Analytica.
“The intent was to allow developers to build experiences” such as multiplayer games, communal music players, and the like, “and allow people to take their data to these experiences,” Schroepfer says. “We said, people are smart… our job is to make sure that they understand what is happening.”
Watling brings up former Facebook manager Sandy Parakilas’s claim that Facebook is an “addictive product”. Schroepfer says people like Facebook, to which Watling notes that people like cigarettes.
“If you develop an addictive product, you can sell it to advertisers, because people will continue coming back,” the MP says. “People can’t put it down, that’s part of the system. You place a post, and 15 minutes later you need to see if they’ve reacted.”
“Our goal is to build products that are good,” Schroepfer says. He brings up Facebook’s desire to encourage active use of the site, rather than passive scrolling.
Giles Watling, Conservative, describes the social media industry as being dragged blinking into the light, and asks Schroepfer if they’re now playing catch-up to the real issues.
“Would you agree that you’re behind the curve, and playing catchup? And that you’re not willing to go there, you’re having to be pulled there?”
Schroepfer says “we were slow, because we didn’t understand the threat at the time. I can’t fix that back then, but I can devote my time and energy to two things: one is, with the specific acts, we build every defence we can… and then two, we do a deeper more rigorous forward looking… proactive defence.”
Watling asks how Facebook divides what people can and can’t put on the platform.
Schroepfer notes that on Tuesday, Facebook published its internal guidelines (a year after the Guardian published leaked versions of the same documents). “One of the things we’ve said is a challenge here is understanding exactly where the lines are on these things… the right approach is to be a lot more engaging on where that line should be.
“People who are using the products should have a much bigger say… I’ve had people tell me we’re too prudish, too restrictive, and others tell me it’s offensive what we allow. It’s a product for free expression… but there have to be some basic rules.”
Chris Matheson, Labour, asks: “I get friend requests from attractive young women, with no friends in common. One explanation might be that they admire my work in Westminster and Chester. Another might be that despite the fact that I’m an overweight middle age bloke they like what they see.
“Let’s assume that they are fake accounts. If I allowed my vanity to get the better of my judgement, what advantage would accrue to the person who controls the fake account?”
Schroepfer notes the accounts are frequently just “straight up financial scams”. “This is again, finding and removing this sort of account, is one of the most important things we can do. You can report the account as a likely fake account.”
Matheson asks how many fake accounts there are. “I know that we report this on a regular basis, I think it’s on the order of a few percent.”
Matheson asks about a specific “purge” of fake accounts in April 2017, and whether that purge was related to the US election. Schroepfer can’t answer detailed questions on that issue, and warns that the company may not have detailed logs for that period. “We purge accounts on a regular basis, but I don’t know the specific details of this.”
Updated
Collins asks how Facebook would be able to identify adverts sent for a page which was taken down after the election.
“We may retain some information about the ads,” Schroepfer says.
Collins notes that the technique could be used to hide electoral spend, which is “a major threat to democracy”.
Farrelly quotes journalist Matt Taibbi’s famous “vampire squid” line about Goldman Sachs, and asks if it concerns Schroepfer that the line could be applied to Facebook just as well.
“These are all really hard problems,” Schroepfer says, “but I can tell you the really hard problems aren’t about money. When you say things like ‘don’t have divisive advertising’, finding exactly what that means is challenging to get right at scale, and we want to make sure we’re very clearly not censoring.
“I don’t want to paper over that these are real issues. I understand the skepticism, I don’t fault you for it. If you’re asking me a question of intent, I can only tell you what’s in my heart, which is that we do really care about these things.”
Updated
Farrelly asks about “embeds” – Facebook staffers who sit with a campaign providing regular advice on how to use the platform. Schroepfer again claims to not know about the term, but agrees that the practice is common.
“We don’t want divisive ads on the platform,” he says. “We do want people to run effective ads, because it is effective for them, small businesses and others.”
Farrelly repeats his request for the specific adverts Cambridge Analytica and AIQ ran on the platform, before asking whether the embeds make Facebook active political players, rather than just a neutral platform.
“We are trying to offer our services to any customer of ours… we have never turned away a political party because we don’t want them to win an election.
“I believe in strong, open political discourse. I don’t think it’s for us to decide who the right candidate is, it’s for the people. So in the sense that we will equally support all legitimate candidates in a democratic election, yes, we do.”
Paul Farrelly asks when Facebook will bring its ad registry to India. “I don’t know the timescale,” says Schroepfer.
Can Facebook make available a searchable archive of the materials Cambridge Analytica and Aggregate IQ have produced for previous elections? “It’s difficult to do this retrospectively, we don’t have all this data.”
Farrelly moves on to dark advertising – the term for Facebook ads that can only be seen by the recipient. “It’s great interest for those of us who are not the target of these adverts to see how subtly they may have been changed, the facts might have been changed, and what they were saying to different audiences,” the MP says.
“It was reported that dark advertising was used in the US elections to try and dissuade African Americans from voting,” Farrelly says, asking to hear more about that.
Schroepfer says he doesn’t know what Farrelly means by dark advertising, so Farrelly defines the term (by googling it on his iPad). Schroepfer says the tool the company is rolling out, that lets a user look up historical logs of adverts, will help.
Hart asks about Facebook’s market share of political advertising.
“We’re about 6% of the global advertising market,” Schroepfer says, but “it is not something we track, we have no goals on the political advertising market. The only numbers we do know are the relative percentage of political ad revenue, but even at the peak during elections are a low single digit percentage.”
Simon Hart, Conservative, brings up Mark Zuckerberg’s quip that it was “crazy” to view Facebook as swinging the US election.
“I think he has said publicly that he regrets having said that at the time,” Schroepfer says.
Elliott points out that it takes “14 clicks and swipes” to opt-out from one ad category.
“If you want fine grained control, there’s no easy way to do that, to let you opt out of one category but not another,” Schroepfer says.
“It hasn’t been my experience that it’s 14 clicks, but it’s a very fair question. Our teams are doing extensive user testing, to try to build them so they’re easy to use and powerful.”
Updated
Elliott asks how many UK Facebook and Instagram users were contacted in the Brexit referendum by non-UK entities. Schroepfer says that would be a big question to answer, with a lot of data required. “I don’t have that information on me.”
Elliott asks if Facebook will contact those users who were contacted by non-UK entities. Schroepfer seems confused by the questions, and repeats his promise to follow-up.
Updated
Julie Elliott, Labour, asks about misinformation in the Brexit referendum, and Schroepfer says the IRA spent only $1.
“Is it possible we missed something, yes. I do think we looked pretty hard.”
Elliott turns to spending from AggregateIQ, a Canadian company that worked the Brexit campaign.
“We looked at that company and found that they had spent $2m on the Brexit referendum in 2016,” Schroepfer says. “I believe it was [declared] but I haven’t looked at that detail.”
“Is it possible, because there are alleged links between Cambridge Analytica and AggregateIQ, we wanted to follow the data to see if the data was used in other elections.
“The campaigns they [AIQ] run were based on email lists… Mr Kogan’s app did not get email addresses from either the installer of the app or their friends. And the people they advertise to, the overlap between that and the Digital Life app was 3-4%. Basically this is a random overlap, so they are clearly not using that data.”
Knight points out that Facebook brings up “self” regulation a lot. “Do you think the time has come for robust regualtion and empowerment of consumers over their own information? Do you need regulation right now to ensure there is a level playigng field and consumers have real ownership of their own informaiton?”
Schroepfer: “I agree 100% that consumers need control… part of the issue of the platform is that we made it easy for people to take their data to third party developers and then abuses ensued.
“In terms of regulation, there are multiple laws that we are under the guise of right now. GDPR comes into effect this month.”
Knight: “I put it to you that Facebook is a morality-free zone. You aren’t an innocent party maligned by the likes of Cambridge Analytica: you are the problem.”
Schroepfer says “I respectfully disagree with that assessment. You want us to say we’re responsible, which we have on multiple occasions, and you want transparency on ads and other things. The core of our job is to build a service which helps millions of people connect with each other around the world every day”.
Knight moves on to Cambridge Analytica, which allowed access to Facebook’s lawyers before the ICO could get access itself: “what would you say that this is an attempt to pervert the course of justice?”
“What we were trying to do was get to the answers as quickly as we could,” Schroepfer says. “Once we understood that the ICO wanted to take precedence we immediately stood down.”
Facebook apologises for legal threats to the Guardian
Conservative Julian Knight returns to Facebook’s legal threats against the Guardian. “Will you apologise today for this bullying tactic?”
Schroepfer says: “We wanted to make sure the facts are straight, we do not want to… we believe deeply in the need of journalists to be free. It’s the same principle we have in our key product.”
Knight: “Why’s your first instinct to send legal letters?
“My understanding is that this is common practice in the UK,” Schroepfer says to laughs from the committee. “I may be wrong…
“You are wrong,” says Knight. “I’m going to ask you again: will you apoligise?
“I am sorry that journalists feel we are attempting to prevent the truth coming out,” Schroepfer says. “I am sorry.”
Updated
Labour’s Ian Lucas up next.
“Following the breach in 2015, which you found out from the Guardian, can you confirm that Cambridge Analytica deleted the information?” Schroepfer says the company received a certification from CA that said as much, but won’t immediately promise to share that certification with the committee.
Farrelly points out the year’s grace that existing Facebook developers had when the company tightened up its rules.
“The reason we did this is that these changes were significant enough that if we had flipped the switch immediately, we would have broken a large portion of the internet,” Schroepfer says.
That grace period was a key competitive advantage for Kogan, whose GSR app was grandfathered in to the old rules at the time he was dealing with Cambridge Analytica – even if the company had wanted, it wouldn’t have been able to gain that access itself.
Labour’s Paul Farrelly brings up Kogan’s belief that “he didn’t consider [Facebook] has got a policy”, since the company was so lax in enforcing its data policy. “How much does Facebook care?”
Schroepfer says “we care a great deal.” The company has, he says, become “a lot more proactive in enforcing these policies”.
The primary thing Facebook does, he says, is let people share data with a small set of friends and family. “If you want something public, you do it on a blog or Twitter. If you want it private, you do it on Facebook. That trust is important to us.”
Collins jumps in to clarify whether Facebook did know about Joseph Chancellor before this year’s stories about Cambridge Analytica – it did, Schroepfer says.
Jo Stevens is digging in to how the auditors were picked, and Schroepfer is refusing to name the individual responsible. “I’m happy to provide this information to you… these proceedings are public, and I don’t want to name individuals in a public proceeding.
“I’m happy to provide this information, I’m not sure why it needs to be broadcast on public TV,” Schroepfer says.
Stevens asks when Facebook hired GSR’s co-founder Joseph Chancellor – it was in 2015, shortly before the Guardian’s first story about the Cambridge Analytica data.
“When the Guardian was working on the stories in 2015 and in 2018, were you aware that Joseph Chancellor was a co-founder of GSR? Have you investigated the word he was doing?”
Schroepfer says Facebook is now investigating the work Chancellor did, and was only aware of that work after the 2015 story.
Stevens asks why it took Facebook two and a half years to tell the affected users.
“In retrospect it was a mistake,” Schroepfer says. “I don’t know who made that decision… I don’t know what happened. The key thing at the time was making sure people’s data was safe.”
“When you realised about this data compromise… did you inform the ICO in the UK?” asks Stevens.
No, says Schroepfer. “The issue at hand was a developer had used our platform, collected some data, and resold the data. We believed the matter was resolved,” he says, because the data had been deleted.
“Do you intend to take any legal action against the University of Cambridge?” Schroepfer again says no legal actions are being prepared, as Facebook waits for the ICO investigation.
Stevens: “Were you aware there were other Cambridge University employees building apps similar to the ones developed by Aleksandr Kogan?” Schroepfer doesn’t think so.
Stevens asks if Facebook will sue Cambridge Analytica or Aleksandr Kogan, and Schroepfer says Facebook will wait until the ICO investigation is concluded.
Stevens brings up the NDA that Aleksandr Kogan was made to sign, and Schroepfer says Facebook contacted Kogan the day after his hearing on Tuesday to say that he should be free to talk.
Stevens then asks Facebook to provide a lot more information about that NDA, and asks Facebook to find out the information while the hearing is on-going. “You’re going to be here a while,” she says.
Stevens: When you found out via the Guardian in 2015 that GSR had misused data, what did you do?
Schroepfer: “We contacted Kogan, Cambridge Analytica and others to find out if there was any other information.”
Stevens asks who the “others” are and Schroepfer refuses to answer in the public session, citing an ICO investigation.
Collins intervenes, pointing that there’s no good legal argument not to answer, and Schroepfer names them: “My understanding is there was a researcher at the university of Toronto, and Mr Wylie and his company.”
Stevens: Why did Facebook threaten to sue the Guardian, if the Guardian had revealed this issue?
Schroepfer: “It’s my understanding we sent a letter asking them to correct some facts… it’s my understanding it was a standard legal letter… We do in many cases send a letter to ask for facts to be corrected.
“Bringing these issues to light is critical. The implication that we’re trying to make these things be hidden, is a fair question, it’s not something we’re trying to do.”
Stevens: We’d quite like to protect freedom of speech here… I’m sure you understand why we think it’s bully to threaten a newspaper with litigation when they’re pointing out problems with your own governance.
Schroepfer: “They and others have raised critical issues… we think transparency on these issues is the most important tool. I agree with you that we want an independent and vocal journalism.”
Stevens continues, asking when Facebook found out about GSR handing data over to Cambridge Analytica.
“We found out from the Guardian report.”
Do you rely on the press to find out about these issues?
“We do have an enforcement team, we have a way users can report issues… I wish we had found this ourselves, but it is not the only – we do find things through other means.”
How many developers did you take enforcement action against?
“I don’t have that data in front of me.”
Updated
Labour’s Jo Stevens asks about Cambridge Analytica contractor GSR, the company set up by Aleksandr Kogan.
“Did Facebook read GSR’s apps terms and conditions?”
Schroepfer: No.
“Does Facebook read any terms and conditions?”
Schroepfer: We made a number of changes in 2015… one of which was a proactive review process. That involves Facebook making sure that apps are asking for the minimum set of data needed to provide functionality for the user.
“So prior to that you didn’t read any terms and conditions for apps you were putting on Facebook?”
Schroepfer: Not never, but rarely.
Collins hands over to Conservative MP Simon Hart, who asks how much Facebook is spending on its political advertising efforts.
Schroepfer doesn’t know the dollar value, but says, “I can’t remember a meeting with our advertising team that wasn’t about… these issues, for quite some while.”
Collins brings up Martin Lewis’ lawsuit against Facebook, over his image being used in adverts promoting bitcoin scams. “How can someone complain about an ad that they can’t see?” he asks.
Schroepfer explains that bad actors “keep running ads, and they get blocked and blocked, until they get round the blocks… Mr Lewis reported about 50 ads, and using those we found thousands of adverts, and took them down.”
“The second issue you raised, about transparency, is an important one. In June, you’ll basically be able to see every running ad on the platform. We don’t want people to have to be the ones looking out for bad adverts, we want the vast majority of this stuff to be caught by automated systems.
“Ninety percent of the nudity on the platform is flagged by automated systems… that’s the future we want to get to.”
Collins asks why Facebook doesn’t use the facial recognition technology it already has to block all adverts with Lewis’ face. “We are investigating ways to do that,” says Schroepfer. “It is challenging to do at scale.”
“This is why, for instance, we looked at the hype around cryptocurrencies… we banned the whole category, because we thought the likelihood of harm to consumers was high.”
Collins pushes on facial recognition: “You’ve got facial recognition on users and can automatically tag them in pictures.” Schroepfer argues that adverts are different from photos, apparently in the difference between how each is affected by false positives.
Collins: “A lot of the tools seem to work for the advertiser, not for the consumer.”
Collins discusses the “Damascene conversion” that Mark Zuckerberg and others have undergone over fake news in the last few months, but argues that their words haven’t been met with actions.
“The combination of the news feed and targeted advertising… is the pipe through which the fake news comes, and there doesn’t seem to be that much you can do to control it,” he says.
“I think the evolution that’s happened over time is we’ve realised the platform can be abused by bad actors,” Schroepfer replies. “My primary job back at home is to build our technological systems, build REI and ML technologies… that have been immensely successful at reducing the bad actors across Facebook.”
Collins moves on to the News Feed. “Do you think there’s a case that political advertising shouldn’t be in the News Feed at all?”
“This is not an issue of revenue for us,” Schroepfer replies. “What we know is that advertising is a critical way to reach audiences. Those who don’t have an established name, who are running for new offices, the ability to reach people via pages and via advertising is a powerful tool for free speech.
“Transparency is important… and saying we would just cut people off from access would do harm.”
Collins suggests that Facebook could put paid adverts in a different feed, like the “Explore feed” experiment the company launched late last year. (When Facebook launched that experiment, it also moved journalism into the secondary feed, prompting reporters in the six affected countries to declare the test “downright orwellian”.)
Schroepfer notes that adverts are easier to ignore, and less disruptive, if they’re integrated in to the News Feed and similar features.
“The biggest weapon we’ve found in this defence is to find the actor, find the account, and take it down. The root cause is identifying that the actor is someone like the IRA, and take it down,” Schroepfer says, “get them off the platform as quickly as possible.”
In case this is getting confusing, the IRA in this context is the Internet Research Agency, the Russian “troll army” that intervened in the US election in an attempt to sow discord and misinformation.
Collins: “The reason the IRA case is interesting to us is that it’s the most exposed, and we want to know how to recognise similar activity in the UK.”
Schroepfer: “We did look for connections between the Russian IRA and the Brexit election, we found $1 of spend… almost nothing.”
Collins: “We don’t know if it happened, but if it did happen, it must have happened in another way, that’s what we know.”
Schroepfer: “I can’t prove a negative.”
“Transparency is good, and helpful, but all this would mean is that the Russians in St Petersburg would just have to find a different office,” Collins points out.
“I 100% agree with you, that political ads are different,” Schroepfer replies. “What happend in 2016, with the Russian IRA, was awful… the key weapon in that is to find out the actors. The problem here is that they were masquerading as citizens of the USA on the site.
“What we are doing now is proactively looking for actors on the site… that’s an issue that is of great importance.”
Collins points out that the first Russians who were found were even paying in Rubles for political ads. “Why wasn’t that found, that was against US political law?”
Schroepfer says “look, I am way more disappointed in this than you are”, eliciting a mumbled retort from another MP on the committee and an apology from the executive. “I’m sorry, I shouldn’t have said that.”
Schroepfer now bringing up his promises for change in time for the 2019 election. You can read more about them here, from when Facebook announced them earlier this month. A snippet:
Facebook is stepping up its efforts to fight fake news and political misinformation, with new controls intended to ensure authenticity and transparency among advertisers and publishers on the site.
CEO Mark Zuckerberg wrote in a post shortly after the moves were announced: “These steps by themselves won’t stop all people trying to game the system. But they will make it a lot harder for anyone to do what the Russians did during the 2016 election and use fake accounts and pages to run ads.”
The measures build on a plan, announced last October, to require American political advertisers to undergo an authentication process and reveal their affiliation alongside their adverts.
Updated
Schroepfer: “Advertisers have limited budgets, and want to spend those budgets well. When I as a user take all these [interests] out, the only way they’ll reach me is with a broad campaign, more like a TV campaign, which is quite expensive per person.
“If there’s a broad campaign running, and something more local, like a coffee shop near to me, you’re likely to get the local one”, because it’s more relevant to you than the broad one.
The pair have now been at loggerheads over this for fifteen minutes: Collins wants to know why it is that a user who explicitly says they don’t want to be targeted over a certain political issue can still get adverts for that issue, and Schroepfer has nothing to offer other than noting that users can opt out of certain types of ad targeting, but can’t opt out of ad content.
Collins asks if Facebook is complying with privacy laws. Schroepfer, unsurprisingly, thinks it is.
Collins: “If a political advertiser wishes to upload a custom audience to Facebook” – a feature that lets advertisers upload mailing lists and the like to advertise to those users on Facebook – “does Facebook have any way of checking those users gave consent?”
Schroepfer says that the advertiser has to affirm that they have consent. “We don’t actually see the data … we don’t get the emails in raw form. We can’t reverse it, to see the email in raw form, and we don’t store the data. We couldn’t validate it if we wanted to, and so it is a requirement that the person who acquired that mailing list have consent.”
Collins again asks whether a person who had opted out of political advertising might receive those adverts through the custom audience feature, and Schroepfer again notes that you can’t exactly opt out of political advertising.
“ I want to be clear,” the Facebook executive says, “there isn’t an opt out of political advertising.” Instead, you can only opt out of the information that is used to target ads – and if an advertiser targets broadly enough, then it may hit you in general.
“There’s no specific category by category opt outs,” Schroepfer says. “When I mute an ad from an advertiser, I don’t see an ad from them again.”
Collins points out “there are thousands of pages, with thousands of adverts… that’s a very weak tool for the user.”
The Guardian’s Media Editor Jim Waterson notes that Twitter, the company, is loving this.
Updated
“Political advertising can be based partly on people’s metadata trail on Facebook itself, is that correct,” asks Collins. “My metadata from Facebook can be used to put me in a category where I might receive political advertising, correct?”
Schroepfer replies that things like Facebook pages may be used, but not Facebook postings. “If I liked a page about immigration, for instance, that may be used to target you with adverts… we don’t scan posts for advertising data.”
Collins brings up offsite tracking, and Schroepfer agrees that when users visit pages with a Facebook like button, Facebook can see that – and points out that Collins’ own website has such a button.
(Schroepfer doesn’t address the Facebook Pixel, another form of off-site tracking which is far less visible to web surfers).
Collins asks how many websites have Facebook tracking information, but Schroepfer doesn’t know the answer.
Collins picks up on the claim that users can control what ads they receive: “If my ad preferences stated that I didn’t want to receive political advertising, can you guarantee I wouldn’t?”
Schroepfer clarifies that no, you can’t, you can simply opt out of the reasons why that advertising might be targeted to you.
“I was referring to the ads preferences… but if you see any advertisement, you can declare that you don’t want to see anything from that advertiser again, which is a feature you absolutely don’t have on a newspaper or a television.”
Collins disputes that, saying that adverts come from so many entities that it would be impractical to block them all.
Schroepfer replies: “We’re actually going to mark all political ads prior to the 2019 local elections, and explain who paid for them, and provide a lot more transparency. It’s an important issue.”
Collins begins with an odd pair of questions: how much Schroepfer spent on his car, and how large his home is.
Schroepfer doesn’t know, but Collins notes that Facebook does know the answers to those questions, and others.
“Facebook gathers that information, those are categories of information that Facebook gathers about users,” Collins says. “Why does Facebook gather that data about its users?”
Schroepfer answers that he’s checked his advertising preferences recently – which anyone can do – and didn’t see anything about his car or home there.
“The basis of our service, the thing we provide, is a way to provide personalised experiences. When I log in to Facebook, the thing I get is a personalised news feed. The feedback we get from people, over and over again, is that when we give them things that aren’t relevant to them, that’s the worst form of experience.”
While we wait for the hearing to begin, the committee has released Schroepfer’s written evidence.
“I want to start by echoing our CEO, Mark Zuckerberg,” he writes. “What happened with Cambridge Analytica represents a breach of trust, and we are deeply sorry. We made mistakes and we are taking steps to make sure it doesn’t happen again.”
You’ll be able to read the whole statement here when it’s published.
Schroepfer won’t have an easy ride from MPs, who are smarting at the fact that they’ve been fobbed off with one of Mark Zuckerberg’s deputies rather than the Facebook chief executive himself.
“It is absolutely astonishing that Mark Zuckerberg is not prepared to submit himself to questioning in front of a parliamentary… hearing,” Damian Collins said last month.
We already know some of what Schroepfer hopes to offer the committee, thanks to a leak to the BBC: he will say that Facebook will commit to enforcing transparency for political advertising in time for the local elections. Not this year’s, though – the ones held in May 2019.
That offering may not have the desired effect, if for no other reason than it’s not actually a new deal: Facebook had already committed to rolling out its regulations worldwide earlier this month, alongside another set of tight rules for people who can run large Facebook pages. The only new aspect to Schroepfer’s expected offer today is the (fairly unambitious) timescale that’s attached.
Facebook's chief technical officer gives evidence to DCMS committee
Mike Schroepfer, Facebook’s CTO, is giving evidence this morning to the digital, culture, media and sport select committee, as part of its inquiry into fake news.
The terms of the inquiry, led by committee chair Damian Collins, have grown wildly since it was constituted in January 2017, and you can expect the questions to be less focused on fake news specifically, and more on Facebook’s wider effect on British political culture – including, in particular, the Cambridge Analytica scandal and Facebook’s response.
Questions? Comments? You can get in touch with me on Twitter at @alexhern.
Updated