Poppy Noor 

There are plenty more like Cambridge Analytica. I know – I’ve used the data

In a world where companies have a monopoly over our eyeballs, the Facebook scandal is the tip of the iceberg , says the Guardian columnist Poppy Noor
  
  

Alexander Nix, suspended CEO of Cambridge Analytica.
Alexander Nix, suspended CEO of Cambridge Analytica. Photograph: Henry Nicholls/Reuters

In 2007, a Facebook application popped up that allowed users to take a quiz that would tell them exactly what kind of person they were: how emotionally stable they were compared with their friends, or how friendly they were. They were invited to tick a little box to share their information – including photographs, likes and political interests from their Facebook pages. The information would go to researchers David Stillwell and Michal Kosinski at the University of Cambridge to help with their research. Users were told this when they agreed to using it. No big deal.

Until Aleksandr Kogan, the man now caught up in a string of investigations into private company Cambridge Analytica, came along. Kogan built a similar personality quiz giving access to 50m Facebook profiles to the company now accused of using big data to influence elections. He says he had approached Stillwell and Kosinski back in 2014, wanting to collaborate. It didn’t happen.

What was so special about this type of data? As a student in the psychology department at the University of Cambridge in 2011, I was one of the people using it for academic research. I used it to do things such as looking at the correlation between users’ personalities and their political leanings. I went through the data using statistical modelling tools that told me, for instance, what the personality traits of the average liberal or conservative was. I found out new things, like the fact that younger people on Facebook were more conservative than older people. Simple enough – there’s nothing new in using data in this way; people have researched how personality and politics are related for centuries.

What was new, however, was the sheer volume of data I had access to. In 2011, the database gave me access to 4.5 million Facebook profiles and personality scores. When the study closed in summer 2012, it claims to have had information from 7.5m profiles. This not only meant that our research could be even more accurate (what psychological study have you looked at that surveyed 4.5 million people?), it also gave us unprecedented access to information on which to base new studies. No need to seek permission to use the data in a different way – people had already agreed to that. This was always intended to be an open data source, shared among many academics.

What could possibly go wrong?

My second piece of work required looking through people’s Facebook profile pictures to see whether how they portrayed themselves was related to their personality. We learned interesting things, like that having alcohol in your profile picture doesn’t mean you’re any less conscientious, in fact, we found it actually correlated significantly with people being more hard-working.

But the potential for harm if used for non-academic purposes was, and still is, scary. The research we did allowed us to predict someone’s personality based on what they liked on Facebook; and to predict people’s personalities better than their own friends could. We knew the personality profile of people who tended to like Smirnoff vodka (quite conscientious, by the way, and similar to people who like BMWs). It logically followed that with this data, you might also be able to accurately predict what political messaging to send out to different people; what ads or memes to circulate on their Facebook page during an election in order to secure a win for your client.

Kosinski and I had healthy debates about the threat this sort of data posed, and how it could be used to manipulate people. Kosinski pointed out that such research provides the opportunity to speak to people more effectively, based on their own beliefs and dispositions. As he put it when I interviewed him after Trump’s election: “In the past, if you were not interested in what the message was – say you went to school and you were given novels that you weren’t interested in – you would basically just disengage. The hope behind tailoring is that those who were previously excluded will now stay in. They will read, they will become involved in political processes, and so on.’

That is true. This kind of information isn’t used to make people believe what they don’t already. No algorithm can: Spotify’s algorithm (which makes educated guesses based on user’s information) can suggest you listen to new songs based on what it knows you already like, but it can’t make you like music that you hate. The same is true of elections – data won’t make Democrats vote Trump. But it could possibly be used to suppress African American Democrat voters from going to the ballot box by targeting them with memes that remind them of the Clinton administration’s awful record on race.

Cambridge Analytica didn’t win the 2016 election; Trump supporters did. Trump’s campaign boosted his popularity using demographic data to tailor messages to people – a method that has been used for centuries, albeit in a more basic way, by targeting women, for instance, or young people. Sure, they used big data to get this vote out, but so did Obama.

The problem occurs when this data is taken without people’s consent, and is used to target them without their knowledge.

Millions completed the myPersonality study, with transparency around who would use their information, and what for. They had no idea it might be used by Cambridge Analytica. And this developing story may stop Facebook users from agreeing to give away their data. People may even stop using Facebook full stop. But will they stop Googling, watching TV, or sending emails? These companies have a monopoly over people’s leisure and private time. Until they are adequately challenged on the way they use people’s information for financial gain, and held to account for when they behave in illegal or immoral ways, the rest of us can make little difference.

As Kosinski points out: “Many, many companies already do what Cambridge Analytica are doing with peoples’ personal data. They just didn’t boast about it, and weren’t hired by Trump.” Come the next election, we may be asking all the same questions again.

  • Poppy Noor is a Guardian columnist and commissioning editor
 

Leave a Comment

Required fields are marked *

*

*