Charles Arthur and Jon Swaine 

Facebook faces criticism amid claims it breached ethical guidelines with study

Social media network criticised over lack of informed consent, while legal expert claims participants were harmed by study
  
  

facebook
Facebook is facing criticism over the study over the lack of informed consent – while one legal expert claimed participants were harmed by the research. Photograph: Rainier Ehrhardt/Getty Images Photograph: Rainier Ehrhardt/Getty Images

Facebook's experiment in which it tweaked the words used in nearly 700,000 users' news feeds to see if it affected their emotions breached ethical guidelines, say independent researchers.

"This is bad, even for Facebook," said James Grimmelmann, professor of law at the University of Maryland, who says that the lack of informed consent in the experiment carried out for a week during January 2012 on Facebook users is a "real scandal".

In an extensive blogpost entitled "Facebook didn't give users informed consent" Grimmelmann said people did not have the information to allow them to decide whether to take part in the study. The study harmed participants, because it changed their mood, Grimmelmann added.

The blogpost came after it emerged that the social network had manipulated news feeds to see if reading about people having a good or bad time made others feel the same, or the opposite.

The conclusion of the research, which was conducted in conjunction with two academic authors from Cornell University, was that emotions were contagious, and that seeing posts by people who were feeling down may have the same effect on readers.

Cornell said that it had decided the research did not require it to check if participants had given their consent, as it would normally, because Facebook itself had been responsible for the collection of its users' data.

John Carberry, a spokesman, said in a statement: "Because the research was conducted independently by Facebook and Professor [Jeffrey] Hancock had access only to results – and not to any data at any time – Cornell University's institutional review board concluded that he was not directly engaged in human research and that no review by the Cornell human research protection program was required."

Max Masnick, a researcher studying for a doctorate in epidemiology who says of his work that "I do human subjects research every day", said the structure of the experiment meant that there was no informed consent, which is a key requirement of studies on humans. "As a researcher, you don't get an ethical free pass because a user checked a box next to a link to a website's terms of use.The researcher is responsible for making sure all participants are properly consented. In many cases, study staff will verbally go through lengthy consent forms with potential participants, point by point. Researchers will even quiz participants after presenting the informed consent information to make sure they really understand.

"Based on the information in the PNAS paper, I don't think these researchers met this ethical obligation."

But one of the Facebook researchers, Adam Kramer, posted a lengthy defence of the Facebook research, saying it was carried out "because we care about the emotional impact of Facebook and the people that use our product". He said that he and his colleagues "felt it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out."

The experiment hid certain elements from 689,003 English-speaking people's news feeds – about 0.04% of total users, or 1 in 2,500 – over the course of one week. The experiment hid a "small percentage" of emotional words from the news feeds, without their knowledge, to test what effect it had on the statuses or "likes" that they then posted or reacted to.

The results found that, contrary to expectation, people's emotions were reinforced by what they saw – what the researchers called "emotional contagion".

Kramer does not address the topic of informed consent in his blogpost. But he says that "my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety".

The study is the latest in a number of "controlled experiments" Facebook has carried out on its users without their knowledge, though none has been as radical as this one.

The firm argues that its terms of service allow it to do so, and that users give their consent to being used for such experiments when they sign up and agree to the terms and conditions.

In 2010, during the presidential election, Facebook showed maps of polling stations and photos of friends who had already voted to encourage one randomly chosen set of people to vote, while showing nothing to others. It claimed afterwards that this lifted the number of votes across the US by around 340,000 based on a "ripple effect" from 60,000 voters who were mobilised by its subtle encouragement.

An earlier study , also carried out by Facebook and published by PLOS One, an online journal, had found that rainy weather could affect peoples' postings and that that could in turn affect the postings of others who read it. But unlike the recent study, that simply observed the content of posts, rather than altering them.

Facebook could also separately face an investigation by the Federal Trade Commission, which oversees consumer protection in the US. However a spokesman for the FTC declined to comment: "Any kind of investigations or compliance monitoring we do is nonpublic," he said. The FTC previously rapped Facebook in November 2011 over charges that the network deceived consumers by failing to keep promises that it would keeping postings private, and then making them public.

Adi Kamdar, of the Electronic Frontier Foundation, a pressure group that works to defend people's digital civil liberties, said the study should teach users "to be wary of how much data you want to give Facebook and how much you rely on ".

"What a mess," said Kamdar. "People need to understand that Facebook as a service is not a neutral platform. It is not the internet. It is a for-profit company with its own needs, and its own agenda, and will affect what users see at its own whim. It could be manipulating data all the time, but this is the first time we saw the results."

Another Cornell academic, who works alongside Hancock at the university's Social Media Lab, was awarded almost $1.2m in US government funding earlier this month for a new study on the disclosure of personal information by social media users – "from mundane details of daily life to tragic warnings of planned suicide" – to help assist in "well-being interventions".

A summary proposal for the study, which is being led by Natalie Bazarova, an assistant professor, said it would "collect examples of actual disclosure" from social media postings. Bazarova said in an email to the Guardian: "We will be collecting data from consenting social media users on sites such as Twitter or Facebook."

 

Leave a Comment

Required fields are marked *

*

*