The outrage sparked by a recently revealed Facebook experiment on almost 700,000 users' emotions continues to burn, causing the research team to mount a public defence.
For a week in January 2012, researchers tweaked users' News Feeds to expose them to more positive or negative posts. They found a corresponding impact on what the affected users shared. In the paper they declare this proof of online emotional contagion.
The publication of the report prompted thousands of protest posts on social media, as well as conmments from leading academics and psychologists who have raised concerns about the ethics involved.
Facebook data scientist Adam Kramer, who coordinated the study, has now admitted the study may not be worth the uproar it caused.
"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone," Kramer wrote in a public Facebook post. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."
Kramer claims Facebook and the two academics involved in the study, UCLA's Jamie Guillory and Cornell University's Jeffrey Hancock, conducted the emotion-altering experiment because they care about their users.
Kramer says he realises the team wasn't clear enough about their motivations in the report. The report states that they set out to test popular theories, such as that seeing positive posts from friends makes users feel worse and withdraw.
"The fact that people were more emotionally positive in response to positive emotion updates from their friends, stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively, for example, via social comparison," wrote Kramer and co in the report.
In his Facebook post, Kramer celebrates the debunking of these theories. He claims they discovered the "exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses it".
Despite this good news, users, psychologists and academics aren't celebrating. Many are raising concerns about the issue of consent, as users weren't directly informed about the study, prior, during or after the experiment.
Under US federal law, research using human subjects requires those tested to give informed consent. This rule only applies to federally funded research and Facebook is a private company.
But James Grimmelmann, a professor of law at Maryland University, says the fact the paper was co-authored by two academics at universities that receive significant federal funding means these rules should apply, particularly because participants weren't all treated in an identical way.
"We wouldn’t tell patients in a drug trial that the study was harmless because only a computer would ever know whether they received the placebo," wrote Grimmelmann in a blog post. "The unwitting participants in the Facebook study were told (seemingly by their friends) for a week either that the world was a dark and cheerless place or that it was a saccharine paradise. That’s psychological manipulation, even when it’s carried out automatically. This is bad, even for Facebook."
The universities are distancing themselves from the research. Cornell released a statement on Monday explaining the academics were not involved in collecting the data, only analysing it.
The research paper was edited by Princeton University's Susan Fiske. The psychology professor told The Atlantic the study was technically ok but she is "a little creeped out too".
“It's ethically okay from the regulations perspective, but ethics are kind of social decisions. There's not an absolute answer. And so the level of outrage that appears to be happening suggests that maybe it shouldn't have been done...I'm still thinking about it and I'm a little creeped out, too.”
Kramer wrote directly to this concern in his post. He explains they made minimal changes, simply "deprioritising" a small amount of content if it included an emotional word for a small group of people for only a week.
"At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it - the result was that people produced an average of one fewer emotional word, per thousand words, over the following week," Kramer wrote.
Kramer finishes the post explaining researchers at Facebook are working on how to improve their internal review practices, adding the study was created several years ago and arguing they've "come a long way since then".