Internet outraged by Facebook's 'creepy' mood experiment

Facebook's controversial mood experiment
Facebook's controversial mood experiment

Everyone has a bad day on occasion. But what if Facebook made it worse -- on purpose, and without telling you?

Internet users have reacted angrily to news that Facebook researchers manipulated the content some users were shown in an attempt to gauge their emotional response.

For one week in early 2012, Facebook (FB) changed the content mix in the News Feeds of almost 690,000 users. Some people were shown a higher number of positive posts, while others were shown more negative posts.

The results of the experiment, conducted by researchers from Cornell, the University of California, San Francisco and Facebook, were published this month in the prestigious academic journal Proceedings of the National Academy of Science.

The study found that users that were shown more negative content were slightly more likely to produce negative posts. Users in the positive group responded with more upbeat posts.

So it worked! Facebook was able to successfully change the emotional state of its users. While the mood changes were small, the researchers argued that the findings have major implications given the size and scale of the social network.

Facebook's term of service gives the company permission to conduct this kind of research, but many users have reacted with anger at what they say is a dangerous social experiment. There is no indication that the 690,000 subjects were asked if they would like to take part in the study.

Facebook uses an algorithm to determine which of roughly 1,500 available posts will show up in a user's News Feed. The company frequently changes this program to modify the mix of news, personal stories and advertisements seen by users.

Related: Facebook's new face recognition knows you from the side

Blackphone: Encrypt your mobile activity
Blackphone: Encrypt your mobile activity

The Facebook researcher who designed the experiment, Adam D. I. Kramer, said in a post Sunday that the research was part of an effort to improve the service -- not upset users.

"I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused," Kramer wrote. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

A Facebook spokesman said the company frequently does research to "improve our services and to make the content people see on Facebook as relevant and engaging as possible."

"We carefully consider what research we do and have a strong internal review process," the spokesman said in a statement. "There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."

Given the company's terms of service, it does not appear that Facebook faces any legal implications. But the guinea pig nature of the experiment -- and the decision to execute it without the explicit consent of participants, raises ethical questions.

Susan Fiske, the Princeton professor who edited the research, said that while the research was "inventive and useful," the outcry suggests that maybe it shouldn't have been carried out.

"I was concerned," she told The Atlantic, "until I queried the authors and they said their local institutional review board had approved it -- and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time... I understand why people have concerns. I think their beef is with Facebook, really, not the research."

CNNMoney Sponsors