featured, News

Facebook manipulated user emotions in 2012 experiment

Written by Rabail Majeed ·  2 min read >
FacebookSnap

In yet another controversial news release, Facebook has revealed to have conducted a research experiment on its users to study how emotions can spread among and affect social networks. The study, published in a paper in the Proceedings of the National Academy of Sciences , was held in 2012 with a team of social scientists from Cornell and the University of California, San Francisco to determine whether the concept of “emotional contagion” holds for online interaction. Emotional contagion is a concept which essentially means that human beings will mimic the emotions of those around them, whether consciously or unconsciously. For example, if you hang out with someone who’s sad or mopey all the time, chances are you will be in the same mood. Before this experiment was conducted there were little to no studies demonstrating whether emotional contagion could spread without face to face interaction.

So what makes Facebook’s experiment so controversial? Its the research methodology. The social media giant has admitted to tweaking over 700,000 users’ news feeds for a week in 2012 by randomly selecting the said amount of users and either increasing or reducing the amount of positive and negative news they saw on their news feeds. Those who were exposed to more positive news used more positive words/posted more positive posts themselves whereas those who saw mostly negative content were more likely to post unhappy and negative content. Researchers however did not see the actual content of these filtered posts as per Facebook’s data use policy. Instead they only counted the number of positive and negative words in more than 3 million posts with a total of 122 million words. According to the report, “4 million words were positive and 1.8 million were negative.”

“This observation, and the fact that people were more emotionally positive in response to positive emotion updates from their friends, stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively,” said Jeff Hancock, a professor at Cornell’s College of Agriculture and Life Sciences and co-director of its Social Media Lab. “In fact, this is the result when people are exposed to less positive content, rather than more.”

If this research has helped us gain a better insight into human psychology, what’s up with all the backlash? Well, it has a lot to do with Facebook intentionally making thousands of people sad and its manipulation of data without the prior consent of its users. While Facebook maintains that such research is totally written into its privacy policy (the one we all thoroughly studied… right?), the concept of “informed consent” is deeper and broader than simply clicking on an “Agree” check box on a website. Social experimenters and psychologists go to great lengths to ensure subjects fully and freely consent to the experiment using language that they understand.
The relevant section of Facebook’s data use policy is as follows: “… in addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” So Facebook very vaguely mentions research but does not define its scope. Nor did it inform the unsuspecting subjects of the study that they were, in fact the subjects!

One of the researchers in the study, Adam Kramer has come to a very public defense of Facebook. He reportedly said, “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product… We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
While Facebook may not have manipulated all of its users’ emotional state, it definitely has increased the level of scrutiny and mistrust it faces from both users and non-users. How does the experiment, and Facebook’s apparent breach of ethical conduct make you feel? Feel to give your two cents and comment below.

Written by Rabail Majeed
Rabail is a writer at TechJuice who, when not rushing to meet deadlines, can be found planning her next big trip up north or adding to her collection of questionably cheesy 80's music. Profile