Facebook May Have Been Deliberately Messing With Your Emotions

Impact

The news: For one week in January 2012, Facebook conducted an experiment in which it altered the newsfeeds of nearly 700,000 users in order to see how emotionally affected they would be. Effectively, Facebook turned a bunch of its users into unwitting guinea pigs. 

Many users were not happy. But the experiment was within Facebook's rights — it's part of their data use policy, which you agree to when you set up a profile. In the key section, Facebook says it can use your information "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

But the study was published in the Proceedings of the National Academy of Science — doesn't Facebook need your permission before sharing information it received from users? Yes and no. The data use policy says the social media site will inform you before sharing your info, unless it had "removed your name and any other personally identifying information from it."

The experiment: So what did this experiment actually discover? It was centered on the idea of "emotional contagion," trying to figure out how your emotional state is affected by the posts you see from friends and family.

In order to measure this, Facebook skewed the algorithms behind some randomly selected users' newsfeeds, making more positive or more negative terms appear. Then the users in question were checked for how positive or negative their posts were after the week was up.

It turns out the emotional state of your Facebook feed affects your own. People with happier newsfeeds were more likely to post happy things, and vice versa. It's the kind of result Facebook data scientist Adam Kramer may have had in mind when he said in an interview, "Facebook data constitutes the largest field study in the history of the world. Being able to ask — and answer — questions about the world in general is very, very exciting to me."

The response: While it's clear they don't have a legal recourse, many users were not at all pleased with the possibility of Facebook altering their newsfeeds and then measuring their emotional states.

Given the relative success of this study and the wiggle room they have thanks to their data use policy, Facebook certainly seems likely to take on more studies of this type (if they haven't done some already). Given the response, though, things may be more transparent next time.