Facebook May Have Been Deliberately Messing With Your Emotions

The news: For one week in January 2012, Facebook conducted an experiment in which it altered the newsfeeds of nearly 700,000 users in order to see how emotionally affected they would be. Effectively, Facebook turned a bunch of its users into unwitting guinea pigs. 

Many users were not happy. But the experiment was within Facebook's rights — it's part of their data use policy, which you agree to when you set up a profile. In the key section, Facebook says it can use your information "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

But the study was published in the Proceedings of the National Academy of Science — doesn't Facebook need your permission before sharing information it received from users? Yes and no. The data use policy says the social media site will inform you before sharing your info, unless it had "removed your name and any other personally identifying information from it."

The experiment: So what did this experiment actually discover? It was centered on the idea of "emotional contagion," trying to figure out how your emotional state is affected by the posts you see from friends and family.

In order to measure this, Facebook skewed the algorithms behind some randomly selected users' newsfeeds, making more positive or more negative terms appear. Then the users in question were checked for how positive or negative their posts were after the week was up.

It turns out the emotional state of your Facebook feed affects your own. People with happier newsfeeds were more likely to post happy things, and vice versa. It's the kind of result Facebook data scientist Adam Kramer may have had in mind when he said in an interview, "Facebook data constitutes the largest field study in the history of the world. Being able to ask — and answer — questions about the world in general is very, very exciting to me."

The response: While it's clear they don't have a legal recourse, many users were not at all pleased with the possibility of Facebook altering their newsfeeds and then measuring their emotional states.




Given the relative success of this study and the wiggle room they have thanks to their data use policy, Facebook certainly seems likely to take on more studies of this type (if they haven't done some already). Given the response, though, things may be more transparent next time.

How likely are you to make Mic your go-to news source?

Matt Connolly

Matt has written for Mother Jones, the Washington Examiner and Chicago Public Radio among many others. He's a resident of Washington, D.C., but much like Bruce Springsteen and pork roll he is a product of New Jersey.

MORE FROM

Meet the Girl Scouts that will earn badges for being cybersecurity experts

They'll soon get badges for coding, cryptography and more.

How to use the Snapchat Map while everyone else continues to be confused about it

Everything you need to know about the new feature.

Planet 10? Scientists may have discovered a hidden planet in our solar system

There could be a ninth — or even 10th — planet hiding out in our solar system.

Scientists created a robot that will iron your clothes for you

Shut up and take my money.

Moth eyes have inspired the touchscreen of the future

It's going to change the anti-reflection game.

Twitter was flagging tweets including the word "queer" as potentially "offensive content"

Why Twitter put the word "queer" in the same category as violent, sexual imagery.

Meet the Girl Scouts that will earn badges for being cybersecurity experts

They'll soon get badges for coding, cryptography and more.

How to use the Snapchat Map while everyone else continues to be confused about it

Everything you need to know about the new feature.

Planet 10? Scientists may have discovered a hidden planet in our solar system

There could be a ninth — or even 10th — planet hiding out in our solar system.

Scientists created a robot that will iron your clothes for you

Shut up and take my money.

Moth eyes have inspired the touchscreen of the future

It's going to change the anti-reflection game.

Twitter was flagging tweets including the word "queer" as potentially "offensive content"

Why Twitter put the word "queer" in the same category as violent, sexual imagery.