Facebook Doesn't Think Manipulating Users' Emotions Is A Big Deal

We may earn a commission from links on this page.

If you missed this outrageous study published earlier this month in an academic journal, here's the nutshell version: In January 2012, a Facebook data scientist, along with two university researchers, tweaked the News Feed of almost 690,000 users to display more "positive" or "negative" stories to figure out if "emotions are contagious on social networks".

That means exactly what you think it does: Facebook played a psychological mind game with its users and it used a tiny clause in its 9,000-word Terms of Service to justify its actions.

What's troubling is that Facebook doesn't think that this is a big deal. When Forbes asked the social network about its review process for this study, Facebook responded with this tone deaf statement:

"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."


Forbes reporter Kashmir Hill argues that despite the fact that "research" is a word in Facebook's Terms of Service, most users don't expect that their Facebook experience is being actively manipulated to gauge their emotions.

Are you horrified? Or do you think this isn't really a big deal? [Forbes]