The Future Is Here
We may earn a commission from links on this page

Facebook Experimented on Random Users to Study Newsfeed Emotions

We may earn a commission from links on this page.

Spend time with a Debbie Downer, and you'll likely end up feeling blue. Turns out, the same is true digitally: Facebook's new study says this "emotional contagion" works just as strongly through your News Feed—which they discovered after tinkering with the emotional content of nearly 700,000 random users' feeds.

Yes, as the research paper published in Proceedings of the National Academy of Science describes, Facebook's researchers wanted to figure out whether the transfer of emotions that we've all experienced face-to-face can also occur digitally. So the team of three created an algorithm that analyzed the words in News Feed posts to categorize them as emotionally positive or negative.


They then used this information to tinker with a whole mess of people's News Feeds. The team took 689,003 users' feeds and tweaked the emotional content for a week. Some users received feeds with some of the more negative status updates hidden; others got feeds tuned toward doom and gloom.

Turns out, there was a direct effect: despite no actual human interaction occurring, the subjects who saw artificially chipper news feeds posted happier status updates; the opposite was true for folks given a sourpuss feed.


The findings are, perhaps, kind of interesting: for the first time, we have statistical proof that social media affects our emotions in a way that's very similar to IRL human interaction. As the authors put it, "in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and . . . the observation of others' positive experiences constitutes a positive experience for people."

But there's something a bit creepy about Facebook using nearly three quarters of a million regular users as psychological test subjects, without their ever knowing it. To be fair, the emotional content rating and newsfeed manipulation was done by machine, rather than some lab-coated scientist combing through all your friends' status updates. But the notion that Facebook can actually manipulate your emotions via the news feed items you see just feels a bit bleak.

Of course, it's all covered under Facebook's privacy policy, the one you thoroughly studied and deeply considered blindly agreed to when you signed up for the social media service. Facebook didn't need to ask you for consent—they already had it.

So don't be surprised if you start feeling happy, sad, or angry in response to a friend's happy, sad, or angry status update. Turns out, it's just the way we're wired. Although there's a good chance your friends' humblebragging updates probably aren't 100 percent truthful. [PNAS; New Scientist via A.V. Club]