Facebook Says Whoops, Updates Rules for News Feed Manipulation (Updated)

Illustration for article titled Facebook Says Whoops, Updates Rules for News Feed Manipulation (Updated)

A few months ago, the world found out that Facebook made its users into test subjects, showing some folks artificially happy (or artificially sad) News Feeds to see if it affected the emotions in the user's own posts. The internet was not happy about that. Now, Facebook says it's updated its research guidelines in response.


In a blog post published today, Facebook CTO Mike Schroepfer says the research, which tweaked nearly 700,000 users' News Feeds for one week, was necessary to help Facebook see if any changes were needed. "Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism," he says. "It is clear now that there are things we should have done differently."

What's different in the new guidelines? Clearer rules for researchers, a review panel to ensure the guidelines are met, more training folded into Facebook's "bootcamp" for new employees, and a new, regularly-updated website publishing Facebook's research projects.

The blog post doesn't mention any changes to Facebook's Data Use Policy, the legal agreement we've all agreed to but none of us have read. It's important to point out that this agreement has always indicated that Facebook can "use the information we receive about you [. . .] for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

Back when the uproar over Facebook's emotional research first ignited, people complained that while Facebook did in fact provide "informed consent," it did so weakly, in a document most average users don't bother to read.

Our stance then, and now, is that this is misplaced outrage: We want Facebook to make more of its data publicly known, and when the social media juggernaut publishes what its scientists are poking at (which the company does not have to do), we should applaud it, not recoil from it.

We've reached out to Facebook for official comment on whether users will see any new kind of alerts regarding potential research manipulation and update this post if we receive a response. [Facebook via Washington Post]


Update: A Facebook representative provided the following comment:

"Our Terms and Data Use Policy disclose to users how we collect and use data. We explain that we may run tests and conduct research to understand how people use the service and make improvements."



There's just one VERY simple thing I'd like Facebook to do:

Show me EVERY post from EVERY person I'm friends with and EVERY page I 'Like'. Don't use some random algorithm to decide which posts I should see. That should be MY choice, based on who I'm friends with and which pages I 'Like'!