Facebook Killed News Feed Fixes Over Fear of Conservative Backlash

Facebook said it did not "build and withhold any News Feed changes based on potential impact on any one political party.” Internal documents say otherwise.
Facebook Killed News Feed Fixes Over Fear of Conservative Backlash
Image: Mandel Ngan/AFP/Gizmodo Illustration/Artem Golub (Getty Images)
We may earn a commission from links on this page.

This piece is part of Gizmodo’s ongoing effort to make the Facebook Papers available to the public. See the full directory of documents here.

[Editor’s note: We’ll be releasing a dozen or more additional “News Feed” documents this week. Reload this page for updates or use our directory page to view new documents as they’re uploaded.]

After Donald Trump’s 2016 victory, Facebook faced a tsunami of allegations blaming its core product, News Feed, for filling the brains of its users with false information. Scrutiny sharply intensified among lawmakers long convinced of the company’s duty to stamp out state-affiliated trolls and other malicious groups set on eroding electoral trust. Mark Zuckerberg, whose platform received $81 million for ads on behalf of the two White House contenders, defensively belittled his own creation: Within days of the results, he told a tech conference crowd it would be “pretty crazy” to assume Facebook—the world’s largest communications platform—was capable of influencing voters “in any way.” Ten months later, Zuckerberg would claim to “regret” those words.

Making the Facebook Papers Public
Subtitles
  • Off
  • English

Soon after the off-the-cuff dismissal, however, sources inside Facebook told Gizmodo that political meddling had been a major concern at the company for the better part of a year. High-level discussions over its approach to false news, disinformation, and other activities aimed at manipulating voters had been routinely held, the current and former employees said, anonymously out of fear of retaliation. One source with direct knowledge of the discussions recalled a potential update that employees believed would reduce the flow of “fake or hoax news stories.” A slew of right-wing pages had been flagged by Facebook’s algorithmic moderation system for habitually spreading falsehoods, they said. Afraid of upsetting conservatives, however, Facebook had shelved the update, and many decisions around the election were “caught up in that,” the source said.

Facebook officials refused to confirm or deny the existence of the update, or to even acknowledge that a disproportionate amount of the misinformation arose from one side of the political spectrum. Instead, a spokesperson said, “We did not build and withhold any News Feed changes based on their potential impact on any one political party.”

Facebook asserted at the time that accusations of political bias had no influence whatsoever over its decision making. But look no further than the leaked statements of its own moderators among its employees for evidence that it does.

Today, Gizmodo is publishing our third batch of the Facebook Papers—documents that, among other things, shine a light on the company’s reluctance to take action against known sources of misinformation. Employees whose work appears in the papers repeatedly attribute decisions like the shelved News Feed update to fears that the company would be portrayed as favoring certain publications that, in some cases, its own users judged more informative. In particular, accusations of liberal bias by Republican leaders weighed heavily in debates over whether to improve News Feed or correct flaws in the ways Facebook prioritized journalism and other political content. Accusations of a liberal slant controlling the company are described in two papers as playing a crucial role in decisions reached during employee-led efforts to minimize the frequency of propaganda and misinformation in people’s feeds.

An internal post dated August 2019 briefly describes the decision by Facebook to kill a News Feed update purportedly designed to prioritize “high quality” news. In this case, Facebook obtained the underlying data responsible for gauging the trustworthiness of news sources by polling users. The company came to the decision not to reduce the flow of “low quality” news to stave off charges from “some quarters” about “perceived anti-conservative bias,” according to the post.

Asked about the discrepancy between the company’s prior claims and the once-confidential testimony of its own employees, a Facebook spokesperson declined to comment.

In the same 2019 document, Facebook employees estimated that the company had only taken action against “approximately 2% of the hate speech on the platform,” while concluding that misinformation, when noticed at all, often goes unidentified “until after it has gotten a lot of distribution.” The most “impactful abusive accounts,” it says, continue to persistently evade moderation. While employees generally have “considerable leeway” when it comes to making decisions that affect “a wide range of content,” the author writes, “policy concerns become significantly higher” when politics enter the frame.

The documents take on new relevance in the political climate of 2022. Attempts by social media companies to minimize the spread of election-related hoaxes and false news have spurred Republican leaders in several states to pursue new laws around content moderation. Laws authored in Texas and Florida have been framed as attempts by legislatures to protect users from being punished for holding unpopular political opinions. Last month, researchers at MIT and Yale debuted a paper finding Republicans on Twitter had, in fact, been “much more likely” to face suspension. (This paper is a pre-print and has yet to be published.) At the same time, analysis showed Republicans users had posted misinformation ahead of the election at a rate “substantially” higher than their Democratic counterparts. “Thus,” the researchers wrote, “policies aimed at fighting misinformation in a nonpartisan way could have easily” explained the discrepancy.

Another Facebook document, dated August 2020, speaks to the influence of its public relations department over a policy already in effect, aimed at limiting political content more broadly across News Feed. The author reveals that “internal employees” had started “expressing discomfort” over the way in which this policy was imposed—specifically, that Facebook was continuing to inject content into people’s feeds stemming from pages pushing “highly partisan content.”

A fix proposed by Facebook’s news team appears to have been shot down by both its PR department and the News Feed’s own policy team. The concerns about fixing the problem stemmed, the document’s author wrote, from anticipated accusations of bias brought by “certain political entities.”

Months prior to the 2020 election, while awash in allegations of censorship by a concerted Republican campaign against the company, Facebook began throttling the level of political content surfacing across the platform. Not all of these changes were announced. A press release from Sept. 2020 described such measures as banning political ads the week before Election Day and adding labels to posts trying to “delegitimize” the results. But no reference was made to restricting political content across the platform more generally. That wasn’t disclosed until a week before the election, seemingly a consequence of Mark Zuckerberg begin grilled under oath.

These quieter efforts at restricting political chatter appear to be tied to concerns about users quitting the platform over constant partisan bickering. A confidential report—published by Gizmodo this year—had warned in January 2020 that many users had begun associating Facebook with “feelings of exhaustion, discouragement, stress, and anger.” Many blamed the platform for “strained relationships” and for costing them lifelong friends. The company quoted one user as saying Facebook had “severely harmed” many of friendships; a consequence, they said, of “how viscous, hateful, and biased” people’s posts had become.

The Facebook Papers includes tens of thousands of leaked pages describing, often in granular detail, how Facebook’s moderation systems really operate. Some of these documents, in fact, are so specific that security experts have warned against making them public, afraid bad actors will learn secrets to evading detection while breaking rules in place for good reason. The records were first provided to Congress last year by Frances Haugen, a Facebook product manager-turned-whistleblower, and later obtained by hundreds of journalists, including those at Gizmodo. Haugen offered blistering testimony to Congress about Facebook’s harms in October 2021. In our first drop, we shared 28 files related to the 2020 election and the Jan 6. attack on the U.S. Capitol. In our second, 37 files. Gizmodo has partnered with a group of independent experts to review, redact, and publish the documents responsibly.

Not one of the documents, however, points to any calculated conspiracy aimed at censoring conservative voices. Far from forcing Facebook to shun partisanship, the company’s biggest accusers on Capitol Hill seem only to have achieved the opposite.


May 16, 2022: Ranking and News Feed documents

In an effort to boost original broadcast posts (OBP’s) on Facebook, one researcher tried turning off auto-play videos in News Feed. The researcher notes that the numbers suggest a “significant” amount of video watch time on users’ feeds is due to autoplay, and that while turning off autoplay on News Feed globally might be good for user wellbeing in the abstract, it would be “difficult” to implement on a wide scale because of the immediate blows to metrics like meaningful social interactions (MSI), and the loss of longer sessions.

A post from early 2021 announcing “goal changes” for metrics in the News Feed.

A collection of various internal studies and reports about News Feed that were conducted during Q1 2021.

An experiment in which researchers switched some user’s feeds from ranked to chronological. As it turns out: a chronological feed resulted in users seeing more content, as measured by VPVs, and ad revenue jumped in turn. The writer hypothesizes this is because people are frustratedly scrolling through their feeds and seeing more ads as a result. Another theory: less compelling content on your News Feed means fewer reasons to click away from it to profiles and pages.

A retrospective on all of the News Feed related research that happened in the second half of 2020.

Two documents detailing a proposal for how the company can filter out semi-political content (the example given is a non-political post from Ben Shapiro’s page).

A list of some basic product updates to In-Feed Recommendations that went through in the first half of 2021, including some tweaks meant to boost sessions and screen time among teens and young adults.

Click here to read all the Facebook Papers we’ve released so far.