The Future Is Here
We may earn a commission from links on this page

Facebook Knows That Labeling Trump's Election Lies Hasn't Stopped His Posts From Going Viral

We may earn a commission from links on this page.
Image for article titled Facebook Knows That Labeling Trump's Election Lies Hasn't Stopped His Posts From Going Viral
Photo: Chip Somodevilla / Staff (Getty Images)

Facebook’s attempt to slow the spread of President Trump’s misinformation and outright lies by affixing warning labels to the content has done little to stop the posts from going viral—and the platform is apparently well aware.

According to internal conversations reviewed by Buzzfeed News, data scientists in the employ of Facebook freely admit that the new labels being attached to misleading or false posts as part of a broader strategy to stop the spread of election-related misinformation—referred to internally as “informs”—have had little to no impact on how posts are being shared, whether they’re coming from Trump or anyone else.


“We have evidence that applying these informs to posts decreases their reshares by ~8%,” the data scientists said, according to Buzzfeed. “However given that Trump has SO many shares on any given post, the decrease is not going to change shares by orders of magnitude.”

That Facebook has been unable to meaningfully reduce the spread of Trump’s lies isn’t exactly shocking, particularly given how feeble the platform’s attempts at stemming the tide of misinformation has been in the lead up to the 2020 election. But the tacit acknowledgement of the failure is illuminating if only in the sense that it provides confirmation that at least some employees at Facebook are alarmed at, and asking questions about, the company’s ineptitude.


Under particular internal scrutiny is Facebook’s failure to address two posts in which Trump falsely wrote, “I WON THE ELECTION,”—posts which, despite bearing labels fact-checking the claim, have amassed a combined 1.7 million reactions, 350,000 comments, and 90,000 shares to date.

“Is there any induction that the ‘this post might not be true’ flags have continued to be effective at all in slowing misinformation spread?” asked one Facebook employee on one of the company’s internal message boards. “I have a feeling people have quickly learned to ignore these flags at this point. Are we limiting reach of these posts at all or just hoping that people will do it organically?”

“The fact that we refuse to hold accounts with millions of followers to higher standards to everyone else (and often they get lower standards) is one of the most upsetting things about working here,” added another employee.

In response, one researcher working on civic integrity at the company helpfully pointed out that Facebook’s policy is not to formally fact-check politicians, which leaves little room for solutions.


“Will also flag that given company policy around not fact-checking politicians the alternative is nothing currently,” they said, according to Buzzfeed.

Even in the aftermath of unveiling the much-criticized election guidelines, Facebook has continued to come under fire for high-profile missteps related to free speech on the platform. After a video in which Former White House Chief Strategist Steve Bannon called for Dr. Anthony Fauci and FBI Director Christopher Wray to be beheaded was live on his Facebook page for more than 10 hours on November 12, Facebook CEO Mark Zuckerberg reportedly told staff at a company meeting that the comment was not enough to merit a suspension of Bannon’s account.