Facebook has been on a mission to stamp out the spread of misinformation and propaganda since last November, after reports first began to shed light on the role that social media played in the US election. But the fact-checkers the company has relied on don’t seem to have much faith in the tech giant’s plan to fix the mess it created.
Facebook CEO Mark Zuckerberg initially claimed that his company had no impact on the presidential election, saying last November that “the idea that fake news on Facebook... influenced the election in any way is a pretty crazy idea.” But the company still began taking action to slow the spread of misinformation, beginning with a partnership with several news and fact-checking organizations—including Associated Press, Politifacts, and Snopes—to flag false and highly disputed news stories.
Since then, Zuckerberg has added some crow to his diet of plain toast. As Facebook has opened up about coordinated efforts involving Russian government-backed Facebook trolls and Russian-bought Facebook ads—including its belief that 126 million Americans viewed Russian propaganda during the election—Zuckerberg has said he regrets his dismissive comments.
And just as we’ve been learning of the true scope of Facebook disinformation campaigns, it’s also becoming more apparent that the company’s efforts to impede such campaigns now and during future elections reportedly aren’t going so well.
Last month, fact-checkers started speaking out about their disappointment with the lack of provided data on the impact of their work. Now they’re also saying the effort is essentially a sham, according to a new report from The Guardian.
“I don’t feel like it’s working at all. The fake information is still going viral and spreading rapidly,”one reporter told The Guardian. “It’s really difficult to hold [Facebook] accountable. They think of us as doing their work for them. They have a big problem, and they are leaning on other organizations to clean up after them.”
The Guardian did not reveal the names of some of the sources because those third-party fact-checkers were not authorized by Facebook to speak publicly about the matter. But multiple fact-checkers told the news outlet that the efforts have mostly been a failure. And they apparently believe the social media company has exploited them as a part of a publicity campaign.
What’s more, the reporters interviewed for the article said their professional partnerships with Facebook have hindered their ability to report stories pertaining to the social media company’s part in misinformation campaigns that affected the election. “They are basically buying good PR by paying us,” one source said.
The sources for the report cast doubt on the true impact that third-party fact-checkers such as themselves can actually have. “They should be hiring armies of moderators and their own fact-checkers,” one fact-checker said. “The relationship they have with fact-checking organizations is way too little and way too late.”
Politifacts executive director Aaron Sharockman told The Guardian that some fact-checkers were disappointed because once articles have been debunked, misinformation propagators can easily re-publish the same story or information on other sites or URLs.
Fact-checkers who spoke to The Guardian added to previous complaints about Facebook’s lack of transparency about the impact of their work. “We’re sort of in the dark. We don’t know what is actually happening,” Alexios Mantzarlis, director of Poynter’s International Fact-Checking Network, told The Guardian. Mantzarlis, whose group helps Facebook choose third-party fact-checkers, added: “the level of information that is being handed out is entirely insufficient … This is potentially the largest real-life experiment in countering misinformation in history. We could have been having an enormous amount of information and data.”
A Facebook spokesperson told The Guardian that article impressions drop by 80 percent after articles have been flagged. “Our work with third-party fact-checkers is not just meant to educate people about what has been disputed–it also helps us better understand what might be false and show it lower in News Feed.” Facebook did not respond to Gizmodo’s request for comment on the report.