In the days since the election, Facebook CEO Mark Zuckerberg has finally started publicly confronting Facebook’s fake news problem. In doing so, he’s attempted to absolve the company of any blame but has offered no proof to support his claim that hoaxes and fake news aren’t running rampant on Facebook. The company holds all of the internal data that could support Zuckerberg’s claims, but is keeping it under wraps. It’s time for Facebook to stop playing games.
Zuckerberg spoke out after several outlets criticized the proliferation of fake news and hoaxes that have spread on Facebook. These critics suggest that the spread of fake stories like “FBI Agent Suspected in Hillary Email Leaks Found Dead in Apparent Murder-Suicide” or “Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement” may have been a factor in getting Donald Trump elected. The fake story about the murder-suicide was posted by a fake news site, The Denver Guardian, and has been shared 560,000 times, according to a publicly available Facebook tool.
On Saturday night, Zuckerberg released a statement saying “more than 99% of what people see [on Facebook] is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics.”
The problem with that statistic is that Facebook does not provide any data or proof to back it up. Zuckerberg clouded the issue further when he responded to a comment from a user who said much more than 1% of the news he saw on his page was fake.
“The stat I mentioned is across the whole system,” Zuckerberg responded. “Depending on which pages you personally follow and who your friends are, you may see more or less. The power of Facebook is that you control what you see by who you choose to connect with.” Zuckerberg could have offered some meaningful data that might have bolstered his case, like how often fake news is clicked on or shared, but he didn’t.
And this is a pattern. In another response, Zuckerberg says “the research shows that people are actually exposed to more diverse content on Facebook and social media than on traditional media like newspapers and TV.”
As Robyn Caplan, lead researcher for the Algorithms and Publics project at Data & Society, told Gizmodo, Zuckerberg is citing his own researchers—Bakshy, Messing and Adamic, who work for Facebook. These researchers wrote that for Facebook users, “who they friend and what content they click on are more consequential than the News Feed ranking in terms of how much diverse content they encounter,” and emphasize, again, how individual choices matter more than algorithms.
“This underplays the role that algorithms play in personalizing content based on these individual choices,” Caplan told Gizmodo via email. It bolsters these claims with nonspecific language.
“Their research shows that social media exposes people to more cross-cutting discourse in social media ‘than they would be under the digital reality envisioned by some.’ Unfortunately these researchers do not clarify what they mean by this very broad statement... Facebook is asking us to trust them, and yet are only giving us our own anecdotal and personalized experience of their platform, as evidence.”
Facebook is the sole keeper of the data and it doesn’t freely offer it to independent researchers either. “Though the researchers claim to give access to the data they use, this data still needs to be requested by researchers (see here), and as of yet, no researchers have independently verified these claims,” Caplan wrote. “In fact, other researchers, with access to an extremely limited dataset (which they got through Bing, which skews older) has shown that ‘filter bubbles’ are greater when individuals are accessing information over search engines or social media, than from traditional media sources.”
In addition to its reluctance to back up its statements with evidence, Facebook also has a long history of dodging hard questions and non-denials that are later proved to be misleading. Zuckerberg’s PR-sanitized statement on Saturday came eight hours after The New York Times’ Mike Isaac reported that Facebook executives were asking themselves what role the social network played in the election. Facebook had previously declined to comment on Isaac’s story. After Gizmodo’s Michael Nuñez reported how the Facebook’s trending news section was run by human “curators”, and that a former curator saw colleagues suppress conservative news, Facebook gradually rolled back its responses from spokespeople until eventually Zuckerberg had to address the controversy himself. Facebook eventually fired its trending news team. After a June report from Fusion saying Facebook used location data to suggest friends, Facebook initially confirmed the report, then walked back its confirmations after backlash.
Facebook’s outward dismissal on Saturday of concerns about fake news also runs counter to a Gizmodo report today that a high-level internal debate over fake news has been going on at the company since May. One source told Gizmodo that “high-ranking officials were briefed on a planned News Feed update that would have identified fake or hoax news stories, but disproportionately impacted right-wing news sites by downgrading or removing that content from people’s feeds.”
It’s still not clear exactly what role Facebook played in the election and if the company has done enough to prevent the spread of fake news. But we do know this: The only way to know for sure is for Facebook to release its internal metrics and stop fucking with us.