Surprise! Despite repeated assurances from Facebook and its CEO, Mark Zuckerberg, that the company would stop recommending all “political and social issue groups” in order to slow down misinformation and coordinated political violence, it didn’t really do that.
The Markup reported on Tuesday that while Zuckerberg testified to Congress in October 2020 that the cessation of politically charged recommendations was part of the company’s “emergency” measures, and the company reiterated it was “not recommending civic groups for people to join” in a blog post on Jan. 11, is has still been prompting users to join such groups. According to the Markup, 12 of the top 100 recommended groups in their sample of 1,900 users were political, while Facebook remained disproportionately likely to recommend political groups to supporters of our banned president:
We found 12 political groups among the top 100 groups recommended to the more than 1,900 Facebook users in our Citizen Browser project, which tracks links and group recommendations served to a nationwide panel of Facebook users... Facebook pushed political groups most often to the Trump voters on our panel. Almost one fourth of the top 100 groups suggested to Trump voters were political—and political groups accounted for half of the top 10 groups recommended to Trump voters.
Moreover, the Markup found a number of the groups shown to Trump supporters included posts calling for political violence, spreading conspiracy theories, or discussing the logistics of attending a Jan. 6 extremist rally in Washington, DC, that became a riot at the Capitol, resulting in five deaths. Eight percent of the Trump supporters received recommendations to join a “Rudy Giuliani [Common Sense]” group, where one post called for Georgia’s governor and secretary of state to be executed by hanging, while 19 percent were recommended to join a group that advertised a “#StormTheCapitol” event on Dec. 30, 2020. Roughly one in five of the Trump supporters also received recommendations to join “Tucker Carlson Fox News” as well as “Kayleigh McEnany Fan Club”.
The 1,900 panelists did receive recommendations to join 97,443 different groups, the majority of which didn’t have any obvious political bent or association. Democratic-leaning voters also received invitations to politically-minded groups—many of which were full of anti-Trump memes—at a lower rate. However, the Markup cautioned the data set captures only a vanishingly tiny set of total group recommendations served up to U.S. Facebook users, and “skews older, more educated, and whiter than the general population, and [is] particularly short on Trump voters [508 in the sample] and Latinos.” The Markup’s data set is available on Github.
Facebook doesn’t clarify how it deems whether or not a particular group is devoted to “political and social” issues, though only one of the groups in the sample voluntarily tagged itself as about politics. (In an example of why this is not necessarily the best system, one administrator of a pro-Trump group told the Markup they were not politically minded but instead “for a man who I know can change the future and do good for this country.”)
Facebook itself knows that its growth-oriented recommendation tools, which aim to keep users engaged with others on the site for long periods of time, can be used to fuel groups with ill intent. A Wall Street Journal report in May 2020 relayed that internal Facebook research had found “64% of all extremist group joins are due to our recommendation tools.”
The company has claimed to be taking major steps to lock down certain features of its platform since the election season. Yet it allowed ads for military and survival gear to run alongside content boosting election conspiracy theories and news about the Capitol riot. Reporting by the Washington Post and New York Times showed that far-right content continued to proliferate on Facebook before the Capitol riot, including the #StopTheSteal hashtag, at least a dozen Republican Party-affiliated groups and individuals that coordinated transport for participants, and fake viral claims of election fraud.
Facebook is also largely responsible for the explosive growth of QAnon, a conspiracy theory that believes Democratic politicians and celebrities are part of a child-raping Satanic cabal that controls the government. It didn’t ban QAnon groups until October, and then only those that discussed violence, despite reports QAnon groups had racked up millions of followers on the site. Facebook has continued to play host for viral QAnon content (some of it simply more subtle) even after the theory’s adherents showed up en masse for the Capitol riots.
Anti-lockdown protests, many of which were attended by far-right gunmen, throughout 2020 were largely planned via public Facebook groups. In August 2020, Facebook got a preview of what might happen if it continued to fail to limit far-right extremists’ ability to organize on the platform when a self-declared “Kenosha Guard” group rallied a militia to confront Black Lives Matter protesters in Kenosha, Wisconsin, after which an armed vigilante shot three people.
“We have a clear policy against recommending civic or political groups on our platforms and are investigating why these were recommended in the first place,” Facebook spokesperson Kevin McAlister told the Markup in a statement.