The Future Is Here
We may earn a commission from links on this page

Internal Facebook Documents Show How Badly It Fumbled the Fight Against Anti-Vaxxers: Report

Facebook researchers warned that anti-vaxxers were deluging comment sections with propaganda, but the company was slow to take action.

We may earn a commission from links on this page.
A patient receiving a coronavirus vaccine from the Oakland County Health Department clinic in Southfield, Michigan, in August 2021.
A patient receiving a coronavirus vaccine from the Oakland County Health Department clinic in Southfield, Michigan, in August 2021.
Photo: Emily Elconin (Getty Images)

The Wall Street Journal has had something of a banner week tearing down Facebook. Its series on a trove of internal company documents obtained by the paper has unveiled Facebook’s secret system for treating certain users as above the rules, company research showing how harmful Instagram is for young girls, how the site’s algorithmic solutions to toxic content have backfired, and that Facebook executives are slow to respond to reports of organized criminal activity. On Friday, it published another article detailing how badly Facebook has fumbled fighting anti-vax content and CEO Mark Zuckerberg’s campaign to get users vaccinated.

The Journal details how encouraging vaccinations became a key initiative under Zuckerberg, but it was continually undermined by the inherent nature of Facebook and a straggling approach to mobilizing against identified problems. As has happened with everything from armed vigilantism during Black Lives Matter protests to QAnon conspiracy theories and the Jan. 6 Capitol insurrection, the Journal report shows how Facebook failed to effectively act on widespread warnings that it was becoming a hive of some kinds of toxic content and it ended up stuck playing catch-up.


One big problem was that Facebook users were brigading any content addressing vaccination with anti-vax comments. Company researchers, according to the Journal, warned executives that comments on vaccine-related content were flooded with anti-vax propaganda, pseudo-scientific claims, and other false information and lies about the virus and the vaccines.

Global health institutions such as the World Health Organization (WHO) and Unicef had registered their concern with Facebook, with one internal company memo warning of “anti-vaccine commenters that swarm their Pages,” while another internal report in early 2021 made an initial estimate that up to 41% of comments on vaccine-related posts appeared to risk discouraging people from getting vaccinated (referred to within the company “barrier to vaccination” content). That’s out of a pool of around 775 million vaccine-related comments seen by users daily.


In the memo, the Journal wrote, the researchers warned that even authoritative sources of information on Facebook were becoming “cesspools of anti-vaccine comments,” adding it is a “huge problem and we need to fix it.”

Facebook had promised in 2019 to crack down on antivax content and summoned WHO reps to meet with tech leaders in February 2020. Zuckerberg personally got in contact with National Institute of Allergy and Infectious Diseases director Dr. Anthony Fauci to discuss funding vaccine trials, offer ad space and user data for government-run vaccination campaigns, and arrange a live Q&A between the two on the site. Facebook had also made adjustments to its content-ranking algorithm that a June 2020 memo claimed reduced health misinformation by 6.7% to 9.9%, the Journal wrote.

But by summer 2020, BS claims about the coronavirus and vaccines were going viral on the site, including the viral “Plandemic” video, a press conference staged by a group of right-wing weirdos calling themselves “America’s Frontline Doctors,” and a handful of anti-vax accounts such as Robert F. Kennedy Jr.’s that advocacy group Avaaz later identified as responsible for a wildly disproportionate share of the offending content. According to the Journal, Facebook was well aware that the phenomenon was being driven by a relatively small but determined and prolific segment of posters and group admins:

As the rollout of the vaccine began early this year, antivaccine activists took advantage of that stance. A later analysis found that a small number of “big whales” were behind many antivaccine posts and groups on the platform. Out of nearly 150,000 posters in Facebook Groups disabled for Covid misinformation, 5% were producing half of all posts, and around 1,400 users were responsible for inviting half the groups’ new members, according to one document.

“We found, like many problems at FB, this is a head-heavy problem with a relatively few number of actors creating a large percentage of the content and growth,” Facebook researchers would write in May, likening the movement to QAnon and efforts to undermine elections.


Zuckerberg waffled and suggested that Facebook shouldn’t be in the business of censoring anti-vax posts in an interview with Axios in September 2020, saying “If someone is pointing out a case where a vaccine caused harm or that they’re worried about it —you know, that’s a difficult thing to say from my perspective that you shouldn’t be allowed to express at all.” This was a deeply incorrect assessment of the problem, as Facebook was well aware that a small group of bad actors was actively and intentionally pushing the anti-vax content.

Another internal assessment conducted earlier this year by a Facebook employee, the Journal wrote, found that two-thirds of randomly sampled comments “were anti-vax” (though the sample size was just 110 comments). In their analysis, the staffer noted one poll that showed actual anti-vaccine sentiment in the general population was 40% lower.


In February 2021, the month before Zuckerberg announced Facebook would mount a major effort to encourage vaccinations, the company said it would remove a much wider swathe of false claims about vaccines. Unfortunately, the system Facebook had in place for identifying those posts often didn’t work. The Journal reported that one integrity worker flagged a post with 53,000 shares and three million views that asserted vaccines are “all experimental & you are in the experiment.” Facebook’s automated moderation tools had ignored it after somehow concluding it was written in the Romanian language. By late February, researchers came up with a hasty method to scan for “vaccine hesitant” comments, but according to the Journal their report mentioned the anti-vax comment problem was “rampant” and Facebook’s ability to fight it was “bad in English, and basically non-existent elsewhere.”

Other responses included limiting the number of comments a person could make on an authoritative coronavirus information source to 13 per hour in April, and an internal memo dated from May obtained by the Journal shows it activated a “break the glass” approach that downranked vaccine-related content that was sensationalist or indirectly encouraging users to forego vaccination. Researchers also discussed building new tools such as classifiers (a predictive model) to identify anti-vax spam. In August, it finally announced it had taken action against the disproportionate offenders identified in the Avaaz report. Still, it remains trivial to find anti-vax content all over the site.


This isn’t exactly all Facebook’s fault, or at the very least, it’s not the sole reason why the nation is going through anti-vax hell. The anti-vax movement is far larger than just one site and is adept at changing tactics, and one of the U.S.’s two major political parties is riddled to the rotten core with and actively promoting vaccine skepticism. BuzzFeed reporter Joe Bernstein has also highlighted the fact that the idea that disinformation on Facebook is somehow inherently more persuasive than other forms of media conveniently matches the company’s own claims about its targeted advertising prowess. But by the same token, Facebook’s need to juice engagement is an inherent conflict with everything socially beneficial it claims to be doing, and it allowed itself to become one of the prime vehicles for anti-vax propaganda throughout the pandemic.

“We’re focused on outcomes, and the data shows that for people in the U.S. on Facebook, vaccine hesitancy has declined by about 50% since January, and acceptance is high,” Aaron Simpson, a spokesperson for Facebook, told the Journal in a statement. He added the Journal’s documents show Facebook’s “routine process for dealing with difficult challenges. Narrowly characterizing leaked documents doesn’t accurately represent the problem, and it also ignores the work that’s been underway to make comments on posts about COVID-19 and vaccines safer and more reliable.”


Simpson also told the paper the internal research wasn’t final and “over-states the amount of misleading vaccine content.”