Activists at a protest in Olympia, Washington, on February 8, 2019.
Photo: Ted S. Warren (AP)

Facebook is considering ways it could prevent the spread of dangerous anti-vaccination content on the platform.

Last month, the World Health Organization included “vaccine hesitancy” on its annual list of health threats—alongside Ebola, HIV, and climate change. Less than a month later, anti-vaccination activists in Washington state rallied against vaccinations at the capital as a measles outbreak was spreading in the state. Public health officials determined it’s a strain that is preventable with a vaccine.

Advertisement

On Thursday, Representative Adam Schiff, a California Democrat, sent letters to Facebook CEO Mark Zuckerberg and Google CEO Sundar Pichai, expressing his concern for the decline of vaccination rates in the U.S. and the recent measles epidemic in Washington. His letters pressed them to say what their platforms are doing, and can do, to stop the spread of anti-vaccination content.

“There is strong evidence to suggest that at least part of the source of this trend is the degree to which medically inaccurate information about vaccines surface on the websites where many Americans get their information,” Schiff’s letter reads. “The algorithms which power these services are not designed to distinguish quality information from misinformation or misleading information, and the consequences of that are particularly troubling for public health issues.”

Advertisement

Bloomberg reports that Facebook responded to the letter. According to a statement Facebook sent Bloomberg, the company is “exploring additional measures to best combat the problem,” which could include “reducing or removing this type of content from recommendations, including Groups You Should Join, and demoting it in search results, while also ensuring that higher quality and more authoritative information is available.”

Reached for comment, a Facebook spokesperson confirmed Facebook is exploring ways to stop the spread of bad medical information. “We’ve taken steps to reduce the distribution of health-related misinformation on Facebook, but we know we have more to do,” a spokesperson told Gizmodo, in a statement. “We’re currently working on additional changes that we’ll be announcing soon.”

Advertisement

Google declined to comment on Schiff’s letter, but a spokesperson pointed to its ongoing efforts to reduce the recommendation of content that includes medical misinformation.

Late last month, Google-owned YouTube announced it would stop recommending “borderline content and content that could misinform users in harmful ways,” but wouldn’t tell Gizmodo what that meant exactly.

Advertisement

As Bloomberg points out and Gizmodo confirmed, a YouTube search for “vaccine” shows several videos that include anti-vaccine misinformation in the top results.

Perhaps Schiff will continue to press both companies to ensure their platforms aren’t spreading information that harms society. Or, more likely, everything will stay a giant mess.

Advertisement

[Bloomberg]