The anti-vaccination movement has become so widespread that the World Health Organization added it to its list of the top threats to global health in 2019. Social media companies are drawing criticism for not doing more to prevent the spread of misinformation about vaccines and Pinterest has come up with a novel solution: just don’t show vaccine-related search results.
While companies like Facebook and YouTube try their hardest to be as hands off as legally possible when it comes to controversial content, lawmakers have started to notice that anti-vaxxers represent a public health threat and social media platforms may have a responsibility to contain the spread of bunk science. The image-sharing social network Pinterest is probably not the first company you think of when it comes to the front lines of the information wars, but it’s apparently been proactive about vaccine misinformation.
On Tuesday, the Wall Street Journal reported Pinterest’s first public comments on a program it quietly started last year to identify what it calls “polluted” content and suppress related search results. A Pinterest spokesperson told the Journal that after conducting a review of vaccine-related content on its platform, it found that the majority of posts were warning users that vaccines are dangerous despite the scientific consensus that they’re not only safe but necessary. In the absence of a better solution, Pinterest decided to just stop returning results for the vaccination category altogether.
“It’s better not to serve those results than to lead people down what is like a recommendation rabbit hole,” Ifeoma Ozoma, Pinterest’s manager of public policy and social impact, told the paper.
In our tests, we found that searches for “vaccination” and “anti-vaccination” on Pinterest returned no results. But it was easy for us to find anti-vaxxer content when we tried other keywords. Here’s the first result that popped up on the site when we ran a search for “vax”:
Thanks to vaccines, measles was officially considered to be eliminated in the U.S. by the year 2000, but we’ve seen recent outbreaks of the infectious disease across the country in places like Brooklyn, Washington, and Oregon. The general consensus is that anti-vaccination conspiracy theories spread by people like Andrew Wakefield and Jenny McCarthy are causing preventable illnesses to come back with deadly consequences. And a recent investigation by The Guardian found that Facebook and YouTube’s algorithms are largely pushing users toward false information when they look for content about vaccines. Pinterest says that its approach is only temporary while it develops better moderation strategies.
Pinterest told the Journal that it has designated other categories as “polluted” as well, including “certain cancer-related searches, content, and accounts related to dubious cancer therapies.” When we asked Pinterest what other topics it has decided to block from its search, a spokesperson referred us to the company’s community guidelines. “We don’t allow advice when it has immediate and detrimental effects on a Pinner’s health or on public safety. This includes promotion of false cures for terminal or chronic illnesses and anti-vaccination advice,” the spokesperson said.
Practically any solution for preventing the spread of false information raises other problems. For instance, if YouTube banned anti-vaxxer videos, its automated systems would probably end up blocking unrelated and perfectly innocuous videos. In Pinterest’s case, their strategy seems aimed at providing some plausible deniability about it actively pushing users in the wrong direction. YouTube doesn’t have to ban this kind of content, but it would certainly help if its algorithm stopped recommending it.
But there’s also a larger, more existential issue at play. As web users are increasingly funneled into doing the bulk of their online browsing on a handful of large platforms, do we want these private companies to simply block controversial topics? If Facebook decided to just say the hell with it and stop allowing all vaccine-related content, that would also prevent accurate information from spreading. In 2017, Pew Research found that about 67 percent of adults in the U.S. get at least some of their news from social media and the number of young people looking to online video as a news source is rapidly growing.
At the same time, you have to ask yourself if you really want someone who is maybe anti-vax-curious to turn to the search bar on Pinterest to educate themselves. And, honestly, is anyone doing that? Pinterest has mostly operated as a place for wedding ceremony mood boards and vacation destination aspirations. It’s a poor destination to hunt for scientifically sound medical advice, and maybe YouTube shouldn’t be, either. Pinterest’s solution to the vaccination problem doesn’t seem very effective, but at least it understands its network doesn’t have to be a go-to destination for crackpots. There’s a whole internet out there just waiting to give those people a place to call their own.