The Future Is Here
We may earn a commission from links on this page

YouTube Cracks Down on Cancer Quacks

YouTube says it’s 'not a platform for distributing information that could harm people,' with a new crackdown on lies about cancer cures.

We may earn a commission from links on this page.
A hand holding a card with the YouTube logo on it.
Photo: Ink Drop / (Shutterstock)

YouTube announced it’s taking on the miracle cures, bogus disease prevention tips, and other health myths that litter the video-sharing site with a new assault on medical misinformation. The effort starts with a mass take-down of videos about “harmful or ineffective” cancer cures, according to a Tuesday press release from the company. It’s a move that’s sure to mire YouTube and its parent company Google deeper in the culture war surrounding Big Tech’s work to fight lies about vaccines and COVID-19.

“In the years since we began our efforts to make YouTube a destination for high-quality health content, we’ve learned critical lessons about developing Community Guidelines in line with local and global health authority guidance on topics that pose serious real-world risks,” YouTube said in a blog post. “Our goal is to ensure that when it comes to areas of well-studied scientific consensus, YouTube is not a platform for distributing information that could harm people.”


The policy starts with a crackdown on content that promotes unproven or harmful cancer treatments or discourages people from seeking professional medical attention. YouTube used videos that say “garlic cures cancer” or “take vitamin C instead of radiation therapy” as examples of prohibited content. The company says that enforcement and video take-downs start today and will ramp up in the coming weeks.

Bans on medical misinformation aren’t new on YouTube, but the company spent the last few years trying to strike a delicate balance between promoting public health and dodging conspiracy theories and censorship accusations. YouTube’s new framework centers around material that contradicts “health authority guidance” on three topics: the prevention, treatment and denial of the existence of specific medical conditions. Videos that suggest people haven’t died of COVID-19, for example, will be taken down, the company says.


The company has faced growing criticism in recent years over the proliferation of medical misinformation. Some blame YouTube’s “rabbit hole” effect on growing vaccine hesitancy, coronavirus denialism and other problems. For a long time, however, social media platforms were reluctant to take content down, likely over fears of a backlash.

But during the pandemic, YouTube joined other platforms including Facebook, Instagram, Twitter, and Google Search with a variety of tactics to push back against COVID-19 and vaccine misinformation, which included labeling misinformation, taking it down altogether, and pointing users to dedicated health information hubs that promote medical consensus.

These efforts were celebrated by public health experts and were often carried out in partnership with groups such as the World Health Organization and the Center for Disease Control. At the same time, it sparked criticism from zealots across the political spectrum. Widespread belief in health misinformation is one of the rare issues that unites factions across party lines.

Robert F. Kennedy Jr., fringe presidential candidate and vaccine conspiracy influencer, is currently suing Google and YouTube for “silencing” his political speech. Kennedy, once labeled a member of the internet’s “disinformation dozen” and “America’s worst anti-vaxxeris the subject of numerous content moderation take-downs on social media in response to his anti-vaccine crusade. The lawsuit argues Kennedy’s videos deserve special treatment because he’s running for president.


Redoubling efforts to fight misinformation will ramp up such criticism, at least in the short term. Apparently, that’s a risk YouTube and Google are willing to take.