Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

YouTube’s Plan to Stop Recommending Conspiracy Theory Videos Is Actually Working

Illustration for article titled YouTube’s Plan to Stop Recommending Conspiracy Theory Videos Is Actually Working
Photo: Getty Images

The YouTube algorithm is responsible for many a video-viewing rabbit hole—one that critics have pointed to as an effective digital megaphone for spreading wonky conspiracy theories. In January 2019, however, YouTube announced it would begin cracking down on “borderline content” in its recommendations. Well, a new study suggests that YouTube’s efforts have actually yielded some fruit. According to the study, conspiracy theories are now 40 percent less likely to pop up in your recommendations than they were before YouTube cracked down.

Advertisement

The gist of the study is that Berkley researchers examined 8 million recommendations over a 15-month period. To judge how effective YouTube’s efforts were, the researchers trained an algorithm to determine whether a video contained conspiracy theories based on its description, comments, and transcripts. The results were mixed.

Advertisement

On the plus side, researchers found that YouTube had been successful at axing videos peddling theories that the U.S. government perpetrated the 9/11 terrorist attacks and that the Earth is flat—two topics that YouTube identified as targets when it initially announced its plans to tackle conspiracy theories. The study also found that in the period from June to December 2019, the percentage of conspiracy theory recommendations dropped first to 50 percent, and then to an all-time low of 70 percent.

However, the researchers also found that those numbers, while consistent with their own data, didn’t necessarily account for the popularity of the source video. When adjusting for that, they found conspiratorial recommendations have risen from the lowest point in May 2019 and now are only 40 percent less common than before. Also, while YouTube was successful at curbing some conspiracy theories, others are still quite rampant—including those about aliens building pyramids and, more concerningly, climate change denial. The researchers told the New York Times that indicates YouTube has made a choice as to what types of misinformation it will shut downones that get a lot of negative media attention, like Sandy Hook conspiracies—versus the ones it will allow.

Another problem here is that while the study does show a marked decrease in conspiratorial recommendations, it doesn’t really shed any more light on how that impacts radicalization. Furthermore, the study was limited in that it only studied recommendations without logging into a YouTube account—which doesn’t reflect how most people browse on the platform. Without cooperation from YouTube, it’s hard to accurately replicate personalized recommendations at scale, meaning any study claiming to definitively judge whether or not YouTube has an impact on radicalizing people is inherently flawed.

YouTube has nearly 2 billion monthly active users, an increasing number of whom use the platform as their primary news source. Steps like curbing recommended conspiracy videos and giving users more direct control over what the algorithm shows them are a step in the right direction, but there’s still work left to be done.

Advertisement

Consumer tech reporter by day, danger noodle by night. No, I'm not the K-Pop star.

Share This Story

Get our newsletter

DISCUSSION

Only tangentially related but is the flat earth thing just a troll conspiracy theory? Or are there real, actual people in the world that believe it despite the whole thing being ridiculously easy to disprove? The only encounters I’ve ever had related to it were tongue in cheek.