The most popular thing on the internet on Wednesday was the first-ever image of a black hole. It was everywhere and for good reason: This is a historical moment, it reinforces Einstein’s theories, and it’s something that’s difficult to wrap your head around.
Millions of people read articles and watched videos explaining the black hole news. A viewer who watched a video on the scientific accomplishment on YouTube had a good chance of the platform’s recommendation algorithms pushing her to a video detailing an absurd conspiracy theory—a harsh irony in the face of the hard science that millions of people have been fascinated by this week.
A third-party researcher tracking over 1,000 of the most popular channels on YouTube found that “They Found Something In Outer Space And It’s On Its Way Here” is the video recommended by the most channels out of the 5,546 videos the researcher collected on Wednesday. It’s a gratuitously monetized conspiracy theory video that delivers 13 minutes of apocalyptic babble and the idea that human beings are really just genetically engineered alien slaves. Sounds like something one of the world’s leading artificial intelligence companies probably could have caught!
The YouTube recommendation numbers are according to research by Guillaume Chaslot, a former Google software engineer who operates AlgoTransparency.org. The conspiracy video has been viewed over 130,000 times in three days.
“The AI recommended this conspiracy millions of times from more than 169 different channels, including European Space Agency and Northrop Grumman,” Chaslot tweeted on Wednesday. “Today was an historic high for astronomy, and an historic low for AI.”
Chaslot’s research examines YouTube’s powerful but barely understood recommendation algorithm. By watching thousands of popular channels, AlgoTransparency gathers and logs the recommendations in order to better understand how YouTube’s algorithms work.
It’s important research because YouTube’s algorithms—which are responsible for an unfathomable over 700 million hours of video watch-time every day but which remain a veritable black box into which almost no one outside of Google has visibility—has been responsible for popularizing misinformation and conspiracy theories among a wider group of abusive content.
Here Chaslot is (on YouTube, of course) explaining his research and how YouTube works to promote conspiracy theories:
Earlier this year, YouTube said it would change its recommendations algorithms to recommend fewer conspiracy and misinformation videos. There was a lot of attention paid to the announcement but it crucially lacked details about how it would actually accomplish this.
“We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways — like videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11,” the company said in a statement.
It seems like this is a clear-cut case of this bizarre conspiracy theory fitting exactly into that definition but still being recommended “millions of times,” according to Chaslot.
YouTube didn’t respond to a request for comment.
“I was very enthusiast about YouTube’s announcement,” Chaslot tweeted on Wednesday, “but let’s be honest: after two months, things barely changed.”
Correction: A previous version of this article incorrectly stated that “They Found Something In Outer Space And It’s On Its Way Here” was the most recommended video on YouTube. In fact, a third-party researcher tracking over 1,000 of the most popular channels on YouTube found that the video was recommended by the most channels out of the 5,546 videos the researcher was monitoring at the time. We regret the error.