Earlier this year, YouTube said it would change its recommendations algorithms to recommend fewer conspiracy and misinformation videos. There was a lot of attention paid to the announcement but it crucially lacked details about how it would actually accomplish this.

Advertisement

“We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways — like videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11,” the company said in a statement.

It seems like this is a clear-cut case of this bizarre conspiracy theory fitting exactly into that definition but still being recommended “millions of times,” according to Chaslot.

Advertisement

YouTube didn’t respond to a request for comment.

“I was very enthusiast about YouTube’s announcement,” Chaslot tweeted on Wednesday, “but let’s be honest: after two months, things barely changed.”

Advertisement

Correction: A previous version of this article incorrectly stated that “They Found Something In Outer Space And It’s On Its Way Here” was the most recommended video on YouTube. In fact, a third-party researcher tracking over 1,000 of the most popular channels on YouTube found that the video was recommended by the most channels out of the 5,546 videos the researcher was monitoring at the time. We regret the error.