Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

YouTube's Nightmare Algorithm Exploited Children by Recommending Pedophiles Watch Home Videos of Kids

YouTube’s recommendation algorithm has been encouraging pedophiles to watch home videos that families upload showing their children playing.

Advertisement

A New York Times report details how YouTube has been exploiting minors through this automated system. According to the Times, researchers for the Harvard’s Berkman Klein Center for Internet and Society were studying YouTube’s influence in Brazil when they noticed the alarming effect. The team’s experiment involved a server that followed YouTube recommendations thousands of times, building a sort of map in the process, which showed how YouTube guides users.

When the experiment went down paths of recommendations stemming from videos with sexual themes, the researchers noticed the system served up videos that were “more bizarre or extreme, and placed greater emphasis on youth,” according to the Times. “Videos of women discussing sex, for example, sometimes led to videos of women in underwear or breast-feeding, sometimes mentioning their age: 19, 18, even 16.”

Advertisement

Deeper down the path of this experiment, YouTube reportedly began recommending videos of adults wearing children’s clothing and soliciting payments from “sugar daddies.”

After such softcore fetish recommendations, YouTube then showed actual videos of “partially clothed children,” many of them based in Eastern Europe and Latin America, according to the Times.

These videos often seemed to be home videos uploaded by parents, who possibly wanted to easily share footage of their children with friends and family. But, as the Times suggests, YouTube’s algorithm might have learned from people who look at children in sexually exploitative ways and steered those viewers to the family videos.

The Times interviewed the mother of a 10-year-old child who uploaded a video of herself and a friend playing in a pool. Within several days the video had been viewed 400,000 times.

Advertisement

The mother, Christine C. (last name withheld for privacy), told the Times when her daughter boasted about the view count, Christine “got scared by the number of views.”

According to the Times, this incident unfolded after YouTube had to publicly confront its pedophile issues earlier this year. In February, YouTube disabled comments on many videos of minors following reports that pedophiles were commenting on videos of children as a signal to other predators.

Advertisement

While studies have shown that the YouTube recommendation system can create a “rabbit hole effect,” through which the algorithm recommends increasingly extreme content, the company has skirted the topic or denied that it’s real. In May, YouTube’s chief product officer Neal Mohan told the Times, “It is not the case that ‘extreme’ content drives a higher version of engagement or watch time than content of other types.” Then the company doubled-down in April, responding to a Bloomberg investigation by claiming that “generally extreme content does not perform well on the platform.”

YouTube did not answer Gizmodo’s request for comment on whether it maintains that the recommendation system doesn’t create a rabbit hole effect. Instead, the company referred Gizmodo to a blog post published today about the company’s “efforts to protect minors” following the Times report on videos that “do not violate our policies and are innocently posted.”

Advertisement

The announcement highlighted YouTube’s recent steps to disable comments in videos featuring minors; restrict minors from livestreaming, unless doing so with an adult clearly present; and reduce recommendations of videos that show “minors in risky situations.”

According to YouTube, the company recently improved its machine learning to “better identify videos that may put minors at risk.”

Advertisement

According to the Times, researchers have said blocking videos of children from being used in the recommendation system would be the best way to protect children. But YouTube told the Times it had no plans on doing that anytime soon since the automated system is the largest driver of traffic, and the move would harm creators.

Former senior reporter at Gizmodo

Share This Story

Get our newsletter

DISCUSSION

What can be learned from this is don’t use YouTube to share videos of your children to other family members. If you do, you do have the option to unlist or privatize the video where the links will still work to people you share.

There’s nothing inherently wrong with the algorithm, It’s designed to learn from users on the entire platform and, much like music streaming apps, it’s designed to push you into new stuff in order to break repetition.

Another thing to remember is that Algorithms still aren’t able to comprehend every single video online, and aside from automated copyright takedowns, which rely on a “fingerprint,” and possibly nudity detectors, all the work has to be done by humans, which is also nearly impossible considering the video-time ratio is 18,000:1. In other words, you would need at least 18,000 moderators watching new videos back-to-back-to-back at any given point in time, just to keep up with the expansion of content.

Almost 1/5th of the world’s population is on YouTube consuming 5B videos, daily.