Companies are pulling their advertising campaigns from YouTube amid reports that a network of pedophiles is openly operating in the comments sections of videos of young children, Bloomberg reported Wednesday. Disney and Nestlé are among those who have reportedly yanked spending after a YouTube video surfaced the ongoing problem.
A video shared by YouTuber Matt Watson on Sunday outlined what he described as a “soft-core pedophile ring” enabled by commenters on videos of children, particularly of young girls. These videos, which are monetized by the company, are flooded with comments by apparent pedophiles who trade contact information and links to child pornography. They also timestamp what Watson said are “points in the video where little girls are in compromising positions, sexually implicit positions.”
Watson called YouTube’s algorithm for surfacing these videos a “wormhole” of exploitative content. Once a YouTube user clicks through several of these videos, their suggested content column becomes flooded primarily with videos of children.
Wired was able to replicate Watson’s claims and said the videos it encountered often included little girls playing, swimming, or eating popsicles, and in some cases more graphic content. Once some of these videos are viewed, Wired said YouTube’s algorithm surfaces videos that appear to be popular with other pedophiles. In many cases, the site reported, videos of young children to which pre-roll ads are attached have racked up hundreds of thousands and even millions of views.
Companies are now opting to distance themselves from the controversy by either contacting YouTube about the problem or pulling the plug on ad campaigns entirely.
“I can confirm that all Nestlé companies in the US have paused advertising on YouTube,” a Nestlé spokesperson told Gizmodo in a statement by email. Bloomberg cited sources who claimed Disney has followed suit, though the company did not immediately return a request for comment.
A spokesperson for Epic Games, the developer behind Fortnite, told Wired that by way of its ad agency, the company had “reached out to YouTube to determine actions they’ll take to eliminate this type of content from their service.” Grammarly told Wired it also contacted YouTube about the issue.
Disturbing and predatory comments on YouTube videos of children resulted in a similar response from advertisers in 2017. The company said at the time that it was working to fix the issue, but it appears to remain a pervasive problem on the site.
A YouTube spokesperson said that the company is working to tackle the issue and has disabled comments on millions of videos of children. The company has also removed more than 400 accounts of some commenters on these videos, as well as some videos it believed may be putting young subjects at risk. The spokesperson added that YouTube is reporting any illegal comments to the National Center for Missing and Exploited Children.
“Any content—including comments—that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” a YouTube spokesperson said in a statement by email. “We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”