YouTube has a genuine crisis on its hands after it was revealed that comments on its site were being used to organize a child exploitation network. Major advertisers are dropping like flies. In response, YouTube on Thursday announced that it is disabling comments on almost all videos that feature minors.
The controversy began earlier this month when YouTuber Matt Watson shared a video that exposed the use of YouTube comments by what he called a “soft-core pedophilia ring.” He showed that sexual predators were able to use comments to identify non-sexual video content that featured minors and would appeal to pedophiles. YouTube’s recommendation algorithm helped the predators find each other, Watson claimed, and they would trade contact information as well as links to actual child pornography. YouTube previously announced that it had banned 400 users’ accounts and deleted millions of comments in response to the news. But today it’s going further with a sweeping new policy.
In a blog post, YouTube said that it’s launching several initiatives to help protect young people. It said that it has already disabled comments on “tens of millions of videos” and over the next few months it will “suspend comments on videos featuring young minors and videos featuring older minors that could be at risk of attracting predatory behavior.”
It’s unclear how YouTube will decide if an older minor should be considered “at risk of attracting predatory behavior.” We’ve reached out to YouTube to ask if it can provide any further details about that process and will update this post when we receive a reply. For sensitive policies like this one, tech firms often withhold information out of fear that people will attempt to game the system.
It appears that YouTube hopes this doesn’t have to be a permanent policy. It said it will allow “a small number of creators” to enable comments on “these types of videos.” Those creators will be required to moderate their comments and “demonstrate a low risk of predatory behavior.” The goal is to expand the number of creators who are granted an exception to the comment ban over time as YouTube’s algorithms get better at catching offending comments.
Speaking of algorithms, YouTube’s got a new one. It said that it “accelerated” the launch of a new comment classifier that is twice as effective at identifying and removing “predatory comments.” You might ask why it didn’t accelerate this launch before if accelerating it was an option all along? The obvious answer is that Disney, AT&T, and the makers of Fortnite weren’t loudly pulling their ads from the network before. The other obvious answer is that the accelerated product probably isn’t ready for primetime.
YouTube also appeared to vaguely address concerns over the Momo Challenge, a recurring viral hoax that has spread online this week with warnings to parents that children’s videos on YouTube contain a creepy figure who encourages kids to kill themselves and their parents. The hoax reached peak virality on Wednesday when Kim Kardashian warned her followers about it and pleaded with YouTube to do something. In today’s blog, YouTube reiterated that “videos encouraging harmful and dangerous challenges targeting any audience are also clearly against our policies.”
At a time when you can set your watch to Facebook’s daily privacy scandal and tech companies seem to be in an endless loop of apologizing, it’s easy to miss that substantial changes are happening all the time. We need more companies to take big action the way that Pinterest simply killed search results that related to vaccines when it realized it had an antivaxxer problem. YouTube’s comment sections are widely understood to be the worst place on the internet that isn’t 4Chan. Disabling a specific set of comments is a great start and hopefully the beginning of a movement. We may live to see a day when all comments vanish and people are forced to keep it to themselves. Until then, we’ll take what we can get.