The Future Is Here
We may earn a commission from links on this page

YouTube Is Probably, Maybe Hiring Some People to Make It Less Welcoming to Child Predators

We may earn a commission from links on this page.

As you may have heard, YouTube is growing its content moderation team to 10,000 staffers. Sounds like progress! Of course, the move comes as a response to the ever-expanding gallery of horrors the site has unwittingly played host to over the years—most recently, various forms of child exploitation and predation—but let’s review the actual announcement from CEO Susan Wojcicki:

We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.

Do these words mean much of anything? Not really. People “working to address content” does not necessarily mean moderators or even paid staff, a “goal” is not a promise, and 10,000 is only an increase if we know how many people YouTube currently tasks with reviewing troubling or potentially illegal videos—numbers they have never been keen to share. A source who spoke to Buzzfeed claims Wojcicki’s projection would be a 25 percent increase, but again, it’s just a projection.


The figures cited in the announcement post make it seem like whatever YouTube has currently invested in human moderation, it’s nowhere near 10,000—or even 7,500 people:

Since June, our trust and safety teams have manually reviewed nearly 2 million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.


Two million videos! Again, that sounds like real work is being accomplished, except 2 million videos over six months amounts to about three videos a day for a workforce of even 5,000. Granted, the trust and safety teams might have non-extremist videos to watch, but it’s worrying that the most impressive figure Wojcicki could pull doesn’t really impress. This, of course, is with the help of the vague machine learning program YouTube is now employing to review and flag untoward videos, which Wojcicki claims is “helping our human reviewers remove nearly five times as many videos than they were previously.”

The numbers being bandied about by the internet’s largest video platform (itself a subsidiary of Alphabet, one of the largest companies, period) don’t stand up to scrutiny, and it’s solutions have consistently been underwhelming. Earlier this year YouTube was projected to close in on TV—yes, all of TV—in terms of viewership by the end of 2018. At this rate it will be lucky just to hold onto all of its advertisers.

We’ve asked YouTube to clarify the number of people working as moderators presently and will update when we hear back.