YouTube is attempting to bridge the gap between its dedicated Kids app and regular YouTube for parents with tweens and teens.
YouTube announced Wednesday that it will launch a new “supervised” experience in beta that will introduce additional features and settings for regulating the types of content that older children can access on the platform. Content will be restricted based on the selection of one of three categories. “Explore” will introduce videos suitable for kids 9 and older, “Explore More” will bump them into a category with videos for kids 13 and older, and “Most of YouTube” will show them nearly everything except age-restricted and topics that might be sensitive to non-adults.
YouTube says it will use a blend of machine learning, human review, and user input to vet content—a system that has worked spectacularly for YouTube in the past. Seemingly trying to get out ahead of whatever issues will arise from its busted moderation system, the announcement blog stated that YouTube knows “that our systems will make mistakes and will continue to evolve over time.”
Clearly, any tool that attempts to filter inappropriate content on YouTube is welcome and necessary. But guardians cannot rely on YouTube alone to take the wheel and guide the experience of their kids. We’ve seen how well that’s worked in the past over on YouTube’s dedicated Kids app—which is to say, not great.
Part of the problem is that YouTube’s platform, like those of other social media giants, is just too big to adequately moderate. One wrong turn can send your kid down a rabbit hole of conspiracies whether they were looking for them or not. Plus, if we’re being honest, teens and tweens are probably going to find a way to watch whatever content they want to watch regardless of how kid-proofed the home computer is anyway.
All that said, creating a middle ground between YouTube Kids and the chaos of normal YouTube is something. Just don’t bank on a perfect moderation system. Even YouTube says so.