YouTube Changes Policies for Kids Videos
posted Saturday Nov 25, 2017 by Scott Ertz
YouTube has once again come under scrutiny for its automation processes. Just a few months ago, the company's advertising automation caused a massive advertiser boycott, including large companies such as AT&T, Enterprise Rent-A-Car, Johnson & Johnson and Verizon. Now they are having trouble with parents who have noticed some incredibly inappropriate content appearing in the YouTube Kids platform.
The company's algorithms began allowing adult-oriented content to appear in the family-friendly Kids platform. For example, Mickey Mouse in a pool of blood, or a Claymation animation of Spider-Man urinating on Frozen's Elsa. There has also been an influx of sexual comments on videos of children, particularly on newer "challenge" videos, such as the yoga challenge or the ice pants challenge. Buzzfeed recently revealed a number of videos in the platform showing children being abused or in vulnerable situations.
Because of these issues, YouTube has put new policies in place to try and eliminate, or at least slow, the problem. The highlights of the new policies are,
- Tougher application of community guidelines and faster enforcement through technology
- Removing ads from inappropriate videos targeting families
- Blocking inappropriate comments on videos featuring minors
- Providing guidance for creators who make family-friendly content
- Engaging and learning from experts
On the surface, these measures sound reasonable, but there are a few issues. Chief among them, the definitions of "inappropriate" and "targeting families" are incredibly vague. The company's only example is a 5-month-old guidance on using family-oriented characters in violent or sexual situations. However, the terminology could be applied to content, such as videogame videos, that include a player swearing. For example, if you have ever watched a Super Mario Maker speed-runner, you know that they can get verbally abusive to the game. Even though the videos are not aimed at children or families, this definition could demonetize said videos for no real reason.
If YouTube really wants to deal with the content on YouTube Kids, they should implement an opt-in on video uploads. Then, the algorithms could scan that content to ensure it meets the regulations, rather than assuming that every video could be a potential match for the service. For example, we are aware that not all of our shows would be entirely appropriate for kids, so we would not opt-in on those episodes. This would, of course, limit the number of videos in the service, so adversely, they could allow an opt-out for those videos that could get caught in the filter.