Published: Fri, November 10, 2017
Research | By Raquel Erickson

YouTube Vows To Crack Down On Inappropriate Videos Aimed At Children

YouTube Vows To Crack Down On Inappropriate Videos Aimed At Children

It takes a few days for content to transition from the main YouTube app to YouTube Kids, and the company is hoping the work of human moderators, Contributors, and the new policy will prevent any more of this content from getting into its safe place for children.

YouTube also made sure to clarify that the number of inappropriate videos within the YouTube Kids app is very small, with most of those videos claiming the majority of their viewership on the all-ages version of YouTube.

Earlier this week, a report in The New York Times and a blog post on Medium drew a lot of attention to a world of odd and sometimes disturbing videos on YouTube aimed at young children. Numerous videos have been criticized for containing disturbing, sometimes violent or inappropriate content that is targeted at the platform's youngest viewers. The move comes after YouTube this summer said it would pull advertising from content that portrays family-entertainment characters (such as, say, Mickey Mouse or Ronald McDonald) engaging in "violent, sexual, vile, or otherwise inappropriate behavior".

YouTube is finally taking bigger steps to combat inappropriate videos targeted toward children. "Age-restricted content is automatically not allowed in YouTube Kids". A YouTube representative clarified that this change has been in the works for a while, and is not a response to recent media coverage of the unusual genre of videos. If the review finds the video is in violation of the new policy, it will be age restrictied, automatically blocking it from showing up in the Kids app.

"We're in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged", said Juniper Downs, YouTube's director of policy.

Age-restricted videos can't be seen by users who aren't logged in, or by those who have entered their age as below 18 on both the site and the app.

YouTube said it has been aware of issues for the past year, and has "thousands" of staff working around the clock to eliminate unsuitable content from YouTube Kids. It's also removing ads from this content. Examples include Nickelodeon characters dying in a fiery vehicle crash and pole-dancing in a strip club, and popular British kids' character Peppa Pig drinking bleach or enduring a horrific visit to the dentist. That means this new policy could put a squeeze on the booming business of crafting odd kid's content. But YouTube is acknowledging that YouTube Kids requires even more moderation.

In October, Mashable first reported that weird, creepy, and downright inappropriate videos were slipping through filters on YouTube Kids, an app geared toward children that allows virtually anyone with a YouTube account to create content that could be seen by millions of children.

Like this: