YouTube is finally taking bigger steps to combat inappropriate videos targeted toward children.
In October, Mashable first reported that weird, creepy, and downright inappropriate videos were slipping through filters on YouTube Kids, an app geared toward children that allows virtually anyone with a YouTube account to create content that could be seen by millions of children. Those findings were reignited this week after the New York Times reported on the story.
Back in August, the company rolled out a new policy restricting users from advertising dollars for the inappropriate use of family-friendly characters, such as Elsa and Spider-Man. Now YouTube has decided to take additional measures that age restricts this type of flagged content on its main app, which will automatically block it from slipping into the kids app, as first reported by The Verge.
"Earlier this year, we updated our policies to make content featuring inappropriate use of family entertainment characters ineligible for monetization," Juniper Downs, YouTube director of policy, said in a statement from the company. "We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids. The YouTube team is made up of parents who are committed to improving our apps and getting this right."
That means if a kid-friendly character like Elsa from Frozen is doing something inappropriate, like shooting a machine gun, YouTube is hoping users will flag it, which will age restrict it, therefore blocking it from hitting the kids app. Content from YouTube main may take several days to filter into the kids app, and content flagged in the kids app has its own reviewers, who are monitoring flagged content 24/7.
YouTube stressed to Mashable that this is an added layer of protection and not the only process that can keep a video from migrating into the kids app from YouTube main. The company says it uses machine learning and algorithms to select content appropriate for children. The system is constantly evolving to block inappropriate content.
YouTube will be using its team of moderators to help sift through content and take action on any videos that may be inappropriate. This new practice should be rolling out in the coming weeks.
YouTube says it has been working on the policy for a while, and that practices were not revised due to scrutiny in the media. No mention of a new policy from YouTube was discussed with Mashable during the reporting of our original piece in October.
While the policy is a welcome change for parents worried about the content their kids may see on a user-generated platform such as YouTube, it appears that the new policy will still rely heavily on algorithms, and on someone spotting the problem content first. So it's not necessarily a sure fix: Some me of these bizarre clips from YouTube can be 30 minutes or longer, and they often start out completely normal, only to take sudden, dark turns.
And, as we all know, algorithms are far from perfect.
UPDATE: Nov. 9, 2017, 4:45 p.m. PST Added comments from YouTube.
Source: Mashable