YouTube Will Remove Mature And Violent Videos Targeted Kids

Chitanis - Sep 04, 2019


YouTube Will Remove Mature And Violent Videos Targeted Kids

YouTube announced that it will remove all of the mature or violent content targeted kids, either in the video’s title, description or tags.

A couple of days ago, YouTube announced some changes in its policies. As part of this, to make the platform safer for kids, from now on, the company will remove all of the mature or violent content targeted toward young children and minors, either in the video’s title, description or tags. YouTube said it would no longer allow this kind of content on its platform.

YouTube-mature-content-video
YouTube will remove all of the mature or violent content targeted toward young children and minors, either in the video’s title, description or tags

YouTube actually announced its policy change by posting an announcement on the YouTube Help community forum, in a pretty quiet way. In fact, this post has not received much attention. There are only about 20 replies under the post, and we haven’t seen a lot of articles reporting the matter.

Before this change, mature or violent videos were age-restricted, but now the company will take a further step to clean up its platform and protect children and family experience.

According to YouTube, violated videos will be removed, but it won’t give strikes to the channels until the next 30 days. The company wants to give video creators time to get used to these new rules. Besides, videos that were uploaded before the announcement won’t be given strikes, but YouTube still can remove these videos.

YouTube gave us some examples of content that will be removed, like videos with nursery rhymes featuring mature themes such as death, sex, and violence. Videos with titles, descriptions or tags like “for children,” “for kids,” “family fun”, … featuring family-friendly cartoons but contain inappropriate acts, such as injecting needles, will be removed as well.

In addition, YouTube will age-restrict more content to protect young viewers. Videos like adult cartoons could be confused as kid-friendly, so they won’t be removed, but will be age-restricted. The company advises creators to check the guide for creating content for YouTube Kids if they focus on family entertainment. They also need to check out their video’s title, description, and tags, then make sure that those are targeting the right viewers if they don’t want their content to be age-restricted or removed.

Youtube-adult-cartoon
Videos like adult cartoons could be confused as kid-friendly, so they won’t be removed, but will be age-restricted

In recent years, YouTube is struggling to moderate the platform, especially with videos aimed at children. These issues have increased significantly in the past few years. This year, the company has been under the Federal Trade Commission (FTC) investigation, because it couldn’t prevent videos which designed to harm, manipulate or exploit young children and violated potential privacy related to its way to handle the underage viewers.

The focus of the controversy is critics claim that the recommendation algorithm of YouTube is fundamentally flawed, as it recommending videos to users without taking into account the content’s nature. This might lead a lot of people, including kids, to the violent, extremist or exploitative content. On the other hand, videos featuring young children and minors tend to perform very well on YouTube, with its algorithms rewarding video creators that use kid-friendly descriptions and tags, along with putting kids on-screen with huge viewing metrics and more advertising dollars.

Previously, YouTube took a half-hearted approach to moderation, included dealing with a bunch of violated videos when it found them, often because news organizations were warning the communications team of the company when asking for comment. Even though there were rules around videos featuring children and targeted them, the video giant failed to enforce those rules uniformly. YouTube became overrun with complicated edge cases that its guidelines didn’t take into account. An example is Elsagate, where anonymous, hard-to-track creators all over the world create disturbing, copyright-infringing videos containing distorted versions of Marvel and Disney characters.

YouTube-Elsagate
Elsagate is a disturbing trend on YouTube

YouTube, in June, said that it wouldn’t stop recommending content featuring kids, even after it noticed that pedophiles had left lewd comments under such content, along with engaging in other exploitative behaviors. This decision has become a focus in the FTC's investigation.

In response to intense criticism from the public and lawmakers, Susan Wojcicki, the company's CEO, has been more proactive over the past year. In February, YouTube said the company was approaching the child exploitation issue aggressively. Google, YouTube’s parent company, is also planning to move all videos featuring children and videos targeted toward children to other apps YouTube Kids. Although the company wants to create YouTube Kids as a kid-friendly app where parents can send their kids on the Internet without being worried, it has its own problems.

YouTube has prevented channels from live streaming if there are children involved, and now it is disabling comment sections on some content featuring them. According to a report from Bloomberg, YouTube is also planning to remove targeted ads on content with children's involvement. However, the company hasn’t decided whether to turn off its recommendation algorithm on those type of content, because it fears engagement could be reduced.

Tags

Comments

Sort by Newest | Popular

Next Story