YouTube Prioritizes And Goes Easier On Famous Creators

Cooky - Oct 04, 2019


YouTube Prioritizes And Goes Easier On Famous Creators

YouTube moderators say that YouTube let its top video creators getaway easier when it comes to content moderation than everyone else does.

If you think YouTube lets popular YouTubers getaway easier when it comes to content moderation than everyone else does, then you’re right. According to a new report by Washington Post, YouTube is playing favorites, doling out more lenient punishments for popular creators who can bring the most money for the company. The Washington Post interviewed 11 former and current YouTube moderators, who have worked in content decision-making teams, and they said that popular creators usually receive special treatment such as looser YouTube guidelines interpretations that prohibit bullying, demeaning speech, as well as other types of graphic content.

Picture5
YouTube moderators admit the site goes easier on popular YouTubers

According to these moderators, YouTube made exceptions for popular YouTubers including PewDiePie, Steven Crowder and Logan Paul. YouTube denied all those claims, saying its rules are the same for everyone and it always tries to draw the line in the right places.

YouTube shares the money it earned from running ads on high-traffic videos with the video makers. If video creators break the rules, YouTube might remove their ads from their videos or channels. Or worse, it might remove their videos entirely. But different from Twitter and Facebook, not all of YouTube moderators are able to delete content themselves. They have to recommend whether a channel or a video is safe for running ads, flag it to higher-ups who make the final decision.

Picture6
YouTube higher-ups are people who make final decisions to whether delete content

The interviewed moderators said that when it came to videos from popular creators, even though they recommended the higher-ups to remove advertising from videos that violate YouTube’s rules, their recommendations were often refused.

A good example is Logan Paul, who filmed and made fun of a dead body in his video. Almost 2 weeks passed before YouTube removed him from the Google Preferred advertising program. And just over a month after that, YouTube brought ads back to his channel.

 

Picture7
YouTube let Logan Paul get away easily with his content

The moderators also said that many rules are contradictory and ineffective. When it came to offensive content, these rules are ad hoc decisions, arbitrary standards and constantly shifting policies.

YouTube admits that it has 2 sets of standards for conduct. It has stricter rules for creators who can benefit from running ads on their videos since they’re effectively in business with the company. General community guidelines seem looser. Moderators are divided to police these 2 groups separately, in order to make their work more efficient and specialized.

Picture8
Moderators are divided to police these 2 groups separately

Lately, YouTube has been criticized for not being able to restrict bad content on the platform. Since then, the company adjusted the recommendation algorithms, banned dangerous challenges and pranks, manually reviewed a million videos for violating its terrorism policy, and more. Last year, in the 3rd quarter alone, YouTube deleted up to 58 million videos violated its policies.

However, YouTube efforts are not enough. In an effort to make the video giant be more transparent about what it has been doing, especially when it comes to demonetization or monetization of videos, YouTubers in Europe moved to unionize. Earlier this year, Google's employees petitioned to ban YouTube from SF Pride. They believed that there are lots of abusive content toward the LGBTQ community but YouTube wasn’t doing it best to protect these people.

Picture9
YouTube has tried to deal with problematic content, but it’s not enough

According to Alex Joseph, a YouTube spokesperson, to live up to the company responsibility, over the past few years, they have made significant investments in the technology, tools, and people, including machine learning to detect problematic content at scale, a systematic review of their policies to ensure they’re drawing the line in the right places, expanding the teams that deal with bad content to more than 10,000 people. Joseph stressed:

Capture

Picture10
YouTube has made significant investments over the past few years

Whether the descriptions from the interviewed YouTube moderators draw an accurate and fair picture of how the company works, obviously it’s time for YouTube to take responsibility and be completely transparent about its rules and decisions.

Comments

Sort by Newest | Popular

Next Story