Conspiracy Videos Will No Longer Be Recommended on YouTube
Jyotis
A series of YouTube videos “claiming the earth is flat or making blatantly false claims about historic events like 9/11” won’t be recommended as much as it before.
- YouTube AI Mistakes Black And White In Chess For Racism
- Young YouTuber Killed When Approaching Strangers With A Knife In A Prank Video
- YouTube Comments Not Loading? Here Is How You Can Fix It
According to the latest announcement from YouTube, users will no longer get the recommendations of “harmful” videos which are supposed to violate the community guidelines of this video sharing website, such as medically inaccurate or conspiracy videos. It can’t be denied that such these videos always have a strong attraction to most of the users on video sharing sites, as well as social networks.
On February 11, NBC News revealed that a series of YouTube videos won’t be recommended as much as it before.
On January 25, the platform published a post on its official blog saying that since now, after a user watched one video, instead of displaying a chain of videos with the same topics.
However, it further points out that the availability of all videos won’t be affected by this change. In other words, if you are following any channel which makes all kinds of contents concerning conspiracy theories or you want to look for something like that, YouTube still provides the most suitable recommendations.
On February 09, Google’s ex-engineer, Guillaume Chaslot, called the latest action of YouTube a “historic victory.”
For those unknown, YouTube is a subsidiary of Google, and Chaslot was the one to develop the AI tech that helps the video-sharing platform to recommend videos in relation to the ones users have watched. The tech was designed to "entice" users to spend more time on YouTube.