YouTube Is Giving Pedophiles The Best Way To Find The Videos They Want
Aadhya Khatri - Feb 20, 2019
YouTube is to blame for making a platform for pedophiles to connect and channel owners to make money out of child porn.
- YouTube AI Mistakes Black And White In Chess For Racism
- Young YouTuber Killed When Approaching Strangers With A Knife In A Prank Video
- YouTube Comments Not Loading? Here Is How You Can Fix It
YouTube is a massive video sharing platform so it comes as no surprise that users can find almost anything there, even something that is against the law, and it seems like the platform is making things easier for users to find these toxic materials.
YouTube has an algorithm to suggest videos related to what users are watching. The downside of this act includes supporting conspiracy theories or radicalization. Recently, it received another accusation that says YouTube is helping child porn gets popular.

YouTube is fueling child porn with its algorithm
This matter was brought forward by Matt Watson. He talked about it in one of his videos that have as many as one million views when this article was written. He also pointed to several videos on the channel that show young girls who are inappropriately dressed. The most disturbing thing is these videos usually have more than 1 million views.
Many viewers include timestamps in their comments to point to when the girls are at sensitive positions as well as several shocking statements about them. The videos’ channels even encourage these disgusting comments and in the meantime, make money.
According to Watson, this YouTube’s algorithm accidentally helps pedophiles connect and share information. Even new accounts can find and see these videos after about 10 minutes of searching or fewer than 5 clicks.

It is not doing enough to address this matter
What YouTube is currently doing is to use another algorithm to detect and remove suspicious videos based on their tags or titles, which is far from effective to solve this issue. This is the reason why wrong accounts get banned sometimes and the owner must contact the platform’s human team to get them back.
This system also lets countless channels showing child porn slip through its fingers and get away. In the rare occasions it catches the right people, it is because a human team is working on the matter because users who have the motivation to report the wrongdoing are unlikely to search for them in the first place.
For now, YouTube ‘s weak effort has led to nowhere. The channel owners are not punished and pedophiles are conveniently receiving a list of exactly what they are looking for.
Featured Stories
ICT News - Mar 14, 2026
Elon Musk's High-Stakes $109 Billion Lawsuit Against OpenAI and Microsoft
ICT News - Mar 05, 2026
X Platform Implements Strict Measures Against Fake AI-Generated Videos Amid Iran...
How To - Mar 04, 2026
Getting Started with AI: A Newbie's Simple Guide
ICT News - Mar 03, 2026
Budget Entry-Level PCs Under $500 to Vanish by 2028 Due to Memory Price Surge
ICT News - Mar 02, 2026
IDC Report Predicts Surging Smartphone Prices Due to Global RAM Shortage
ICT News - Mar 01, 2026
Samsung Links Galaxy S26 Price Hikes to AI Memory Supply Issues
ICT News - Feb 28, 2026
Anthropic Blacklisted by US Department of War: Trump Orders Federal Ban Over AI...
ICT News - Feb 26, 2026
AI Models Frequently Resort to Nuclear Escalation in Simulated Crises, Study...
ICT News - Feb 23, 2026
It's Over for Xbox: Asha Sharma Takes Over to Ruin Microsoft Gaming with AI
ICT News - Feb 22, 2026
Which AI Model Excels at Which Task in 2026: A Comprehensive Guide
Read more
ICT News- Mar 14, 2026
Elon Musk's High-Stakes $109 Billion Lawsuit Against OpenAI and Microsoft
This could reshape AI partnerships and nonprofit-to-for-profit shifts, echoing Musk's AI safety concerns.
Comments
Sort by Newest | Popular