China: A Women in Ad Banner Was Mistaked As Criminal By Face Recognition System
Harin - Nov 26, 2018
A Women in China was mistakenly flagged as a criminal by face recognition system when her photo was on an advertisement banner on a bus.
- Mumbai 2020 Power Outage Might Be A China's Cyberattack, India Official Claims
- TikTok China Banned Users From Making Videos To Show Off Wealth
- Entrepreneur Spent 1 Year Building 600-Square-Meter Floating Mansion
Ever since China introduced the country’s new facial recognition and AI for surveillance systems, technology experts and human rights activists worldwide has been showing their criticism.
Recently, an embarrassing situation caused by the system has justified those criticisms. Ningbo city’s police force needed to take action after a woman was mistakenly identified as a jaywalker by a road surveillance. The automated system flagged the businesswoman for wrongly crossing a road when a green pedestrian light has not been on. The problem was that not only there was a mistake in the system, the mentioned woman was nowhere near the area.
Cameras have been placed on major intersections and roadways in the country since a long time ago. Later, they were modified to add in facial recognition feature. Human face from a distance can be sensed and captured by these cameras. Those captured images are then compared with government records.
This feature has now also been being used by authorities on crosswalks, to put to shame those pedestrians illegally crossing the roads. The pedestrian’s identity was pinpointed by the AI. When they finished crossing the road, on a digital notice board, their name, photo and government ID will be put on display for other people to see.
The system then went wrong, particularly for Dong Mingzhu. When the camera got her image and put it on the notice board, she wasn’t actually there to cross the road. What happened was that because she’s quite a famous person in the area, her photo was on an advertisement on a bus passing the crosswalk. The AI, mistaked the image as the real Dong Mingzhu, then it automatically flagged her. Later, the local police needed to correct this. An upgrade is said to have been added to the facial recognition system, in order to “lower the incorrect recognition rate”. However, how soon can this be achieved is still remained unknown.
It might sound funny when you first read about it, but there are problems involved here. When order systems with AI and automating law are used, a small false action can affect the victim seriously. Not to also mention, there has already been an underlying problem where non-stop government surveillance can intrude in people’s privacy.
Featured Stories
ICT News - Mar 14, 2026
Elon Musk's High-Stakes $109 Billion Lawsuit Against OpenAI and Microsoft
ICT News - Mar 05, 2026
X Platform Implements Strict Measures Against Fake AI-Generated Videos Amid Iran...
How To - Mar 04, 2026
Getting Started with AI: A Newbie's Simple Guide
ICT News - Mar 03, 2026
Budget Entry-Level PCs Under $500 to Vanish by 2028 Due to Memory Price Surge
ICT News - Mar 02, 2026
IDC Report Predicts Surging Smartphone Prices Due to Global RAM Shortage
ICT News - Mar 01, 2026
Samsung Links Galaxy S26 Price Hikes to AI Memory Supply Issues
ICT News - Feb 28, 2026
Anthropic Blacklisted by US Department of War: Trump Orders Federal Ban Over AI...
ICT News - Feb 26, 2026
AI Models Frequently Resort to Nuclear Escalation in Simulated Crises, Study...
ICT News - Feb 23, 2026
It's Over for Xbox: Asha Sharma Takes Over to Ruin Microsoft Gaming with AI
ICT News - Feb 22, 2026
Which AI Model Excels at Which Task in 2026: A Comprehensive Guide
Read more
ICT News- Mar 14, 2026
Elon Musk's High-Stakes $109 Billion Lawsuit Against OpenAI and Microsoft
This could reshape AI partnerships and nonprofit-to-for-profit shifts, echoing Musk's AI safety concerns.

Comments
Sort by Newest | Popular