China: A Women in Ad Banner Was Mistaked As Criminal By Face Recognition System

Harin - Nov 26, 2018


China: A Women in Ad Banner Was Mistaked As Criminal By Face Recognition System

A Women in China was mistakenly flagged as a criminal by face recognition system when her photo was on an advertisement banner on a bus.

Ever since China introduced the country’s new facial recognition and AI for surveillance systems, technology experts and human rights activists worldwide has been showing their criticism.

Jaywalking 1542957503

Recently, an embarrassing situation caused by the system has justified those criticisms. Ningbo city’s police force needed to take action after a woman was mistakenly identified as a jaywalker by a road surveillance. The automated system flagged the businesswoman for wrongly crossing a road when a green pedestrian light has not been on. The problem was that not only there was a mistake in the system, the mentioned woman was nowhere near the area.

Cameras have been placed on major intersections and roadways in the country since a long time ago. Later, they were modified to add in facial recognition feature. Human face from a distance can be sensed and captured by these cameras. Those captured images are then compared with government records.

This feature has now also been being used by authorities on crosswalks, to put to shame those pedestrians illegally crossing the roads. The pedestrian’s identity was pinpointed by the AI. When they finished crossing the road, on a digital notice board, their name, photo and government ID will be put on display for other people to see.

The system then went wrong, particularly for Dong Mingzhu. When the camera got her image and put it on the notice board, she wasn’t actually there to cross the road. What happened was that because she’s quite a famous person in the area, her photo was on an advertisement on a bus passing the crosswalk. The AI, mistaked the image as the real Dong Mingzhu, then it automatically flagged her. Later, the local police needed to correct this. An upgrade is said to have been added to the facial recognition system, in order to “lower the incorrect recognition rate”. However, how soon can this be achieved is still remained unknown.

It might sound funny when you first read about it, but there are problems involved here. When order systems with AI and automating law are used, a small false action can affect the victim seriously. Not to also mention, there has already been an underlying problem where non-stop government surveillance can intrude in people’s privacy.

Comments

Sort by Newest | Popular