Amazon's Facial Recognition Tech Matched Politicians To Criminals

Viswamitra Jayavant - Sep 11, 2019


Amazon's Facial Recognition Tech Matched Politicians To Criminals

Amazon's Rekognition is a problematic piece of software that recently misidentified 26 Californian politicians for criminals during a test.

Facial recognition is the newest thing that the government is quite enthusiastic about. They have been pushing for its wide deployment for a while. But even when you don’t know the technical or legal details behind the move, it’s not hard to understand that this may be a bad thing.

The ACLU orchestrated a test just to see how effective the current facial recognition software that the U.S. law enforcement agencies are using: Amazon’s Rekognition. The results can’t be any more depressing (and morbidly humorous). It falsely identified 29 Californian lawmakers as criminals.

Rekognition 1
The ACLU tested Amazon’s Rekognition.

In a sense, the system’s not wrong - but we’ll leave the political jokes alone for now.

ACLU's Experiments

It’s not the first time that the ACLU created and ran this type of experiment. The first test was done last year showed the accuracy of Rekognition to be disastrous. Not only were the results incorrect, but they were also racially biased when the program was pushed to matching members of the Congress.

In the second test done recently, the ACLU set up a data set of 120 Californian politicians, pitching them against a database filled with 12,000 mugshots of criminals. False positives popped up 20% of the time.

Rekognition-Scan
The ACLU set up a data set of 120 Californian politicians, pitching them against a database filled with 12,000 mugshots of criminals.

Phil Ting is one of the politicians who were mismatched. A San Francisco Assembly Member, he’s now using the result to push for the ban of the implementation of facial recognition tech in police’s body cameras. During a press conference addressing the matter, he stated how the experiment proved that facial recognition tech is still not ready for use. He also cited how its inaccuracy can become problematic for people who are trying to interview for a job or trying to buy a home.

Amazon's Response

In response to the experiment, a spokesperson for Amazon told the press in defense of the program that ACLU had “[misused] and [misrepresented]” Rekognition in order to “make headlines.” The company said how when the program is used with a 99% confidence threshold as well as taking into account human-driven decisions, there is a vast array of benefits that the tech could bring to the table.

Screenshot 3

Still A Lot of Questions

There are numerous confusions that popped around Rekognition. When Amazon said that the program should be used with a 99% confidence threshold. Matt Cagle - an attorney for ACLU who collaborated with UC Berkeley to verify the results said that the organization did not use a 99% confidence threshold simply because it was not the default value in the settings of the program, which is fixed at 80%.

Amazon even went as far as pointing its finger towards a blog post that stated explicitly that Amazon’s Rekognition should not be used with any other confidence threshold than 99%. Yet, it’s strange how that’s not the default value in the program.

Comments

Sort by Newest | Popular

Next Story