Autonomous Vehicles Find It More Difficult To Detect People With Darker Skin

Viswamitra Jayavant - Mar 09, 2019


Autonomous Vehicles Find It More Difficult To Detect People With Darker Skin

If it’s left unfixed, you may find yourself more dangerous to cross the streets if you have darker skins than those with fairer complexes.

Facial recognition is still a relatively new technology despite its rising popularity in a variety of products from smartphones to autonomous vehicles. Smartphones like the new iPhones are making it more and more trivial as days passed just like with fingerprint sensors. However, just like any ‘new’ technologies, there will be problems.

Selfdriving Car

No one didn’t actually think that the problem with facial recognition system unintentionally touched a sensitive topic. Racism.

We have already known for quite some time now that these systems struggle with recognising people with darker skin colours. But software like Amazon’s Rekognition aren’t very pressing. Although they can be a mild inconvenience for those with darker skin tones, they can be improved in time.

That’s not something we can confidently say to facial recognition systems in charge of autonomous vehicles.

These artificially intelligent systems are tasked with recognising signs of pedestrians and minimise risks to both the vehicle and them. Usually, when a car spots a pedestrian in front of it and they both are on a collision course, it will brake the vehicle before the collision.

1 N0frgjieisekk4e40hqbfa

Now the thing is that the system, more often than not, failed to recognise people with darker skin tones. Needless to say that despite AI does not have the capacity of thoughts to express prejudices, the sign is worrying.

Uniformly Poorer Performance

A paper published by researchers from the Georgia Institute of Technology on arXiv detailing their analysis of eight AI models from different modern, top-of-the-line object detection systems. In other words, these systems are responsible for detecting objects, pedestrians, or road signs in autonomous vehicles.

The baseline to test these systems was what's known as the Fitzpatrick scale. It is a scale commonly used to classify human’s skin colours. Researchers print out images of pedestrians separated into two categories, one's fairer on the scale, and the other darker.

The result is worrying. All models exhibited “uniformly poorer performance” when they’re tested against images of pedestrians with darker skin tones on the scale.

Biased Self Driving Systems

An average decrease of 5% accuracy was calculated when these systems are faced off against the category of pedestrians with darker skins. Even when the researchers have taken into account environmental variables such as whether the photos were taken during the day or night.

A Simple Solution

It is fortunate that the problem was spotted early in a world that’s moving slowly but surely to a time when autonomous vehicles took over the roads. If it’s left unfixed, you may find yourself more dangerous to cross the streets if you have darker skins than those with fairer complexes.

Thankfully, the solution is not that complex. What we (Manufacturers) ought to do is to include more images and data sets of darker skinned pedestrians for the system to train on, and then, set priority for the system to focus more on looking out for these images that it had found it harder to detect.

Comments

Sort by Newest | Popular

Next Story