This Website Will Show You What AI Thinks Of You With Just One Photo
Aadhya Khatri - Sep 19, 2019
If you are foolish or curious enough to give ImageNet Roulette one of your photos, it will tell you exactly what AI thinks of you
- This New AI Ear Will Hear Potential Mechanical Problems On The ISS
- This Scam App Claims To Tell How Gay Someone Is By Analyzing Their DNA
- 89% Of Indian Workers Said That They Would Trust Robots Over Managers
According to ImageNet Roulette, you are no more than a machine filled with problem and embarrassment. If you are foolish or curious enough to give it one of your photos, it will tell you exactly what AI thinks of you. What its creator aims at is to let people know how artificial intelligence sees and classifies them.
It may come as a surprise for many of us that AI is pretty misogynistic and racist toward people. In other words, the dataset that ImageNet Roulette uses to train artificial intelligence is filled with prejudice and racism. This further proves the project’s point:
"[We] want to shed light on what happens when technical systems are trained on problematic training data. AI classifications of people are rarely made visible to the people being classified. ImageNet Roulette provides a glimpse into that process – and to show the ways things can go wrong," explained ImageNet Roulette’s website.
This project belongs to Kate Crawford's and Trevor Paglen's exhibition called Training Humans at Fondazione Prada museum in Milan. The AI will identify what it thinks are human faces and then label them in the way it sees fit.
To most of us, the labeling makes absolutely no sense. For example, in the photo of Barack Obama, the former president of the U.S, and Prince Harry, they are called “card player” and “sphinx” respectively.
"[Training Humans] is the first major photography exhibition devoted to training images: the collections of photos used by scientists to train artificial intelligence (AI) systems in how to 'see' and categorize the world," explains the exhibit page.
The problem here is, the way AI thinks of people reflects, or worst, exacerbates the prejudice and biases we have already had in our society. That is something you can use the next time you hear someone balames machine learning when something goes wrong.