Your Photos Might Be Used Without Your Consent For Facial Recognition

Aadhya Khatri


IBM has made a dataset of people's photos to train facial recognition systems but when they ask it to remove them, the company does not comply completely and cannot give a justified answer to it

Facial recognition is evolving quickly and as it is better at tasks which used to be challenges for older systems. However, to train AI to recognize faces, a large number of photos have been used without people’s consent.

The Development Of Facial Recognition

Ever since the boom of the Internet, collecting photos has been made much easier

In the past, to gather photos to train the facial recognition systems, scientists asked people to come and took pictures of them with a certain amount of money. The subjects knew what their photos were used for, but this method was both expensive and time-consuming, so the database was quite small.

When the Internet started to bloom, researchers suddenly have access to a vast number of pictures online. The process became as simple as typing some celebrities’ names and took all the photos that show up.

Developing facial recognition systems was not for profit at the beginning so the researchers argued that they did not have to acquire copyright rights. For this reason, Flickr became an ideal source.

Thanks to the massive database experts could get their hands on, facial recognition has seen a prodigious leap. However, despite the significant role of the online photos, it will be hard-pressed to find any researcher that elaborate on the source of their data. Some even ignore the matter of consent entirely and use what they see on the Internet.

When asked about the source of the photos, not many experts are willing to make any comment on the matter.

IBM’s Controversial Dataset

IBM has a large dataset to train facial recognition systems

IBM has its library of photos that it lets researchers use freely. However, the pictures are not associated with any name, which is supposed to protect the people with images that appear in IBM’s collection. Despite this effort, the motivations behind this act of IBM still raise questions of civil liberties as the company is not exactly a shining example of this matter.

In answer to the public’s questions on IBM Watson Visual Recognition, especially when the company has announced that the system is “transparent”, IBM did cite any specific source from which it took the data from, which was explained as an act to keep their intellectual property safe.

IBM announced that the data would not be used for any commercial purpose, which conflicts with what the company used to respond to a study of Joy Buolamwini of MIT, which claimed that IBM’s facial recognition system works better on men with fairer skin than with women with darker skin.

On this matter, experts said that it was hard to separate commercial purposes and researching activities in companies like Facebook or IBM. They might say the use of data is for research only, but they own the patent for anything it team of experts comes up with anyway.

What Photographers Think?

IBM works for profit after all

Photographers, whose pictures could probably appear in IBM’s data library, express a different view on this matter.

Georg Holzer from Australia, with 700 photos appear in the dataset, said that IBM’s ultimate goal is to maximize its profit, so it was commercial use.

Shared this view with Holzer is Dolan Hapbrook in Portland, whose 452 pictures were used. He agreed that IBM should have asked him before claiming the photos. He expressed annoyance of being deprived of the right to know in advance and to decide which ones would be included in the dataset.

However, some other photographers felt that it was a good thing their photos were used for technological advancement.

Neil Moralee from the UK said that facial recognition could not be undone so he would rather have an accurate system than a false one that was prone to mistakes.

A Swiss photographer Guillaume Boppe did not think IBM’s act was wrong or violating his rights as his photos were helping the system evolve.

Sebastian Gambolati focused more on the advantages of facial recognition as a tool to track down criminals, but he was happy if IBM asked for his permission.

Can People Have Their Photos Removed?

Europeans can count on their General Data Protection Regulation to protect their privacy

While IBM does offer a choice for people, who have pictures featured in its dataset to claim and have them removed, the problem here is checking for whether you appear in it or not is not as straightforward as looking something up online. Even when you know for sure that you are in the dataset, there is no way to provide your Flickr ID and be done with it. What you have to do is to provide links to every photo you want to be removed.

IBM is not exactly transparent in this offer as when a photographer was told that over 1000 photos of him appear in the dataset, he contacted the company and sent them his Flickr ID. What IBM told him was none of his pictures was found in the library. When he showed links to 4 of his creations in IBM’s dataset, it blamed a bug for its initial statements.

Privacy is not a new subject so fortunately, there are laws to protect people’s rights. In Europe, with the General Data Protection Regulation in place, if photos are for identity confirmation, they will be sensitive personal information. Europeans who want their photos removed can ask IBM to do so. In case the company does not observe the law, it will be fined.

In Illinois, pictures of people’s faces can be considered one of the biometric information, and it is against the law to use them without that person’s agreement.

Are Better Facial Recognition Systems A Good Thing?

Facial recognition system might be used to target minority groups

Facial recognition raises privacy issues but is it worth going through all the troubles to enhance it knowing that the system might not be fair after all?

Such a system brings tremendous benefits if they are accurate, but at the same time, it can be used on minority groups like immigrants and black people, those who have already been victims of bias without facial recognition.

A system like that in the hand of the authority has raised a lot of questions and concerns. The problem was such that over 85 groups devoting to civil rights had asked tech firms to stop providing the government with facial recognition systems for fear of discrimination and prejudice.

Next Story