The Facing-Reading AI Helps To Detect Whether Suspects Are Telling Lies Or Not

Jyotis - Jul 01, 2019


The Facing-Reading AI Helps To Detect Whether Suspects Are Telling Lies Or Not

The company developed this AI system to detect human’s emotions from fear, surprise to anger. All of these emotions originate from micro-expressions that can’t be seen by naked eyes.

The research of the US psychologist Paul Ekman that specializes in researching facial expressions has been able to detect humans’ lies for over 40 years. However, artificial intelligence (AI) may be soon a perfect alternative in the upcoming time.

AI-detect-lie-tellers
However, artificial intelligence (AI) may be soon a perfect alternative in the upcoming time.

Although the United States has been among the first nations to deploy automated technologies to learn how to comprehend secret reactions and emotions of suspects, the technique, in fact, is still young.

A lot of entrepreneurial ventures are under the investment and development to improve its accuracy and efficiency, as well as to limit the probability of false signals.

According to a report from The Times, a startup company from the United Kingdom named Facesoft has revealed it managed to build a database containing 30 crores images of faces. It is an AI system that creates many of these images based on models of human brains.

The company developed this system to detect human’s emotions from fear, surprise to anger. All of these emotions originate from micro-expressions that can’t be seen by naked eyes.

As the Chief Executive Officer and co-founder of Facesoft, Allan Ponniah said to The Times

Quote 1

The Facesoft CEO now works as a reconstructive and plastic surgeon in London.

AI-police-facial-expressions
The company developed this system to detect human’s emotions from fear, surprise to anger.

His company has worked with the Mumbai police in an effort to monitor crowds and identify the growing mob dynamics via the AI system. In addition, Facesoft has introduced its initiative to the UK police forces.

In fact, the application of AI technologies in the police forces has sparked lots of contrary opinions in recent days. A group of researchers from Apple, Amazon, Alphabet, Microsoft, and Facebook reported in April of this year that lots of current algorithms which help police decide who should get probation, parole or bail, as well as which are designed to help judges decide what sentence to give an offender, turn out to be possibly opaque, biased, or even don’t work.

According to the Partnership on AI (PAI), such AI systems have been widely used in the United States and created their own marks in many other countries. As per the technology industry consortium, it doesn’t support the use of these AI systems.

Comments

Sort by Newest | Popular

Next Story