New AI Hides Your Emotions From Other AI Systems
Kumari Shrivastav - Sep 13, 2019, 3:00 pm IST
Smart speaker devices are infringing on our privacy by detecting our emotions. Therefore, researchers are coming up with a solution to protect users.
- Google Announced ‘Google Research India’ To Build AI Community In India
- This Website Will Show You What AI Thinks Of You With Just One Photo
- Survey Of India To Digitally Map The Country With Drones, AI, And Big Data
When people communicate, they will present a variety of nonverbal feelings such as exasperated groans or terrified shrieks. From arousal to disgust, agony to ecstasy, several subtle emotions can be told from vocal cues and inflections. Just a simple saying can deliver lots of meanings. And this personal data would be so valuable when being collected by any company.
When a person speaks to voice assistants which are connected to the Internet, his speech is likely recorded. As a result, researchers from the Imperial College London have created a layer between users and the cloud that their personal information is uploaded to. This AI helps users hide their emotions by translating sentimental speech into unemotional speech automatically. The research paper called “Emotionless: Privacy-Preserving Speech Analysis for Voice Assistant” has been published. You can find it on the arXiv preprint server.
Amazon and other companies are always putting efforts into developing AI’s emotional detecting capabilities. Different personal information such as age, physical condition, gender, stress levels, confidence, etc., can be revealed from our voices, and smart speaker providers don’t miss this point.
According to Ranya Aloufi, the leader of the research, by pinning down human’s emotional states as well as personal preferences, this accurate emotional reader may infringe our privacy significantly.
Researchers stated in an experiment that their method helps to decrease the emotional detecting by 96% despite the reduction of speech recognition accuracy with 35% of word error rate. The method they use to mask emotion includes gathering speech, analyzing data, then extracting any emotional factors from the core signal. After that, an AI program studies on these indicators, replacing any emotional signal in the speech, then flatten them. Lastly, with the help of AIs outputs, your normalized speech is regenerated by a voice synthesizer. The cloud will store the data.
What makes a machine human-like is understanding different emotions, which is many futurists and companies' longtime goal. In recent years, AI speech engineers consider emotionality as a critical goal. SER (Speech Emotion Recognition), which first appeared in the late 1990s, is an older field than any famous AI speech recognition system on Google Home, Siri, or Alexa gadgets.
In 2018, Huawei stated that it was developing an AI for detecting emotions to deploy to is voice assistant programs. “We think that, in the future, all our end users wish [that] they can interact with the system in the emotional mode. This is the direction we see in the long run,” said the Huawei vice president of software engineering to CNBC.
Back in May, speech engineers of Alexa revealed research where adversarial networks were used to detect emotions in the digital assistant. They wrote in their research that emotion "can aid in health monitoring; it can make conversational-AI systems more engaging, and it can provide implicit customer feedback that could help voice agents like Alexa learn from their mistakes."
Those firms also hope the highly sensitive data can help them sell more specifically-targeted ads.
In the patent of emotional speech detection in 2017, Amazon uses sickness as an instance: "physical conditions such as sore throats and coughs may be determined based at least in part on a voice input from the user, and emotional conditions such as an excited emotional state or a sad emotional state may be determined based at least in part on voice input from a user. A cough or sniffle, or crying, may indicate that the user has a specific physical or emotional abnormality."
After receiving the cues, a virtual assistant, Alexa for example, will use them for offering extremely specific advertisements of medications and other particular products, in combination with your purchase and browsing history. Aloufi said that consumers can’t do much to protect their personal data from being used like this unless more privacy protections are provided by smart home gadgets such as Google Home and Alexa.
"Users should be informed and given awareness on the ways in which these systems will use and analyze their recording so they are able to make informed choices about the placement of these devices in their households, or their interaction with these devices," she shared. "There is a dire need to apply these [privacy-protecting] technologies to protect the users’ and their emotional state."