The pace of artificial intelligence advancements is unmatched. The technology is in continuous evolution to benefit a larger set of people, organizations or even nations. 'Artificial Intelligence' in itself defines its true objective – to be an alternate to natural intelligence. The technology is fostering its reach to match or even in some case exceed human intelligence. However, human intelligence is not limited to mental and laborious strength, emotional intelligence equally constitutes a man. Therefore, arrives the Emotional AI or Emotional Artificial Intelligence.
Emotion AI is a subset of artificial intelligence (the broad term for machines replicating the way humans think) that measures understand, simulates, and reacts to human emotions. It's also known as affective computing or artificial emotional intelligence.
While humans might currently have the upper hand on reading emotions, machines are gaining ground using their own strengths. Machines are very good at analyzing large amounts of data, explained MIT Sloan professor Erik Brynjolfsson. They can listen to voice inflections and start to recognize when those inflections correlate with stress or anger. Machines can analyze images and pick up subtleties in micro-expressions on humans' faces that might happen even too fast for a person to recognize.
"We have a lot of neurons in our brain for social interactions. We're born with some of those skills, and then we learn more. It makes sense to use technology to connect to our social brains, not just our analytical brains." Brynjolfsson said. "Just like we can understand speech and machines can communicate in speech, we also understand and communicate with humor and other kinds of emotions. And machines that can speak that language — the language of emotions — are going to have better, more effective interactions with us. It's great that we've made some progress; it's just something that wasn't an option 20 or 30 years ago, and now it's on the table."
However, one of the world's leading experts on the psychology of emotions has warned that Artificial Intelligence (AI) systems that companies claim can "read" facial expressions are based on outdated science and risks being unreliable and discriminatory.
Lisa Feldman Barrett, professor of psychology at Northeastern University, said that such technologies appear to disregard a growing body of evidence undermining the notion that the basic facial expressions are universal across cultures. As a result, such technologies – some of which are already being deployed in real-world settings – run the risk of being unreliable or discriminatory, she said.
"I don't know how companies can continue to justify what they're doing when it's really clear what the evidence is," she said. "There are some companies that just continue to claim things that can't possibly be true."
Her warning comes as such systems are being rolled out for a growing number of applications. In October, Unilever claimed that it had saved 100,000 hours of human recruitment time last year by deploying such software to analyze video interviews.
The AI system, developed by the company HireVue, scans candidates' facial expressions, body language, and word choice and cross-references them with traits that considered to be correlated with job success.
Amazon claims its own facial recognition system, Rekognition, can detect seven basic emotions – happiness, sadness, anger, surprise, disgust, calmness and confusion. The EU is reported to be trialing software that purportedly can detect deception through an analysis of micro-expressions in an attempt to bolster border security.
"Based on the published scientific evidence, our judgment is that [these technologies] shouldn't be rolled out and used to make consequential decisions about people's lives," said Feldman Barrett.
However, a growing body of evidence has shown that beyond these basic stereotypes there is a huge range in how people express emotion, both across and within cultures.
In western cultures, for instance, people have been found to scowl only about 30 percent of the time when they're angry, she said, meaning they move their faces in other ways about 70 percent of the time.
"There is low reliability," Feldman Barrett said. "And people often scowl when they're not angry. That's what we'd call low specificity. People scowl when they're concentrating really hard when you tell a bad joke when they have gas."
The expression that is supposed to be universal for fear is the supposed stereotype for a threat or anger face in Malaysia, she said. There are also wide variations within cultures in terms of how people express emotions, while context such as body language and who a person is talking to is critical.
"AI is largely being trained on the assumption that everyone expresses emotion in the same way," she said. "There's very powerful technology being used to answer very simplistic questions."
Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp
_____________
Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.