Artificial Intelligence

Is Virtuous Implementation of Emotion AI Possible Among Rising Concerns?

Smriti Srivastava

The regular advancements in AI are both thrilling and chilling at the same time. Where most of the industry people are thrilled with technology's growth propelling new heights, some experts are concerned about the adverse situations it could bring in near future or probably already brought.

The AI algorithms are growing more human in space of emotions. Yes! AI has become capable of reading emotions of people now. The advents of computer vision and facial recognition has led researchers to actively work on developing algorithms that can determine the emotions and intent of humans, along with making other inferences.

What is Emotion AI?

As noted by MIT Management Sloan School, Emotion AI is a subset of artificial intelligence (the broad term for machines replicating the way humans think) that measures, understands, simulates, and reacts to human emotions. It's also known as affective computing, or artificial emotional intelligence. The field dates back to at least 1995, when MIT Media lab professor Rosalind Picard published "Affective Computing."

Javier Hernandez, a research scientist with the Affective Computing Group at the MIT Media Lab, explains emotion AI as a tool that allows for a much more natural interaction between humans and machines. "Think of the way you interact with other human beings; you look at their faces, you look at their body, and you change your interaction accordingly," Hernandez said. "How can a machine effectively communicate information if it doesn't know your emotional state, if it doesn't know how you're feeling, it doesn't know how you're going to respond to specific content?"

As explained by MIT Sloan professor Erik Brynjolfsson, "while humans might currently have the upper hand on reading emotions, machines are gaining ground using their own strengths. Machines are very good at analyzing large amounts of data – they can listen to voice inflections and start to recognize when those inflections correlate with stress or anger. Machines can analyze images and pick up subtleties in micro-expressions on humans' faces that might happen even too fast for a person to recognize."

What are the Horrifying Concerns of Affective AI?

The annual report released by AI Now Institute which is an interdisciplinary research center studying the societal implications of artificial intelligence, comprehends the flaws in emotion detection through AI and calls for a ban on technology designed to recognize people's emotions in certain cases. According to the researchers, the technology should not be used in decisions that "impact people's lives and access to opportunities," such as hiring decisions or pain assessments, because it is not sufficiently accurate and can lead to biased decisions.

Here are some significant highlights from the report

The report acknowledges, in the past year, researchers uncovered systems in wide deployment that purport to operationalize proven scientific theories, but in the end are little more than speculation. This trend in AI development is a growing area of concern, especially as applied to facial- and affect-recognition technology. Despite the technology's broad application, research shows affect recognition is built on markedly shaky foundations.

The affect-recognition industry is undergoing a period of significant growth: some reports indicate that the emotion-detection and -recognition market was worth US$12 billion in 2018, and by one enthusiastic estimate, the industry is projected to grow to over US$90 billion by 2024. These technologies are often layered on top of facial-recognition systems as a "value add."

The report notes that Boston-based company BrainCo is creating headbands that purport to detect and quantify students' attention levels through brain-activity detection, despite studies that outline significant risks associated with the deployment of emotional AI in the classroom.

Moreover, according the researchers affect-recognition software has also joined risk assessment as a tool in criminal justice. For example, police in the US and UK are using the eye-detection software Converus, which examines eye movements and changes in pupil size to flag potential deception. Oxygen Forensics, which sells data-extraction tools to clients including the FBI, Interpol, London Metropolitan Police, and Hong Kong Customs, announced in July 2019 that it also added facial recognition, including emotion detection, to its software, which includes "analysis of videos and images captured by drones used to identify possible known terrorists."

But often the software doesn't work. For example, ProPublica reported that schools, prisons, banks, and hospitals have installed microphones from companies that carry software developed by the company Sound Intelligence, purporting to detect stress and aggression before violence erupts. But the "aggression detector" was not very reliable, detecting rough, higher-pitched sounds like coughing as aggression.

Another study by researcher Dr. Lauren Rhue found systematic racial biases in two well-known emotion-recognition programs: when she ran Face++ and Microsoft's Face API on a dataset of 400 NBA player photos, she found that both systems assigned black players more negative emotional scores on average, no matter how much they smiled.

There remains little to no evidence that these new affect-recognition products have any scientific validity. In February, researchers at Berkeley found that in order to detect emotions with accuracy and high agreement requires context beyond the face and body.

Others at the University of Southern California called for a pause in the use of some emotion analytics techniques at the 8th International Conference on Affective Computing and Intelligent Interaction in 2019. "'This facial expression recognition technology is picking up on something — it's just not very well correlated with what people want to use it for. So they're just going to be making errors, and in some cases, those errors cause harm,'" said Professor Jonathan Gratch.

A major review released this summer found that efforts to "read out" people's internal states from an analysis of facial movements alone, without considering context, are at best incomplete and at worst entirely lack validity. After reviewing over a thousand studies on emotion expression, the authors found that, although these technologies claim to detect emotional state, they actually achieve a much more modest outcome: detecting facial movements.

As the study shows, there is a substantial amount of variance in how people communicate their emotional state across cultures, situations, and even across people within a single situation. Moreover, the same combination of facial movements—a smile or a scowl, for instance—can express more than a single emotion. The authors conclude that "no matter how sophisticated the computational algorithms . . . it is premature to use this technology to reach conclusions about what people feel on the basis of their facial movements."

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

BlockDAG's $150M Surge Steals the Spotlight as Pepe Unchained’s Presale Winds Down — What Are Traders Saying?

Top 6 Best Cryptos to Buy Now for Massive Gains – The Ultimate Crypto List for 2025

Bitcoin ETFs Surge as Crypto Market Boom; BlockDAG Raises $150M in Record Time

Don’t Buy at 10x Higher Prices in January: Expert Says Last Chance to Get In Cardano and DTX Before Moonshot

BlockDAG Presale’s $20M Jump in 48Hrs or Rexas Finance’s $8.6M Goal: Which One Steals the Spotlight?