Emotion Detection AI

What is Emotion detection AI?

Artificial Intelligence is the simulation of human intelligence shown by machines and technology. As time goes by this Artificial Intelligence continues to progress and improve into different forms of technology and a prime example of this is emotion detection AI. Emotion detection AI is the use of artificial intelligence to detect the mental state of an individual. More specifically it is able to pinpoint and identify emotions or feelings portrayed in a human through the observation of facial expressions. This can be done either through a video, live feed, or even picture. They are able to go frame by frame to view the changes in facial expressions and events leading up to that change to help accurately “read” emotions. This AI does this with the help of observing and measuring a human’s facial movements and expressions and connecting that to an idea developed long ago. 

This idea that facial expressions equal emotion is an idea developed in the 1960’s thanks to Paul Ekman. He proposed the idea that facial expressions are universal and defined five basic emotions which are anger, happiness, neutral, sad, and surprised. He also believed that micro expressions alone can tell you a lot about a person. With this concept future emotion based technologies, such as emotion detection AI, have been made to only detect these specific emotions by simply analyzing a face. However, the possibility that this method works is not always guaranteed. 

Accuracy of Emotion Detection AI

The idea that emotions can be predicted solely out of someone’s facial expressions is completely false. Someone’s facial expression is not always what someone feels on the inside. It takes more than just a face to be able to correctly infer a person’s emotion, especially since a lot of people display their emotions differently. As stated in a study by Andrew McStay, a very important reason why this method of emotion recognition is inaccurate is because similar configurations of facial movements are expressed in more than one emotion category. This means that movements like raised eyebrows and squinting eyes can be matched to more than one emotion. For example, stating someone who has their eyes squinting and low arched eyebrows is “angry” is entirely wrong since even though they are portraying an expression that we believe belongs to an “angry” expression, we as outsiders wouldn’t actually know what this person is truly feeling.

This idea that certain facial expressions or movements belong to certain emotions is called an emotional stereotype, which is currently the main problem with these technologies. In order to accurately predict emotions, more context on specific situations is crucial and necessary. Furthermore methods may have to become more invasive since background information for what’s going on during the specific time span that an emotion is being analyzed is needed.

“How people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation.” 

– a team at Northeastern University and Massachusetts General Hospital