Different Applications

This artificial intelligence has developed from in house research facilities to wearables, human-robot interaction, education, retail, employee behavior, border control, and online and physical contexts (McStay 2020). It is believed that with these technologies it is possible to use emotional states to enhance our interactions with different devices and even places.

Improve Shopping Experiences

Emotion detection AI has been greatly used to provide emotional reactivity feedback to marketers so that they can know what to advertise and how to advertise it. By using emotion detecting AI marketers are able to analyze and store data about where customers have shown the most signs of happiness compared to other places in a location and use that to make shopping a better experience for people. A successful use of this in a research conducted by Gonchigiav (2020). In this research, Gonchigiav (2020), argued that the use of emotion reading AI can be used to improve sales in supermarkets by observing how people seem to feel depending on what aisle they are on or what they are looking at.

Thanks to emotion reading AI, Gonchigiav was able to put together and figure out what kinds of techniques to use in order to positively stimulate customers’ interest and emotions. In this case emotion reading AI has been a very helpful tool in verifying the effectiveness of certain store marketing strategies within departments in stores and has helped improve the experiences of going grocery shopping and how to increase sales. 

Assistive Technology for Autism 

Emotion detection AI has been used as an assistive technology for people and children with disabilities, specifically autism spectrum disorder. This is a common disorder that harms the ability to interact and communicate with others. This makes it difficult for them to connect facial features to emotions and respond appropriately to other people’s emotions. This ability to sympathize and feel empathy for other people is a skill that people with ASD do not possess. They may use the wrong emotions for the wrong situations as stated in a study by Akansha Singh and Surbhi Dewan called, “AutisMitr: Emotion Recognition Assistive Tool for Autistic Children,” where they stated that “They may show signs of joy when someone is hurt, or they respond with no emotions whatsoever. Thus, the inability to respond appropriately to others’ emotions may create appearances that autistic people don’t feel emotions.” This idea that people with Autism don’t feel emotions is entirely wrong and can cause very harmful beliefs to arise since they actually do but just can’t express them correctly.

This is where a mix of facial and emotion detection AI come into play. This AI not only identifies faces in a picture but also measures the types of emotions displayed based on facial features. This, as shown in the study, has been used in multiple videos and games to teach children with autism about the different kinds of emotions and at the same time analyze their progress. They do this by checking if after the videos and games they are more capable of portraying their feelings to the corresponding emotions that are appropriate in specific occasions better than they did before. This assistive technology allowed people with ASD to perform daily life activities and made understanding emotions a little less difficult for them.

Provide Video Game Feedback

This AI has been used to detect the types of emotions that video games spark for those playing them. It is stated that with the use of this AI we are able to observe which emotions a person is experiencing in real time as they are playing. This is important since every video game tries to provoke a specific set of emotions and behavior from players. This means that game developers use these technologies to test whether or not users demonstrate the corresponding emotions that they are aiming for. They do this during the games testing phase where people are asked to play the game for a specific amount of time. During this time they are being monitored for the emotions being portrayed and with this feedback they are able to make adjustments to the game to fit their preference. 

Pain Assessment

It is also believed that this technology can also be used in hospitals as a way to assess pain. The idea behind this is for people who are either unable to demonstrate when they are in pain or those too shy to say anything. With this technology people believe that they will be able to identify the exact moments when someone feels uncomfortable during any medical procedures and modify practices from there in favor of the patient. In other words, lead professionals to turn to another method of treating someone, a method that works more to their liking. If they are treating someone and see that they demonstrate high levels of uneasiness then they will know that they are feeling irritation. 

On the other hand, a big problem with this is the possibility of this AI being completely wrong and inaccurate. This will cause major issues and misunderstandings in communication between the patient and doctor. It may cause unnecessary actions to be taken and waste both of the their time.

School Application

This AI has also been used to supervise schools to try to enable learning in students and facilitate and regulate behaviors and moods. The AI would be used to see if students were performing suspicious behaviors on video surveillance. It was believed that with this technology adults in the school will be more aware of how their students feel and check in on them and make sure they are emotionally stable. The use of this AI was also put in place in schools as a way to make sure students were focused and engaged in the lesson. 

Jordan Harrod, the speaker in a youtube video called, “Can AI detect your emotion?”, explained how the usage of emotion detection AI in schools is a very bad idea. She talks about how these AI’s are being used to see if students are performing suspicious behavior. Recently proctorU stopped using these AI’s since they had concerns that teachers weren’t actually reviewing the flagged footage but instead just using it to unfairly penalize students. This means that the use of this AI in a classroom just led to difficulties for students to learn in class and added more stress to their daily lives in school and gave teachers/administrators the ability to take this technology for granted and use it to target students.

Workplace Usage

It is believed that by having this technology in work settings that we are able to judge risks, facilitate and regulate behaviors and moods and even improve performance in people (McStay, 2020). Similarly, for employees these machines make sure that the workers are having positive attitudes throughout the day and are working efficiently. What employers do with this data is something they have full control over. Most of the time these people have no clue that they are being monitored for emotion. Employers do this as a way to track worker productivity. This method of using this technology has also been used to make decisions regarding whether or not employees deserve raises or promotions

This technology is being used even before you are employed in a job. It is being used throughout virtual interviews where employers analyze the interview footage and evaluate potential job candidates. This then leads to decision making on whether or not someone gets employed. According to the Atlantic article, “Artificial Intelligence is Misreading Human Emotion” they state, “In 2014, the company launched its AI system to extract micro-expressions, tone of voice, and other variables from video job interviews, which is used to compare job applicants against a company’s top performers.” This means that these applicants are set side by side to those already employed and are evaluated using this technology to see if they have similar attitudes. Job applicants are judged unfairly based on the emotion they portray through their facial expressions or vocal cords. They are compared to higher performing employees to check if they have similar potential. If they fail to portray similar attitudes to current employees then they are automatically seen as unqualified for the job. 

Using this AI to judge someone’s working ability is very unethical and not fair. This AI causes people to lose job opportunities based on unrealistic and inaccurate data. AI this way is very harmful to people who are just simply trying to get jobs. Basing whether or not someone gets a job for how their face looks 24/7 makes no sense since portraying emotion is something that naturally happens.

Not only is it being used to monitor workers but customers as well. This AI is being trusted with the task to detect potential shoplifters by tracking and analyzing facial cues. Customers get stopped in stores and questioned for suspicious behaviors based on what this technology detects.

“These are the people who will bear the costs of systems that are not just technically imperfect, but based on questionable methodologies.”

– The Atlantic Article author, Kate Crawford

“There is no good evidence that facial expressions reveal a persons feelings. But big tech companies want you to believe otherwise.”

– Kate Crawford from the Atlantic 

Leave a comment