Role of Artificial Intelligence in Emotion Recognition

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

11 V May 2023

https://doi.org/10.22214/ijraset.2023.52314
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com

Role of Artificial Intelligence in Emotion


Recognition
Anjana C M
Assistant Professor, Department of Software Development, St. Albert’s College (Autonomous), Ernakulam, India

Abstract: Emotion recognition is the task of machines trying to analyze, interpret and classify human emotion through the
analysis of facial features. AI emotion recognition is a very active current field of computer vision research that involves facial
emotion detection and the automatic assessment of sentiment from visual data. Human-machine interaction is an important area
of research where artificially intelligent systems with visual perception aim to gain an understanding of human interaction. AI
emotion recognition leverages machine learning, deep learning, computer vision, and other technologies to recognize emotions
based on object and motion detection. This paper presents the role of AI in emotion recognition. In this case, the machine treats
the human face as an object. Through computer vision, the machine can observe facial features like the mouth, eyes, and
eyebrows, notice their position, and track their movements over time. It then compares the captured data from the movements
with already learned emotions.
Keywords: Artificial Intelligence, Action Unit, Affective computing, Emotion Recognition, Sentiment Analysis

I. INTRODUCTION
The human emotion recognition has attracted interest of many problem solvers in the field of artificial intelligence. The emotions on
a human face say so much about our thought process and give a glimpse of what's going on inside the mind. Real time emotion
recognition is to acquaint the machine with human like ability to recognize and analyse human emotions. Emotion AI, also called
Affective Computing, is a rapidly growing branch of Artificial Intelligence that allows computers to analyze and understand human
nonverbal signs such as facial expressions, body language, gestures, and voice tones to assess their emotional state. Hence, visual
Emotion AI analyses face appearances in images and videos using computer vision technology to analyze an individual’s emotional
status. At its core, this revolutionary technology aims at making human-machine interactions more natural and authentic. The
technology is based on the universal emotion theory, which claims that all humans, regardless of demographic and nationality,
display six internal emotional states using the same facial movements as a result of their evolutionary and biological origins. The
basic emotion states include happiness, fear, anger, surprise, disgust, and sadness.
Affective computing can detect people’s feelings through their voice tone, text, gestures, and facial expressions and adjust their
demeanour accordingly. Its algorithms achieve this level of human emotion interpretation by employing various technologies and
techniques such as speech science, computer vision, and deep learning algorithms. Affective computing technologies identify each
emotion as an Action Unit (AU) and then link it to a specific emotion. For instance, if the machine observes both the AU ‘upside-
down smile’ and the AU ‘wrinkled forehead,’ it can conclude that the person is sad. By mixing these basic classifications, an
advanced emotion detector can identify more complex feelings, thus adding to the system’s dependability.

II. HOW AI EMOTION ANALYSIS WORKS


When it comes to how emotion identification works, it works by examining body and facial movements. Vision systems often
identify facial movements to infer emotional recognition. For example, raising the eyebrows and arching the mouth, in short, a smile
represents the emotion of happiness. Emotion recognition technologies are being introduced in many areas of life, and it seems
likely that machines, home appliances, or cars will likely focus on our mood in the near future. Emotion recognition systems often
learn to identify the connection between emotion and external factors from large sets of labelled data. This data can include many
situations from everyday life. For example, it can consist of TV productions, radio and podcast recordings, interviews, human
experiments, theatre performances or movie scenes, and dialogues performed by professional actors. Many scientists have worked
on visual AI and emotion recognition methods for decades, and scientists have developed and evaluated different approaches in this
direction. Visual AI is a technology that captures, analyzes, and compares patterns based on subject individual facial details.
Emotion recognition is the task of machines that try to analyze, interpret and classify human emotions through the analysis of facial
features.

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2999
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com

On a high level, an AI emotion application or vision system includes several steps. In the first step, the video of a camera is used to
detect and localize the human face. The bounding box coordinate is used to indicate the exact face location in real-time. The face
detection task is still challenging, and it’s not guaranteed that all faces are going to be detected in a given input image, especially in
uncontrolled environments with challenging lighting conditions, different head poses, great distances or occlusion.
When the faces are detected, the image data is optimized before it is fed into the emotion classifier. This step greatly improves the
detection accuracy. The image pre-processing usually includes multiple sub steps to normalize the image for illumination changes,
reduce noise, perform image smoothing, image rotation correction, image resizing, and image cropping.
After pre-processing, the relevant features are retrieved from the pre-processed data containing the detected faces. There are
different methods to detect numerous facial features. For example, Action Units (AU), the motion of facial landmarks, distances
between facial landmarks, gradient features, facial texture, and more. Generally, the classifiers used for AI emotion recognition are
based on Support Vector Machine (SVM) or Convolutional Neural Networks (CNN). Finally, the recognized human face is
classified based on facial expression by assigning a pre-defined class (label) such as “happy” or “neutral.”

III. TECHNOLOGIES FACILITATING AI EMOTION RECOGNITION


There is no universal solution when it comes to effective computing. Developers have to choose the most suitable technology for the
task at hand or create a new approach. The technologies are primarily based on either machine learning algorithms or deep learning
networks.
Machine learning algorithms analyze data in an emotion-controlled environment, learn its characteristics and then recognize the data
in real-world situations. The intricate nature of machine learning algorithms means they rely on human intervention and a lot of
structured data to come up with accurate AI emotion analysis.
On the other hand, deep learning models try to mimic the way the human brain works. These systems typically consist of layers of
machine learning algorithms, each interpreting data differently to learn from it.

IV. ARTIFICIAL INTELLIGENCE IN EMOTION RECOGNITION


Over the past few years, AI emotion recognition vendors have ventured into most major industries. Now, many major organizations
are leveraging these technologies to enhance customer experience and revolutionize their data collection strategies. There is a wide
range of uses for facial emotion detection systems. In the near future, it is expected to become widespread in areas such as banking,
online shopping, computer or video games, and the service sector. This shows that the market share of emotion recognition is
growing every day. Emotion recognition technology is now a multi-billion dollar industry that aims to use artificial intelligence to
detect emotions from facial expressions. The global emotion detection and recognition market size was valued at USD 32.95 billion
in 2021 and is expected to expand at a CAGR of 16.7% from 2022 to 2030.
One of the crucial projects on artificial visual intelligence is Cameralyze. Cameralyze enables the identification of emotions from
people's facial expressions. The Cameralyze No-Code Visual Intelligent system offers fast and flexible operation. Emotion
recognition algorithms can help marketers determine which ads resonate better with their target audience. With this information,
they can better determine which features to include in their advertisements to promote engagement and boost conversions.
Organizations can deploy AI emotion recognition technologies at their call centers to enhance customer service. AI-powered
affective computing algorithms can pick the best fitting customer care agent for a specific client, give real-time feedback on the
customer’s emotional state, and respond in kind to a frustrated customer. Chatbots equipped with effective computing technologies
can also streamline service flow by considering customers’ emotions. For instance, if the system determines the customer is angry, it
can switch to a different escalation flow or direct it to a human customer care agent.
Numerous players in the health sector deploy AI emotion recognition technologies to help both patients and doctors. Emotion AI
technologies can monitor patients’ emotions during surgical procedures and in examination rooms. Likewise, doctors can pair the
technology with voice assistants to detect stress levels in patients and respond accordingly. Companies dealing with issues
pertaining to mental health can also deploy emotion recognition technologies to detect suicidal ideation and alert emergency
responders to prevent suicides. For instance, Facebook has deployed an emotion recognition software that monitors users’ posts that
shows signs of a user having suicidal ideation. The software also alerts local authorities, thus preventing potential suicides.
Specially designed emotion recognition software can gauge and adjust to learners’ emotions. For instance, if a learner displays signs
of frustration owing to a task being too difficult or too easy, the learning software adjusts the task accordingly, making it either more
or less challenging according to the learners’ emotions. Some learning software can also help autistic children recognize other
people’s emotions.

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 3000
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com

Several insurers and automotive manufacturers use computer vision technology paired with AI emotion recognition software to
assess the emotional state of the driver. If the driver displays signs of extreme emotion or drowsiness, the software notifies the
company, which then acts accordingly.
The ever-increasing traffic problems and the high speed of cars increase the importance of artificial intelligence in the automotive
sector. Artificial intelligence is used for automotive, automotive sector driver monitoring, driving assistance, and roadside assistance
systems. Drivers' attention level decreases during a long journey, bringing many problems with this situation. Automotive artificial
intelligence is developed for driver monitoring systems to prevent accidents due to fatigue, insomnia, and similar reactions of
drivers who drive for long hours. Many automotive brands have been trying to develop autonomous vehicle technologies for their
vehicles for many years. This work carried out by automotive companies facilitates the integration of artificial intelligence systems
into the automotive sector today. Driver monitoring systems developed with the help of artificial intelligence prevent accidents as
well as possible theft, physical violence, and extortion. These systems identify the drivers of vehicles and immediately notify the
owner or the company to which the vehicle belongs when a foreign driver takes the driver's seat. Facial emotion detection systems,
which companies in the automotive industry did not foresee when developing autonomous vehicle systems, are used in driver
monitoring systems. The AI-driven DMS solutions can analyze drivers' behaviour and take immediate measures to overcome the
situation. Driver monitoring can detect anxiety or anger and suggest alternative solutions.
There is a growing demand for AI emotion analysis in the AI and computer vision market. While it is not currently popular in large-
scale use, solving visual emotion analysis tasks is expected to greatly impact real-world applications. Sentiment analysis and
emotion recognition are key tasks to build empathetic systems and human-computer interaction based on user emotion. Since deep
learning solutions were originally designed for servers with unlimited resources, the real-world deployment to edge devices is a
challenge (Edge AI). However, real-time inference of emotion recognition systems allows the implementation of large-scale
solutions.

V. CONCLUSION
Emotion recognition technologies are vital to building empathetic computer systems and improving human-computer interactions
based on the users’ emotions. With visual artificial intelligence and emotion identification systems, many problems such as security,
shopping, and socialization issues can be prevented. Visual artificial intelligence and facial emotion detection make life easier in
many areas. This artificial intelligence work meets users' needs in many areas, from security to human resources, from healthcare to
banking. But, despite the numerous benefits in real-world applications, the technology faces several hurdles in terms of bias and
privacy concerns. As ML algorithms get smarter, bias may be a thing of the past, but privacy still remains a major concern. The
interest in facial emotion recognition is growing increasingly, and new algorithms and methods are being introduced. Recent
advances in supervised and unsupervised machine learning brought breakthroughs in the research field, and more and more accurate
systems are emerging every year. However, even though progress is considerable, emotion detection is still a very big challenge.

REFERENCES
[1] Hari Kishan Kondaveeti and Mogili Vishal Goud, "Emotion Detection using Deep Facial Features", 2020 IEEE International Conference on Advent Trends in
Multidisciplinary Research and Innovation (ICATMRI), December 2020.
[2] R.M.A.H. Manewa and B. Mayurathan, "Emotion Recognition and Discrimination of Facial Expressions using Convolutional Neural Networks", 2020 IEEE
8th R10 Humanitarian Technology Conference (R10-HTC), December 2020.
[3] Kottilingam Kottursamy, "A review on finding efficient approach to detect customer emotion analysis using deep learning analysis", Journal of Trends in
Computer Science and Smart Technology, vol. 3, no. 2, pp. 95-113, 2021.
[4] A. Jaiswal, A. Krishnama Raju and S. Deb, "Facial Emotion Detection Using Deep Learning", International Conference for Emerging Technology (INCET),
pp. 1-5, 2020.
[5] Shrey modi et al., "Facial Emotion Recognition using Convolution Neural Network", Proceedings of the Fifth International Conference on Intelligent
Computing and Control Systems (ICICCS 2021) IEEE Xplore Part Number: CFP21K74- ART, ISBN 978-0-7381-1327.
[6] Lu Lingling liu, "Human Face Expression Recognition Based on Deep Learning-Deep Convolutional Neural Network", 2019 International Conference on
Smart Grid and Electrical Automation (ICSGEA).
[7] Nahla Nour, Mohammed Elhebir and Serestina Viriri, "Face Expression Recognition Using Convolution Neural Network (CNN) Models", International
Journal of Grid Computing and Application, vol. 11, no. 4, pp. 1-11, 2020.

©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 3001

You might also like