Role of Artificial Intelligence in Emotion Recognition
Role of Artificial Intelligence in Emotion Recognition
Role of Artificial Intelligence in Emotion Recognition
https://doi.org/10.22214/ijraset.2023.52314
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
Abstract: Emotion recognition is the task of machines trying to analyze, interpret and classify human emotion through the
analysis of facial features. AI emotion recognition is a very active current field of computer vision research that involves facial
emotion detection and the automatic assessment of sentiment from visual data. Human-machine interaction is an important area
of research where artificially intelligent systems with visual perception aim to gain an understanding of human interaction. AI
emotion recognition leverages machine learning, deep learning, computer vision, and other technologies to recognize emotions
based on object and motion detection. This paper presents the role of AI in emotion recognition. In this case, the machine treats
the human face as an object. Through computer vision, the machine can observe facial features like the mouth, eyes, and
eyebrows, notice their position, and track their movements over time. It then compares the captured data from the movements
with already learned emotions.
Keywords: Artificial Intelligence, Action Unit, Affective computing, Emotion Recognition, Sentiment Analysis
I. INTRODUCTION
The human emotion recognition has attracted interest of many problem solvers in the field of artificial intelligence. The emotions on
a human face say so much about our thought process and give a glimpse of what's going on inside the mind. Real time emotion
recognition is to acquaint the machine with human like ability to recognize and analyse human emotions. Emotion AI, also called
Affective Computing, is a rapidly growing branch of Artificial Intelligence that allows computers to analyze and understand human
nonverbal signs such as facial expressions, body language, gestures, and voice tones to assess their emotional state. Hence, visual
Emotion AI analyses face appearances in images and videos using computer vision technology to analyze an individual’s emotional
status. At its core, this revolutionary technology aims at making human-machine interactions more natural and authentic. The
technology is based on the universal emotion theory, which claims that all humans, regardless of demographic and nationality,
display six internal emotional states using the same facial movements as a result of their evolutionary and biological origins. The
basic emotion states include happiness, fear, anger, surprise, disgust, and sadness.
Affective computing can detect people’s feelings through their voice tone, text, gestures, and facial expressions and adjust their
demeanour accordingly. Its algorithms achieve this level of human emotion interpretation by employing various technologies and
techniques such as speech science, computer vision, and deep learning algorithms. Affective computing technologies identify each
emotion as an Action Unit (AU) and then link it to a specific emotion. For instance, if the machine observes both the AU ‘upside-
down smile’ and the AU ‘wrinkled forehead,’ it can conclude that the person is sad. By mixing these basic classifications, an
advanced emotion detector can identify more complex feelings, thus adding to the system’s dependability.
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 2999
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
On a high level, an AI emotion application or vision system includes several steps. In the first step, the video of a camera is used to
detect and localize the human face. The bounding box coordinate is used to indicate the exact face location in real-time. The face
detection task is still challenging, and it’s not guaranteed that all faces are going to be detected in a given input image, especially in
uncontrolled environments with challenging lighting conditions, different head poses, great distances or occlusion.
When the faces are detected, the image data is optimized before it is fed into the emotion classifier. This step greatly improves the
detection accuracy. The image pre-processing usually includes multiple sub steps to normalize the image for illumination changes,
reduce noise, perform image smoothing, image rotation correction, image resizing, and image cropping.
After pre-processing, the relevant features are retrieved from the pre-processed data containing the detected faces. There are
different methods to detect numerous facial features. For example, Action Units (AU), the motion of facial landmarks, distances
between facial landmarks, gradient features, facial texture, and more. Generally, the classifiers used for AI emotion recognition are
based on Support Vector Machine (SVM) or Convolutional Neural Networks (CNN). Finally, the recognized human face is
classified based on facial expression by assigning a pre-defined class (label) such as “happy” or “neutral.”
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 3000
International Journal for Research in Applied Science & Engineering Technology (IJRASET)
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue V May 2023- Available at www.ijraset.com
Several insurers and automotive manufacturers use computer vision technology paired with AI emotion recognition software to
assess the emotional state of the driver. If the driver displays signs of extreme emotion or drowsiness, the software notifies the
company, which then acts accordingly.
The ever-increasing traffic problems and the high speed of cars increase the importance of artificial intelligence in the automotive
sector. Artificial intelligence is used for automotive, automotive sector driver monitoring, driving assistance, and roadside assistance
systems. Drivers' attention level decreases during a long journey, bringing many problems with this situation. Automotive artificial
intelligence is developed for driver monitoring systems to prevent accidents due to fatigue, insomnia, and similar reactions of
drivers who drive for long hours. Many automotive brands have been trying to develop autonomous vehicle technologies for their
vehicles for many years. This work carried out by automotive companies facilitates the integration of artificial intelligence systems
into the automotive sector today. Driver monitoring systems developed with the help of artificial intelligence prevent accidents as
well as possible theft, physical violence, and extortion. These systems identify the drivers of vehicles and immediately notify the
owner or the company to which the vehicle belongs when a foreign driver takes the driver's seat. Facial emotion detection systems,
which companies in the automotive industry did not foresee when developing autonomous vehicle systems, are used in driver
monitoring systems. The AI-driven DMS solutions can analyze drivers' behaviour and take immediate measures to overcome the
situation. Driver monitoring can detect anxiety or anger and suggest alternative solutions.
There is a growing demand for AI emotion analysis in the AI and computer vision market. While it is not currently popular in large-
scale use, solving visual emotion analysis tasks is expected to greatly impact real-world applications. Sentiment analysis and
emotion recognition are key tasks to build empathetic systems and human-computer interaction based on user emotion. Since deep
learning solutions were originally designed for servers with unlimited resources, the real-world deployment to edge devices is a
challenge (Edge AI). However, real-time inference of emotion recognition systems allows the implementation of large-scale
solutions.
V. CONCLUSION
Emotion recognition technologies are vital to building empathetic computer systems and improving human-computer interactions
based on the users’ emotions. With visual artificial intelligence and emotion identification systems, many problems such as security,
shopping, and socialization issues can be prevented. Visual artificial intelligence and facial emotion detection make life easier in
many areas. This artificial intelligence work meets users' needs in many areas, from security to human resources, from healthcare to
banking. But, despite the numerous benefits in real-world applications, the technology faces several hurdles in terms of bias and
privacy concerns. As ML algorithms get smarter, bias may be a thing of the past, but privacy still remains a major concern. The
interest in facial emotion recognition is growing increasingly, and new algorithms and methods are being introduced. Recent
advances in supervised and unsupervised machine learning brought breakthroughs in the research field, and more and more accurate
systems are emerging every year. However, even though progress is considerable, emotion detection is still a very big challenge.
REFERENCES
[1] Hari Kishan Kondaveeti and Mogili Vishal Goud, "Emotion Detection using Deep Facial Features", 2020 IEEE International Conference on Advent Trends in
Multidisciplinary Research and Innovation (ICATMRI), December 2020.
[2] R.M.A.H. Manewa and B. Mayurathan, "Emotion Recognition and Discrimination of Facial Expressions using Convolutional Neural Networks", 2020 IEEE
8th R10 Humanitarian Technology Conference (R10-HTC), December 2020.
[3] Kottilingam Kottursamy, "A review on finding efficient approach to detect customer emotion analysis using deep learning analysis", Journal of Trends in
Computer Science and Smart Technology, vol. 3, no. 2, pp. 95-113, 2021.
[4] A. Jaiswal, A. Krishnama Raju and S. Deb, "Facial Emotion Detection Using Deep Learning", International Conference for Emerging Technology (INCET),
pp. 1-5, 2020.
[5] Shrey modi et al., "Facial Emotion Recognition using Convolution Neural Network", Proceedings of the Fifth International Conference on Intelligent
Computing and Control Systems (ICICCS 2021) IEEE Xplore Part Number: CFP21K74- ART, ISBN 978-0-7381-1327.
[6] Lu Lingling liu, "Human Face Expression Recognition Based on Deep Learning-Deep Convolutional Neural Network", 2019 International Conference on
Smart Grid and Electrical Automation (ICSGEA).
[7] Nahla Nour, Mohammed Elhebir and Serestina Viriri, "Face Expression Recognition Using Convolution Neural Network (CNN) Models", International
Journal of Grid Computing and Application, vol. 11, no. 4, pp. 1-11, 2020.
©IJRASET: All Rights are Reserved | SJ Impact Factor 7.538 | ISRA Journal Impact Factor 7.894 | 3001