Automatic Emotion Detection Model From Facial Expression
Automatic Emotion Detection Model From Facial Expression
Abstract—The human face plays a prodigious role for Emotion recognition [2][3][5] is useful to make
automatic recognition of emotion in the field of smooth communication between human & computer
identification of human emotion and the interaction interaction. The recognition of human emotion can
between human and computer for some real application have wide applications in heterogeneous field. The
like driver state surveillance, personalized learning,
applications are mainly based on the man and machine
health monitoring etc. Most reported facial emotion
recognition systems, however, are not fully considered interaction, patient surveilling, inspecting for antisocial
subject-independent dynamic features, so they are not motives etc. Even we can recognize emotion for
robust enough for real life recognition tasks with subject customers by analyzing their response on seeing
(human face) variation, head movement and illumination certain commodity or advertisement or immediately
change. In this article we have tried to design an after getting a message and based on the response from
automated framework for emotion detection using facial the customers, the resource hub can improve their
expression. For human-computer interaction facial strategies[1].
expression makes a platform for non-verbal The first aim of this work is to incorporate
communication. The emotions are effectively changeable
anatomical grip for emotion recognition. Facial
happenings that are evoked as a result of impelling force.
So in real life application, detection of emotion is very behavior is represented using Facial Action Coding
challenging task. Facial expression recognition system System (FACS). FACS couples the transient
requires to overcome the human face having multiple appearance changes with the action of muscles from
variability such as color, orientation, expression, posture anatomical perspective. FACS employs Action Units
and texture so on. In our framework we have taken (AU) and AU represents the muscular activities to
frame from live streaming and processed it using Grabor describe the facial expressions. Generally, a single
feature extraction and neural network. To detect the muscle is invoked by most of the AUs. However, in
emotion facial attributes extraction by principal some scenarios to express relatively autonomous
component analysis is used and a clusterization of
activity of several segment of one specific muscle, two
different facial expression with respective emotions.
Finally to determine facial expressions separately, the or more than two AUs [13] are used. FACS has
processed feature vector is channeled through the already recovered overall 46 Action units which delivers a
learned pattern classifiers. multifaceted procedure to express a large variety of
facial behavior [13].
Index Terms—Face Detection, Gabor Feature In rest of the section we have discussed the
Extraction, Neural Network, Facial Expressions, Emotion following-Section II we have tried to report and refer
Recognition, Facial Attribute Extraction, Principal some of the influential work in the domain of
Component Analysis(PCA), Pattern Classification, K- emotional intelligence. Section III we have dragged
mean clustering. something on emotion taxonomy. Section IV we have
described the dataset. Section V discussed mainly
I. INTRODUCTION methodology, where initially frame extraction from
live streaming and well-known face detection through
The article introduced here has mainly neural network are touched on very light way, then
concentrated on the creation of smart framework with the detail discussion of PCA is taken care and applied
the inherent capabilities of drawing the inference for K-mean with a little modification for clustering,
emotion detection from facial expressions. Recently, including a pictorial representation of flow chart and
the notion of emotion recognition is attaining mostly output of each individual step. Section VI the results
the researcher's mind in the area of exploration on are shown and experimental analysis is done towards
smart system and interaction between human and reaching our goal. Finally Section VII and Section VIII
computer. Based on facial attributes the facial extension of our work and conclusion are made
expression recognition can be classified one of the six betterment.
well known fundamental emotions: sadness, disgust,
II. RELATED WORK
happiness, fear, anger and surprise [1]. Coren and
Russel [1] stated that each emotion is having the In emotional recognition of face a notable
property of stereo-scopic-perceptual conflict. So advancement has been observed in the field of
establishing an effective automatic emotion neuroscience, cognitive and computational intelligence
recognition framework is a very challenging task. [1][5][6]. It is also proved by Kharat and Dudul that
about 55% effect of overall emotion expression is as
ISBN No.978-1-4673-9545-8 77
Authorized licensed use limited to: UNIVERSIDAD DE ALICANTE . Downloaded on November 07,2024 at 17:38:41 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Advanced Communication Control and Computing Technologies (ICACCCT)
facial expression which is contributed during social emotions [1], and Russell's circumflex representation
interactions. of influence [1]. Ekman and Freisen in 1971 [1] put
Actually, facial muscle generates monetary forward six quintessential basic emotions like disgust,
adaptation in facial appearance which can be joy, sadness, fear, anger and surprise which are
recapitulated by incorporating Action Units. The six globally presented and identified from facial
common emotions have been considered as globally expressions.
recognizable as the movements of muscle are very
similar of these emotional expressions from the people
from various region and society. Therefore, we have
mainly concentrated on the automatic recognition of
these six fundamental emotions.
In general, emotion recognition is a two steps
procedure which involves extraction of significant
features and classification [1]. Feature extraction
determines a set of independent attributes, which
together can portray an expression of facial emotion.
For classification in emotion recognition the features
are mapped into either of various emotion classes like
anger, happy, sad, disgust, surprise, etc [1]. For the
effectiveness calculation of a facial expression
identification model both the group of feature
attributes which have been taken for feature extraction
and the classifier that is responsible classification are
equivalently significant. For a badly picked collection
of feature attributes, in some cases, even a smart
classification mechanism is not able to produce an
ideal outcome. Thus, for getting the high classification
accuracy and qualitative outcome, picking of superior
features will play a major role.
The circumflex model by Russell and recognition Fig1. The Circumflex Representation of Russell.
of six basic emotions are having remarkable
contribution in the field of emotion recognition. Other Since last five decades, this model with six basic
than this, the work by Kudiri M. Krishna, Said Abas emotion has begun to be the most popular and usual
Md, Nayan M Yunus [2], where they tried to detect model for estimating the emotions and detection of
emotion by using the concept of sub-image based emotion from their respective facial expression. After
features through facial expression. Silva C. DE certain time a different model of emotion was
Liyanage, Miyasato Tsutomu, Nakatsu Ryohei [4], presented by Russell where emotional states are
they formed a model for emotion recognition with the depicted by a ring having two pole in two dimensional
help of multimodal information. Maja Pantic, Iounnis space instead for categorizing each of the emotion
Patras [5], they implemented an approach for distinctly.
recognition facial action and temporal segment from
IV. DATASET
fare profile image sequences by considering the
dynamic property of facial action. Li Zhang, Ming
To simulate our proposed model JAFFE has been
Jiang, Dewan Farid, M.A. Hassain [13], they modelled
used. JAFFE has 213 sample images and 213 lines in
an intelligent system for automatic emotion
this file. Each line contains the position of 77 key
recognition. Happy S L, Routray Aurobinda [7],
points, thus makes a 154 dimensional vector. Also
created an automatic emotion recognition system by
all_labels.txt contains all sample labels in numerical
using salient features. This article is greatly influenced
form, where label mapping is done as follows: NEU =
by all of this contribution in the field of emotion
0; HAP = 1; SAD = 2; SUR = 3; ANG = 4; DIS = 5;
recognition.
FEA = 6. Out of 213 samples 20 number of sample is
used for training for respective emotion.
III. EMOTION TAXONOMY
V. IMPLEMENTATION
According to the emotion theorists and
psychologists, different of emotion can be categorized
A. Frame Extraction and Face Detection
starting from globally showed six fundamental
Initially we have taken live video stream and
emotions to complicated emotions which are
extracts frames from the video. Then we have tried to
originated from different culture with. From several
detect those frames that are having face. To detect face
reported framework in the field of emotion
in a frame we have used an existing well-known
recognition, two models have hold the command in
technique where for feature extraction Gabor method is
this research domain: Ekman's fundamental set of
used and for learning of neural network is used, and
78
Authorized licensed use limited to: UNIVERSIDAD DE ALICANTE . Downloaded on November 07,2024 at 17:38:41 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Advanced Communication Control and Computing Technologies (ICACCCT)
Input
ଵ ͳ
Extracted frames
Layer
as input image ୧ଶ Activation
ଶ ܵܫ
functionܱூ ୍
୧ଷ ͵ ݃ሺܵூ ሻ
Gabor Feature Extraction from Image ܫଷ ଷ ଷ
Hidden
Layer
threshold, t
Face Detection Using the Filter
Output
Layer
Training O ANN
means learning the
yi weights of neurons
Create Database to Train Neural
Network Fig 2. Architecture of Artificial Neural Network
79
Authorized licensed use limited to: UNIVERSIDAD DE ALICANTE . Downloaded on November 07,2024 at 17:38:41 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Advanced Communication Control and Computing Technologies (ICACCCT)
,
Where the similarity of jth feature vector of the test
face(vi,j), to kth feature vector of ith reference face, (vi,k)
is represented by Si (k,j). The similarity OSi of two
faces are calculated by
σೕ ௌǡೕ
ܱܵ ൌ
ே
(5)
Where, b. Real parts of Gabor filter
ܵ ሺ݈ǡ ݆ሻ
ܵ݅݉ǡ ൌ ݉ܽ ݔ ൨ (6)
ܵ ሺܰ ǡ ݆ሻ
80
Authorized licensed use limited to: UNIVERSIDAD DE ALICANTE . Downloaded on November 07,2024 at 17:38:41 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Advanced Communication Control and Computing Technologies (ICACCCT)
ଵ
X= σே
ୀଵ ݔ
ே
ே
ͳ
ሼݑଵ் ݔ െ ݑଵ் ݔሽଶ ൌ ݑଵ் ܵݑଵ
ܰ
ୀଵ
81
Authorized licensed use limited to: UNIVERSIDAD DE ALICANTE . Downloaded on November 07,2024 at 17:38:41 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Advanced Communication Control and Computing Technologies (ICACCCT)
Emotion Detected
Stop Successfully
82
Authorized licensed use limited to: UNIVERSIDAD DE ALICANTE . Downloaded on November 07,2024 at 17:38:41 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Advanced Communication Control and Computing Technologies (ICACCCT)
Two emotion
has got
overlapped
83
Authorized licensed use limited to: UNIVERSIDAD DE ALICANTE . Downloaded on November 07,2024 at 17:38:41 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Advanced Communication Control and Computing Technologies (ICACCCT)
REFERENCES
[1] D. Anurag,S. Ashim, ‘A Comparative Study on different
approaches of Real Time Human Emotion Recognition based
on Facial Expression Detection’, International Conference on
Advances in Computer Engineering and Applications, 978-
1-4673-6911-4/15/$31.00©2015 IEEE.
[2] Kudiri M. Krishna, Said Abas Md, Nayan M Yunus, ‘Emotion
Detection Using Sub-image Based Features Through Human
Facial Expressions’, International on Computer & Information
Science (ICCIS). 978-1-4673-1938-6/12/$31.00©2012 IEEE.
l. Scatter output of the same experimental [3] Pal Pritam, Iyer Ananth N. and Yantorno Robert E . ‘Emotion
image Detection From Infant Facial Expression’, Internationa
Conference on Acoustics, Speech and Signal Processing. 1-
4244-0469-X/06/$20.00©2016 IEEE
[4] Silva C. DE Liyanage,Miyasato Tsutomu, Nakatsu Ryohei.
‘Facial Emotion Recognition Using Multi-modal Information’,
PCA of the new International Conference on Information, Communication and
image instances is Signal Processing, 0-7803-3676-3/97/$10.00©1997 IEEE.
closer to angry [5] Maja Pantic, Iounnis Patras. ‘Dynamics of Facial Expression :
Recognition of Facial Actions and their Temoporal Segments
from fave profile Image Sequences’, IEEE Transctions on
System, Man and Cybernetis.1083-4419/$20.00©2006 IEEE
[6] Songfan Yang, Bir Bhanu, ‘Understanding Discrete Facial
expressions in Video Using an Emotion Avatar Image’, IEEE
Tranjactins on Systems, Man and Cybernetis. 1083-
4419/$31.00©2012 IEEE
[7] Happy S L, Routray Aurobinda, ‘Autamatic facial Expression
Recognition Using Features Salient Facial Patterns’, IEEE
m. Output to show minimum distance Tranjactions on Affective Computing.1949-3045©2014 IEEE.
measure [8] Mohammad Soleymani , Sadjad Asghari-Esfeden, Yunfu ,Maja
Pantic, ‘Analysis of EEG signals and facial Expressions for
Fig 4. Output of each individual steps of our proposed Continuous Emotion detection’, IEEE Transctions on Affective
approach Computing, 1949- 3045©2015 IEEE.
[9] Leh Luoh, Chih- Chang Huang, Hsueh-Yenhiu. ‘Image
VII. FUTURE WORK Processing Based Emotion Recognition’, International
Conference on System Science and Engineering. 978-1-4244-
We can make our automated framework for 6474-6/10/$26.00©2010
emotion detection more efficient by improving the [10] F. Abdat, C. Maaoui and A. Pruski, UKSIM 5th European
pattern classifiers by which we will be able to handle Symposium on Computer Modeling and Simulation. 978-0-
7695-4619-3/11$26.00©2011 IEEE.
more accurately the emotion of new face to which
[11] Kenny Hong, Stephan K. Chalup, Robert A.R. King. ’ A
class of emotion-cluster that will belong. However it Component based Approach for Classifying the Seven
will be very fascinating if we contemplate by Universal Facial Expressions of Emotion’. 978-1-4673-6010-
considering both the auditory & visual information and 4//13/$31.00©2013 IEEE
some more attributes like EEG signal, facial color etc. [12] A. Ghahari, Y. Rakhshani Fatmehsari and R.A Zoroofi, ‘A
together, for processing with the expectation that this Novel Clustering- Based Feature Extraction Method for an
Automatic Facial Expression Analysis System’. 5th
kind of multi-modal information processing will International Conference on Intelligent Information Hiding and
become a datum of information processing in future Multimedia Signal Processing. 978-0-7695-3762-
multimedia era. We can even improve the accuracy by 7/09$26.00©2009 IEEE.
taking the principal component of each individual [13] Li Zhang, Ming Jiang, Dewan Farid, M.A. Hassain, ‘Intelligent
portion of the face like eye, nose, lips, forehead, cheek facial Emotion recognition and Semantic- Based topic
detection for a Humanoid Robot’.0957-4174/$ © 2013 Elsevier
etc and then compare with the experimented image. Ltd.
[14] Lin-Lin Huang, Akinobu Shimizu, and Hidefumi Kobatake.
VIII. CONCLUSION ‘Automatic Face and Gesture Recognition. Classification-
Till today all of the existing vision system for Based Face Detection Using Gabor Filter Features’, Sixth
IEEE International Conference. 0-7695-2122-3/04 $ 20.00 ©
facial muscle action detection deal only with the 2004 IEEE.
frontal-view face images and cannot handle the [15] Wang Chuan-xu, Li Xue. ‘Face detection using BP network
temporal dynamics of facial actions. Also for some combined with Gabor wavelet transform’, Ninth ACIS
human being, they don’t show their emotion and International Conference on Software Engineering, Artificial
mental state by facial expression, for this kind of Intelligence, Networking, and Parallel/Distributed Computing.
978-0-7695-3263-9/08 $25.00 © 2008 IEEE.
situation our proposed model significantly fails to
[16] Burcu Kepenekci, F. Boray. ‘Occluded Face Recognition
recognize the emotion and provides FALSE Based on Gabor Wavelets’. 0-7803-7622-6/02/$17.00 02002
POSITIVE result. However, with this shortcoming we IEEE.
have shown based on experimental confirmation that [17] http://www.kasrl.org/jaffe_info.htm.
84
Authorized licensed use limited to: UNIVERSIDAD DE ALICANTE . Downloaded on November 07,2024 at 17:38:41 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Advanced Communication Control and Computing Technologies (ICACCCT)
85
Authorized licensed use limited to: UNIVERSIDAD DE ALICANTE . Downloaded on November 07,2024 at 17:38:41 UTC from IEEE Xplore. Restrictions apply.