Facial Emotion Detection in Low Light Conditions Using CNN
Facial Emotion Detection in Low Light Conditions Using CNN
using CNN
Chinmayee Mhaskar Nidhi Parikh Sherilyn Kevin
Dept. of Information Technology Dept. of Information Technology Dept. of Information Technology
Thakur College of science and Thakur College of science and Thakur College of science and
commerce commerce commerce
Mumbai University Mumbai University Mumbai University
Mumbai-400101, India Mumbai, India Mumbai, India
[email protected] [email protected] [email protected]
The accuracy of binary classification can also be determined Fig. 6. Disgust Facial Emotion Detected
in terms of positives and negatives, as in the following
formula:
V. CONCLUSION
Accuracy = TP+TN In this paper, the CNN model is developed to extract facial
TP+TN+FP+FN features and recognize emotions. The FER 2013 dataset
consists of seven emotions. The emotions considered were
Where TP stands for True Positives, TN for True Negatives, happy, sad, angry, fearful, surprised, disgusted, and neutral.
FP for False Positives, and FN for False Negatives. These images were converted into NumPy arrays and
landmark features were identified and extracted. A CNN
TABLE I. Experimental Analysis and Results
model was developed with four phases where the first three
phases had convolution, pooling, batch normalization, and
BATC EPOC TRAINING TRAININ VALIDATIO VALIDATIO dropout layers. The final phase consists of output layers.
H H ACCURAC G LOSS N N LOSS CNN Model has 35,888 parameters of which 28,710 are
Y ACCURACY
trainable. The best parameter values were determined for
64 10 57% 1.13 53% 1.27 these models using the accuracy and loss metrics. CNN
64 50 65% 0.92 62% 0.12 Model had an average accuracy of 85.0% and an average
loss of 0.93.
64 75 70% 0.8 65% 0.95
64 100 85% 0.7 69% 0.95 ACKNOWLEDGMENT
This paper and the research behind it would not have
been possible without the exceptional support of my
supervisor, Sherilyn Kevin. Her enthusiasm, knowledge and
IV. OUTPUTS exacting attention to detail have been an inspiration and kept
The following figures show detected facial emotions in our work on track from our first encounter with the log books
low-light conditions. of Artificial Network, Deep Learning to the final draft of this
paper. Further, we are very thankful to our Head of
Department Dr. Santosh Kumar Singh, for giving us this
opportunity. Last but not the least, we would like to thank all
our friends, family members, non-teaching staff, and
colleagues for their support and individual help.
REFERENCES
[1] B. Y. LeCun, Yann and G. Hinton, “Deep learning,” Nature, vol. 521,
pp. 436–444, May 28, 2015.
[2] D. Duncan, G. Shine, and C. English, “Facial emotion recognition in
real-time”, [Accessed: 2019-05-03].
[3] P. Lucey, J. F. Cohn, J. S. Takeo Kanade, Z. Ambadar, and I.
Matthews, “A complete expression dataset for action unit and
emotion-specified expression,” 2010 IEEE Computer Society
Conference on Computer Vision and Pattern Recognition-
Workshopss, pp. 94–101, June 12, 2010.
Fig. 3. Happy Facial Emotion Detected [4] M. Lyons, S. Akamatsu, M. Kamachi, and J. Gyoba, “A brief review
Fig. 4. Neutral Facial Emotion Detected of facial emotion recognition based on visual information,” Automatic
Face and Gesture Recognition, 1998. Proceedings of Third IEEE
International Conference on, pp. 200–205, April 14, 1998.
[5] M. Lyons, S. Akamatsu, M. Kamachi, and J. Gyoba, “Coding facial
expressions with Gabor wavelets,” Proceedings of Third IEEE
International Conference on Automatic Face and Gesture
Recognition, 1998., pp. 200–205, April 14, 1998.
[6] J. P. Skelley, "Experiments in Expression Recognition," MSc Thesis, [11] Senthil Kumar, Keerthana, Sharmila, Dhivya Shree, Face Emotion
Massachusetts institute of technology, 2005. Recognition and Detection, Volume: 08, Issue: 06 June 2021.
[7] Zilu Ying, Mingwei Huang, Zhen Wang, and Zhewei Wang, "A New [12] Johanna Freeda, Lavanya, Lekhaa Shree, Nivedhitha, Kavitha,
Method of Facial Expression Recognition Based on SPE Plus SVM," Emotion Detection using Convolutional Neural Network, Volume 8,
Springer-Verlag Berlin Heidelberg, Part II, CCIS 135, 2011, pp. 399– Issue 3 March 2019.
404. [13] Pushkata Petkar, K. A. Pujari, Srushti Salgare, Sumukh Shetty,
[8] M. Pantic, and I. Patras, "Dynamics of Facial Expression: Recognition Emotion Recognition Using CNN, Volume 09, Issue 4 June 2020.
of Facial Action and their Temporal Segment from Face Profile Image [14] Nguyen Gia Hong, Facial Emotional Recognition Experiment by
sequences," IEEE Trans. A system, Man, and Cybernetic, Part B, Vol. applying R-CNN, 05 October 2020.
36, No. 2, 2006, pp.433-449.
[15] Hongli Zhang, Alireza Jolfaei, and Mamoun Alazab, A Face Emotion
[9] I. Kotsia, I. Buciu, and I. Pitas, "An Analysis of Facial Expression Recognition Method Using Convolutional Neural Network and Image
Recognition under Partial Facial Image Occlusion," Image and Vision Edge Computing, volume xx, 2017.
Computing, Vol. 26, No. 7, 2008, pp.1052-1067.
[10] I. Kotsia, I. Buciu, and I. Pitas, "An Analysis of Facial Expression
Recognition under Partial Facial Image Occlusion," Image and Vision
Computing, Vol. 26, No. 7, 2008, pp.1052-1067.