Authors:
Min Seop Lee
1
;
Ye Ri Cho
1
;
Yun Kyu Lee
1
;
Dong Sung Pae
1
;
Myo Taeg Lim
1
and
Tae Koo Kang
2
Affiliations:
1
School of Electrical Engineering, Korea University, Seoul and Republic of Korea
;
2
Department of Human Intelligence and Robot Engineering, Sangmyung University, Cheonan and Republic of Korea
Keyword(s):
Valence, Arousal, Convolutional Neural Network, Physiological Signal, PPG, EMG.
Related
Ontology
Subjects/Areas/Topics:
Biological Inspired Sensors
;
Informatics in Control, Automation and Robotics
;
Sensors Fusion
;
Signal Processing, Sensors, Systems Modeling and Control
Abstract:
Emotion recognition is an essential part of human computer interaction and there are many sources for emotion recognition. In this study, physiological signals, especially electromyogram (EMG) and photoplethysmogram (PPG) are used to detect the emotion. To classify emotions in more detail, the existing method of modeling emotion which represents the emotion as valence and arousal is subdivided by four levels. Convolutional Neural network (CNN) is adopted for feature extraction and emotion classification. We measure the EMG and PPG signals from 30 subjects using selected 32 videos. Our method is evaluated by what we acquired from participants.