EEG–fNIRS-Based Emotion Recognition Using Graph Convolution and Capsule Attention Network
Abstract
:1. Introduction
Method | Data | Features | Protocol | Accuracy (%) |
---|---|---|---|---|
SVM [8] | EEG from DEAP | time and frequency domain features | 10-fold CV | 89.94 |
CNN [9] | EEG from DEAP | time and frequency domain features | LOSO CV | 80.52 valence, 75.22 arousal |
MD-GCN [10] | EEG from SEED/SEED-IV | temporal and spatial features | LOSO CV | 92.15/90.83 |
GC-F-GCN [11] | EEG from DEAP | differential entropy | 5-fold CV | 98.46 valence, 97.91 arousal |
ACTNN [12] | EEG from SEED/SEED-IV | differential entropy | 10-fold CV | 98.47/91.90 |
TC-Net [13] | EEG from DEAP | continuous wavelet transform | 10-fold CV | 98.76 valence, 98.81 arousal |
MLF-CapsNet [14] | EEG from DEAP | preprocessed EEG | 10-fold CV | 97.97 valence, 98.31 arousal |
SVM [15] | fNIRS | slope, min, max, etc. from HbO and HbR | 4-fold CV | - |
SVM [16] | fNIRS | HbO/HbR | 6-fold CV | 83.69/79.06 |
DBJNet [17] | fNIRS | HbO | LOSO CV | 74.80 |
TC-ResNet [20] | EEG, fNIRS | spectral features from EEG-HbO | - | 99.81 |
SVM [21] | EEG, fNIRS | PSD from EEG, mean, median, etc. from HbT, HbO, HbR | leave-one-out CV | 80.00 |
- (1)
- A novel EEG–fNIRS-based emotion recognition framework using a graph convolution and capsule attention network, namely GCN-CA-CapsNet, is introduced;
- (2)
- The CNN layer is replaced with the GCN layer, which can extract and fuse the graph structure features and spatial correlation from the EEG and fNIRS;
- (3)
- The capsule attention mechanism is proposed to give different primary capsules to different attention weights, and the primary capsules with higher quality are selected to generate better classification capsules.
2. Materials and Methods
2.1. EEG–fNIRS Dataset and Preprocessing
2.2. Feature Extraction
2.3. GCN-CA-CapsNet Model
2.3.1. GCN Module
2.3.2. Capsule Attention Module
2.3.3. Dynamic Routing-Based Classification Capsule Module
3. Experimental Results and Analysis
3.1. Experimental Settings and Evaluation
3.2. Ablation Study
- (1)
- CapsNet: The CapsNet method includes two-layer CNN, a primary capsule module, and a classification capsule module;
- (2)
- GCN-CapsNet: Compared with the CapsNet, this network replaces the CNN with the GCN, which is utilized to extract graph structure features from the EEG and fNIRS;
- (3)
- GCN-CA-CapsNet: This network introduces the capsule attention mechanism, which gives different primary capsules to different attention weights for feature fusion.
3.3. Performance Comparison with Single EEG and Single fNIRS
3.4. Comparison with the State-of-the-Art Methods
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Izard, C.E. The many meanings/aspects of emotion: Definitions, functions, activation, and regulation. Emot. Rev. 2010, 2, 363–370. [Google Scholar] [CrossRef]
- He, Z.; Li, Z.; Yang, F.; Wang, L.; Li, J.; Zhou, C.; Pan, J. Advances in multimodal emotion recognition based on brain–computer interfaces. Brain Sci. 2020, 10, 687. [Google Scholar] [CrossRef] [PubMed]
- Damasio, A.R.; Grabowski, T.J.; Bechara, A.; Damasio, H.; Ponto, L.L.; Parvizi, J.; Hichwa, R.D. Subcortical and cortical brain activity during the feeling of self-generated emotions. Nat. Neurosci. 2000, 3, 1049–1056. [Google Scholar] [CrossRef] [PubMed]
- Liu, Z.; Shore, J.; Wang, M.; Yuan, F.; Buss, A.; Zhao, X. A systematic review on hybrid EEG/fNIRS in brain-computer interface. Biomed. Signal Process. Control 2021, 68, 102595. [Google Scholar] [CrossRef]
- Qiu, L.; Zhong, Y.; Xie, Q.; He, Z.; Wang, X.; Chen, Y.; Zhan, C.A.A.; Pan, J. Multi-modal integration of EEG-fNIRS for characterization of brain activity evoked by preferred music. Front. Neurorobotics 2022, 16, 823435. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; Zhang, Y.; Tiwari, P.; Song, D.; Hu, B.; Yang, M.; Zhao, Z.; Kumar, N.; Marttinen, P. EEG based emotion recognition: A tutorial and review. ACM Comput. Surv. 2022, 55, 1–57. [Google Scholar] [CrossRef]
- Zhang, X.; Huang, D.; Li, H.; Zhang, Y.; Xia, Y.; Liu, J. Self-training maximum classifier discrepancy for EEG emotion recognition. CAAI Trans. Intell. Technol. 2023, 8, 1480–1491. [Google Scholar] [CrossRef]
- Chen, G.; Zhang, X.; Sun, Y.; Zhang, J. Emotion feature analysis and recognition based on reconstructed EEG sources. IEEE Access 2020, 8, 11907–11916. [Google Scholar] [CrossRef]
- Gao, Q.; Yang, Y.; Kang, Q.; Tian, Z.; Song, Y. EEG-based emotion recognition with feature fusion networks. Int. J. Mach. Learn. Cybern. 2022, 13, 421–429. [Google Scholar] [CrossRef]
- Du, G.; Su, J.; Zhang, L.; Su, K.; Wang, X.; Teng, S.; Liu, P.X. A multi-dimensional graph convolution network for EEG emotion recognition. IEEE Trans. Instrum. Meas. 2022, 71, 2518311. [Google Scholar] [CrossRef]
- Zhang, J.; Zhang, X.; Chen, G.; Zhao, Q. Granger-causality-based multi-frequency band EEG graph feature extraction and fusion for emotion recognition. Brain Sci. 2022, 12, 1649. [Google Scholar] [CrossRef] [PubMed]
- Gong, L.; Li, M.; Zhang, T.; Chen, W. EEG emotion recognition using attention-based convolutional transformer neural network. Biomed. Signal Process. Control 2023, 84, 104835. [Google Scholar] [CrossRef]
- Wei, Y.; Liu, Y.; Li, C.; Cheng, J.; Song, R.; Chen, X. TC-Net: A Transformer Capsule Network for EEG-based emotion recognition. Comput. Biol. Med. 2023, 152, 106463. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.; Ding, Y.; Li, C.; Cheng, J.; Song, R.; Wan, F.; Chen, X. Multi-channel EEG-based emotion recognition via a multi-level features guided capsule network. Comput. Biol. Med. 2020, 123, 103927. [Google Scholar] [CrossRef] [PubMed]
- Bandara, D.; Velipasalar, S.; Bratt, S.; Hirshfield, L. Building predictive models of emotion with functional near-infrared spectroscopy. Int. J. Hum.-Comput. Stud. 2018, 110, 75–85. [Google Scholar] [CrossRef]
- Hu, X.; Zhuang, C.; Wang, F.; Liu, Y.J.; Im, C.H.; Zhang, D. fNIRS evidence for recognizably different positive emotions. Front. Hum. Neurosci. 2019, 13, 120. [Google Scholar] [CrossRef] [PubMed]
- Si, X.; He, H.; Yu, J.; Ming, D. Cross-subject emotion recognition brain–computer interface based on fNIRS and DBJNet. Cyborg Bionic Syst. 2023, 4, 0045. [Google Scholar] [CrossRef] [PubMed]
- Balconi, M.; Grippa, E.; Vanutelli, M.E. What hemodynamic (fNIRS), electrophysiological (EEG) and autonomic integrated measures can tell us about emotional processing. Brain Cogn. 2015, 95, 67–76. [Google Scholar] [CrossRef] [PubMed]
- Rahman, L.; Oyama, K. Long-term monitoring of nirs and eeg signals for assessment of daily changes in emotional valence. In Proceedings of the 2018 IEEE International Conference on Cognitive Computing (ICCC), San Francisco, CA, USA, 2–7 July 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 118–121. [Google Scholar] [CrossRef]
- Chen, J.; Yu, K.; Wang, F.; Zhou, Z.; Bi, Y.; Zhuang, S.; Zhang, D. Temporal convolutional network-enhanced real-time implicit emotion recognition with an innovative wearable fNIRS-EEG dual-modal system. Electronics 2024, 13, 1310. [Google Scholar] [CrossRef]
- Sun, Y.; Ayaz, H.; Akansu, A.N. Multimodal affective state assessment using fNIRS+ EEG and spontaneous facial expression. Brain Sci. 2020, 10, 85. [Google Scholar] [CrossRef]
- Mognon, A.; Jovicich, J.; Bruzzone, L.; Buiatti, M. ADJUST: An automatic EEG artifact detector based on the joint use of spatial and temporal features. Psychophysiology 2011, 48, 229–240. [Google Scholar] [CrossRef]
- Strangman, G.; Franceschini, M.A.; Boas, D.A. Factors affecting the accuracy of near-infrared spectroscopy concentration calculations for focal changes in oxygenation parameters. Neuroimage 2003, 18, 865–879. [Google Scholar] [CrossRef]
- Shi, L.C.; Jiao, Y.Y.; Lu, B.L. Differential entropy feature for EEG-based vigilance estimation. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 6627–6630. [Google Scholar] [CrossRef]
- Hamilton, W.; Ying, Z.; Leskovec, J. Inductive representation learning on large graphs. In Proceedings of the Advances in Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; Volume 30. [Google Scholar]
- Sabour, S.; Frosst, N.; Hinton, G.E. Dynamic routing between capsules. In Proceedings of the Advances in Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; Volume 30, pp. 1–11. [Google Scholar]
- Bao, G.; Yang, K.; Tong, L.; Shu, J.; Zhang, R.; Wang, L.; Yan, B.; Zeng, Y. Linking multi-layer dynamical GCN with style-based recalibration CNN for EEG-based emotion recognition. Front. Neurorobotics 2022, 16, 834952. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.; Chen, C.; Li, J.; Wan, F.; Sun, Y.; Wang, H. ST-CapsNet: Linking spatial and temporal attention with capsule network for P300 detection improvement. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 991–1000. [Google Scholar] [CrossRef] [PubMed]
- Liberati, G.; Federici, S.; Pasqualotto, E. Extracting neurophysiological signals reflecting users’ emotional and affective responses to BCI use: A systematic literature review. NeuroRehabilitation 2015, 37, 341–358. [Google Scholar] [CrossRef]
- Wu, D.; Lu, B.L.; Hu, B.; Zeng, Z. Affective brain–computer interfaces (abcis): A tutorial. Proc. IEEE 2023, 111, 1314–1332. [Google Scholar] [CrossRef]
- Uchitel, J.; Vidal-Rosas, E.E.; Cooper, R.J.; Zhao, H. Wearable, integrated eeg–fnirs technologies: A review. Sensors 2021, 21, 6106. [Google Scholar] [CrossRef] [PubMed]
- Kwak, Y.; Song, W.J.; Kim, S.E. FGANet: fNIRS-guided attention network for hybrid EEG-fNIRS brain-computer interfaces. IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 30, 329–339. [Google Scholar] [CrossRef] [PubMed]
- Eastmond, C.; Subedi, A.; De, S.; Intes, X. Deep learning in fNIRS: A review. Neurophotonics 2022, 9, 041411. [Google Scholar] [CrossRef]
- Balconi, M.; Vanutelli, M.E. Hemodynamic (fNIRS) and EEG (N200) correlates of emotional inter-species interactions modulated by visual and auditory stimulation. Sci. Rep. 2016, 6, 23083. [Google Scholar] [CrossRef]
- Zhang, Y.; Zhu, C. Assessing brain networks by resting-state dynamic functional connectivity: An fNIRS-EEG study. Front. Neurosci. 2020, 13, 1430. [Google Scholar] [CrossRef] [PubMed]
- Lu, C.M.; Zhang, Y.J.; Biswal, B.B.; Zang, Y.F.; Peng, D.L.; Zhu, C.Z. Use of fNIRS to assess resting state functional connectivity. J. Neurosci. Methods 2010, 186, 242–249. [Google Scholar] [CrossRef]
- Qiu, X.; Wang, S.; Wang, R.; Zhang, Y.; Huang, L. A multi-head residual connection GCN for EEG emotion recognition. Comput. Biol. Med. 2023, 163, 107126. [Google Scholar] [CrossRef]
- Li, C.; Wang, B.; Zhang, S.; Liu, Y.; Song, R.; Cheng, J.; Chen, X. Emotion recognition from EEG based on multi-task learning with capsule network and attention mechanism. Comput. Biol. Med. 2022, 143, 105303. [Google Scholar] [CrossRef]
- Kuppens, P.; Tong, E.M. An appraisal account of individual differences in emotional experience. Soc. Personal. Psychol. Compass 2010, 4, 1138–1150. [Google Scholar] [CrossRef]
Methods | Acc (%) | Number of Parameters | Times (s) | ||||
---|---|---|---|---|---|---|---|
Sad | Happy | Calm | Fear | Avg Acc (Std) | |||
CapsNet | 91.27 | 92.71 | 90.34 | 92.42 | 91.69 (5.45) | 2,897,155 | 1814 |
GCN-CapsNet | 96.61 | 96.49 | 94.91 | 97.00 | 96.25 (3.04) | 1,071,271 | 391 |
GCN-CA-CapsNet | 97.76 | 98.19 | 97.06 | 98.62 | 97.91 (2.20) | 1,102,585 | 429 |
Methods | Acc (%) | ||||
---|---|---|---|---|---|
Sad | Happy | Calm | Fear | Avg Acc (Std) | |
GCN-CA-CapsNet (EEG) | 96.31 | 96.47 | 95.03 | 96.88 | 96.17 (2.63) |
GCN-CA-CapsNet (fNIRS) | 83.10 | 88.19 | 81.41 | 85.95 | 84.66 (4.38) |
GCN-CA-CapsNet (EEG-fNIRS) | 97.76 | 98.19 | 97.06 | 98.62 | 97.91 (2.20) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, G.; Liu, Y.; Zhang, X. EEG–fNIRS-Based Emotion Recognition Using Graph Convolution and Capsule Attention Network. Brain Sci. 2024, 14, 820. https://doi.org/10.3390/brainsci14080820
Chen G, Liu Y, Zhang X. EEG–fNIRS-Based Emotion Recognition Using Graph Convolution and Capsule Attention Network. Brain Sciences. 2024; 14(8):820. https://doi.org/10.3390/brainsci14080820
Chicago/Turabian StyleChen, Guijun, Yue Liu, and Xueying Zhang. 2024. "EEG–fNIRS-Based Emotion Recognition Using Graph Convolution and Capsule Attention Network" Brain Sciences 14, no. 8: 820. https://doi.org/10.3390/brainsci14080820