Identifying Gender Differences in Multimodal Emotion Recognition Using Bimodal Deep AutoEncoder
This paper mainly focuses on investigating the differences between males and females in emotion recognition using electroencephalography (EEG) and eye movement data. Four basic emotions are considered, namely happy, sad, fearful and neutral. The Bimodal Deep AutoEncoder (BDAE) and the fuzzy-integral-based method are applied to fuse EEG and eye movement data. Our experimental results indicate that gender differences do exist in neural patterns for emotion recognition; eye movement data is not as good as EEG data for examining gender differences in emotion recognition; the activation of the brains for females is generally lower than that for males in most bands and brain areas especially for fearful emotions. According to the confusion matrix, we observe that the fearful emotion is more diverse among women compared with men, and men behave more diversely on the sad emotion compared with women. Additionally, individual differences in fear are more pronounced than other three emotions for females.
KeywordsEEG Eye movement data Emotion BDAE Gender differences
This work was supported in part by grants from the National Key Research and Development Program of China (Grant No. 2017YFB1002501), the National Natural Science Foundation of China (Grant No. 61673266), the Major Basic Research Program of Shanghai Science and Technology Committee (Grant No. 15JC1400103), ZBYY-MOE Joint Funding (Grant No. 6141A02022604), and the Technology Research and Development Program of China Railway Corporation (Grant No. 2016Z003-B).
- 1.Duan, R.N., Zhu, J.Y., Lu, B.L.: Differential entropy feature for EEG-based emotion classification. In: 6th International IEEE/EMBS Conference on Neural Engineering, vol. 8588, pp. 81–84 (2013)Google Scholar
- 4.Lu, Y., Zheng, W.L., Li, B., Lu, B.L.: Combining eye movements and EEG to enhance emotion recognition. In: International Conference on Artificial Intelligence, pp. 1170–1176 (2015)Google Scholar
- 6.Sugeno, M.: Theory of fuzzy integrals and its applications. Ph.D. thesis. Tokyo Institute of Technology (1974)Google Scholar
- 9.Weiss, E., Siedentopf, C.M., Hofer, A., Deisenhammer, E.A., Hoptman, M.J., Kremser, C., Golaszewski, S., Felber, S., Fleischhacker, W.W., Delazer, M.: Sex differences in brain activation pattern during a visuospatial cognitive task: a functional magnetic resonance imaging study in healthy volunteers. Neurosci. Lett. 344(3), 169–72 (2003)CrossRefGoogle Scholar
- 10.Zheng, W.L., Dong, B.N., Lu, B.L.: Multimodal emotion recognition using EEG and eye tracking data. In: 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5040–5043. IEEE (2014)Google Scholar
- 12.Zhu, J.-Y., Zheng, W.-L., Lu, B.-L.: Cross-subject and cross-gender emotion classification from EEG. In: Jaffray, D.A. (ed.) World Congress on Medical Physics and Biomedical Engineering, June 7-12, 2015, Toronto, Canada. IP, vol. 51, pp. 1188–1191. Springer, Cham (2015). doi: 10.1007/978-3-319-19387-8_288 CrossRefGoogle Scholar