Multimodal Paradigm for Emotion Recognition Based on EEG Signals
A large number of existing real-time emotion recognition systems use stimuli with low ecological validity to elicit emotions. Furthermore, most of the emotion-based studies have used single stimuli to evoke emotions. In order to address these issues, this paper proposes multi modal emotion elicitation paradigm to investigate if same signatures of emotions exist if induced by using different methods of elicitation. The proposed method attempts to analyse Electroencephalography (EEG) data of healthy human subjects to recognize emotions using three different ways of elicitation. Emotional imagery is one of the proposed methods. Furthermore, audio-video clips and immersive videos on Virtual reality headsets are other two stimuli proposed for evoking emotions. Initially, experiments based on emotional imagery have been conducted with eight participants. The recorded EEG data is time sampled with 1 s time window and bandpass filtered in different frequency bands of Delta, Theta, Alpha, Beta, Low and High Gamma. Linear Discriminant Analysis (LDA) classified these features in two different classes of fear vs. neutral. Findings revealed that gamma band has achieved highest classification performance amongst other bands. This is in consistent with other studies related to emotion recognition. Outcome of the proposed work is a prototype of EEG based BCI system for emotion recognition using multi-modal elicitation methods. Initially, as mentioned before EEG data recorded based on emotional imagery was analysed. In the next phase of this study, experiments using audio-video clips and immersive videos will be conducted.
KeywordsElectroencephalography (EEG) Emotions Virtual reality Emotional imagery Classification
This work is funded by Higher Education Commission (HEC), Pakistan and is being conducted and supervised under the ‘Intelligent Systems and Robotics’ research group at Computer Science (CS) Department, Bahria University, Karachi, Pakistan.
- 1.Hockenbury, D.H., Hockenbury, S.E.: Discovering Psychology. Macmillan (2010)Google Scholar
- 2.Ekman, P.E., Davidson, R.J.: The Nature of Emotion: Fundamental Questions. Oxford University Press (1994)Google Scholar
- 4.Hu, X., et al.: EEG Correlates of ten positive emotions. Front. Hum. Neurosci. 11 (2017)Google Scholar
- 6.Kothe, C.A., Makeig, S., Onton, J.A.: Emotion recognition from EEG during self-paced emotional imagery. In: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII). IEEE (2013)Google Scholar
- 8.Hettich, D.T., et al.: EEG responses to auditory stimuli for automatic affect recognition. Front. Neurosci. 10 (2016)Google Scholar
- 9.Larsen, E.A.: Classification of EEG signals in a brain-computer interface system. Institutt for datateknikk og informasjonsvitenskap (2011)Google Scholar
- 14.Tyrrell, R., et al.: Simulator sickness in patients with neck pain and vestibular pathology during virtual reality tasks. In: Virtual Reality, pp. 1–9 (2017)Google Scholar
- 17.Simões, M., Amaral, C., Carvalho, P., Castelo-Branco, M.: Specific EEG/ERP responses to dynamic facial expressions in virtual reality environments. In: Zhang, Y.-T. (ed.) The International Conference on Health Informatics. IP, vol. 42, pp. 331–334. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-03005-0_84CrossRefGoogle Scholar
- 18.Naveen Masood, H.F.: EEG Correlates of self-induced Human Emotions: Fear vs. Neutral. (Submitted) (2017)Google Scholar
- 19.Li, M., Lu, B.-L.: Emotion classification based on gamma-band EEG. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2009. IEEE (2009)Google Scholar
- 20.Nie, D., et al.: EEG-based emotion recognition during watching movies. In: 2011 5th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE (2011)Google Scholar