Skip to main content
Log in

Investigation of human state classification via EEG signals elicited by emotional audio-visual stimulation

  • 1232: Human-centric Multimedia Analysis
  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

EEG-based classification from human states is still challenging in human-computer interaction (HCI). Since it reflects brain activity directly, electroencephalography (EEG) has significant advantages in emotion classification research. This study recorded EEG signals from 12 participants while they perceived emotional audio-visual stimulation movie clips for four minutes. The six emotions studied were anger, excitement, fear, happiness, sadness, and a neutral state. We also perform raw data preprocessing to obtain clean data, extracting the power spectrum using Fast Fourier Transform (FFT) to generate feature vectors. In addition, we conduct extensive experiments to validate the classification of human states using subject-independent machine-learning techniques. As a result, the LSTM network achieved the highest classification accuracy of 81.46% for six emotional states, while the SVM network achieved only 68.64%. In addition, we achieved 82.89% accuracy in the Bi-LSTM network with two layers when applying the deep learning methods to different layers. In conclusion, extensive experiments were conducted on our collected dataset. Experimental results indicate that the LSTM network, a time-sequence-related model, has superior classification results of 82.89% than other methods. It also shows that the long duration of EEG signals is crucial for detecting the emotional state changes of various subject types, including individual subjects and cross-subjects. In future research, we will investigate multiple evaluation experiments utilizing deep learning models or propose novel EEG-based emotion classification features to improve the accuracy of emotion classification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data availability

The data cannot be publicly available because all relevant data include personal information. The interested parties may request the data from the corresponding author upon reasonable requests.

References

  1. Acharya D et al (2020) A long short term memory deep learning network for the classification of negative emotions using EEG signals. In: 2020 international joint conference on neural networks (IJCNN). IEEE

    Google Scholar 

  2. Bartlett M, Sejnowski TJ (1996) Viewpoint invariant face recognition using independent component analysis and attractor networks. Advances in Neural Information Processing Systems 9

  3. Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59

    Article  Google Scholar 

  4. Chen T et al (2020) EEG emotion recognition model based on the LIBSVM classifier. Measurement 164:108047

    Article  Google Scholar 

  5. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20:273–297

    Article  MATH  Google Scholar 

  6. Craik A, He Y, Contreras-Vidal JL (2019) Deep learning for electroencephalogram (EEG) classification tasks: a review. J Neural Eng 16(3):031001

    Article  Google Scholar 

  7. Delorme A, Makeig S (2004) EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods 134(1):9–21

    Article  Google Scholar 

  8. Du X, Ma C, Zhang G, Li J, Lai YK, Zhao G., ..., Wang H (2020) An efficient LSTM network for emotion recognition from multichannel EEG signals. IEEE Transactions on Affective Computing 13(3):1528-1540

  9. Egger M, Ley M, Hanke S (2019) Emotion recognition from physiological signal analysis: A review. Electronic Notes Theor Comput Sci 343:35-55

    Article  Google Scholar 

  10. Ekman P (1992) An argument for basic emotions. Cognit Emot 6(3-4):169–200

    Article  Google Scholar 

  11. Graves A, Schmidhuber J (2005) Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw 18(5-6):602–610

    Article  Google Scholar 

  12. Hefron RG et al (2017) Deep long short-term memory structures model temporal dependencies improving cognitive workload estimation. Pattern Recogn Lett 94:96–104

    Article  Google Scholar 

  13. Hosseini M-P, Hosseini A, Ahi K (2020) A review on machine learning for EEG signal processing in bioengineering. IEEE Rev Biomed Eng 14:204–218

    Article  Google Scholar 

  14. Islam MR et al (2021) EEG channel correlation based model for emotion recognition. Comput Biol Med 136:104757

    Article  Google Scholar 

  15. Javaid MM et al (2015) Real-time EEG-based human emotion recognition. In: International conference on neural information processing. Springer

    Google Scholar 

  16. Mehmood RM, Lee HJ (2015) Emotion classification of EEG brain signal using SVM and KNN. In: 2015 IEEE international conference on multimedia & expo workshops (ICMEW). IEEE

    Google Scholar 

  17. Moon S-E, Jang S, Lee J-S (2018) Convolutional neural network approach for EEG-based emotion recognition using brain connectivity and its spatial information. In: 2018 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE

    Google Scholar 

  18. Nath D et al (2020) An efficient approach to eeg-based emotion recognition using lstm network. In: 2020 16th IEEE international colloquium on signal processing & its applications (CSPA). IEEE

    Google Scholar 

  19. Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, ..., Chintala S (2019) Pytorch: An imperative style, high-performance deep learning library. Advances in Neural Information Processing Systems 32

  20. Pedregosa F et al (2011) Scikit-learn: machine learning in Python. J Mach Learn Res 12:2825–2830

    MathSciNet  MATH  Google Scholar 

  21. Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161

    Article  Google Scholar 

  22. Salzman CD, Fusi S (2010) Emotion, cognition, and mental state representation in amygdala and prefrontal cortex. Annu Rev Neurosci 33:173

    Article  Google Scholar 

  23. Wang Z-M, Hu S-Y, Song H (2019) Channel selection method for EEG emotion recognition using normalized mutual information. IEEE Access 7:143303–143311

    Article  Google Scholar 

  24. Xu H et al (2019) Research on EEG channel selection method for emotion recognition. In: 2019 IEEE international conference on robotics and biomimetics (ROBIO). IEEE

    Google Scholar 

  25. Yang H, Han J, Min K (2019) A multi-column CNN model for emotion recognition from EEG signals. Sensors 19(21):4736

    Article  Google Scholar 

  26. Zhao G, Zhang Y, Ge Y (2018) Frontal EEG asymmetry and middle line power difference in discrete emotions. Front Behav Neurosci 12:225

    Article  Google Scholar 

  27. Zheng W-L, Lu B-L (2015) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 7(3):162–175

    Article  Google Scholar 

Download references

Acknowledgements

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF), funded by the Ministry of Education (grant number) (NRF-2022R1I1A1A01053144).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guiyoung Son.

Ethics declarations

Conflict of interest

 The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lee, W., Son, G. Investigation of human state classification via EEG signals elicited by emotional audio-visual stimulation. Multimed Tools Appl (2023). https://doi.org/10.1007/s11042-023-16294-w

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11042-023-16294-w

Keywords

Navigation