Convolutional Neural Networks on EEG-Based Emotion Recognition

  • Chunbin Li
  • Xiao SunEmail author
  • Yindong Dong
  • Fuji Ren
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1120)


Human Computer Interaction (HCI) enables people to transfer and exchange information with computers. For the purpose of friendliness, integrating HCI with emotional factors has been intensively investigated. In this paper, an effective method is proposed to recognize human emotions by electroencephalogram (EEG) signals, which record electrical activities of the brain. First of all, the EEG signals are converted to the multispectral image that preserves the local distance between any two nearby electrodes. Notably, our method preserves the features of EEG signals in frequency and spatial dimensions, unlike standard EEG analysis techniques inaccurately interpreting the location of electrodes. And then a Convolutional Neural Network (CNN) model is performed to identify human emotions by virtue of the image containing EEG feature, for the reason of CNN’s significant effect in image recognition. A publicly available dataset, DEAP dataset, is used to validate our algorithm. The results show that the mean classification accuracy is 81.64% for valence (low and high) and 80.25% for arousal (low and high) across 32 subjects.


Emotion recognition EEG signal Image CNN 


  1. 1.
  2. 2.
    Alfeld, P.: A trivariate clough–tocher scheme for tetrahedral data. Comput. Aided Geom. Des. 1(2), 169–181 (1984)CrossRefGoogle Scholar
  3. 3.
    Bashivan, P., Rish, I., Yeasin, M., Codella, N.: Learning representations from EEG with deep recurrent-convolutional neural networks. arXiv preprint arXiv:1511.06448 (2015)
  4. 4.
    Horlings, R., Datcu, D., Rothkrantz, L.J.: Emotion recognition using brain activity. In: Proceedings of the 9th International Conference on Computer Systems and Technologies and Workshop for PhD Students in Computing, p. 6. ACM (2008)Google Scholar
  5. 5.
    Jatupaiboon, N., Pan-ngum, S., Israsena, P.: Emotion classification using minimal EEG channels and frequency bands, pp. 21–24. IEEE (2013)Google Scholar
  6. 6.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
  7. 7.
    Koelstra, S., et al.: Deap: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012). Scholar
  8. 8.
    Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)Google Scholar
  9. 9.
    Lahane, P., Sangaiah, A.K.: An approach to EEG based emotion recognition and classification using kernel density estimation. Procedia Comput. Sci. 48, 574–581 (2015). item\_number: S187705091500647X CrossRefGoogle Scholar
  10. 10.
    LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)CrossRefGoogle Scholar
  11. 11.
    Li, X., Song, D., Zhang, P., Guangliang, Y., Hou, Y., Hu, B.: Emotion recognition from multi-channel EEG data through convolutional recurrent neural network, pp. 352–359. IEEE (2016)Google Scholar
  12. 12.
    Liu, J., Meng, H., Nandi, A., Li, M.: Emotion detection from EEG recordings. In: 2016 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD), pp. 1722–1727. IEEE (2016)Google Scholar
  13. 13.
    Mert, A., Akan, A.: Emotion recognition from EEG signals by using multivariate empirical mode decomposition. Pattern Anal. Appl. 21(1), 81–89 (2018). identifier: 567MathSciNetCrossRefGoogle Scholar
  14. 14.
    Petrantonakis, P.C., Hadjileontiadis, L.J.: Emotion recognition from EEG using higher order crossings. IEEE Trans. Inf. Technol. Biomed. 14(2), 186–197 (2010). item\(\_\)number: 5291724CrossRefGoogle Scholar
  15. 15.
    Rozgić, V., Vitaladevuni, S.N., Prasad, R.: Robust EEG emotion classification using segment level decision fusion. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 1286–1290. IEEE (2013)Google Scholar
  16. 16.
    Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980). identifier: 1981-25062-001CrossRefGoogle Scholar
  17. 17.
    Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014)
  18. 18.
    Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetzbMATHGoogle Scholar
  19. 19.
    Thammasan, N., Moriyama, K., Fukui, K.I., Numao, M.: Continuous music-emotion recognition based on electroencephalogram. IEICE Trans. Inf. Syst. E99.D(4), 1234–1241 (2016). Scholar
  20. 20.
    Thammasan, N., Moriyama, K., Fukui, K.I., Numao, M.: Familiarity effects in EEG-based emotion recognition. Brain Inform. 4(1), 39–50 (2017). identifier: 51CrossRefGoogle Scholar
  21. 21.
    Tripathi, S., Acharya, S., Sharma, R.D., Mittal, S., Bhattacharya, S.: Using deep and convolutional neural networks for accurate emotion classification on deap dataset. In: Twenty-Ninth IAAI Conference (2017)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.School of Computer Science and Information EngineeringHefei University of TechnologyHefeiChina
  2. 2.Faculty of EngineeringThe University of TokushimaTokushimaJapan

Personalised recommendations