Advertisement

Cognitive Computation

, Volume 10, Issue 2, pp 368–380 | Cite as

Hierarchical Convolutional Neural Networks for EEG-Based Emotion Recognition

  • Jinpeng Li
  • Zhaoxiang Zhang
  • Huiguang He
Article

Abstract

Traditional machine learning methods suffer from severe overfitting in EEG-based emotion reading. In this paper, we use hierarchical convolutional neural network (HCNN) to classify the positive, neutral, and negative emotion states. We organize differential entropy features from different channels as two-dimensional maps to train the HCNNs. This approach maintains information in the spatial topology of electrodes. We use stacked autoencoder (SAE), SVM, and KNN as competing methods. HCNN yields the highest accuracy, and SAE is slightly inferior. Both of them show absolute advantage over traditional shallow models including SVM and KNN. We confirm that the high-frequency wave bands Beta and Gamma are the most suitable bands for emotion reading. We visualize the hidden layers of HCNNs to investigate the feature transformation flow along the hierarchical structure. Benefiting from the strong representational learning capacity in the two-dimensional space, HCNN is efficient in emotion recognition especially on Beta and Gamma waves.

Keywords

Affective brain-computer interface Emotion recognition Brain wave Deep learning EEG 

Notes

Funding

This work was supported by the National Natural Science Foundation of China (91520202, 81671651), CAS Scientific Equipment Development Project (YJKYYQ20170050) and Youth Innovation Promotion Association CAS. The authors would also like to thank Prof. Baoliang Lu forproviding the SEED dataset.

Compliance with Ethical Standards

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Informed consent was obtained from all individual participants included in the study.

References

  1. 1.
    Mühl C, Allison B, Nijholt A, Chanel G. A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges[J]. Brain Comput Interfaces. 2014;1(2):66–84.  https://doi.org/10.1080/2326263X.2014.912881.CrossRefGoogle Scholar
  2. 2.
    Kothe CA, Scott M. Estimation of task workload from EEG data: new and current tools and perspectives. 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, 2011.Google Scholar
  3. 3.
    Shi L-C, Bao-Liang L. EEG-based vigilance estimation using extreme learning machines. Neurocomputing. 2013;102:135–43.  https://doi.org/10.1016/j.neucom.2012.02.041.CrossRefGoogle Scholar
  4. 4.
    Sauvet F, Bougard C, Coroenne M, Lely L, van Beers P, Elbaz M, et al. In-flight automatic detection of vigilance states using a single EEG channel[J]. IEEE Trans Biomed Eng. 2014;61(12):2840–7.  https://doi.org/10.1109/TBME.2014.2331189.CrossRefPubMedGoogle Scholar
  5. 5.
    Ahern GL, Schwartz GE. Differential lateralization for positive and negative emotion in the human brain: EEG spectral analysis. Neuropsychologia. 1985;23(6):745–55.  https://doi.org/10.1016/0028-3932(85)90081-8.CrossRefPubMedGoogle Scholar
  6. 6.
    Gunes H, Piccardi M. Bi-modal emotion recognition from expressive face and body gestures. J Netw Comput Appl. 2007;30(4):1334–45.  https://doi.org/10.1016/j.jnca.2006.09.007.CrossRefGoogle Scholar
  7. 7.
    Busso C, et al. Analysis of emotion recognition using facial expressions, speech and multimodal information. Proceedings of the 6th international conference on Multimodal interfaces. ACM, 2004.Google Scholar
  8. 8.
    Elfenbein HA, Ambady N. When familiarity breeds accuracy: cultural exposure and facial emotion recognition. J Pers Soc Psychol. 2003;85(2):276–90.  https://doi.org/10.1037/0022-3514.85.2.276.CrossRefPubMedGoogle Scholar
  9. 9.
    Russell JA. Is there universal recognition of emotion from facial expressions? A review of the cross-cultural studies. Psychol Bull. 1994;115(1):102–41.  https://doi.org/10.1037/0033-2909.115.1.102.CrossRefPubMedGoogle Scholar
  10. 10.
    Zheng W-L, Bao-Liang L. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev. 2015;7(3):162–75.  https://doi.org/10.1109/TAMD.2015.2431497.CrossRefGoogle Scholar
  11. 11.
    Hjorth B. EEG analysis based on time domain properties[J]. Electroencephalogr Clin Neurophysiol. 1970;29(3):306–10.  https://doi.org/10.1016/0013-4694(70)90143-4.CrossRefPubMedGoogle Scholar
  12. 12.
    Valdés P, Bosch J, Grave R, et al. Frequency domain models of the EEG[J]. Brain Topogr. 1992;4(4):309–19.  https://doi.org/10.1007/BF01135568.CrossRefPubMedGoogle Scholar
  13. 13.
    Davis CJ, Clinton JM, Jewett KA, et al. EEG delta wave power: an independent sleep phenotype or epiphenomenon[J]. J Clin Sleep Med. 2011;7(5)Google Scholar
  14. 14.
    Buzsáki G. Theta oscillations in the hippocampus[J]. Neuron. 2002;33(3):325–40.  https://doi.org/10.1016/S0896-6273(02)00586-X.CrossRefPubMedGoogle Scholar
  15. 15.
    Tran Y, Craig A, McIsaac P. Extraversion–introversion and 8–13 Hz waves in frontal cortical regions[J]. Personal Individ Differ. 2001;30(2):205–15.  https://doi.org/10.1016/S0191-8869(00)00027-1.CrossRefGoogle Scholar
  16. 16.
    Levin RB. Devices and methods for maintaining an alert state of consciousness through brain wave monitoring: U.S. Patent 6,167,298[P]. 2000–12-26.Google Scholar
  17. 17.
    Başar-Eroglu C, Strüber D, Schürmann M, Stadler M, Başar E. Gamma-band responses in the brain: a short review of psychophysiological correlates and functional significance[J]. Int J Psychophysiol. 1996;24(1):101–12.  https://doi.org/10.1016/S0167-8760(96)00051-7.CrossRefPubMedGoogle Scholar
  18. 18.
    Li Mu, Bao-Liang Lu. Emotion classification based on gamma-band EEG. 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, 2009.Google Scholar
  19. 19.
    Ferree TC, Hwa RC. Power-law scaling in human EEG: relation to Fourier power spectrum[J]. Neurocomputing. 2003;52:755–61.CrossRefGoogle Scholar
  20. 20.
    Jayaram V, Alamgir M, Altun Y, Scholkopf B, Grosse-Wentrup M. Transfer learning in brain-computer interfaces[J]. IEEE Comput Intell Mag. 2016;11(1):20–31.  https://doi.org/10.1109/MCI.2015.2501545.CrossRefGoogle Scholar
  21. 21.
    Hadjidimitriou SK, Hadjileontiadis LJ. EEG-based classification of music appraisal responses using time-frequency analysis and familiarity ratings[J]. IEEE Trans Affect Comput. 2013;4(2):161–72.  https://doi.org/10.1109/T-AFFC.2013.6.CrossRefGoogle Scholar
  22. 22.
    Li Y, Luo M-L, Li K. A multiwavelet-based time-varying model identification approach for time–frequency analysis of EEG signals. Neurocomputing. 2016;193:106–14.  https://doi.org/10.1016/j.neucom.2016.01.062.CrossRefGoogle Scholar
  23. 23.
    Wang XW, Nie D, Lu BL. Emotional state classification from EEG data using machine learning approach[J]. Neurocomputing. 2014;129:94–106.  https://doi.org/10.1016/j.neucom.2013.06.046.CrossRefGoogle Scholar
  24. 24.
    Bajoulvand A, Marandi RZ, Daliri MR, et al. Analysis of folk music preference of people from different ethnic groups using kernel-based methods on EEG signals[J]. Appl Math Comput. 2017;307:62–70.Google Scholar
  25. 25.
    Duan, R-N, Zhu J-Y, Bao-Liang Lu. Differential entropy feature for EEG-based emotion classification. Neural Engineering (NER), 2013 6th International IEEE/EMBS Conference on. IEEE, 2013.Google Scholar
  26. 26.
    Shi LC, Jiao YY, Lu BL. Differential entropy feature for EEG-based vigilance estimation[C]. Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE. IEEE, 2013: 6627–30.Google Scholar
  27. 27.
    Lokannavar S, et al. Emotion recognition using EEG signals. Emotion. 4(5):2015.Google Scholar
  28. 28.
    Kahn B. Electroencephalogram (EEG) signal processing, wave identification, and emotion recognition. Diss. California State University, Northridge, 2015.Google Scholar
  29. 29.
    Dash M, Liu H. Feature selection for classification. Intelligent Data Analysis. 1997;1(3):131–56.  https://doi.org/10.1016/S1088-467X(97)00008-5.CrossRefGoogle Scholar
  30. 30.
    Krizhevsky A, Ilya Sutskever, GE Hinton. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems. 2012.Google Scholar
  31. 31.
    Szegedy C, et al. Going deeper with convolutions. Proc IEEE Conf Comput Vis Pattern Recognit. 2015;Google Scholar
  32. 32.
    Deselaers T, et al. A deep learning approach to machine transliteration. Proceedings of the Fourth Workshop on Statistical Machine Translation. Association for Computational Linguistics, 2009.Google Scholar
  33. 33.
    Vincent P, et al. Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J Mach Learn Res. 2010;11:3371–408.Google Scholar
  34. 34.
    Hinton GE. Deep belief networks. Scholarpedia. 2009;4(5):5947.  https://doi.org/10.4249/scholarpedia.5947.CrossRefGoogle Scholar
  35. 35.
    Kim, Yelin, Honglak Lee, Emily Mower Provost. Deep learning for robust feature generation in audiovisual emotion recognition. 2013 I.E. International Conference on Acoustics, Speech and Signal Processing. IEEE, 2013.Google Scholar
  36. 36.
    Jirayucharoensak S, Pan-Ngum S, Israsena P. EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci World J. 2014;2014:1–10.  https://doi.org/10.1155/2014/627892.CrossRefGoogle Scholar
  37. 37.
    Cecotti H, Graeser A. Convolutional neural network with embedded fourier transform for EEG classification. Pattern Recognition, 2008. ICPR 2008. 19th International Conference on. IEEE, 2008.Google Scholar
  38. 38.
    Samek W, Meinecke FC, Müller KR. Transferring subspaces between subjects in brain--computer interfacing[J]. IEEE Trans Biomed Eng. 2013;60(8):2289–98.  https://doi.org/10.1109/TBME.2013.2253608.CrossRefPubMedGoogle Scholar
  39. 39.
    Zheng W-L, Lu B-L. Personalizing EEG-based affective models with transfer learning, to appear in Proc. of the 25th International Joint Conference on Artificial Intelligence (IJCAI-16), New York.Google Scholar
  40. 40.
    Horlings R, Datcu D, Rothkrantz LJM. Emotion recognition using brain activity. Proceedings of the 9th international conference on computer systems and technologies and workshop for PhD students in computing. ACM, 2008.Google Scholar
  41. 41.
    Pan SJ, Yang Q. A survey on transfer learning[J]. IEEE Trans Knowl Data Eng. 2010;22(10):1345–59.  https://doi.org/10.1109/TKDE.2009.191.CrossRefGoogle Scholar
  42. 42.
    Fazli S, Popescu F, Danóczy M, Blankertz B, Müller KR, Grozea C. Subject-independent mental state classification in single trials[J]. Neural Netw. 2009;22(9):1305–12.  https://doi.org/10.1016/j.neunet.2009.06.003.CrossRefPubMedGoogle Scholar
  43. 43.
    Kang H, Nam Y, Choi S. Composite common spatial pattern for subject-to-subject transfer[J]. IEEE Signal Processing Letters. 2009;16(8):683–6.  https://doi.org/10.1109/LSP.2009.2022557.CrossRefGoogle Scholar
  44. 44.
    Lotte F, Guan C. Regularizing common spatial patterns to improve BCI designs: unified theory and new algorithms[J]. IEEE Trans Biomed Eng. 2011;58(2):355–62.  https://doi.org/10.1109/TBME.2010.2082539.CrossRefPubMedGoogle Scholar
  45. 45.
    Yosinski J, Clune J, Bengio Y, et al. How transferable are features in deep neural networks?[C]//Advances in neural information processing systems. 2014: 3320–3328.Google Scholar
  46. 46.
    Wu D, Courtney CG, Lance BJ, Narayanan SS, Dawson ME, Oie KS, et al. Optimal arousal identification and classification for affective computing using physiological signals: virtual reality stroop task[J]. IEEE Trans Affect Comput. 2010;1(2):109–18.  https://doi.org/10.1109/T-AFFC.2010.12.CrossRefGoogle Scholar
  47. 47.
    Snyder, JP. Map projections--A working manual. Vol. 1395. US Government Printing Office, 1987.Google Scholar
  48. 48.
    Alfeld P. A trivariate Clough—Tocher scheme for tetrahedral data. Comput Aided Geometric Des. 1984;1(2):169–81.  https://doi.org/10.1016/0167-8396(84)90029-3.CrossRefGoogle Scholar
  49. 49.
    Sharif Razavian, Ali, et al. CNN features off-the-shelf: an astounding baseline for recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 2014.Google Scholar
  50. 50.
    Donahue, Jeff, et al. DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition. ICML. 2014.Google Scholar
  51. 51.
    Nagi J, Ducatelle F, Di Caro G A, et al. Max-pooling convolutional neural networks for vision-based hand gesture recognition[C]//Signal and Image Processing Applications (ICSIPA), 2011 I.E. International Conference on. IEEE, 2011: 342–7.Google Scholar
  52. 52.
    He K, Zhang X, Ren S, et al. Spatial pyramid pooling in deep convolutional networks for visual recognition[C]//European conference on computer vision. Springer International Publishing, 2014: 346–361.Google Scholar
  53. 53.
    Eysenck SBG, Eysenck HJ, Barrett P. A revised version of the psychoticism scale[J]. Personal Individ Differ. 1985;6(1):21–9.  https://doi.org/10.1016/0191-8869(85)90026-1.CrossRefGoogle Scholar
  54. 54.
    Weston J, Watkins C. Multi-class support vector machines[R]. Technical Report CSD-TR-98-04, Department of Computer Science, Royal Holloway, University of London, May, 1998.Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2017

Authors and Affiliations

  1. 1.Research Center for Brain-inspired Intelligence, Institute of AutomationChinese Academy of SciencesBeijingChina
  2. 2.University of Chinese Academy of Sciences (UCAS)BeijingChina
  3. 3.Center for Excellence in Brain Science and Intelligence TechnologyChinese Academy of SciencesBeijingChina

Personalised recommendations