Skip to main content

Advertisement

Log in

Emotion recognition based on physiological signals using brain asymmetry index and echo state network

  • S.I. : Emergence in Human-like Intelligence towards Cyber-Physical Systems
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

A Correction to this article was published on 15 November 2018

This article has been updated

Abstract

This paper proposes a method to evaluate the degree of emotion being motivated in continuous music videos based on asymmetry index (AsI). By collecting two groups of electroencephalogram (EEG) signals from 6 channels (Fp1, Fp2, Fz and AF3, AF4, Fz) in the left and right hemispheres, multidimensional directed information is used to measure the mutual information shared between two frontal lobes, and then, we get AsI to estimate the degree of emotional induction. In order to evaluate the effect of AsI processing on physiological emotion recognition, 32-channel EEG signals, 2-channel EEG signals and 2-channel EMG signals are selected for each subject from the DEAP dataset, and different sub-bands are extracted using wavelet packet transform. k-means algorithm is used to cluster the wavelet packet coefficients of each sub-band, and the probability distribution of the coefficients under each cluster is calculated. Finally, the probability distribution value of each sample is sent as the original features into echo state network for unsupervised intrinsic plasticity training; the reservoir state nodes are selected as the final feature vector and fed into the support vector machine. The experimental results show that the proposed algorithm can achieve an average recognition rate of 70.5% when the subjects are independent. Compared with the case without AsI, the recognition rate is increased by 8.73%. On the other hand, the ESN is adopted for the original physiological feature refinement which can significantly reduce feature dimensions and be more beneficial to the emotion classification. Therefore, this study can effectively improve the performance of human–machine interface systems based on emotion recognition.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Change history

  • 15 November 2018

    The authors quite agree that the AsI model is first developed by Panagiotis Petrantonakis from references [11] and [18] in their thesis.

References

  1. Vilar P (2014) Designing the user interface: strategies for effective human–computer interaction (5th edition). Inf Process Manage 61(5):1073–1074

    Google Scholar 

  2. Andreasson R, Alenljung B, Billing E et al (2018) Affective touch in human–robot interaction: conveying emotion to the nao robot. Int J Social Robot 3:1–19

    Google Scholar 

  3. Zhang Z, Tanaka E (2017) Affective computing using clustering method for mapping human’s emotion. In: IEEE international conference on advanced intelligent mechatronics. IEEE, pp 235–240

  4. Fragopanagos N, Taylor JG (2005) Emotion recognition in human–computer interaction. Neural Netw 18(4):389

    Article  Google Scholar 

  5. Hu M, Zheng Y, Ren F et al (2015) Age estimation and gender classification of facial images based on Local Directional Pattern. In: IEEE international conference on cloud computing and intelligence systems. IEEE, pp 103–107

  6. Ren F, Huang Z (2016) Automatic facial expression learning method based on humanoid robot XIN-REN. IEEE Trans Hum Mach Syst 46(6):810–821

    Article  Google Scholar 

  7. Wang K, An N, Li BN et al (2017) Speech emotion recognition using Fourier parameters. IEEE Trans Affect Comput 6(1):69–75

    Article  Google Scholar 

  8. Camurri A, Camurri A, Camurri A (2016) Adaptive body gesture representation for automatic emotion recognition. ACM Trans Interact Intell Syst 6(1):6

    Google Scholar 

  9. Ren F (2009) Affective information processing and recognizing human emotion. Elsevier Science Publishers B.V., Amsterdam

    Book  Google Scholar 

  10. Ren F, Wang L (2017) Sentiment analysis of text based on three-way decisions. J Intell Fuzzy Syst 33(1):245–254

    Article  Google Scholar 

  11. Petrantonakis PC, Hadjileontiadis LJ (2012) Adaptive emotional information retrieval from EEG signals in the time–frequency domain. IEEE Trans Signal Process 60(5):2604–2616

    Article  MathSciNet  Google Scholar 

  12. Yoon HJ, Chung SY (2013) EEG-based emotion estimation using Bayesian weighted-log-posterior function and perceptron convergence algorithm. Comput Biol Med 43(12):2230–2237

    Article  Google Scholar 

  13. Davidson RJ, Ekman P, Saron CD et al (1990) Approach-withdrawal and cerebral asymmetry: emotional expression and brain physiology: I. J Pers Soc Psychol 58(2):330

    Article  Google Scholar 

  14. Davidson RJ, Schwartz GE, Saron C, Bennett J, Goleman DJ (1979) Frontal versus parietal EEG asymmetry during positive and negative affect. Psychophysiology 16:202–203

    Google Scholar 

  15. Zheng WL, Lu BL (2015) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 7(3):162–175

    Article  Google Scholar 

  16. Daimi SN, Saha G (2014) Classification of emotions induced by music videos and correlation with participants’ rating. Expert Syst Appl 41(13):6057–6065

    Article  Google Scholar 

  17. Sakata O, Shiina T, Saito Y (2002) Multidimensional directed information and its application. Electron Commun Jpn 85(4):45–55

    Article  Google Scholar 

  18. Petrantonakis PC, Hadjileontiadis LJ (2011) A novel emotion elicitation index using frontal brain asymmetry for enhanced EEG-based emotion recognition. IEEE Trans Inf Technol Biomed 15(5):737–746

    Article  Google Scholar 

  19. Sakata O, Shiina T, Satake T et al (2006) Short-time multidimensional directed coherence for EEG analysis. IEEJ Trans Electr Electron Eng 1(4):408–416

    Article  Google Scholar 

  20. Deshpande G, Laconte S, Peltier S et al (2006) Directed transfer function analysis of fMRI data to investigate network dynamics. In: International conference of the IEEE engineering in medicine and biology society, p 671

  21. Xu X, Ye Z, Peng J (2007) Method of direction-of-arrival estimation for uncorrelated, partially correlated and coherent sources. Microw Antennas Propag IET 1(4):949–954

    Article  Google Scholar 

  22. Roebroeck A, Formisano E, Goebel R (2005) Mapping directed influence over the brain using Granger causality and fMRI. Neuroimage 25(1):230–242

    Article  Google Scholar 

  23. Jaeger H (2001) The “echo state” approach to analysing and training recurrent neural networks. Technical report GMD Report 148. German National Research Center for Information Technology

  24. Jaeger H (2002) Tutorial on training recurrent neural networks, covering BPPT, RTRL, EKF and the echo state network approach. GMD-Forschungszentrum Informationstechnik, Bonn

    Google Scholar 

  25. Han M, Xu M (2018) Subspace echo state network for multivariate time series prediction. IEEE Trans Neural Netw Learn Syst 29(1):238–244

    Article  MathSciNet  Google Scholar 

  26. Koprinkova Hristova P, Tontchev N (2012) Echo state networks for multi-dimensional data clustering. In: International conference on artificial neural networks and machine learning. Springer-Verlag, pp 571–578

  27. Fourati R, Ammar B, Aouiti C et al (2017) Optimized echo state network with intrinsic plasticity for EEG-based emotion recognition. In: International conference on neural information processing. Springer, Cham, pp 718–727

    Chapter  Google Scholar 

  28. Koelstra S, Muhl C, Soleymani M et al (2012) DEAP: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31

    Article  Google Scholar 

  29. Kanungo T, Mount DM, Netanyahu NS et al (2002) An efficient k-means clustering algorithm: analysis and implementation. IEEE Trans Pattern Anal Mach Intell 24(7):881–892

    Article  Google Scholar 

  30. Hartigan JA (1979) A k-means clustering algorithm. Appl Stat 28(1):100–108

    Article  Google Scholar 

  31. Huang Z (1998) Extensions to the k-means algorithm for clustering large data sets with categorical values. Data Min Knowl Disc 2(3):283–304

    Article  Google Scholar 

  32. Zheng WL, Zhu JY, Lu BL et al (2016) Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans Affect Comput 99(1949):1. https://doi.org/10.1109/TAFFC.2017.2712143

    Article  Google Scholar 

  33. Lin YP, Wang CH, Jung TP et al (2010) EEG-based emotion recognition in music listening. IEEE Trans Biomed Eng 57(7):1798–1806

    Article  Google Scholar 

  34. Jenke R, Peer A, Buss M (2017) Feature extraction and selection for emotion recognition from EEG. IEEE Trans Affect Comput 5(3):327–339

    Article  Google Scholar 

  35. Yin Z, Wang Y, Liu L et al (2017) Cross-subject EEG feature selection for emotion recognition using transfer recursive feature elimination. Front Neurorobot 11:19

    Article  Google Scholar 

  36. Schrauwen B, Wardermann M, Verstraeten D et al (2008) Improving reservoirs using intrinsic plasticity. Neurocomputing 71(7–9):1159–1171

    Article  Google Scholar 

  37. Skowronski MD, Harris JG (2007) Special issue: automatic speech recognition using a predictive echo state network classifier. Elsevier Science Ltd, Amsterdam

    MATH  Google Scholar 

  38. Chen J, Hu B, Wang Y et al (2017) A three-stage decision framework for multi-subject emotion recognition using physiological signals. In: IEEE international conference on bioinformatics and biomedicine. IEEE, pp 470–474

Download references

Acknowledgements

This research has been partially supported by National Natural Science Foundation of China (Grant No. 61432004), NSFC-Shenzhen Joint Foundation (Key Project) (Grant No. U1613217).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yindong Dong.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ren, F., Dong, Y. & Wang, W. Emotion recognition based on physiological signals using brain asymmetry index and echo state network. Neural Comput & Applic 31, 4491–4501 (2019). https://doi.org/10.1007/s00521-018-3664-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-018-3664-1

Keywords

Navigation