Affective Recognition Using EEG Signal in Human-Robot Interaction

  • Chen Qian
  • Tingting Hou
  • Yanyu Lu
  • Shan FuEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10906)


Human-robot interaction is a crucial field in human factor field and mechanical arm operation is a widely used form in human-robot interaction. However, the mistaken operations caused by the affect influction of operators are still one of the dominant reasons causing accidents. Because of the close link between affective state and human error, in this paper, we analyzed the EEG signal of five subjects operating mechanical arm and the track record of the mechanical arm movement. A combination label model including the subjective part and the objective part are proposed to reflect the real time affective state influction. Additionally, in subsequent recognition experiment, the results indicate that the affect is a state of mind that requires a relatively longer period of time to be effectively represented and the frequency domain features are significantly more important than time domain features in affective recognition process using EEG signal.


Affective recognition Mechanical arm Time domain features Frequency domain features Multi-scale sliding window 


  1. 1.
    Picard, R.W.: Affective Computing. MIT Press, Cambridge (1997)CrossRefGoogle Scholar
  2. 2.
    Khosrowabadi, R., Wahab, A., Ang, K.K., et al.: Affective computation on EEG correlates of emotion from musical and vocal stimuli. In: 2009 International Joint Conference on Neural Networks, IJCNN 2009, pp. 1590–1594. IEEE (2009)Google Scholar
  3. 3.
    Ekman, P., Friesen, W.V., O’sullivan, M., et al.: Universals and cultural differences in the judgments of facial expressions of emotion. J. Personal. Soc. Psychol. 53(4), 712 (1987)CrossRefGoogle Scholar
  4. 4.
    Plutchik, R.: Emotions: a general psychoevolutionary theory. Approaches Emot. 1984, 197–219 (1984)Google Scholar
  5. 5.
    Russell, J.A.: A circumplex model of affect. J. Personal. Soc. Psychol. 39(6), 1161 (1980)CrossRefGoogle Scholar
  6. 6.
    Jenke, R., Peer, A., Buss, M.: Feature extraction and selection for emotion recognition from EEG. IEEE Trans. Affect. Comput. 5(3), 327–339 (2014)CrossRefGoogle Scholar
  7. 7.
    Documents of the Cognionics HD-72 Dry EEG.
  8. 8.
    Specification of the Dobot Magician mechanical arm.
  9. 9.
  10. 10.
    Petrantonakis, P.C., Hadjileontiadis, L.J.: Emotion recognition from EEG using higher order crossings. IEEE Trans. Inf. Technol. Biomed. 14(2), 186–197 (2010)CrossRefGoogle Scholar
  11. 11.
    Liu, Y., Sourina, O.: Real-time fractal-based valence level recognition from EEG. In: Gavrilova, M.L., Tan, C.J.K., Kuijper, A. (eds.) Transactions on Computational Science XVIII. LNCS, vol. 7848, pp. 101–120. Springer, Heidelberg (2013). Scholar
  12. 12.
    Lin, Y.P., Wang, C.H., Jung, T.P., et al.: EEG-based emotion recognition in music listening. IEEE Trans. Biomed. Eng. 57(7), 1798–1806 (2010)CrossRefGoogle Scholar
  13. 13.
    Pearson, K.: LIII. On lines and planes of closest fit to systems of points in space. Lond. Edinb. Dublin Philos. Mag. J. Sci. 2(11), 559–572 (1901)Google Scholar
  14. 14.
    Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)zbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.School of Electronic Information and Electrical EngineeringShanghai Jiao Tong UniversityShanghaiPeople’s Republic of China

Personalised recommendations