Active Feedback Framework with Scan-Path Clustering for Deep Affective Models

  • Li-Ming Zhao
  • Xin-Wei Li
  • Wei-Long Zheng
  • Bao-Liang Lu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11302)


The attention of subjects to EEG-based emotion recognition experiments could seriously affect their emotion induction level and annotation quality of EEG data. Therefore, it is important to evaluate the raw EEG data before training the classification model. In this paper, we propose a framework to filter out low quality EEG data from participants with low attention using eye tracking data and boost the performance of deep affective models with CNN and LSTM. We introduce a novel attention-deprived experiment with dual tasks, in which the dominant task is auditory continuous performance test, identical pairs version (CPT-IP) and the subtask is emotion eliciting experiment. Motivated by the idea that subjects with attention share similar scan-path patterns under the same clips, we adopt the cosine distance based spatial-temporal scan-path analysis with eye tracking data to cluster these similar scan-paths. The average accuracy of emotion recognition using the selected EEG data with attention is about 3% higher than that of original training dataset without filtering. We also found that with the increasing distance of scan-paths between outliers and cluster center, the performance of corresponding EEG data tends to decrease.


Eye tracking Scan-path Attention evaluation EEG data filtering 



This work was supported in part by the grants from the National Key Research and Development Program of China (Grant No. 2017YFB1002501), the National Natural Science Foundation of China (Grant No. 61673266), and the Fundamental Research Funds for the Central Universities.


  1. 1.
    Borg, I., Groenen, P.: Modern multidimensional scaling: theory and applications. J. Educ. Meas. 40(3), 277–280 (2003)CrossRefGoogle Scholar
  2. 2.
    Cornblatt, B.A., Risch, N.J., Faris, G., Friedman, D., Erlenmeyer-Kimling, L.: The continuous performance test, identical pairs version (CPT-IP): new findings about sustained attention in normal families. Psychiatr. Res. 26(2), 223–238 (1988)CrossRefGoogle Scholar
  3. 3.
    Duan, R.N., Zhu, J.Y., Lu, B.L.: Differential entropy feature for EEG-based emotion classification. In: 6th International IEEE/EMBS Conference on Neural Engineering, pp. 81–84. IEEE (2013)Google Scholar
  4. 4.
    Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. A Wiley-Interscience Publication, New York (1973)zbMATHGoogle Scholar
  5. 5.
    Ester, M., Kriegel, H.P., Sander, J., Xu, X.: A density-based algorithm for discovering clusters in large spatial databases with noise. In: KDD, vol. 96, pp. 226–231 (1996)Google Scholar
  6. 6.
    Huang, A.: Similarity measures for text document clustering. In: Proceedings of the Sixth New Zealand Computer Science Research Student Conference, Christchurch, New Zealand, pp. 49–56 (2008)Google Scholar
  7. 7.
    Koelstra, S., et al.: Patras: DEAP: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)CrossRefGoogle Scholar
  8. 8.
    Lu, Y., Zheng, W.L., Li, B., Lu, B.L.: Combining eye movements and EEG to enhance emotion recognition. In: IJCAI, vol. 15, pp. 1170–1176 (2015)Google Scholar
  9. 9.
    Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)MathSciNetzbMATHGoogle Scholar
  10. 10.
    Philipp, A.M., Kalinich, C., Koch, I., Schubotz, R.I.: Mixing costs and switch costs when switching stimulus dimensions in serial predictions. Psychol. Res. 72(4), 405–414 (2008)CrossRefGoogle Scholar
  11. 11.
    Qian, G., Sural, S., Gu, Y., Pramanik, S.: Similarity between Euclidean and cosine angle distance for nearest neighbor queries. In: Proceedings of the 2004 ACM Symposium on Applied Computing, pp. 1232–1237. ACM (2004)Google Scholar
  12. 12.
    Schaefer, A., Nils, F., Sanchez, X., Philippot, P.: Assessing the effectiveness of a large database of emotion-eliciting films: a new tool for emotion researchers. Cognit. Emot. 24(7), 1153–1172 (2010)CrossRefGoogle Scholar
  13. 13.
    Shi, Z.F., Zhou, C., Zheng, W.L., Lu, B.L.: Attention evaluation with eye tracking glasses for EEG-based emotion recognition. In: 8th International IEEE/EMBS Conference on Neural Engineering, pp. 86–89. IEEE (2017)Google Scholar
  14. 14.
    Wang, X.W., Nie, D., Lu, B.L.: Emotional state classification from EEG data using machine learning approach. Neurocomputing 129, 94–106 (2014)CrossRefGoogle Scholar
  15. 15.
    Yan, X., Zheng, W.L., Liu, W., Lu, B.L.: Investigating gender differences of brain areas in emotion recognition using LSTM neural network. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, E.S. (eds.) ICONIP 2017. LNCS, vol. 10637, pp. 820–829. Springer, Cham (2017). Scholar
  16. 16.
    Zheng, W.L., Lu, B.L.: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. 7(3), 162–175 (2015)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Li-Ming Zhao
    • 1
  • Xin-Wei Li
    • 1
  • Wei-Long Zheng
    • 1
  • Bao-Liang Lu
    • 1
    • 2
    • 3
  1. 1.Center for Brain-like Computing and Machine Intelligence, Department of Computer Science and EngineeringShanghai Jiao Tong UniversityShanghaiChina
  2. 2.Key Laboratory of Shanghai Education Commission for Intelligent Interaction and Cognitive EngineeringShanghai Jiao Tong UniversityShanghaiChina
  3. 3.Brain Science and Technology Research CenterShanghai Jiao Tong UniversityShanghaiChina

Personalised recommendations