Skip to main content
Log in

A deep multi-source adaptation transfer network for cross-subject electroencephalogram emotion recognition

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In real-world application of affective brain–computer interface (aBCI), individual differences across subjects and non-stationary characteristics of electroencephalogram (EEG) signals can cause data bias. Moreover, for new specific subject, the size of sample data is very small compared to that of existing subjects, which easily leads to overfitting in deep neural network training and reduces generalization performance of the network. In this paper, the deep multi-source adaptation transfer network (DMATN) is proposed for the new subjects in aBCI. In DMATN, the multi-source selection is employed to obtain the portion of existing EEG data mostly correlated with new subject and to decrease by two-fifth source data. To explore domain-invariant structures, deep adaptation network is used to map correlated source domain and the target domain (new subject) into reproducing kernel Hilbert space (RKHS) optimized by the multiple kernel variant of maximum mean discrepancies (MK-MMD). To more precisely predict the emotional state of the new subject, domain discriminator is applied in DMATN to make the data distribution of the two domains more similar. Finally, across-subject experiments on SEED dataset are conducted to evaluate the proposed method. The experimental results show that DMATN model can achieve the state-of-the-art performance of 84.46%, 83.32% and 84.90% in three sessions, respectively. It also shows great time efficiency in applications of aBCI.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Availability of data and material

The datasets used or analyzed during the current study are available from http://bcmi.sjtu.edu.cn/~seed/index.html.

References

  1. Christian M, Camille J, Fabien L (2014) EEG-based workload estimation across affective contexts. Front Neurosci 8:114

    Google Scholar 

  2. Yin F, Xiangju L, Dian L, Yuanliu L (2016) Video-based emotion recognition using CNN-RNN and C3D hybrid networks. In: Proceedings of the 18th ACM international conference on multimodal interaction, pp 445–450

  3. Jingwei Y, Wenming Z, Zhen C, Chuangao T, Tong Z, Yuan Z (2018) Multi-cue fusion for emotion recognition in the wild. Neurocomputing 309:27–35

    Article  Google Scholar 

  4. Mühl C, Brouwer AM, van Wouwe NC, van den Broek EL, Nijboer F, Dirk KJH (2011) Modality-specific affective responses and their implications for affective BCI. Graz, Austria: Verlag der Technischen Universität

  5. Qi L, Hongguang L (2020) Criminal psychological emotion recognition based on deep learning and EEG signals. Neural Comput Appl 2:20

    Google Scholar 

  6. Alik SW, Darin DD, Chet TM (2014) Affective brain-computer interfaces as enabling technology for responsive psychiatric stimulation. Brain-Comput Interfaces 1:126–136

    Article  Google Scholar 

  7. Choon GL, Tih-Shih L, Cuntai G, Fung DS, Yin BC, Teng S, Haihong Z, Krishnan KR (2010) Effectiveness of a brain-computer interface based programme for the treatment of ADHD: a pilot study. Psychopharmacol Bull 43:73–82

    Google Scholar 

  8. Andrea K, Femke N, Niels B (2007) Brain-computer interfaces for communication and motor control-perspectives on clinical applications. Toward Brain Comput Interfacing 10:373–391

    Google Scholar 

  9. Shruti J, Nandi GC (2019) Robust real-time emotion detection system using CNN architecture. Neural Comput Appl 3:1–10

    Google Scholar 

  10. Ren F, Dong Y, Wang W (2019) Emotion recognition based on physiological signals using brain asymmetry index and echo state network. Neural Comput Appl 31(9):4491–4501

    Article  Google Scholar 

  11. Jenke R, Peer A, Buss M (2014) Feature extraction and selection for emotion recognition from EEG. IEEE Trans Affect Comput 5:327–339

    Article  Google Scholar 

  12. Sung-Woo B, Seok-Pil L, Hyuk SH (2017) Feature selection and comparison for the emotion recognition according to music listening. In: 2017 international conference on robotics and automation sciences (ICRAS), pp 172–176

  13. Ruo-Nan D, Jia-Yi Z, Bao-Liang L (2013) Differential entropy feature for EEG-based emotion classification. In: 2013 6th international IEEE/EMBS conference on neural engineering (NER), pp 81–84

  14. Zhang J, Chen M, Zhao S, Sanqing H, Shi Z, Cao Yu (2016) Relieff-based EEG sensor selection methods for emotion recognition. Sensors 16:1558

    Article  Google Scholar 

  15. Jing C, Bin H, Yue W, Yongqiang D, Yuan Y, Shengjie Z (2016) A three-stage decision framework for multi-subject emotion recognition using physiological signals. In: 2016 IEEE international conference on bioinformatics and biomedicine (BIBM), pp 470–474

  16. Varvara K, Oguz HE (2016) Distributed processing of biosignal-database for emotion recognition with mahout. arXiv preprintarXiv:1609.02631

  17. Wei-Long Z, Hao-Tian G, Bao-Liang L (2015) Revealing critical channels and frequency bands for emotion recognition from EEG with deep belief network. In: 2015 7th international IEEE/EMBS conference on neural engineering (NER), pp 154–157

  18. Wei-Long Z, Jia-Yi Z, Yong P, Bao-Liang L (2014) EEG-based emotion classification using deep belief networks. In: 2014 IEEE international conference on multimedia and expo (ICME), pp 1–6

  19. Zheng W-L, Liu W, Yifei L, Bao-Liang L, Cichocki A (2018) Emotionmeter: a multimodal framework for recognizing human emotions. IEEE Trans Cybern 49:1110–1122

    Article  Google Scholar 

  20. Zhixiang Y, Damang B, Zekai C, Ming L (2017) Integrated transfer learning algorithm using multi-source tradaboost for unbalanced samples classification. In: 2017 international conference on computing intelligence and information system (CIIS), pp 188–195

  21. Yi Y, Gianfranco D (2010) Boosting for transfer learning with multiple sources. In: 2010 IEEE computer society conference on computer vision and pattern recognition, pp 1855–1862

  22. Wei-Long Z, Bao-Liang L (2016) Personalizing EEG-based affective models with transfer learning. In: Proceedings of the twenty-fifth international joint conference on artificial intelligence, pp 2732–2738

  23. Mingsheng L, Yue C, Jianmin W, Michael IJ (2015) Learning transferable features with deep adaptation networks. arXiv preprintarXiv:1502.02791

  24. Yi-Ming J, Yu-Dong L, Wei-Long Z, Bao-Liang L (2017) EEG-based emotion recognition using domain adaptation network. In: 2017 international conference on orange technologies (ICOT), pp 222–225

  25. Ganin Y, Ustinova E, Ajakan H, Germain P, Larochelle H, Laviolette F, Marchand M, Lempitsky V (2016) Domain-adversarial training of neural networks. J Mach Learn Res 17:2030–2096

    MathSciNet  MATH  Google Scholar 

  26. Yang L, Wenming Z, Yuan Z, Zhen C, Tong Z, Xiaoyan Z (2018) A bi-hemisphere domain adversarial neural network model for EEG emotion recognition. IEEE Trans Affect Comput 5:69

    Google Scholar 

  27. Yang L, Wenming Z, Lei W, Yuan Z, Zhen C (2019) From regional to global brain: a novel hierarchical spatial-temporal neural network model for EEG emotion recognition. IEEE Trans Affect Comput 3:91

    Google Scholar 

  28. Jinpeng L, Shuang Q, Yuan-Yuan S, Cheng-Lin L, Huiguang H (2019) Multisource transfer learning for cross-subject EEG emotion recognition. IEEE Trans Cybern 6:97

    Google Scholar 

  29. Eric T, Judy H, Ning Z, Kate S, Trevor D (2014) Deep domain confusion: Maximizing for domain invariance. arXiv preprintarXiv:1412.3474

  30. Eric T, Judy H, Trevor D, Kate S (2015) Simultaneous deep transfer across domains and tasks. In: Proceedings of the IEEE international conference on computer vision, pp 4068–4076

  31. Weiwei Z, Fei W, Yang J, Zongfeng X, Shichao W, Yahui Z (2019) Cross-subject EEG-based emotion recognition with deep domain confusion. In: International conference on intelligent robotics and applications, pp 558–570

  32. Yaroslav G, Victor L (2014) Unsupervised domain adaptation by backpropagation. arXiv preprintarXiv:1409.7495

  33. Zheng W-L, Bao-Liang L (2015) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 7:162–175

    Article  Google Scholar 

  34. Noppadon J, Setha P-N, Pasin I (2013) Emotion classification using minimal EEG channels and frequency bands. In: The 2013 10th international joint conference on computer science and software engineering (JCSSE), pp 21–24

  35. Yang L, Wenming Z, Lei W, Yuan Z, Lei Q, Zhen C, Tong Z, Tengfei S (2019) A novel bi-hemispheric discrepancy model for EEG emotion recognition. arXiv preprintarXiv:1906.01704

  36. He L, Yi-Ming J, Wei-Long Z, Bao-Liang L (2018) Cross-subject emotion recognition using deep adaptation networks. In: International conference on neural information processing, pp 403–413. Springer

  37. Animasaun IL, Ibraheem RO, Mahanthesh B, Babatunde HA (2019) A meta-analysis on the effects of haphazard motion of tiny/nano-sized particles on the dynamics and other physical properties of some fluids. Chin J Phys 60:676–687

    Article  MathSciNet  Google Scholar 

  38. Wakif A, Animasaun IL, Narayana SPV, Sarojamma G (2019) Meta-analysis on thermo-migration of tiny/nano-sized particles in the motion of various fluids. Chin J Phys 3:946

    Google Scholar 

  39. Nehad AS, Il A, Ibraheem RO, Babatunde HA, Sandeep N, Pop I (2018) Scrutinization of the effects of Grashof number on the flow of different fluids driven by convection over various surfaces. J Mol Liquids 249:980–990

    Article  Google Scholar 

  40. Nehad AS, Animasaun IL, Abderrahim W, Koriko OK, Sivaraj R, Adegbie KS, Zahra A, Vaidyaa H, Ijirimoye AF, Prasad KV (2020) Significance of suction and dual stretching on the dynamics of various hybrid nanofluids: comparative analysis between type i and type ii models. Physica Scripta 95(9):095205

    Article  Google Scholar 

  41. Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9:293–300

    Article  Google Scholar 

  42. Breiman L (2001) Random forests. Mach Learn 45:5–32

    Article  MATH  Google Scholar 

  43. Wei-Long Z, Yong-Qi Z, Jia-Yi Z, Bao-Liang L (2015) Transfer components between subjects for EEG-based emotion recognition. In: 2015 international conference on affective computing and intelligent interaction (ACII), pp 917–922

  44. Schölkopf B, Smola A, Müller K-R (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10:1299–1319

    Article  Google Scholar 

  45. Chai X, Wang Q, Zhao Y, Xin L, Ou B, Yongqiang L (2016) Unsupervised domain adaptation techniques based on auto-encoder for non-stationary EEG-based emotion recognition. Comput Biol Med 79:205–214

    Article  Google Scholar 

  46. Chai X, Wang Q, Zhao Y, Li Y, Liu D, Liu X, Bai O (2017) A fast, efficient domain adaptation technique for cross-domain electroencephalography (EEG)-based emotion recognition. Sensors 17:1014

    Article  Google Scholar 

  47. Song T, Zheng W, Song P, Cui Z (2018) EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans Affect Comput 5:4–8

    Google Scholar 

  48. Yang L, Wenming Z, Zhen C, Tong Z, Yuan Z (2018) A novel neural network model based on cerebral hemispheric asymmetry for EEG emotion recognition. In: IJCAI, pp 1561–1567

  49. Laurens Van Der M (2013) Barnes-hut-sne. arXiv preprintarXiv:1301.3342

  50. Burges CJC (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Dis 2(2):121–167

    Article  Google Scholar 

  51. Wang Z, Tong Y, Heng X (2019) Phase-locking value based graph convolutional neural networks for emotion recognition. IEEE Access 7:93711–93722

    Article  Google Scholar 

Download references

Funding

We wish to acknowledge the support of Natural Science Foundation of China under Grant 61973065, the Fundamental Research Funds for the Central Universities of China under Grant N182612002 and N2026002.

Author information

Authors and Affiliations

Authors

Contributions

Fei Wang and Weiwei Zhang contributed equally to the work.

Corresponding author

Correspondence to Fei Wang.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Code availability

The code used during the current study is available from the corresponding author on reasonable request.

Consent to publication

The authors declare that they consent to publication.

Ethics approval

This article does not contain any studies with human participants performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, F., Zhang, W., Xu, Z. et al. A deep multi-source adaptation transfer network for cross-subject electroencephalogram emotion recognition. Neural Comput & Applic 33, 9061–9073 (2021). https://doi.org/10.1007/s00521-020-05670-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-05670-4

Keywords

Navigation