Skip to main content

Emotion Recognition Using Multimodal Deep Learning

Part of the Lecture Notes in Computer Science book series (LNTCS,volume 9948)

Abstract

To enhance the performance of affective models and reduce the cost of acquiring physiological signals for real-world applications, we adopt multimodal deep learning approach to construct affective models with SEED and DEAP datasets to recognize different kinds of emotions. We demonstrate that high level representation features extracted by the Bimodal Deep AutoEncoder (BDAE) are effective for emotion recognition. With the BDAE network, we achieve mean accuracies of 91.01 % and 83.25 % on SEED and DEAP datasets, respectively, which are much superior to those of the state-of-the-art approaches. By analysing the confusing matrices, we found that EEG and eye features contain complementary information and the BDAE network could fully take advantage of this complement property to enhance emotion recognition.

Keywords

  • EEG
  • Emotion recognition
  • Multimodal deep learning
  • Auto-encoder

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-46672-9_58
  • Chapter length: 9 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   89.00
Price excludes VAT (USA)
  • ISBN: 978-3-319-46672-9
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   119.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.

Notes

  1. 1.

    http://bcmi.sjtu.edu.cn/~seed/index.html.

References

  1. Liu, Y., Sourina, O., Nguyen, M.K.: Real-time EEG-based human emotion recognition and visualization. In: 2010 International Conference on Cyberworlds, pp. 262–269. IEEE (2010)

    Google Scholar 

  2. Li, M., Lu, B.-L.: Emotion classification based on gamma-band EEG. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2009, pp. 1223–1226. IEEE (2009)

    Google Scholar 

  3. Duan, R.-N., Zhu, J.-Y., Lu, B.-L.: Differential entropy feature for eeg-based emotion classification. In: 2013 6th International IEEE/EMBS Conference on Neural Engineering, pp. 81–84. IEEE (2013)

    Google Scholar 

  4. Wang, X.-W., Nie, D., Bao-Liang, L.: Emotional state classification from EEG data using machine learning approach. Neurocomputing 129, 94–106 (2014)

    CrossRef  Google Scholar 

  5. Zheng, W.-L., Bao-Liang, L.: Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Mental Dev. 7(3), 162–175 (2015)

    CrossRef  Google Scholar 

  6. Yang, Y., Ye, H.-J., Zhan, D.-C., Jiang, Y.: Auxiliary information regularized machine for multiple modality feature learning. In: IJCAI 2015, pp. 1033–1039. AAAI Press (2015)

    Google Scholar 

  7. Ngiam, J., Khosla, A., Kim, M., Nam, J., Lee, H., Ng, A.Y.: Multimodal deep learning. In: ICML 2011, pp. 689–696 (2011)

    Google Scholar 

  8. Srivastava, N., Salakhutdinov, R.: Multimodal learning with deep boltzmann machines. J. Mach. Learn. Res. 15(1), 2949–2980 (2014)

    MathSciNet  MATH  Google Scholar 

  9. Verma, G.K., Tiwary, U.S.: Multimodal fusion framework: a multiresolution approach for emotion classification and recognition from physiological signals. NeuroImage 102, 162–172 (2014)

    CrossRef  Google Scholar 

  10. Lu, Y., Zheng, W.-L., Li, B., Lu, B.-L.: Combining eye movements and EEG to enhance emotion recognition. In: IJCAI 2015, pp. 1170–1176 (2015)

    Google Scholar 

  11. Liu, W., Zheng, W.-L., Lu, B.-L.: Multimodal emotion recognition using multimodal deep learning. arXiv preprint arXiv:1602.08225 (2016)

  12. Hinton, G.E.: Training products of experts by minimizing contrastive divergence. Neural Comput. 14(8), 1771–1800 (2002)

    CrossRef  MATH  Google Scholar 

  13. Koelstra, S., Mühl, C., Soleymani, M., Lee, J.-S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., Patras, I.: Deap: a database for emotion analysis using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)

    CrossRef  Google Scholar 

  14. Rozgic, V., Vitaladevuni, S.N., Prasad, R.: Robust EEG emotion classification using segment level decision fusion. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 1286–1290. IEEE (2013)

    Google Scholar 

  15. Li, X., Zhang, P., Song, D., Yu, G., Hou, Y., Hu, B.: EEG based emotion identification using unsupervised deep feature learning. In: SIGIR2015 Workshop on Neuro-Physiological Methods in IR Research, August 2015

    Google Scholar 

Download references

Acknowledgments

This work was supported in part by the grants from the National Natural Science Foundation of China (Grant No. 61272248), the National Basic Research Program of China (Grant No. 2013CB329401) and the Major Basic Research Program of Shanghai Science and Technology Committee (15JC1400103).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bao-Liang Lu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Liu, W., Zheng, WL., Lu, BL. (2016). Emotion Recognition Using Multimodal Deep Learning. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds) Neural Information Processing. ICONIP 2016. Lecture Notes in Computer Science(), vol 9948. Springer, Cham. https://doi.org/10.1007/978-3-319-46672-9_58

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-46672-9_58

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-46671-2

  • Online ISBN: 978-3-319-46672-9

  • eBook Packages: Computer ScienceComputer Science (R0)