Skip to main content

Emotion Recognition Based on Graph Neural Networks

  • Conference paper
  • First Online:
Cognitive Systems and Signal Processing (ICCSIP 2020)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1397))

Included in the following conference series:

  • 1573 Accesses

Abstract

Emotion recognition is widely used in many areas, such as medicine and education. Due to the obvious difference in duration and intensity between micro and macro expression, the same model cannot be used to classify emotions precisely. In this paper, an algorithm for emotion recognition based on graph neural network is proposed. The proposed method involves four key steps. Firstly, data augmentation is used to increase the diversity of original data. Secondly, graph network is built based on feature points. The feature points Euclidean distance is calculated as the initial value of the matrix. Thirdly, Laplacian matrix is obtained according to the matrix. Finally, graph neutral network is utilized to bridge the relationship between feature vectors and emotions. In addition, a new dataset named FEC-13 is provided by subdivided traditional six kinds of emotions to thirteen categories according to the intensity of emotions. The experimental results show that a high accuracy is reached with a small amount of training data, especially CASME II dataset, which achieves an accuracy of 95.49%. A cross-database study indicates that proposed method has high generalization performance and the accuracy of FEC-13 dataset is 74.99%.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zeng, H., Shu, X., Wang, Y., et al.: EmotionCues: emotion-oriented visual summarization of classroom videos. IEEE Trans. Visual Comput. Graphics. https://doi.org/10.1109/tvcg.2019.2963659

  2. Zhao, G., Pietikainen, M.: Dynamic texture recognition using local binary patterns with an application to facial expressions. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 915–928 (2007)

    Article  Google Scholar 

  3. Song, B., Li, K., Zong, Y., et al.: Recognizing spontaneous micro-expression using a three-stream convolutional neural network. IEEE Access 7, 184537–184551 (2019)

    Article  Google Scholar 

  4. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: ICLR 2017 (2017)

    Google Scholar 

  5. Zeng, R., Huang, W., Tan, M., et al.: Graph convolutional networks for temporal action localization. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 7094–7103. IEEE (2019)

    Google Scholar 

  6. Ghosal, D., Majumder, N., Poria, S., et al.: Dialoguegcn: a graph convolutional neural network for emotion recognition in conversation. arXiv preprint arXiv:1908.11540 (2019)

  7. Yan, W.J., Li, X., Wang, S.J., et al.: CASME II: an improved spontaneous micro-expression database and the baseline evaluation. PLoS ONE 9(1), (2014)

    Article  Google Scholar 

  8. Li, X., Pfister, T., Huang, X., et al.: A spontaneous micro-expression database: inducement, collection and baseline. In: 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition(fg), pp. 1–6. IEEE (2013)

    Google Scholar 

  9. Davison, A.K., Lansley, C., Costen, N., et al.: SAMM: a spontaneous micro-facial movement dataset. IEEE Trans. Affect. Comput. 9(1), 116–129 (2016)

    Article  Google Scholar 

  10. Pfister, T., Li, X., Zhao, G., et al.: Differentiating spontaneous from posed facial expressions within a generic facial expression recognition framework. In: 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), pp. 868–875. IEEE (2011)

    Google Scholar 

  11. Lucey, P., Cohn, J.F., Kanade, T., et al.: The extended Cohn-Kanade dataset (CK+): a complete dataset for action unit and emotion-specified expression. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops, pp. 94–101. IEEE (2010)

    Google Scholar 

  12. Lyons, M., Akamatsu, S., Kamachi, M., et al.: Coding facial expressions with gabor wavelets. In: Proceedings of the Third IEEE International Conference on Automatic Face and Gesture Recognition, pp. 200–205. IEEE (1998)

    Google Scholar 

  13. Chen, L.-F., Yen, Y.-S.: Taiwanese facial expression image database. Brain Mapping Laboratory, Institute of Brain Science, National Yang-Ming University, Taipei, Taiwan (2007)

    Google Scholar 

  14. Li, S., Deng, W.: Reliable crowdsourcing and deep locality-preserving learning for unconstrained facial expression recognition. IEEE Trans. Image Process. 28(1), 356–370 (2018)

    Article  MathSciNet  Google Scholar 

  15. Liu, Y.J., Zhang, J.K., Yan, W.J., et al.: A main directional mean optical flow feature for spontaneous micro-expression recognition. IEEE Trans. Affect. Comput. 7(4), 299–310 (2015)

    Article  Google Scholar 

  16. Xia, Z., Hong, X., Gao, X., et al.: Spatiotemporal recurrent convolutional networks for recognizing spontaneous micro-expressions. IEEE Trans. Multimedia 22(3), 626–640 (2019)

    Article  Google Scholar 

  17. Zavaschi, T.H.H., Britto, A.S., Oliveira, L.E.S., et al.: Fusion of feature sets and classifiers for facial expression recognition. Expert Syst. Appl. 40(2), 646–655 (2013)

    Article  Google Scholar 

  18. Wang, X., Jin, C., Liu, W., et al.: Feature fusion of HOG and WLD for facial expression recognition. In: Proceedings of the 2013 IEEE/SICE International Symposium on System Integration, pp. 227–232. IEEE (2013)

    Google Scholar 

  19. Ouellet, S.: Real-time emotion recognition for gaming using deep convolutional network features. arXiv preprint arXiv:1408.3750 (2014)

  20. Liu, P., Han, S., Meng, Z., et al.: Facial expression recognition via a boosted deep belief network. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1805–1812 (2014)

    Google Scholar 

  21. Azadi, S., Feng, J., Jegelka, S., et al.: Auxiliary image regularization for deep CNNs with noisy labels. arXiv preprint arXiv:1511.07069 (2015)

  22. Goldberger, J., Ben-Reuven, E.: Training deep neural-networks using a noise adaptation layer. In: International Conference on Learning Representations (2017)

    Google Scholar 

  23. Zhang, Z., Fang, C., Ding, X.: Facial expression analysis across databases. In: 2011 International Conference on Multimedia Technology, pp. 317–320. IEEE (2011)

    Google Scholar 

  24. Xie, S., Hu, H., Wu, Y.: Deep multi-path convolutional neural network joint with salient region attention for facial expression recognition. Pattern Recogn. 92, 177–191 (2019)

    Article  Google Scholar 

  25. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014)

  26. He, K., Zhang, X., Ren, S., et al.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  27. Yang, C.Y., Zheng, K., Zhou, J., et al.: Video-based meticulous classification of facial expressions reflecting psychological status in the classroom. Basic Clinical Pharmacol. Toxicol. 125(066), 42–43 (2019)

    Google Scholar 

Download references

Acknowledgment

This work was supported in part by the Planning Subject for the 13th Five Year Plan of Beijing Education Sciences (CADA18069).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guangmin Sun .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, J., Sun, G., Zheng, K., Mazhar, S., Fu, X., Yang, D. (2021). Emotion Recognition Based on Graph Neural Networks. In: Sun, F., Liu, H., Fang, B. (eds) Cognitive Systems and Signal Processing. ICCSIP 2020. Communications in Computer and Information Science, vol 1397. Springer, Singapore. https://doi.org/10.1007/978-981-16-2336-3_45

Download citation

  • DOI: https://doi.org/10.1007/978-981-16-2336-3_45

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-16-2335-6

  • Online ISBN: 978-981-16-2336-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics