Advertisement

Multi-view Face Expression Recognition—A Hybrid Method

Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 325)

Abstract

Facial expressions play a significant role in human communication, and their automatic recognition has several applications especially in human–computer interaction. Recognizing facial actions is very challenging due to uneven facial deformation, angle of face poses, ambiguous and uncertain face component measurements. This paper proposes a hybrid approach for face pose detection and facial expression recognition. To speed up expression evaluation process, pose estimation is carried out prior to feature extraction to select appropriate shape model. The major contribution of this paper is introducing a hybrid classification method which uses Ada-Boost for Action Unit classification and Temporal Rule-based classification for correcting Action Unit errors. The experimental results show that this hybrid classification method produces better performance than other classifier which ideally improves overall performance of the system.

Keywords

Expression recognition Pose estimation Action unit classification Feature point tracking Active shape model 

References

  1. 1.
    P. Ekman, W. Friesen, Facial Action Coding System: A Technique for the Measurement of Facial Movement (1978) Google Scholar
  2. 2.
    E. Hjelmas, B. Low, Face detection: a survey. Comput. Vis. Image Underst. 83(3), 236–274 (2001)CrossRefMATHGoogle Scholar
  3. 3.
    S. Li, A. Jain, Handbook of Face Recognition, 2nd edn. (Springer, New York, 2011)CrossRefMATHGoogle Scholar
  4. 4.
    P. Viola, M. Jones, Robust real-time object detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)CrossRefGoogle Scholar
  5. 5.
    J.-C. Chen, J.-J.J. Lien, A view-based statistical system for multi-view face detection and pose estimation. Image Vis. Comput. 27(9), 1252–1271 (2009)Google Scholar
  6. 6.
    T.F. Cootes, C.J. Taylor, T. Cooper, J. Graham, Active shape models-their training and application. Comput. Vis. Image Underst. 61(1), 38–59 (1995)CrossRefGoogle Scholar
  7. 7.
    K. Ko, K.-B. Sim, Facial emotion recognition using a combining AAM with DBN, in International Conference on Control, Automation and Systems (2010), pp. 1436–1439Google Scholar
  8. 8.
    J.J. Lien, T. Kanade, J.F. Cohn, C. Li, Detection, tracking, and classification of action units in facial expression. J. Robot. Autom. Syst. 31(3), 131–146 (2000)CrossRefGoogle Scholar
  9. 9.
    Y. Tong, W. Liao, Q. Ji, Facial action unit recognition by exploiting their dynamic and semantic relationships. IEEE Trans. Pattern Anal. Mach. Intell. 29(10), 1683–1699 (2007)CrossRefGoogle Scholar
  10. 10.
    M. Pantic, I. Patras, Dynamics of facial expressions: recognition of facial actions and their temporal segments from face profile image sequences. IEEE Trans. Syst. Man Cybern. Part B Cybern. 36(2), 433–449 (2006)CrossRefGoogle Scholar
  11. 11.
    E. Murphy-Chutorian, M.M. Trivedi, Head pose estimation in computer vision: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 31(4), 606–626 (2009)Google Scholar
  12. 12.
    Y. Li, S. Wang, Y. Zhao, Q. Ji, Simultaneous facial feature tracking and facial expression recognition. IEEE Trans. Image Process. 22(7), 2559–2573 (2013)CrossRefGoogle Scholar
  13. 13.
    H.-Y. Chen, C.-L. Huang, C.-M. Fu, Hybrid-boost learning for multi-pose face detection and facial expression recognition. Pattern Recogn. 41, 1173–1185 (2008)Google Scholar
  14. 14.
    N. Hesse, T. Gehrig, H. Gao, H.K. Ekenel, Multi-view facial expression recognition using local appearance features, in 21st International Conference on Pattern Recognition (ICPR 2012) (2012), pp. 3533–3536Google Scholar
  15. 15.
    T.F. Cootes, G.J. Edwards, C. Taylor, Active appearance models. IEEE Trans. Pattern Anal. Matching Intell. 23(6), 681–685 (2001)CrossRefGoogle Scholar
  16. 16.
    C. Cortes, V. Vapnik, Support-vector networks. Mach. Learn. 20, 273–297 (1995)MATHGoogle Scholar
  17. 17.
    C. Shan, S. Gong, P.W. McOwan, Facial expression recognition based on local binary patterns: a comprehensive study. Image Vis. Comput. 27, 803–816 (2009)CrossRefGoogle Scholar
  18. 18.
    Z. Guoying, M. Pietikainen, Dynamic texture recognition using local binary patterns with an application to facial expressions. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 915–928 (2007)CrossRefGoogle Scholar
  19. 19.
    P. Yang, Q. Liu, D.N. Metaxas, Boosting encoded dynamic features for facial expression recognition. Pattern Recogn. Lett. 30, 132–139 (2009)CrossRefGoogle Scholar
  20. 20.
    Z. Zhang, M. Lyons, M. Schuster, S. Akamatsu, Comparison between geometry-based and Gabor-wavelets-based facial expression recognition using multi-layer perceptron, in 3rd IEEE International Conference on Automatic Face and Gesture Recognition (1998), pp. 454–459Google Scholar
  21. 21.
    Y. Tian, T. Kanade, J. Cohn, Recognizing action units for facial expression analysis. IEEE Trans. Pattern. Anal. Matching Intell. 23(2), 97–115 (2001)CrossRefGoogle Scholar
  22. 22.
    A. Chakraborty, A. Konar, U.K. Chakraborty, A. Chatterjee, Emotion recognition from facial expressions and its control using fuzzy logic. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 39(4) 726–743 (2009)Google Scholar
  23. 23.
    U. Tariq, K.-H. Lin, Z. Li, X. Zhou, Z. Wang, V. Le, T.S. Huang, X. Lv, T.X. Han, Recognizing emotions from an ensemble of features. IEEE Trans. Syst. Man Cybern. 42(4), 1017–1026 (2012)CrossRefGoogle Scholar
  24. 24.
    Y. Tong, J. Chen, Q. Ji, unified probabilistic framework for spontaneous facial activity modeling and understanding. IEEE Trans. Pattern Anal. Mach. Intell. 32(2), 258–273 (2010)CrossRefGoogle Scholar

Copyright information

© Springer India 2015

Authors and Affiliations

  1. 1.Department of Information Science and TechnologyAnna UniversityChennaiIndia

Personalised recommendations