Advertisement

Intensity Estimation of the Real-World Facial Expression

  • Yan Gao
  • Shan Li
  • Weihong Deng
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 662)

Abstract

Affect computing or Automatic affect sensing has aroused extensive interests of researchers in the area of machine learning and pattern recognition. Most previous research focused on face detection and emotion recognition while our research explores facial intensity estimation, which cares more about the dynamic changes on a face. CK+ database and Real-world Affective Face Database (RAF-DB) are used to test and implement the algorithms in this paper. To settle the problem of intensity estimation, classification and ranking algorithms are used for training and testing intensity levels. Meanwhile, the performance of five different feature representations is evaluated using the accuracy results obtained from classification approach. By using the optimum feature representation as the input to the next designed training model, ranking results can be attained. Techniques of Learning to Rank in the area of information retrieval are utilized to combat the situation of intensity ranking. RankSVM and RankBoost are used as frameworks to estimate the ranking scores based on sequences of images. The experimental results of scoring are evaluated by the indexes used in information retrieval. Algorithms used in the research are well organized and compared to generate an optimal model for the ranking task.

Keywords

Intensity level Learning to rank RankSVM 

Notes

Acknowledgement

This work was partially sponsored by supported by the NSFC (National Natural Science Foundation of China) under Grant No. 61375031, No. 61573068, No. 61471048, and No. 61273217, the Fundamental Research Funds for the Central Universities under Grant No. 2014ZD03-01, This work was also supported by Beijing Nova Program, CCF-Tencent Open Research Fund, and the Program for New Century Excellent Talents in University.

References

  1. 1.
    Delannoy, J., McDonald, J.: Automatic estimation of the dynamics of facial expression using a three-level model of intensity. In: IEEE International Conference on Automatic Face & Gesture Recognition (2008)
Google Scholar
  2. 2.
    Yang, P., Liu, Q., Metaxas, D.N.: IEEE rankboost with l-1 regularization for facial expression recognition and intensity estimation. In: International Conference of Computer Vision (ICCV) (2009)Google Scholar
  3. 3.
    Mahoor, M., Cadavid, S., Messinger, D., Cohn, J.: A framework for automated measurement of the intensity of non-posed facial action units. In: IEEE CVPR Workshop on Human Communicative Behaviour Analysis (2009)Google Scholar
  4. 4.
    Chang, K.Y., Chen, C.S., Hung, Y.P.: Intensity rank estimation of facial expressions based on a single image. In: IEEE International Conference on Systems, Man, and Cybernetics, pp. 3157–3162 (2013)Google Scholar
  5. 5.
    Mavadati, S., Mahoor, M., Bartlett, K., Trinh, P., Cohn, J.: DISFA: a spontaneous facial action intensity database. IEEE Trans. Affect. Comput. 4(2), 151–160 (2013)CrossRefGoogle Scholar
  6. 6.
    Valstar, M.F., Almaev, T., et al.: FERA 2015 - second facial expression recognition and analysis challenge. In: 2015 IEEE International Conference on Automatic Face & Gesture Recognition and Workshops (FG 2015). IEEE (2015)
Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2016

Authors and Affiliations

  1. 1.Beijing University of Posts and TelecommunicationsBeijingChina

Personalised recommendations