Personalized Modeling of Facial Action Unit Intensity

  • Shuang Yang
  • Ognjen Rudovic
  • Vladimir Pavlovic
  • Maja Pantic
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8888)


Facial expressions depend greatly on facial morphology and expressiveness of the observed person. Recent studies have shown great improvement of the personalized over non-personalized models in variety of facial expression related tasks, such as face and emotion recognition. However, in the context of facial action unit (AU) intensity estimation, personalized modeling has been scarcely investigated. In this paper, we propose a two-step approach for personalized modeling of facial AU intensity from spontaneously displayed facial expressions. In the first step, we perform facial feature decomposition using the proposed matrix decomposition algorithm that separates the person’s identity from facial expression. These two are then jointly modeled using the framework of Conditional Ordinal Random Fields, resulting in a personalized model for intensity estimation of AUs. Our experimental results show that the proposed personalized model largely outperforms non-personalized models for intensity estimation of AUs.


Facial Expression Facial Feature Emotion Recognition Identity Component Transfer Learning 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Pantic, M.: Machine analysis of facial behaviour: Naturalistic and dynamic behaviour. Philosophical Transactions of Royal Society B 364, 3505–3513 (2009)CrossRefGoogle Scholar
  2. 2.
    Ekman, P., Friesen, W., Hager, J.: Facial Action Coding System (FACS): Manual. A Human Face (2002)Google Scholar
  3. 3.
    Mahoor, M., Cadavid, S., Messinger, D., Cohn, J.: A framework for automated measurement of the intensity of non-posed facial action units. In: IEEE CVPR’W, pp. 74–80 (2009)Google Scholar
  4. 4.
    Vasilescu, M.A.O., Terzopoulos, D.: Multilinear subspace analysis of image ensembles. In: CVPR. vol. 2, p. II-93 (2003)Google Scholar
  5. 5.
    Mpiperis, I., Malassiotis, S., Strintzis, M.G.: Bilinear models for 3-d face and facial expression recognition. IEEE Trans. on Information Forensics and Security 3, 498–511 (2008)CrossRefGoogle Scholar
  6. 6.
    Valstar, M.F., Pantic, M.: Fully automatic recognition of the temporal phases of facial actions. Systems, Man, and Cybernetics, Part B 42, 28–43 (2012)CrossRefGoogle Scholar
  7. 7.
    Kim, M., Pavlovic, V.: Structured output ordinal regression for dynamic facial emotion intensity prediction. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010, Part III. LNCS, vol. 6313, pp. 649–662. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  8. 8.
    Hablani, R., Chaudhari, N., Tanwani, S.: Recognition of facial expressions using local binary patterns of important facial parts. Int. Journal of Image Proc. 7, 163 (2013)Google Scholar
  9. 9.
    Arapakis, I., Athanasakos, K., Jose, J.M.: A comparison of general vs personalised affective models for the prediction of topical relevance. In: Proc. of the 33rd Int. ACM SIGIR Conf. on R&D in Information Retrieval, pp. 371–378 (2010)Google Scholar
  10. 10.
    Dahmane, M., Meunier, J.: Individual feature–appearance for facial action recognition. In: Kamel, M., Campilho, A. (eds.) ICIAR 2011, Part II. LNCS, vol. 6754, pp. 233–242. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  11. 11.
    Doulamis, N.: An adaptable emotionally rich pervasive computing system. In: European Signal Processing Conference (EUSIPCO) (2006)Google Scholar
  12. 12.
    Romera-Paredes, B., Aung, M.S., Pontil, M., Bianchi-Berthouze, N., de C Williams, A., Watson, P.: Transfer learning to account for idiosyncrasy in face and body expressions. In: FG, pp. 1–6 (2013)Google Scholar
  13. 13.
    Chen, J., Liu, X., Tu, P., Aragones, A.: Learning person-specific models for facial expression and action unit recognition. Pattern Recognition Letters 34, 1964–1970 (2013)CrossRefGoogle Scholar
  14. 14.
    Romera-Paredes, B., Argyriou, A., Berthouze, N., Pontil, M.: Exploiting unrelated tasks in multi-task learning. In: Int. Conf. on Artificial Intelligence and Statistics, pp. 951–959 (2012)Google Scholar
  15. 15.
    Rudovic, O., Pavlovic, V., Pantic, M.: Context-sensitive Dynamic Ordinal Regression for Intensity Estimation of Facial Action Units. IEEE TPAMI (in press, 2014)Google Scholar
  16. 16.
    Chu, W.S., Torre, F.D.L., Cohn, J.F.: Selective transfer machine for personalized facial action unit detection. In: CVPR, pp. 3515–3522 (2013)Google Scholar
  17. 17.
    Reilly, J., Ghent, J., McDonald, J.: Investigating the dynamics of facial expression. In: Bebis, G., et al. (eds.) ISVC 2006. LNCS, vol. 4292, pp. 334–343. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  18. 18.
    Savrana, A., Sankur, B., Bilgeb, M.: Regression-based intensity estimation of facial action units. In: Image and Vision Computing (2012)Google Scholar
  19. 19.
    Vasilescu, M.A.O., Terzopoulos, D.: Multilinear image analysis for facial recognition. In: ICPR, vol. 2, p. 20511 (2002)Google Scholar
  20. 20.
    Wang, H., Ahuja, N.: Facial expression decomposition. In: ICCV, pp. 958–965 (2003)Google Scholar
  21. 21.
    Lee, C.-S., Elgammal, A.: Facial expression analysis using nonlinear decomposable generative models. In: Zhao, W., Gong, S., Tang, X. (eds.) AMFG 2005. LNCS, vol. 3723, pp. 17–31. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  22. 22.
    Turk, M., Pentland, A.: Eigenfaces for recognition. Journal of Cognitive Neuroscience 3, 71–86 (1991)CrossRefGoogle Scholar
  23. 23.
    Li, P., Fu, Y., Mohammed, U., Elder, J., Prince, S.: Probabilistic models for inference about identity. TPAMI 34, 144–157 (2012)CrossRefGoogle Scholar
  24. 24.
    Lafferty, J., McCallum, A., Pereira, F.: Conditional random fields: Probabilistic models for segmenting and labeling sequence data. In: ICML, pp. 282–289 (2001)Google Scholar
  25. 25.
    Winkelmann, R., Boes, S.: Analysis of microdata. Springer (2006)Google Scholar
  26. 26.
    Mavadati, S., Mahoor, M., Bartlett, K., Trinh, P., Cohn, J.: Disfa: A spontaneous facial action intensity database. IEEE Trans. on Affective Comp. 4(2), 151–160 (2013)CrossRefGoogle Scholar
  27. 27.
    Lucey, P., Cohn, J., Prkachin, K., Solomon, P., Matthews, I.: Painful data: The unbc-mcmaster shoulder pain expression archive database. In: IEEE FG, pp. 57–64 (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Shuang Yang
    • 1
  • Ognjen Rudovic
    • 1
  • Vladimir Pavlovic
    • 2
  • Maja Pantic
    • 1
    • 3
  1. 1.Comp. Dept.Imperial College LondonUK
  2. 2.Dept. of Computer ScienceRutgers UniversityUSA
  3. 3.EEMCSUniversity of TwenteThe Netherlands

Personalised recommendations