Advertisement

Performance Driven Facial Animation by Appearance Based Tracking

  • José Miguel Buenaposada
  • Enrique Muñoz
  • Luis Baumela
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3522)

Abstract

We present a method that estimates high level animation parameters (muscle contractions, eye movements, eye lids opening, jaw motion and lips contractions) from a marker-less face image sequence. We use an efficient appearance-based tracker to stabilise images of upper (eyes and eyebrows) and lower (mouth) face. By using a set of stabilised images with known animation parameters, we can learn a re-animation matrix that allows us to estimate the parameters of a new image. The system is able to re-animate a 32 DOF 3D face model in real-time.

Keywords

Facial Expression Training Sequence Face Animation Parameter Motion Template Animation Parameter 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Jun-yong, N.U.: Expression cloning. In: Proc. of SIGGRAPH, pp. 277–288. ACM, New York (2001)Google Scholar
  2. 2.
    Cohn, J., Kanade, T., Moriyama, T., Ambadar, Z., Xiao, J., Ga, J., Imamura, H.: A comparative study of alternative facs coding algorithms. Technical report, Robotics Institute, Carnegie Mellon University (2001)Google Scholar
  3. 3.
    Tian, Y., Kanade, T., Cohn, J.: Recognizing action units for facial expression analysis. PAMI 23, 97–115 (2001)Google Scholar
  4. 4.
    Valente, S., Andrés, A.C., Del Valle, J.L.D.: Analysis and reproduction of facial expressions for realistic communicating clones. Journal of VLSI Signal Processing 29, 41–49 (2001)zbMATHCrossRefGoogle Scholar
  5. 5.
    Ahlberg, J.: Using the active appearence algorithm for face and facial feature tracking. In: Proc. of 2nd Int. Workshop on Recognition, analisys and tracking of faces and gestures in real time systems, RATFG-RTS 2001, pp. 68–72 (2001)Google Scholar
  6. 6.
    Terzopoulos, D., Waters, K.: Analysis and synthesis of facial image sequences using physical and anatomical models. PAMI 15 (1997)Google Scholar
  7. 7.
    Buck, I., Finkelstein, A., Jacobs, C., Klein, A., Salesin, D.H., Seims, J., Szeliski, R., Toyama, K.: Performance-driven hand-drawn animation. In: Proc. of Int. Symposium on Non Photorealistic Animation and Rendering, NPAR’2000, pp. 101–108 (2000)Google Scholar
  8. 8.
    Buenaposada, J., Muõz, E., Baumela, L.: Efficient appearance-based tracking. In: Proc. of Workshop on Nonrigid and Articulated Motion, IEEE, Los Alamitos (2004)Google Scholar
  9. 9.
    Hager, G., Belhumeur, P.: Efficient region tracking with parametric models of geometry and illumination. PAMI 20, 1025–1039 (1998)Google Scholar
  10. 10.
    Black, M.J., Jepson, A.D.: Eigentracking: Robust matching and tracking of articulated objects using a view-based representation. IJCV 26, 63–84 (1998)CrossRefGoogle Scholar
  11. 11.
    Hou, X., Li, S.Z., Zhang, H., Cheng, Q.: Direct appearance models. In: Proc. of CVPR, IEEE, Los Alamitos (2001)Google Scholar
  12. 12.
    Parke, F.I., Waters, K.: Computer Facial Animation. AK Peters Ltd (1996)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • José Miguel Buenaposada
    • 1
  • Enrique Muñoz
    • 2
  • Luis Baumela
    • 2
  1. 1.Dpto. de Informática, Estadística y TelemáticaESCET, Univ. Rey Juan CarlosMóstoles, MadridSpain
  2. 2.Fac. de InformáticaUniv. Politécnica de MadridBoadilla del Monte, MadridSpain

Personalised recommendations