Performance-Driven Facial Expression Real-Time Animation Generation

  • Zhang Mandun
  • Huo Jianglei
  • Na Shenruoyang
  • Huang Chunmeng
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9219)

Abstract

In view of the reality of facial expression animation and the efficiency of expression reconstruction, a novel method of real-time facial expression reconstruction is proposed. Our pipeline begins with the feature point capture of an actor’s face using a Kinect device. A simple face model has been constructed. 38 feature points for control are manually chosen. Then we track the face of an actor in real-time and reconstruct target model with two different deformation algorithms. Experimental results show that our method can reconstruct facial expression efficiency in low-cost. The facial expression of target model is realistic and synchronizes with the actor.

Keywords

Kinect Expression animation Deformation algorithm Facial reconstruction 

References

  1. 1.
    Yao, J.F., Chen, Q.: Survey on computer facial expression animation technology. J. Appl. Res. Comput. 25(11), 3233–3237 (2008)Google Scholar
  2. 2.
    Weise, T., Li, H., Gool, L.V., Pauly, M.: Face/off: Live facial puppetry. In: Symposium on Computer Animation 2009 ACM SIGGRAPH/Eurographics Symposium, pp. 7–16 (2009)Google Scholar
  3. 3.
    Ma, W.C., Jones, A., Chiang, J.Y.: Facial performance synthesis using deformation driven polynomial displacement maps. ACM Trans. Graph. (ACM SIGGRAPH Asia) 27(5), 121:1–121:10 (2008)Google Scholar
  4. 4.
    Zhang, L., Snavely, N., Curless, B., et al.: Spacetime faces: high resolution capture for modeling and animations. J. ACM Trans. Graph. 23(3), 54–558 (2004)Google Scholar
  5. 5.
    Liu, Z.C., Zhang, Z.Y., Jacobs, C., et al.: Rapid modeling of animated faces from video. In: Proceedings of the Third International Conference on Visual Computing, pp. 58–67, Mexico (2000)Google Scholar
  6. 6.
    Weise, T., Li, H., Gool, L.V., Pauly, M.: Real-time performance-based facial animation. ACM Trans. Graph. Proc. SIGGRAPH 2011 30(4), 60:1–60:10 (2011)Google Scholar
  7. 7.
    Li, H., Yu, J., Ye, Y., Bregler, C.: Realtime facial animation with on-the-fly correctives. ACM Trans. Graph. 32(4), 42:1–42:10 (2013)Google Scholar
  8. 8.
    Zhang, M.D., Yao, J., Ding, B., et al.: Fast individual face modeling and animation. In: Proceedings of the Second Australasian Conference on Interactive Entertainment, pp. 235-239. Creativity and Cognition Studios Press, Sydney (2005)Google Scholar
  9. 9.
    Wan, X.M., Jin, X.G.: Spacetime facial animation editing. J. Comput. Aided Des. Comput. Graph. 25(8), 1183–1189 (2013)Google Scholar
  10. 10.
    Wan, X.M., Jin, X.G.: Data-driven facial expression synthesis via Laplacian deformation. J. Multimedia Tools Appl. 58(1), 109–123 (2012)CrossRefGoogle Scholar
  11. 11.
    Sorkine, O., Cohen-Or, D., Lipman, Y., et al.: Laplacian surface editing. In: Proceedings of 2004 Eurographics/ACM SIGGRAPH Symposium on Geometry Processing, pp. 175–184. ACM Press, New York (2004) Google Scholar
  12. 12.
    Zhang, Y.M., Ji, Q., Zhu, Z.W., et al.: Dynamic facial expression analysis and synthesis with MPEG-4 facial animation parameters. IEEE Trans. Circuits Syst. Video Technol. 18(10), 1383–1396 (2008)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Zhang Mandun
    • 1
  • Huo Jianglei
    • 1
  • Na Shenruoyang
    • 1
  • Huang Chunmeng
    • 1
  1. 1.School of Computer Science and EngineeringHebei University of TechnologyTianjinChina

Personalised recommendations