Skip to main content
Log in

An interactive facial expression generation system

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

How to generate vivid facial expressions by computers has been an interesting and challenging problem for a long time. Some research adopts an anatomical approach by studying the relationships between the expressions and the underlying bones and muscles. On the other hand, MPEG4’s SNHC (synthetic/natural hybrid coding) provides mechanisms which allow detailed descriptions of facial expressions and animations. Unlike most existing approaches that ask a user to provide 3D head models, a set of reference images, detailed information of facial feature markers, numerous associated parameters, and/or even non-trivial user assistance, our proposed approach is simple, intuitive and interactive, and most importantly, it is still capable of generating vivid 2D facial expressions. With our system, a user is only required to give a single photo and spend a couple of seconds to roughly mark the positions of eyes, eyebrows and mouth in the photo, and then our system could trace more accurately the contours of these facial features through the technique of active contour. Different expressions can then be generated and morphed via the mesh warping algorithm. Another innovation of this paper is to propose a simple music emotion analysis algorithm, which is coupled with our system to further demonstrate the effectiveness of our facial expression generation. Through such an integration, our system could identify the emotions of a music piece, and display the corresponding emotions via aforementioned synthesized facial expressions. Experimental results show that in general the end-to-end facial generation time, from the time an input photo is given, to the time the final facial expressions are generated, is about 1 min.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Beier T (1992) Feature-based image metamorphosis. In: SIGGRAPH ’92, pp 35–42

  2. Ekman P, Friesen W (1978) Facial action coding system: a technique for the measurement of facial movement. Consulting Psychologists Press, Palo Alto, CA

    Google Scholar 

  3. Fournier A, Montuno DY (1984) Triangulating simple polygons and equivalent problems. ACM Trans Graphics 3:153–174

    Article  MATH  Google Scholar 

  4. Gleicher M (1998) Retargetting motion to new characters. In: SIGGRAPH ’98, pp 33–42

  5. Han X, Xu C, Prince JL (2003) A topology preserving level set method for geometric deformable models. IEEE Trans Pattern Anal Mach Intell 25(6):755–768

    Article  Google Scholar 

  6. Lyons M, Akamatsu S, Kamachi M, Gyoba J (1998) Coding facial expressions with Gabor wavelets. In: Proceedings of Third IEEE International Conference on Automatic Face and Gesture Recognition. Nara Japan, IEEE Computer Society, pp 200–205

    Chapter  Google Scholar 

  7. Karpouzis K, Tsapatsoulis N, Kollias S (2000) Moving to continuous facial expression space using the MPEG-4 facial definition parameter(FDP) set. In: SPIE Electronic Imaging 2000, pp 443–450

  8. Kass M, Witkin A, Terzopoulos D (1988) Snakes: active contour models. Int J Comput Vision 1:321–332

    Article  Google Scholar 

  9. Lischinski D (2006) Selected topics in computer graphics course slides. http://www.cs.huji.ac.il/~danix/

  10. Liu Z, Shan Y, Zhang Z (2001) Expressive expression mapping with ratio images. In: SIGGRAPH ’01, pp 271–276

  11. Parke FI (1972) Computer generated animation of faces. In: Proceedings of Annual ACM Conference

  12. Parke FI, Waters K (1996) Computer facial animation. AK Peters, Wellesley, MA

    Google Scholar 

  13. Pighin F, Hecker J, Lischinski D, Szeliski R, Salesin DH (1998) Synthesizing realistic facial expressions from photographs. In: SIGGRAPH ’98, pp 75–84

  14. Press WH, Teukolsky SA, Vetterling WT, Flannery BP (1997) Numerical recipes in C, 2nd edn. Cambridge Univ. Press

  15. Raouzaiou A, Tsapatsoulis N, Karpouzis K, Kollias S (2002) Parameterized facial expression synthesis based on MPEG-4. In: EURASIP ’02, pp 1021–1038

  16. Schubert E (1999) Measurement and time series analysis of emotion in music. PhD thesis, University of New South Wales

  17. Schubert E (2004) Emotionface: prototype facial expression display of emotion in music. In: Proceedings of ICAD-04

  18. Sethian A (1999) Level set methods and fast marching methods, 2nd edn. Cambridge Univ. Press, Cambridge, UK

    MATH  Google Scholar 

  19. Shugrina M, Betke M, Collomosse J (2006) Empathic painting: interactive stylization through observed emotional state. In: NPAR ’2006, pp 87–96

  20. Smythe DB (1990) A two-pass mesh warping algorithm for object transformation and image interpolation. Technical Report ILM Technical Memo No. 1030, Computer Graphics Department, Lucasfilm Ltd

  21. Terzopoulos D, Waters K (1993) Analysis and synthesis of facial image sequences using physical and anatomical models. IEEE Trans Pattern Anal Mach Intell 15(6):569–579

    Article  Google Scholar 

  22. Wang H, Ahuja N (2003) Facial expression decomposition. In: Proceedings of Nineth IEEE Int Conference on Comput Vision, pp 958–965

  23. Wang M, Zhang N, Zhu H (2004) User-adaptive music emotion reconition. In: Proceedings of the ICSP’04, pp 1352–1355

  24. Waters K (1987) A muscle model for animating three-dimensional facial expression. In: SIGGRAPH ’87, pp 17–24

  25. Williams D, Shah M (1990) A fast algorithm for active contours. In: The 3rd IEEE Int Conference on Comput Vision, pp 592–595

  26. Wolberg G (1990) Digital image warping. IEEE Society Press, Los Alamitos

    Google Scholar 

  27. Zhang Q, Liu Z, Guo B, Terzopoulos D, Shum H (2006) Geometry-driven photorealistic facial expression synthesis. IEEE Trans Vis Comput Graph 12(1):48–60

    Article  Google Scholar 

  28. Zhang Q, Prakash EC, Sung E (2003) Efficient modeling of an anatomy-based face and fast 3D facial expression synthesis. Comput Graph Forum 22(2):159–169

    Article  Google Scholar 

  29. Zhou C, Lin X (2005) Facial expressional image synthesis controlled by emotional parameters. Pattern Recogn Letters 26(16):2611–2627

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chuan-Kai Yang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yang, CK., Chiang, WT. An interactive facial expression generation system. Multimed Tools Appl 40, 41–60 (2008). https://doi.org/10.1007/s11042-007-0184-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-007-0184-x

Keywords

Categories and Subject Descriptors

Navigation