Abstract
Human interfaces for computer graphics systems are now evolving towards multi-modal approach. Information gathered using visual, audio and motion capture systems are now becoming increasingly important within user-controlled virtual environments. This paper discusses real-time interaction with virtual world through the visual analysis of human facial features from video. The underlying approach to recognize and analyze the facial movements of a real performance is described in detail. The output of the system is compatible with MPEG-4 standard and therefore enhances the ability to use the available data in any other MPEG-4 compatible application. The MPEG-4 standard mainly focuses on networking capabilities and it therefore offers interesting possibilities for teleconferencing, as the requirements for the network bandwidth are quite low. The real-time facial analysis system enables the user to control the facial animation. This is used primarily with real-time facial animation systems, where the synthetic actor reproduces the animator’s expressions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
D. DeCarlo, D. Metaxas, ‘Optical Flow Constraints on Deformable Models with Applications to Face Tracking’, CIS technical report MS-CIS-97–23
E. Cosatto, H.P. Graf, ‘Sample-Based Synthesis of Photo-Realistic Talking Heads’, Computer Animation 1998, pp l03–110
C. Kouadio, P. Poulin, P. Lachapelle, ‘Real-Time Facial Animation based upon a Bank of 3D Facial Expressions’, CA98, 1999, ppl28 – 136
I.S. Pandzic, T.K. Capin, N. Magnenat-Thalmann, D. Thalmann, Towards Natural Communication in Networked Collaborative Virtual Environments’, FIVE ‘96, December 1996
SNHC, ‘Information Technology — Generic Coding Of Audio — Visual Objects Part 2: Visual’, ISO/IEC 14496–2, Final Draft of International Standard, Version of: 13, Nov., 1998, ISO/IEC JTC1/SC29/WG11 N2502a, Atlantic City, Oct. 1998
SNHC, ‘Text for ISO/IEC FDIS 14496-1 Systems (2nd draft)’, ISO/IEC IEC JTC1/SC29/WG11 2501, 1997. ISO/IEC JTC1/SC29/WG11 N2501, Nov., 1998
P. Doenges, F. Lavagetto, J. Ostermann, I.S. Pandzic and E. Petajan, ‘MPEG-4: Audio/Video and Synthetic Graphics/Audio for Mixed Media’, Image Communications Journal, Vol.5, No.4, May, 1997
I.S. Pandzic, T.K. Capin. E. Lee, N. Magnenat-Thalmann, D. Thalmann (1997). ‘A flexible Architecture for Virtual Humans in Networked Collaborative Virtual Environments’, Proceedings Eurogrhics 97, Budapest, Hungary, 1997.
G. Sannier, S. Balcisoy, N. Magnenat-Thalmann, D. Thalmann, ‘VHD: A System for Directing Real-Time Virtual Actors’, The Visual Computer, Springer, 1999.
W. S. Lee, M. Escher, G. Sannier, N. Magnenat-Thalmann, ‘MPEG-4 Compatible Faces from Orthogonal Photos’, CA 99, 1999 pp 186–194.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Wien
About this paper
Cite this paper
Goto, T., Escher, M., Zanardi, C., Magnenat-Thalmann, N. (1999). MPEG-4 based animation with face feature tracking. In: Magnenat-Thalmann, N., Thalmann, D. (eds) Computer Animation and Simulation ’99. Eurographics. Springer, Vienna. https://doi.org/10.1007/978-3-7091-6423-5_9
Download citation
DOI: https://doi.org/10.1007/978-3-7091-6423-5_9
Publisher Name: Springer, Vienna
Print ISBN: 978-3-211-83392-6
Online ISBN: 978-3-7091-6423-5
eBook Packages: Springer Book Archive