Abstract
In recent years, networked virtual environments have steadily grown to become a frontier in social computing. Such virtual cyberspaces are usually accessed by multiple users through their 3D avatars. Recent scientific activity has resulted in the release of both hardware and software components that enable users at home to interact with their virtual persona through natural body and facial activity performance. Based on 3D computer graphics methods and vision-based motion tracking algorithms, these techniques aspire to reinforce the sense of autonomy and telepresence within the virtual world. In this paper we present two distinct frameworks for avatar animation through user natural motion input. We specifically target the full body avatar control case using a Kinect sensor via a simple, networked skeletal joint retargeting pipeline, as well as an intuitive user facial animation 3D reconstruction pipeline for rendering highly realistic user facial puppets. Furthermore, we present a common networked architecture to enable multiple remote clients to capture and render any number of 3D animated characters within a shared virtual environment.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
PrimeSense, who was founding member of the OpenNI, shutdown the original OpenNI project on which our modules are linking to when it was acquired by Apple on November 24, 2013. The module retains its operability through the latest legacy version of the library (1.5.4.0 as of May 7, 2012).
- 2.
Blender, free, open source 3D Computer Graphics Software http://www.blender.org/.
References
Apostolakis, K.C., Daras, P.: RAAT-the reverie avatar authoring tool. In: 2013 18th International Conference on Digital Signal Processing (DSP). IEEE (2013)
Apostolakis, K.C., et al.: Blending real with virtual in 3DLife. In: 2013 14th International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS). IEEE (2013)
Cao, C., et al.: 3D shape regression for real-time facial animation. ACM Trans. Graph. 32(4), 41 (2013)
Cho, T., et al.: Emotional avatars: appearance augmentation and animation based on facial expression analysis. Appl. Math. 9(2L), 461–469 (2015)
Cootes, T.F., et al.: Active shape models-their training and application. Comput. Vis. Image Underst. 61(1), 38–59 (1995)
Cootes, T.F., Taylor, C.J.: Statistical models of appearance for computer vision (2004)
Cootes, T.F.: Modeling and Search software. http://personalpages.manchester.ac.uk/staff/timothy.f.cootes/software/am_tools_doc/index.html
Ekman, P., Friesen, W.V.: Manual for the Facial Action Coding System. Consulting Psychologists Press, Palo Alto (1978)
Ostermann, J.: Face animation in MPEG-4. In: Pandzic, I., Forchheimer, R. (eds.) MPEG-4 Facial Animation: The Standard, Implementation and Applications, pp. 17–55. Wiley, Chichester (2002)
Rhee, T., et al.: Real-time facial animation from live video tracking. In: Proceedings of the 2011 ACM SIGGRAPH/Eurographics Symposium on Computer Animation. ACM (2011)
Sanna, A., et al.: A kinect-based interface to animate virtual characters. J. Multimodal User Interfaces 7(4), 269–279 (2013)
Shapiro, A., et al.: Automatic acquisition and animation of virtual avatars. In: VR (2014)
Spanlang, B., et al.: Real time whole body motion mapping for avatars and robots. In: Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology. ACM (2013)
Wei, Y.: Research on facial expression recognition and synthesis. Master Thesis, Department of Computer Science and Technology, Nanjing (2009)
Weise, T., et al.: Realtime performance-based facial animation. ACM Trans. Graph. (TOG) 30(4), 77:1-77:10 (2011). ACM
Zahariadis, T., et al.: Utilizing social interaction information for efficient 3D immersive overlay communications. In: Kondoz, A., Dagiuklas, T. (eds.) Novel 3D Media Technologies, pp. 225–240. Springer, New York (2015)
Acknowledgement
The research leading to this work has received funding from the European Community’s Horizon 2020 Framework Programme under grant agreement no. 644204 (ProsocialLearn project).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Apostolakis, K.C., Daras, P. (2015). Natural User Interfaces for Virtual Character Full Body and Facial Animation in Immersive Virtual Worlds. In: De Paolis, L., Mongelli, A. (eds) Augmented and Virtual Reality. AVR 2015. Lecture Notes in Computer Science(), vol 9254. Springer, Cham. https://doi.org/10.1007/978-3-319-22888-4_27
Download citation
DOI: https://doi.org/10.1007/978-3-319-22888-4_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-22887-7
Online ISBN: 978-3-319-22888-4
eBook Packages: Computer ScienceComputer Science (R0)