Natural User Interfaces for Virtual Character Full Body and Facial Animation in Immersive Virtual Worlds

  • Konstantinos Cornelis ApostolakisEmail author
  • Petros Daras
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9254)


In recent years, networked virtual environments have steadily grown to become a frontier in social computing. Such virtual cyberspaces are usually accessed by multiple users through their 3D avatars. Recent scientific activity has resulted in the release of both hardware and software components that enable users at home to interact with their virtual persona through natural body and facial activity performance. Based on 3D computer graphics methods and vision-based motion tracking algorithms, these techniques aspire to reinforce the sense of autonomy and telepresence within the virtual world. In this paper we present two distinct frameworks for avatar animation through user natural motion input. We specifically target the full body avatar control case using a Kinect sensor via a simple, networked skeletal joint retargeting pipeline, as well as an intuitive user facial animation 3D reconstruction pipeline for rendering highly realistic user facial puppets. Furthermore, we present a common networked architecture to enable multiple remote clients to capture and render any number of 3D animated characters within a shared virtual environment.


Virtual character animation Markerless performance capture Face animation Kinect-based interfaces 



The research leading to this work has received funding from the European Community’s Horizon 2020 Framework Programme under grant agreement no. 644204 (ProsocialLearn project).


  1. 1.
    Apostolakis, K.C., Daras, P.: RAAT-the reverie avatar authoring tool. In: 2013 18th International Conference on Digital Signal Processing (DSP). IEEE (2013)Google Scholar
  2. 2.
    Apostolakis, K.C., et al.: Blending real with virtual in 3DLife. In: 2013 14th International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS). IEEE (2013)Google Scholar
  3. 3.
    Cao, C., et al.: 3D shape regression for real-time facial animation. ACM Trans. Graph. 32(4), 41 (2013)CrossRefGoogle Scholar
  4. 4.
    Cho, T., et al.: Emotional avatars: appearance augmentation and animation based on facial expression analysis. Appl. Math. 9(2L), 461–469 (2015)Google Scholar
  5. 5.
    Cootes, T.F., et al.: Active shape models-their training and application. Comput. Vis. Image Underst. 61(1), 38–59 (1995)CrossRefGoogle Scholar
  6. 6.
    Cootes, T.F., Taylor, C.J.: Statistical models of appearance for computer vision (2004)Google Scholar
  7. 7.
  8. 8.
    Ekman, P., Friesen, W.V.: Manual for the Facial Action Coding System. Consulting Psychologists Press, Palo Alto (1978)Google Scholar
  9. 9.
    Ostermann, J.: Face animation in MPEG-4. In: Pandzic, I., Forchheimer, R. (eds.) MPEG-4 Facial Animation: The Standard, Implementation and Applications, pp. 17–55. Wiley, Chichester (2002)CrossRefGoogle Scholar
  10. 10.
    Rhee, T., et al.: Real-time facial animation from live video tracking. In: Proceedings of the 2011 ACM SIGGRAPH/Eurographics Symposium on Computer Animation. ACM (2011)Google Scholar
  11. 11.
    Sanna, A., et al.: A kinect-based interface to animate virtual characters. J. Multimodal User Interfaces 7(4), 269–279 (2013)CrossRefGoogle Scholar
  12. 12.
    Shapiro, A., et al.: Automatic acquisition and animation of virtual avatars. In: VR (2014)Google Scholar
  13. 13.
    Spanlang, B., et al.: Real time whole body motion mapping for avatars and robots. In: Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology. ACM (2013)Google Scholar
  14. 14.
    Wei, Y.: Research on facial expression recognition and synthesis. Master Thesis, Department of Computer Science and Technology, Nanjing (2009)Google Scholar
  15. 15.
    Weise, T., et al.: Realtime performance-based facial animation. ACM Trans. Graph. (TOG) 30(4), 77:1-77:10 (2011). ACMGoogle Scholar
  16. 16.
    Zahariadis, T., et al.: Utilizing social interaction information for efficient 3D immersive overlay communications. In: Kondoz, A., Dagiuklas, T. (eds.) Novel 3D Media Technologies, pp. 225–240. Springer, New York (2015)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Konstantinos Cornelis Apostolakis
    • 1
    Email author
  • Petros Daras
    • 1
  1. 1.Information Technologies Institute, Centre for Research and Technology HellasThessalonikiGreece

Personalised recommendations