Informed Use of Motion Synthesis Methods

  • Herwin van Welbergen
  • Zsófia Ruttkay
  • Balázs Varga
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5277)


In virtual human (VH) applications, and in particular, games, motions with different functions are to be synthesized, such as communicative and manipulative hand gestures, locomotion, expression of emotions or identity of the character. In the bodily behavior, the primary motions define the function, while the more subtle secondary motions contribute to the realism and variability. From a technological point of view, there are different methods at our disposal for motion synthesis: motion capture and retargeting, procedural kinematic animation, force-driven dynamical simulation, or the application of Perlin noise. Which method to use for generating primary and secondary motions, and how to gather the information needed to define them? In this paper we elaborate on informed usage, in its two meanings. First we discuss, based on our own ongoing work, how motion capture data can be used to identify joints involved in primary and secondary motions, and to provide basis for the specification of essential parameters for motion synthesis methods used to synthesize primary and secondary motion. Then we explore the possibility of using different methods for primary and secondary motion in parallel in such a way, that one methods informs the other. We introduce our mixed usage of kinematic an dynamic control of different body parts to animate a character in real-time. Finally we discuss motion Turing test as a methodology for evaluation of mixed motion paradigms.


Motion Capture Primary Motion Computer Animation Virtual Human Secondary Motion 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Reidsma, D., Nijholt, A., Bos, P.: Temporal interaction between an artificial orchestra conductor and human musicians. ACM Computers in Entertainment (to appear, 2008)Google Scholar
  2. 2.
    Ruttkay, Z., van Welbergen, H.: Elbows higher! performing, observing and correcting exercises by a virtual trainer. In: Proceedings of Intelligent Virtual Agents, Tokyo, Japan (to appear) (September 2008)Google Scholar
  3. 3.
    Witkin, A., Popovic, Z.: Motion warping. In: SIGGRAPH, pp. 105–108. ACM Press, New York (1995)Google Scholar
  4. 4.
    Gleicher, M.: Retargetting motion to new characters. In: SIGGRAPH, pp. 33–42. ACM Press, New York (1998)Google Scholar
  5. 5.
    Ikemoto, L., Arikan, O., Forsyth, D.A.: Knowing when to put your foot down. In: Proceedings of Interactive 3D graphics and games, pp. 49–53. ACM Press, New York (2006)Google Scholar
  6. 6.
    Tak, S., Song, O.-Y., Ko, H.-S.: Motion balance filtering. Computer Graphics Forum 19(3) (2000)Google Scholar
  7. 7.
    Perlin, K.: Real time responsive animation with personality. IEEE Transactions on Visualization and Computer Graphics 1(1), 5–15 (1995)CrossRefGoogle Scholar
  8. 8.
    Chi, D.M., Costa, M., Zhao, L., Badler, N.I.: The EMOTE model for effort and shape. In: SIGGRAPH, pp. 173–182. ACM Press/Addison-Wesley Publishing Co., New York (2000)Google Scholar
  9. 9.
    Neff, M., Kipp, M., Albrecht, I., Seidel, H.P.: Gesture modeling and animation based on a probalistic recreation of speaker style. Transactions on Graphics (to appear, 2008)Google Scholar
  10. 10.
    Hartmann, B., Mancini, M., Pelachaud, C.: Implementing expressive gesture synthesis for embodied conversational agents. In: Gibet, S., Courty, N., Kamp, J.-F. (eds.) GW 2005. LNCS, vol. 3881, pp. 188–199. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  11. 11.
    Wooten, W.L., Hodgins, J.K.: Simulating leaping, tumbling, landing, and balancing humans. In: ICRA, pp. 656–662. IEEE, Los Alamitos (2000)Google Scholar
  12. 12.
    Faloutsos, P., van de Panne, M., Terzopoulos, D.: The virtual stuntman: dynamic characters with a repertoire of autonomous motor skills. Computers & Graphics 25, 933–953 (2001)CrossRefGoogle Scholar
  13. 13.
    Zordan, V.B., Macchietto, A., Medina, J., Soriano, M., Wu, C.C.: Interactive dynamic response for games. In: Proceedings of the SIGGRAPH symposium on Video games, pp. 9–14. ACM Press, New York (2007)Google Scholar
  14. 14.
    Isaacs, P.M., Cohen, M.F.: Controlling dynamic simulation with kinematic constraints. In: SIGGRAPH, pp. 215–224. ACM Press, New York (1987)Google Scholar
  15. 15.
    Otten, E.: Inverse and forward dynamics: models of multi-body systems. Philosophical Transactions of the Royal Society 358(1437), 1493–1500 (2003)CrossRefGoogle Scholar
  16. 16.
    Zordan, V.B., Hodgins, J.K.: Motion capture-driven simulations that hit and react. In: Proceedings of the Symposium on Computer Animation, pp. 89–96. ACM Press, New York (2002)Google Scholar
  17. 17.
    van Welbergen, H., Ruttkay, Z.: On the parameterization of clapping. In: Gesture Workshop, Lisbon, Portugal (July 2007)Google Scholar
  18. 18.
    Varga, B.: Movement modeling and control of the reactive virtual trainer. Master’s thesis, Peter Pazmany Catholic University, Dept. of Information and Technology, Budapest, Hungary (2008)Google Scholar
  19. 19.
    Ruttkay, Z., van Welbergen, H.: Let’s shake hands! on the coordination of gestures of humanoids. In: Artificial and Ambient Intelligence, Newcastle University, Newcastle upon Tyne, UK, pp. 164–168 (April 2007)Google Scholar
  20. 20.
    McNeill, D.: Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press, Chicago (1995)CrossRefGoogle Scholar
  21. 21.
    Fitts, P.M.: The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology 47(6), 381–391 (1954)CrossRefGoogle Scholar
  22. 22.
    van Welbergen, H., Zwiers, J., Ruttkay, Z.: Real-time animation using a mix of dynamics and kinematics. In: Symposium On Computer Animation (submitted, 2008)Google Scholar
  23. 23.
    Bodenheimer, B., Shleyfman, A.V., Hodgins, J.K.: The effects of noise on the perception of animated human running. In: Computer Animation and Simulation (September 1999)Google Scholar
  24. 24.
    Egges, A., Molet, T., Magnenat-Thalmann, N.: Personalised real-time idle motion synthesis. In: Pacific Graphics, pp. 121–130. IEEE Computer Society, Washington (2004)Google Scholar
  25. 25.
    Hodgins, J.K., Wooten, W.L., Brogan, D.C., O’Brien, J.F.: Animating human athletics. In: SIGGRAPH, pp. 71–78. ACM Press, New York (1995)Google Scholar
  26. 26.
    Reeves, B., Nass, C.: The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places. Cambridge University Press, New York (1996)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Herwin van Welbergen
    • 1
  • Zsófia Ruttkay
    • 1
  • Balázs Varga
    • 2
  1. 1.HMI, Dept. of CSUniversity of TwenteEnschedeThe Netherlands
  2. 2.Pázmány Péter Catholic UniversityBudapestHungary

Personalised recommendations