Methodologies for the User Evaluation of the Motion of Virtual Humans

  • Sander E. M. Jansen
  • Herwin van Welbergen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5773)


Virtual humans are employed in many interactive applications, including (serious) games. Their motion should be natural and allow interaction with its surroundings and other (virtual) humans in real time. Physical controllers offer physical realism and (physical) interaction with the environment. Because they typically act on a selected set of joints, it is hard to evaluate their naturalness in isolation. We propose to augment the motion steered by such a controller with motion capture, using a mixed paradigm animation that creates coherent full body motion. A user evaluation of this resulting motion assesses the naturalness of the controller. Methods from Signal Detection Theory provide us with evaluation metrics that can be compared among different test setups, observers and motions. We demonstrate our approach by evaluating the naturalness of a balance controller. We compare different test paradigms, assessing their efficiency and sensitivity.


Evaluation of Virtual Agents Naturalness of Animation 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    van Welbergen, H., van Basten, B.J.H., Egges, A., Ruttkay, Z., Overmars, M.H.: Real Time Animation of Virtual Humans: A Trade-off Between Naturalness and Control. In: Eurographics - State of the Art Reports, Eurographics Association, pp. 45–72 (2009)Google Scholar
  2. 2.
    Hartmann, B., Mancini, M., Pelachaud, C.: Formational parameters and adaptive prototype instantiation for mpeg-4 compliant gesture synthesis. In: Computer Animation, pp. 111–119. IEEE Computer Society, Los Alamitos (2002)Google Scholar
  3. 3.
    Kopp, S., Wachsmuth, I.: Synthesizing multimodal utterances for conversational agents: Research articles. Comput. Animat. Virtual Worlds 15(1), 39–52 (2004)CrossRefGoogle Scholar
  4. 4.
    Wooten, W.L., Hodgins, J.K.: Simulating leaping, tumbling, landing, and balancing humans. In: International Conference on Robotics and Animation, pp. 656–662 (2000)Google Scholar
  5. 5.
    van Welbergen, H., Zwiers, J., Ruttkay, Z.: Real-time animation using a mix of dynamics and kinematics. Submitted to Journal of Graphics Tools (2009)Google Scholar
  6. 6.
    Macmillan, N.A., Creelman, D.C.: Detection Theory: A User’s Guide, 2nd edn. Lawrence Erlbaum, Mahwah (2004)Google Scholar
  7. 7.
    Chaminade, T., Hodgins, J.K., Kawato, M.: Anthropomorphism influences perception of computer-animated characters’ actions. Social Cognitive and Affective Neuroscience 2(3), 206–216 (2007)CrossRefGoogle Scholar
  8. 8.
    Weissenfeld, A., Liu, K., Ostermann, J.: Video-Realistic Image-based Eye Animation System. In: Eurographics 2009 - Short Papers, pp. 41–44. Eurographics Association (2009)Google Scholar
  9. 9.
    van Basten, B., Egges, A.: Evaluating distance metrics for animation blending. In: Proceedings of the 4th International Conference on Foundations of Digital Games, pp. 199–206. ACM, New York (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Sander E. M. Jansen
    • 1
    • 2
  • Herwin van Welbergen
    • 3
  1. 1.Department of Computer ScienceUtrecht UniversityThe Netherlands
  2. 2.TNO Human FactorsThe Netherlands
  3. 3.Human Media InteractionUniversity of Twente EnschedeThe Netherlands

Personalised recommendations