Advertisement

The Visual Computer

, Volume 22, Issue 3, pp 194–209 | Cite as

Robust on-line adaptive footplant detection and enforcement for locomotion

  • Pascal GlardonEmail author
  • Ronan Boulic
  • Daniel Thalmann
Special issue paper

Abstract

A common problem in virtual character computer animation concerns the preservation of the basic foot-floor constraint (or footplant), consisting in detecting it before enforcing it. This paper describes a system capable of generating motion while continuously preserving the footplants for a real-time, dynamically evolving context. This system introduces a constraint detection method that improves classical techniques by adaptively selecting threshold values according to motion type and quality. The footplants are then enforced using a numerical inverse kinematics solver. As opposed to previous approaches, we define the footplant by attaching to it two effectors whose position at the beginning of the constraint can be modified, in order to place the foot on the ground, for example. However, the corrected posture at the constraint beginning is needed before it starts to ensure smoothness between the unconstrained and constrained states. We, therefore, present a new approach based on motion anticipation, which computes animation postures in advance, according to time-evolving motion parameters, such as locomotion speed and type. We illustrate our on-line approach with continuously modified locomotion patterns, and demonstrate its ability to correct motion artifacts, such as foot sliding, to change the constraint position and to modify from a straight to a curved walk motion.

Keywords

Motion anticipation Animation with constraints Human body simulation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Baerlocher, P., Boulic, R.: An inverse kinematic architecture enforcing an arbitrary number of strict priority levels. Visual. Comput. 20(6), 402–417 (2004)Google Scholar
  2. 2.
    Bindiganavale, R., Badler, N.: Motion abstraction and mapping with spatial constraints. Lecture Notes in Computer Science, Vol. 1537, pp. 70–83 (1998)Google Scholar
  3. 3.
    Boulic, R., Ulciny, B., Thalmann, D.: Versatile walk engine. J of Game Development 1(1), 29–50 (2004)Google Scholar
  4. 4.
    Bruderlin, A., Calvert, T.: Knowledge-driven, interactive animation of human running. In: Graphics Interface ’96, pp. 213–221, Canadian Information Processing Society. Toronto, Ontario, Canada (1996)Google Scholar
  5. 5.
    Bruderlin, A., Williams, L.: Motion signal processing. In: Proceedings of ACM SIGGRAPH, Annual Conference Series, pp. 97–104 (1995)Google Scholar
  6. 6.
    Butz, M., Sigaud, O., Gerard, P.: Anticipatory Behavior in Adaptive Learning Systems. Springer, Berlin Heidelberg New York (2003)Google Scholar
  7. 7.
    Choi, K., Ko, H.: Online motion retargetting. J. Visual. Comput. Anim. 11, 223–235 (2000)Google Scholar
  8. 8.
    Choi, M., Lee, J., Shin, S.: Planning biped locomotion using motion capture data and probabilistic roadmaps. ACM Trans. Graph. (2003)Google Scholar
  9. 9.
    Chung, S., Hahn, J.: Animation of human walking in virtual environments. In: Proceedings of Computer Animation, Geneva, IEEE Computer Society (1999)Google Scholar
  10. 10.
    Conde, T., Thalmann, D.: An artificial life environment for autonomous virtual agents with multi-sensorial and multi-perceptives features. Comput. Anim. Virtual World 15, 311–318 (2004)Google Scholar
  11. 11.
    Girard, M.: Interactive design of 3-D computer-animated legged animal motion. In: Proceedings of ACM Symposium on Interactive 3D Graphics, pp. 131–150 (1987)Google Scholar
  12. 12.
    Glardon, P., Boulic, R., Thalmann, D.: A coherent locomotion engine extrapolating beyond experimental data. In: Proceedings of Computer Animation and Social Agent, pp. 73–83, Geneva (2004)Google Scholar
  13. 13.
    Glardon, P., Boulic, R., Thalmann, D.: On-line adapted transition between locomotion and jump. In: Proceedings of Computer Graphics International, pp. 44–49, IEEE Computer Society (2005)Google Scholar
  14. 14.
    Gleicher, M.: Motion editing with spacetime constraints. In: Proceedings of ACM Symposium on Interactive 3D Graphics, pp. 139–148 (1997)Google Scholar
  15. 15.
    Gleicher, M.: Comparing constraint-based motion editing methods. Graphical Models 63(2), 107–134 (2001)Google Scholar
  16. 16.
    H-ANIM: Humanoid animation working group. www.hanim.org (2005)Google Scholar
  17. 17.
    Hreljac, A., Marshall, R.: Algorithms to determine event timing during normal walking using kinematic data. J. Biomech. 33(6), 783–786 (2000)Google Scholar
  18. 18.
    Ko, H., Badler, N.: Animating human locomotion with inverse dynamics. IEEE Comput. Graph. Applic. 16(2), 50–58 (1996)Google Scholar
  19. 19.
    Kovar, L., Gleicher, M.: Flexible automatic motion blending with registration curves. In: Proceedings of ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 214–224 (2003)Google Scholar
  20. 20.
    Kovar, L., Schreiner, J., Gleicher, M.: Footskate cleanup for motion capture editing. In: Proceedings of ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 97–104 (2002)Google Scholar
  21. 21.
    Labbé, V., Sigaud, O., Codognet, P.: Anticipation of periodic movements in real time 3D environments. In: Proceedings of ABiALS Workshop, Los Angeles (2004)Google Scholar
  22. 22.
    Le Callennec, B., Boulic, R.: Interactive motion deformation with prioritized constraints. In: Proceedings of ACM SIGGRAPH/Eurographics Symposium on Computer Animation (2004)Google Scholar
  23. 23.
    Lee, J., Chai, J., Reitsma, P., Hodgins, J., Pollard, N.: Interactive control of avatars animated with human motion data. In: Proceedings of ACM SIGGRAPH, Annual Conference Series (2002)Google Scholar
  24. 24.
    Lee, J., Shin, S.: A hierarchical approach to interactive motion editing for human-like figures. In: Proceedings of ACM SIGGRAPH, Annual Conference Series, pp. 39–48 (1999)Google Scholar
  25. 25.
    Liu, K., Popović, Z.: Synthesis of complex dynamic character motion from simple animations. In: Proceedings of ACM SIGGRAPH, Annual Conference Series, pp. 408–416 (2002)Google Scholar
  26. 26.
    Maya®: Alias systems corp. www.alias.com (2005)Google Scholar
  27. 27.
    Menardais, S., Kulpa, R., Arnaldi, B.: Synchronisation for dynamic blending of motions. In: Proceedings of ACM SIGGRAPH/Eurographics Symposium on Computer Animation (2004)Google Scholar
  28. 28.
    Multon, F., France, L., Cani-Gascuel, M., Debunne, G.: Computer animation of human walking: a survey. J. Visual. Comput. Anim. 10(1), 39–54 (1999)Google Scholar
  29. 29.
    Park, S., Shin, H., Shin, S.: On-line locomotion generation based on motion blending. In: Proceedings of ACM SIGGRAPH/Eurographics Symposium on Computer Animation (2002)Google Scholar
  30. 30.
    Popović, Z., Witkin, A.: Physically based motion transformation. In: Proceedings of ACM SIGGRAPH, Annual Conference Series, pp. 11–20 (1999)Google Scholar
  31. 31.
    Reynolds, C.: Steering behaviors For autonomous characters. In: Proceedings of Game Developers Conference, pp. 763–782 (1999)Google Scholar
  32. 32.
    Rose, C., Cohen, M., Bodenheimer, B.: Verbs and adverbs: Multidimensional motion interpolation. IEEE Comput. Graph. Applic. 18(5), 32–41 (1998)Google Scholar
  33. 33.
    Rose, C., Sloan, P., Cohen, M.: Artist-directed inverse-kinematics using radial basis function interpolation. In: Proceedings of Eurographics, vol. 20(3) (2001)Google Scholar
  34. 34.
    Safonova, A., Hodgins, J., Pollard, N.: Synthesizing physically realistic human motion in low-dimensional, behavior-specific spaces. In: Proceedings of ACM SIGGRAPH, Annual Conference Series (2004)Google Scholar
  35. 35.
    Salvati, M., Le Callennec, B., Boulic, R.: A generic method for geometric contraints detection. In: Proceedings of Eurographics, short presentation (2004)Google Scholar
  36. 36.
    Shin, H., Kovar, L., Gleicher, M.: Physical touch-up of human motions. In: Proceedings of Pacific Graphics, pp. 194–203, IEEE Computer Society (2003)Google Scholar
  37. 37.
    Shin, H., Lee, J., Shin, S., Gleicher, M.: Computer puppetry: an importance-based approach. ACM Trans. Graph. 20(2), 67–94 (2001)Google Scholar
  38. 38.
    Sun, H., Metaxas, D.: Automating Gait Generation. In: Proceedings of ACM SIGGRAPH, Annual Conference Series (2001)Google Scholar
  39. 39.
    Unuma, M., Anjyo, K., Takeuchi, R.: Fourier principles for emotion-based human figure. In: Proceedings of ACM SIGGRAPH, Annual Conference Series, pp. 91–96 (1995)Google Scholar
  40. 40.
    van de Panne, M.: From footprints to animation. Comput. Graph. Forum 16(4), 211–223 (1997)Google Scholar
  41. 41.
    Veloso, M., Stone, P., Bowling, M.: Anticipation: a key for collaboration in a team of agents. In: Proceedings of Conference on Autonomous Agents (1998)Google Scholar
  42. 42.
    Witkin, A., Popović, Z.: Motion warping. In: Proceedings of ACM SIGGRAPH, Annual Conference Series, pp. 105–108 (1995)Google Scholar
  43. 43.
    Wooten, W., Hodgins, J.: Simulating leaping, tumbling, landing and balancing humans. In: Proceedings of IEEE International Conference on Robotics and Automation (2000)Google Scholar
  44. 44.
    Yamane, K., Nakamura, Y.: Natural motion animation through constraining and deconstraining at will. IEEE Trans. Visual. Comput. Graph. 9(3), 352–360 (2003)Google Scholar

Copyright information

© Springer-Verlag 2006

Authors and Affiliations

  1. 1.Virtual Reality LabEcole Polytechnique Fédérale de Lausanne (EPFL)LausanneSwitzerland

Personalised recommendations