Skip to main content
Log in

Hybrid inverse motion control for virtual characters interacting with sound synthesis

Application to percussion motion

  • Original Article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

The ever growing use of virtual environments requires more and more engaging elements for enhancing user experiences. Specifically regarding sounding virtual environments, one promising option to achieve such realism and interactivity requirements is the use of virtual characters interacting with sounding objects. In this paper, we focus as a case study on virtual characters playing virtual music instruments. We address more specially the real-time motion control and interaction of virtual characters with their sounding environment for proposing engaging and compelling virtual music performances. Combining physics-based simulation with motion data is a recent approach to finely represent and modulate this motion-sound interaction, while keeping the realism and expressivity of the original captured motion. We propose a physically-enabled environment in which a virtual percussionist interacts with a physics-based sound synthesis algorithm. We introduce and extensively evaluate the Hybrid Inverse Motion Control (HIMC), a motion-driven hybrid control scheme dedicated to the synthesis of upper-body percussion movements. We also propose a physics-based sound synthesis model with which the virtual character can interact. Finally, we present an architecture offering an effective way to manage heterogenous data (motion and sound parameters) and feedback (visual and sound) that influence the resulting virtual percussion performances.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Allen, B., Chu, D., Shapiro, A., Faloutsos, P.: On the beat! Timing and tension for dynamic characters. In: Proc. of the SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 239–247 (2007)

    Google Scholar 

  2. Aubry, M., Julliard, F., Gibet, S.: Modeling joint synergies to synthesize realistic movements. In: Gesture in Embodied Communication and Human Computer Interaction. LNCS, vol. 5934, pp. 231–242. Springer, Berlin (2010)

    Chapter  Google Scholar 

  3. Bonneel, N., Drettakis, G., Tsingos, N., Viaud-Delmon, I., James, D.: Fast modal sounds with scalable frequency-domain synthesis. Trans. Graph. 27(3), 1–9 (2008)

    Article  Google Scholar 

  4. Bouënard, A., Gibet, S., Wanderley, M.M.: Enhancing the visualization of percussion gestures by virtual character animation. In: Proc. of the International Conference on New Interfaces for Musical Expression, pp. 38–43 (2008)

    Google Scholar 

  5. Bouënard, A., Gibet, S., Wanderley, M.M.: Hybrid motion control combining inverse kinematics and inverse dynamics controllers for simulating percussion gestures. In: Proc. of the International Conference on Computer Animation and Social Agents, pp. 17–20 (2009)

    Google Scholar 

  6. Bouënard, A., Wanderley, M.M., Gibet, S.: Analysis of percussion grip for physically-based character animation. In: Proc. of the International Conference on Enactive Interfaces, pp. 22–27 (2008)

    Google Scholar 

  7. Bouënard, A., Wanderley, M.M., Gibet, S.: Advantages and limitations of simulating percussion gestures for sound synthesis. In: Proc. of the International Computer Music Conference, pp. 255–261 (2009)

    Google Scholar 

  8. Bouënard, A., Wanderley, M.M., Gibet, S.: Gesture control of sound synthesis: analysis and classification of percussion gestures. Acta Acustica United with Acustica 96(4), 668–677 (2010)

    Article  Google Scholar 

  9. Bouënard, A., Wanderley, M.M., Gibet, S., Marandola, F.: Virtual gesture control and synthesis of music performances: qualitative evaluation of synthesized timpani exercises. Comput. Music J. 35(3), 57–72 (2011)

    Article  Google Scholar 

  10. Cardle, M., Brooks, S., Bar-Joseph, Z., Robinson, P.: Sound-by-numbers: motion-driven sound synthesis. In: Proc. of the SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 349–356 (2003)

    Google Scholar 

  11. Corbett, R., van den Doel, K., Lloyd, J., Heidrich, W.: Timbrefields: 3d interactive sound models for real-time audio. Presence: Teleoperators and Virtual Environments 16(6), 643–654 (2007)

    Article  Google Scholar 

  12. Coros, S., Beaudoin, P., van de Panne, M.: Robust task-based control policies for physics-based characters. Trans. Graph. 28(5), 1–9 (2009)

    Article  Google Scholar 

  13. Dahl, S.: Playing the accent: comparing striking velocity and timing in ostinato rhythm performed by four drummers. Acta Acustica United with Acustica 90(4), 762–776 (2004)

    MathSciNet  Google Scholar 

  14. DaSilva, M., Durand, F., Popovic̀, J.: Linear Bellman combination for control of character animation. Trans. Graph. 28(3), 1–10 (2009)

    Article  Google Scholar 

  15. Dempster, W., Gaughran, G.: Properties of body segments based on size and weight. Am. J. Anat. 120(1), 33–54 (1967)

    Article  Google Scholar 

  16. Faloutsos, P., van de Panne, M., Terzopoulos, D.: Composable controllers for physics-based character animation. In: Proc. of the Annual Conference on Computer Graphics and Interactive Techniques, pp. 251–260 (2001)

    Google Scholar 

  17. Gaver, W.: How do we hear in the world? Explorations in ecological acoustics. Ecol. Psychol. 5(4), 285–313 (1993)

    Article  MathSciNet  Google Scholar 

  18. Gibet, S., Marteau, P.: Expressive gesture animation based on non parametric learning of sensory-motor models. In: Proc. of the International Conference on Computer Animation and Social Agents, pp. 465–472 (2003)

    Google Scholar 

  19. Hodgins, J., Wooten, W., Brogan, D., O’Brien, J.: Animating human athletics. In: Proc. of the Annual Conference on Computer Graphics and Interactive Techniques, pp. 71–78 (1995)

    Google Scholar 

  20. Johnson, M.: Exploiting quaternions to support expressive interactive character motion. Ph.D. thesis, Massachusetts Institute of Technology, USA (2003)

  21. Klein, C., Huang, C.-H.: Review of pseudoinverse control for use with kinematically redundant manipulators. Trans. Syst. Man Cybern. 13(3), 245–250 (1983)

    Google Scholar 

  22. Ko, H., Badler, N.: Animating human locomotion with inverse dynamics. IEEE Comput. Graph. Appl. 16(2), 50–59 (1996)

    Article  Google Scholar 

  23. Loy, G.: Musimathics, The Mathematical Foundations of Music, vol. 2. MIT Press, Cambridge (2007)

    MATH  Google Scholar 

  24. Macchietto, A., Zordan, V., Shelton, C.: Momentum control for balance. Trans. Graph. 28(3), 80:1–80:8 (2009)

    Google Scholar 

  25. Mordatch, I., de Lasa, M., Hertzmann, A.: Robust physics-based locomotion using low-dimensional planning. Trans. Graph. 29(4), 71:1–71:8 (2010)

    Google Scholar 

  26. O’Brien, J., Shen, C., Gatchalian, C.: Synthetizing sounds from rigid body simulations. In: Proc. of the SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 175–181 (2002)

    Chapter  Google Scholar 

  27. Picard, C., Faure, F., Drettakis, G., Kry, P.: A robust and multi-scale modal analysis for sound synthesis. In: Proc. of the International Conference on Digital Audio Effects, pp. 221–227 (2009)

    Google Scholar 

  28. Raghuvanshi, N., Lin, M.C.: Interactive sound synthesis for large scale environments. In: Proc. of the International Symposium on Interactive 3D Graphics and Games, pp. 101–108 (2006)

    Chapter  Google Scholar 

  29. Raibert, M.: Legged Robots that Balance. MIT Press, Cambridge (1986)

    Google Scholar 

  30. Reitsma, P., Pollard, N.: Perceptual metrics for character animation: sensitivity to errors in ballistic motion. Trans. Graph. 22(3), 537–542 (2003)

    Article  Google Scholar 

  31. Ren, L., Alton, P., Efros, A., Hodgins, J., Rehg, J.: A data-driven approach to quantifying natural human motion. Trans. Graph. 24(3), 1090–1097 (2005)

    Article  Google Scholar 

  32. Shapiro, A., Pighin, F., Faloutsos, P.: Hybrid control for interactive character animation. In: Proc. of the International Conference on Pacific Graphics, pp. 455–461 (2003)

    Google Scholar 

  33. Takala, T., Hahn, J.: Sound rendering. In: Proc. of the Annual Conference on Computer Graphics and Interactive Techniques, pp. 211–220 (1992)

    Google Scholar 

  34. van den Doel, K., Kry, P., Pai, D.: FoleyAutomatic: physically-based sound effects for interactive simulation and animation. In: Proc. of the Annual Conference on Computer Graphics and Interactive Techniques, pp. 537–544 (2001)

    Google Scholar 

  35. van Welbergen, H., van Basten, B., Egges, A., Ruttkay, Z., Overmars, M.: Real-time animation of virtual humans: a trade-off between naturalness and control. In: Proc. of the Eurographics Workshop on Animation and Simulation, pp. 45–72 (2009)

    Google Scholar 

  36. Wagner, A.: Analysis of drumbeats: interaction between drummer, drumstick and instrument. Master’s thesis, KTH Royal Institute of Technology, Sweden (2006)

  37. Wampler, C.: Manipulator inverse kinematic solutions based on vector formulations and damped least squares. Trans. Syst. Man Cybern. 16(1), 93–101 (1986)

    Article  MATH  Google Scholar 

  38. Yang, P.F., Laszlo, J., Singh, K.: Layered dynamic control for interactive character swimming. In: Proc. of the SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 39–47 (2004)

    Chapter  Google Scholar 

  39. Zordan, V., Hodgins, J.: Tracking and modifying upper-body human motion data with dynamic simulation. In: Proc. of the Eurographics Workshop on Animation and Simulation, pp. 13–22 (1999)

    Google Scholar 

  40. Zordan, V., Majkowska, A., Chiu, B., Fast, M.: Dynamic response for motion capture animation. Trans. Graph. 24(3), 697–701 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. Bouënard.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Bouënard, A., Gibet, S. & Wanderley, M.M. Hybrid inverse motion control for virtual characters interacting with sound synthesis. Vis Comput 28, 357–370 (2012). https://doi.org/10.1007/s00371-011-0620-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-011-0620-9

Keywords

Navigation