Advertisement

Musical Sonification of Avatar Physiologies, Virtual Flight and Gesture

  • Robert HamiltonEmail author
Conference paper
  • 1.4k Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8905)

Abstract

Virtual actors moving through interactive game-space environments create rich streams of data that serve as drivers for real-time musical sonification. The paradigms of avian flight, biologically-inspired kinesthetic motion and manually-controlled avatar skeletal mesh components through inverse kinematics are used in the musical performance work ECHO::Canyon to control real-time synthesis-based instruments within a multi-channel sound engine. This paper discusses gestural and control methodologies as well as specific mapping schemata used to link virtual actors with musical characteristics.

Keywords

Musical sonification Procedural music Video games Virtual gesture 

References

  1. 1.
    Aristidou, A., Lasenby, J.: Inverse kinematics: a review of existing techniques and introduction of a new fast iterative solver. Technical Report, Cambridge (2009)Google Scholar
  2. 2.
    Bencina, R.: Oscpack (2006). http://code.google.com/p/oscpack
  3. 3.
    Berthaut, F., Hachet, M., Desainte-Catherine, M.: Interacting with the 3D reactive widgets for musical performance. J. New Music Res. 40(3), 253–263 (2011)CrossRefGoogle Scholar
  4. 4.
    Cerqueira, M., Salazar, S., Wang, G.: Soundcraft: transducing starcraft. In: Proceedings of the New Interfaces for Musical Expression Conference, Daiejon, pp. 243–247 (2013)Google Scholar
  5. 5.
    Choi, H., Wang, G.: LUSH : an organic eco + music system. In: Proceedings of the New Interfaces for Musical Expression Conference, Sydney, pp. 112–115 (2010)Google Scholar
  6. 6.
    Dahl, S., Bevilacqua, F., Bresin, R., Clayton, M., Leante, L., Poggi, I., Rasamimanana, N.: Gestures in performance. In: Godøy, R., Leman, M. (eds.) Musical Gestures: Sound, Movement, and Meaning, pp. 36–68. Routledge, New York (2010)Google Scholar
  7. 7.
    Davis, T., Karamanlis, O.: Gestural control of sonic swarms: composing with grouped sound Objects. In: The Proceedings of the 4th Sound and Music Computing Conference, Lefkada, Greece, pp. 192–195 (2007)Google Scholar
  8. 8.
    Furukawa, K., Fujihata, M., Muench, W.: small_fish (2000). http://hosting.zkm.de/wmuench/small_fish
  9. 9.
    Hamilton, R.: Q3OSC: or how I learned to stop worrying and love the game. In: Proceedings of the International Computer Music Conference, Copenhagen (2008)Google Scholar
  10. 10.
    Hamilton, R.: Sonifying game-space choreographies with UDKOSC. In: Proceedings of the New Interfaces for Musical Expression Conference, Daiejon, pp. 446–449 (2013)Google Scholar
  11. 11.
    Hamilton, R., Caceres, J., Nanou, C., Platz, C.: Multi-modal musical environments for mixed-reality performance. JMUI 4(3–4), 147–156 (2011)Google Scholar
  12. 12.
    Jensenius, A., Wanderley, M., Godøy, R., Leman, M.: Musical gestures: concepts and methods of research. In: Godøy, R., Leman, M. (eds.) Musical Gestures: Sound, Movement, and Meaning, p. 13. Routledge, New York (2010)Google Scholar
  13. 13.
    Lehar, S.: Gestalt Isomorphism and the primacy of subjective conscious experience: a gestalt bubble model. Behav. Brain Sci. 26(4), 375–444 (2004). Cambridge University PressGoogle Scholar
  14. 14.
    Malham, D., Myatt, A.: 3-D sound spatialization using ambisonic techniques. Comput. Music J. 19(4), 58–70 (1995). WinterCrossRefGoogle Scholar
  15. 15.
    McCarthy, J.: SuperCollider. http://supercollider.sourceforge.net
  16. 16.
  17. 17.
    Paul, L.J.: Video game audio prototyping with half life 2. Transdisciplinary Digital Art. Sound, Vision and the New Screen. Communications in Computer and Information Science, vol. 7, pp. 187–198. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  18. 18.
    Sirikata (2009). http://www.sirikata.com
  19. 19.
    Verron, C., Drettakis, G.: Procedural audio modeling for particle-based environmental effects. In: Proceedings of the 133rd AES Convention, San Francisco (2012)Google Scholar
  20. 20.
    Wright, M., Freed, A.: Open sound control: a new protocol for communicating with sound synthesizers. In: Proceedings of the International Computer Music Conference, Thessaloniki (1997)Google Scholar
  21. 21.
    Zehnder, S., Lipscomb, S.: The role of music In video games. In: Vorderer, P., Bryant, J. (eds.) Playing Video Games: Motives, Responses, and Consequences, pp. 282–303. Lawrence Erlbaum Associates, Routledge, New York (2009)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Center for Computer Research in Music and AcousticsStanford UniversityStanfordUSA

Personalised recommendations