Abstract
Virtual actors moving through interactive game-space environments create rich streams of data that serve as drivers for real-time musical sonification. The paradigms of avian flight, biologically-inspired kinesthetic motion and manually-controlled avatar skeletal mesh components through inverse kinematics are used in the musical performance work ECHO::Canyon to control real-time synthesis-based instruments within a multi-channel sound engine. This paper discusses gestural and control methodologies as well as specific mapping schemata used to link virtual actors with musical characteristics.
All environment and character modeling, custom animations and art direction for ECHO::Canyon were created by artist Chris Platz.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Unreal Development Kit by Epic Software. http://www.udk.com.
References
Aristidou, A., Lasenby, J.: Inverse kinematics: a review of existing techniques and introduction of a new fast iterative solver. Technical Report, Cambridge (2009)
Bencina, R.: Oscpack (2006). http://code.google.com/p/oscpack
Berthaut, F., Hachet, M., Desainte-Catherine, M.: Interacting with the 3D reactive widgets for musical performance. J. New Music Res. 40(3), 253–263 (2011)
Cerqueira, M., Salazar, S., Wang, G.: Soundcraft: transducing starcraft. In: Proceedings of the New Interfaces for Musical Expression Conference, Daiejon, pp. 243–247 (2013)
Choi, H., Wang, G.: LUSH : an organic eco + music system. In: Proceedings of the New Interfaces for Musical Expression Conference, Sydney, pp. 112–115 (2010)
Dahl, S., Bevilacqua, F., Bresin, R., Clayton, M., Leante, L., Poggi, I., Rasamimanana, N.: Gestures in performance. In: Godøy, R., Leman, M. (eds.) Musical Gestures: Sound, Movement, and Meaning, pp. 36–68. Routledge, New York (2010)
Davis, T., Karamanlis, O.: Gestural control of sonic swarms: composing with grouped sound Objects. In: The Proceedings of the 4th Sound and Music Computing Conference, Lefkada, Greece, pp. 192–195 (2007)
Furukawa, K., Fujihata, M., Muench, W.: small_fish (2000). http://hosting.zkm.de/wmuench/small_fish
Hamilton, R.: Q3OSC: or how I learned to stop worrying and love the game. In: Proceedings of the International Computer Music Conference, Copenhagen (2008)
Hamilton, R.: Sonifying game-space choreographies with UDKOSC. In: Proceedings of the New Interfaces for Musical Expression Conference, Daiejon, pp. 446–449 (2013)
Hamilton, R., Caceres, J., Nanou, C., Platz, C.: Multi-modal musical environments for mixed-reality performance. JMUI 4(3–4), 147–156 (2011)
Jensenius, A., Wanderley, M., Godøy, R., Leman, M.: Musical gestures: concepts and methods of research. In: Godøy, R., Leman, M. (eds.) Musical Gestures: Sound, Movement, and Meaning, p. 13. Routledge, New York (2010)
Lehar, S.: Gestalt Isomorphism and the primacy of subjective conscious experience: a gestalt bubble model. Behav. Brain Sci. 26(4), 375–444 (2004). Cambridge University Press
Malham, D., Myatt, A.: 3-D sound spatialization using ambisonic techniques. Comput. Music J. 19(4), 58–70 (1995). Winter
McCarthy, J.: SuperCollider. http://supercollider.sourceforge.net
Oliver, J.: q3apd (2008). http://www.selectparks.net/archive/q3apd.htm
Paul, L.J.: Video game audio prototyping with half life 2. Transdisciplinary Digital Art. Sound, Vision and the New Screen. Communications in Computer and Information Science, vol. 7, pp. 187–198. Springer, Heidelberg (2008)
Sirikata (2009). http://www.sirikata.com
Verron, C., Drettakis, G.: Procedural audio modeling for particle-based environmental effects. In: Proceedings of the 133rd AES Convention, San Francisco (2012)
Wright, M., Freed, A.: Open sound control: a new protocol for communicating with sound synthesizers. In: Proceedings of the International Computer Music Conference, Thessaloniki (1997)
Zehnder, S., Lipscomb, S.: The role of music In video games. In: Vorderer, P., Bryant, J. (eds.) Playing Video Games: Motives, Responses, and Consequences, pp. 282–303. Lawrence Erlbaum Associates, Routledge, New York (2009)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Hamilton, R. (2014). Musical Sonification of Avatar Physiologies, Virtual Flight and Gesture. In: Aramaki, M., Derrien, O., Kronland-Martinet, R., Ystad, S. (eds) Sound, Music, and Motion. CMMR 2013. Lecture Notes in Computer Science(), vol 8905. Springer, Cham. https://doi.org/10.1007/978-3-319-12976-1_31
Download citation
DOI: https://doi.org/10.1007/978-3-319-12976-1_31
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12975-4
Online ISBN: 978-3-319-12976-1
eBook Packages: Computer ScienceComputer Science (R0)