User-Centric Context-Aware Mobile Applications for Embodied Music Listening
This paper surveys a collection of sample applications for networked user-centric context-aware embodied music listening. The applications have been designed and developed in the framework of the EU-ICT Project SAME (www.sameproject.eu) and have been presented at Agora Festival (IRCAM, Paris, France) in June 2009. All of them address in different ways the concept of embodied, active listening to music, i.e., enabling listeners to interactively operate in real-time on the music content by means of their movements and gestures as captured by mobile devices. In the occasion of the Agora Festival the applications have also been evaluated by both expert and non-expert users.
Unable to display preview. Download preview PDF.
- 1.Camurri, A., Canepa, C., Volpe, G.: Active listening to a virtual orchestra through an expressive gestural interface: The Orchestra Explorer. In: Proceedings 2007 Intl. Conference on New Interfaces for Musical Expression (NIME 2007), New York, USA (June 2007)Google Scholar
- 2.Camurri, A., Canepa, C., Coletta, P., Mazzarino, B., Volpe, G.: Mappe per Affetti Erranti: a Multimodal System for Social Active Listening and Expressive Performance. In: Proc 2008 Intl. Conference on New Interfaces for Musical Expression (NIME 2008), Genova (2008)Google Scholar
- 3.Karjalainen, M., Maki-Patola, T., Kanerva, A., Huovilainen, A.: Virtual air guitar. Journal of the Audio Engineering Society 54(10), 964–980 (2006)Google Scholar
- 5.Camurri, A., Coletta, P., Demurtas, M., Peri, M., Ricci, A., Sagoleo, R., Simonetti, M., Varni, G., Volpe, G.: A Platform for Real-Time Multimodal Processing. In: Proceedings International Conference Sound and Music Computing 2007 (SMC 2007), Lefkada, Greece (2007)Google Scholar
- 6.Vinyes, M., Bonada, J., Loscos, A.: Demixing Commercial Music Productions via Human-Assisted Time-Frequency Masking. In: 120th AES Convention, Paris (2006)Google Scholar
- 7.Varni, G., Mancini, M., Volpe, G.: Sync’n’Move: social interaction based on music and gesture. In: Proc. 1st International ICST Conference on User Centric Media, Venice (2009)Google Scholar