User-Centric Context-Aware Mobile Applications for Embodied Music Listening

  • Antonio Camurri
  • Gualtiero Volpe
  • Hugues Vinet
  • Roberto Bresin
  • Marco Fabiani
  • Gaël Dubus
  • Esteban Maestre
  • Jordi Llop
  • Jari Kleimola
  • Sami Oksanen
  • Vesa Välimäki
  • Jarno Seppanen
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 40)

Abstract

This paper surveys a collection of sample applications for networked user-centric context-aware embodied music listening. The applications have been designed and developed in the framework of the EU-ICT Project SAME (www.sameproject.eu) and have been presented at Agora Festival (IRCAM, Paris, France) in June 2009. All of them address in different ways the concept of embodied, active listening to music, i.e., enabling listeners to interactively operate in real-time on the music content by means of their movements and gestures as captured by mobile devices. In the occasion of the Agora Festival the applications have also been evaluated by both expert and non-expert users.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Camurri, A., Canepa, C., Volpe, G.: Active listening to a virtual orchestra through an expressive gestural interface: The Orchestra Explorer. In: Proceedings 2007 Intl. Conference on New Interfaces for Musical Expression (NIME 2007), New York, USA (June 2007)Google Scholar
  2. 2.
    Camurri, A., Canepa, C., Coletta, P., Mazzarino, B., Volpe, G.: Mappe per Affetti Erranti: a Multimodal System for Social Active Listening and Expressive Performance. In: Proc 2008 Intl. Conference on New Interfaces for Musical Expression (NIME 2008), Genova (2008)Google Scholar
  3. 3.
    Karjalainen, M., Maki-Patola, T., Kanerva, A., Huovilainen, A.: Virtual air guitar. Journal of the Audio Engineering Society 54(10), 964–980 (2006)Google Scholar
  4. 4.
    Pakarinen, J., Puputti, T., Valimaki, V.: Virtual slide guitar. Computer Music Journal 32(3), 42–54 (Fall 2008)CrossRefGoogle Scholar
  5. 5.
    Camurri, A., Coletta, P., Demurtas, M., Peri, M., Ricci, A., Sagoleo, R., Simonetti, M., Varni, G., Volpe, G.: A Platform for Real-Time Multimodal Processing. In: Proceedings International Conference Sound and Music Computing 2007 (SMC 2007), Lefkada, Greece (2007)Google Scholar
  6. 6.
    Vinyes, M., Bonada, J., Loscos, A.: Demixing Commercial Music Productions via Human-Assisted Time-Frequency Masking. In: 120th AES Convention, Paris (2006)Google Scholar
  7. 7.
    Varni, G., Mancini, M., Volpe, G.: Sync’n’Move: social interaction based on music and gesture. In: Proc. 1st International ICST Conference on User Centric Media, Venice (2009)Google Scholar
  8. 8.
    Friberg, A.: pDM: an expressive sequencer with real-time control of the KTH music performance rules movements. Computer Music Journal 30(1), 37–48 (2006)CrossRefGoogle Scholar

Copyright information

© ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering 2010

Authors and Affiliations

  • Antonio Camurri
    • 1
  • Gualtiero Volpe
    • 1
  • Hugues Vinet
    • 2
  • Roberto Bresin
    • 3
  • Marco Fabiani
    • 3
  • Gaël Dubus
    • 3
  • Esteban Maestre
    • 4
  • Jordi Llop
    • 4
  • Jari Kleimola
    • 5
  • Sami Oksanen
    • 5
  • Vesa Välimäki
    • 5
  • Jarno Seppanen
    • 6
  1. 1.Casa Paganini - InfoMus LabDIST - University of GenovaGenovaItaly
  2. 2.IRCAMParisFrance
  3. 3.KTH School of Computer Science and CommunicationStockholmSweden
  4. 4.Music Technology GroupUPF - Universitat Pompeu FabraBarcelonaSpain
  5. 5.Department of Signal Processing and AcousticsTKKEspooFinland
  6. 6.Nokia Research Center, Helsinki, FinlandGenovaItaly

Personalised recommendations