Journal on Multimodal User Interfaces

, Volume 4, Issue 1, pp 27–35

Human movement expressivity for mobile active music listening

  • Maurizio Mancini
  • Giovanna Varni
  • Jari Kleimola
  • Gualtiero Volpe
  • Antonio Camurri
Original Paper

DOI: 10.1007/s12193-010-0047-z

Cite this article as:
Mancini, M., Varni, G., Kleimola, J. et al. J Multimodal User Interfaces (2010) 4: 27. doi:10.1007/s12193-010-0047-z

Abstract

In this paper we describe the SAME networked platform for context-aware, experience-centric mobile music applications, and we present an implementation of the SAME active music listening paradigm: the Mobile Conductor. It allows the user to express herself in conducting a virtual ensemble playing a MIDI piece of music by means of her mobile phone. The mobile phone detects the user’s hand movement and molds the music performance style by modulating its speed, volume, and intonation.

Keywords

Human-computer interaction Active music listening Movement expressivity 

Copyright information

© OpenInterface Association 2010

Authors and Affiliations

  • Maurizio Mancini
    • 1
  • Giovanna Varni
    • 1
  • Jari Kleimola
    • 2
  • Gualtiero Volpe
    • 1
  • Antonio Camurri
    • 1
  1. 1.InfoMus Lab—Laboratorio di Informatica MusicaleDIST—University of GenovaGenovaItaly
  2. 2.Aalto University School of Science and TechnologyDepartment of Signal Processing and AcousticsAaltoFinland