Sync’n’Move: Social Interaction Based on Music and Gesture

  • Giovanna Varni
  • Maurizio Mancini
  • Gualtiero Volpe
  • Antonio Camurri
Conference paper

DOI: 10.1007/978-3-642-12630-7_4

Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 40)
Cite this paper as:
Varni G., Mancini M., Volpe G., Camurri A. (2010) Sync’n’Move: Social Interaction Based on Music and Gesture. In: Daras P., Ibarra O.M. (eds) User Centric Media. UCMEDIA 2009. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 40. Springer, Berlin, Heidelberg

Abstract

In future User Centric Media the importance of the social dimension will likely increase. As social networks and Internet games show, the social dimension has a key role for active participation of the users in the overall media chain. In this paper, a first sample application for social active listening to music is presented. Sync’n’Move enables two users to explore a multi-channel pre-recorded music piece as the result of their social interaction. The application has been developed in the framework of the EU-ICT Project SAME (www.sameproject.eu) and has been presented for the first time at the Agora Festival (IRCAM, Paris, June 2009). In that occasion, Sync’n’Move has also been evaluated by both expert and non expert users.

Keywords

active music listening social interaction synchronization 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering 2010

Authors and Affiliations

  • Giovanna Varni
    • 1
  • Maurizio Mancini
    • 1
  • Gualtiero Volpe
    • 1
  • Antonio Camurri
    • 1
  1. 1.InfoMus LabDIST - Università degli Studi di GenovaItaly

Personalised recommendations