Advertisement

Interactive sonification of synchronisation of motoric behaviour in social active listening to music with mobile devices

  • Giovanna Varni
  • Gaël Dubus
  • Sami Oksanen
  • Gualtiero Volpe
  • Marco Fabiani
  • Roberto Bresin
  • Jari Kleimola
  • Vesa Välimäki
  • Antonio Camurri
Original Paper

Abstract

This paper evaluates three different interactive sonifications of dyadic coordinated human rhythmic activity. An index of phase synchronisation of gestures was chosen as coordination metric. The sonifications are implemented as three prototype applications exploiting mobile devices: Sync’n’Moog, Sync’n’Move, and Sync’n’Mood. Sync’n’Moog sonifies the phase synchronisation index by acting directly on the audio signal and applying a nonlinear time-varying filtering technique. Sync’n’Move intervenes on the multi-track music content by making the single instruments emerge and hide. Sync’n’Mood manipulates the affective features of the music performance. The three sonifications were also tested against a condition without sonification.

Keywords

Interactive sonification Interactive systems Audio systems Sound and music computing Active music listening Synchronisation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Mathews MV (1991) The Radio Baton and conductor program, or: pitch, the most important and least expressive part of music. Comput Music J 15(4):37–46 CrossRefGoogle Scholar
  2. 2.
    Ilmonen T, Takala T (1999) Conductor following with artificial neural networks. In: Proceedings of the 1999 international computer music conference, Beijing, China, pp 367–370 Google Scholar
  3. 3.
    Paradiso JA (1997) Electronic music: new ways to play. IEEE Spectr 34(12):18–30 CrossRefGoogle Scholar
  4. 4.
    Wanderley M, Depalle P (2004) Gestural control of sound synthesis. Proc IEEE 92(4):632–644 CrossRefGoogle Scholar
  5. 5.
    Karjalainen M, Mäki-Patola T, Kanerva A, Huovilainen A (2006) Virtual air guitar. J Audio Eng Soc 54(10):964–980 Google Scholar
  6. 6.
    Pakarinen J, Puputti T, Välimäki V (2008) Virtual slide guitar. Comput Music J 32(3):42–54 CrossRefGoogle Scholar
  7. 7.
    Huovilainen A (2004) Non-linear digital implementation of the Moog ladder filter. In: Proceedings of the 7th international conference on digital audio effects (DAFx’04), Naples, Italy, pp 61–64 Google Scholar
  8. 8.
    Moog RA (1965) A voltage-controlled low-pass high-pass filter for audio signal processing. In: Proceedings of the 17th audio engineering society convention Google Scholar
  9. 9.
    Stilson T, Smith JO (1996) Analyzing the Moog VCF with considerations for digital implementation. In: Proceedings of the 1996 international computer music conference, Hong Kong, pp 398–401 Google Scholar
  10. 10.
    Välimäki V, Huovilainen A (2006) Oscillator and filter algorithms for virtual analog synthesis. Comput Music J 30(2):19–31 CrossRefGoogle Scholar
  11. 11.
    Varni G, Volpe G, Camurri A (2010) A System for real-time multimodal analysis of nonverbal affective social interaction in user-centric media. IEEE Trans Multimed 12(6):576–590 CrossRefGoogle Scholar
  12. 12.
    Volpe G, Camurri A (2011) A system for embodied social active listening to sound and music content. J Comput Cult Heritage 4:2:1–2:23 Google Scholar
  13. 13.
    Laso-Ballesteros I, Daras P (2008) User centric future media Internet. EU Commission, Brussels Google Scholar
  14. 14.
    Marwan N, Romano MC, Thiel M, Kurths J (2007) Recurrence plots for the analysis of complex systems. Phys Rep 438:237–329 MathSciNetCrossRefGoogle Scholar
  15. 15.
    Varni G, Mancini M, Volpe G, Camurri A (2010) A system for mobile active music listening based on social interaction and embodiment. Mob Netw Appl 16(3):375–384 CrossRefGoogle Scholar
  16. 16.
    Bingham GP, Schmidt RC, Turvey MT, Rosenblum LD (1991) Task dynamics and resource dynamics in the assembly of coordinated rhythmic activity. J Exp Psychol Hum Percept Perform 17(2):359–381 CrossRefGoogle Scholar
  17. 17.
    Richardson MJ, Marsh KL, Schmidt RC (2005) Effects of visual and verbal interaction on unintentional interpersonal coordination. J Exp Psychol Hum Percept Perform 31(1):62–79 CrossRefGoogle Scholar
  18. 18.
    Thiel M, Romano MC, Kurths J, Rolfs M, Kiegl R (2006) Twin surrogates to test for complex synchronisation. Europhys Lett 75:535–541 CrossRefGoogle Scholar
  19. 19.
    Gaye L, Mazé R, Holmquist LE (2003) Sonic city: the urban environment as a musical interface. In: Proceedings of the 2003 international conference on new interfaces for musical expression Google Scholar
  20. 20.
    Östergren M, Juhlin O (2004) Sound pryer: truly mobile joint listening. In: Proceedings of the 1st international workshop on mobile music technology Google Scholar
  21. 21.
    Anttila A (2006) SonicPulse: exploring a shared music space. In: Proceedings of the 3rd international workshop on mobile music technology Google Scholar
  22. 22.
    Rohs M, Essl G, Roth M (2006) CaMus: live music performance using camera phones and visual grid tracking. In: Proceedings of the 2006 international conference on new interfaces for musical expression, pp 31–36 Google Scholar
  23. 23.
    Leman M, Demey M, Lesaffre M, van Noorden L, Moelants D (2009) Concepts, technology and assessment of the social music game Sync-in Team. In: Proceedings of the 12th IEEE international conference on computational science and engineering Google Scholar
  24. 24.
    Marsh K, Richardson M, Schmidt R (2009) Social connection through joint action and interpersonal coordination. Top Cogn Sci 1(2):320–339 CrossRefGoogle Scholar
  25. 25.
    Wallis I, Ingalls T, Rikakis T, Olsen L, Chen Y, Xu W, Sundaram H (2007) Real-time sonification of movement for an immersive stroke rehabilitation environment. In: Proceedings of the 13th international conference on auditory display Google Scholar
  26. 26.
    Godbout A, Boyd JE (2010) Corrective sonic feedback for speed skating: a case study. In: Proceedings of the 16th international conference on auditory display Google Scholar
  27. 27.
    Effenberg AE (2005) Movement sonification: effects on perception and action. IEEE Multimed 12(2):53–59 CrossRefGoogle Scholar
  28. 28.
    Schaffert N, Mattes K, Effenberg AE (2010) Listen to the boat motion: acoustic information for elite rowers. In: Proceedings of 3rd interactive sonification workshop Google Scholar
  29. 29.
    Dubus G, Bresin R (2010) Sonification of sculler movements, development of preliminary methods. In: Proceedings of the 3rd interactive sonification workshop Google Scholar
  30. 30.
    Barrass S, Schaffert N, Barrass T (2010) Probing preferences between six designs of interactive sonifications for recreational sports, health and fitness. In: Proceedings of the 3rd interactive sonification workshop Google Scholar
  31. 31.
    Fabiani M, Dubus G, Bresin R (2010) Interactive sonification of emotionally expressive gestures by means of music performance. In: Proceedings of the 3rd interactive sonification workshop Google Scholar
  32. 32.
    Friberg A, Bresin R, Sundberg J (2006) Overview of the KTH rule system for musical performance. Adv Cogn Psychol 2(2):145–161 CrossRefGoogle Scholar
  33. 33.
    Friberg A (2006) pDM: an expressive sequencer with real-time control of the KTH music-performance rules. Comput Music J 30(1):37–48 CrossRefGoogle Scholar
  34. 34.
    Kessous L, Jacquemin C, Filatriau JJ (2008) Real-time sonification of physiological data in an artistic performance context. In: Proceedings of the 14th international conference on auditory display Google Scholar
  35. 35.
    Barrass S, Kramer G (1999) Using sonification. Multimed Syst 7:23–31 CrossRefGoogle Scholar
  36. 36.
    Hermann T (2008) Taxonomy and definitions for sonification and auditory display. In: Proceedings of the 14th international conference on auditory display Google Scholar
  37. 37.
    Goebl W (2001) Melody lead in piano performance: expressive device or artifact? J Acoust Soc Am 110(1):563–572 CrossRefGoogle Scholar
  38. 38.
    Bresin R, Friberg A (2011) Emotion rendering in music: range and characteristic values of seven musical variables. Cortex 47(9):1068–1081 CrossRefGoogle Scholar

Copyright information

© OpenInterface Association 2011

Authors and Affiliations

  • Giovanna Varni
    • 1
  • Gaël Dubus
    • 3
  • Sami Oksanen
    • 2
  • Gualtiero Volpe
    • 1
  • Marco Fabiani
    • 3
  • Roberto Bresin
    • 3
  • Jari Kleimola
    • 2
  • Vesa Välimäki
    • 2
  • Antonio Camurri
    • 1
  1. 1.InfoMus, DISTUniversity of GenovaGenovaItaly
  2. 2.Department of Signal Processing and Acoustics, School of Electrical EngineeringAalto UniversityAaltoFinland
  3. 3.Department of Speech, Music and Hearing, School of Computer Science and CommunicationKTH Royal Institute of TechnologyStockholmSweden

Personalised recommendations