Skip to main content
Log in

Interactive sonification of synchronisation of motoric behaviour in social active listening to music with mobile devices

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

This paper evaluates three different interactive sonifications of dyadic coordinated human rhythmic activity. An index of phase synchronisation of gestures was chosen as coordination metric. The sonifications are implemented as three prototype applications exploiting mobile devices: Sync’n’Moog, Sync’n’Move, and Sync’n’Mood. Sync’n’Moog sonifies the phase synchronisation index by acting directly on the audio signal and applying a nonlinear time-varying filtering technique. Sync’n’Move intervenes on the multi-track music content by making the single instruments emerge and hide. Sync’n’Mood manipulates the affective features of the music performance. The three sonifications were also tested against a condition without sonification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Mathews MV (1991) The Radio Baton and conductor program, or: pitch, the most important and least expressive part of music. Comput Music J 15(4):37–46

    Article  Google Scholar 

  2. Ilmonen T, Takala T (1999) Conductor following with artificial neural networks. In: Proceedings of the 1999 international computer music conference, Beijing, China, pp 367–370

    Google Scholar 

  3. Paradiso JA (1997) Electronic music: new ways to play. IEEE Spectr 34(12):18–30

    Article  Google Scholar 

  4. Wanderley M, Depalle P (2004) Gestural control of sound synthesis. Proc IEEE 92(4):632–644

    Article  Google Scholar 

  5. Karjalainen M, Mäki-Patola T, Kanerva A, Huovilainen A (2006) Virtual air guitar. J Audio Eng Soc 54(10):964–980

    Google Scholar 

  6. Pakarinen J, Puputti T, Välimäki V (2008) Virtual slide guitar. Comput Music J 32(3):42–54

    Article  Google Scholar 

  7. Huovilainen A (2004) Non-linear digital implementation of the Moog ladder filter. In: Proceedings of the 7th international conference on digital audio effects (DAFx’04), Naples, Italy, pp 61–64

    Google Scholar 

  8. Moog RA (1965) A voltage-controlled low-pass high-pass filter for audio signal processing. In: Proceedings of the 17th audio engineering society convention

    Google Scholar 

  9. Stilson T, Smith JO (1996) Analyzing the Moog VCF with considerations for digital implementation. In: Proceedings of the 1996 international computer music conference, Hong Kong, pp 398–401

    Google Scholar 

  10. Välimäki V, Huovilainen A (2006) Oscillator and filter algorithms for virtual analog synthesis. Comput Music J 30(2):19–31

    Article  Google Scholar 

  11. Varni G, Volpe G, Camurri A (2010) A System for real-time multimodal analysis of nonverbal affective social interaction in user-centric media. IEEE Trans Multimed 12(6):576–590

    Article  Google Scholar 

  12. Volpe G, Camurri A (2011) A system for embodied social active listening to sound and music content. J Comput Cult Heritage 4:2:1–2:23

    Google Scholar 

  13. Laso-Ballesteros I, Daras P (2008) User centric future media Internet. EU Commission, Brussels

  14. Marwan N, Romano MC, Thiel M, Kurths J (2007) Recurrence plots for the analysis of complex systems. Phys Rep 438:237–329

    Article  MathSciNet  Google Scholar 

  15. Varni G, Mancini M, Volpe G, Camurri A (2010) A system for mobile active music listening based on social interaction and embodiment. Mob Netw Appl 16(3):375–384

    Article  Google Scholar 

  16. Bingham GP, Schmidt RC, Turvey MT, Rosenblum LD (1991) Task dynamics and resource dynamics in the assembly of coordinated rhythmic activity. J Exp Psychol Hum Percept Perform 17(2):359–381

    Article  Google Scholar 

  17. Richardson MJ, Marsh KL, Schmidt RC (2005) Effects of visual and verbal interaction on unintentional interpersonal coordination. J Exp Psychol Hum Percept Perform 31(1):62–79

    Article  Google Scholar 

  18. Thiel M, Romano MC, Kurths J, Rolfs M, Kiegl R (2006) Twin surrogates to test for complex synchronisation. Europhys Lett 75:535–541

    Article  Google Scholar 

  19. Gaye L, Mazé R, Holmquist LE (2003) Sonic city: the urban environment as a musical interface. In: Proceedings of the 2003 international conference on new interfaces for musical expression

    Google Scholar 

  20. Östergren M, Juhlin O (2004) Sound pryer: truly mobile joint listening. In: Proceedings of the 1st international workshop on mobile music technology

    Google Scholar 

  21. Anttila A (2006) SonicPulse: exploring a shared music space. In: Proceedings of the 3rd international workshop on mobile music technology

    Google Scholar 

  22. Rohs M, Essl G, Roth M (2006) CaMus: live music performance using camera phones and visual grid tracking. In: Proceedings of the 2006 international conference on new interfaces for musical expression, pp 31–36

    Google Scholar 

  23. Leman M, Demey M, Lesaffre M, van Noorden L, Moelants D (2009) Concepts, technology and assessment of the social music game Sync-in Team. In: Proceedings of the 12th IEEE international conference on computational science and engineering

    Google Scholar 

  24. Marsh K, Richardson M, Schmidt R (2009) Social connection through joint action and interpersonal coordination. Top Cogn Sci 1(2):320–339

    Article  Google Scholar 

  25. Wallis I, Ingalls T, Rikakis T, Olsen L, Chen Y, Xu W, Sundaram H (2007) Real-time sonification of movement for an immersive stroke rehabilitation environment. In: Proceedings of the 13th international conference on auditory display

    Google Scholar 

  26. Godbout A, Boyd JE (2010) Corrective sonic feedback for speed skating: a case study. In: Proceedings of the 16th international conference on auditory display

    Google Scholar 

  27. Effenberg AE (2005) Movement sonification: effects on perception and action. IEEE Multimed 12(2):53–59

    Article  Google Scholar 

  28. Schaffert N, Mattes K, Effenberg AE (2010) Listen to the boat motion: acoustic information for elite rowers. In: Proceedings of 3rd interactive sonification workshop

    Google Scholar 

  29. Dubus G, Bresin R (2010) Sonification of sculler movements, development of preliminary methods. In: Proceedings of the 3rd interactive sonification workshop

    Google Scholar 

  30. Barrass S, Schaffert N, Barrass T (2010) Probing preferences between six designs of interactive sonifications for recreational sports, health and fitness. In: Proceedings of the 3rd interactive sonification workshop

    Google Scholar 

  31. Fabiani M, Dubus G, Bresin R (2010) Interactive sonification of emotionally expressive gestures by means of music performance. In: Proceedings of the 3rd interactive sonification workshop

    Google Scholar 

  32. Friberg A, Bresin R, Sundberg J (2006) Overview of the KTH rule system for musical performance. Adv Cogn Psychol 2(2):145–161

    Article  Google Scholar 

  33. Friberg A (2006) pDM: an expressive sequencer with real-time control of the KTH music-performance rules. Comput Music J 30(1):37–48

    Article  Google Scholar 

  34. Kessous L, Jacquemin C, Filatriau JJ (2008) Real-time sonification of physiological data in an artistic performance context. In: Proceedings of the 14th international conference on auditory display

    Google Scholar 

  35. Barrass S, Kramer G (1999) Using sonification. Multimed Syst 7:23–31

    Article  Google Scholar 

  36. Hermann T (2008) Taxonomy and definitions for sonification and auditory display. In: Proceedings of the 14th international conference on auditory display

    Google Scholar 

  37. Goebl W (2001) Melody lead in piano performance: expressive device or artifact? J Acoust Soc Am 110(1):563–572

    Article  Google Scholar 

  38. Bresin R, Friberg A (2011) Emotion rendering in music: range and characteristic values of seven musical variables. Cortex 47(9):1068–1081

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gualtiero Volpe.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Varni, G., Dubus, G., Oksanen, S. et al. Interactive sonification of synchronisation of motoric behaviour in social active listening to music with mobile devices. J Multimodal User Interfaces 5, 157–173 (2012). https://doi.org/10.1007/s12193-011-0079-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-011-0079-z

Keywords

Navigation