Skip to main content
Log in

Interactive sonification strategies for the motion and emotion of dance performances

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Sonification has the potential to communicate a variety of data types to listeners including not just cognitive information, but also emotions and aesthetics. The goal of our dancer sonification project is to “sonify emotions as well as motions” of a dance performance via musical sonification. To this end, we developed and evaluated sonification strategies for adding a layer of emotional mappings to data sonification. Experiment 1 developed and evaluated four musical sonifications (i.e., sin-ification, MIDI-fication, melody module, and melody and arrangement module) to see their emotional effects. Videos were recorded of a professional dancer interacting with each of the four musical sonification strategies. Forty-eight participants provided ratings of musicality, emotional expressivity, and sound-motion/emotion compatibility via an online survey. Results suggest that increasing musical mappings led to higher ratings for each dimension for dance-type gestures. Experiment 2 used the musical sonification framework to develop four sonification scenarios that aimed to communicate a target emotion (happy, sad, angry, and tender). Thirty participants compared four interactive sonification scenarios with four pre-composed dance choreographies featuring the same musical and gestural palettes. Both forced choice and multi-dimensional emotional evaluations were collected, as well as motion/emotion compatibility ratings. Results show that having both music and dance led to higher accuracy scores for most target emotions, compared to music or dance conditions alone. These findings can contribute to the fields of movement sonification, algorithmic music composition, as well as affective computing in general, by describing strategies for conveying emotion through sound.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Dubus G, Bresin R (2011) Sonification of physical quantities throughout history: a meta-study of previous mapping strategies. Int Community Audit Disp

  2. Winters RM, Wanderley MM (2013) Sonification of emotion: strategies for continuous display of arousal and valence. In: The 3rd international conference on music and emotion, Jyväskylä, Finland, June 11–15, 2013. University of Jyväskylä, Department of Music

  3. Roddy S, Furlong D (2014) Embodied aesthetics in auditory display. Organised Sound 19(1):70–77

    Article  Google Scholar 

  4. Schaffert N, Mattes K, Barrass S, Effenberg AO (2009) Exploring function and aesthetics in sonifications for elite sports. In: Proceedings of the 2nd international conference on music communication science (ICoMCS2), vol 83.HCSNet, p 86

  5. Camurri A, Mazzarino B, Volpe G (2003) Analysis of expressive gesture: the eyesweb expressive gesture processing library. In: Camurri A, Volpe G (eds) International gesture workshop. Springer, Berlin, pp 460–467

    Google Scholar 

  6. Camurri A, De Poli G, Friberg A, Leman M, Volpe G (2005) The MEGA project: analysis and synthesis of multisensory expressive gesture in performing art applications. J New Music Res 34(1):5–21

    Article  Google Scholar 

  7. Friberg A, Bresin R, Sundberg J (2006) Overview of the KTH rule system for musical performance. Adv Cognit Psychol 2(2–3):145–161

    Article  Google Scholar 

  8. Ekman P (2016) What scientists who study emotion agree about. Perspect Psychol Sci 11(1):31–34

    Article  Google Scholar 

  9. Gabrielsson A, Juslin PN (1996) Emotional expression in music performance: between the performer's intention and the listener's experience. Psychol Music 24(1):68–91

    Article  Google Scholar 

  10. Jeon M (2017) Emotions and affect in human factors and human–computer interaction: taxonomy, theories, approaches, and methods. In: Jeon M (ed)Emotions and affect in human factors and human–computer interaction. Elsevier, pp 3–26

  11. Juslin PN, Laukka P (2003) Communication of emotions in vocal expression and music performance: different channels, same code? Psychol Bull 129(5):770

    Article  Google Scholar 

  12. Sterkenburg J, Jeon M, Plummer C (2014) Auditory emoticons: iterative design and acoustic characteristics of emotional auditory icons and earcons. In: Kurosu M (ed) International conference on human–computer interaction. Springer, Cham, pp 633–640

    Google Scholar 

  13. Boone RT, Cunningham JG (1998) Children's decoding of emotion in expressive body movement: the development of cue attunement. Dev Psychol 34(5):1007

    Article  Google Scholar 

  14. De Meijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13(4):247–268

    Article  MathSciNet  Google Scholar 

  15. Lagerlöf I, Djerf M (2009) Children's understanding of emotion in dance. Eur J Dev Psychol 6(4):409–431

    Article  Google Scholar 

  16. Baulch E (2008) Music and dance. J R Anthropol Inst (NS) 14:890–935

    Article  Google Scholar 

  17. Hagen EH, Bryant GA (2003) Music and dance as a coalition signaling system. Hum Nat 14(1):21–51

    Article  Google Scholar 

  18. Krumhansl CL, Schenck DL (1997) Can dance reflect the structural and expressive qualities of music? A perceptual experiment on Balanchine's choreography of Mozart's Divertimento No. 15. Musicae Scientiae 1(1):63–85

    Article  Google Scholar 

  19. Grieser DL, Kuhl PK (1988) Maternal speech to infants in a tonal language: Support for universal prosodic features in motherese. Dev Psychol 24(1):14

    Article  Google Scholar 

  20. Hermann T, Hunt A, Neuhoff JG (2011) The sonification handbook. Logos Verlag, Berlin

    Google Scholar 

  21. Barrass S, Vickers P (2011) Sonification design and aesthetics. In: Hermann T, Hunt A, Neuhoff JG (eds) The sonification handbook, chap 7. Logos Publishing House, Berlin, pp 145–171

    Google Scholar 

  22. Edworthy J (2012) Medical audible alarms: a review. J Am Med Inform Assoc 20(3):584–589

    Article  Google Scholar 

  23. Cvach M (2012) Monitor alarm fatigue: an integrative review. Biomed Instrum Technol 46(4):268–277

    Article  Google Scholar 

  24. Barrass S (2012) The aesthetic turn in sonification towards a social and cultural medium. AI Soc 27(2):177–181

    Article  Google Scholar 

  25. Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161

    Article  Google Scholar 

  26. Henkelmann C (2007) Improving the aesthetic quality of realtime motion data sonification. Computer Graphics Technical Report CG-2007-4. University of Bonn

  27. Burger B, Thompson MR, Luck G, Saarikallio S, Toiviainen P (2013) Influences of rhythm-and timbre-related musical features on characteristics of music-induced movement. Front psychol 4:183

    Article  Google Scholar 

  28. Camurri A, Lagerlöf I, Volpe G (2003) Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int J Hum Comput Stud 59(1):213–225

    Article  Google Scholar 

  29. Schubert E, Ferguson S., Farrar N, McPherson GE (2011). Sonification of emotion I: film music. In: Proceedings of the international conference on auditory display (ICAD2011)

  30. Juslin PN, Friberg A, Bresin R (2001) Toward a computational model of expression in music performance: The GERM model. Musicae Scientiae 5(1_suppl):63–122

    Article  Google Scholar 

  31. Juslin PN (2000) Cue utilization in communication of emotion in music performance: relating performance to perception. J Exp Psychol Hum Percept Perform 26(6):1797

    Article  Google Scholar 

  32. Holler J, Beattie G (2003) How iconic gestures and speech interact in the representation of meaning: are both aspects really integral to the process? Semiotica 146:81–116

    Google Scholar 

  33. Busso C, Deng Z, Yildirim S, Bulut M, Lee CM, Kazemzadeh, A, Lee S, Neumann U, Narayanan S (2004) Analysis of emotion recognition using facial expressions, speech and multimodal information. In: Proceedings of the 6th international conference on multimodal interfaces (ICMI'04), ACM, pp 205–211

  34. Guizatdinova I, Guo Z (2003) Sonification of facial expressions. In: Proceedings of new interaction techniques '03. pp 44–51

  35. Tanveer MI, Anam AI, Rahman AM, Ghosh S, Yeasin M (2012) FEPS: a sensory substitution system for the blind to perceive facial expressions. In: Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS '12), ACM, pp 207–208

  36. Zhang R, Jeon M, Park CH, Howard A (2015) Robotic sonification for promoting emotional and social interactions of children with ASD. In: Proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction extended abstracts (HRI'15), ACM, pp 111–112

  37. Canazza S, Poli G, Rodà A, Vidolin A (2003) An abstract control space for communication of sensory expressive intentions in music performance. J New Music Res 32(3):281–294

    Article  Google Scholar 

  38. Johnson ML, Larson S (2003) "Something in the way she moves"—metaphors of musical motion. Metaphor Symb 18(2):63–84. https://doi.org/10.1207/S15327868MS1802_1

    Article  Google Scholar 

Download references

Acknowledgment

This paper is partly based on the first author's Ph.D. dissertation. The first author would like to thank his committee members, Dr. Stephen Barrass, Dr. Shane Mueller, Dr. Scott Kuhl, and Dr. Myounghoon Jeon for their invaluable feedback for this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Myounghoon Jeon.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Link to playlist of all stimuli used in Experiments 1 and 2.—https://www.youtube.com/playlist?list=PLEGYnxgyNt1A210xEjhVqkGKLecyynqkC.

1.1 Video label codes

Video label—Condition.


Experiment 1:

  • Reg1demo—Level 1, demonstrative gesture

  • Reg1dance—Level 1, dance gesture

  • Reg2demo—Level 2, demonstrative gesture

  • Reg2dance—Level 2, dance gesture

  • Reg3demo—Level 3, demonstrative gesture

  • Reg3dance—Level 3, dance gesture

  • Reg4demo—Level 4, demonstrative gesture

  • Reg4dance—Level 4, dance gesture

Experiment 2:

  • PCBD—Precomposed, both music/dance, Tender

  • PCBC—Precomposed, both music/dance, Sad

  • PCBB—Precomposed, both music/dance, Happy

  • PCBA—Precomposed, both music/dance, Anger

  • PCDD—Precomposed, dance only, Tender

  • PCDC—Precomposed, dance only, Sad

  • PCDB—Precomposed, dance only, Happy

  • PCDA—Precomposed, dance only, Anger

  • PCMD—Precomposed, music only, Tender

  • PCMC—Precomposed, music only, Sad

  • PCMB—Precomposed, music only, Happy

  • PCMA—Precomposed, music only, Anger

  • ISBD—Interactive Sonification, both music/dance, Tender

  • ISBC—Interactive Sonification, both music/dance, Sad

  • PCBB—Precomposed, both music/dance, Happy

  • PCBA—Precomposed, both music/dance, Anger

  • ISDD—Interactive Sonification, dance only, Tender

  • ISDC—Interactive Sonification, dance only, Sad

  • ISDB—Interactive Sonification, dance only, Happy

  • PCDA—Interactive Sonification, dance only, Anger

  • ISMD—Interactive Sonification, music only, Tender

  • ISMC—Interactive Sonification, music only, Sad

  • ISMB—Interactive Sonification, music only, Happy

  • ISMA—Interactive Sonification, music only, Anger

1.2 Screenshots of the pure data sonification patches

See Figs. 11, 12.

Fig. 11
figure 11

Screenshots of the Pure Data patches for musical level 1 and 2

Fig. 12
figure 12

Screenshot of the Pure Data sonification patch for musical scenarios level three and four

1.3 The structure and dimensions of the tracking space

See Fig. 13.

Fig. 13
figure 13

Documentation of the X, Y, and Z distance dimensions tracked by the vicon motion cameras. Markers are worn on the dancer’s wrists and ankles

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Landry, S., Jeon, M. Interactive sonification strategies for the motion and emotion of dance performances. J Multimodal User Interfaces 14, 167–186 (2020). https://doi.org/10.1007/s12193-020-00321-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-020-00321-3

Keywords

Navigation