Advertisement

Novel Methods in Facilitating Audience and Performer Interaction Using the Mood Conductor Framework

  • György Fazekas
  • Mathieu Barthet
  • Mark B. Sandler
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8905)

Abstract

While listeners’ emotional response to music is the subject of numerous studies, less attention is paid to the dynamic emotion variations due to the interaction between artists and audiences in live improvised music performances. By opening a direct communication channel from audience members to performers, the Mood Conductor system provides an experimental framework to study this phenomenon. Mood Conductor facilitates interactive performances and thus also has an inherent entertainment value. The framework allows audience members to send emotional directions using their mobile devices in order to “conduct” improvised performances. Emotion coordinates indicted by the audience in the arousal-valence space are aggregated and clustered to create a video projection. This is used by the musicians as guidance, and provides visual feedback to the audience. Three different systems were developed and tested within our framework so far. These systems were trialled in several public performances with different ensembles. Qualitative and quantitative evaluations demonstrated that musicians and audiences were highly engaged with the system, and raised new insights enabling future improvements of the framework.

Keywords

Audience-performer interaction Music Performance Emotion Mood Arousal Valence Improvisation Live concert Mobile technology Smartphone app Real time Visualisation Human computer interaction 

Notes

Acknowledgments

The authors acknowledge the kind contribution of the vocal quartet VoXP who performed during some of the events detailed in this paper, and Matthias Gregori from SoundCloud Ltd. who implemented the client interface of MC System 1. This work was partly funded by the EPSRC Grant EP/K009559/1, the TSB funded “Making Musical Mood Metadata” project (TS/J002283/1), the EPSRC and AHRC Centre for Doctoral Training in Media and Arts Technology (EP/L01632X/1), and the EPRRC funded “Fusing Semantic and Audio Technologies for Intelligent Music Production and Consumption” (FAST-IMPACt) project (EP/L019981/1).

References

  1. 1.
    Asmus, E.P.: Nine affective dimensions. Technical report, University of Miami (1986)Google Scholar
  2. 2.
    Avidan, S.: Ensemble tracking. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, California, USA (2005)Google Scholar
  3. 3.
    Barkhuus, L., Jørgensen, T.: Engaging the crowd: studies of audience-performer interaction. In: CHI ’08 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’08, pp. 2925–2930. New York (2008)Google Scholar
  4. 4.
    Barthet, M., Fazekas, G., Sandler, M.: Music emotion recognition: from content- to context-based models. In: Aramaki, M., Barthet, M., Kronland-Martinet, R., Ystad, S. (eds.) CMMR 2012. LNCS, vol. 7900, pp. 228–252. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  5. 5.
    Bradley, M., Lang, P.J.: Affective norms for english words: instruction manual and affective ratings. Technical Report C-2, University of Florida (2010)Google Scholar
  6. 6.
    Cheng, Y.: Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 17(8), 790–799 (1995)CrossRefGoogle Scholar
  7. 7.
    Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theor. 13(1), 21–27 (1967)CrossRefMATHGoogle Scholar
  8. 8.
    Dean, R.: Creative Improvisation: Jazz, Contemporary Music and Beyond. Open University Press, Philadelphia (1989)Google Scholar
  9. 9.
    Fazekas, G., Barthet, M., Sandler, M.: Mood conductor: emotion-driven interactive music performance. In: International Conference on Affective Computing and Intelligent Interaction (ACII’13), Geneva, Switzerland, 2–5 September 2013 (2013)Google Scholar
  10. 10.
    Fielding, R.T., Taylor, R.N.: Principled design of the modern Web architecture. ACM Trans. Internet Technol. 2(2), 115–150 (2002)CrossRefGoogle Scholar
  11. 11.
    Gorow, R.: Hearing and Writing Music: Professional Training for Today’s Musician, 2nd edn. September Publishing, Gardena (2002)Google Scholar
  12. 12.
    Hayes-Roth, B., Sincoff, E., Brownston, L., Huard, R., Lent, B.: Directed improvisation. Technical report, Stanford University (1994)Google Scholar
  13. 13.
    Iacobini, M., Gonsalves, T., Berthouze, N., Frith, C.: Creating emotional communication with interactive artwork. In: 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, pp. 1–6 (2009)Google Scholar
  14. 14.
    Johnstone, K.: Improvisation and the Theatre. Methuen Drama, London (2007)Google Scholar
  15. 15.
    Kendall, R.A., Carterette, E.C.: The communication of musical expression. Music Percept. 8(2), 129–164 (1990)CrossRefGoogle Scholar
  16. 16.
    Kim, Y., Schmidt, E.M., Emelle, L.: Moodswings: a collaborative game for music mood label collection. In: Proceeding of the International Society for Music Information Retrieval (ISMIR) Conference (2008)Google Scholar
  17. 17.
    Lou, T., Barthet, M., Fazekas, G., Sandler, M.: Evaluation of the Mood Conductor interactive system based on audience and performers’ perspectives. In: Proceedings of the 10th International Symposium on Computer Music Multidisciplinary Research (CMMR), pp. 594–609 (2013)Google Scholar
  18. 18.
    Lou, T., Barthet, M., Fazekas, G., Sandler, M.: Evaluation and improvement of the mood conductor interactive system. In: 53rd AES Internationa Conference on Semantic Audio, London, UK, 26–29 January 2014 (2014)Google Scholar
  19. 19.
    Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)CrossRefGoogle Scholar
  20. 20.
    Saari, P., Barthet, M., Fazekas, G., Eerola, T., Sandler, M.: Semantic models of musical mood: comparison between crowd-sourced and curated editorial tags. In: Proceedings of IEEE International Conference on Multimedia & Expo (ICME2013) International Workshop on Affective Analysis in Multimedia (AAM), San Jose, CA, USA, 15–19 July 2013 (2013)Google Scholar
  21. 21.
    Schubert, E.: Measuring emotion continuously: validity and reliability of the two-dimensional emotion-space. Aust. J. Psychol. 51(3), 154–165 (1999)CrossRefGoogle Scholar
  22. 22.
    Schubert, E., Ferguson, S., Farrar, N., Taylor, D., McPherson, G.E.: Continuous response to music using discrete emotion faces. In: Barthet, M., Dixon, S. (eds.) Proceedings of the 9th International Symposium on Computer Music Modeling and Retrieval (CMMR’12), pp. 3–19 (2012)Google Scholar
  23. 23.
    Sgouros, N.M.: Supporting audience and player interaction during interactive media performances. In: 2000 IEEE International Conference on Multimedia and Expo, 2000, ICME 2000, vol. 3, pp. 1367–1370 (2000)Google Scholar
  24. 24.
    Sloboda, J.A., Juslin, P.N.: Psychological perspectives on music and emotion. In: Juslin, P.N., Sloboda, J.A. (eds.) Music and Emotion Theory and Research. Series in Affective Science, pp. 71–104. Oxford University Press, Oxford (2001)Google Scholar
  25. 25.
    Thayer, R.E.: The Biopsychology of Mood and Arousal. Oxford University Press, New York (1989)Google Scholar
  26. 26.
    Van Zijl, A.G.W., Sloboda, J.: Performers’ experienced emotions in the construction of expressive musical performance: an exploratory investigation. Psychol. Music 39(2), 196–219 (2010)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • György Fazekas
    • 1
  • Mathieu Barthet
    • 1
  • Mark B. Sandler
    • 1
  1. 1.School of Electronic Engineering and Computer Science, Center for Digital MusicQueen Mary University of LondonLondonUK

Personalised recommendations