A Possible Model for Predicting Listeners’ Emotional Engagement

  • Roberto Dillon
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3902)

Abstract

This paper introduces a possible approach for evaluating and predicting listeners’ emotional engagement during particular musical performances. A set of audio parameters (cues) is extracted from recorded audio files of two contrasting movements from Bach’s Solo Violin Sonatas and Partitas and compared to listeners’ responses, obtained by moving a slider while listening to music. The cues showing the highest correlations are then used for generating decision trees and a set of rules which will be useful for predicting the emotional engagement (EM) experienced by potential listeners in similar pieces. The model is tested on two different movements of the Solos showing very promising results.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Scherer, K.R.: Why music does not produce basic emotions: pleading for a new approach to measuring the emotional effect of music. In: Proceedings SMAC 2003, Stockholm, Sweden, pp. 25–28 (2003)Google Scholar
  2. 2.
    Cooke, D.: The Language of Music. Oxford University Press, Oxford (1959)Google Scholar
  3. 3.
    Sloboda, J.: Music structure and emotional response: Some empirical findings. Psychology of Music 19, 110–120 (1991)CrossRefGoogle Scholar
  4. 4.
    Gabrielsson, A., Lindstrom, E.: The influence of musical structure on emotional expression. In: Juslin, P., Sloboda, J. (eds.) Music and Emotion: theory and research, pp. 223–248. Oxford University Press, London (2001)Google Scholar
  5. 5.
    Juslin, P.: Communicating emotion in music performance: a review and a theoretical framework. In: Juslin, P., Sloboda, J. (eds.) Music and Emotion: theory and research, pp. 309–337. Oxford University Press, London (2001)Google Scholar
  6. 6.
    Leman, M., Vermeulen, V., De Voogdt, L., Taelman, J., Moelants, D., Lesaffre, M.: Correlation of Gestural Musical Audio Cues and Perceived Expressive Qualities. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS, vol. 2915, pp. 40–54. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  7. 7.
    Timmers, R., Marolt, M., Camurri, A., Volpe, G.: Listeners’ emotional engagement with performances of a Skriabin etude: An explorative case study. Psychology of Music (submitted, 2004)Google Scholar
  8. 8.
    Krumhansl, K., Schenk, D.L.: Can dance reflect the structural and expressive quality of music? A perceptual experiment on Balanchine’s choreography of Mozart’s divertimento No.15. Musica Scientiae I, 63–85 (1997)CrossRefGoogle Scholar
  9. 9.
    Barbier, P.: The world of the Castrati: the history of an extraordinary operatic phenomenon. Souvenir Press (1999)Google Scholar
  10. 10.
    Berri, P.: Paganini: Documenti e testimonianze. Genova (1962)Google Scholar
  11. 11.
    Camurri, A., Hashimoto, S., Ricchetti, M., Suzuki, K., Trocca, R., Volpe, G.: EyesWeb – Toward gesture and affect recognition in dance/music interactive systems. Computer Music Journal 24(1), MIT Press (2000)Google Scholar
  12. 12.
    Picard, R.: Affective Computing. MIT Press, Cambridge (1997)CrossRefGoogle Scholar
  13. 13.
    Todd, N.: Music and Motion: a personal view. In: Proceedings 4th Workshop on Rhythm Perception, Bourges, France (1992)Google Scholar
  14. 14.
    Dillon, R.: On the recognition of expressive intentions in music playing: a computational approach with experiments and applications. Ph.D Thesis, DIST University of Genoa (2004)Google Scholar
  15. 15.
    Dillon, R.: Extracting audio cues in real time to understand musical expressiveness. In: Proceedings Current research directions in computer music, MOSART Workshop, Barcelona, Spain, pp. 41–44 (2001)Google Scholar
  16. 16.
    Dillon, R.: A statistical approach to expressive intention recognition in violin performances. In: Proceedings SMAC 2003, Stockholm, Sweden, pp. 529–532 (2003)Google Scholar
  17. 17.
    Dillon, R.: Classifying musical performance by statistical analysis of audio cues. Journal of New Music Research 32(3), 327–332 (2003)CrossRefGoogle Scholar
  18. 18.
    Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, California (1993)Google Scholar
  19. 19.
    Timmers, R., Camurri, A., Volpe, G.: On the relation between performance cues and emotional engagement. In: Proceedings SMAC 2003, Stockholm, Sweden, pp. 569–572 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Roberto Dillon
    • 1
  1. 1.School of Computer Engineering – gameLABNanyang Technological UniversitySingapore

Personalised recommendations