Advertisement

Towards Affective-Psychophysiological Foundations for Music Production

  • António Pedro Oliveira
  • Amílcar Cardoso
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4738)

Abstract

This paper describes affective and psychophysiological foundations used to help to control affective content in music production. Our work includes the proposal of a knowledge base grounded on the state of the art done in areas of Music Psychology. This knowledge base has relations between affective states (happiness, sadness, etc.) and high level music features (rhythm, melody, etc.) to assist in the production of affective music. A computer system uses this knowledge base to select and transform chunks of music. The methodology underlying this system is essentially founded on Affective Computing topics. Psychophysiology measures will be used to detect listener’s affective state.

Keywords

Emotional Expression Emotion Recognition Galvanic Skin Response Music Therapy Music Performance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Scherer, K.: On the nature and function of emotion: A component process approach. Approaches to emotion, 293–317 (1984)Google Scholar
  2. 2.
    Scherer, K.: Psychological models of emotion. The Neuropsychology Of Emotion, 137–162 (2000)Google Scholar
  3. 3.
    Friberg, A.: A fuzzy analyzer of emotional expression in music performance and body motion. Music and Music Science (October 2004)Google Scholar
  4. 4.
    Taylor, R., Boulanger, P., Torres, D.: Visualizing emotion in musical performance using a virtual character. In: Butz, A., Fisher, B., Krüger, A., Olivier, P. (eds.) SG 2005. LNCS, vol. 3638, Springer, Heidelberg (2005)Google Scholar
  5. 5.
    Haag, A., Goronzy, S., Schaich, P., Williams, J.: Emotion recognition using bio-sensors: First steps towards an automatic system. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds.) ADS 2004. LNCS (LNAI), vol. 3068, pp. 36–48. Springer, Heidelberg (2004)Google Scholar
  6. 6.
    Vyzas, E.: Recognition of Emotional and Cognitive States Using Physiological Data. PhD thesis, Massachusetts Institute Of Technology (1999)Google Scholar
  7. 7.
    Leman, M., Camurri, A.: Musical content processing for expressive gesture applications in interactive multimedia. In: Conference on Interdisciplinary Musicology (April 2004)Google Scholar
  8. 8.
    Leman, M., Vermeulen, V., De Voogdt, L., Taelman, J., Moelants, D., Lesaffre, M.: Correlation of gestural musical audio cues and perceived expressive qualities. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 40–54. Springer, Heidelberg (2004)Google Scholar
  9. 9.
    Castellano, G.: Experiments, analysis, and models of motor activation as a component of an emotional process. Master’s thesis, University of Genoa (2004)Google Scholar
  10. 10.
    Scherer, K., Zentner, M.: Emotional effects of music: Production rules. Music and emotion. Theory and research, 361–392 (2001)Google Scholar
  11. 11.
    Chiu, P., Kumar, A.: Music therapy: Loud noise or soothing notes? International Pediatrics 18(4), 204–208 (2003)Google Scholar
  12. 12.
    Hilliard, R.: Music therapy in hospice and palliative care: a review of the empirical data. Evidence-based Complementary and Alternative Medicine 2(2), 173–178 (2005)CrossRefGoogle Scholar
  13. 13.
    Erkkilä, J., Lartillot, O., Luck, G., Riikkilä, K., Toiviainen, P.: Intelligent music systems in music therapy. Music Therapy Today 5 (2004)Google Scholar
  14. 14.
    Liljedahl, M., Sjömark, C., Lefford, N.: Using music to promote physical well-being via computer-mediated interaction. In: Musicnetwork Open Workshop, vol. 5, p. 5 (2005)Google Scholar
  15. 15.
    Wingstedt, J., Liljedahl, M., Lindberg, S., Berg, J.: Remupp: An interactive tool for investigating musical properties and relations. New Interfaces For Musical Expression, 232–235 (2005)Google Scholar
  16. 16.
    Fagerlönn, J.: A prototype using music responding to heart rate for stress reduction. Master’s thesis, Luleá University of Technology (June 2005)Google Scholar
  17. 17.
    Nakra, T.: Inside the Conductors Jacket: Analysis, Interpretation and Musical Synthesis of Expressive Gesture. PhD thesis, Massachusetts Institute of Technology (1999)Google Scholar
  18. 18.
    Nakra, T.M., Picard, R.: Analysis of affective musical expression with the conductors jacket. Col. Musical Informatics 12 (1998)Google Scholar
  19. 19.
    Healey, J., Picard, R., Dabek, F.: A new affect-perceiving interface and its application to personalized music selection. In: Workshop Perceptual User Interfaces (November 1998)Google Scholar
  20. 20.
    Casella, P., Paiva, A.: Magenta: An architecture for real time automatic composition of background music. In: de Antonio, A., Aylett, R., Ballin, D. (eds.) IVA 2001. LNCS (LNAI), vol. 2190, pp. 224–232. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  21. 21.
    Chung, J., Vercoe, G.: The affective remixer: Personalized music arranging. Conference on Human Factors in Computing Systems, 393–398 (2006)Google Scholar
  22. 22.
    Gabrielsson, A., Lindström, E.: The influence of musical structure on emotional expression. Music and emotion: Theory and research, 223–248 (2001)Google Scholar
  23. 23.
    Honing, H.: Computational modeling of music cognition: A case study on model selection. Music Perception 23(5), 365–376 (2006)CrossRefGoogle Scholar
  24. 24.
    Widmer, G., Goebl, W., Intelligence, A., Vienna, A.: Computational models of expressive music performance: The state of the art. Journal of New Music Research 33(3), 203–216 (2004)CrossRefGoogle Scholar
  25. 25.
    Schubert, E.: Measurement and Time Series Analysis of Emotion in Music. PhD thesis, University of New South Wales (1999)Google Scholar
  26. 26.
    Friberg, A.: pdm: An expressive sequencer with real-time control of the kth music-performance rules. Computer Music Journal 30(1), 37–48 (2006)CrossRefGoogle Scholar
  27. 27.
    Schröder, M., Pirker, H., Lamolle, M.: First suggestions for an emotion annotation and representation language. International Conference On Language Resources And Evaluation, pp. 88–92 (May, 2006)Google Scholar
  28. 28.
    Lim, T., Loh, W., Shih, Y.: A comparison of prediction accuracy, complexity, and training time of thirty-three old and new classification algorithms. Machine Learning 40(3), 203–228 (2000)zbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • António Pedro Oliveira
    • 1
  • Amílcar Cardoso
    • 1
  1. 1.Coimbra University, Department of Informatics EngineeringPortugal

Personalised recommendations