In the Mood: Tagging Music with Affects

  • Jörn Loviscach
  • David Oswald
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4868)

Abstract

Music and mood carry a strong relationship, which is employed very effectively by classical music, Hollywood’s soundtracks, and pop bands. Affective computing can provide support in selecting music that fits to a given mood. We describe a system that addresses a full range of functionality. It allows the user to semi-automatically tag music with mood descriptions, determines mood from sensors or from the state of a computer game, and plays appropriate music.

Keywords

Music information retrieval playlists MP3 tags 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Healey, J., Picard, R., Dabek, F.: A new affect-perceiving interface and its application to personalized music selection. In: Proc. of PUI 1998 (1998)Google Scholar
  2. 2.
    Dornbush, S., Fisher, K., McKay, K., Prikhodko, A., Segall, Z.: Xpod – a human activity and emotion aware music player. In: Proc. of the International Conference on Mobile Technology, Applications and Systems, pp. 1–6 (2005)Google Scholar
  3. 3.
    Chung, J.-W., Vercoe, G.S.: The affective remixer: personalized music arranging. In: CHI 2006 Extended Abstracts, pp. 393–398 (2006)Google Scholar
  4. 4.
    Meyers, O.: mySoundTrack: A commonsense playlist generator (2005), http://web.media.mit.edu/~meyers/mysoundtrack.html
  5. 5.
    Wijnalda, G., Pauws, S., Vignoli, F., Stuckenschmidt, H.: A personalized music system for motivation in sport performance. IEEE Pervasive Computing 04(3), 26–32 (2005)CrossRefGoogle Scholar
  6. 6.
    Oliver, N., Flores-Mangas, F.: MPTrain: a mobile, music and physiology-based personal trainer. In: Proc. of MobileHCI 2006, pp. 21–28 (2006)Google Scholar
  7. 7.
    Reddy, S., Mascia, J.: Lifetrak: Music in tune with your life. In: Proc. of HCM 2006, pp. 25–34 (2006)Google Scholar
  8. 8.
    Corthaut, N., Govaerts, S., Duval, E.: Moody tunes: The Rockanango project. In: Proc. of ISMIR 2006, pp. 308–313 (2006)Google Scholar
  9. 9.
    Livingstone, S.R., Brown, A.R.: Dynamic response: real-time adaptation for music emotion. In: Proc. of IE 2005, pp. 105–111 (2005)Google Scholar
  10. 10.
    Yang, Y.-H., Liu, C.-C., Chen, H.H.: Music emotion classification: a fuzzy approach. In: Proc. of MULTIMEDIA 2006, pp. 81–84 (2006)Google Scholar
  11. 11.
    Mandel, M.I., Poliner, G.E., Ellis, D.P.W.: Support vector machine active learning for music retrieval. Multimedia Systems 12(1), 3–13 (2006)CrossRefGoogle Scholar
  12. 12.
    Knees, P., Pohle, T., Schedl, M., Widmer, G.: Combining audio-based similarity with web-based data to accelerate automatic music playlist generation. In: Proc. of MIR 2006, pp. 147–154 (2006)Google Scholar
  13. 13.
    Andric, A., Xech, P.L., Fantasia, A.: Music mood wheel: Improving browsing experience on digital content through an audio interface. In: Proc. of AXMEDIS 2006, pp. 251–257 (2006)Google Scholar
  14. 14.
    Ortony, A., Turner, T.J.: What’s basic about basic emotions? Psychological Review 97, 315–331 (1990)CrossRefGoogle Scholar
  15. 15.
    Kalinnen, K.: Emotional ratings of music excerpts in the Western art music repertoire and their self-organization in the Kohonen neural network. Psychology of Music 33(4), 373–379 (2005)CrossRefGoogle Scholar
  16. 16.
    Schubert, E.: Measuring emotion continuously: validity and reliability of the two-dimensional emotion-space. Australian J. of Psychology 51(3), 154–156 (1999)CrossRefGoogle Scholar
  17. 17.
    Mehrabian, A.: Pleasure–arousal–dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology: Developmental, Learning, Personality, Social 14, 261–292 (1996)MathSciNetGoogle Scholar
  18. 18.
    Li, T., Ogihara, M.: Detecting emotion in music. In: Proc. of ISMIR 2003, pp. 239–240 (2003)Google Scholar
  19. 19.
    Pampalk, E., Dixon, S., Widmer, G.: On the evaluation of perceptual similarity measures for music. In: Proc. of DAFx 2003, pp. 7–12 (2003)Google Scholar
  20. 20.
    Peter, C., Ebert, E., Beikirch, H.: A wearable multi-sensor system for mobile acquisition of emotion-related physiological data. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 691–698. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  21. 21.
    Lichtenstein, A., Oehme, A., Kupschick, S., Jürgensohn, T.: Comparing Two Emotion Models for Deriving Affective States from Physiological Data. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868. Springer, Heidelberg (2008)Google Scholar
  22. 22.
    Money, A.G., Agius, H.: Automating the extraction of emotion-related multimedia semantics. In: Workshop on The Role of Emotion in Human-Computer Interaction (2005)Google Scholar
  23. 23.
    Nagel, F., Grewe, O., Kopiez, R., Altenmller, E.: The relationship of psycho-physiological responses and self-reported emotions while listening to music. In: Proc. of the 30th Göttingen Neurobiology Conference (2005)Google Scholar
  24. 24.
    Livingstone, S.R., Brown, A.R., Muhlberger, R.: Influencing the perceived emotions of music with intent. In: Proc. of the 3rd International Conference on Generative Systems (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Jörn Loviscach
    • 1
  • David Oswald
    • 1
  1. 1.Hochschule BremenFachbereich Elektrotechnik und InformatikBremenGermany

Personalised recommendations