Skip to main content

Towards User-Aware Music Information Retrieval: Emotional and Color Perception of Music

Part of the Human–Computer Interaction Series book series (HCIS)

Abstract

This chapter presents our findings on emotional and color perception of music. It emphasizes the importance of user-aware music information retrieval (MIR) and the advantages that research on emotional processing and interaction between multiple modalities brings to the understanding of music and its users. Analyses of results show that correlations between emotions, colors and music are largely determined by context. There are differences between emotion-color associations and valence-arousal ratings in non-music and music contexts, with the effects of genre preferences evident for the latter. Participants were able to differentiate between perceived and induced musical emotions. Results also show how associations between individual musical emotions affect their valence-arousal ratings. We believe these findings contribute to the development of user-aware MIR systems and open further possibilities for innovative applications in MIR and affective computing in general.

Keywords

  • Music Piece
  • Color Response
  • Arousal Dimension
  • Emotion Label
  • Music Information Retrieval

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-31413-6_16
  • Chapter length: 27 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   119.00
Price excludes VAT (USA)
  • ISBN: 978-3-319-31413-6
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   159.99
Price excludes VAT (USA)
Hardcover Book
USD   159.99
Price excludes VAT (USA)
Fig. 16.1
Fig. 16.2
Fig. 16.3
Fig. 16.4
Fig. 16.5
Fig. 16.6
Fig. 16.7
Fig. 16.8
Fig. 16.9
Fig. 16.10
Fig. 16.11
Fig. 16.12
Fig. 16.13

Notes

  1. 1.

    http://www.music-ir.org/mirex.

  2. 2.

    http://www.music-ir.org/mirex.

  3. 3.

    for colored figures (Fig. 16.1 (left) and Figs. 16.5, 16.6, 16.7, 16.8, 16.9, 16.10, 16.11, 16.12 and 16.13) refer to the electronic version of this paper.

  4. 4.

    Visualization tools for the general overview of the Moodo dataset are available here: http://www.moodo.musiclab.si/#/razplozenjeinglasba.

  5. 5.

    Note that emotions D: Energetic and H: Liveliness in Fig. 16.6 do not correspond to emotions D: Anticipation and H: Calmness in Fig. 16.7, as the latter are more appropriate in music context (for a discussion on musical and non-musical emotions, see [37]).

References

  1. Albert, W., Tullis, T.: Measuring the user experience: collecting, analyzing, and presenting usability metrics (Google eBook). Newnes (2013)

    Google Scholar 

  2. Aljanaki, A., Bountouridis, D., Burgoyne, J.A., van Balen, J., Wiering, F., Honing, H., Veltkamp, R.C.: Designing games with a purpose for data collection in music research. Emotify and hooked: two case studies. Lecture Notes Computer Science (2014)

    Google Scholar 

  3. Aljanaki, A., Wiering, F., Veltkamp, R.C.: Computational modeling of induced emotion using GEMS. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 373–378. Taipei (2014)

    Google Scholar 

  4. Barthet, M., Fazekas, G., Sandler, M.: Multidisciplinary perspectives on music emotion recognition: implications for content and context-based models. In: CMMR, pp. 492–507. London (2012)

    Google Scholar 

  5. Barthet, M., Marston, D., Baume, C., Fazekas, G., Sandler, M.: Design and evaluation of semantic mood models for music recommendation using editorial tags. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR). Curitiba (2013)

    Google Scholar 

  6. Bergstrom, T., Karahalios, K., Hart, J.C.: Isochords: visualizing structure in music. In: Proceedings of Graphics Interface, pp. 297–304 (2007)

    Google Scholar 

  7. Bigand, E., Vieillard, S., Madurell, F., Marozeau, J., Dacquet, A.: Multidimensional scaling of emotional responses to music: the effect of musical expertise and of the duration of the excerpts. Cogn. Emot. 19(8), 1113–1139 (2005)

    CrossRef  Google Scholar 

  8. Bulkin, D.A., Groh, J.M.: Seeing sounds: visual and auditory interactions in the brain. Curr. opin. neurobiol. 16(4), 415–419 (2006)

    CrossRef  Google Scholar 

  9. Calvert, G.A.: Crossmodal processing in the human brain: insights from functional neuroimaging studies. Cereb. Cortex 11(12), 1110–1123 (2001)

    CrossRef  Google Scholar 

  10. Canazza, S., De Poli, G., Rodà, A., Vidolin, A., Zanon, P.: Kinematics-energy space for expressive interaction in music performance. In: Proceedings of MOSART. Workshop on Current Research Directions in Computer Music, pp. 35–40 (2001)

    Google Scholar 

  11. Collignon, O., Girard, S., Gosselin, F., Roy, S., Saint-Amour, D., Lassonde, M., Lepore, F.: Audio-visual integration of emotion expression. Brain Res. 1242, 126–135 (2008)

    CrossRef  Google Scholar 

  12. De Gelder, B., Bertelson, P.: Multisensory integration, perception and ecological validity. Trends Cogn. Sci. 7(10), 460–467 (2003)

    CrossRef  Google Scholar 

  13. Dibben, N.: Emotion and music: a view from the cultural psychology of music. In: Proceedings of the 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009 (2009)

    Google Scholar 

  14. Doehrmann, O., Naumer, M.J.: Semantics and the multisensory brain: how meaning modulates processes of audio-visual integration. Brain Res. 12(42), 136–150 (2008)

    CrossRef  Google Scholar 

  15. Donaldson, J., Lamere, P.: Using visualizations for music discovery. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR). Tutorial (2009)

    Google Scholar 

  16. Eerola, T.: Are the emotions expressed in music genre-specific? An audio-based evaluation of datasets spanning classical, film, pop and mixed genres. J. New Music Res. 40(4), 349–366 (2011)

    CrossRef  Google Scholar 

  17. Eerola, T.: Modeling listeners’ emotional response to music. Top. Cogn. Sci. 4, 607–624 (2012)

    CrossRef  Google Scholar 

  18. Eerola, T.: Modelling emotional effects of music: key areas of improvement. In: Proceedings of the Sound and Music Computing Conference 2013, SMC 2013. Stockholm, Sweden (2013)

    Google Scholar 

  19. Eerola, T., Vuoskoski, J.K.: A comparison of the discrete and dimensional models of emotion in music. Psychol. Music 39(1), 18–49 (2010)

    CrossRef  Google Scholar 

  20. Eerola, T., Vuoskoski, J.K.: A review of music and emotion studies: approaches, emotion models, and stimuli. Music Percept. 30(3), 307–340 (2013)

    Google Scholar 

  21. Eerola, T., Lartillot, O., Toiviainen, P.: Prediction of multidimensional emotional ratings in music from audio using multivariate regression models. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 621–626 (2009)

    Google Scholar 

  22. Ekman, P.: An argument for basic emotions. Cogn. Emot. 6, 169–200 (1992)

    CrossRef  Google Scholar 

  23. Ernst, M.O., Bülthoff, H.H.: Merging the senses into a robust percept. Trends Cogn. Sci. 8(4), 162–169 (2004)

    CrossRef  Google Scholar 

  24. Evans, P., Schubert, E.: Relationships between expressed and felt emotions in music. Musicae Sci. 12, 75–99 (2008)

    CrossRef  Google Scholar 

  25. Evans, K.K., Treisman, A.: Natural cross-modal mappings between visual and auditory features. J. Vis. 10(1), 6 (2010)

    CrossRef  Google Scholar 

  26. Gabrielsson, A.: Emotion perceived and emotion felt: same or different? Musicae Sci. 5(1 suppl):123–147 (2002)

    Google Scholar 

  27. Gingras, B., Marin, M.M., Fitch, W.T.: Beyond intensity: spectral features effectively predict music-induced subjective arousal. Q. J. Exp. Psychol. 1–19 (2013) [ahead-of-print]

    Google Scholar 

  28. Griscom, W.S., Palmer, S.E.: The color of musical sounds: color associates of harmony and timbre in non-synesthetes. J. Vis. 12(9), 74–74 (2012)

    CrossRef  Google Scholar 

  29. Grohganz, H., Clausen, M., Jiang, N., Mueller, M.: Converting path structures into block structures using eigenvalue decompositions of self-similarity matrices. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR). Curitiba (2013)

    Google Scholar 

  30. Hart, S.G.: Nasa-task load index (NASA-TLX); 20 years later. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 50(9), 904–908 (2006)

    CrossRef  Google Scholar 

  31. Herrera-Boyer, P., Gouyon, F.: MIRrors: music information research reflects on its future: special issue foreword. J. Intell. Inf. Syst. 41, 339–343 (2013)

    CrossRef  Google Scholar 

  32. Hu, X., Downie, J.S.: Exploring mood metadata: relationships with genre, artist and usage metadata. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR). Vienna (2007)

    Google Scholar 

  33. Isaacson, E.: What you see is what you get: on visualizing music. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 389–395. London (2005)

    Google Scholar 

  34. Jaimovich, J., Coghlan, N., Knapp, R.B.: Emotion in motion: a study of music and affective response. In: From Sounds to Music and Emotions, pp. 19–43. Springer (2013)

    Google Scholar 

  35. Jiang, N., Mueller, M.: Automated methods for analyzing music recordings in sonata form. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR). Curitiba (2013)

    Google Scholar 

  36. Julia, C.F., Jorda, S.: SongExplorer: a tabletop application for exploring large collections of songs. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 675–680. Kobe (2009)

    Google Scholar 

  37. Juslin, P.N., Sloboda, J.A.: Music and Emotion: Theory and Research. Oxford University Press (2001)

    Google Scholar 

  38. Juslin, P.N., Laukka, P.: Expression, perception, and induction of musical emotions: a review and a questionnaire study of everyday listening. J. New Music Res. 33(3), 217–238 (2004)

    CrossRef  Google Scholar 

  39. Juslin, P.N., Västfjäll, D.: Emotional responses to music: the need to consider underlying mechanisms. Behav. Brain Sci. 31(5), 559–575 (2008)

    Google Scholar 

  40. Kim, Y.E., Schmidt, E.M., Migneco, R., Morton, B.G., Richardson, P., Scott, J., Speck, J.A., Turnbull, D.: Music emotion recognition: a state of the art review. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 255–266. Utrecht (2010)

    Google Scholar 

  41. Koelsch, S.: Towards a neural basis of music-evoked emotions. Trends Cogn. Sci. 14(3), 131–137 (2010)

    CrossRef  Google Scholar 

  42. Kohonen, T.: The self-organizing map. In: Proceedings of the IEEE 78(9) (1990)

    Google Scholar 

  43. Kreutz, G., Ott, U., Teichmann, D., Osawa, P., Vaitl, D.: Using music to induce emotions: influences of musical preference and absorption. Psychol. Music 36, 101–126 (2007)

    CrossRef  Google Scholar 

  44. Kurabayashi, S., Imai, T.: Chord-cube: music visualization and navigation system with an emotion-aware metric space for temporal chord progression. Int. J. Adv. Internet Technol. 7(1), 52–62 (2014)

    Google Scholar 

  45. Lamere, P., Eck, D.: Using 3D visualizations to explore and discover music. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 173–174 (2007)

    Google Scholar 

  46. Laurier, C., Sordo, M., Serrà, J., Herrera, P.: Music mood representations from social tags. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 381–386 (2009)

    Google Scholar 

  47. Lee, J.H., Cunningham, S.J.: The impact (or non-impact) of user studies in music information retrieval. Ismir 391–396 (2012)

    Google Scholar 

  48. Lee, J.H., Cunningham, S.J.: Toward an understanding of the history and impact of user studies in music information retrieval. J. Intell. Inf. Syst. (2013)

    Google Scholar 

  49. Levitin, D.J., Tirovolas, A.K.: Current advances in the cognitive neuroscience of music. Ann. N.Y. Acad. Sci. 1156, 211–231 (2009)

    Google Scholar 

  50. Lykartsis, A., Pysiewicz, A., Coler, H., Lepa, S.: The emotionality of sonic events: testing the geneva emotional music scale (GEMS) for popular and electroacoustic music. In: Proceedings of the 3rd International Conference on Music and Emotion (ICME3), pp. 1–15. Jyväskylä (2013)

    Google Scholar 

  51. Mardirossian, A., Chew, E.: Visualizing music: tonal progressions and distributions. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 189–194. Vienna (2007)

    Google Scholar 

  52. Marin, M.M., Gingras, B., Bhattacharya, J.: Crossmodal transfer of arousal, but not pleasantness, from the musical to the visual domain. Emotion 12(3), 618 (2012)

    CrossRef  Google Scholar 

  53. Marks, L.E., Ben-Artzi, E., Lakatos, S.: Cross-modal interactions in auditory and visual discrimination. Int. J. Psychophysiol. 1, 125–145 (2003)

    CrossRef  Google Scholar 

  54. McGurk, H., MacDonald, J.: Hearing lips and seeing voices. Nature 264, 746–748 (1976)

    Google Scholar 

  55. Mcvicar, M., Freeman, T., De Bie, T.: Mining the correlation between lyrical and audio features and the emergence of mood. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 783–788. Miami (2011)

    Google Scholar 

  56. Meyer, L.B.: Emotion and Meaning in Music. University of Chicago Press. Chicago (1956)

    Google Scholar 

  57. Müllensiefen, D., Gingras, B., Musil, J., Stewart, L.: The musicality of non-musicians: an index for assessing musical sophistication in the general population. PLoS ONE 9(2) (2014)

    Google Scholar 

  58. Omez, P., Danuser, B.: Relationships between musical structure and psychophysiological measures of emotion. Emotion 7(2), 377 (2007)

    CrossRef  Google Scholar 

  59. Ou, L.-C., Luo, M.R., Woodcock, A., Wright, A.: A study of colour emotion and colour preference. Part I: colour emotions for single colours. Color Res. Appl. 29(3) (2004)

    Google Scholar 

  60. Palmer, S.E., Schloss, K.B., Zoe, X., Prado-León, L.R.: Music-color associations are mediated by emotion. Proc. Natl. Acad. Sci. 110(22), 8836–8841 (2013)

    CrossRef  Google Scholar 

  61. Pampalk, E., Dixon, S., Widmer, G.: Exploring music collections by browsing different views (2004)

    Google Scholar 

  62. Pampalk, E.: Islands of music analysis, organization, and visualization of music archives. OGAI J. (Oesterreichische Ges. Artif. Intell.) 22(4), 20–23 (2003)

    Google Scholar 

  63. Panda, R., Malheiro, R., Rocha, B., Oliveira, A., Paiva, R.P.: Multi-modal music emotion recognition: a new dataset. In: Proceedings of the Methodology and Comparative Analysis CMMR (2013)

    Google Scholar 

  64. Parise, C.V., Spence, C.: ’When birds of a feather flock together’: synesthetic correspondences modulate audiovisual integration in non-synesthetes. PLoS One 4(5), e5664 (2009)

    Google Scholar 

  65. Pearce, M., Rohrmeier, M.: Music cognition and the cognitive sciences. Top. Cogn. Sci. 4(4), 468–484 (2012)

    Google Scholar 

  66. Peretz, I., Coltheart, M.: Modularity of music processing. Nat. Neurosci. 6(7), 688–691 (2003)

    Google Scholar 

  67. Pesek, M., Godec, P., Poredoš, M., Strle, G., Guna, J., Stojmenova, E., Pogačnik, M., Marolt, M.: Capturing the mood: evaluation of the moodstripe and moodgraph interfaces. In: Management Information Systems in Multimedia Art, Education, Entertainment, and Culture (MIS-MEDIA), IEEE Internation Conference on Multimedia and Expo (ICME), pp. 1–4 (2014)

    Google Scholar 

  68. Pesek, M., Godec, P., Poredos, M., Strle, G., Guna, J., Stojmenova, E., Pogacnik, M., Marolt, M.: Introducing a dataset of emotional and color responses to music. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 355–360. Taipei (2014)

    Google Scholar 

  69. Pressing, J.: Cognitive complexity and the structure of musical patterns. Noetica 3, 1–8 (1998)

    Google Scholar 

  70. Remmington, N.A., Fabrigar, L.R., Visser, P.S.: Reexamining the circumplex model of affect. J. Pers. Soc. Psychol. 79(2), 286–300 (2000)

    CrossRef  Google Scholar 

  71. Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6):1161–1178 (1980)

    Google Scholar 

  72. Saari, P., Eerola, T.: Semantic computing of moods based on tags in social media of music. IEEE Trans. Knowl. Data Eng. 26(10), 2548–2560 (2014)

    CrossRef  Google Scholar 

  73. Schedl, M., Flexer, A.: Putting the user in the center of music information retrieval. In: Proceedings of the 13th International Society for Music Information Retrieval Conference, (Ismir), pp. 416–421 (2012)

    Google Scholar 

  74. Schedl, M., Knees, P.: Personalization in multimodal music retrieval. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 7836, LNCS, pp. 58–71 (2013)

    Google Scholar 

  75. Schedl, M., Flexer, A., Urbano, J.: The neglected user in music information retrieval research. J. Intell. Inf. Syst. 41(3), 523–539 (2013)

    CrossRef  Google Scholar 

  76. Scherer, K.R.: Which emotions can be induced by music? What are the underlying mechanisms? And how can we measure them? J. New Music Res. 33(3), 239–251 (2004)

    MathSciNet  CrossRef  Google Scholar 

  77. Scherer, K.R., Zentner, M.R.: Emotional effects of music: production rules. In: Juslin, P.N., Sloboda, J.A. (eds.) Music and emotion. Oxford University Press, New York (2001)

    Google Scholar 

  78. Schimmack, U., Reisenzein, R.: Experiencing activation: energetic arousal and tense arousal are not mixtures of valence and activation. Emotion (Washington, D.C.) 2(4), 412–7 (2002)

    Google Scholar 

  79. Schmidt, E.M., Kim, Y.E.: Modeling musical emotion dynamics with conditional random fields. In: ISMIR, pp. 777–782 (2011)

    Google Scholar 

  80. Schubert, E.: Emotion felt by listener and expressed by music: a literature review and theoretical investigation. Frontiers Psychol. 4(837) (2013)

    Google Scholar 

  81. Schuller, B., Hage, C., Schuller, D., Rigoll, G.: ’Mister DJ, cheer me up!’: musical and textual features for automatic mood classification. J. of New Music Res. 39(1), 13–34 (2010)

    Google Scholar 

  82. Serra, X., Magas, M., Benetos, E., Chudy, M., Dixon, S. Flexer, A., Gómez, E., Gouyon, F., Herrera, P., Jordà, S., Paytuvi, O., Peeters, G., Vinet, H., Widmer, G.: Roadmap for music information research. Jan Schlüter (2013)

    Google Scholar 

  83. Soleymani, M., Caro, M.N., Schmidt, E.M., Sha, C.-Y., Yang, Y.-H.: 1000 songs for emotional analysis of music. In: Proceedings of the 2nd ACM International Workshop on Crowdsourcing for Multimedia—CrowdMM ’13, pp. 1–6. ACM Press, New York, USA (2013)

    Google Scholar 

  84. Song, Y., Dixon, S., Pearce, M.: A survey of music recommendation systems and future perspectives. In: Proceedings of the 9th International Symposium Computer Music Modelling and Retrieval (CMMR), pp. 395–410. London (2012)

    Google Scholar 

  85. Speck, J.A., Schmidt, E.M., Morton, B.G., Kim, Y.E.: A comparative study of collaborative vs. traditional musical mood annotation. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 549–554. Miami (2011)

    Google Scholar 

  86. Spence, C.: Audiovisual multisensory integration. Acoust. Sci. Technol. 28(2), 61–70 (2007)

    CrossRef  Google Scholar 

  87. Spence, C.: Crossmodal correspondences: a tutorial review. Atten. Percept. Psychophys. 4(1), 971–995 (2011)

    MathSciNet  CrossRef  Google Scholar 

  88. Spence, C., Senkowski, D., Röder, B.: Crossmodal processing. Exp. Brain Res. 198(2), 107–111 (2009)

    CrossRef  Google Scholar 

  89. Stalinski, S.M., Schellenberg, E.G.: Music cognition: a developmental perspective. Top. Cogn. Sci. 4(4), 485–497 (2012)

    Google Scholar 

  90. Stevens, C.J.: Music perception and cognition: a review of recent cross-cultural research. Top. Cogn. Sci. 4, 653–667 (2012)

    CrossRef  Google Scholar 

  91. Tingle, D., Kim, Y.E., Turnbull, D.: Exploring automatic music annotation with “acoustically-objective” tags. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 55–62. New York (2010)

    Google Scholar 

  92. Torrens, M., Hertzog, P., Arcos, J.L.: Visualizing and exploring personal music libraries. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR). Barcelona (2004)

    Google Scholar 

  93. Torres-Eliard, K., Labbé, C., Grandjean, D.: Towards a dynamic approach to the study of emotions expressed by music. Lect. Notes Inst. Comput. Sci. Soc. Inform. Telecommun. Eng. 78, 252–259 (2011)

    Google Scholar 

  94. Turnbull, D., Barrington, L., Torres, D., Lanckriet, G.: Semantic annotation and retrieval of music and sound effects. IEEE Trans. Audio Speech Lang. Process. 16(2), 467–476 (2008)

    CrossRef  Google Scholar 

  95. Typke, R., Wiering, F., Veltkamp, R.C.: A survey of music information retrieval systems. In: Proceedings of the International Symposium on Music Information Retrieval, ISMIR, pp. 153–160 (2005)

    Google Scholar 

  96. Van Gulik, R., Vignoli, F.: Visual Playlist generation on the artist map. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR). London (2005)

    Google Scholar 

  97. Van Gulik, R., Vignoli, F., Van de Wetering, H.: Mapping music in the palm of your hand, explore and discover your collection. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR). Barcelona (2004)

    Google Scholar 

  98. Vroomen, J., de Gelder, B.: Sound enhances visual perception: cross-modal effects of auditory organization on vision. J. Exp. Psychol. Hum. Percept. Perform. 26(5), 1583–1588 (2000)

    CrossRef  Google Scholar 

  99. Vuoskoski, J.K., Eerola, T.: Measuring music-induced emotion: a comparison of emotion models, personality biases, and intensity of experiences. Musicae Sci. 15(2), 159–173 (2011)

    CrossRef  Google Scholar 

  100. Vuoskoski, J.K., Eerola, T.: The role of mood and personality in the perception of emotions represented by music. Cortex 47(9), 1099–1106 (2011)

    CrossRef  Google Scholar 

  101. Wang, J.C., Yang, Y.H., Chang, K., Wang, H.M., Jeng, S.-K.: Exploring the relationship between categorical and dimensional emotion semantics of music. In: Proceedings of the Second International ACM Workshop on Music Information Retrieval with User-centered And Multimodal Strategies—MIRUM ’12, p. 63. ACM Press, New York, USA (2012)

    Google Scholar 

  102. Wang, J.-C., Wang, H.-M., Lanckriet, G.: A histogram density modeling approach to music emotion recognition. In: 2015 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE (2015)

    Google Scholar 

  103. Watson, D., Clark, L.A., Tellegen, A.: Development and validation of brief measures of positive and negative affect: the PANAS scales. J. Pers. Soc. Psychol. 54(6), 1063–1070 (1988)

    CrossRef  Google Scholar 

  104. Weigl, D., Guastavino, C.: User studies in the music information retrieval literature. Ismir 335–340 (2011)

    Google Scholar 

  105. Witten, I.B., Knudsen, E.I.: Why seeing is believing: merging auditory and visual worlds. Neuron 48(3), 489–496 (2005)

    CrossRef  Google Scholar 

  106. Yang, Y.H., Chen, H.H.: Machine recognition of music emotion (2012)

    Google Scholar 

  107. Yoshii, K., Goto, M.: Music thumbnailer: visualizing musical pieces in thumbnail images based on acoustic features. In: Proceedings of the International Conference on Music Information Retrieval (ISMIR), pp. 211–216. Philadelphia (2008)

    Google Scholar 

  108. Zentner, M., Grandjean, D., Scherer, K.R.: Emotions evoked by the sound of music: characterization, classification, and measurement. Emotion 8(4), 494 (2008)

    CrossRef  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gregor Strle .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Strle, G., Pesek, M., Marolt, M. (2016). Towards User-Aware Music Information Retrieval: Emotional and Color Perception of Music. In: Tkalčič, M., De Carolis, B., de Gemmis, M., Odić, A., Košir, A. (eds) Emotions and Personality in Personalized Services. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-319-31413-6_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-31413-6_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-31411-2

  • Online ISBN: 978-3-319-31413-6

  • eBook Packages: Computer ScienceComputer Science (R0)