Skip to main content

Emotion Recognition from Music Enhanced by Domain Knowledge

  • Conference paper
  • First Online:
PRICAI 2019: Trends in Artificial Intelligence (PRICAI 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11670))

Included in the following conference series:

Abstract

Music elements have been widely used to influence the audiences’ emotional experience by its music grammar. However, these domain knowledge, has not been thoroughly explored as music grammar for music emotion analyses in previous work. In this paper, we propose a novel method to analyze music emotion via utilizing the domain knowledge of music elements. Specifically, we first summarize the domain knowledge of music elements and infer probabilistic dependencies between different main musical elements and emotions from the summarized music theory. Then, we transfer the domain knowledge to constraints, and formulate affective music analysis as a constrained optimization problem. Experimental results on the Music in 2015 database and the AMG1608 database demonstrate that the proposed music content analyses method outperforms the state-of-the-art performance prediction methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Aljanaki, A., Yang, Y.H., Soleymani, M.: Emotion in music task at mediaeval 2015. In: Working Notes Proceedings of the MediaEval 2015 Workshop (2015)

    Google Scholar 

  2. Bittner, R.M., Salamon, J., Tierney, M., Mauch, M., Cannam, C., Bello, J.P.: MedleyDB: a multitrack dataset for annotation-intensive MIR research. In: ISMIR, pp. 155–160 (2014)

    Google Scholar 

  3. Chen, Y.A., Yang, Y.H., Wang, J.C., Chen, H.: The AMG1608 dataset for music emotion recognition. In: 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 693–697. IEEE (2015)

    Google Scholar 

  4. Chin, Y.H., Wang, J.C.: Mediaeval 2015: recurrent neural network approach to emotion in music tack. In: Working Notes Proceedings of the MediaEval 2015 Workshop (2015)

    Google Scholar 

  5. Fernández-Sotos, A., Fernández-Caballero, A., Latorre, J.M.: Influence of tempo and rhythmic unit in musical emotion regulation. Front. Comput. Neurosci. 10, 80 (2016)

    Google Scholar 

  6. Gabrielsson, A., Lindström, E.: The role of structure in the musical expression of emotions. In: Handbook of Music and Emotion: Theory, Research, Applications, pp. 367–400 (2010)

    Chapter  Google Scholar 

  7. Gomez, P., Danuser, B.: Relationships between musical structure and psychophysiological measures of emotion. Emotion 7(2), 377–387 (2007)

    Article  Google Scholar 

  8. Husain, G., Thompson, W.F., Schellenberg, E.G.: Effects of musical tempo and mode on arousal, mood, and spatial abilities. Music Percept.: Interdisc. J. 20(2), 151–171 (2002)

    Article  Google Scholar 

  9. Lartillot, O.: Mirtoolbox 1.3. 4 user’s manual. Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä, Finland (2011)

    Google Scholar 

  10. Lartillot, O., Toiviainen, P.: A Matlab toolbox for musical feature extraction from audio. In: International Conference on Digital Audio Effects, pp. 237–244 (2007)

    Google Scholar 

  11. Liu, Y., Liu, Y., Gu, Z.: Affective feature extraction for music emotion prediction (2015)

    Google Scholar 

  12. Markov, K., Matsui, T.: Dynamic music emotion recognition using kernel Bayes’ filter (2015)

    Google Scholar 

  13. Miller, M.: The Complete Idiot’s Guide to Music Theory. Penguin, New York (2005)

    Google Scholar 

  14. Patra, B.G., Maitra, P., Das, D., Bandyopadhyay, S.: Mediaeval 2015: music emotion recognition based on feed-forward neural network. In: MediaEval (2015)

    Google Scholar 

  15. Sloboda, J.: Handbook of Music and Emotion: Theory, Research, Applications. Oxford University Press, Oxford (2011)

    Google Scholar 

  16. Trochidis, K., Lui, S.: Modeling affective responses to music using audio signal analysis and physiology. In: Kronland-Martinet, R., Aramaki, M., Ystad, S. (eds.) CMMR 2015. LNCS, vol. 9617, pp. 346–357. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46282-0_22

    Chapter  Google Scholar 

  17. Wessel, D.L.: Timbre space as a musical control structure. Comput. Music J. 3, 45–52 (1979)

    Article  Google Scholar 

  18. Yang, Y.H., Chen, H.H.: Machine recognition of music emotion: a review. ACM Trans. Intell. Syst. Technol. (TIST) 3(3), 40 (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guandong Xu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Shu, Y., Xu, G. (2019). Emotion Recognition from Music Enhanced by Domain Knowledge. In: Nayak, A., Sharma, A. (eds) PRICAI 2019: Trends in Artificial Intelligence. PRICAI 2019. Lecture Notes in Computer Science(), vol 11670. Springer, Cham. https://doi.org/10.1007/978-3-030-29908-8_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-29908-8_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-29907-1

  • Online ISBN: 978-3-030-29908-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics