Advertisement

A Study on Query-by-Any-Word Based Music Retrieval System

  • Shinji Sako
  • Ai Zukawa
  • Tadashi Kitamura
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 434)

Abstract

Recently, commercial interest in the field of music information retrieval (MIR) has been growing rapidly. This paper describes a MIR system that accept any Japanese word as query. Previous studies focused on emotion based MIR system generally uses limited words such as major adjectives or kansei words. However, emotion of the music is represented by various words in practice. Music review is a one of good example. Word can also express complicated emotions with which various emotions are mixed. Starting from this point of view, we propose a method for MIR system that is able to find the appropriate music directory from any word as query. There are three main issues in this study. First one is how to mapping music and emotion. We introduce two-dimensional space which can represent emotion and music in a unified space. This space is obtained automatically from the emotion evaluation data of words and music. Second issue is extraction method for musical feature in order to map to the emotion space from given music. In our approach, optimal feature parameters are automatically selected with respect to each axis of the emotion space. Third issue is how to cope with any query word. Our method can find a music pieace corresponding to emotion of any word by measurement of relationship between each basic word for the given query word. A feature point of this approach is the use of co-occurrence probability of words obtained from a large scale of web text corpus. We performed a subjective evaluation experiments using 100 classical musical pieces and 50 Japanese words that are often used in music reviews. The experimental results show that our proposed system can find the correct music piece which matches mostly given query word.

Keywords

emotion based music retrieval system co-occurrence probability of word 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Sato, A., Ogawa, J., Kitakami, H.: An impression-based retrieval system of music collection. In: Proc. of 4th International Conference on knowledge-Based Intelligent Engineering Systems & Allied Technologies, pp. 856–859 (2000)Google Scholar
  2. 2.
    Levy, M., Sandler, M.: A semantic space for music derived from social tags. In: Proc. of 8th International Society for Music Information Retrieval Conference (ISMIR 2007), pp. 411–416 (2007)Google Scholar
  3. 3.
    Kumamoto, T., Ohta, K.: Design, implementation, and opening to the public of an impression-based music retrieval system. Transactions of the Japanese Society for Artificial Intelligence 21, 310–318 (2006)CrossRefGoogle Scholar
  4. 4.
    Iwatsuki, Y., Sako, S., Kitamura, T.: An estimation method of musical emotion considering individ. In: Proc. of International Conference on Kansei Engineering and Emotion Research (KEER 2012), pp. 456–461 (2012)Google Scholar
  5. 5.
    Goto, M., Hashiguchi, H., Nishimura, T., Oka, R.: Rwc music database: Popular, classical, and jazz music databases. In: Proc. of the 3rd International Society for Music Information Retrieval (ISMIR 2002), pp. 287–288 (2002)Google Scholar
  6. 6.
    NICT MASTAR Project: ALAGIN Language Resources and Voice Resources Site (accessed May 18, 2014)Google Scholar
  7. 7.
    Lartillot, O., Toiviainen, P.: MIR in Matlab(II): A Toolbox for Musical Feature Extraction form Audio. In: Proc. of the 7th International Conference on Music Information Retrieval (ISMIR 2007), pp. 287–288 (2002)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Shinji Sako
    • 1
  • Ai Zukawa
    • 1
  • Tadashi Kitamura
    • 1
  1. 1.Nagoya Institute of TechnologyNagoyaJapan

Personalised recommendations