Abstract
With the advent of the ubiquitous era, many studies have been devoted to various situation-aware services in the semantic web environment. One of the most challenging studies involves implementing a situation-aware personalized music recommendation service which considers the user’s situation and preferences. Situation-aware music recommendation requires multidisciplinary efforts including low-level feature extraction and analysis, music mood classification and human emotion prediction. In this paper, we propose a new scheme for a situation-aware/user-adaptive music recommendation service in the semantic web environment. To do this, we first discuss utilizing knowledge for analyzing and retrieving music contents semantically, and a user adaptive music recommendation scheme based on semantic web technologies that facilitates the development of domain knowledge and a rule set. Based on this discussion, we describe our Context-based Music Recommendation (COMUS) ontology for modeling the user’s musical preferences and contexts, and supporting reasoning about the user’s desired emotions and preferences. Basically, COMUS defines an upper music ontology that captures concepts on the general properties of music such as titles, artists and genres. In addition, it provides functionality for adding domain-specific ontologies, such as music features, moods and situations, in a hierarchical manner, for extensibility. Using this context ontology, we believe that logical reasoning rules can be inferred based on high-level (implicit) knowledge such as situations from low-level (explicit) knowledge. As an innovation, our ontology can express detailed and complicated relations among music clips, moods and situations, which enables users to find appropriate music. We present some of the experiments we performed as a case-study for music recommendation.
Similar content being viewed by others
References
All Music Guide, Available at: http://allmusic.com
Birmingham W, Dannenberg R, Pardo B (2006) An Introduction to query by humming with the vocal search system. Commun ACM 49(8):49–52
Cano P et al (2005) “Content-based music audio recommendation,” Proc of ACM Multimedia, pp. 211–212
CYC upper ontology, Available at: http://www.cyc.com/cycdoc/vocab/vocab-toc.html
Ellis DPW, Poliner GE (2007) Identifying ‘Cover Songs’ with chroma features and dynamic programming beat tracking. IEEE Conf Acoustic Speech Signal Process ICASSP IV:1429–1432
GarageBand, Available at: http://www.garageband.com/
Grüninger M, Fox MS (1994) “The role of Mariano Fernández López 4–12 competency questions in enterprise engineering,” In: IFIP WG 5.7 Workshop on Benchmarking. Theory and Practice, Trondheim, Norway
Horrocks I, Sattler U (2005) “A tableaux decision procedure for SHOIQ,” In: Proc. of the 19th Int. Joint Conf. on Artificial Intelligence (IJCAI 2005), Morgan Kaufman
Jun S, Rho S, Han B, Hwang E (2008) “A Fuzzy Inference-based music emotion recognition system,” International Conference on Visual Information Engineering, pp. 673–677, July 29 ~ Aug. 1
Juslin PN, Sloboda JA (2001) Music and emotion: theory and research. Oxford University Press, New York
Kanzaki Music Vocabulary, Available at: http://www.kanzaki.com/ns/music
Klaus RS, Marcel RZ (2001) Emotional effects of music: production rules,” music and emotion: theory and research. Oxford University Press, Oxford
Krumhansl C (1990) “Cognitive foundations of musical pitch”, Oxford University Press
Krzysztof J, Przemyslaw K, Katarzna M (2010) “Personalized ontology-based recommender systems for multimedia objects”, Agent and multi-agent technology for internet and enterprise systems, Studies in Computational Intelligence, Vol.289, pp.275-292, 2010
Last.fm, Available at: http://www.last.fm
List T, Fisher RB (2004) “CVML – An XML-based computer vision markup language,” Proceedings of the 17th international conference on pattern recognition, pp. 789–792
Lu L, Liu D, Zhang H-J (2006) Automatic mood detection and tracking of music audio signals. IEEE Trans Audio Speech Lang Process 14(1):5–18
Mood Logic, Available at: http://www.moodlogic.com/
MPEG-7, Available at: http://mpeg.chiariglione.org/standards/mpeg-7/mpeg-7.htm
MusicBrainz, Available at: http://musicbrainz.org
MyStrands, Available at: http://www.mystrands.com/
Ortony A, Clore GL, Collins L (1998) The cognitive structure of emotions. Cambridge University Press, Cambridge
Oscar C (2008) Foafing the music: bridging the semantic gap in music recommendation. J Web Seman 6(4):256–256
Oscar C, Perfecto H, Xavier S (2006) “A multimodal approach to bridge the music semantic gap,” Semantic and Digital Media Technologies (SAMT)
OWL Web Ontology Language, Available at: http://www.w3.org/TR/owl-ref/
Paulo N et al (2006) “Emotions on agent based simulators for group formation,” Proceedings of the European Simulation and Modeling Conference, pp. 5–18
Pauws S, Eggen B (2002) “PATS: realization and user evaluation of an automatic playlist generator,” Proceedings of ISMIR, pp. 222–227
Poppe C, Martens G, Potter PD, Walle RVD (2010) “Semantic web technologies for video surveillance metadata,” Multimedia Tools and Applications, online published at
Protégé Editor, Available at: http://protege.stanford.edu
Rho S, Han B, Hwang E, Kim M (2008) MUSEMBLE: a novel music retrieval system with automatic voice query transcription and reformulation. J Syst Softw Elsevier 81(7):1065–1080
Richard A, Raphaël T, Steffen S, Lynda H (2009) “COMM: a core ontology for multimedia annotation”, Handbook on ontologies. International Handbooks on Information Systems, pp.403-421
Ruebenstrunk G “Emotional Computers,” Available at: http://ruebenstrunk.de/emeocomp/content.HTM
Russell JA (1980) “A circumplex model of affect,” J Pers Soc Psychol Vol. 39
Song S, Rho S, Hwang E, Kim M (2009) “Music ontology for mood and situation reasoning to support music retrieval and recommendation,” In: Proceedings of the International Conference on Digital Society, Cancun, Mexico, pp. 304–309, 2009
Thayer RE (1989) The biopsychology of mood and arousal. Oxford University Press, New York
W3C. RDF Specification, Available at: http://www.w3c.org/RDF
Wikipedia, Available at: http://en.wikipedia.org/wiki/Information_retrieval
WordNet, Available at: http://wordnet.princeton.edu/
Yves R, Frederick G, “Music ontology specification,” Available at: http://www.musicontology.com/
Yves R, Samer A, Mark S, Frederick G (2007) The music ontology. Proc Int Conf Music Inf Retrieval ISMIR 2007:417–422
Acknowledgment
“This research was supported by the MKE(Ministry of Knowledge Economy), Korea, under the ITRC(Information Technology Research Center) support program supervised by the NIPA(National IT Industry Promotion Agency)” (NIPA-2011-C1090-1101-0008)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Rho, S., Song, S., Nam, Y. et al. Implementing situation-aware and user-adaptive music recommendation service in semantic web and real-time multimedia computing environment. Multimed Tools Appl 65, 259–282 (2013). https://doi.org/10.1007/s11042-011-0803-4
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-011-0803-4