Advertisement

Multimedia content recommendation in social networks using mood tags and synonyms

  • Chang Bae Moon
  • Jong Yeol Lee
  • Dong-Seong Kim
  • Byeong Man KimEmail author
Regular Paper
  • 32 Downloads

Abstract

The preferences of Web information purchasers are changing. Cost-effectiveness (i.e., an emphasis on performance with respect to price) is becoming less regarded than cost-satisfaction, which emphasizes the purchaser’s psychological satisfaction. A method to improve a user’s cost-satisfaction in recommending multimedia content is to use the mood-inherent in multimedia items. An example of applications using this method is SNS (Social Network Services) based on mood folksonomy. However, such applications encounter problems due to synonyms. This paper suggests a cost-satisfactory method of multimedia content recommendation to solve the problem of synonyms. It utilizes arousal and valence (AV), which express the mood of multimedia content, as its internal tag. A method that defines the relationship between the AV values of multimedia content and the AV values of mood tags is suggested by considering synonyms and the correlations between them. Furthermore, a multimedia content-recommendation method based on the relationship is suggested and tested. The analysis shows that the AV values of multimedia content with a mood tag and its synonyms reside in the area of the mood tag in the Thayer model. The performance of this method exceeds that of a keyword-based recommendation method.

Keywords

Multimedia content Cost-satisfaction Multimedia content mood Multimedia content recommendation Mood tag Social network 

Notes

Acknowledgements

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2017R1D1A1B03033733, 2018R1C1B6001042).

References

  1. 1.
    Moon, C.B., Yi, J.Y., Kim, D.-S., Kim, B.M.: Analysis of overlapping mood tags based on synonyms, Korea Computer Congress 2018 (KCC 2018), KIISE (2018), June 20–22, pp. 667–669 ICC JEJU, Korea (2018)Google Scholar
  2. 2.
    Russel, J.A.: A circumplex model of affect. J. Personal. Soc. Psychol. 39(6), 1161–1178 (1980)CrossRefGoogle Scholar
  3. 3.
    Hevner, K.: Experimental studies of the elements of expression in music. Am. J. Psychol. 48(2), 246–268 (1936)CrossRefGoogle Scholar
  4. 4.
    Thayer, R.E.: The Biopsychology of Mood and Arousal. Oxford University Press, Oxford (1990)Google Scholar
  5. 5.
    Moon, C.B., Kim, H.S., Kim, B.M.: Music retrieval method using mood tag and music AV tag based on folksonomy. J. KIISE 40(9), 526–543 (2013)Google Scholar
  6. 6.
    Moon, C.B., Kim, H.S., Lee, H.A., Kim, B.M.: Analysis of relationships between mood and color for different musical preferences. Color Res. Appl. 39(4), 413–423 (2014)CrossRefGoogle Scholar
  7. 7.
    Moon, C.B., Kim, H.S., Lee, D.W., Kim, B.M.: Mood lighting system reflecting music mood. Color Res. Appl. 40(2), 201–212 (2015)CrossRefGoogle Scholar
  8. 8.
    Ness, S.R., Theocharis, A., Tzanetakis, G., Martins, L.G.: Improving automatic music tag annotation using stacked generalization of probabilistic svm outputs. Proc. of the 17th ACM International Conference on Multimedia, pp. 705–708 (2009)Google Scholar
  9. 9.
    Laurier, C., Sordo, M., Serra, J., Herrera, P.: Music mood representations from social tags. Proc. of the 10th International Society for Music Information Conference, pp. 381–386. Kobe, Japan (2009)Google Scholar
  10. 10.
    Kim, J., Lee, S., Kim, S., Yoo, W.Y.: Music mood classification model based on arousal-valence values. Proc. of 13th International Conference on Advanced Communication Technology (ICACT), 2011, pp. 292295 (2011)Google Scholar
  11. 11.
    Ji, A.T., et al.: Collaborative Tagging in Recommender Systems, presented at the Advances in Artificial Intelligence (2007)Google Scholar
  12. 12.
    Tso-Sutter, K.H.L., et al.: Tag-aware Recommender Systems by Fusion of Collaborative Filtering Algorithms, presented at the ACM Symposium on Applied Computing (2008)Google Scholar
  13. 13.
    Vojnovic, M., et al.: Ranking and suggesting popular items. IEEE Trans. Knowl. Data Eng. 21, 1133–1146 (2009)CrossRefGoogle Scholar
  14. 14.
    Yang, S., Kim, S., Ro, Y.M.: Semantic home photo categorization. IEEE Trans. Circ. Syst. Video Technol. 17(3), 324–335 (2007)CrossRefGoogle Scholar
  15. 15.
    Yang, S., Kim, S.K., Seo, K.S., Ro, Y.M., Kim, J., Seo, Y.S.: Semantic categorization of digital home photo using photographic region templates. Int. J. Inf. Process. Manag. 43(2), 503–514 (2007)CrossRefGoogle Scholar
  16. 16.
    Yang, S., Ro, R.M.: Photo indexing using person-based multi-feature fusion with temporal context. Proc. International Conference on Mobile Ubiquitous Computing, Systems, Services and Technologies, Nov. 2007, pp. 257–262 (2007)Google Scholar
  17. 17.
    Chang, E., Kingshy, G., Sychay, G., Wu, G.: CBSA: Content-based soft annotation for multimodal image retrieval using Bayes point machines. IEEE Trans. Circ. Syst. Video Technol. 13(1), 26–38 (2003)CrossRefGoogle Scholar
  18. 18.
    Li, J., Wang, J.Z.: Real-time computerized annotation of pictures. IEEE Trans. Pattern Anal. Mach. Intell. 30(6), 985–1002 (2008)CrossRefGoogle Scholar
  19. 19.
    ö rzinger, R.M., Sorschag, R., Thallinger, G., Lindstaedt, S.: Automatic image annotation using visual content and folksonomies. Multimed. Tools Appl. 42(1), 97–113 (2009)CrossRefGoogle Scholar
  20. 20.
    Powers, D.: Evaluation: from precision recall and f-measure to roc informedness markedness and correlation. J. Mach. Learn. Technol. 2(1), 37–63 (2011)MathSciNetGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.ICT-Convergence Research CenterKumoh National Institute of TechnologyGumiSouth Korea
  2. 2.Computer and Software EngineeringKumoh National Institute of TechnologyGumiSouth Korea

Personalised recommendations