Advertisement

User Modeling and User-Adapted Interaction

, Volume 20, Issue 4, pp 279–311 | Cite as

Using affective parameters in a content-based recommender system for images

  • Marko Tkalčič
  • Urban Burnik
  • Andrej Košir
Original Paper

Abstract

There is an increasing amount of multimedia content available to end users. Recommender systems help these end users by selecting a small but relevant subset of items for each user based on her/his preferences. This paper investigates the influence of affective metadata (metadata that describe the user’s emotions) on the performance of a content-based recommender (CBR) system for images. The underlying assumption is that affective parameters are more closely related to the user’s experience than generic metadata (e.g. genre) and are thus more suitable for separating the relevant items from the non-relevant. We propose a novel affective modeling approach based on users’ emotive responses. We performed a user-interaction session and compared the performance of the recommender system with affective versus generic metadata. The results of the statistical analysis showed that the proposed affective parameters yield a significant improvement in the performance of the recommender system.

Keywords

Affective modeling Content-based recommender system Emotion induction IAPS Item profile Machine learning Metadata User profile Valence-arousal-dominance 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Adomavicius G., Tuzhilin A.: Toward the next generation of recommender systems: a survey of the state-of-the-art and possible extensions. IEEE. Trans. Knowl. Data Eng. 17(6), 734–749 (2005)CrossRefGoogle Scholar
  2. Ali, K., Van Stam, W.: TiVo: making show recommendations using a distributed collaborative filtering architecture. In: Proceedings of the Tenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 394–401. ACM, New York, NY, USA (2004)Google Scholar
  3. Arapakis, I., Moshfeghi, Y., Joho, H., Ren, R., Hannah, D., Jose, J., Gardens, L.: Integrating facial expressions into user profiling for the improvement of a multimodal recommender system. In: Proceedings of the IEEE International Conference on Multimedia and Expo, pp. 1440–1443 (2009)Google Scholar
  4. Basu, C., Hirsh, H., Cohen, W.: Recommendation as classification: using social and content-based information in recommendation. In: Proceedings of the National Conference on Artificial Intelligence, pp. 714–720. Wiley, New York (1998)Google Scholar
  5. Batliner A., Steidl S., Hacker C., Noth E.: Private emotions versus social interaction: a data-driven approach towards analysing emotion in speech. User Model. User-Adapt. Interact. J. Pers. Res. 18(1), 175–206 (2008). doi: 10.1007/s11257-007-9039-4 CrossRefGoogle Scholar
  6. Berger H., Denk M., Dittenbach M., Pesenhofer A., Merkl D.: Photo-based user profiling for tourism recommender systems. In: Psaila, G., Wagner, R. (eds) E-Commerce and Web Technologies, vol. 4655, pp. 46–55. Springer, Berlin (2007)CrossRefGoogle Scholar
  7. Bradley, M.M., Lang, P.J.: The International Affective Picture System (IAPS) in the Study of Emotion and Attention, chap. 2. Series in Affective Science. Oxford University Press, Oxford (2007)Google Scholar
  8. Burke R.: Hybrid recommender systems: survey and experiments. User Model. User-Adapt. Interact. 12(4), 331–370 (2002). doi: 10.1023/A:1021240730564 zbMATHCrossRefGoogle Scholar
  9. Carberry S., de Rosis F.: Introduction to special issue on affective modeling and adaptation. User Model. User-Adapt. Interact. 18(1), 1–9 (2008)CrossRefGoogle Scholar
  10. Caridakis G., Karpouzis K., Wallace M., Kessous L., Amir N.: Multimodal users affective state analysis in naturalistic interaction. J. Multimodal User Interfaces 3(1), 49–66 (2010)CrossRefGoogle Scholar
  11. Coan J., Allen J.: Handbook of Emotion Elicitation and Assessment. Oxford University Press, New York (2007)Google Scholar
  12. Conati C., Maclaren H.: Empirically building and evaluating a probabilistic model of user affect. User Model. User-Adapt. Interact. 19(3), 267–303 (2009)CrossRefGoogle Scholar
  13. Cowie R., Cowie E.D., Tsapatsoulis N., Votsis G., Kollias S., Fellenz W., Taylor J.: Emotion recognition in human-computer interaction. IEEE Signal Process. Mag. 18(1), 32–80 (2001)CrossRefGoogle Scholar
  14. Darwin C.: The Expression of the Emotions in Man and Animals. Oxford University Press, New York (1872)CrossRefGoogle Scholar
  15. D’Mello S., Craig S., Witherspoon A., McDaniel B., Graesser A.: Automatic detection of learner’s affect from conversational cues. User Model. User-Adapt. Interact. J. Pers. Res. 18(1), 45–80 (2008). doi: 10.1007/s11257-007-9037-6 CrossRefGoogle Scholar
  16. Ekman P.: Basic emotions. In: Dalgleish, T., Power, T. (eds) Handbook of Cognition and Emotion, Wiley, New York (1999)Google Scholar
  17. Fleiss J.L.: Measuring nominal scale agreement among many raters. Psychol. Bull. 76(5), 378–382 (1971)CrossRefGoogle Scholar
  18. Goldberg L.R., Johnson J.A., Eber H.W., Hogan R., Ashton M.C., Cloninger C.R., Gough H.G.: The international personality item pool and the future of public-domain personality measures. J. Res. Pers. 40, 84–96 (2006)CrossRefGoogle Scholar
  19. González, G., López, B., de la Rosa, J.L.L.: Managing emotions in smart user models for recommender systems. In: Proceedings of 6th International Conference on Enterprise Information Systems ICEIS 2004, vol. 5, pp. 187–194 (2004)Google Scholar
  20. Grouplens Data Sets.: http://www.grouplensorg/node/12. Accessed Sept 2010
  21. Hanjalic A.: Extracting moods from pictures and sounds. IEEE Signal Process. Mag. 23(2), 90 (2006)CrossRefGoogle Scholar
  22. Hastie T., Tibshirani R., Friedman J.H.: The Elements of Statistical Learning. Springer, New York (2001)zbMATHGoogle Scholar
  23. Herlocker J., Konstan J., Terveen L., Riedl J.: Evaluating collaborative filtering recommender systems. ACM Trans. Inf. Syst. 22(1), 53 (2004)CrossRefGoogle Scholar
  24. Ioannou S., Raouzaiou A., Tzouvaras V., Mailis T., Karpouzis K., Kollias S.: Emotion recognition through facial expression analysis based on a neurofuzzy network. Neural Netw. 18(4), 423–435 (2005)CrossRefGoogle Scholar
  25. Irun, M., Moltó Brotons, F.: Looking at pictures in North America and Europe: a cross-cultural study on the IAPS. In: Poster presented at the 1997 FEPS Meeting in Konstanz (1997)Google Scholar
  26. Joho, H., Jose, J., Valenti, R., Sebe, N.: Exploiting facial expressions for affective video summarisation. In: Proceeding of the ACM International Conference on Image and Video Retrieval, pp. 1–8. ACM (2009)Google Scholar
  27. Kim Y., Yum B., Song J., Kim S.: Development of a recommender system based on navigational and behavioral patterns of customers in e-commerce sites. Expert Syst. Appl. 28(2), 381–393 (2005)CrossRefGoogle Scholar
  28. Kohavi, R.: A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence, vol. 2, no. 12, pp. 1137–1143. Morgan Kaufmann, San Mateo (1995)Google Scholar
  29. Koren Y., Bell R., Volinsky C.: Matrix factorization techniques for recommender systems. Computer 42(8), 30–37 (2009). doi: ieeecomputersociety.org/10.1109/MC.2009.263 CrossRefGoogle Scholar
  30. Lang, P., Bradley, M., Cuthbert, B.: International affective picture system (IAPS): affective ratings of pictures and instruction manual. Technical Report a-6. Technical Report, University of Florida, Gainesville, FL (2005)Google Scholar
  31. Lehman E.L., Romano J.: Testing Statistical Hypotheses. Springer, New York (2005)Google Scholar
  32. Lew M.S., Sebe N., Djeraba C., Jain R.: Content-based multimedia information retrieval: state of the art and challenges. ACM Trans. Multimed. Comput. 2(1), 1–19 (2006)CrossRefGoogle Scholar
  33. Lichtenstein A., Oehme A., Kupschick S., Jürgensohn T.: Comparing Two Emotion Models for Deriving Affective States from Physiological Data, pp. 35–50. Springer-Verlag, Berlin (2008). doi: 10.1007/978-3-540-85099-1_4 Google Scholar
  34. McNee, S., Lam, S., Konstan, J., Riedl, J.: Interfaces for eliciting new user preferences in recommender systems. In: User Modeling 2003: 9th International Conference, UM 2003, Johnstown, PA, USA, June 22–26, 2003: Proceedings, pp. 178–187. Springer-Verlag, Berlin (2003)Google Scholar
  35. McQuiggan S., Mott B., Lester J.: Modeling self-efficacy in intelligent tutoring systems: an inductive approach. User Model. User-Adapt. Interact. J. Pers. Res. 18(1), 81–123 (2008). doi: 10.1007/s11257-007-9040-y CrossRefGoogle Scholar
  36. Mehrabian A.: Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr. Psychol. 14(4), 261–292 (1996)CrossRefMathSciNetGoogle Scholar
  37. Nunes, M.A., Cerri, S., Blanc, N.: Improving recommendations by using personality traits in user profiles. In: Proceedings of I-KNOW’08 8th International Conference on Knowledge Management and Knowledge Technologies, pp. 92–100. Graz, Austria (2008)Google Scholar
  38. Pantic M., Vinciarelli A.: Implicit human-centered tagging. IEEE Signal Process. Mag. 26(6), 173–180 (2009)CrossRefGoogle Scholar
  39. Pazzani, M., Billsus, D.: Content-Based Recommendation Systems. The Adaptive Web, pp. 325–341. doi: 10.1007/978-3-540-72079-9_10 (2007)
  40. Picard R.W.: Affective Computing. MIT Press, Cambridge (2000)Google Scholar
  41. Plutchik R.: The nature of emotions. Am. Sci. 89(4), 344–350 (2001)Google Scholar
  42. Pogačnik M., Tasič J., Meža M., Košir A.: Personal content recommender based on a hierarchical user model for the selection of TV programmes. User Model. User Adapt. Interact. 15, 425–457 (2005)CrossRefGoogle Scholar
  43. Porayska-Pomsta K., Mavrikis M., Pain H.: Diagnosing and acting on student affect: the tutor’s perspective. User Model. User-Adapt. Interact. J. Pers. Res. 18(1), 125–173 (2008). doi: 10.1007/s11257-007-9041-x CrossRefGoogle Scholar
  44. Posner J., Russell J.A., Peterson B.: The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev. Psychopathol. 17, 715–734 (2005)CrossRefGoogle Scholar
  45. Rashid, A., Albert, I., Cosley, D., Lam, S., McNee, S., Konstan, J., Riedl, J.: Getting to know you: learning new user preferences in recommender systems. In: Proceedings of the 7th International Conference on Intelligent User Interfaces, January, ACM, pp. 13–16 (2002)Google Scholar
  46. Ribeiro R., Pompéia S., Bueno O.: Comparison of Brazilian and American norms for the International Affective Picture System (IAPS). Revista Brasileira de Psiquiatria 27, 208–215 (2005)Google Scholar
  47. Rottenberg J., Ray R.D., Gross J.J.: Emotion Elicitation Using Films, chap. 2. Oxford University Press, London (2007)Google Scholar
  48. Scheirer J., Fernandez R., Klein J., Picard R.W.: Frustrating the user on purpose: a step toward building an affective computer. Interact. Comput. 14(2), 93–118 (2002). doi: 10.1016/S0953-5438(01)00059-5 Google Scholar
  49. Scherer K.: What are emotions? And how can they be measured? Soc. Sci. Inf. 44(4), 695 (2005)CrossRefGoogle Scholar
  50. Schröeder, M., Baggia, P., Burkhardt, F., Oltramari, A., Pelachaud, C., Peter, C., Zovato, E.: Emotion markup language (emotionml) 1.0. W3C Working Draft 29 July 2010. http://www.w3.org/TR/2010/WD-emotionml-20100729/ (2010)
  51. Shan M.K., Kuo F.F., Chiang M.F., Lee S.Y.: Emotion-based music recommendation by affinity discovery from film music. Expert. Syst. Appl. 36(4), 7666–7674 (2009). doi: 10.1016/j.eswa.2008.09.042 CrossRefGoogle Scholar
  52. Tkalčič, M., Tasič, J, Košir, A.: The LDOS-PerAff-1 corpus of face video clips with affective and personality metadata. In: Kipp M (ed.) Proceedings of the LREC 2010 Workshop on Multimodal Corpora: Advances in Capturing, Coding and Analyzing Multimodality (2010)Google Scholar
  53. Verschuere, B., Crombez, G., Koster, E.: Cross Cultural Validation of the IAPS. Ghent University, Ghent, Belgium. http://users.ugent.be/~bvschuer/Iaps.pdf (2007)
  54. Villon, O., Lisetti, C.: A user-modeling approach to build user’s psycho-physiological maps of emotions using bio-sensors. In: The 15th IEEE International Symposium on Robot and Human Interactive Communication, 2006, ROMAN 2006, pp 269–276 (2006)Google Scholar
  55. Vinciarelli A., Pantic M., Bourlard H.: Social signal processing: survey of an emerging domain. Image Vis. Comput. 27(12), 1743–1759 (2009)CrossRefGoogle Scholar
  56. Westen D.: Psychology: Mind, Brain and Culture. 2nd edn. Wiley, New York (1999)Google Scholar
  57. Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques. 2nd edn. Morgan Kaufmann Series in Data Management Systems. Morgan Kaufmann, San Francisco (2005)Google Scholar
  58. Yannakakis G., Hallam J., Lund H.: Entertainment capture through heart rate activity in physical interactive playgrounds. User Model. User-Adapt. Interact. J. Pers. Res. 18(1), 207–243 (2008). doi: 10.1007/s11257-007-9036-7 CrossRefGoogle Scholar
  59. Yik M., Russell J.A., Ahn C.k., Fernandez Dols J.M., Suzuki N.: Relating the five-factor model of personality to a circumplex model of affect: a five-language study. In: McCrae, R.R., Allik, J. (eds) The Five-Factor Model of Personality Across Cultures, pp. 79–104. Kluwer Academic Publishers, New York (2002)Google Scholar
  60. Zeng Z., Pantic M., Roisman G.I., Huang T.S.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009). doi: 10.1109/TPAMI.2008.52 CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2010

Authors and Affiliations

  1. 1.Faculty of Electrical EngineeringUniversity of LjubljanaLjubljanaSlovenia

Personalised recommendations