Advertisement

Journal on Multimodal User Interfaces

, Volume 7, Issue 1–2, pp 143–155 | Cite as

The LDOS-PerAff-1 corpus of facial-expression video clips with affective, personality and user-interaction metadata

  • Marko Tkalčič
  • Andrej Košir
  • Jurij Tasič
Original Paper

Abstract

We present the LDOS-PerAff-1 Corpus that bridges the affective computing and recommender system research areas, which makes it unique. The corpus is composed of video clips of subjects’ affective responses to visual stimuli. These affective responses are annotated in the continuous valence-arousal-dominance space. Furthermore, the subjects are annotated with their personality information using the five-factor personality model. We also provide the explicit ratings that the users gave to the images used for the visual stimuli. In the paper we present the results of four experiments conducted with the corpus; an affective content-based recommender system, a personality-based collaborative filtering recommender system, an emotion-detection algorithm and a qualitative study of the latent factors.

Keywords

Affective facial expressions Affect detection Recommender systems 

Notes

Acknowledgments

The authors would like to thank the teachers and students from the Gimnazija Poljane school in Ljubljana for their participation. We are also thankful to Matevẑ Kunaver, Tomaẑ Poẑrl and other members of the LDOS group who have helped in the implementation of the acquisition procedure. This work has been partially funded by the European Commission within the 6th framework of the IST under grant number FP6-27312 and by the Slovenian research agency ARRS under the contract P2-0246. All statements in this work reflect the personal ideas and opinions of the authors and not necessarily the opinions of the funders.

References

  1. 1.
    Adomavicius G, Tuzhilin A (2005) Toward the next generation of recommender systems: a survey of the state-of-the-art and possible extensions. IEEE Transact Knowl Data Eng 17(6):734–749. doi: 10.1109/TKDE.2005.99. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=1423975
  2. 2.
    Arapakis I (2010) Affect-Based information retrieval. Ph.D. thesis, University of Glasgow. http://theses.gla.ac.uk/1867/01/2010arapakisphd.pdf
  3. 3.
    Bartlett MS, Littlewort GC, Frank MG, LainscsekC, Fasel IR, Movellan JR (2006) Automatic recognition of facial actions in spontaneous expressions. J Multimed 1(6):22–35. doi: 10.4304/jmm.1.6.22-35. http://ojs.academypublisher.com/index.php/jmm/article/view/2125
  4. 4.
    Bradley M, Lang Coan J, Allen J (2007) The international affective picture system (IAPS) in the study of emotion and attention. In: Handbook of emotion elicitation and assessment, pp 29–46. http://books.google.com.au/books?id=kE1sDDGduWUC&pg=PA29&source=gbs_toc_r&cad=0_0
  5. 5.
    Fleiss J (1971) Measuring nominal scale agreement among many raters. Psychol Bull 76(5):378. http://psycnet.apa.org/journals/bul/76/5/378/ Google Scholar
  6. 6.
    Funk S (2006) Netflix update: try this at home. http://sifter.org/simon/journal/20061211.html
  7. 7.
    Goldberg L, Johnson J, Eber H, Hogan R, Ashton M, Cloninger C, Gough H (2006) The international personality item pool and the future of public-domain personality measures. J Res Personal 40(1):84–96. doi: 10.1016/j.jrp. 2005.08.007. http://linkinghub.elsevier.com/retrieve/pii/S0092656605000553 Google Scholar
  8. 8.
    Goldberg LR (1998) What is beyond the big five? J Pers 66(4):495–524. doi: 10.1111/1467-6494.00022. http://doi.wiley.com/10.1111/1467-6494.00022 Google Scholar
  9. 9.
    Gonzalez G, de la Rosa JL, Montaner M, Delfin S (2007) Embedding emotional context in recommender systems. In: 2007 IEEE 23rd international conference on data engineering workshop, pp 845–852. doi: 10.1109/ICDEW.2007.4401075. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=4401075
  10. 10.
    Grimm M, Kroschel K, Harris H, Nass C, Schuller B, Rigoll G, Moosmayr T (2007) On the necessity and feasibility of detecting a driver’s emotional state while driving. Affect comput intell interact 126–138. http://www.springerlink.com/index/B034573151775330.pdf
  11. 11.
    Hu R, Pu P (2010) A study on user perception of personality-based recommender systems. User Model Adapt Pers 6075:291–302. doi: 10.1007/978-3-642-13470-8_27
  12. 12.
    Hu R, Pu P (2010) Using personality information in collaborative filtering for new users. Recommender systems and the social web 17. http://www.dcs.warwick.ac.uk/~ssanand/RSWeb_files/Proceedings_RSWEB-10.pdf#page=23
  13. 13.
    John O, Srivastava S (1999) The big five trait taxonomy: history, measurement, and theoretical perspectives. In: Handbook of personality: theory and research, vol 2, pp 102–138. http://books.google.com/books?hl=en&lr=&id=b0yalwi1HDMC&oi=fnd&pg=PA102&dq=The+Big+Five+trait+taxonomy:+History,+measurement,+and+theoretical+perspectives&ots=752AK4ZvSj&sig=9FzJss22rcllAlbBbkIV7ELa1xU
  14. 14.
    Kahneman D (2003) A perspective on judgment and choice: mapping bounded rationality. Am Psychol 58(9):697–720. doi: 10.1037/0003-066X.58.9.697. http://www.ncbi.nlm.nih.gov/pubmed/14584987 Google Scholar
  15. 15.
    Kanade T, Cohn J, Tian Y (2000) Comprehensive database for facial expression analysis. Automatic face and gesture recognition. In: Proceedings. Fourth IEEE international conference, pp 46–53. doi: 10.1109/AFGR.2000.840611. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=840611
  16. 16.
    Koenigstein N, Dror G, Koren Y (2011) Yahoo! music recommendations. In: Proceedings of the fifth ACM conference on recommender systems—RecSys ’11, ACM Press, New York, p 165. doi: 10.1145/2043932.2043964. http://dl.acm.org/citation.cfm?id=2043964 http://dl.acm.org/citation.cfm?doid=2043932.2043964
  17. 17.
    Koren Y (2010) Collaborative filtering with temporal dynamics. Commun ACM 53(4):89. doi: 10.1145/1721654.1721677. http://portal.acm.org/citation.cfm?doid=1721654.1721677 Google Scholar
  18. 18.
    Koren Y, Bell R, Volinsky C (2009) Matrix factorization techniques for recommender systems. Computer 42(8):30–37. doi: 10.1109/MC.2009.263. http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=5197422 Google Scholar
  19. 19.
    Lang PJ, Bradley MM, Cuthbert BN (2005) International affective picture system (IAPS): affective ratings of pictures and instruction manual. Technical Report A-8. Tech. rep., University of Florida. http://www.phhp.ufl.edu/csea/Media.html
  20. 20.
    Lehmann EL, Romano JP (2005) Testing statistical hypotheses. In: Springer texts in statistics. Springer, New York. doi: 10.1007/0-387-27605-X. http://www.springer.com/statistics/statistical+theory+and+methods/book/978-0-387-98864-1http://www.springerlink.com/index/10.1007/0-387-27605-X
  21. 21.
    Lucey P, Cohn JF, Kanade T, Saragih J, Ambadar Z, Matthews I (2010) The extended Cohn-Kanade Dataset (CK+): a complete dataset for action unit and emotion-specified expression. In: 2010 IEEE computer society conference on computer vision and pattern recognition—Workshops (July), pp 94–101. doi: 10.1109/CVPRW.2010.5543262. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5543262
  22. 22.
    McKeown G, Valstar M, Cowie R, Pantic M (2010) The SEMAINE corpus of emotionally coloured character interactions. In: Multimedia and Expo (ICME), 2010 IEEE international conference, pp 1079–1084. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5583006
  23. 23.
    McKeown G, Valstar M, Cowie R, Pantic M, Schroder M (2012) The SEMAINE database: annotated multimodal records of emotionally colored conversations between a person and a limited agent. IEEE Trans Affect Comput 3(1):5–17. doi:10.1109/T-AFFC.2011.20. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5959155
  24. 24.
    Miller BN, Albert I, Lam SK, Konstan JA, Riedl J (2003) MovieLens unplugged. In: Proceedings of the 8th international conference on Intelligent user interfaces—IUI ’03. ACM Press, New York, p 263. doi: 10.1145/604045.604094. http://portal.acm.org/citation.cfm?doid=604045.604094
  25. 25.
    Nicolaou MA, Gunes H, Pantic M (2011) Continuous prediction of spontaneous affect from multiple cues and modalities in valence—arousal Space. IEEE Trans Affect Comput 1–15. doi: 10.1109/T-AFFC.2011.9. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5740839
  26. 26.
    Pantic M, Valstar M, Rademaker R, Maat L (2005) Web-based database for facial expression analysis. In: 2005 IEEE international conference on multimedia and Expo, pp 317–321. doi: 10.1109/ICME.2005.1521424. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=1521424
  27. 27.
    Posner J, Russell JA, Peterson BS (2005) The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev Psychopathol 17(3):715–34. doi: 10.1017/S0954579405050340. http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2367156&tool=pmcentrez&rendertype=abstract Google Scholar
  28. 28.
    Soleymani M, Davis J, Pun T (2009) A collaborative personalized affective video retrieval system. In: 2009 3rd international conference on affective computing and intelligent interaction and workshops, pp 1–2. doi: 10.1109/ACII.2009.5349526. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5349526
  29. 29.
    Tkalčič M, Burnik U, Košir A (2010) Using affective parameters in a content-based recommender system for images. User Model User Adapted Interact 20(4):279–311. doi: 10.1007/s11257-010-9079-z. http://www.springerlink.com/content/3l2p657572rt4j11/http://www.springerlink.com/index/10.1007/s11257-010-9079-z
  30. 30.
    Tkalčič M, Kunaver M, Košir A, Tasič J (2011) Addressing the new user problem with a personality based user similarity measure. In: Joint proceedings of the workshop on decision making and recommendation acceptance issues in recommender systems (DEMRA 2011) and the 2nd workshop on user models for motivational systems: the affective and the rational routes to persuasion (UMMS 2011). http://ceur-ws.org/Vol-740/UMMS2011_paper6.pdf
  31. 31.
    Tkalčič M, Kunaver M, Tasič J, Košir A (2009) Personality based user similarity measure for a collaborative recommender system. In: 5th workshop on emotion in human–computer interaction—real world challenges, p 30. http://slavnik.fe.uni-lj.si/markot/uploads/Main/2009_Tkalcic_HCI.pdf
  32. 32.
    Tkalčič M, Odić A, Košir A, Tasič J (2010) Comparison of an emotion detection technique on posed and spontaneous datasets. In: Proceedings of the 19th ERK conference, PortorožGoogle Scholar
  33. 33.
    Tkalčič M, Odić A, Košir A, Tasič J (2011) Impact of implicit and explicit affective labeling on a recommender system’s performance. In: Joint proceedings of the workshop on decision making and recommendation acceptance issues in recommender systems (DEMRA 2011) and the 2nd workshop on user models for motivational systems: the affective and the rational routes to persuasion (UMMS 2011), p 112. http://ceur-ws.org/Vol-740/UMMS2011_paper7.pdf
  34. 34.
    Tkalčič M, Tasič J, Košir A (2009) Emotive and personality parameters in multimedia recommender systems. In: Affective computing and intelligent interaction proceedings of the doctoral consortium, p 33. http://slavnik.fe.uni-lj.si/markot/Main/AffectiveUserModeling
  35. 35.
    Valenti R, Yucel Z, Gevers T (2009) Robustifying eye center localization by head pose cues. In: 2009 IEEE conference on computer vision and pattern recognition, pp 612–618. doi: 10.1109/CVPR.2009.5206640. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5206640
  36. 36.
    Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Machine Intell 31(1):39–58CrossRefGoogle Scholar

Copyright information

© OpenInterface Association 2012

Authors and Affiliations

  1. 1.Faculty of Electrical EngineeringUniversity of LjubljanaLjubljanaSlovenia

Personalised recommendations