Journal on Multimodal User Interfaces

, Volume 4, Issue 2, pp 81–95 | Cite as

Design and implementation of an affect-responsive interactive photo frame

  • Hamdi Dibeklioğlu
  • Marcos Ortega Hortas
  • Ilkka Kosunen
  • Petr Zuzánek
  • Albert Ali Salah
  • Theo Gevers
Open Access
Original Paper


This paper describes an affect-responsive interactive photo-frame application that offers its user a different experience with every use. It relies on visual analysis of activity levels and facial expressions of its users to select responses from a database of short video segments. This ever-growing database is automatically prepared by an offline analysis of user-uploaded videos. The resulting system matches its user’s affect along dimensions of valence and arousal, and gradually adapts its response to each specific user. In an extended mode, two such systems are coupled and feed each other with visual content. The strengths and weaknesses of the system are assessed through a usability study, where a Wizard-of-Oz response logic is contrasted with the fully automatic system that uses affective and activity-based features, either alone, or in tandem.


Affective computing Facial expression Optical flow Interactive photograph Human behavior understanding Automatic video segmentation 


  1. 1.
    Agamanolis S (2006) Beyond communication: human connectedness as a research agenda. In: Networked neighbourhoods, pp 307–344 Google Scholar
  2. 2.
    Bailenson J, Pontikakis E, Mauss I, Gross J, Jabon M, Hutcherson C, Nass C, John O (2008) Real-time classification of evoked emotions using facial feature tracking and physiological responses. Int J Hum-Comput Stud 66(5):303–317 CrossRefGoogle Scholar
  3. 3.
    Bookstein F (1989) Principal warps: thin-plate splines and the decomposition of deformations. IEEE Trans Pattern Anal Mach Intell 11(6):567–585 zbMATHCrossRefGoogle Scholar
  4. 4.
    Bouguet J (1999) Pyramidal implementation of the Lucas Kanade feature tracker description of the algorithm. Intel Corporation, Microprocessor Research Labs, OpenCV Documents 3 Google Scholar
  5. 5.
    Buchanan R, Margolin V (1995) Discovering design: explorations in design studies. University of Chicago Press, Chicago Google Scholar
  6. 6.
    Bui T, Zwiers J, Poel M, Nijholt A (2006) Toward affective dialogue modeling using partially observable Markov decision processes. In: Proc workshop emotion and computing, 29th annual German conf on artificial intelligence, pp 47–50 Google Scholar
  7. 7.
    Cao J, Wang H, Hu P, Miao J (2008) PAD model based facial expression analysis. In: Advances in visual computing, pp 450–459 CrossRefGoogle Scholar
  8. 8.
    Carver C, White T (1994) Behavioral inhibition, behavioral activation, and affective responses to impending reward and punishment: the BIS/BAS scales. J Pers Soc Psychol 67(2):319–333 CrossRefGoogle Scholar
  9. 9.
    Dibeklioğlu H, Kosunen I, Ortega M, Salah A, Zuzánek P (2010) An affect-responsive photo frame. In: Salah A, Gevers T (eds) Proc eNTERFACE, pp 58–68 Google Scholar
  10. 10.
    Ekman P, Friesen W, Hager J (1978) Facial action coding system. Consulting Psychologists Press, Palo Alto Google Scholar
  11. 11.
    Gilroy S, Cavazza M, Chaignon R, Mäkelä S, Niranen M, André E, Vogt T, Urbain J, Seichter H, Billinghurst M et al. (2008) An affective model of user experience for interactive art. In: Proc int conf on advances in computer entertainment technology. ACM, New York, pp 107–110 CrossRefGoogle Scholar
  12. 12.
    Gunes H, Piccardi M (2009) Automatic temporal segment detection and affect recognition from face and body display. IEEE Trans Syst Man Cybern, Part B, Cybern 39(1):64–84 CrossRefGoogle Scholar
  13. 13.
    Ijsselsteijn W, de Kort Y, Poels K (in preparation) The game experience questionnaire: development of a self-report measure to assess the psychological impact of digital games. Manuscript Google Scholar
  14. 14.
    John O, Donahue E, Kentle R (1991) The Big Five Inventory Versions 4a and 54. Berkeley: University of California, Berkeley, Institute of Personality and Social Research Google Scholar
  15. 15.
    John O, Naumann L, Soto C (2008) The Big Five trait taxonomy: discovery, measurement, and theoretical issues. In: Handbook of personality: theory and research, pp 114–158 Google Scholar
  16. 16.
    Kaliouby R, Robinson P (2005) Real-time inference of complex mental states from facial expressions and head gestures. In: Real-time vision for human-computer interaction, pp 181–200 CrossRefGoogle Scholar
  17. 17.
    Kanade T, Cohn J, Tian Y (2000) Comprehensive database for facial expression analysis. In: Proc AFGR Google Scholar
  18. 18.
    Lienhart R, Maydt J (2002) An extended set of haarlike features for rapid object detection. In: IEEE international conference on image processing, vol 1, pp 900–903 Google Scholar
  19. 19.
    Lucas BD, Kanade T (1981) An iterative image registration technique with an application to stereo vision. In: IJCAI, pp 674–679 Google Scholar
  20. 20.
    Mancas M, Chessini R, Hidot S, Machy C, Ben Madhkour R, Ravet T (2009) Morface: face morphing. Q Prog Sci Rep Numediart Res Program 2(2):33–39 Google Scholar
  21. 21.
    Markopoulos P, Bongers B, Alphen E, Dekker J, Dijk W, Messemaker S, Poppel J, Vlist B, Volman D, Wanrooij G (2006) The PhotoMirror appliance: affective awareness in the hallway. Pers Ubiquitous Comput 10(2):128–135 CrossRefGoogle Scholar
  22. 22.
    Mehrabian A (1996) Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292 MathSciNetCrossRefGoogle Scholar
  23. 23.
    Okwechime D, Ong E, Bowden R (2009) Real-time motion control using pose space probability density estimation. In: Proc int workshop on human-computer interaction Google Scholar
  24. 24.
    Russell J (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178 CrossRefGoogle Scholar
  25. 25.
    Salah A, Schouten B (2009) Semiosis and the relevance of context for the AmI environment. In: Proc European conf on computing and philosophy Google Scholar
  26. 26.
    Schröder M, Bevacqua E, Eyben F, Gunes H, Heylen D, ter Maat M, Pammi S, Pantic M, Pelachaud C, Schuller B et al. (2009) A demonstration of audiovisual sensitive artificial listeners. In: Proc int conf on affective computing & intelligent interaction Google Scholar
  27. 27.
    Schröder M (2010) The SEMAINE API: towards a standards-based framework for building emotion-oriented systems. In: Advances in human-computer interaction Google Scholar
  28. 28.
    Sebe N, Lew M, Sun Y, Cohen I, Gevers T, Huang T (2007) Authentic facial expression analysis. Image Vis Comput 25(12):1856–1863 CrossRefGoogle Scholar
  29. 29.
    Shan C, Gong S, McOwan P (2007) Beyond facial expressions: learning human emotion from body gestures. In: Proc of the British machine vision conference Google Scholar
  30. 30.
    Shi J, Tomasi C (1994) Good features to track. In: Proc computer vision and pattern recognition. IEEE, New York, pp 593–600 Google Scholar
  31. 31.
    Tao H, Huang T (1998) Connected vibrations: a modal analysis approach for non-rigid motion tracking. In: Proc computer vision and pattern recognition, pp 735–740 Google Scholar
  32. 32.
    Valenti R, Sebe N, Gevers T (2007) Facial expression recognition: a fully integrated approach. In: Proc 14th int conf of image analysis and processing-workshops. IEEE Computer Society, New York, pp 125–130 Google Scholar
  33. 33.
    Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. In: Proc computer vision and pattern recognition, vol 1, pp 511–518 Google Scholar
  34. 34.
    Zeng Z, Pantic M, Roisman G, Huang T (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58 CrossRefGoogle Scholar

Copyright information

© The Author(s) 2011

Authors and Affiliations

  • Hamdi Dibeklioğlu
    • 1
  • Marcos Ortega Hortas
    • 2
  • Ilkka Kosunen
    • 3
  • Petr Zuzánek
    • 4
  • Albert Ali Salah
    • 1
  • Theo Gevers
    • 1
  1. 1.Intelligent Systems Lab Amsterdam, Informatics InstituteUniversity of AmsterdamAmsterdamThe Netherlands
  2. 2.University of A CorunaCorunnaSpain
  3. 3.Helsinki Institute for Information Technology HIITUniversity of HelsinkiHelsinkiFinland
  4. 4.Faculty of Information TechnologyCzech Technical University in PraguePrague 6Czech Republic

Personalised recommendations