Real-time motion attention and expressive gesture interfaces

  • Matei Mancas
  • Donald Glowinski
  • Gualtiero Volpe
  • Antonio Camurri
  • Pierre Bretéché
  • Jonathan Demeyer
  • Thierry Ravet
  • Paolo Coletta
Original Paper


This paper aims at investigating the relationship between gestures’ expressivity and the amount of attention they attract. We present a technique for quantifying behavior saliency, here understood as the capacity to capture one’s attention, by the rarity of selected motion and gestural expressive features. This rarity index is based on the real-time computation of the occurrence probability of expressive motion features numerical values. Hence, the time instants that correspond to rare unusual dynamic patterns of an expressive feature are singled out. In a multi-user scenario, the rarity index highlights the person in a group which shows the most different behavior with respect to the others. In a mono-user scenario, the rarity index highlights when the expressive content of a gesture changes. Those methods can be considered as preliminary steps toward context-aware expressive gesture analysis. This work has been partly carried out in the framework of the eNTERFACE 2008 workshop (Paris, France, August 2008) and is partially supported by the EU ICT SAME Project ( and by the NUMEDIART Project (


Computational attention Saliency Rarity Expressive gesture 


  1. 1.
    Berlyne DE, Berlyne DE (1974) Studies in the new experimental aesthetics Google Scholar
  2. 2.
    Boiman O, Irani M (2007) Detecting irregularities in images and in video. Int J Comput Vis 74(1):17–31 CrossRefGoogle Scholar
  3. 3.
    Bruce NDB, Tsotsos JK (2009) Saliency, attention, and visual search: An information theoretic approach. J Vis 9(3):5 Google Scholar
  4. 4.
    Camurri A, Lagerlöf I, Volpe G (2003) Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques. Int J Hum Comput Stud Elsevier Sci 59:213–225 CrossRefGoogle Scholar
  5. 5.
    Camurri A, Volpe G, De Poli G, Leman M (2005) Communicating expressiveness and affect in multimodal interactive systems. IEEE Multimed 43–53 Google Scholar
  6. 6.
    Cowan N (2001) The magical number 4 in short-term memory: A reconsideration of mental storage capacity. Behav Brain Sci 24(01):87–114 CrossRefGoogle Scholar
  7. 7.
    Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor JG (2001) Emotion recognition in human-computer interaction. IEEE Signal Process Mag 18(1):32–80 CrossRefGoogle Scholar
  8. 8.
    Dhavale N, Itti L (2003) Saliency-based multifoveated MPEG compression. In: Signal processing and its applications, 2003. Proceedings. Seventh international symposium on, vol 1 Google Scholar
  9. 9.
    Eastwood JD, Smilek D, Merikle PM (2001) Differential attentional guidance by unattended faces expressing positive and negative emotion. Percept Psychophys 63(6):1004–1013 Google Scholar
  10. 10.
  11. 11.
    Glowinski D, Bracco F, Chiorri C, Atkinson A, Coletta P, Camurri A (2008) An investigation of the minimal visual cues required to recognize emotions from human upper-body movements. In: Proceedings of ACM international conference on multimodal interfaces (ICMI), workshop on affective interaction in natural environments (AFFINE). ACM, New York Google Scholar
  12. 12.
    Hatfield E, Cacioppo JT, Rapson RL (1994) Emotional contagion studies in emotion and social interaction. Editions de la Maison des sciences de l’homme Google Scholar
  13. 13.
    Itti L, Baldi P (2006) Bayesian surprise attracts human attention. Adv Neural Inf Process Syst 18:547 Google Scholar
  14. 14.
    Kurtenbach G, Hulteen EA (1992) Gestures in human-computer communication. In: The art of human-computer interface design, pp 309–317 Google Scholar
  15. 15.
    Le Meur O, Le Callet P, Barba D, Thoreau D (2006) A coherent computational approach to model bottom-up visual attention. IEEE Trans Pattern Anal Mach Intell, pp 802–817 Google Scholar
  16. 16.
    Liu F, Gleicher M (2006) Video retargeting: automating pan and scan. In: Proceedings of the 14th annual ACM international conference on multimedia. ACM, New York, pp 241–250 CrossRefGoogle Scholar
  17. 17.
    Mancas M (2007) Computational attention: Towards attentive computers. Similar edition. CIACO University Distributors Google Scholar
  18. 18.
    Mancas M (2009) Relative influence of bottom-up and top-down attention. In: Attention in cognitive systems. Lecture notes in computer science, vol 5395/2009. Springer, Berlin, pp 212–226 CrossRefGoogle Scholar
  19. 19.
    Mancas M, Gosselin B, Macq B (2007) A three-level computational attention model. In: Proc of ICVS workshop on computational attention & applications, Germany Google Scholar
  20. 20.
    Mancas M, Mancas-Thillou C, Gosselin B, Macq B (2007) A rarity-based visual attention map-application to texture description. In: Proceedings of IEEE international conference on image processing, pp 445–448 Google Scholar
  21. 21.
    Mehrabian A, Russell JA (1974) An approach to environmental psychology Google Scholar
  22. 22.
    Parkhurst DJ, Niebur E (2004) Texture contrast attracts overt visual attention in natural scenes. Eur J Neurosci 19(3):783–789 CrossRefGoogle Scholar
  23. 23.
    Picard RW (1997) Affective computing. MIT Press, Cambridge Google Scholar
  24. 24.
    Stormark KM, Hugdahl K, Posner MI (1999) Emotional modulation of attention orienting: A classical conditioning study. Scand J Psychol 40(2):91–99 CrossRefGoogle Scholar
  25. 25.
    Velastin SA, Lo BA, Vicencio-Silva BPLJS (2005) PRISMATICA: toward ambient intelligence in public transport environments. IEEE Trans Syst Man Cybern, Part A 35(1):164–182 CrossRefGoogle Scholar
  26. 26.
    Vuilleumier P, Armony J, Dolan R (2003) Reciprocal links between emotion and attention. In: Friston KJ, Frith CD, Dolan RJ, Price C, Ashburner J, Penny W, Zeki S, Frackowiak RSJ (eds) Human brain functions. Academic Press, San Diego, pp 419–444 Google Scholar
  27. 27.
    Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28:879–896 CrossRefGoogle Scholar
  28. 28.
    Watson D, Clark LA, Tellegen A (1988) Development and validation of brief measures of positive and negative affect: The PANAS scales. J Pers Soc Psychol 54(6):1063–1070 CrossRefGoogle Scholar
  29. 29.
    Yee H, Pattanaik S, Greenberg DP (2001) Spatiotemporal sensitivity and visual attention for efficient rendering of dynamic environments. ACM Trans Graph (TOG) 20(1):39–65 CrossRefGoogle Scholar
  30. 30.
    Zhang S, Stentiford F (2007) Motion detection using a model of visual attention. In: Image processing, 2007. ICIP 2007. IEEE international conference on, vol 3 Google Scholar

Copyright information

© OpenInterface Association 2009

Authors and Affiliations

  • Matei Mancas
    • 1
  • Donald Glowinski
    • 2
  • Gualtiero Volpe
    • 2
  • Antonio Camurri
    • 2
  • Pierre Bretéché
    • 3
  • Jonathan Demeyer
    • 1
  • Thierry Ravet
    • 1
  • Paolo Coletta
    • 2
  1. 1.IT Research Center, FPMSUniversity of MonsMonsBelgium
  2. 2.Casa Paganini/InfoMus LabUniversity of GenovaGenovaItaly
  3. 3.Laseldi LabUniversity of Franche ComtéMontbéliardFrance

Personalised recommendations