Real-time motion attention and expressive gesture interfaces

  • Matei Mancas
  • Donald Glowinski
  • Gualtiero Volpe
  • Antonio Camurri
  • Pierre Bretéché
  • Jonathan Demeyer
  • Thierry Ravet
  • Paolo Coletta
Original Paper

DOI: 10.1007/s12193-009-0017-5

Cite this article as:
Mancas, M., Glowinski, D., Volpe, G. et al. J Multimodal User Interfaces (2008) 2: 187. doi:10.1007/s12193-009-0017-5

Abstract

This paper aims at investigating the relationship between gestures’ expressivity and the amount of attention they attract. We present a technique for quantifying behavior saliency, here understood as the capacity to capture one’s attention, by the rarity of selected motion and gestural expressive features. This rarity index is based on the real-time computation of the occurrence probability of expressive motion features numerical values. Hence, the time instants that correspond to rare unusual dynamic patterns of an expressive feature are singled out. In a multi-user scenario, the rarity index highlights the person in a group which shows the most different behavior with respect to the others. In a mono-user scenario, the rarity index highlights when the expressive content of a gesture changes. Those methods can be considered as preliminary steps toward context-aware expressive gesture analysis. This work has been partly carried out in the framework of the eNTERFACE 2008 workshop (Paris, France, August 2008) and is partially supported by the EU ICT SAME Project (www.sameproject.eu) and by the NUMEDIART Project (www.numediart.org).

Keywords

Computational attention Saliency Rarity Expressive gesture 

Copyright information

© OpenInterface Association 2009

Authors and Affiliations

  • Matei Mancas
    • 1
  • Donald Glowinski
    • 2
  • Gualtiero Volpe
    • 2
  • Antonio Camurri
    • 2
  • Pierre Bretéché
    • 3
  • Jonathan Demeyer
    • 1
  • Thierry Ravet
    • 1
  • Paolo Coletta
    • 2
  1. 1.IT Research Center, FPMSUniversity of MonsMonsBelgium
  2. 2.Casa Paganini/InfoMus LabUniversity of GenovaGenovaItaly
  3. 3.Laseldi LabUniversity of Franche ComtéMontbéliardFrance

Personalised recommendations