Journal on Multimodal User Interfaces

, Volume 3, Issue 1, pp 119–130

Investigating shared attention with a virtual agent using a gaze-based interface

  • Christopher Peters
  • Stylianos Asteriadis
  • Kostas Karpouzis
Original Paper

DOI: 10.1007/s12193-009-0029-1

Cite this article as:
Peters, C., Asteriadis, S. & Karpouzis, K. J Multimodal User Interfaces (2010) 3: 119. doi:10.1007/s12193-009-0029-1


This paper investigates the use of a gaze-based interface for testing simple shared attention behaviours during an interaction scenario with a virtual agent. The interface is non-intrusive, operating in real-time using a standard web-camera for input, monitoring users’ head directions and processing them in real-time for resolution to screen coordinates. We use the interface to investigate user perception of the agent’s behaviour during a shared attention scenario. Our aim is to elaborate important factors to be considered when constructing engagement models that must account not only for behaviour in isolation, but also for the context of the interaction, as is the case during shared attention situations.

Shared attentionGaze detectionEmbodied agentsSocial behaviour

Supplementary material

View video

Below is the link to the electronic supplementary material. (AVI 3.25 MB)

Copyright information

© OpenInterface Association 2009

Authors and Affiliations

  • Christopher Peters
    • 1
  • Stylianos Asteriadis
    • 2
  • Kostas Karpouzis
    • 2
  1. 1.Department of Engineering and ComputingCoventry UniversityCoventryUK
  2. 2.National Technical University AthensAthensGreece