Journal on Multimodal User Interfaces

, Volume 3, Issue 1, pp 99–108

When my robot smiles at me: Enabling human-robot rapport via real-time head gesture mimicry

Original Paper

DOI: 10.1007/s12193-009-0028-2

Cite this article as:
Riek, L.D., Paul, P.C. & Robinson, P. J Multimodal User Interfaces (2010) 3: 99. doi:10.1007/s12193-009-0028-2


People use imitation to encourage each other during conversation. We have conducted an experiment to investigate how imitation by a robot affect people’s perceptions of their conversation with it. The robot operated in one of three ways: full head gesture mimicking, partial head gesture mimicking (nodding), and non-mimicking (blinking). Participants rated how satisfied they were with the interaction. We hypothesized that participants in the full head gesture condition will rate their interaction the most positively, followed by the partial and non-mimicking conditions. We also performed gesture analysis to see if any differences existed between groups, and did find that men made significantly more gestures than women while interacting with the robot. Finally, we interviewed participants to try to ascertain additional insight into their feelings of rapport with the robot, which revealed a number of valuable insights.

Affective computing Empathy Facial expressions Human-robot interaction Social robotics 

Copyright information

© OpenInterface Association 2009

Authors and Affiliations

  • Laurel D. Riek
    • 1
  • Philip C. Paul
    • 2
  • Peter Robinson
    • 1
  1. 1.Computer LaboratoryUniversity of CambridgeCambridgeUK
  2. 2.Department of EngineeringUniversity of CambridgeCambridgeUK

Personalised recommendations