Original Paper

Journal on Multimodal User Interfaces

, Volume 3, Issue 1, pp 99-108

First online:

When my robot smiles at me: Enabling human-robot rapport via real-time head gesture mimicry

  • Laurel D. RiekAffiliated withComputer Laboratory, University of Cambridge Email author 
  • , Philip C. PaulAffiliated withDepartment of Engineering, University of Cambridge
  • , Peter RobinsonAffiliated withComputer Laboratory, University of Cambridge

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access


People use imitation to encourage each other during conversation. We have conducted an experiment to investigate how imitation by a robot affect people’s perceptions of their conversation with it. The robot operated in one of three ways: full head gesture mimicking, partial head gesture mimicking (nodding), and non-mimicking (blinking). Participants rated how satisfied they were with the interaction. We hypothesized that participants in the full head gesture condition will rate their interaction the most positively, followed by the partial and non-mimicking conditions. We also performed gesture analysis to see if any differences existed between groups, and did find that men made significantly more gestures than women while interacting with the robot. Finally, we interviewed participants to try to ascertain additional insight into their feelings of rapport with the robot, which revealed a number of valuable insights.

Affective computing Empathy Facial expressions Human-robot interaction Social robotics