Journal on Multimodal User Interfaces

, Volume 3, Issue 1, pp 99–108

When my robot smiles at me: Enabling human-robot rapport via real-time head gesture mimicry

Authors

    • Computer LaboratoryUniversity of Cambridge
  • Philip C. Paul
    • Department of EngineeringUniversity of Cambridge
  • Peter Robinson
    • Computer LaboratoryUniversity of Cambridge
Original Paper

DOI: 10.1007/s12193-009-0028-2

Cite this article as:
Riek, L.D., Paul, P.C. & Robinson, P. J Multimodal User Interfaces (2010) 3: 99. doi:10.1007/s12193-009-0028-2
  • 362 Views

Abstract

People use imitation to encourage each other during conversation. We have conducted an experiment to investigate how imitation by a robot affect people’s perceptions of their conversation with it. The robot operated in one of three ways: full head gesture mimicking, partial head gesture mimicking (nodding), and non-mimicking (blinking). Participants rated how satisfied they were with the interaction. We hypothesized that participants in the full head gesture condition will rate their interaction the most positively, followed by the partial and non-mimicking conditions. We also performed gesture analysis to see if any differences existed between groups, and did find that men made significantly more gestures than women while interacting with the robot. Finally, we interviewed participants to try to ascertain additional insight into their feelings of rapport with the robot, which revealed a number of valuable insights.

Affective computingEmpathyFacial expressionsHuman-robot interactionSocial robotics

Copyright information

© OpenInterface Association 2009