Touch versus In-Air Hand Gestures: Evaluating the Acceptance by Seniors of Human-Robot Interaction

  • Anouar Znagui Hassani
  • Betsy van Dijk
  • Geke Ludden
  • Henk Eertink
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7040)

Abstract

Do elderly people have a preference between performing in-air gestures or pressing screen buttons to interact with an assistive robot? This study attempts to provide answers to this question by measuring the level of acceptance, performance as well as knowledge of both interaction modalities during a scenario where elderly participants interacted with an assistive robot. Two interaction modalities were compared; in-air gestures and touch. A scenario has been chosen in which the elderly people perform exercises in order to improve lifestyle behavior. The seniors in this scenario stand in front of the assistive robot. The robot displays several exercises on the robot screen. After each successfully performed exercise the senior navigates to the next or previous exercise. No significant differences were found between the interaction modalities on the technology acceptance measures on effort, ease, anxiety, performance and attitude. The results on these measures were very high for both interaction modalities, indicating that both modalities were accepted by the elderly people. In a final interview participants reacted more positive on the use of in-air gestures.

Keywords

Robot Acceptance Assistive technologies Activities of daily Living (ADL’s) Human Robot Interaction 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Heerink, M., Kröse, B., Wielinga, B., Evers, V.: Measuring the influence of social abilities on acceptance of an interface robot and a screen agent by elderly users. In: Proceedings of the 23rd British HCI Group Annual Conference on People and Computers: Celebrating People and Technology, BCS-HCI 2009, Swinton, UK, pp. 430–439. British Computer Society (2009)Google Scholar
  2. 2.
    Keskin, C., Erkan, A., Akarun, L.: Real time hand tracking and 3d gesture recognition for interactive interfaces using hmm. In: Joint International Conference ICANN/ICONIP. Springer, Heidelberg (2003)Google Scholar
  3. 3.
    Park, C.-B., Lee, S.-W.: Real-time 3d pointing gesture recognition for mobile robots with cascade hmm and particle filter. Image and Vision Computing 29(1), 51–63 (2011)CrossRefGoogle Scholar
  4. 4.
    Wany-Robotics. PekeeII Essential package (2011), http://www.wanyrobotics.com
  5. 5.
    Znagui-Hassani, A.: Discovering the level of robot acceptance of seniors using scenarios based on assistive technologies. Technical report, University of Twente (HMI) (2010)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Anouar Znagui Hassani
    • 1
  • Betsy van Dijk
    • 1
  • Geke Ludden
    • 2
  • Henk Eertink
    • 2
  1. 1.Human Media InteractionTwente UniversityEnschedeThe Netherlands
  2. 2.NovayEnschedeThe Netherlands

Personalised recommendations