International Journal of Social Robotics

, Volume 6, Issue 3, pp 383–396 | Cite as

A Framework for User-Defined Body Gestures to Control a Humanoid Robot

  • Mohammad Obaid
  • Felix Kistler
  • Markus Häring
  • René Bühling
  • Elisabeth André
Article

Abstract

This paper presents a framework that allows users to interact with and navigate a humanoid robot using body gestures. The first part of the paper describes a study to define intuitive gestures for eleven navigational commands based on analyzing 385 gestures performed by 35 participants. From the study results, we present a taxonomy of the user-defined gesture sets, agreement scores for the gesture sets, and time performances of the gesture motions. The second part of the paper presents a full body interaction system for recognizing the user-defined gestures. We evaluate the system by recruiting 22 participants to test for the accuracy of the proposed system. The results show that most of the defined gestures can be successfully recognized with a precision between 86\(-\)100 % and an accuracy between 73\(-\)96 %. We discuss the limitations of the system and present future work improvements.

Keywords

Humanoid robot Robot Nao Gesture User-defined User-defined gestures Robot navigation Gesture recognition 

Notes

Acknowledgments

This work was partially funded by the European Commission within the 7th Framework Program under grant agreement eCute (FP7-ICT-257666).

References

  1. 1.
    Kistler F, Endrass B, Damian I, Dang C, André E (2012) Natural interaction with culturally adaptive virtual characters. J Multimodal User Interfaces 6:39–47CrossRefGoogle Scholar
  2. 2.
    Suma EA, Lange B, Rizzo A, Krum DM, Mark B (2011) FAAST: the flexible action and articulated skeleton toolkit. In: Proceedings of the virtual reality, Singapore, pp 47–248Google Scholar
  3. 3.
    Stiefelhagen R, Fugen C, Gieselmann R, Holzapfel H, Nickel K, Waibel A (2004) Natural human-robot interaction using speech, head pose and gestures. In: Proceedings of cthe IEEE/RSJ international conference on intelligent robots and systems, (IROS 2004), 3:2422–2427Google Scholar
  4. 4.
    Suay HB, Chernova S (2011) Humanoid robot control using depth camera. In: Proceedings of the 6th international conference on Human-robot interaction, HRI ’11. NY, USA, ACM, New York, pp 401–402Google Scholar
  5. 5.
    Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: Proceedings of the 27th international conference on Human factors in computing systems, CHI ’09. NY, USA, ACM, New York, pp 1083–1092Google Scholar
  6. 6.
    Kurdyukova E, Redlin M, André E (2012) Studying user-defined ipad gestures for interaction in multi-display environment. In: International Conference on Intelligent User Interfaces, ACM, New York, pp 1–6Google Scholar
  7. 7.
    Häring M, Eichberg J, André E (2012) Studies on grounding with gaze and pointing gestures in human-robot-interaction. In: Ge ShuzhiSam, Khatib Oussama, Cabibihan John-John, Simmons Reid, Williams Mary-Anne (eds) Social robotics, vol 7621 Lecture notes in computer science. Berlin Heidelberg, Springer, pp 378–387Google Scholar
  8. 8.
    Maha S, Rohlfing K, Kopp S, Joublin F (2011) A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction. In: IEEE, RO-MAN, Atlanta, 3: 247–252Google Scholar
  9. 9.
    Sidner CL, Lee C, Kidd CD, Lesh N, Rich C (2005) Explorations in engagement for humans and robots. Artif Intell 166(12):140–164Google Scholar
  10. 10.
    Salem M, Eyssel F, Rohlfing K, Kopp S, Joublin F (2013) To err is human(-like): effects of robot gesture on perceived anthropomorphism and likability. Int J Soc Robot 5(3):313–323Google Scholar
  11. 11.
    Salem M, Kopp S, Wachsmuth I, Rohlfing K, Joublin F (2012) Generation and evaluation of communicative robot gesture. Int J Soc Robot 4(2):201–217Google Scholar
  12. 12.
    Efron D (1941) Gesture and Environment. King’s Crown Press, Morningside Heights, New YorkGoogle Scholar
  13. 13.
    Ekman P, Friesen W (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica 1:49–98Google Scholar
  14. 14.
    McNeill D (1985) So you think gestures are nonverbal? Psychol Rev 92(3):350–371Google Scholar
  15. 15.
    McNeill D (1992) Head and mind: what gestures reveal about thought. University of Chicago University of Chicago Press, ChicagoGoogle Scholar
  16. 16.
    McNeill D (2005) Gesture and thought. University of Chicago Press, ChicagoGoogle Scholar
  17. 17.
    Saffer D (2009) Designing gestural interfaces. O’Reilly Media, SebastopolGoogle Scholar
  18. 18.
    Jaime R, Yang L, Edward L (2011) User-defined motion gestures for mobile interaction. In: Proceedings of the 2011 annual conference on Human factors in computing systems, CHI ’11. NY, USA, ACM, New York, pp 197–206Google Scholar
  19. 19.
    Christian K, Daniel N, John D, Michael R (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services, MobileHCI ’10. NY, USA, ACM, New York, pp 239–248Google Scholar
  20. 20.
    Zhang L, Huang Q, Liu Q, Liu T, Li D, Lu Y (2005) A teleoperation system for a humanoid robot with multiple information feedback and operational modes. In: IEEE international conference on robotics and biomimetics (ROBIO), pp 290–294Google Scholar
  21. 21.
    Kechavarzi BD, Sabanovic S, Weisman K (2012) Evaluation of control factors affecting the operator’s immersion and performance in robotic teleoperation. In: IEEE, RO-MAN, pp 608–613Google Scholar
  22. 22.
    Sian NE, Yokoi K, Kajita S, Kanehiro F, Tanie K (2002) Whole body teleoperation of a humanoid robot - development of a simple master device using joysticks. In: IEEE/RSJ international conference on intelligent robots and systems, vol. 3, pp 2569–2574Google Scholar
  23. 23.
    McColl D, Zhang Z, Nejat G (2011) Human body pose interpretation and classification for social human-robot interaction. Int J Soc Robot 3(3):313–332Google Scholar
  24. 24.
    Sakagami Y, Watanabe R, Aoyama C, Matsunaga S, Higaki N, Fujimura K (2002) The intelligent ASIMO: system overview and integration. In: IEEE/RSJ international conference on intelligent robots and systems, vol. 3, pp 2478–2483Google Scholar
  25. 25.
    Yorita A, Kubota N (2011) Cognitive development in partner robots for information support to elderly people. IEEE Trans Auton Ment Dev 3(1):64–73Google Scholar
  26. 26.
    Ju Z, Liu H (2010) Recognizing hand grasp and manipulation through empirical copula. Int J Soc Robot 2(3):321–328Google Scholar
  27. 27.
    Fujimoto I, Matsumoto T, Silva PRS, Kobayashi M, Higashi M (2011) Mimicking and evaluating human motion to improve the imitation skill of children with autism through a robot. Int J Soc Robot 3(4):349–357Google Scholar
  28. 28.
    Yun S-S, Kim M, Choi MT (2013) Easy interface and control of tele-education robots. Int J Soc Robot 5(3):335–343Google Scholar
  29. 29.
    Waldherr S, Romero R, Thrun S (2000) A gesture based interface for human-robot interaction. Auton Robot 9(2):151–173Google Scholar
  30. 30.
    Nguyen-Duc-Thanh N, Stoniern D, Lee SY, Kim DH (2011) A new approach for human-robot interaction using human body language. In: Proceedings of the 5th international conference on convergence and hybrid information technology, ICHIT’11. Springer, Berlin, pp 762–769Google Scholar
  31. 31.
    Broccia G, Livesu M, Scateni R (2011) Gestural interaction for robot motion control. In: EuroGraphics Italian chapter, pp 61–66Google Scholar
  32. 32.
    Cabibihan J-J, So W-C, Pramanik S (2012) Human-recognizable robotic gestures. IEEE Trans Auton Mental Dev, 4(4):305–314Google Scholar
  33. 33.
    Strobel M, Illmann J, Kluge B, Marrone F (2002) Using spatial context knowledge in gesture recognition for commanding a domestic service robot. In: Proceedings of the 11th IEEE international workshop on robot and human interactive communication, pp 468–473Google Scholar
  34. 34.
    Sato E, Yamaguchi T, Harashima F (2007) Natural interface using pointing behavior for human-robot gestural interaction. In: IEEE transactions on industrial electronics, 54(2):1105–1112Google Scholar
  35. 35.
    Sato E, Nakajima A, Yamaguchi T, Harashima F (2005) Humatronics (1)— natural interaction between human and networked robot using human motion recognition. In: IEEE/RSJ international conference on intelligent robots and systems, (IROS 2005), pp 930–935Google Scholar
  36. 36.
    Hu C, Meng MQ, Liu PX, Wang X (2003) Visual gesture recognition for human-machine interface of robot teleoperation. In: IEEE/RSJ international conference on intelligent robots and systems, (IROS 2003). Proceedings, vol. 2, pp 1560–1565Google Scholar
  37. 37.
    Konda KR, Königs A, Schulz H, Schulz D (2012) Real time interaction with mobile robots using hand gestures. In: Proceedings of the seventh annual ACM/IEEE international conference on human-robot interaction, HRI ’12. NY, USA, ACM, New York, pp 177–178Google Scholar
  38. 38.
    Dillmann R (2004) Teaching and learning of robot tasks via observation of human performance. Robot Auton Syst 47(23):109–116Google Scholar
  39. 39.
    Breazeal C, Scassellati B (2002) Robots that imitate humans. Trends Cogn Sci 6(11):481–487Google Scholar
  40. 40.
    Barattini P, Morand C, Robertson NM (2012) A proposed gesture set for the control of industrial collaborative robots. In IEEE RO-MAN, pp 132–137Google Scholar
  41. 41.
    Ende T, Haddadin S, Parusel S, Wusthoff T, Hassenzahl M, Albu-Schaffer A (2011) A human-centered approach to robot gesture based communication within collaborative working processes. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 3367–3374Google Scholar
  42. 42.
    Gleeson B, MacLean K, Haddadi A, Croft E, Alcazar J (2013) Gestures for industry intuitive human-robot communication from human observation. In: 8th ACM/IEEE international conference on human-robot interaction (HRI), pp 349–356Google Scholar
  43. 43.
    Bodiroa S, Stern HI, Edan Y (2012) Dynamic gesture vocabulary design for intuitive human-robot dialog. In: 7th ACM/IEEE international conference on Human-robot interaction (HRI), pp 111–112Google Scholar
  44. 44.
    Wobbrock JO, Aung HH, Rothrock B, Myers BA (2005) Maximizing the guessability of symbolic input. In: CHI ’05 extended abstracts on Human factors in computing systems, CHI EA ’05, ACM, New York, pp 1869–1872Google Scholar
  45. 45.
    Kang SK, Nam MY, Rhee PK (2008) Color based hand and finger detection technology for user interaction. In: ICHIT ’08. international conference on convergence and hybrid information technology, pp 229–236Google Scholar
  46. 46.
    Kita S (2009) Cross-cultural variation of speech-accompanying gesture: a review. Lang Cogn Process 24(2):145–167Google Scholar
  47. 47.
    Bartneck C, Nomura T, Kanda T, Suzuki T, Kato K (2005) Cultural differences in attitudes towards robots. In: Proceedings of the symposium on robot companions: hard problems and open challenges in Robot-human interaction,Google Scholar
  48. 48.
    Bartneck C, Suzuki T, Kanda T, Nomura T (2007) The influence of people’s culture and prior experiences with aibo on their attitude towards robots. AI Soc 21:217–230Google Scholar
  49. 49.
    Nomura T, Suzuki T, Kanda T, Han J, Shin N, Burke J, Kato K (2008) What people assume about humanoid and animal-type robots: cross-cultural analysis between japan, korea, and the united states. Int J Hum Robot 05(01):25–46Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  • Mohammad Obaid
    • 1
  • Felix Kistler
    • 2
  • Markus Häring
    • 2
  • René Bühling
    • 2
  • Elisabeth André
    • 2
  1. 1.t2i LabChalmers University of TechnologyGothenburgSweden
  2. 2.Human Centered MultimediaAugsburg UniversityAugsburgGermany

Personalised recommendations