Abstract
Helping hand robots have been the focus of a number of studies and have high potential in modern manufacturing processes and for use in daily living. As helping hand robots interact closely with users, it is important to find natural and intuitive user interfaces for interacting with the robots in various situations. This study describes a set of gestures for interacting with and controlling helping hand robots in situations in which users need to manually control the robot but both hands are not available, for example, when users are holding tools or objects in their hands. The gestures are derived from an experimental study that asked participants for gestures suitable for controlling primitive robot motions. The selected gestures can be used to control translation and orientation of an end effector of a helping hand robot when one or both hands are engaged with tasks. As an example for validating the proposed gestures, we implemented a helping hand robot system to perform a soldering task.
Similar content being viewed by others
References
3D visualization tool for ROS. http://wiki.ros.org/rviz. Accessed 15 July 2013
HiveGround ROS Packages. https://github.com/hiveground-ros-package. Accessed July 2013
Kinect SDK. http://www.microsoft.com/en-us/kinectforwindows. Accessed 31 July 2013
MoveIt!. http://moveit.ros.org/wiki/MoveIt!. Accessed 8 July 2013
Spacenavigator. http://www.3dconnexion.com/. Accessed 18 July 2013
Bauer A, Wollherr D, Buss M (2008) Human–robot collaboration: a survey. Hum Robot 05(01):47–66
Bradski G, Kaehler A (2008) Learning OpenCV. O’Reilly Media, Cambridge
Burger B, Ferrane I, Lerasle F, Infantes G (2012) Two-handed gesture recognition and fusion with speech to command a robot. Auton Robot 32(2):129–147
Ende T, Haddadin S, Parusel S, Wusthoff T, Hassenzahl M, Albu-Schaffer A (2011) A human-centered approach to robot gesture based communication within collaborative working processes. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, pp. 3367–3374
Faul F, Erdfelder E, Lang AG, Buchner A (2007) G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods 39(2):175–191
Gleeson B, MacLean K, Haddadi A, Croft E, Alcazar J (2013) Gestures for industry: intuitive human–robot communication from human observation. In: Proceedings of the 8th ACM/IEEE international conference on human–robot interaction, pp. 349–356
Goodrich MA, Schultz AC (2007) Human–robot interaction: a survey. Found Trends Human–Comput Interact 1:203–275
Gossow D, Leeper A, Hershberger D, Ciocarlie M (2011) Interactive markers: 3-D user interfaces for ROS applications [ROS topics]. IEEE Robot Autom Mag 18(4):14–15
Groothuis S, Stramigioli S, Carloni R (2013) Lending a helping hand: toward novel assistive robotic arms. IEEE Robot Autom Mag 20(1):20–29
Hammel J, Hall K, Lees D, Leifer L, der Loos MV, Perkash I, Crigler R (1989) Clinical evaluation of a desktop robotic assistant. Rehabil Res Dev 26:1–16
Hato Y, Satake S, Kanda T, Imai M, Hagita N (2010) Pointing to space: modeling of deictic interaction referring to regions. In: Proceedings of the 5th ACM/IEEE international conference on human–robot interaction, pp. 301–308
Iossifidis I, Theis C, Grote C, Faubel C, Schoner G (2003) Anthropomorphism as a pervasive design concept for a robotic assistant. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, vol. 3, pp. 3465–3472
Kuniyoshi Y, Inaba M, Inoue H (1994) Learning by watching: extracting reusable task knowledge from visual observation of human performance. IEEE Trans Robot Autom 10(6):799–822
Neto P, Pires J, Moreira A (2009) Accelerometer-based control of an industrial robotic arm. In: Proceedings of IEEE international symposium on robot and human interactive communication, pp. 1192–1197
Nof SY (ed) (1999) Handbook of industrial robotics, 2nd edn. Wiley, Chicago
Oikonomidis I, Kyriazis N, Argyros A (2012) Tracking the articulated motion of two strongly interacting hands. In: Proceedings of IEEE international conference on computer vision and pattern recognition, pp. 1862–1869
Oviatt S (1999) Ten myths of multimodal interaction. Commun ACM 42(11):74–81
Quigley M, Conley K, Gerkey BP, Faust J, Foote T, Leibs J, Wheeler R, Ng AY (2009) ROS: an open-source robot operating system. In: ICRA workshop on open source software
Rogalla O, Ehrenmann M, Zollner R, Becher R, Dillmann R (2002) Using gesture and speech control for commanding a robot assistant. In: Proceeding of IEEE international workshop on robot and human interactive communication, pp. 454–459
Rusu RB, Cousins S (2011) 3D is here: point cloud library (PCL). In: Proceedings IEEE international conference robotics and automation, pp. 1–4
Stiefelhagen R, Fugen C, Gieselmann R, Holzapfel H, Nickel K, Waibel A (2004) Natural human–robot interaction using speech, head pose and gestures. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, vol. 3, pp. 2422–2427
Tran N, Phan H, Dinh V, Ellen J, Berg B, Lum J, Alcantara E, Bruch M, Ceruti M, Kao C, Garcia D, Fugate S, Duffy L (2009) Wireless data glove for gesture-based robotic control. In: Jacko J (ed) Human–computer interaction. Novel interaction methods and techniques, lecture notes in computer science, vol. 5611. Springer, Berlin, Heidelberg, pp. 271–280
Wachs J, Stern H, Edan Y (2005) Cluster labeling and parameter estimation for the automated setup of a hand-gesture recognition system. IEEE Trans Syst Man Cybern Part A 35(6):932–944
Wachs JP, Kölsch M, Stern H, Edan Y (2011) Vision-based hand-gesture applications. Commun ACM 54:60–71
Wallhoff F, Blume J, Bannat A, Rsel W, Lenz C, Knoll A (2010) A skill-based approach towards hybrid assembly. Adv Eng Inf 24(3):329–339
Wang R, Paris S, Popović J (2011) 6D hands: markerless hand-tracking for computer aided design. In: Proceedings of the \(24{{\rm th}}\) annual ACM symposium on user interface software and technology, UIST ’11. ACM, pp. 549–558
Welch G, Bishop G (2006) An introduction to the Kalman filter. Tech Rep, Department of Computer Science University of North Carolina at Chapel Hill Chapel Hill
Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: Proceedings of the ACM international conference on human factors in computing systems, pp. 1083–1092
Wongphati M, Kanai Y, Osawa H, Imai M (2012) Give me a hand how users ask a robotic arm for help with gestures. In: Proceedings of IEEE international conference on cyber technology in automation, control, and intelligent systems, pp. 64–68
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wongphati, M., Osawa, H. & Imai, M. Gestures for Manually Controlling a Helping Hand Robot. Int J of Soc Robotics 7, 731–742 (2015). https://doi.org/10.1007/s12369-015-0302-2
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-015-0302-2