Advertisement

Physical Human Interactive Guidance: Identifying Grasping Principles from Human-Planned Grasps

  • Ravi Balasubramanian
  • Ling Xu
  • Peter D. Brook
  • Joshua R. Smith
  • Yoky Matsuoka
Chapter
Part of the Springer Tracts in Advanced Robotics book series (STAR, volume 95)

Abstract

We present a novel and simple experimental method called Physical Human Interactive Guidance to study human-planned grasping. Instead of studying how the human uses his/her own biological hand or how a human teleoperates a robot hand in a grasping task, the method involves a human interacting physically with a robot arm and hand, carefully moving and guiding the robot into the grasping pose while the robot’s configuration is recorded. Analysis of the grasps from this simple method has produced two interesting results. First, the grasps produced by this method perform better than grasps generated through a state-of-the-art automated grasp planner. Second, this method when combined with a detailed statistical analysis using a variety of grasp measures (physics-based heuristics considered critical for a good grasp) offered insights into how the human grasping method is similar or different from automated grasping synthesis techniques. Specifically, data from the Physical Human Interactive Guidance method showed that the human-planned grasping method provides grasps that are similar to grasps from a state-of-the-art automated grasp planner, but differed in one key aspect. The robot wrists were aligned with the object’s principal axes in the human-planned grasps (termed low skewness in this work), while the automated grasps used arbitrary wrist orientation. Preliminary tests shows that grasps with low skewness were significantly more robust than grasps with high skewness (77–93 %). We conclude with a detailed discussion of how the Physical Human Interactive Guidance method relates to existing methods for extracting the human principles for physical interaction.

Keywords

Grasping Haptic interface Human-robot interaction Manipulators Telerobotics 

Notes

Acknowledgments

The authors thank Brian Mayton for help with the robot experiment set-up and Louis LeGrand for interesting discussions on grasp metrics. Gratitude is also due to Matei Ciocarlie and Peter Allen of the GraspIt! team for helping the authors use the GraspIt! code.

References

  1. 1.
    R. Balasubramanian, L. Xu, P. Brook, J. R. Smith, Y. Matsuoka, Human-guided grasp measures improve grasp robustness on physical robot, in Proceedings of IEEE International Conference on Robotics and Automation, pp. 2294–2301 (2010)Google Scholar
  2. 2.
    Y. Bekiroglu, J. Laaksonen, J. Jorgensen, V. Kyrki, D. Kragic, Assessing grasp stability based on learning and haptic data. IEEE Trans. Robot. 27(3), 619–629 (2011)Google Scholar
  3. 3.
    G.M. Bone, Y. Du, Multi-metric comparison of optimal 2d grasp planning algorithms, in Proceedings of IEEE International Conference on Robotics and Automation (2001)Google Scholar
  4. 4.
    L.Y. Chang, R.L. Klatzky, N.S. Pollard, Selection criteria for preparatory object rotation in manual lifting actions. J. Mot. Behav. 42(1), 11–27 (2010)CrossRefGoogle Scholar
  5. 5.
    L.Y. Chang, Y. Matsuoka, A kinematic thumb model for the act hand, in Proceedings of the 2006 IEEE International Conference on Robotics and Automation (2006)Google Scholar
  6. 6.
    L.Y. Chang, N.S. Pollard, Constrained least-squares optimization for robust estimation of center of rotation. J. Biomech. 40(6), 1392–1400 (2007) Google Scholar
  7. 7.
    E. Chinellato, A. Morales, R.B. Fisher, A.P. del Pobil, Visual quality measures for characterizing planar robot grasps. IEEE Trans. Syst. Man Cybernet. 35(1), 30–41 (2005)Google Scholar
  8. 8.
    A. Churchill, B. Hopkins, L. Ronnqvist, S. Vogt, Vision of the hand and environmental context in human prehension. Exp. Brain Res. 134, 81–89 (2000)Google Scholar
  9. 9.
    M.T. Ciocarlie, P.K. Allen, On-line interactive dexterous grasping, in Proceedings of Eurohaptics (2008)Google Scholar
  10. 10.
    S.T. Clanton, D.C. Wang, V.S. Chib, Y. Matsuoka, G.D. Stetten, Optical merger of direct vision with virtual images for scaled teleoperation. IEEE Trans. Vis. Comput. Graph. 12(2), 277–285 (2006)CrossRefGoogle Scholar
  11. 11.
    R.G. Cohen, D.A. Rosenbaum, Where grasps are made reveals how grasps are planned: generation and recall of motor plans. Exp. Brain Res. 157, 486–495 (2004). doi: 10.1007/s00221-004-1862-9 CrossRefGoogle Scholar
  12. 12.
    M.R. Cutkosky, On grasp choice, grasp models, and the design of hands for manufacturing tasks. IEEE Trans. Robot. Autom. 5(3), 269–279 (1989)CrossRefMathSciNetGoogle Scholar
  13. 13.
    R. Diankov, J. Kuffner, OpenRAVE: A planning architecture for autonomous robotics. Technical Report CMU-RI-TR-08-34, The Robotics Institute, Pittsburgh, PA, July 2008Google Scholar
  14. 14.
    C. Fernández, M.A. Vicente, C. Pérez, O. Reinoso, R. Aracil, Learning to grasp from examples in telerobotics, in Proceeding Conference on Artificial Intelligence and Applications (2003)Google Scholar
  15. 15.
    C. Ferrari, J. Canny, Planning optimal grasps, in Proceeding of the IEEE International Conference on Robotics and Automation, pp. 2290–2295 (1992)Google Scholar
  16. 16.
    J. Friedman, T. Flash, Task-dependent selection of grasp kinematics and stiffness in human object manipulation. Cortex 43, 444–460 (2007)CrossRefGoogle Scholar
  17. 17.
    S.S.H.U. Gamage, J. Lasenby, New least squares solutions for estimating the average centre of rotation and the axis of rotation. J. Biomech. 35(1), 87–93, 01 (2002)Google Scholar
  18. 18.
    C. Goldfeder, M. Ciocarlie, H. Dang, P. Allen, The columbia grasp database, in Proceedings of International Conference on Robotics and Automation, pp. 1710–1716 (2009). doi: 10.1109/ROBOT.2009.5152709
  19. 19.
    W.B. Griffin, R.P. Findley, M.L. Turner, M.R. Cutkosky, Calibration and mapping of a human hand for dexterous telemanipulation, in Proceedings of ASME IMECE Conference on Haptic Interfaces for Virtual Environments and Teleoperator System Symposium (2000)Google Scholar
  20. 20.
    R.S. Johansson, G. Westling, Roles of glabrous skin receptors and sensorimotor memory in automatic control of precision grip when lifting rougher or more slippery objects. Exp. Brain Res. 56(3), 550–564 (1984)CrossRefGoogle Scholar
  21. 21.
    S.H. Johnson-Frey, Whats so special about human tool use? Neuron (2003)Google Scholar
  22. 22.
    L.A. Jones, S.J. Lederman, Human Hand Function (Oxford University Press, Oxford, 2006)Google Scholar
  23. 23.
    S. Kakei, D.S. Hoffman, P.L. Strick, Muscle and movement representations in the primary motor cortex. Science 285(5436), 2136–2139 (1999)CrossRefGoogle Scholar
  24. 24.
    U. Kartoun, H. Stern, Y. Edan, Advances in e-engineering and digital enterprise technology-1: Proc. Int. Conf. on e-Engineering and Digital Enterprise Technology, chapter Virtual Reality Telerobotic System (John Wiley and Sons, New York, 2004)Google Scholar
  25. 25.
    D. Kirkpatrick, B. Mishra, C.K. Yap, Quantitative Steinitz’s theorms with applications to multifingered grasping, in ACM Symposium on Theory of Computing, pp. 341–351 (1990)Google Scholar
  26. 26.
    C. Lee, Learning Reduced-Dimension Models of Human Actions. PhD thesis, The Robotics Institute, Carnegie Mellon University, 2000Google Scholar
  27. 27.
    C. Lee, Y. Xu, Reduced-dimension representations of human performance data for human-to-robot skill transfer, in Proceedings of IEEE International Conference on Robotics and Automation, pp. 84–90 (1998)Google Scholar
  28. 28.
    Z. Li, S.S. Sastry, Task-oriented optimal grasping by multifingered robot hands. IEEE J. Robot. Autom. 4(1), 32–44 (1988)Google Scholar
  29. 29.
    J. Lloyd, J. Beis, D. Pai, D. Lowe, Model-based telerobotics with vision, in Proceedings of IEEE International Conference on Robotics and Automation, vol. 2, pp. 1297–1304 (1997)Google Scholar
  30. 30.
    J. Lukos, C. Ansuini, M. Santello, Choice of contact points during multidigit grasping: effect of predictability of object center of mass location. J. Neurosci. 27(4), 3894–3903 (2007). doi: 10.1523/JNEUROSCI.4693-06.2007 CrossRefGoogle Scholar
  31. 31.
    A. Miller, P.K. Allen, Graspit!: a versatile simulator for robotic grasping, in IEEE Robotics and Automation Magazine (2004)Google Scholar
  32. 32.
    B. Mirtich, J. Canny, Easily computable optimum grasps in 2-D and 3-D, in Proceedings of IEEE International Conference on Robotics and Automation, pp. 739–747 (1994)Google Scholar
  33. 33.
    N. Miyata, M. Kouchi, T. Kurihara, M. Mochimaru, Modeling of human hand link structure from optical motion capture data, in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2129–2135 (2004)Google Scholar
  34. 34.
    A. Morales, E. Chinellato, A. Fagg, A. del Pobil, An active learning approach for assessing robot grasp reliability, in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems 2004 (IROS 2004), vol. 1, pp. 485–490 (2004)Google Scholar
  35. 35.
    J. Ponce, B. Faveqon, On computing three-finger force-closure grasps of polygonal objects. IEEE Trans. Robot. Autom. 11, 868–881 (1995)Google Scholar
  36. 36.
    M. Ralph, M. Moussa, An integrated system for user-adaptive robotic grasping. IEEE Trans. Robot. 26(4), 698–709 (2010)Google Scholar
  37. 37.
    J. Romano, K. Hsiao, G. Niemeyer, S. Chitta, K. Kuchenbecker, Human-inspired robotic grasp control with tactile sensing. IEEE Trans. Robot. 27(6), 1067–1079 (2011)CrossRefGoogle Scholar
  38. 38.
    M. Santello, M. Flanders, J.F. Soechting, Postural hand synergies for tool use. J. Neurosci. 18(23), 10105–10115 (1998)Google Scholar
  39. 39.
    A. Saxena, J. Driemeyer, A.Y. Ng, Robotic grasping of novel objects using vision. Int. J. Robotics Res. 27(2), 157–173 (2008)CrossRefGoogle Scholar
  40. 40.
    A. Saxena, L.L.S. Wong, A. Ng, Learning grasp strategies with partial shape information, in Proceedings of AAAI Conference on Artificial Intelligence (2008)Google Scholar
  41. 41.
    K.B. Shimoga, Robot grasp synthesis algorithms: a survey. Int. J. Robot. Res. (1996). doi: 10.1177/027836499601500302 Google Scholar
  42. 42.
    W.T. Townsend, The BarrettHand grasper—programmably flexible part handling and assembly. Ind. Robot Int. J. 27(3), 181–188 (2000)CrossRefMathSciNetGoogle Scholar
  43. 43.
    M. Veber, T. Bajd, Assessment of human hand kinematics, in Proceedings of 2006 IEEE International Conference on Robotics and Automation, ICRA 2006, pp. 2966–2971 (2006)Google Scholar
  44. 44.
    G. Westling, R. Johansson, Factors influencing the force control during precision grip. Exp. Brain Res. 53, 277–284 (1984)CrossRefGoogle Scholar
  45. 45.
    R. Wistort, J.R. Smith, Electric field servoing for robotic manipulation, in Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (2008)Google Scholar
  46. 46.
    D.M. Wolpert, Z. Ghahramani, M.I. Jordan, Perceptual distortion contributes to the curvature of human reaching movements. Exp. Brain Res. 98, 153–156 (1994)CrossRefGoogle Scholar
  47. 47.
    V.M. Zatsiorsky, M.L. Latash, Multifinger prehension: an overview. J. Motor Behav. 40(5), 446–475 (2008)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Ravi Balasubramanian
    • 1
  • Ling Xu
    • 2
  • Peter D. Brook
    • 3
  • Joshua R. Smith
    • 3
  • Yoky Matsuoka
    • 3
  1. 1.School of Mechanical, Industrial, and Manufacturing EngineeringOregon State UniversityCorvallisUSA
  2. 2.The Robotics InstituteCarnegie Mellon UniversityPittsburghUSA
  3. 3.Department of Computer ScienceThe University of WashingtonSeattleUSA

Personalised recommendations