Using Near-Field Stereo Vision for Robotic Grasping in Cluttered Environments

  • Adam Leeper
  • Kaijen Hsiao
  • Eric Chu
  • J. Kenneth Salisbury
Part of the Springer Tracts in Advanced Robotics book series (STAR, volume 79)

Abstract

Robotic grasping in unstructured environments requires the ability to adjust and recover when a pre-planned grasp faces imminent failure. Even for a single object, modeling uncertainties due to occluded surfaces, sensor noise and calibration errors can cause grasp failure; cluttered environments exacerbate the problem. In this work, we propose a simple but robust approach to both pre-touch grasp adjustment and grasp planning for unknown objects in clutter, using a small-baseline stereo camera attached to the gripper of the robot. By employing a 3D sensor from the perspective of the gripper we gain information about the object and nearby obstacles immediately prior to grasping that is not available during head-sensor-based grasp planning. We use a feature-based cost function on local 3D data to evaluate the feasibility of a proposed grasp. In cases where only minor adjustments are needed, our algorithm uses gradient descent on a cost function based on local features to find optimal grasps near the original grasp. In cases where no suitable grasp is found, the robot can search for a significantly different grasp pose rather than blindly attempting a doomed grasp. We present experimental results to validate our approach by grasping a wide range of unknown objects in cluttered scenes. Our results show that reactive pre-touch adjustment can correct for a fair amount of uncertainty in the measured position and shape of the objects, or the presence of nearby obstacles.

Keywords

Point Cloud Gradient Descent Stereo Camera Visual Servoing Unknown Object 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Saxena, A., Wong, L.L.S., Ng, A.Y.: Learning grasp strategies with partial shape information. In: AAAI (2008)Google Scholar
  2. 2.
    Rao, D., Le, Q., Phoka, T., Quigley, M., Sudsang, A., Ng, A.: Grasping novel objects with depth segmentation. In: IROS (2010)Google Scholar
  3. 3.
    Maldonado, A., Klank, U., Beetz, M.: Robotic grasping of unmodeled objects using time-of-flight range data and finger torque information. In: IROS (2010)Google Scholar
  4. 4.
    Ciocarlie, M., Hsiao, K., Jones, E.G., Chitta, S., Rusu, R., Sucan, I.: Towards reliable grasping and manipulation in household environments. In: ISER (2010)Google Scholar
  5. 5.
    Srinivasa, S., Ferguson, D., Weghe, M.V., Diankov, R., Berenson, D., Helfrich, C., Strasdat, H.: The robotic busboy: Steps towards developing a mobile robotic home assistant. In: 10th International Conference on Intelligent Autonomous Systems (2008)Google Scholar
  6. 6.
    Rusu, R.B., Holzbach, A., Diankov, R., Bradski, G., Beetz, M.: Perception for mobile manipulation and grasping using active stereo. In: Humanoids, Paris (2009)Google Scholar
  7. 7.
    Jain, A., Kemp, C.: EL-E: an assistive mobile manipulator that autonomously fetches objects from flat surfaces. In: Autonomous Robots (2010)Google Scholar
  8. 8.
    Hsiao, K., Nangeroni, P., Huber, M., Saxena, A., Ng, A.Y.: Reactive grasping using optical proximity sensors. In: ICRA, pp. 2098–2105 (May 2009)Google Scholar
  9. 9.
    Mayton, B., LeGrand, L., Smith, J.R.: An electric field pretouch system for grasping and co-manipulation. In: ICRA, pp. 831–838 (May 2010)Google Scholar
  10. 10.
    Konolige, K.: Projected texture stereo. In: ICRA, pp. 148–155 (May 2010)Google Scholar
  11. 11.
    Miller, A.T., Knoop, S., Christensen, H.I., Allen, P.K.: Automatic grasp planning using shape primitives. In: IEEE International Conference on Robotics and Automation, pp. 1824–1829 (2003)Google Scholar
  12. 12.
    Kragic, D., Christensen, H.: Survey on visual servoing for manipulation. Technical report, ISRN KTH/NA/P-02/01-SE, Computational Vision and Active Perception Laboratory (2002)Google Scholar
  13. 13.
    Hsiao, K., Kaelbling, L., Lozano-Perez, T.: Task-driven tactile exploration. In: RSS (2010)Google Scholar
  14. 14.
    Platt Jr., R., Fagg, A.H., Grupen, R.A.: Nullspace composition of control laws for grasping. In: IROS (2002)Google Scholar
  15. 15.
    Dollar, A., Jentoft, L., Gao, J., Howe, R.: Contact sensing and grasping performance of compliant hands. In: Autonomous Robots (2010)Google Scholar
  16. 16.
    Prats, M., Martinet, P., Lee, S., Sanz, P.: Compliant physical interaction based on external vision-force control and tactile-force combination. In: MFI (2008)Google Scholar
  17. 17.
    Natale, L., Torres-Jara, E.: A sensitive approach to grasping. In: Proceedings of the Sixth International Workshop on Epigenetic Robotics (2006)Google Scholar
  18. 18.
    Hsiao, K., Chitta, S., Ciocarlie, M., Jones, E.G.: Contact-reactive grasping of objects with partial shape information. In: IROS (2010)Google Scholar
  19. 19.
    Torabi, L., Gupta, K.: Integrated view and path planning for an autonomous six-dof eye-in-hand object modeling system. In: IROS (2010)Google Scholar
  20. 20.
    Falk, V., McLoughlin, Guthart, G., Salisbury, K., Walther, T., Gummert, J., Mohr, F.: Dexterity enhancement in endoscopic surgery by a computer controlled mechanical wrist. Minimally Invasive Therapy and Allied Technologies 8 (4), 235–242 (1999)CrossRefGoogle Scholar
  21. 21.
    Mintz, D., Falk, V., Salisbury Jr., J.K.: Comparison of Three High-End Endoscopic Visualization Systems on Telesurgical Performance. In: Delp, S.L., DiGoia, A.M., Jaramaz, B. (eds.) MICCAI 2000. LNCS, vol. 1935, pp. 385–394. Springer, Heidelberg (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Berlin Heidelberg 2014

Authors and Affiliations

  • Adam Leeper
    • 1
  • Kaijen Hsiao
    • 2
  • Eric Chu
    • 1
  • J. Kenneth Salisbury
    • 1
  1. 1.Stanford UniversityStanfordUSA
  2. 2.Willow GarageMenlo ParkUSA

Personalised recommendations