Autonomous Robots

, Volume 23, Issue 2, pp 83–96

Hybrid image plane/stereo (HIPS) manipulation for robotic space applications

Authors

    • Jet Propulsion Laboratory
  • Eric T. Baumgartner
    • Ohio Northern University
  • Kevin M. Nickels
    • Trinity University
  • Todd E. Litwin
    • Jet Propulsion Laboratory
Article

DOI: 10.1007/s10514-007-9032-0

Cite this article as:
Robinson, M.L., Baumgartner, E.T., Nickels, K.M. et al. Auton Robot (2007) 23: 83. doi:10.1007/s10514-007-9032-0
  • 112 Views

Abstract

Manipulation systems for planetary exploration operate under severe limitations due to power and weight restrictions and extreme environmental conditions. Typically such systems employ carefully calibrated stereo cameras and carefully calibrated manipulators to achieve precision on the order of ten millimeters with respect to instrument placement activities. The environmental and functional restrictions under which these systems are used limit the operational accuracy of these approaches. This paper presents a novel approach to stereo-based manipulation designed to robustly achieve high precision levels despite the aforementioned limitations. The basic principle of the approach, known as Hybrid Image Plane/Stereo (HIPS) Manipulation, is the generation of camera models through direct visual sensing of the manipulator’s end-effector. The HIPS method estimates and subsequently uses these models to position the manipulator at a target location specified in the image-planes of a stereo camera pair using stereo correlation and triangulation. In-situ estimation and adaptation of the manipulator/camera models in this method accounts for changes in the system configuration, thus ensuring consistent precision for the life of the mission. The end result is a increase in positioning precision by a factor of approximately two for a limited version of HIPS, and an order of magnitude increase in positioning precision for the full on-line version of HIPS.

Keywords

Vision-based manipulationStereo triangulationStereo visionSpace robotics

Copyright information

© Springer Science+Business Media, LLC 2007