Abstract
During the Mars Exploration Rover (MER) landings, the Descent Image Motion Estimation System (DIMES) was used for horizontal velocity estimation. The DIMES algorithm combined measurements from a descent camera, a radar altimeter, and an inertial measurement unit. To deal with large changes in scale and orientation between descent images, the algorithm used altitude and attitude measurements to rectify images to a level ground plane. Feature selection and tracking were employed in the rectified images to compute the horizontal motion between images. Differences of consecutive motion estimates were then compared to inertial measurements to verify correct feature tracking. DIMES combined sensor data from multiple sources in a novel way to create a low-cost, robust, and computationally efficient velocity estimation solution, and DIMES was the first robotics vision system used to control a spacecraft during planetary landing. This paper presents the design and implementation of the DIMES algorithm, the assessment of the algorithm performance using a high fidelity Monte Carlo simulation, validation of performance using field test data and the detailed results from the two landings on Mars.
DIMES was used successfully during both MER landings. In the case of Spirit, had DIMES not been used onboard, the total velocity would have been at the limits of the airbag capability. Fortunately, DIMES computed the actual steady state horizontal velocity and it was used by the thruster firing logic to reduce the total velocity prior to landing. For Opportunity, DIMES computed the correct velocity, and the velocity was small enough that the lander performed no action to remove it.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Amidi, O., Kanade, T., and Fujita, K. 1999. A visual odometer for autonomous helicopter flight. Robotics and Autonomous Systems, 28:185–193.
Azarbayejani, A. and Pentland, A. 1995. Recursive estimation of motion, structure, and focal length. IEEE Pattern Analysis and Machine Intelligence, 17(6):562–575.
Bank, T., Frazier, W., Blume, W., Kubitschek, D., Null, G., Mastrodemos, N., and Synnot, S. 2001. Deep impact: 19 gigajoules can make quite and impression. In Proc. 24th Annual AAS Guidance and Control Conference.
Bosse, M., Karl, W., Castanon, D., and DiBitetto, P. 1997. A vision-augmented navigation system. In Proc. IEEE Conf. Intelligent Transportation Systems, pp. 1028–1033.
Buratti, B. 1984. Voyager disk resolved photometry of the saturnian satellites. Icarus 59:392–405.
Cheng, Y., Johnson, A., Matthies, L., and Wolf, A. 2001. Passive image-based hazard avoidance for spacecraft safe landing. In Proc. 7th Int’l Symp. Artificial Intelligence, Robotics and Automation in Space (iSAIRAS 01).
Cheng, Y., Johnson, A., and Matthies, L. 2005. MER-DIMES: A planetary landing application of computer vision. In Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2005).
Corke, P. 2004. An inertial and visual sensing system for a small autonomous helicopter. Journal of Robotic Systems, 21:43–51.
Gennery, D. 2001. Least-squares camera calibration including lens distortion and automatic editing of calibration points. In A. Gruen and T. Huang (eds.), Calibration and Orientation of Cameras in Computer Vision, Springer-Verlag, Berlin.
Hapke, B. 1986. Bidirectional reflectance spectroscopy. IV. The extinction coefficient and the opposition effect. Icarus, 67:264–280.
Harris, C. and Stevens, M. 1988. A combined corner and edge detector. In Proc. 4th Alvey Vision Conf., pp. 147–151.
Hartley, R. and Zisserman, A. 2003. Multiple View Geometry in Computer Vision, Cambridge University Press.
Johnson, A. and Matthies, L. 1999. Precise image-based motion estimation for autonomous small body exploration. In Proc. 5th Int’l Symp. Artificial Intelligence, Robotics and Automation in Space (iSAIRAS’99), pp. 627–634.
Johnson, A., Montgomery, J., and Matthies, L. 2005a. Vision guided landing of an autonomous helicopter in hazardous terrain. In Proc. IEEE International Conference on Robotics and Automation, (ICRA 2005).
Johnson, A., Willson, R., Goguen, J., Alexander, J., and Meller, D. 2005b. Field testing of the mars exploration rovers descent image motion estimation system. In Proc. IEEE International Conference on Robotics and Automation, (ICRA 2005).
Kubota, T., Sawai, S., Hashimoto, T., Kawaguchi, J., and Fujiwara, A. 1999. Autonomous landing system for MUSES-C sample return mission. In Proc. 5th Int’l Symp. Artificial Intelligence, Robotics and Automation in Space (iSAIRAS’99).
Maki, J. 1999. The color of Mars: Spectrophotometric measurements at the pathfinder landing site. Journal of Geophysical Research, 104(E4).
Maki, J., et al. 2003. Mars exploration rover engineering cameras. Journal of Geophysical Research, 108(E12).
Malin, M., et al. 1992. The mars observer camera. Journal of Geophysical Research, 97(E5):7699–7718.
Moravec, H. 1977. Towards automatic visual obstacle avoidance. In Proc. of the 5th International Joint Conference on Artificial Intelligence.
Oliensis, J. 2002. Exact two image structure from motion. IEEE Pattern Analysis and Machine Intelligence, 24(12):1618–1633.
Raiszadeh, B. and Queen, E. 2004. Mars exploration rover terminal descent mission modeling and simulation. In Proc. AIAA Space Flight Mechanics Meeting, AAS-04-271.
Roumeliotis, S., Johnson, A., and Montgomery, J. 2002. Augmenting inertial navigation with image-based motion estimation. In Proc. Int’l Conf. Robotics and Automation (ICRA 2002), pp. 4326–4333.
San Martin, A. and Bailey, E. 2005. The MER transverse impulse rocket system: An active system to compensate for martian winds during landing. In Proc. 28th AAS G&C Conf., AAS-05-086.
Saripalli, S., Montgomery, J., and Sukhatme, G. 2003. Visually-guided landing of an unmanned aerial vehicle. IEEE Transactions on Robotics and Automation, 19(3):371–381.
Shakernia, O., Vidal, R., Sharp, C.S., Ma, Y., and Sastry, S.S. 2002. Multiple view motion estimation and control for landing an unmanned aerial vehicle. In Proc. Int’l Conf. Robotics and Automation (ICRA ‘02), pp. 2793–2798.
Shuster, M. and Oh, S. 1981. Three-axis attitude determination from vector observations. Journal of Guidance and Control, 4(1):70–77.
Smith, G.H., et al. 2001. Optical design for the Mars’ 03 rover cameras. SPIE, 4441:118–131.
Williams, B. 2003. Technical challenges and results for navigation of NEAR Shoemaker. John Hopkins APL Technical Digest, 23(1).
Willson, R., Johnson, A., and Goguen, J. 2005a. MOC2DIMES: A camera simulator for the mars exploration Rover descent image motion estimation system. In Proc. 8th Int’l. Symp. Artificial Intelligence, Robotics and Automation in Space.
Willson, R., Maimone, M., Johnson, A., and Scherr, L. 2005b. An optical model for image artifacts produced by dust particles on lenses. In Proc. 8th Int’l. Symp. Artificial Intelligence, Robotics and Automation in Space.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Johnson, A., Willson, R., Cheng, Y. et al. Design Through Operation of an Image-Based Velocity Estimation System for Mars Landing. Int J Comput Vision 74, 319–341 (2007). https://doi.org/10.1007/s11263-006-0022-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11263-006-0022-z