Abstract
The upcoming Hubble Space Telescope (HST) Servicing Mission 4 (SM4) includes a Relative Navigation Sensor (RNS) experiment which uses three cameras and an avionics package to record images, and estimate in real-time the relative position and attitude (aka “pose”) during the Shuttle capture and deployment of the telescope. RNS recently completed its third and final phase of testing at the Marshall Space Flight Center Flight Robotics Laboratory. This testing utilized flight spare cameras, engineering development unit avionics and flight pose algorithms to estimate the pose of a Hubble mockup mounted to the Flight Robotics Laboratory (FRL) Dynamic Overhead Target Simulator (DOTS). The mockup was moved through a variety of flight-like lighting conditions and trajectories. In this paper we present pose estimation results from the third phase of RNS FRL testing.
Similar content being viewed by others
References
“NASA-STD-8719.14: Process for Limiting Orbital Debris,” NASA Technical Standard, August 2008.
Sirotzky, S., Butler, R., Heckler, G., Cohen, J., Wennersten, M., Boegner, J., Davis, M., Lanham, A., and Bamford, B. “Building a GPS Receiver for Space-Lessons Learned,” Proceedings of the 2008 Institute of Navigation National Technical Meeting, San Diego, California, USA, January 2008.
Winternitz, L., Moreau, M., Boegner, J., and Sirotzky, S. “Navigator GPS Receiver for Fast Acquisition and Weak Signal Space Applications,“ Proceedings of the 2004 Institute of Navigation GNSS Meeting, Long Beach, California, USA, September 2004.
Harris, C. Active Vision, MIT Press, 1991, Ch. 4.
Bouguet, J. Y. “Pyramidal Implementation of the Lucas Kanade Feature Tracker: Description of the Algorithm,” Intel Corporation, Microprocessor Research Labs, implemented in OpenCV, 2002.
Bradski, G. “The OpenCV Library,” Dr. Dobbs Journal of Software Tools, 2000.
Chien, C. and Baker, K. “Pose Estimation for Servicing of Orbital Replacement Units in a Cluttered Environment,” Proceedings of the 2004 IEEE International Conference on Robotics Automation, 2004.
Drummond, T. and Cipolla, R. “Real-Time Visual Tracking of Complex Structures,” IEEE Transactionson Pattern Analysis and Machine Intelligence, 2002.
Faugeras, O. and Toscani, G. “Camera Calibration for 3D Computer Graphics,” Proceedings of the International Workshop on Machine Vision and Machine Intelligence, 1987.
Nash, J. C. Compact Numerical Methods for Computers: Linear Algebra and Function Minimisation, Taylor and Francis, 1990.
Shreiner, D., Woo, M., Neider, J., and Davis, T. OpenGL Programming Guide, Upper Saddle River, NJ: AddisonWesley, Fifth ed., 2006.
Faugeras, O. Three-Dimensional Computer Vision: A Geometric Viewpoint, Cambridge, Massachusetts, The MIT Press, 1993.
Hannah, S. J. “ULTOR Passive Pose and Position Engine for Spacecraft Relative Navigation,” Proceedings of the SPIE, Vol. 6958, Bellingham, WA, USA, 2007, pp. 69580I—1 to 69580I—10.
Markle Y, F.L. “Attitude Error Representations for Kaiman Filtering,” Journal of Guidance, Control, and Dynamics, Vol. 26, 2003, pp. 311–317.
Wertz, J. R. Spacecraft Attitude Determination and Control, Boston, MA, Kluwer Academic Publishers, 1978.
Author information
Authors and Affiliations
Additional information
Presented at the F. Landis Markley Astronautics Symposium, Cambridge, Maryland, June 29–July 2, 2008.
Rights and permissions
About this article
Cite this article
Naasz, B.J., Burns, R.D., Queen, S.Z. et al. The HST SM4 Relative Navigation Sensor System: Overview and Preliminary Testing Results from the Flight Robotics Lab. J of Astronaut Sci 57, 457–483 (2009). https://doi.org/10.1007/BF03321512
Published:
Issue Date:
DOI: https://doi.org/10.1007/BF03321512