Abstract
A new autonomous navigation scheme for planetary landing is presented. The navigation system contains an inertial measurement unit (IMU) and a stereo camera which can measure unit directional vectors and range information from the camera to detected landmarks. The lander’s motion is estimated by a algorithm known as vision-aided inertial navigation (VAIN). The algorithm uses the unit directional vectors and range measurements of features tracked in two sequential images and the lander’s corresponding poses derived from the IMU and it does not require any a priori terrain information. An augmented implicit extended Kalman filter (IEKF) tightly integrates measurements from the stereo camera and the IMU to produce an accurate estimation of the lander’s pose and velocity and to correct the IMU constant biases. The results of a numerical simulation show that the proposed VAIN method can vastly improve the navigation accuracy of the INS and satisfy the requirements of future planetary exploration missions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Wang D, Huang X, Guan Y (2008) GNC system scheme for lunar soft landing spacecraft. Adv Space Res 42(2):379–385
Busnardo DM, Aitken ML, Tolson RH et al (2011) LIDAR-aided inertial navigation with extended Kalman filtering for pinpoint landing. In: 49th AIAA aerospace science meeting including the new horizons forum and aerospace exposition, Orlando, Florida
Trawny N, Mourikis AI, Roumeliotis SI (2007) Vision-aided inertial navigation for pin-point landing using observations of mapped landmarks. J Field Robot 24(5):357–378
Pham VB, Lacroix S, Devy M (2012) Vision-based absolute navigation for descent and landing. J Field Robot 29(4):627–647
Yu M, Cui HT, Tian Y (2014) A new approach based on crater detection and matching for visual navigation in planetary landing. Adv Space Res 53(12):1810–1821
Kim J, Sukkarieh S (2004) Autonomous airborne navigation in unknown terrain environments. IEEE Trans Aerosp Electron Syst 40(3):1031–1045
Amidi O, Kanade T, Fujita K (1999) A visual odometer for autonomous helicopter flight. Robot Auton Syst 28:185–193
Indelman V, Gurfil P, Rivlin E et al (2012) Real-time vision-aided localization and navigation based on three-view geometry. IEEE Trans Aerosp Electron Syst 48(3):2239–2259
Mourikis AI, Trawny N, Roumeliotis SI, Johnson AE et al (2009) Vision-aided inertial navigation for spacecraft entry, descent, and landing. IEEE Trans Robot 25(2):264–280
Pini G, Hector R (2001) Partial aircraft state estimation from visual motion using the subspace constraints approach. J Guid Control Dyn 24(5):1016–1028
Roumeliotis SI, Burdick JW (2002) Stochastic cloning: a generalized framework for processing relative state measurements, In: IEEE international conference on robotics and automation, Washington, DC, pp. 1788–1795
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Xu, C., Wang, D., Huang, X. (2016). Autonomous Navigation Based on Sequential Images for Planetary Landing. In: Jia, Y., Du, J., Li, H., Zhang, W. (eds) Proceedings of the 2015 Chinese Intelligent Systems Conference. Lecture Notes in Electrical Engineering. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-48365-7_41
Download citation
DOI: https://doi.org/10.1007/978-3-662-48365-7_41
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-48363-3
Online ISBN: 978-3-662-48365-7
eBook Packages: EngineeringEngineering (R0)