Abstract
This paper presents an algorithm capable of determining the three-dimensional position and orientation (3D pose) of an exploration robot from the processing of two multidimensional signals, a monocular near-infrared (NIR) video signal and a time-of-flight (ToF) depth signal, both provided by a monocular NIR ToF camera rigidly attached to the side of a robot facing the terrain. It is shown that if the depth signal is also considered during processing, it is possible to accurately determine the 3D pose of the robot, even on irregular terrain. The 3D pose is calculated by integrating the frame-to-frame robot 3D motion over time using composition rules, where the frame-to-frame robot 3D motion is estimated by minimizing a linear photometric error by applying an iterative maximum likelihood estimator. Hundreds of experiments have been conducted over rough terrain, obtaining excellent absolute position and orientation errors of less than 1 percent of the distance and angle traveled, respectively. This good performance is mainly due to the algorithm’s more accurate knowledge of the depth provided by the monocular NIR ToF camera. The algorithm runs in real time and can process up to 50 fps at VGA resolution on a conventional laptop computer.
The author thanks the Vice Rector’s Office for Research of the University of Costa Rica for funding.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bayard, D.S., et al.: D.T.C.: Vision-based navigation for the NASA mars helicopter. In: AIAA Scitech 2019 Forum (2019)
Bertsekas, D.P.: Nonlinear Programming. 2nd edn. Athena Scientific, Nashua (1999)
Bierling, M.: A differential displacement estimation algorithm with improved stability. In: 2nd International Technical Symposium on Optional and Electro-Optic Applied Sciences and Engineering, pp. 170–174. Cannes, France, 25-27 November 1985
Chiuso, A., Favaro, P., Jin, H., Soatto, S.: Structure from motion causally integrated over time. IEEE Trans. Pattern Anal. Mach. Intell. 24(4), 523–535 (2002)
Davison, A.J., Reid, I.D., Molton, N.D., Stasse, O.: MonoSLAM: real-time single camera slam. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1052–1067 (2007)
Delaune, J., Bayard, D.S., Brockers, R.: Range-visual-inertial odometry: scale observability without excitation. IEEE Robot. Autom. Lett. 6(2), 2421–2428 (2021). https://doi.org/10.1109/LRA.2021.3058918
Delaune, J., et al.: Extended navigation capabilities for a future mars science helicopter concept. In: 2020 IEEE Aerospace Conference, pp. 1–10 (2020). https://doi.org/10.1109/AERO47225.2020.9172289
Eade, E., Drummond, T.: Scalable monocular slam. In: IEEE IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 469–476. New York, NY, USA, June 2006
Engel, J., Schoeps, T., Cremers, D.: Lsd-slam: Large-scale direct monocular slam. In: European Conference on Computer Vision, pp. 834–849. Zurich, Switzerland, September 2014
Forster, C., Pizzoli, M., , Scaramuzza, D.: SVO: fast semi-direct monocular visual odometry. In: IEEE International Conference on Robotics and Automation, pp. 15–22. Hong Kong, June 2014
Helmick, D., Cheng, Y., Clouse, D., Bajracharya, M., Matthies, L., Roumeliotis, S.: Slip compensation for a mars rover. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2806–2813. Edmonton, Alberta, Canada, 2-6 August 2005
Howard, T., et al.: Enabling continuous planetary rover navigation through FPGA stereo and visual odometry. In: IEEE Aerospace Conference, pp. 1–9. Big Sky, MT, USA, 3–10 March 2012
Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: IEEE ACM International Symposium Mixed Augmented Reality, pp. 225–234. Nara, Japan, November 2007
Ma, Y., Soatto, S., Kosecka, J., Sastry, S.S.: An Invitation to 3-D Vision: From Images to Geometric Models. Springer, New York (2003). https://doi.org/10.1007/978-0-387-21779-6
Maimone, M., Cheng, Y., Matthies, L.: Two years of visual odometry on the mars exploration rovers. J. Field Robot. 24(3), 169–186 (2007)
Martinez, G.: Monocular visual odometry from frame to frame intensity differences for planetary exploration mobile robots. In: IEEE Worshop on Robot Vision (IEEE WoRV), pp. 54–59. Tampa Bay, Florida, USA, 15–17 January 2013
Martinez, G.: Field tests on flat ground of an intensity-difference based monocular visual odometry algorithm for planetary rovers. In: 15th IAPR International Conference on Machive Vision Applications (IAPR MVA-2017), pp. 161–164. Nagoya, Japan, 08–12 May 2017
Martinez, G.: Experimental results of testing a direct monocular visual odometry algorithm outdoors on flat terrain under severe global illumination changes for planetary exploration rovers. Computacion y Sistemas 22(4), 1581–1593 (2018)
Mendel, J.M.: Lessons in Estimation Theory for Signal Processing, Communications, and Control. Prentice-Hall signal processing series, Prentice Hall PTR (1995). https://books.google.co.cr/books?id=pK51QgAACAAJ
Mouragnon, E., Lhuillier, M., Dhome, M., Dekeyser, F., Sayd, P.: Real time localization and 3d reconstruction. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 363–370 (2006)
Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: Orb-slam: A versatile and accurate monocular slam system. IEEE Trans. Robot. 31(5), 1147–1163 (2015)
Strasdat, H., Montiel, J., Davison, A.: Scale drift-aware large scale monocular slam. In: Robotics: Sci. and Systems, Zaragoza, Spain, 27–30 June 2010
Tsai, R.: A versatile camera calibration technique for high-accuracy 3d machine vision metrology using off-the-shelf tv cameras and lenses. IEEE J. Robot. Autom. 3(4), 323–344 (1987)
Tzanetos, T., et al.: Future of mars rotorcraft - mars science helicopter. In: 2022 IEEE Aerospace Conference (AERO), pp. 1–16 (2022). https://doi.org/10.1109/AERO53065.2022.9843501
Acknowledgment
This work was supported by the Vice-Rectorate for Research of the University of Costa Rica. Thanks to Reg Willson from the NASA Jet Propulsion Laboratory for kindly delivering the implementation of Tsai’s coplanar calibration algorithm for use in the experiments.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Martinez, G. (2024). Real-Time Robot 3D Pose Computation from NIR Imagery and ToF Depth Maps for Space Applications. In: Hernández Ponce, A.M., Marcos Escobar, K., Canales Hernández, L.D., Zea Ortiz, M., Sánchez Alonso, R.E. (eds) Trends and Challenges in Multidisciplinary Research for Global Sustainable Development. ICASAT 2023. Lecture Notes in Networks and Systems, vol 965. Springer, Cham. https://doi.org/10.1007/978-3-031-57620-1_2
Download citation
DOI: https://doi.org/10.1007/978-3-031-57620-1_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-57619-5
Online ISBN: 978-3-031-57620-1
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)