Skip to main content

Real-Time Robot 3D Pose Computation from NIR Imagery and ToF Depth Maps for Space Applications

  • Conference paper
  • First Online:
Trends and Challenges in Multidisciplinary Research for Global Sustainable Development (ICASAT 2023)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 965))

Included in the following conference series:

Abstract

This paper presents an algorithm capable of determining the three-dimensional position and orientation (3D pose) of an exploration robot from the processing of two multidimensional signals, a monocular near-infrared (NIR) video signal and a time-of-flight (ToF) depth signal, both provided by a monocular NIR ToF camera rigidly attached to the side of a robot facing the terrain. It is shown that if the depth signal is also considered during processing, it is possible to accurately determine the 3D pose of the robot, even on irregular terrain. The 3D pose is calculated by integrating the frame-to-frame robot 3D motion over time using composition rules, where the frame-to-frame robot 3D motion is estimated by minimizing a linear photometric error by applying an iterative maximum likelihood estimator. Hundreds of experiments have been conducted over rough terrain, obtaining excellent absolute position and orientation errors of less than 1 percent of the distance and angle traveled, respectively. This good performance is mainly due to the algorithm’s more accurate knowledge of the depth provided by the monocular NIR ToF camera. The algorithm runs in real time and can process up to 50 fps at VGA resolution on a conventional laptop computer.

The author thanks the Vice Rector’s Office for Research of the University of Costa Rica for funding.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bayard, D.S., et al.: D.T.C.: Vision-based navigation for the NASA mars helicopter. In: AIAA Scitech 2019 Forum (2019)

    Google Scholar 

  2. Bertsekas, D.P.: Nonlinear Programming. 2nd edn. Athena Scientific, Nashua (1999)

    Google Scholar 

  3. Bierling, M.: A differential displacement estimation algorithm with improved stability. In: 2nd International Technical Symposium on Optional and Electro-Optic Applied Sciences and Engineering, pp. 170–174. Cannes, France, 25-27 November 1985

    Google Scholar 

  4. Chiuso, A., Favaro, P., Jin, H., Soatto, S.: Structure from motion causally integrated over time. IEEE Trans. Pattern Anal. Mach. Intell. 24(4), 523–535 (2002)

    Article  Google Scholar 

  5. Davison, A.J., Reid, I.D., Molton, N.D., Stasse, O.: MonoSLAM: real-time single camera slam. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1052–1067 (2007)

    Article  Google Scholar 

  6. Delaune, J., Bayard, D.S., Brockers, R.: Range-visual-inertial odometry: scale observability without excitation. IEEE Robot. Autom. Lett. 6(2), 2421–2428 (2021). https://doi.org/10.1109/LRA.2021.3058918

    Article  Google Scholar 

  7. Delaune, J., et al.: Extended navigation capabilities for a future mars science helicopter concept. In: 2020 IEEE Aerospace Conference, pp. 1–10 (2020). https://doi.org/10.1109/AERO47225.2020.9172289

  8. Eade, E., Drummond, T.: Scalable monocular slam. In: IEEE IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 469–476. New York, NY, USA, June 2006

    Google Scholar 

  9. Engel, J., Schoeps, T., Cremers, D.: Lsd-slam: Large-scale direct monocular slam. In: European Conference on Computer Vision, pp. 834–849. Zurich, Switzerland, September 2014

    Google Scholar 

  10. Forster, C., Pizzoli, M., , Scaramuzza, D.: SVO: fast semi-direct monocular visual odometry. In: IEEE International Conference on Robotics and Automation, pp. 15–22. Hong Kong, June 2014

    Google Scholar 

  11. Helmick, D., Cheng, Y., Clouse, D., Bajracharya, M., Matthies, L., Roumeliotis, S.: Slip compensation for a mars rover. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2806–2813. Edmonton, Alberta, Canada, 2-6 August 2005

    Google Scholar 

  12. Howard, T., et al.: Enabling continuous planetary rover navigation through FPGA stereo and visual odometry. In: IEEE Aerospace Conference, pp. 1–9. Big Sky, MT, USA, 3–10 March 2012

    Google Scholar 

  13. Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: IEEE ACM International Symposium Mixed Augmented Reality, pp. 225–234. Nara, Japan, November 2007

    Google Scholar 

  14. Ma, Y., Soatto, S., Kosecka, J., Sastry, S.S.: An Invitation to 3-D Vision: From Images to Geometric Models. Springer, New York (2003). https://doi.org/10.1007/978-0-387-21779-6

  15. Maimone, M., Cheng, Y., Matthies, L.: Two years of visual odometry on the mars exploration rovers. J. Field Robot. 24(3), 169–186 (2007)

    Article  Google Scholar 

  16. Martinez, G.: Monocular visual odometry from frame to frame intensity differences for planetary exploration mobile robots. In: IEEE Worshop on Robot Vision (IEEE WoRV), pp. 54–59. Tampa Bay, Florida, USA, 15–17 January 2013

    Google Scholar 

  17. Martinez, G.: Field tests on flat ground of an intensity-difference based monocular visual odometry algorithm for planetary rovers. In: 15th IAPR International Conference on Machive Vision Applications (IAPR MVA-2017), pp. 161–164. Nagoya, Japan, 08–12 May 2017

    Google Scholar 

  18. Martinez, G.: Experimental results of testing a direct monocular visual odometry algorithm outdoors on flat terrain under severe global illumination changes for planetary exploration rovers. Computacion y Sistemas 22(4), 1581–1593 (2018)

    Google Scholar 

  19. Mendel, J.M.: Lessons in Estimation Theory for Signal Processing, Communications, and Control. Prentice-Hall signal processing series, Prentice Hall PTR (1995). https://books.google.co.cr/books?id=pK51QgAACAAJ

  20. Mouragnon, E., Lhuillier, M., Dhome, M., Dekeyser, F., Sayd, P.: Real time localization and 3d reconstruction. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 363–370 (2006)

    Google Scholar 

  21. Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: Orb-slam: A versatile and accurate monocular slam system. IEEE Trans. Robot. 31(5), 1147–1163 (2015)

    Article  Google Scholar 

  22. Strasdat, H., Montiel, J., Davison, A.: Scale drift-aware large scale monocular slam. In: Robotics: Sci. and Systems, Zaragoza, Spain, 27–30 June 2010

    Google Scholar 

  23. Tsai, R.: A versatile camera calibration technique for high-accuracy 3d machine vision metrology using off-the-shelf tv cameras and lenses. IEEE J. Robot. Autom. 3(4), 323–344 (1987)

    Article  Google Scholar 

  24. Tzanetos, T., et al.: Future of mars rotorcraft - mars science helicopter. In: 2022 IEEE Aerospace Conference (AERO), pp. 1–16 (2022). https://doi.org/10.1109/AERO53065.2022.9843501

Download references

Acknowledgment

This work was supported by the Vice-Rectorate for Research of the University of Costa Rica. Thanks to Reg Willson from the NASA Jet Propulsion Laboratory for kindly delivering the implementation of Tsai’s coplanar calibration algorithm for use in the experiments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Geovanni Martinez .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Martinez, G. (2024). Real-Time Robot 3D Pose Computation from NIR Imagery and ToF Depth Maps for Space Applications. In: Hernández Ponce, A.M., Marcos Escobar, K., Canales Hernández, L.D., Zea Ortiz, M., Sánchez Alonso, R.E. (eds) Trends and Challenges in Multidisciplinary Research for Global Sustainable Development. ICASAT 2023. Lecture Notes in Networks and Systems, vol 965. Springer, Cham. https://doi.org/10.1007/978-3-031-57620-1_2

Download citation

Publish with us

Policies and ethics