Incremental Learning of Traversability Cost for Aerial Reconnaissance Support to Ground Units

  • Miloš PrágrEmail author
  • Petr Čížek
  • Jan Faigl
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11472)


In this paper, we address traversability cost estimation using exteroceptive and proprioceptive data collected by a team of aerial and ground vehicles. The main idea of the proposed approach is to estimate the terrain traversability cost based on the real experience of the multi-legged walking robot with traversing different terrain types. We propose to combine visual features with the real measured traversability cost based on proprioceptive signals of the utilized hexapod walking robot as a ground unit. The estimated traversability cost is augmented by extracted visual features from the onboard robot camera, and the features are utilized to extrapolate the learned traversability model for an aerial scan of new environments to assess their traversability cost. The extrapolated traversability cost can be utilized in the high-level mission planning to avoid areas that are difficult to traverse but not visited by the ground units. The proposed approach has been experimentally verified with a real hexapod walking robot in indoor and outdoor scenarios.



This work has been supported by the Czech Science Foundation (GAČR) under research Project No. 18-18858S.


  1. 1.
    Bartoszyk, S., Kasprzak, P., Belter, D.: Terrain-aware motion planning for a walking robot. In: RoMoCo, pp. 29–34 (2017).
  2. 2.
    Belter, D., Wietrzykowski, J., Skrzypczynski, P.: Employing natural terrain semantics in motion planning for a multi-legged robot. J. Intell. Robot. Syst., 1–21 (2018).
  3. 3.
    Bradley, D.M., Chang, J.K., Silver, D., Powers, M., Herman, H., Rander, P., Stentz, A.: Scene understanding for a high-mobility walking robot. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1144–1151 (2015).
  4. 4.
    Brown, D., Webster, G.: Now a stationary research platform, NASA’s mars rover spirit starts a new chapter in red planet scientific studies. NASA Press Release (2010)Google Scholar
  5. 5.
    Falconer, J.: Toshiba unveils four-legged nuclear plant inspection robot. Innovation Toronto (2012). Accessed 10 April 2018
  6. 6.
    Fankhauser, P., et al.: Collaborative navigation for flying and walking robots. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2859–2866 (2016).
  7. 7.
    Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (1981). Scholar
  8. 8.
    Homberger, T., Bjelonic, M., Kottege, N., Borges, P.V.K.: Terrain-dependant control of hexapod robots using vision. In: Kulić, D., Nakamura, Y., Khatib, O., Venture, G. (eds.) ISER 2016. SPAR, vol. 1, pp. 92–102. Springer, Cham (2017). Scholar
  9. 9.
    Jun, B.H., Shim, H., Kim, B., Park, J.Y., Baek, H., Yoo, S., Lee, P.M.: Development of seabed walking robot CR200. In: OCEANS MTS/IEEE Bergen, pp. 1–5 (2013).
  10. 10.
    Kottege, N., Parkinson, C., Moghadam, P., Elfes, A., Singh, S.P.N.: Energetics-informed hexapod gait transitions across terrains. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 5140–5147 (2015).
  11. 11.
    Kragh, M., Jørgensen, R.N., Pedersen, H.: Object detection and terrain classification in agricultural fields using 3D lidar data. In: Nalpantidis, L., Krüger, V., Eklundh, J.-O., Gasteratos, A. (eds.) ICVS 2015. LNCS, vol. 9163, pp. 188–197. Springer, Cham (2015). Scholar
  12. 12.
    Mrva, J., Faigl, J.: Tactile sensing with servo drives feedback only for blind hexapod walking robot. In: RoMoCo, pp. 240–245 (2015).
  13. 13.
    Mur-Artal, R., Tardós, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Trans. Robot. 33(5), 1255–1262 (2017). Scholar
  14. 14.
    Otsu, K., Ono, M., Fuchs, T.J., Baldwin, I., Kubota, T.: Autonomous terrain classification with co- and self-training approach. Robot. Autom. Lett. 1(2), 814–819 (2016). Scholar
  15. 15.
    Pinto, R.C., Engel, P.M.: A fast incremental gaussian mixture model. PLoS One 10(10), e0139931 (2015). Scholar
  16. 16.
    Prágr, M., Čížek, P., Faigl, J.: Cost of transport estimation for legged robot based on terrain features inference from aerial scan. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1745–1750 (2018).
  17. 17.
    Roennau, A., Heppner, G., Nowicki, M., Dillmann, R.: LAURON V: a versatile six-legged walking robot with advanced maneuverability. In: AIM, pp. 82–87 (2014).
  18. 18.
    Sofman, B., Lin, E., Bagnell, J.A., Cole, J., Vandapel, N., Stentz, A.: Improving robot navigation through self-supervised online learning. J. Field Robot. 23(11–12), 1059–1075 (2006).
  19. 19.
    Stelzer, A., Hirschmüller, H., Görner, M.: Stereo-vision-based navigation of a six-legged walking robot in unknown rough terrain. Int. J. Robot. Res. 31(4), 381–402 (2012). Scholar
  20. 20.
    Tucker, V.A.: The energetic cost of moving about: walking and running are extremely inefficient forms of locomotion. Much greater efficiency is achieved by birds, fish—and bicyclists. Am. Sci. 63(4), 413–419 (1975)Google Scholar
  21. 21.
    Ünsalan, C., Boyer, K.L.: Linearized vegetation indices based on a formal statistical framework. IEEE Trans. Geosci. Remote Sens. 42(7), 1575–1585 (2004). Scholar
  22. 22.
    Wellington, C., Stentz, A.: Online adaptive rough-terrain navigation in vegetation. In: IEEE International Conference on Robotics and Automation (ICRA), pp. 96–101 (2004).

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Faculty of Electrical EngineeringCzech Technical University in PraguePragueCzech Republic

Personalised recommendations