Skip to main content

Advertisement

Log in

TNES: terrain traversability mapping, navigation and excavation system for autonomous excavators on worksite

  • Published:
Autonomous Robots Aims and scope Submit manuscript

Abstract

We present a terrain traversability mapping and navigation system (TNS) for autonomous excavator applications in an unstructured environment. We use an efficient approach to extract terrain features from RGB images and 3D point clouds and incorporate them into a global map for planning and navigation. Our system can adapt to changing environments and update the terrain information in real-time. Moreover, we present a novel dataset, the Complex Worksite Terrain dataset, which consists of RGB images from construction sites with seven categories based on navigability. Our novel algorithms improve the mapping accuracy over previous methods by 4.17–30.48\(\%\) and reduce MSE on the traversability map by 13.8–71.4\(\%\). We have combined our mapping approach with planning and control modules in an autonomous excavator navigation system and observe \(49.3\%\) improvement in the overall success rate. Based on TNS, we demonstrate the first autonomous excavator that can navigate through unstructured environments consisting of deep pits, steep hills, rock piles, and other complex terrain features. In addition, we combine the proposed TNS with the autonomous excavation system (AES), and deploy the new pipeline, TNES, on a more complex construction site. With minimum human intervention, we demonstrate autonomous navigation capability with excavation tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Notes

  1. Up direction in the real world.

References

  • Ahtiainen, J., Stoyanov, T., & Saarinen, J. (2017). Normal distributions transform traversability maps: Lidar-only approach for traversability mapping in outdoor environments. Journal of Field Robotics, 34(3), 600–621. https://doi.org/10.1002/rob.21657

    Article  Google Scholar 

  • Bellone, M., Messina, A., & Reina, G. (2013). A new approach for terrain analysis in mobile robot applications. In 2013 IEEE international conference on mechatronics (ICM) (pp. 225–230). https://doi.org/10.1109/ICMECH.2013.6518540

  • Bellone, M., Reina, G., Giannoccaro, N., & Spedicato, L. (2014). 3D traversability awareness for rough terrain mobile robots. Sensor Review. https://doi.org/10.1108/SR-03-2013-644

    Article  Google Scholar 

  • Braun, T., Bitsch, H., & Berns, K. (2008). Visual terrain traversability estimation using a combined slope/elevation model. In A. R. Dengel, K. Berns, T. M. Breuel, F. Bomarius, & T. R. Roth-Berghofer (Eds.), KI 2008: Advances in Artificial Intelligence (pp. 177–184). Springer.

  • Chavez-Garcia, R. O., Guzzi, J., Gambardella, L. M., & Giusti, A. (2018). Learning ground traversability from simulations. IEEE Robotics and Automation Letters, 3(3), 1695–1702. https://doi.org/10.1109/LRA.2018.2801794

    Article  Google Scholar 

  • Chilian, A., & Hirschmüller, H. (2009). Stereo camera based navigation of mobile robots on rough terrain. In 2009 IEEE/RSJ international conference on intelligent robots and systems (pp. 4571–4576). https://doi.org/10.1109/IROS.2009.5354535

  • Cortinhal, T., Tzelepis, G., & Aksoy, E. E. (2020). SalsaNext: Fast, uncertainty-aware semantic segmentation of LiDAR point clouds for autonomous driving.

  • Dahlkamp, H., Kaehler, A., Stavens, D., Thrun, S., & Bradski, G. R. (2006). Self-supervised monocular road detection in desert terrain. In Robotics: Science and systems.

  • Deng, F., Zhu, X., & He, C. (2017). Vision-based real-time traversable region detection for mobile robot in the outdoors. Sensors. https://doi.org/10.3390/s17092101

    Article  Google Scholar 

  • Ewen, P., Li, A., Chen, Y., Hong, S., & Vasudevan, R. (2022). These maps are made for walking: Real-time terrain property estimation for mobile robots. IEEE Robotics and Automation Letters, 7(3), 7083–7090.

    Article  Google Scholar 

  • Fankhauser, P., & Hutter, M. (2016). A universal grid map library: Implementation and use case for rough terrain navigation. In Robot operating system (ROS) (pp. 99–120). Springer

  • Frey, J., Hoeller, D., Khattak, S., & Hutter, M. (2022). Locomotion policy guided traversability learning using volumetric representations of complex environments. In 2022 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp. 5722–5729 . IEEE

  • Geiger, A., Lenz, P., Stiller, C., & Urtasun, R. (2013). Vision meets robotics: The kitti dataset. The International Journal of Robotics Research, 32(11), 1231–1237. https://doi.org/10.1177/0278364913491297

    Article  Google Scholar 

  • Guan, T., He, Z., Song, R., Manocha, D., & Zhang, L. (2021). TNS: Terrain traversability mapping and navigation system for autonomous excavators. In Robotics: Science and Systems XVIII.

  • Guan, T., Song, R., Ye, Z., & Zhang, L. (2023). Vinet: Visual and inertial-based terrain classification and adaptive navigation over unknown terrain. In 2023 IEEE international conference on robotics and automation (ICRA)

  • Guan, T., Kothandaraman, D., Chandra, R., Sathyamoorthy, A. J., Weerakoon, K., & Manocha, D. (2022). GA-Nav: Efficient terrain segmentation for robot navigation in unstructured outdoor environments. IEEE Robotics and Automation Letters, 7(3), 8138–8145. https://doi.org/10.1109/LRA.2022.3187278

    Article  Google Scholar 

  • He, D., Xu, W., Zhang, F. (2021). Embedding manifold structures into Kalman filters. arXiv preprint arXiv:2102.03804

  • Hewitt, R. A., Ellery, A., & de Ruiter, A. (2017). Training a terrain traversability classifier for a planetary rover through simulation. International Journal of Advanced Robotic Systems, 14(5), 1729881417735401. https://doi.org/10.1177/1729881417735401

    Article  Google Scholar 

  • Hirose, N., Sadeghian, A., Vázquez, M., Goebel, P., & Savarese, S. (2018). Gonet: A semi-supervised deep learning approach for traversability estimation. In 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 3044–3051). https://doi.org/10.1109/IROS.2018.8594031

  • Hoffmann, G. M., Tomlin, C. J., Montemerlo, M., & Thrun, S. (2007). Autonomous automobile trajectory tracking for off-road driving: Controller design, experimental validation and racing. In 2007 American control conference (pp. 2296–2301). IEEE

  • Holder, C. J., & Breckon, T. P. (2018). Learning to drive: Using visual odometry to bootstrap deep learning for off-road path prediction. In 2018 IEEE intelligent vehicles symposium (IV) (pp. 2104–2110). https://doi.org/10.1109/IVS.2018.8500526

  • Jiang, P., Osteen, P. R., Wigness, M., Saripalli, S. (2021). Rellis-3d dataset: Data, benchmarks and analysis. In 2021 IEEE international conference on robotics and automation (ICRA) (pp. 1110–1116).

  • Julier, S. J., & Uhlmann, J. K. (1997). New extension of the Kalman filter to nonlinear systems. In Signal processing, sensor fusion, and target recognition VI (Vol. 3068, pp. 182–193). Spie

  • Kahn, G., Abbeel, P., & Levine, S. (2021). Badgr: An autonomous self-supervised learning-based navigation system. IEEE Robotics and Automation Letters, 6(2), 1312–1319.

    Article  Google Scholar 

  • Khan, M. M., Ali, H., Berns, K., & Muhammad, A. (2016). Road traversability analysis using network properties of roadmaps. In 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 2960–2965). https://doi.org/10.1109/IROS.2016.7759458

  • Khan, M., Berns, K., & Muhammad, A. (2020). Vehicle specific robust traversability indices using roadmaps on 3d pointclouds. International Journal of Intelligent Robotics and Applications, 4, 1–17. https://doi.org/10.1007/s41315-020-00148-x

    Article  Google Scholar 

  • Kim, S.-K., & Russell, J. (2003). Framework for an intelligent earthwork system: Part I. System architecture. Automation in Construction, 12, 1–13.

    Article  Google Scholar 

  • Kingry, N., Jung, M., Derse, E., Dai, R. (2018). Vision-based terrain classification and solar irradiance mapping for solar-powered robotics. In 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 5834–5840). https://doi.org/10.1109/IROS.2018.8593635

  • Kumar, A., Fu, Z., Pathak, D., & Malik, J. (2021). RMA: Rapid motor adaptation for legged robots.

  • Kurzer, K. (2016). Path planning in unstructured environments: A real-time hybrid A* implementation for fast and deterministic path generation for the kth research concept vehicle. Master’s thesis

  • Lin, J., Zheng, C., Xu, W., & Zhang, F. (2021). R2 live: A robust, real-time, lidar-inertial-visual tightly-coupled state estimator and mapping. IEEE Robotics and Automation Letters, 6(4), 7469–7476.

    Article  Google Scholar 

  • Manduchi, R., Castano, A., Talukder, A., & Matthies, L. (2005). Obstacle detection and terrain classification for autonomous off-road navigation. Autonomous Robots, 18, 81–102. https://doi.org/10.1023/B:AURO.0000047286.62481.1d

    Article  Google Scholar 

  • Matsuzaki, S., Yamazaki, K., Hara, Y., & Tsubouchi, T. (2018). Traversable region estimation for mobile robots in an outdoor image. Journal of Intelligent & Robotic Systems, 92(3–4), 453–463. https://doi.org/10.1007/s10846-017-0760-x

    Article  Google Scholar 

  • Maturana, D., Chou, P.-W., Uenoyama, M., & Scherer, S. (2018). Real-time semantic mapping for autonomous off-road navigation. In M. Hutter & R. Siegwart (Eds.), Field and service robotics (pp. 335–350). Springer.

    Chapter  Google Scholar 

  • Nath, N. D., & Behzadan, A. H. (2020). Deep convolutional networks for construction object detection under different visual conditions. Frontiers in Built Environment, 6, 97. https://doi.org/10.3389/fbuil.2020.00097

  • Papadakis, P. (2013). Terrain traversability analysis methods for unmanned ground vehicles: A survey. Engineering Applications of Artificial Intelligence, 26(4), 1373–1385. https://doi.org/10.1016/j.engappai.2013.01.006

    Article  Google Scholar 

  • Paz, D., Zhang, H., Li, Q., Xiang, H., & Christensen, H. I. (2020). Probabilistic semantic mapping for urban autonomous driving applications. In 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 2059–2064). https://doi.org/10.1109/IROS45743.2020.9341738

  • Poudel, R. P. K., Liwicki, S., & Cipolla, R. (2019). Fast-SCNN: Fast semantic segmentation network. In BMVC.

  • Procopio, M. J., Mulligan, J., & Grudic, G. (2009). Learning terrain segmentation with classifier ensembles for autonomous robot navigation in unstructured environments. Journal of Field Robotics, 26(2), 145–175. https://doi.org/10.1002/rob.20279

    Article  Google Scholar 

  • R Shamshiri, R., Weltzien, C., Hameed, I. A., J Yule, I., E Grift, T., Balasundram, S. K., Pitonakova, L., Ahmad, D., & Chowdhary, G. (2018). Research and development in agricultural robotics: A perspective of digital farming

  • Ranftl, R., Bochkovskiy, A., Koltun, V. (2021). Vision transformers for dense prediction. In ICCV

  • Roberts, D., & Golparvar-Fard, M. (2019). End-to-end vision-based detection, tracking and activity analysis of earthmoving equipment filmed at ground level. Automation in Construction. https://doi.org/10.1016/j.autcon.2019.04.006

    Article  Google Scholar 

  • Rosenfeld, R. D., Restrepo, M. G., Gerard, W. H., Bruce, W. E., Branch, A. A., Lewin, G. C., & Bezzo, N. (2018). Unsupervised surface classification to enhance the control performance of a UGV. In 2018 systems and information engineering design symposium (SIEDS) (pp. 225–230). https://doi.org/10.1109/SIEDS.2018.8374741

  • Rothrock, B., Kennedy, R., Cunningham, C. T., Papon, J., Heverly, M., & Ono, M. (2016). Spoc: Deep learning-based terrain classification for mars rover missions.

  • Schilling, F., Chen, X., Folkesson, J., & Jensfelt, P. (2017). Geometric and visual terrain classification for autonomous mobile navigation. In 2017 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 2678–2684). https://doi.org/10.1109/IROS.2017.8206092

  • Seo, J., Lee, S., Kim, J., & Kim, S.-K. (2011). Task planner design for an automated excavation system. Automation in Construction, 20(7), 954–966. https://doi.org/10.1016/j.autcon.2011.03.013

    Article  Google Scholar 

  • Shan, T., Englot, B., Meyers, D., Wang, W., Ratti, C., & Rus, D. (2020). LIO-SAM: Tightly-coupled lidar inertial odometry via smoothing and mapping. In 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 5135–5142). IEEE

  • Shariati, H., Yeraliyev, A., Terai, B., Tafazoli, S., & Ramezani, M. (2019). Towards autonomous mining via intelligent excavators. In CVPR Workshops.

  • Singh, A., Singh, K., & Sujit, P. B. (2021). OffRoadTranSeg: Semi-supervised segmentation using transformers on OffRoad environments.

  • Sock, J., Kim, J., Min, J., & Kwak, K. (2016). Probabilistic traversability map generation using 3d-lidar and camera. In 2016 IEEE international conference on robotics and automation (ICRA) (pp. 5631–5637). https://doi.org/10.1109/ICRA.2016.7487782

  • Sun, P., Kretzschmar, H., Dotiwalla, X., Chouard, A., Patnaik, V., Tsui, P., Guo, J., Zhou, Y., Chai, Y., Caine, B. & Vasudevan, V. (2020). Scalability in perception for autonomous driving: Waymo open dataset. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (pp. 2446–2454).

  • Suryamurthy, V., Raghavan, V. S., Laurenzi, A., Tsagarakis, N. G., & Kanoulas, D. (2019). Terrain segmentation and roughness estimation using RGB data: Path planning application on the Centauro robot. In 2019 IEEE-RAS 19th international conference on humanoid robots (Humanoids) (pp. 1–8). https://doi.org/10.1109/Humanoids43949.2019.9035009

  • Thomas, H., Qi, C. R., Deschaud, J.-E., Marcotegui, B., Goulette, F., Guibas, L. J. (2019). KPConv: Flexible and deformable convolution for point clouds. In Proceedings of the IEEE international conference on computer vision.

  • Viswanath, K., Singh, K., Jiang, P., Sujit, P. B., & Saripalli, S. (2021). Offseg: A semantic segmentation framework for off-road driving. In 2021 IEEE 17th international conference on automation science and engineering (CASE) (pp. 354–359). https://doi.org/10.1109/CASE49439.2021.9551643

  • Wermelinger, M., Fankhauser, P., Diethelm, R., Krüsi, P., Siegwart, R., & Hutter, M. (2016). Navigation planning for legged robots in challenging terrain. In 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 1184–1189). https://doi.org/10.1109/IROS.2016.7759199

  • Wigness, M., Eum, S., Rogers, J.G., Han, D., & Kwon, H. (2019). A rugd dataset for autonomous navigation and visual perception in unstructured outdoor environments. In International conference on intelligent robots and systems (IROS).

  • Wu, H., Zhang, J., Huang, K., Liang, K., & Yizhou, Y. (2019). FastFCN: Rethinking Dilated Convolution in the Backbone for Semantic Segmentation.

  • Wu, T., Tang, S., Zhang, R., & Zhang, Y. (2021). Cgnet: A light-weight context guided network for semantic segmentation. IEEE Transactions on Image Processing, 30, 1169–1179.

    Article  Google Scholar 

  • Xie, E., Wang, W., Yu, Z., Anandkumar, A., Alvarez, J. M., & Luo, P. (2021). Segformer: Simple and efficient design for semantic segmentation with transformers. In Thirty-Fifth conference on neural information processing systems. https://openreview.net/forum?id=OG18MI5TRL

  • Xu, W., Cai, Y., He, D., Lin, J., & Zhang, F. (2022). Fast-lio2: Fast direct lidar-inertial odometry. IEEE Transactions on Robotics, 38(4), 2053–2073.

    Article  Google Scholar 

  • Xue, J., Zhang, H., Dana, K., & Nishino, K. (2017). Differential angular imaging for material recognition. In 2017 IEEE conference on computer vision and pattern recognition (CVPR) (pp. 6940–6949).

  • Yu, C., Gao, C., Wang, J., Yu, G., Shen, C., & Sang, N. (2021). Bisenet v2: Bilateral network with guided aggregation for real-time semantic segmentation. International Journal of Computer Vision, 129, 1–18. https://doi.org/10.1007/s11263-021-01515-2

    Article  Google Scholar 

  • Zhang, L., Zhao, J., Long, P., Wang, L., Qian, L., Lu, F., Song, X., & Manocha, D. (2021). An autonomous excavator system for material loading tasks. Science Robotics. https://doi.org/10.1126/scirobotics.abc3164

    Article  Google Scholar 

  • Zhao, Y., Liu, P., Xue, W., Miao, R., Gong, Z., & Ying, R. (2019). Semantic probabilistic traversable map generation for robot path planning. In 2019 IEEE international conference on robotics and biomimetics (ROBIO) (pp. 2576–2582). https://doi.org/10.1109/ROBIO49542.2019.8961533

  • Zheng, S., Lu, J., Zhao, H., Zhu, X., Luo, Z., Wang, Y., Fu, Y., Feng, J., Xiang, T., Torr, P. H. S., & Zhang, L. (2021). Rethinking semantic segmentation from a sequence-to-sequence perspective with transformers. In CVPR

  • Zhou, Y., Huang, Y., & Xiong, Z. (2021). 3D traversability map generation for mobile robots based on point cloud. In 2021 IEEE/ASME international conference on advanced intelligent mechatronics (AIM) (pp. 836–841). https://doi.org/10.1109/AIM46487.2021.9517463

Download references

Acknowledgements

This work is a summer internship project while Tianrui is interning at Baidu RAL. We appreciate the discussion and support from Baidu RAL team.

Funding

No funds, grants, or other support was received.

Author information

Authors and Affiliations

Authors

Contributions

TG is the lead author of this paper during his internship and contribute of every aspect of this work. ZH helps with the hardware setup and obtains the demo on the worksite. RS helps with the planning and controller section. LZ is the corresponding author and the manager of this project and RAL team. All authors contribute to each section and help other authors. All authors write and carefully proofread the main manuscript text.

Corresponding author

Correspondence to Liangjun Zhang.

Ethics declarations

Conflict of interest

No conflict of interest.

Human and animal rights

No human participants and/or animals involved.

Informed consent

Informed consent is given to all authors of this manuscript.

RSS special issue

This paper is an extension of the RSS 2022 submission TNS Guan et al. (2021) and is invited to submit to Autonomous Robots.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file 1 (pdf 4661 KB)

Supplementary file 2 (mp4 878037 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Guan, T., He, Z., Song, R. et al. TNES: terrain traversability mapping, navigation and excavation system for autonomous excavators on worksite. Auton Robot 47, 695–714 (2023). https://doi.org/10.1007/s10514-023-10113-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-023-10113-9

Keywords

Navigation