Advertisement

Autonomous Robots

, Volume 43, Issue 8, pp 2131–2161 | Cite as

Localization uncertainty-aware autonomous exploration and mapping with aerial robots using receding horizon path-planning

  • Christos PapachristosEmail author
  • Frank Mascarich
  • Shehryar Khattak
  • Tung Dang
  • Kostas Alexis
Article

Abstract

This work presents an uncertainty-aware path-planning strategy to achieve autonomous aerial robotic exploration of unknown environments while ensuring mapping consistency on-the-go. The planner follows a paradigm of hierarchically optimized objectives, which are executed in receding horizon fashion. Initially, a random tree over the known feasible configurations is used to derive a maximal-exploration path, and its first viewpoint is selected as the next waypoint. Subsequently, an uncertainty-optimization step takes place, constructing within a local volume region a second tree of admissible alternative trajectories that all arrive at the reference viewpoint. Belief propagation of the robot state and the tracked landmarks in the environment takes place over the branches of this tree, and the path that minimizes the expected localization and mapping uncertainty is selected. This path is followed by the robot, and the entire process is iteratively repeated. The algorithm’s computational complexity is analyzed and experimental results are used to evaluate its realtime execution efficiency onboard a micro aerial vehicle. The architecture of the complete pipeline is detailed, and an open-source implementation is provided. A complete aerial robot synthesis that enables high-fidelity autonomous reconstruction supported by the proposed planner is also elaborated. Comprehensive experimental evaluation studies that include mockup environments in ambient illumination and in challenging conditions such as clutter and darkness, as well as a field deployment in a railroad tunnel degraded visual environment are presented, with all data provided as openly available.

Keywords

Uncertainty-aware planning Belief propagation Exploration Belief-space planning Path-planning Aerial robots Degraded visual environments 

Notes

Supplementary material

Supplementary material 1 (mp4 372036 KB)

References

  1. Achtelik, M. W., Lynen, S., Weiss, S., Chli, M., & Siegwart, R. (2014). Motion-and uncertainty-aware path planning for micro aerial vehicles. Journal of Field Robotics, 31(4), 676–698.CrossRefGoogle Scholar
  2. Alexis, K., Papachristos, C., Siegwart, R., & Tzes, A. (2015). Uniform coverage structural inspection path-planning for micro aerial vehicles. In 2015 IEEE international symposium on intelligent control (ISIC) (pp. 59–64). IEEE.Google Scholar
  3. Alismail, H., Kaess, M., Browning, B., & Lucey, S. (2017). Direct visual odometry in low light using binary descriptors. IEEE Robotics and Automation Letters, 2(2), 444–451.CrossRefGoogle Scholar
  4. Aloimonos, J., Weiss, I., & Bandyopadhyay, A. (1988). Active vision. International Journal of Computer Vision, 1(4), 333–356.CrossRefGoogle Scholar
  5. Amigoni, F., & Caglioti, V. (2010). An information-based exploration strategy for environment mapping with mobile robots. Robotics and Autonomous Systems, 58(5), 684–699.CrossRefGoogle Scholar
  6. Amukele, T., Ness, P. M., Tobian, A. A., Boyd, J., & Street, J. (2017). Drone transportation of blood products. Transfusion, 57(3), 582–588.CrossRefGoogle Scholar
  7. Arora, S., Choudhury, S., Althoff, D., & Scherer, S. (2015). Emergency maneuver library-ensuring safe navigation in partially known environments. In 2015 IEEE international conference on robotics and automation (ICRA) (pp. 6431–6438). IEEE.Google Scholar
  8. Banta, J. E., Wong, L., Dumont, C., & Abidi, M. A. (2000). A next-best-view system for autonomous 3-D object reconstruction. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 30(5), 589–598.CrossRefGoogle Scholar
  9. Barfoot, T. D., & Furgale, P. T. (2014). Associating uncertainty with three-dimensional poses for use in estimation problems. IEEE Transactions on Robotics, 30(3), 679–693.CrossRefGoogle Scholar
  10. Bircher, A., Kamel, M., Alexis, K., Burri, M., Oettershagen, P., Omari, S., et al. (2016a). Three-dimensional coverage path planning via viewpoint resampling and tour optimization for aerial robots. Autonomous Robots, 40(6), 1059–1078.CrossRefGoogle Scholar
  11. Bircher, A., Kamel, M., Alexis, K., Oleynikova, H., & Siegwart, R. (2016b). Receding horizon “next-best-view” planner for 3D exploration. In 2016 IEEE international conference on robotics and automation (ICRA) (pp. 1462–1468). IEEE.Google Scholar
  12. Bloesch, M., Omari, S., Hutter, M., & Siegwart, R. (2015). Robust visual inertial odometry using a direct EKF-based approach. In 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 298–304). IEEE.Google Scholar
  13. Bourgault, F., Makarenko, A. A., Williams, S. B., Grocholsky, B., & Durrant-Whyte, H. F. (2002). Information based adaptive robotic exploration. In IEEE/RSJ international conference on intelligent robots and systems, 2002 (Vol. 1, pp. 540–545). IEEE.Google Scholar
  14. Brent, R . P., & Zimmermann, P. (2010). Modern computer arithmetic (Vol. 18). Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  15. Bry, A., & Roy, N. (2011). Rapidly-exploring random belief trees for motion planning under uncertainty. In 2011 IEEE international conference on robotics and automation (ICRA) (pp. 723–730). IEEE.Google Scholar
  16. Bryce, D., Kambhampati, S., & Smith, D. E. (2006). Planning graph heuristics for belief space search. Journal of Artificial Intelligence Research, 26, 35–99.CrossRefGoogle Scholar
  17. Bryson, M., & Sukkarieh, S. (2008). Observability analysis and active control for airborne SLAM. IEEE Transactions on Aerospace and Electronic Systems, 44(1), 261–280.CrossRefGoogle Scholar
  18. Burgard, W., Stachniss, C., & Grisetti, G. (2005). Information gain-based exploration using Rao-Blackwellized particle filters. In Proceedings of the learning workshop.Google Scholar
  19. Carlone, L., Du, J., Ng, M. K., Bona, B., & Indri, M. (2010). An application of Kullback–Leibler divergence to active SLAM and exploration with particle filters. In 2010 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 287–293). IEEE.Google Scholar
  20. Carlone, L., Tron, R., Daniilidis, K., & Dellaert, F. (2015). Initialization techniques for 3D SLAM: A survey on rotation estimation and its use in pose graph optimization. In 2015 IEEE international conference on robotics and automation (ICRA) (pp. 4597–4604). IEEE.Google Scholar
  21. Carrillo, H., Reid, I., & Castellanos, J. A. (2012). On the comparison of uncertainty criteria for active SLAM. In 2012 IEEE international conference on robotics and automation (ICRA). IEEE.Google Scholar
  22. Chli, M., & Davison, A. J. (2008). Active matching. In European conference on computer vision (pp. 72–85). Berlin: Springer.Google Scholar
  23. Connolly, C. et al. (1985). The determination of next best views. In Proceedings. 1985 IEEE international conference on robotics and automation (Vol. 2, pp. 432–435). IEEE.Google Scholar
  24. Costante, G., Forster, C., Delmerico, J., Valigi, P., & Scaramuzza, D. (2016). Perception-aware path planning. arXiv:1605.04151.
  25. Davison, A. J., & Murray, D. W. (1998). Mobile robot localisation using active vision. In European conference on computer vision (pp. 809–825). Berlin: Springer.Google Scholar
  26. Davison, A. J., & Murray, D. W. (2002). Simultaneous localization and map-building using active vision. IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(7), 865–880.CrossRefGoogle Scholar
  27. Dey, D., Shankar, K. S., Zeng, S., Mehta, R., Agcayazi, M. T., Eriksen, C., Daftry, S., Hebert, M., & Bagnell, J. A. (2016). Vision and learning for deliberative monocular cluttered flight. In Field and service robotics (pp. 391–409). Berlin: Springer.Google Scholar
  28. Engel, J., Sturm, J., & Cremers, D. (2013). Semi-dense visual odometry for a monocular camera. In Proceedings of the IEEE international conference on computer vision (pp. 1449–1456).Google Scholar
  29. Fang, Z., & Scherer, S. (2015). Real-time onboard 6DoF localization of an indoor MAV in degraded visual environments using a RGB-D camera. In 2015 IEEE international conference on robotics and automation (ICRA) (pp. 5253–5259). IEEE.Google Scholar
  30. Feder, H. J. S., Leonard, J. J., & Smith, C. M. (1999). Adaptive mobile robot navigation and mapping. The International Journal of Robotics Research, 18(7), 650–668.CrossRefGoogle Scholar
  31. Fernández-Madrigal, J.-A. (2012). Simultaneous localization and mapping for mobile robots: Introduction and methods. Hershey: IGI Global.Google Scholar
  32. Furgale, P., Rehder, J., & Siegwart, R. (2013). Unified temporal and spatial calibration for multi-sensor systems. In 2013 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 1280–1286). IEEE.Google Scholar
  33. Grocholsky, B., Keller, J., Kumar, V., & Pappas, G. (2006). Cooperative air and ground surveillance. IEEE Robotics & Automation Magazine, 13(3), 16–25.CrossRefGoogle Scholar
  34. Hornung, A., Wurm, K. M., Bennewitz, M., Stachniss, C., & Burgard, W. (2013). OctoMap: An efficient probabilistic 3D mapping framework based on octrees. Autonomous Robots, 34(3), 189–206.CrossRefGoogle Scholar
  35. Huang, S., Kwok, N. M., Dissanayake, G., Ha, Q. P., & Fang, G. (2005). Multi-step look-ahead trajectory planning in SLAM: Possibility and necessity. In Proceedings of the 2005 IEEE international conference on robotics and automation (pp. 1091–1096). IEEE.Google Scholar
  36. Hunt, E. R., Hively, W. D., Fujikawa, S. J., Linden, D. S., Daughtry, C. S., & McCarty, G. W. (2010). Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sensing, 2(1), 290–305.CrossRefGoogle Scholar
  37. Indelman, V., Carlone, L., & Dellaert, F. (2016). Towards planning in generalized belief space. In Robotics research. Berlin: Springer.Google Scholar
  38. Jain, S., Nuske, S., Chambers, A., Yoder, L., Cover, H., Chamberlain, L., Scherer, S., & Singh, S. (2015). Autonomous river exploration. In Field and service robotics (pp. 93–106). Berlin: Springer.Google Scholar
  39. Kaelbling, L. P., & Lozano-Pérez, T. (2012). Unifying perception, estimation and action for mobile manipulation via belief space planning. In 2012 IEEE international conference on robotics and automation (ICRA) (pp. 2952–2959). IEEE.Google Scholar
  40. Kamel, M., Stastny, T., Alexis, K., & Siegwart, R. (2017). Model predictive control for trajectory tracking of unmanned aerial vehicles using robot operating system. In Robot operating system (ROS) (pp. 3–39). Berlin: Springer.Google Scholar
  41. Karaszewski, M., Adamczyk, M., & Sitnik, R. (2016). Assessment of next-best-view algorithms performance with various 3D scanners and manipulator. ISPRS Journal of Photogrammetry and Remote Sensing, 119, 320–333.CrossRefGoogle Scholar
  42. Khalfaoui, S., Seulin, R., Fougerolle, Y., & Fofi, D. (2013). An efficient method for fully automatic 3D digitization of unknown objects. Computers in Industry, 64(9), 1152–1160.CrossRefGoogle Scholar
  43. Kiefer, J. (1974). General equivalence theory for optimum designs (approximate theory). The Annals of Statistics, 2(5), 849–879.MathSciNetCrossRefGoogle Scholar
  44. LaValle, S., & Kuffner, Jr., J. J. (1999). Randomized kinodynamic planning. In Proceedings. 1999 IEEE international conference on robotics and automation, 1999 (Vol. 1, pp. 473–479).Google Scholar
  45. LaValle, S. M. (2006). Planning algorithms. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  46. Leung, C., Huang, S., & Dissanayake, G. (2006). Active slam using model predictive control and attractor based exploration. In 2006 IEEE/RSJ international conference on intelligent robots and systems (pp. 5026–5031). IEEE.Google Scholar
  47. Leung, C., Huang, S., & Dissanayake, G. (2008). Active SLAM in structured environments. In (2008) IEEE international conference on robotics and automation (pp. 1898–1903). IEEE.Google Scholar
  48. Li, M., & Mourikis, A. I. (2013). High-precision, consistent ekf-based visual–inertial odometry. The International Journal of Robotics Research, 32(6), 690–711.CrossRefGoogle Scholar
  49. Ljung, L. (1999). System identification: Theory for the user (2nd ed.). Upper Saddle River, NJ: Prentice Hall Inc.zbMATHGoogle Scholar
  50. Lynen, S., Achtelik, M., Weiss, S., Chli, M., & Siegwart, R. (2013). A robust and modular multi-sensor fusion approach applied to MAV navigation. In Proceedings of the IEEE/RSJ conference on intelligent robots and systems (IROS).Google Scholar
  51. Makarenko, A. A., Williams, S. B., Bourgault, F., & Durrant-Whyte, H. F. (2002). An experiment in integrated exploration. In IEEE/RSJ international conference on intelligent robots and systems, 2002 (Vol. 1, pp. 534–539). IEEE.Google Scholar
  52. Martinez-Cantin, R., de Freitas, N., Brochu, E., Castellanos, J., & Doucet, A. (2009). A Bayesian exploration–exploitation approach for optimal online sensing and planning with a visually guided mobile robot. Autonomous Robots, 27(2), 93–103.CrossRefGoogle Scholar
  53. Mascarich, F., Khattak, S., Papachristos, C., & Alexis, K. (2018). A multi-modal mapping unit for autonomous exploration and mapping of underground tunnels. In 2018 IEEE aerospace conference (pp. 1–7). IEEE.Google Scholar
  54. Mourikis, A. I., & Roumeliotis, S. I. (2007). A multi-state constraint kalman filter for vision-aided inertial navigation. In 2007 IEEE international conference on robotics and automation (pp. 3565–3572). IEEE.Google Scholar
  55. Nägeli, T., Meier, L., Domahidi, A., Alonso-Mora, J., & Hilliges, O. (2017). Real-time planning for automated multi-view drone cinematography. ACM Transactions on Graphics (TOG), 36(4), 132.CrossRefGoogle Scholar
  56. Newcombe, R. A., Lovegrove, S. J., & Davison, A. J. (2011). DTAM: Dense tracking and mapping in real-time. In 2011 IEEE international conference on computer vision (ICCV) (pp. 2320–2327). IEEE.Google Scholar
  57. Nuske, S., Choudhury, S., Jain, S., Chambers, A., Yoder, L., Scherer, S., et al. (2015). Autonomous exploration and motion planning for an unmanned aerial vehicle navigating rivers. Journal of Field Robotics, 32(8), 1141–1162.CrossRefGoogle Scholar
  58. Papachristos, C., & Alexis, K. (2016). Augmented reality-enhanced structural inspection using aerial robots. In 2016 IEEE international symposium on intelligent control (ISIC) (pp. 1–6). IEEE.Google Scholar
  59. Papachristos, C., Alexis, K., Carrillo, L. R. G., & Tzes, A. (2016). Distributed infrastructure inspection path planning for aerial robotics subject to time constraints. In 2016 international conference on unmanned aircraft systems (ICUAS) (pp. 406–412). IEEE.Google Scholar
  60. Papachristos, C., Alexis, K., & Tzes, A. (2014a). Technical activities execution with a tiltrotor uas employing explicit model predictive control. IFAC Proceedings Volumes, 47(3), 11 036–11 042.CrossRefGoogle Scholar
  61. Papachristos, C., Alexis, K., & Tzes, A. (2014b). Efficient force exertion for aerial robotic manipulation: Exploiting the thrust-vectoring authority of a tri-tiltrotor UAV. In 2014 IEEE international conference on robotics and automation (ICRA) (pp. 4500–4505). IEEE.Google Scholar
  62. Papachristos, C., Khattak, S., & Alexis, K. (2017a). Uncertainty-aware receding horizon exploration and mapping using aerial robots. In 2017 IEEE international conference on robotics and automation (ICRA) (pp. 4568–4575). IEEE. Available: https://github.com/unr-arl/rhem_planner.
  63. Papachristos, C., Khattak, S., & Alexis, K. (2017b). Uncertainty-aware receding horizon exploration and mapping planner open source package. Available: https://github.com/unr-arl/rhem_planner.
  64. Papachristos, C., Mascarich, F., Khattak, S., Dang, T., & Alexis, K. (2018). Uncertainty-aware receding horizon exploration and mapping with aerial robots open datasets. Available: https://github.com/unr-arl/rhem_datasets.
  65. Pérez, L. M. P. (2008). Divide and conquer: EKF SLAM in \(o (n)\). Ph.D. dissertation, University of Zaragoza, Zaragoza.Google Scholar
  66. Pito, R. (1996). A sensor-based solution to the “next best view” problem. In Proceedings of the 13th international conference on pattern recognition, 1996 (Vol. 1, pp. 941–945). IEEE.Google Scholar
  67. Pito, R. (1999). A solution to the next best view problem for automated surface acquisition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 21(10), 1016–1030.CrossRefGoogle Scholar
  68. Platt Jr, R., Tedrake, R., Kaelbling, L., & Lozano-Perez, T. (2010). Belief space planning assuming maximum likelihood observations.Google Scholar
  69. Potthast, C., Breitenmoser, A., Sha, F., & Sukhatme, G. S. (2016). Active multi-view object recognition: A unifying view on online feature selection and view planning. Robotics and Autonomous Systems, 84, 31–47.CrossRefGoogle Scholar
  70. Potthast, C., & Sukhatme, G. S. (2014). A probabilistic framework for next best view estimation in a cluttered environment. Journal of Visual Communication and Image Representation, 25(1), 148–164.CrossRefGoogle Scholar
  71. Prentice, S., & Roy, N. (2009). The belief roadmap: Efficient planning in belief space by factoring the covariance. The International Journal of Robotics Research, 28(11–12), 1448–1465.Google Scholar
  72. Prieto, S., Quintana, B., Adán, A., & Vázquez, A. S. (2017). As-is building-structure reconstruction from a probabilistic next best scan approach. Robotics and Autonomous Systems, 94, 186–207.CrossRefGoogle Scholar
  73. Roy, N., & Thrun, S. (2000). Coastal navigation with mobile robots. In Advances in neural information processing systems (pp. 1043–1049).Google Scholar
  74. Sim, R., Dudek, G., & Roy, N. (2004). Online control policy optimization for minimizing map uncertainty during exploration. In 2004 IEEE international conference on robotics and automation, 2004. Proceedings. ICRA’04 (Vol. 2, pp. 1758–1763). IEEE.Google Scholar
  75. Sim, R. & Roy, N. (2005). Global A-optimal robot exploration in SLAM. In Proceedings of the 2005 IEEE international conference on robotics and automation (pp. 661–666). IEEE.Google Scholar
  76. Tomic, T., Schmid, K., Lutz, P., Domel, A., Kassecker, M., Mair, E., et al. (2012). Toward a fully autonomous UAV: Research platform for indoor and outdoor urban search and rescue. IEEE Robotics & Automation Magazine, 19(3), 46–56.CrossRefGoogle Scholar
  77. Unnikrishnan, R., & Hebert, M. (2005). Fast extrinsic calibration of a laser rangefinder to a camera. Robotics Institute, Pittsburgh, PA, technical report CMU-RI-TR-05-09.Google Scholar
  78. Van Den Berg, J., Patil, S., & Alterovitz, R. (2012). Motion planning under uncertainty using iterative local optimization in belief space. The International Journal of Robotics Research, 31(11), 1263–1278.CrossRefGoogle Scholar
  79. Watterson, M., & Kumar, V. (2015). Safe receding horizon control for aggressive MAV flight with limited range sensing. In 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 3235–3240). IEEE.Google Scholar
  80. Yamauchi, B. (1997). A frontier-based approach for autonomous exploration. In 1997 IEEE international symposium on computational intelligence in robotics and automation. CIRA’97, Proceedings (pp. 146–151). IEEE.Google Scholar
  81. Yoder, L., & Scherer, S. (2016). Autonomous exploration for infrastructure modeling with a micro aerial vehicle. In Field and service robotics (pp. 427–440). Berlin: Springer.Google Scholar
  82. Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11), 1330–1334.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.University of NevadaRenoUSA

Personalised recommendations