Skip to main content

Heterogeneous Sensor Fusion via Confidence-Rich 3D Grid Mapping: Application to Physical Robots

  • Conference paper
  • First Online:
Proceedings of the 2018 International Symposium on Experimental Robotics (ISER 2018)

Part of the book series: Springer Proceedings in Advanced Robotics ((SPAR,volume 11))

Included in the following conference series:

  • 1851 Accesses

Abstract

Autonomous navigation of intelligent physical systems largely depend on the ability of the system to generate an accurate map of its environment. Confidence-rich grid mapping algorithm provides a novel representation of the map based on range data by storing richer information at each voxel, including an estimate of the variance of occupancy. Capabilities and limitations are attributes of any given sensor, and therefore a single sensor may not be effective in providing detailed assessment of dynamic terrains. By incorporating multiple sensory modalities in a robot and extracting fused sensor information from them leads to higher certainty, noise reduction, and improved failure tolerance when mapping in real-world scenarios. In this work we investigate and evaluate sensor fusion techniques using confidence-rich grid mapping through a series of experiments on physical robotic systems with measurements from heterogeneous ranging sensors.

E. Heiden and D. Pastor—Equal contribution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://cast.caltech.edu.

  2. 2.

    http://optitrack.com.

  3. 3.

    https://software.intel.com/en-us/aero/drone-kit.

References

  1. Agha-Mohammadi, A.-A., Heiden, E., Hausman, K., Sukhatme, G.: Confidence-rich grid mapping. In: Proceedings of the International Symposium on Robotics Research, pp. 1–19 (2017)

    Google Scholar 

  2. Khoshelham, K.: Accuracy analysis of kinect depth data. In: ISPRS Workshop Laser Scanning, vol. 38, p. W12 (2011)

    Google Scholar 

  3. Castorena, J., Creusere, C.D., Voelz, D.: Modeling LIDAR scene sparsity using compressive sensing. In: 2010 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp. 2186–2189. IEEE (2010)

    Google Scholar 

  4. Elmenreich, W.: An introduction to sensor fusion. Vienna University of Technology, Austria (2002)

    Google Scholar 

  5. Moravec, H.P., Elfes, A.: High resolution maps from wide angle sonar. In: Proceedings of the IEEE Conference on Robotics and Automation, pp. 19–24 (1985)

    Google Scholar 

  6. Yamauchi, B.: A frontier-based approach for autonomous exploration. In: IEEE International Symposium on Computational Intelligence in Robotics and Automation, pp. 146–151 (1997)

    Google Scholar 

  7. Paskin, M., Thrun, S.: Robotic mapping with polygonal random fields. In: Conference on Uncertainty in Artificial Intelligence, pp. 450–458 (2005)

    Google Scholar 

  8. Thrun, S.: Learning occupancy grid maps with forward sensor models. Auton. Robots 15(2), 111–127 (2003)

    Article  Google Scholar 

  9. Moravec, H.P.: Sensor fusion in certainty grids for mobile robots. AI Mag. 9(2), 61 (1988)

    Google Scholar 

  10. Elfes, A., Matthies, L.: Sensor integration for robot navigation: combining sonar and stereo range data in a grid-based representataion. In: 26th IEEE Conference on Decision and Control 1987, vol. 26, pp. 1802–1807. IEEE (1987)

    Google Scholar 

  11. Stepan, P., Kulich, M., Preucil, L.: Robust data fusion with occupancy grid. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 35(1), 106–115 (2005)

    Article  Google Scholar 

  12. Pathak, K., Birk, A., Poppinga, J., Schwertfeger, S.: 3D forward sensor modeling and application to occupancy grid based sensor fusion. In: IEEE/RSJ International Conference on Intelligent Robots and Systems 2007. IROS 2007, pp. 2059–2064. IEEE (2007)

    Google Scholar 

  13. Geneva, P., Eckenhoff, K., Huang, G.: Asynchronous multi-sensor fusion for 3D mapping and localization. Technical report RPNG-2017-002, University of Delaware (2017). http://udel.edu/ghuang/papers/tr_async.pdf

  14. Patel, N., Choromanska, A., Krishnamurthy, P., Khorrami, F.: Sensor modality fusion with CNNs for UGV autonomous driving in indoor environments. In: International Conference on Intelligent Robots and Systems (IROS). IEEE (2017)

    Google Scholar 

  15. Bohez, S., Verbelen, T., De Coninck, E., Vankeirsbilck, B., Simoens, P., Dhoedt, B.: Sensor fusion for robot control through deep reinforcement learning. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2365–2370. IEEE (2017)

    Google Scholar 

  16. Quigley, M., Conley, K., Gerkey, B.P., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A.Y.: ROS: an open-source Robot Operating System. In: ICRA Workshop on Open Source Software (2009)

    Google Scholar 

  17. Hornung, A., Wurm, K.M., Bennewitz, M., Stachniss, C., Burgard, W.: OctoMap: an efficient probabilistic 3d mapping framework based on octrees. Auton. Robots 34(3), 189–206 (2013)

    Article  Google Scholar 

  18. Koch, R., May, S., Nüchter, A.: Detection and purging of specular reflective and transparent object influences in 3D range measurements. In: ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, pp. 377–384 (2017)

    Article  Google Scholar 

  19. Heiden, E., Hausman, K., Sukhatme, G.S., Agha-Mohammadi, A.-A.: Planning high-speed safe trajectories in confidence-rich maps. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2880–2886. IEEE (2017)

    Google Scholar 

  20. Sun, K., Mohta, K., Pfrommer, B., Watterson, M., Liu, S., Mulgaonkar, Y., Taylor, C.J., Kumar, V.: Robust stereo visual inertial odometry for fast autonomous flight. IEEE Robot. Autom. Lett. 3(2), 965–972 (2018)

    Article  Google Scholar 

  21. Zhang, J., Singh, S.: LOAM: lidar odometry and mapping in real-time. In: Robotics: Science and Systems Conference, July 2014

    Google Scholar 

Download references

Acknowledgements

This research was carried out at the Jet Propulsion Laboratory under a contract with the National Aeronautics and Space Administration. U.S. Government sponsorship acknowledged.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eric Heiden .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 14787 KB)

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Heiden, E., Pastor, D., Vyshnav, P., Agha-Mohammadi, AA. (2020). Heterogeneous Sensor Fusion via Confidence-Rich 3D Grid Mapping: Application to Physical Robots. In: Xiao, J., Kröger, T., Khatib, O. (eds) Proceedings of the 2018 International Symposium on Experimental Robotics. ISER 2018. Springer Proceedings in Advanced Robotics, vol 11. Springer, Cham. https://doi.org/10.1007/978-3-030-33950-0_62

Download citation

Publish with us

Policies and ethics