Abstract
Simultaneous Localization and Mapping (SLAM) is a fundamental problem in robotics. Over the past three decades, researchers have made significant progress in solving the probabilistic SLAM problem by presenting various theoretical frameworks, efficient solvers, and complete systems. As the development of autonomous robots (i.e., self-driving cars, legged robots) continues, SLAM systems have become increasingly popular for large-scale real-world applications. The evolution of SLAM is also often propelled by the emergence of new sensors or sensor combinations. This chapter provides an introduction to the commonly used sensors in mobile robots, followed by a comprehensive review of several classic SLAM systems from a modern perspective. Additionally, this chapter presents a real-world case of constructing a multi-sensor system and a challenging SLAM dataset, offering a valuable tutorial for researchers in developing their research platforms. Overall, this chapter aims to provide readers with a comprehensive guide to learn sensor fusion from theory to practice completely.
This work was supported by Guangdong Basic and Applied Basic Research Foundation, under project 2021B1515120032, Foshan-HKUST Project no. FSUST20-SHCIRI06C, and Project of Hetao Shenzhen Hong Kong Science and Technology Innovation Cooperation Zone (HZQB-KCZYB-2020083), awarded to Prof. Ming Liu.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
- 10.
References
Andrew AM (2001) Multiple view geometry in computer vision. Kybernetes
Arun KS, Huang TS, Blostein SD (1987) Least-squares fitting of two 3-d point sets. IEEE Trans Pattern Anal Mach Intell 5:698–700
Barfoot TD (2017) State estimation for robotics. Cambridge University Press
Barnes D, Gadd M, Murcutt P, Newman P, Posner I (2020) The oxford radar robotcar dataset: a radar extension to the oxford robot car dataset. In: 2020 IEEE international conference on robotics and automation (ICRA). IEEE, pp 6433–6438
Behley J, Stachniss C (2018) Efficient surfel-based slam using 3d laser range data in urban environments. Robot: Sci Syst
Bentley JL (1975) Multidimensional binary search trees used for associative searching. Commun ACM 18(9):509–517
Beuchert J, Camurri M, Fallon M (2022) Factor graph fusion of raw GNSS sensing with IMU and lidar for precise robot localization without a base station. arXiv:2209.14649
Bloesch M, Burri M, Omari S, Hutter M, Siegwart R (2017) Iterated extended kalman filter based visual-inertial odometry using direct photometric feedback. Int J Robot Res 36(10):1053–1072
Bloesch M, Czarnowski J, Clark R, Leutenegger S, Davison AJ (2018) Codeslam-learning a compact, optimisable representation for dense visual slam. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2560–2568
Bowman SL, Atanasov N, Daniilidis K, Pappas GJ (2017) Probabilistic data association for semantic slam. In: IEEE international conference on robotics and automation (ICRA). IEEE, pp 1722–1729
Broome M, Gadd M, De Martini D, Newman P (2020) On the road: Route proposal from radar self-supervised by fuzzy lidar traversability. AI 1(4):558–585
Bryner S, Gallego G, Rebecq H, Scaramuzza D (2019) Event-based, direct camera tracking from a photometric 3d map using nonlinear optimization. In: 2019 international conference on robotics and automation (ICRA). IEEE, pp 325–331
Burnett K, Schoellig AP, Barfoot TD (2021) Do we need to compensate for motion distortion and doppler effects in spinning radar navigation? IEEE Robot Autom Lett 6(2):771–778
Cadena C, Carlone L, Carrillo H, Latif Y, Scaramuzza D, Neira J, Reid I, Leonard JJ (2016) Past, present, and future of simultaneous localization and mapping: toward the robust-perception age. IEEE Trans Robot 32(6):1309–1332
Campos C, Elvira R, Rodríguez JJG, Montiel JM, Tardós JD (2021) Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Trans Robot
Cao S, Lu X, Shen S (2022) Gvins: tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation. IEEE Trans Robot 38(4):2004–2021
Cen SH, Newman P (2018) Precise ego-motion estimation with millimeter-wave radar under diverse and challenging conditions. In: 2018 IEEE international conference on robotics and automation (ICRA). IEEE, pp 6045–6052
Cen SH, Newman P (2019) Radar-only ego-motion estimation in difficult settings via graph matching. In: 2019 international conference on robotics and automation (ICRA). IEEE, pp 298–304
Cheng J, Chen Y, Zhang Q, Gan L, Liu C, Liu M (2022) Real-time trajectory planning for autonomous driving with gaussian process and incremental refinement. In: 2022 international conference on robotics and automation (ICRA). IEEE, pp 8999–9005
Chen X, Milioto A, Palazzolo E, Giguere P, Behley J, Stachniss C (2019) Suma++: Efficient lidar-based semantic slam. In: 2019 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4530–4537
Chen Y, Xin R, Cheng J, Zhang Q, Mei X, Liu M, Wang L (2022) Efficient speed planning for autonomous driving in dynamic environment with interaction point model. IEEE Robot Autom Lett 7(4):11 839–11 846
Delmerico J, Cieslewski T, Rebecq H, Faessler M, Scaramuzza D (2019) Are we ready for autonomous drone racing? the uzh-fpv drone racing dataset. In: 2019 international conference on robotics and automation (ICRA). IEEE, pp 6713–6719
Eckenhoff K, Yang Y, Geneva P, Huang G (2019) Tightly-coupled visual-inertial localization and 3-d rigid-body target tracking. IEEE Robot Autom Lett 4(2):1541–1548
Engel J, Koltun V, Cremers D (2017) Direct sparse odometry. IEEE Trans Pattern Anal Mach Intell 40(3):611–625
EUSPA (2021) What is GNSS. https://www.euspa.europa.eu/european-space/eu-space-programme/what-gnss
Forster C, Carlone L, Dellaert F, Scaramuzza D (2016) On-manifold preintegration for real-time visual-inertial odometry. IEEE Trans Robot 33(1):1–21
Forster C, Pizzoli M, Scaramuzza D (2014) Svo: fast semi-direct monocular visual odometry. In: IEEE international conference on robotics and automation (ICRA). IEEE, pp 15–22
Forster C, Zhang Z, Gassner M, Werlberger M, Scaramuzza D (2016) Svo: semidirect visual odometry for monocular and multicamera systems. IEEE Trans Robot 33(2):249–265
Forsyth D, Ponce J (2011) Computer vision: a modern approach. Prentice Hall
Furgale P, Rehder J, Siegwart R (2013) Unified temporal and spatial calibration for multi-sensor systems. In: 2013 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 1280–1286
Furrer F, Fehr M, Novkovic T, Sommer H, Gilitschenski I, Siegwart R (2018) Evaluation of combined time-offset estimation and hand-eye calibration on robotic datasets. In: Field and service robotics: results of the 11th international conference. Springer, pp 145–159
Gadd M, De Martini D, Newman P (2021) Unsupervised place recognition with deep embedding learning over radar videos (2021). arXiv:2106.06703
Gallego G, Lund JE, Mueggler E, Rebecq H, Delbruck T, Scaramuzza D (2017) Event-based, 6-dof camera tracking from photometric depth maps. IEEE Trans Pattern Anal Mach Intell 40(10):2402–2412
Gallego G, Lund JE, Mueggler E, Rebecq H, Delbruck T, Scaramuzza D (2017) Event-based, 6-dof camera tracking from photometric depth maps. IEEE Trans Pattern Anal Mach Intell 40(10):2402–2412
Gallego G, Delbrück T, Orchard G, Bartolozzi C, Taba B, Censi A, Leutenegger S, Davison AJ, Conradt J, Daniilidis K, Scaramuzza D (2020) Event-based vision: a survey. IEEE Trans Pattern Anal Mach Intell 1–1
Gallego G, Rebecq H, Scaramuzza D (2018) A unifying contrast maximization framework for event cameras, with applications to motion, depth, and optical flow estimation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 3867–3876
Gálvez-López D, Tardos JD (2012) Bags of binary words for fast place recognition in image sequences. IEEE Trans Robot 28(5):1188–1197
Geneva P, Eckenhoff K, Lee W, Yang Y, Huang G (2020) Openvins: a research platform for visual-inertial estimation. In: 2020 IEEE international conference on robotics and automation (ICRA). IEEE, pp 4666–4672
Geyer J, Kassahun Y, Mahmudi M, Ricou X, Durgesh R, Chung AS, Hauswald L, Pham VH, Mühlegg M, Dorn S et al (2020) A2d2: audi autonomous driving dataset. arXiv:2004.06320
Han L, Gao F, Zhou B, Shen S (2019) Fiesta: fast incremental Euclidean distance fields for online motion planning of aerial robots. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 4423–4430
Heng L (2020) Automatic targetless extrinsic calibration of multiple 3d lidars and radars. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 10 669–10 675
Hong Z, Petillot Y, Wang S (2020) Radarslam: radar based large-scale slam in all weathers. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 5164–5170
Huang C, Gao F, Pan J, Yang Z, Qiu W, Chen P, Yang X, Shen S, Cheng K-T (2018) Act: An autonomous drone cinematography system for action scenes. In: 2018 IEEE international conference on robotics and automation (ICRA). IEEE, pp 7039–7046
ICRA (2021) Radar in robotics. https://sites.google.com/view/radar-robotics/home
Jiao J, Chen F, Wei H, Wu J, Liu M (2023) Lce-calib: automatic lidar-frame/event camera extrinsic calibration with a globally optimal solution. arXiv:2303.09825
Jiao J, Huang H, Li L, He Z, Zhu Y, Liu M (2021) Comparing representations in tracking for event camera-based slam. In: Proceedings of the IEEE/cvf conference on computer vision and pattern recognition, pp 1369–1376
Jiao J, Wei H, Hu T, Hu X, Zhu Y, He Z, Wu J, Yu J, Xie X, Huang H et al (2022) Fusionportable: a multi-sensor campus-scene dataset for evaluation of localization and mapping accuracy on diverse platforms. In: 2022 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 3851–3856
Jiao J, Ye H, Zhu Y, Liu M (2021) Robust odometry and mapping for multi-lidar systems with online extrinsic calibration. IEEE Trans Robot
Khattak S, Nguyen H, Mascarich F, Dang T, Alexis K (2020) Complementary multi–modal sensor fusion for resilient robot pose estimation in subterranean environments. In: 2020 International conference on unmanned aircraft systems (ICUAS). IEEE, pp 1024–1029
Kilic C, Ohi N, Gu Y, Gross JN (2021) Slip-based autonomous zupt through gaussian process to improve planetary rover localization. IEEE Robot Autom Lett 6(3):4782–4789
Kim G, Park YS, Cho Y, Jeong J, Kim A (2020) Mulran: multimodal range dataset for urban place recognition. In: 2020 IEEE international conference on robotics and automation (ICRA). IEEE, pp 6246–6253
Kubelka V, Vaidis M, Pomerleau F (2022) Gravity-constrained point cloud registration. arXiv:2203.13799
Kumar V, Michael N (2012) Opportunities and challenges with autonomous micro aerial vehicles. The International Journal of Robotics Research 31(11):1279–1291
Le Gentil C, Tschopp F, Alzugaray I, Vidal-Calleja T, Siegwart R, Nieto J (2020) Idol: a framework for imu-dvs odometry using lines. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 5863–5870
Lee W, Eckenhoff K, Yang Y, Geneva P, Huang G (2020) Visual-inertial-wheel odometry with online calibration. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4559–4566
Lee J, Hwangbo J, Wellhausen L, Koltun V, Hutter M (2020) Learning quadrupedal locomotion over challenging terrain. Sci Robot 5(47)
Leutenegger S, Lynen S, Bosse M, Siegwart R, Furgale P (2015) Keyframe-based visual-inertial odometry using nonlinear optimization. Int J Robot Res 34(3):314–334
Li M, Mourikis AI (2013) High-precision, consistent ekf-based visual-inertial odometry. Int J Robot Res 32(6):690–711
Li K, Li M, Hanebeck UD (2021) Towards high-performance solid-state-lidar-inertial odometry and mapping. IEEE Robot Autom Lett 6(3):5167–5174
Lin J, Zheng C, Xu W, Zhang F (2021) R2live: a robust, real-time, lidar-inertial-visual tightly-coupled state estimator and mapping. IEEE Robot Autom Lett 6(4):7469–7476
Lin J, Liu X, Zhang F (2020) A decentralized framework for simultaneous calibration, localization and mapping with multiple lidars. arXiv:2007.01483
Lin J, Zhang F (2020) Loam livox: a fast, robust, high-precision lidar odometry and mapping package for lidars of small FOV. In: 2020 ieee international conference on robotics and automation (ICRA). IEEE, pp 3126–3131
Lin J, Zhang F (2022) R3live: a robust, real-time, RGB-colored, lidar-inertial-visual tightly-coupled state estimation and mapping package. In: 2022 international conference on robotics and automation (ICRA). IEEE, pp 10 672–10 678
Liu Z, Zhang F (2021) Balm: bundle adjustment for lidar mapping. IEEE Robot Autom Lett 6(2):3184–3191
Liu T, hai Liao Q, Gan L, Ma F, Cheng J, Xie X, Wang Z, Chen Y, Zhu Y, Zhang S et al (2021) The role of the hercules autonomous vehicle during the covid-19 pandemic: an autonomous logistic vehicle for contactless goods transportation
Liu Z, Liu X, Zhang F (2022) Efficient and consistent bundle adjustment on lidar point clouds. arXiv:2209.08854
Maddern W, Pascoe G, Linegar C, Newman P (2017) 1 year, 1000 km: the oxford robotcar dataset. Int J Robot Res 36(1):3–15
Magnusson M, Andreasson H, Nuchter A, Lilienthal AJ (2009) Appearance-based loop detection from 3d laser data using the normal distributions transform. In: 2009 IEEE international conference on robotics and automation. IEEE, pp 23–28
McCormac J, Handa A, Davison A, Leutenegger S (2017) Semanticfusion: dense 3d semantic mapping with convolutional neural networks. In: 2017 IEEE international conference on robotics and automation (ICRA). IEEE, pp 4628–4635
Mueggler E, Gallego G, Scaramuzza D (2015) Continuous-time trajectory estimation for event-based vision sensors, Technical Report
Mur-Artal R, Montiel JMM, Tardos JD (2015) Orb-slam: a versatile and accurate monocular slam system. IEEE Trans Robot 31(5):1147–1163
Mur-Artal R, Tardós JD (2017) Orb-slam2: an open-source slam system for monocular, stereo, and RGB-d cameras. IEEE Trans Robot 33(5):1255–1262
Newcombe RA, Izadi S, Hilliges O, Molyneaux D, Kim D, Davison AJ, Kohi P, Shotton J, Hodges S, Fitzgibbon A (2011) Kinectfusion: real-time dense surface mapping and tracking. In: 10th IEEE international symposium on mixed and augmented reality. IEEE, pp 127–136
Nguyen TM, Yuan S, Cao M, Lyu Y, Nguyen TH, Xie L (2021) Ntu viral: a visual-inertial-ranging-lidar dataset, from an aerial vehicle viewpoint. Int J Robot Res 02783649211052312
Pan L, Scheerlinck C, Yu X, Hartley R, Liu M, Dai Y (2019) Bringing a blurry frame alive at high frame-rate with an event camera. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 6820–6829
Park YS, Shin YS, Kim A (2020) Pharao: direct radar odometry using phase correlation. In: 2020 IEEE international conference on robotics and automation (ICRA). IEEE, pp 2617–2623
Park C, Moghadam P, Williams JL, Kim S, Sridharan S, Fookes C (2021) Elasticity meets continuous-time: Map-centric dense 3d lidar slam. IEEE Trans Robot 38(2):978–997
Qin T, Li P, Shen S (2018) Vins-mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans Robot 34(4):1004–1020
Qin T, Pan J, Cao S, Shen S (2019) A general optimization-based framework for local odometry estimation with multiple sensors. arXiv:1901.03638
Qin C, Ye H, Pranata CE, Han J, Liu M (2020) Lins: A lidar-inerital state estimator for robust and fast navigation
Qiu K, Qin T, Gao W, Shen S (2019) Tracking 3-d motion of dynamic objects using monocular visual-inertial sensing. IEEE Trans Robot 35(4):799–816
Qiu K, Qin T, Pan J, Liu S, Shen S (2020) Real-time temporal and rotational calibration of heterogeneous sensors using motion correlation analysis. IEEE Trans Robot
Rebecq H, Horstschäfer T, Gallego G, Scaramuzza D (2016) Evo: a geometric approach to event-based 6-dof parallel tracking and mapping in real time. IEEE Robot Autom Lett 2(2):593–600
Rebecq H, Ranftl R, Koltun V, Scaramuzza D (2019) High speed and high dynamic range video with an event camera. IEEE Trans Pattern Anal Mach Intell 43(6):1964–1980
Rebecq H, Ranftl R, Koltun V, Scaramuzza D (2019) Events-to-video: bringing modern computer vision to event cameras. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 3857–3866
Reinke A, Palieri M, Morrell B, Chang Y, Ebadi K, Carlone L, Agha-Mohammadi AA (2022) Locus 2.0: robust and computationally efficient lidar odometry for real-time 3d mapping. IEEE Robot Autom Lett
Rosinol A, Abate M, Chang Y, Carlone L (2020) Kimera: an open-source library for real-time metric-semantic localization and mapping. In: 2020 IEEE international conference on robotics and automation (ICRA). IEEE, pp 1689–1696
Rublee E, Rabaud V, Konolige K, Bradski G (2011) Orb: an efficient alternative to sift or surf. In: International conference on computer vision, pp 2564–2571
Schubert D, Goll T, Demmel N, Usenko V, Stückler J, Cremers D (2018) The tum vi benchmark for evaluating visual-inertial odometry. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 1680–1687
Schwarting W, Alonso-Mora J, Rus D (2018) Planning and decision-making for autonomous vehicles. Robot Auton Syst Ann Rev Control
Segal A, Haehnel D, Thrun S (2009) Generalized-ICP. In: Robotics: science and systems, vol 2, Issue 4. Seattle, WA, p 435
Serafin J, Grisetti G (2015) Nicp: dense normal based point cloud registration. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 742–749
Shan T, Englot B (2018) ‘Lego-loam: lightweight and ground-optimized lidar odometry and mapping on variable terrain. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4758–4765
Shan T, Englot B, Duarte F, Ratti C, Rus D (2021) Robust place recognition using an imaging lidar. In: 2021 IEEE international conference on robotics and automation (ICRA). IEEE, pp 5469–5475
Shan T, Englot B, Meyers D, Wang W, Ratti C, Rus D (2020) Lio-sam: tightly-coupled lidar inertial odometry via smoothing and mapping. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 5135–5142
Shan T, Englot B, Ratti C, Rus D (2021) Lvi-sam: tightly-coupled lidar-visual-inertial odometry via smoothing and mapping. arXiv:2104.10831
Siegwart R, Nourbakhsh IR, Scaramuzza D (2011) Introduction to autonomous mobile robots. MIT Press
Song H, Ding W, Chen Y, Shen S, Wang MY, Chen Q (2020) Pip: planning-informed trajectory prediction for autonomous driving. In: European conference on computer vision. Springer, pp 598–614
Sturm J, Engelhard N, Endres F, Burgard W, Cremers D (2012) A benchmark for the evaluation of RGB-d slam systems. In: IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 573–580
Sun P, Kretzschmar H, Dotiwalla X, Chouard A, Patnaik V, Tsui P, Guo J, Zhou Y, Chai Y, Caine B et al (2020) Scalability in perception for autonomous driving: Waymo open dataset. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 2446–2454
Tang TY, De Martini D, Newman P (2021) Get to the point: learning lidar place recognition and metric localisation using overhead imagery
Thrun S, Montemerlo M, Dahlkamp H, Stavens D, Aron A, Diebel J, Fong P, Gale J, Halpenny M, Hoffmann G et al (2006) Stanley: the robot that won the darpa grand challenge. J Field Robot 23(9):661–692
Wang Y, Jiang Z, Gao X, Hwang JN, Xing G, Liu H (2021) Rodnet: radar object detection using cross-modal supervision. In: Proceedings of the IEEE/CVF winter conference on applications of computer vision, pp 504–513
Wang W, Zhu D, Wang X, Hu Y, Qiu Y, Wang C, Hu Y, Kapoor A, Scherer S (2020) Tartanair: a dataset to push the limits of visual slam. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4909–4916
Weiffenbach G (2019) Tropospheric and ionospheric propagation effects on satellite radio-doppler geodesy. In: Electromagnetic distance measurement. University of Toronto Press, pp 339–352
Wen W, Zhou Y, Zhang G, Fahandezh-Saadi S, Bai X, Zhan W, Tomizuka M, Hsu LT, Urbanloco: a full sensor suite dataset for mapping and localization in urban scenes. In: 2020 IEEE international conference on robotics and automation (ICRA). IEEE, pp 2310–2316
Wisth D, Camurri M, Fallon M (2022) Vilens: visual, inertial, lidar, and leg odometry for all-terrain legged robots. IEEE Trans Robot
Wu KJ, Guo CX, Georgiou G, Roumeliotis SI (2017) Vins on wheels. In: 2017 IEEE international conference on robotics and automation (ICRA). IEEE, pp 5155–5162
Xu W, Zhang F (2021) Fast-lio: a fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter. IEEE Robot Autom Lett 6(2):3317–3324
Xu W, Cai Y, He D, Lin J, Zhang F (2022) Fast-lio2: fast direct lidar-inertial odometry. IEEE Trans Robot 38(4):2053–2073
Yang B, Zhang Q, Geng R, Wang L, Liu M (2022) Real-time neural dense elevation mapping for urban terrain with uncertainty estimations. IEEE Robot Autom Lett 8(2):696–703
Yang S, Choset H, Manchester Z (2022) Online kinematic calibration for legged robots. IEEE Robot Autom Lett 7(3):8178–8185
Ye H, Chen Y, Liu M (2019) Tightly coupled 3d lidar inertial odometry and mapping. In: 2019 IEEE international conference on robotics and automation (ICRA)
Ye H, Huang H, Liu M (2020) ‘Monocular direct sparse localization in a prior 3d surfel map. In: 2020 IEEE international conference on robotics and automation (ICRA). IEEE, pp 8892–8898
Yin J, Li A, Li T, Yu W, Zou D (2021) M2dgr: a multi-sensor and multi-scenario slam dataset for ground robots. IEEE Robot Autom Lett 7(2):2266–2273
Yin H, Wang Y, Xiong R (2021) Improved radar localization on lidar maps using shared embedding. arXiv:2106.10000
Yokozuka M, Koide K, Oishi S, Banno A (2020) Litamin: lidar-based tracking and mapping by stabilized ICP for geometry approximation with normal distributions. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 5143–5150
Yokozuka M, Koide K, Oishi S, Banno A (2021) Litamin2: ultra light lidar-based slam using geometric approximation applied with kl-divergence. In: 2021 IEEE international conference on robotics and automation (ICRA). IEEE, pp 11 619–11 625
Yuan C, Xu W, Liu X, Hong X, Zhang F (2022) Efficient and probabilistic adaptive voxel mapping for accurate online lidar odometry. IEEE Robot Autom Lett 7(3):8518–8525
Yu Y, Gao W, Liu C, Shen S, Liu M (2019) A gps-aided omnidirectional visual-inertial state estimator in ubiquitous environments. In: 2019 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 7750–7755
Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 22(11):1330–1334
Zhang J, Singh S (2018) Laser-visual-inertial odometry and mapping with high robustness and low drift. J Field Robot 35(8):1242–1264
Zhang Z, Scaramuzza D (2018) A tutorial on quantitative trajectory evaluation for visual (-inertial) odometry. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 7244–7251
Zhang J, Singh S (2014) Loam: lidar odometry and mapping in real-time. In: Robotics: Science and Systems, vol 2, p 9
Zhang J, Singh S (2015) Visual-lidar odometry and mapping: low-drift, robust, and fast. In: 2015 IEEE international conference on robotics and automation (ICRA). IEEE, pp 2174–2181
Zheng C, Zhu Q, Xu W, Liu X, Guo Q, Zhang F (2022) Fast-livo: fast and tightly-coupled sparse-direct lidar-inertial-visual odometry. In: 2022 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4003–4009
Zhou Y, Gallego G, Shen S (2021) Event-based stereo visual odometry. IEEE Trans Robot 37(5):1433–1450
Zhou X, Zhu J, Zhou H, Xu C, Gao F (2021) Ego-swarm: a fully autonomous and decentralized quadrotor swarm system in cluttered environments. In: 2021 IEEE international conference on robotics and automation (ICRA). IEEE, pp 4101–4107
Zhu AZ, Thakur D, Özaslan T, Pfrommer B, Kumar V, Daniilidis K (2018) The multivehicle stereo event camera dataset: an event camera dataset for 3d perception. IEEE Robot Autom Lett 3(3):2032–2039
Zhu Z, Peng S, Larsson V, Xu W, Bao H, Cui Z, Oswald MR, Pollefeys M (2022) Nice-slam: neural implicit scalable encoding for slam. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 12 786–12 796
Zou Z, Li L, Hu X, Zhu Y, Xue B, Wu J, Liu M (2023) Robust equipment-free calibration of low-cost inertial measurement units. IEEE Trans Instrum Meas
Zuo X, Geneva P, Lee W, Liu Y, Huang G (2019) Lic-fusion: lidar-inertial-camera odometry. In: 2013 IEEE/RSJ international conference on intelligent robots and systems. IEEE
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Jiao, J. et al. (2023). Enabling Robust SLAM for Mobile Robots with Sensor Fusion. In: Fan, R., Guo, S., Bocus, M.J. (eds) Autonomous Driving Perception. Advances in Computer Vision and Pattern Recognition. Springer, Singapore. https://doi.org/10.1007/978-981-99-4287-9_7
Download citation
DOI: https://doi.org/10.1007/978-981-99-4287-9_7
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-4286-2
Online ISBN: 978-981-99-4287-9
eBook Packages: Computer ScienceComputer Science (R0)