Skip to main content
Log in

An efficient loop closure detection method based on spatially constrained feature matching

  • Original Research Paper
  • Published:
Intelligent Service Robotics Aims and scope Submit manuscript

Abstract

Loop detection technology is an important part of the simultaneous localization and mapping system for eliminating the pose drift of robots during long-term movement. In order to solve the three main challenges of appearance-based methods, namely viewpoint changes, repeated textures, and large amounts of calculation, this paper proposes an unsupervised loop detection method that takes into account both the texture pattern and position information of feature points and avoids any pre-training steps. Since the relative rotation and translation of the robot between two frames forming loop closure are both very small, the proposed method constrains the matching range with an overlapped block strategy to not only improve the matching precision, but also reduce the cost of matching. Furthermore, the method introduces Gaussian functions to weight and fuse the matching score of each block. The proposed method is evaluated in detail on two different public datasets with various scenarios, and the results show that the proposed method performs better and more efficiently than existing state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Data availability

The data that support the findings of this study are available from the corresponding author upon reasonable request.

References

  1. An S, Che G, Zhou F, Liu X, Ma X, Chen Y (2019) Fast and incremental loop closure detection using proximity graphs. https://doi.org/10.1109/IROS40897.2019.8968043

  2. Arroyo R, Alcantarilla PF, Bergasa LM, Yebes JJ, Gámez S (2014) Bidirectional loop closure detection on panoramas for visual navigation. In: 2014 IEEE intelligent vehicles symposium proceedings. IEEE, pp 1378–1383 https://doi.org/10.1109/IVS.2014.6856457

  3. Calonder M, Lepetit V, Strecha C, Fua P (2010) Brief: binary robust independent elementary features. In: European conference on computer vision. Springer, pp 778–792 https://doi.org/10.1007/978-3-642-15561-1_56

  4. Chen B, Yuan D, Liu C, Wu Q (2019) Loop closure detection based on multi-scale deep feature fusion. Appl Sci 9(6):1120. https://doi.org/10.3390/app9061120

    Article  Google Scholar 

  5. Cummins M, Newman P (2008) Fab-map: probabilistic localization and mapping in the space of appearance. Int J Robot Res 27(6):647–665. https://doi.org/10.1177/0278364908090961

    Article  Google Scholar 

  6. Cummins M, Newman P (2011) Appearance-only slam at large scale with fab-map 2.0. Int J Robot Res 30(9):1100–1123. https://doi.org/10.1177/0278364910385483

    Article  Google Scholar 

  7. Fritsch J, Kuehnl T, Geiger A (2013) A new performance measure and evaluation benchmark for road detection algorithms. In: 16th international ieee conference on intelligent transportation systems (ITSC 2013). IEEE, pp 1693–1700 https://doi.org/10.1109/ITSC.2013.6728473

  8. Gao X, Zhang T (2017) Unsupervised learning to detect loops using deep neural networks for visual slam system. Auton Robot 41(1):1–18. https://doi.org/10.1007/s10514-015-9516-2

    Article  MathSciNet  Google Scholar 

  9. Garcia-Fidalgo E, Ortiz A (2018) ibow-lcd: an appearance-based loop-closure detection approach using incremental bags of binary words. IEEE Robot Autom Lett 3(4):3051–3057. https://doi.org/10.1109/LRA.2018.2849609

    Article  Google Scholar 

  10. Glover A, Maddern W, Warren M, Reid S, Milford M, Wyeth G (2012) Openfabmap: an open source toolbox for appearance-based loop closure detection. In: 2012 IEEE international conference on robotics and automation, pp 4730–4735. https://doi.org/10.1109/ICRA.2012.6224843

  11. Gálvez-López D, Tardós JD (2011) Real-time loop detection with bags of binary words. In: 2011 IEEE/RSJ international conference on intelligent robots and systems, pp 51–58 https://doi.org/10.1109/IROS.2011.6094885

  12. Han J, Dong R, Kan J (2020) A novel loop closure detection method with the combination of points and lines based on information entropy. J Field Robot. https://doi.org/10.1002/rob.21992

  13. Hou Y, Zhang H, Zhou S (2015) Convolutional neural network-based image representation for visual loop closure detection. In: 2015 IEEE international conference on information and automation. IEEE, pp 2238–2245. https://doi.org/10.1109/ICInfA.2015.7279659

  14. Khan S, Wollherr D (2015) Ibuild: Incremental bag of binary words for appearance based loop closure detection. In: 2015 IEEE international conference on robotics and automation (ICRA). IEEE, pp 5441–5447 https://doi.org/10.1109/ICRA.2015.7139959

  15. Liu Y, Zhang H (2012) Visual loop closure detection with a compact image descriptor. In: 2012 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 1051–1056 https://doi.org/10.1109/IROS.2012.6386145

  16. Lowry S, Sünderhauf N, Newman P, Leonard JJ, Cox D, Corke P, Milford MJ (2015) Visual place recognition: a survey. IEEE Trans Robot 32(1):1–19. https://doi.org/10.1109/TRO.2015.2496823

    Article  Google Scholar 

  17. Milford MJ, Wyeth GF (2012) Seqslam: visual route-based navigation for sunny summer days and stormy winter nights. In: 2012 IEEE international conference on robotics and automation. IEEE, pp 1643–1649 https://doi.org/10.1109/ICRA.2012.6224623

  18. Mur-Artal R, Tardós JD (2017) Orb-slam2: an open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Trans Robot 33(5):1255–1262. https://doi.org/10.1109/TRO.2017.2705103

    Article  Google Scholar 

  19. Nicosevici T, Garcia R (2012) Automatic visual bag-of-words for online robot navigation and mapping. IEEE Trans Robot 28(4):886–898. https://doi.org/10.1109/TRO.2012.2192013

    Article  Google Scholar 

  20. rmsalinas (2017) DBoW3 dbow3. https://github.com/rmsalinas/DBow3

  21. Siagian C, Itti L (2009) Biologically inspired mobile robot vision localization. IEEE Trans Robot 25(4):861–873. https://doi.org/10.1109/TRO.2009.2022424

    Article  Google Scholar 

  22. Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. https://doi.org/10.1109/SLT.2016.7846307

  23. Tsintotas KA, Bampis L, Gasteratos A (2019) Probabilistic appearance-based place recognition through bag of tracked words. IEEE Robot Autom Lett 4(2):1737–1744. https://doi.org/10.1109/LRA.2019.2897151

    Article  Google Scholar 

  24. Viswanathan DG (2009) Features from accelerated segment test (fast). In: Proceedings of the 10th workshop on image analysis for multimedia interactive services, London, UK, pp 6–8. https://doi.org/10.1109/SLT.2016.7846307

  25. Zhang X, Wang L, Zhao Y, Su Y (2019) Graph-based place recognition in image sequences with cnn features. J Intell Robot Syst 95(2):389–403

    Article  Google Scholar 

Download references

Funding

This work is supported by the National Key Research and Development Program of China (2018YFB1307402).

Author information

Authors and Affiliations

Authors

Contributions

YZ contributed to the conception of the study and manuscript preparation, and HZ contributed significantly to the analysis and provided most of the experimental data, he and TZ performed the data analyses and wrote the manuscript, YY and HY provided some comparative experimental results, SD helped revise the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Yuzhong Zhong.

Ethics declarations

Conflict of interest

We declare that we do not have any commercial or associative interest that represents a conflict of interest in connection with the work submitted.

Ethics approval

Not applicable.

Consent to participate

Not applicable.

Consent for publication

Not applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work is supported by the National Key Research and Development Program of China (2018YFB1307402)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, H., Zhao, T., Zhong, Y. et al. An efficient loop closure detection method based on spatially constrained feature matching. Intel Serv Robotics 15, 363–379 (2022). https://doi.org/10.1007/s11370-022-00423-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11370-022-00423-9

Keywords

Navigation