Abstract
This work proposes an extrinsic calibration approach designed for the alignment of a monocular camera with a prism-spinning solid-state LiDAR. Challenges arise due to the absence of adjacent laser rings, which are essential for the detection of line or plane features, in solid-state LiDAR systems. Additionally, the existence of a distinct type of outlier, designated as ‘vacant points’, complicates the task of feature extraction, particularly those reliant on depth variation. In contrast to existing methods that leverage reflectivity variation in depth-continuous measurements to circumvent this issue, we use depth discontinuous measurements to retain more valid features by efficient removal of the vacant points. The detected 3D corners thus contain more robust a priori information than usual which, together with the 2D corners detected by camera and constrained by our proposed rules, produce accurate extrinsic estimates. The efficacy of our algorithm is thoroughly evaluated through real-world field experiments, encompassing both qualitative and quantitative performance assessments. The results show its superiority over existing algorithms. Moreover, robustness tests demonstrate the algorithm’s resilience, particularly in feature-barren outdoor environments. The code is available on GitHub.
Similar content being viewed by others
Data Availability
Data sharing not applicable to this article as no datasets were generated or analyzed during the current study. Our code is available on Github: https://github.com/GAfieldCN/automatic-camera-pointcloud-calibration.
References
Qi C.R., Su, H., Mo, K., Guibas, L.J.: Pointnet: Deep learning on point sets for 3d classification and segmentation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 652–660 (2017)
Ciresan, D.C., Meier, U., Masci, J., Gambardella, L.M., Schmidhuber, J.: Flexible, high performance convolutional neural networks for image classification. In: Twenty-second international joint conference on artificial intelligence, pp. 1237–1242 (2011)
Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: ORB-SLAM: A versatile and accurate monocular SLAM system. IEEE Trans. Robot 31(5), 1147–1163 (2015)
Zhang, J., Singh, S.: LOAM: lidar odometry and mapping in real-time. Robot.: Sci. Syst. 2(9), 1–9 (2014)
Shizhuang, W., Xingqun, Z., Yawei, Z., Cheng, C., Jiawen, S.: Highly reliable relative navigation for multi-UAV formation flight in urban environments. Chin. J. Aeronaut 34(7), 257–270 (2021)
Lin, J., Zhang, F.: Loam livox: A fast, robust, high-precision LiDAR odometry and mapping package for LiDARs of small FoV. In: IEEE International Conference on Robotics and Automation, pp. 3126–3131 (2020)
Scaramuzza, D., Harati, A., Siegwart, R.: Extrinsic self calibration of a camera and a 3d laser range finder from natural scenes. Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. 4164–4169 (2007)
Park, C., Moghadam, P., Kim, S., Sridharan, S., Fookes, C.: Spatiotemporal Camera-LiDAR calibration: a targetless and structureless approach. IEEE Robot. Autom. Lett 5(2), 1556–1563 (2020)
Zhang, Q., Pless, R.: Extrinsic calibration of a camera and laser range finder (improves camera calibration). Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. 3, 2301–2306 (2004)
Pandey, G., McBride, J.R., Savarese, S., Eustice, R.M.: Automatic extrinsic calibration of vision and lidar by maximizing mutual information. J. Field Robot. 32(5), 696–722 (2015)
Lai, Z., et al.: Laser reflectance feature assisted accurate extrinsic calibration for non-repetitive scanning LiDAR and camera systems. Opt. Express 30(10), 16242–16263 (2022)
Koo, G., Kang, J., Jang, B., Doh, N.: Analytic plane covariances construction for precise planarity-based extrinsic calibration of camera and lidar. In: Proc. IEEE Int. Conf. Robot. Autom. 6042–6048 (2020)
Zhou, L., Li, Z., Kaess, M.: Automatic extrinsic calibration of a camera and a 3d lidar using line and plane correspondences. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 5562–5569 (2018)
Verma, S., Berrio, J.S., Worrall, S., Nebot, E.: Automatic extrinsic calibration between a camera and a 3D Lidar using 3D point and plane correspondences. In: IEEE international conference on intelligent transportation systems, pp. 3906–3912 (2019)
Shang, E., An, X., Shi, M., Meng, D., Li, J., Wu, T.: An efficient calibration approach for arbitrary equipped 3-d lidar based on an orthogonal normal vector pair. J. Intell. Robot Syst. 79, 21–36 (2015)
Pusztai, Z., Hajder, L.: Accurate calibration of LiDAR-camera systems using ordinary boxes. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, pp. 394–402 (2017)
Debattisti, S., Mazzei, L., Panciroli, M.: Automated extrinsic laser and camera inter-calibration using triangular targets. In: IEEE Intelligent Vehicles Symposium, pp. 696–701. IEEE (2013)
Fremont, V., Rodriguez, S.A., Bonnifait, P.: Circular targets for 3d alignment of video and lidar sensors. Adv. Robot 26(18), 2087–2113 (2012)
Tóth, T., Pusztai, Z., Hajder, L.: Automatic LiDAR-camera calibration of extrinsic parameters using a spherical target. In: IEEE International Conference on Robotics and Automation, pp. 8580–8586 (2020)
Guindel, C., Beltrán, J., Martín, D., García, F.: Automatic extrinsic calibration for lidar-stereo vehicle sensor setups. In: IEEE international conference on intelligent transportation systems, pp. 1–6 (2017)
Cui, J., Niu, J., Ouyang, Z., He, Y., Liu, D.: ACSC: Automatic calibration for non-repetitive scanning solid-state LiDAR and camera systems. arXiv preprint arXiv:2011.08516 (2020)
Moghadam, P., Bosse, M., Zlot, R.: Line-based extrinsic calibration of range and image sensors. In Proc. IEEE Int. Conf. Robot. Autom, pp. 3685–3691 (2013)
Rehder, J., Beardsley, P., Siegwart, R., Furgale, P.: Spatio-temporal laser to visual/inertial calibration with applications to hand-held, large scale scanning. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 459–465 (2014)
Jeong, J., Cho, Y., Kim, A.: The road is enough! Extrinsic calibration of non-overlapping Stereo Camera and LiDAR using road information. IEEE Robot. Autom. Lett 4(3), 2831–2838 (2019)
Zhang, X., Zhu, S., Guo, S., Li, J., Liu, H.: Line-based Automatic Extrinsic Calibration of LiDAR and Camera. In: IEEE International Conference on Robotics and Automation, pp. 9347–9353 (2021)
Castorena, J., Puskorius, G.V., Pandey, G.: Motion guided LiDAR-camera self-calibration and accelerated depth upsampling for autonomous vehicles. J. Intell. Robot Syst. 100, 1129–1138 (2020)
Yuan, C., Liu, X., Hong, X., Zhang, F.: Pixel-level extrinsic self calibration of high resolution LiDAR and Camera in Targetless environments. IEEE Robot. Autom. Lett 6(4), 7517–7524 (2021)
Chen, Y., Hafez, O.A., Pervan, B., Spenko, M.: Landmark augmentation for mobile robot localization safety. IEEE Robot. Autom. Lett. 6(1), 119–126 (2020)
Acknowledgements
This work was supported by the National Key R&D Program of China (2022YFB3904401).
Funding
This work was supported by the National Key R&D Program of China (2022YFB3904401), and Project 62173227, 62103274 supported by National Natural Science Foundation of China (NSFC).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Ethics Approval
Not applicable.
Consent to Participate
Not applicable.
Consent for Publication
Not applicable.
Conflict of Interest
The authors have no relevant financial or non-financial interests to disclose.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Liu, J., Zhan, X., Chi, C. et al. Robust Extrinsic Self-Calibration of Camera and Solid State LiDAR. J Intell Robot Syst 109, 81 (2023). https://doi.org/10.1007/s10846-023-02015-w
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10846-023-02015-w