Depth error correction for projector-camera based consumer depth cameras
This paper proposes a depth measurement error model for consumer depth cameras such as the Microsoft Kinect, and a corresponding calibration method. These devices were originally designed as video game interfaces, and their output depth maps usually lack sufficient accuracy for 3D measurement. Models have been proposed to reduce these depth errors, but they only consider camera-related causes. Since the depth sensors are based on projector-camera systems, we should also consider projector-related causes. Also, previous models require disparity observations, which are usually not output by such sensors, so cannot be employed in practice. We give an alternative error model for projector-camera based consumer depth cameras, based on their depth measurement algorithm, and intrinsic parameters of the camera and the projector; it does not need disparity values. We also give a corresponding new parameter estimation method which simply needs observation of a planar board. Our calibrated error model allows use of a consumer depth sensor as a 3D measuring device. Experimental results show the validity and effectiveness of the error model and calibration procedure.
Keywordsconsumer depth camera intrinsic calibration projector distortion
This work was supported by the JST CREST “Behavior Understanding based on Intention-Gait Model” project.
- Smisek, J.; Jancosek, M.; Pajdla, T. 3D with Kinect. In: Proceedings of the IEEE International Conference on Computer Vision Workshops, 1154–1160, 2011.Google Scholar
- Yamazoe, H.; Habe, H.; Mitsugami, I.; Yagi, Y. Easy depth sensor calibration. In: Proceedings of the 21st International Conference on Pattern Recognition, 465–468, 2012.Google Scholar
- Raposo, C.; Barreto, J. P.; Nunes, U. Fast and accurate calibration of a Kinect sensor. In: Proceedings of the International Conference on 3D Vision, 342–349, 2013.Google Scholar
- Xiang, W.; Conly, C.; McMurrough, C. D.; Athitsos, V. A review and quantitative comparison of methods for Kinect calibration. In: Proceedings of the 2nd International Workshop on Sensor-based Activity Recognition and Interaction, Article No. 3, 2015.Google Scholar
- Darwish, W.; Tang, S.; Li, W.; Chen, W. A new calibration method for commercial RGB-d sensors. Sensors Vol. 17, No. 6, 1204, 2017.Google Scholar
- Weiss, A.; Hirshberg, D.; Black, M. J. Home 3D body scans from noisy image and range data. In: Proceedings of the International Conference on Computer Vision, 1951–1958, 2011.Google Scholar
- Jin, B.; Lei, H.; Geng, W. Accurate intrinsic calibration of depth camera with cuboids. In: Computer Vision – ECCV 2014. ECCV 2014. Lecture Notes in Computer Science, Vol. 8693. Fleet, D.; Pajdla, T.; Schiele, B.; Tuytelaars, T. Eds. Springer, Cham, 788–803, 2014.Google Scholar
- Teichman, A.; Miller, S.; Thrun, S. Unsupervised intrinsic calibration of depth sensors via SLAM. Robotics: Science and Systems Vol. 248, 3, 2013.Google Scholar
- Wang, H.; Wang, J.; Liang, W. Online reconstruction of indoor scenes from RGB-D streams. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 3271–3279, 2016.Google Scholar
- Nguyen, C. V.; Izadi, S.; Lovell, D. Modeling Kinect sensor noise for improved 3D reconstruction and tracking. In: Proceedings of the 2nd International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission, 524–530, 2012.Google Scholar
- Freedman, B.; Shpunt, A.; Arieli, Y. Distance-varying illumination and imaging techniques for depth mapping. U.S. Patent 8,761,495. 2014.Google Scholar
Open Access The articles published in this journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
Other papers from this open access journal are available free of charge from http://www.springer.com/journal/41095. To submit a manuscript, please go to https://www.editorialmanager.com/cvmj.