Advertisement

Machine Vision and Applications

, Volume 27, Issue 7, pp 1005–1020 | Cite as

An overview of depth cameras and range scanners based on time-of-flight technologies

  • Radu Horaud
  • Miles Hansard
  • Georgios Evangelidis
  • Clément Ménier
Original Paper

Abstract

Time-of-flight (TOF) cameras are sensors that can measure the depths of scene points, by illuminating the scene with a controlled laser or LED source and then analyzing the reflected light. In this paper, we will first describe the underlying measurement principles of time-of-flight cameras, including: (1) pulsed-light cameras, which measure directly the time taken for a light pulse to travel from the device to the object and back again, and (2) continuous-wave-modulated light cameras, which measure the phase difference between the emitted and received signals, and hence obtain the travel time indirectly. We review the main existing designs, including prototypes as well as commercially available devices. We also review the relevant camera calibration principles, and how they are applied to TOF devices. Finally, we discuss the benefits and challenges of combined TOF and color camera systems.

Keywords

LIDAR Range scanners Single-photon avalanche diode Time-of-flight cameras 3D computer vision Active light sensors 

References

  1. 1.
    Albota, M.A., Aull, B.F., Fouche, D.G., Heinrichs, R.M., Kocher, D.G., Marino, R.M., Mooney, J.G., Newbury, N.R., O’Brien, M.E., Player, B.E., et al.: Three-dimensional imaging laser radars with Geiger-mode avalanche photodiode arrays. Linc. Lab. J. 13(2), 351–370 (2002)Google Scholar
  2. 2.
    Amzajerdian, F., Pierrottet, D., Petway, L., Hines, G., Roback, V.: Lidar systems for precision navigation and safe landing on planetary bodies. In: International Symposium on Photoelectronic Detection and Imaging, International Society for Optics and Photonics, pp. 819202–819202 (2011)Google Scholar
  3. 3.
    Aull, B.F., Loomis, A.H., Young, D.J., Heinrichs, R.M., Felton, B.J., Daniels, P.J., Landers, D.J.: Geiger-mode avalanche photodiodes for three-dimensional imaging. Linc. Lab. J. 13(2), 335–349 (2002)Google Scholar
  4. 4.
    Bamji, S.C., O’Connor, P., Elkhatib, T., Mehta, S., Thompson, B., Prather, L.A., Snow, D., Akkaya, O.C., Daniel, A., Payne, D.A., et al.: A 0.13 \(\mu \)m cmos system-on-chip for a 512 \(\times \) 424 time-of-flight image sensor with multi-frequency photo-demodulation up to 130 mhz and 2 gs/s adc. IEEE J. Solid-State Circuits 50(1), 303–319 (2015)CrossRefGoogle Scholar
  5. 5.
    Bioucas-Dias, J.M., Valadão, G.: Phase unwrapping via graph cuts. IEEE Trans. Image Process. 16(3), 698–709 (2007)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Blais, F.: Review of 20 years of range sensor development. J. Electron. Imaging. 13(1), 231–243 (2004)CrossRefGoogle Scholar
  7. 7.
    Bradski, G., Kaehler, A.: Learning OpenCV 3 Computer Vision in C++ with the OpenCV Library. O’Reilly Media (2008)Google Scholar
  8. 8.
    Büttgen, B., Seitz, P.: Robust optical time-of-flight range imaging based on smart pixel structures. IEEE Trans. Circuits Syst. I Regul. Pap. 55(6), 1512–1525 (2008)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Choi, O., Lee, S.: Wide range stereo time-of-flight camera. In: Proceedings IEEE International Conference on Image Processing (2012)Google Scholar
  10. 10.
    Choi, O., Lim, H., Kang, B., Kim, Y.S., Lee, K., Kim, J.D.K., Kim, C.Y.: Range unfolding for time-of-flight depth cameras. In: Proceedings IEEE International Conference on Image Processing (2010)Google Scholar
  11. 11.
    Cova, S., Longoni, A., Andreoni, A.: Towards picosecond resolution with single-photon avalanche diodes. Rev. Sci. Instrum. 52(3), 408–412 (1981)CrossRefGoogle Scholar
  12. 12.
    Cui, Y., Schuon, S., Thrun, S., Stricker, D., Theobalt, C.: Algorithms for 3d shape scanning with a depth camera. IEEE Trans. Pattern Anal. Mach. Intell. 35(5), 1039–1050 (2013)CrossRefGoogle Scholar
  13. 13.
    Droeschel, D., Holz, D., Behnke, S.: Probabilistic phase unwrapping for time-of-flight cameras. In: Joint 41st International Symposium on Robotics and 6th German Conference on Robotics (2010a)Google Scholar
  14. 14.
    Droeschel, D., Holz, D., Behnke, S.: Multifrequency phase unwrapping for time-of-flight cameras. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (2010b)Google Scholar
  15. 15.
    Evangelidis, G.D., Hansard, M., Horaud, R.: Fusion of range and stereo data for high-resolution scene-modeling. IEEE Trans. PAMI 37(11), 2178–2192 (2015)CrossRefGoogle Scholar
  16. 16.
    Fankhauser, P., Bloesch, M., Rodriguez, D., Kaestner, R., Hutter, M., Siegwart, R.: Kinect v2 for mobile robot navigation: evaluation and modeling. In: International Conference on Advanced Robotics, Istanbul, Turkey, July (2015)Google Scholar
  17. 17.
    Ferstl, D., Reinbacher, C., Riegler, G., Rüther, M., Bischof, H.: Learning depth calibration of time-of-flight cameras. Technical report, Graz University of Technology (2015)Google Scholar
  18. 18.
    Foix, S., Alenya, G., Torras, C.: Lock-in time-of-flight (ToF) cameras: a survey. IEEE Sens. 11(9), 1917–1926 (2011)CrossRefGoogle Scholar
  19. 19.
    Freedman, D., Smolin, Y., Krupka, E., Leichter, I., Schmidt, M.: Sra: Fast removal of general multipath for tof sensors. In: European Conference on Computer Vision. Springer, Berlin, pp. 234–249 (2014)Google Scholar
  20. 20.
    Fursattel, P., Placht, S., Schaller, C., Balda, M., Hofmann, H., Maier, A., Riess, C.: A comparative error analysis of current time-of-flight sensors. IEEE Trans. Comput. Imaging 2(1), 27–41 (2016)MathSciNetCrossRefGoogle Scholar
  21. 21.
    Gandhi, V., Cech, J., Horaud, R.: High-resolution depth maps based on TOF-stereo fusion. In: IEEE International Conference on Robotics and Automation, pp. 4742–4749 (2012)Google Scholar
  22. 22.
    Ghiglia, D.C., Romero, L.A.: Robust two-dimensional weighted and unweighted phase unwrapping that uses fast transforms and iterative methods. J. Opt. Soc. Am. A 11(1), 107–117 (1994)CrossRefGoogle Scholar
  23. 23.
    Glennie, C.: Rigorous 3D error analysis of kinematic scanning LIDAR systems. J. Appl. Geod. 1(3), 147–157 (2007)Google Scholar
  24. 24.
    Glennie, C., Lichti, D.D.: Static calibration and analysis of the velodyne HDL-64E S2 for high accuracy mobile scanning. Remote Sens. 2(6), 1610–1624 (2010)CrossRefGoogle Scholar
  25. 25.
    Glennie, C., Lichti, D.D.: Temporal stability of the velodyne HDL-64E S2 scanner for high accuracy scanning applications. Remote Sens. 3(3), 539–553 (2011)CrossRefGoogle Scholar
  26. 26.
    Gonzalez-Aguilera, D., Gomez-Lahoz, J., Rodriguez-Gonzalvez, P.: An automatic approach for radial lens distortion correction from a single image. IEEE Sens. 11(4), 956–965 (2011)CrossRefGoogle Scholar
  27. 27.
    Grzegorzek, M., Theobalt, C., Koch, R., Kolb, A.: Time-of-Flight and Depth Imaging. Sensors, Algorithms and Applications, vol 8200. Springer, Berlin (2013)Google Scholar
  28. 28.
    Gudmundsson, S.A., Aanaes, H., Larsen, R.: Fusion of stereo vision and time-of-flight imaging for improved 3D estimation. Int. J. Intell. Syst. Technol. Appl. 5(3/4), 425 (2008)Google Scholar
  29. 29.
    Hansard, M, Horaud, R., Amat, M., Lee, S.K.: Projective alignment of range and parallax data. In: IEEE Computer Vision and Pattern Recognition, pp. 3089–3096 (2011)Google Scholar
  30. 30.
    Hansard, M., Lee, S., Choi, O., Horaud, R.: Time-of-Flight Cameras: Principles, Methods and Applications. Springer, Berlin (2013)CrossRefGoogle Scholar
  31. 31.
    Hansard, M., Horaud, R., Amat, M., Evangelidis, G.: Automatic detection of calibration grids in time-of-flight images. Comput. Vis. Image Underst. 121, 108–118 (2014)CrossRefGoogle Scholar
  32. 32.
    Hansard, M., Evangelidis, G., Pelorson, Q., Horaud, R.: Cross-calibration of time-of-flight and colour cameras. Comput. Vis. Image Underst. 134, 105–115 (2015)CrossRefGoogle Scholar
  33. 33.
    Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press, Cambridge (2003)MATHGoogle Scholar
  34. 34.
    Herrera, D.C., Kannala, J., Heikila, J.: Joint depth and color camera calibration with distortion correction. IEEE Trans. PAMI 34(10), 2058–2064 (2012)CrossRefGoogle Scholar
  35. 35.
    Hertzberg, C., Frese, U.: Detailed modeling and calibration of a time-of-flight camera. In: ICINCO 2014—Proceedings of International Conference on Informatics in Control, Automation and Robotics, pp. 568–579 (2014)Google Scholar
  36. 36.
    Jung, J., Lee, J.Y., Jeong, Y., Kweon, I.S.: Time-of-flight sensor calibration for a color and depth camera pair. IEEE Trans. Pattern. Anal. Mach. Intell. 37(7), 1501–1513 (2015)CrossRefGoogle Scholar
  37. 37.
    Kahlmann, T., F. Remondino, Ingensand, H.: Calibration for increased accuracy of the range imaging camera swissranger tm. Image Engineering and Vision Metrology (IEVM), 36(3), 136–141 (2006)Google Scholar
  38. 38.
    Kim, S.-J., Kim, J.D.K., Kang, B., Lee, K.: A CMOS image sensor based on unified pixel architecture with time-division multiplexing scheme for color and depth image acquisition. IEEE J. Solid-State Circuits 47(11), 2834–2845 (2012)CrossRefGoogle Scholar
  39. 39.
    Kuznetsova, A., Rosenhahn, B.: On calibration of a low-cost time-of-flight camera. In: ECCV Workshops, pp. 415–427 (2014)Google Scholar
  40. 40.
    Lange, R., Seitz, P.: Solid-state time-of-flight range camera. IEEE J. Quantum Electron. 37(3), 390–397 (2001)CrossRefGoogle Scholar
  41. 41.
    Lindner, M., Schiller, I., Kolb, A., Koch, R.: Time-of-flight sensor calibration for accurate range sensing. Comput. Vis. Image Underst. 114(12), 1318–1328 (2010)CrossRefGoogle Scholar
  42. 42.
    McClure, S.H., Cree, M.J., Dorrington, A.A., Payne, A.D.: Resolving depth-measurement ambiguity with commercially available range imaging cameras. In: Machine Vision Applications III Image Processing (2010)Google Scholar
  43. 43.
    Mutto, C.D., Zanuttigh, P., Cortelazzo, G.M.: Probabilistic ToF and stereo data fusion based on mixed pixels measurement models. IEEE Trans. PAMI 37(11), 2260–2272 (2015)CrossRefGoogle Scholar
  44. 44.
    Newcombe, R.A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A.J., Kohli, P., Shotton, J., Hodges, S., Fitzgibbon, A.: Kinectfusion: Real-time dense surface mapping and tracking. In: ISMAR (2011)Google Scholar
  45. 45.
    Niclass, C., Rochas, A., Besse, P.-A., Charbon, E.: Design and characterization of a CMOS 3-D image sensor based on single photon avalanche diodes. IEEE J. Solid-State Circuits 40(9), 1847–1854 (2005)CrossRefGoogle Scholar
  46. 46.
    Niclass, C., Favi, C., Kluter, T., Gersbach, M., Charbon, E.: A 128\(\times \)128 single-photon image sensor with column-level 10-bit time-to-digital converter array. IEEE J. Solid-State Circuits 43(12), 2977–2989 (2008)CrossRefGoogle Scholar
  47. 47.
    Niclass, C., Soga, M., Matsubara, H., Kato, S., Kagami, M.: A 100-m range 10-frame/s 340 96-pixel time-of-flight depth sensor in 0.18-CMOS. IEEE J. Solid-State Circuits 48(2), 559–572 (2013)CrossRefGoogle Scholar
  48. 48.
    Oprişescu, Ş., Fălie, D., Ciuc, M., Buzuloiu, V.: Measurements with TOF cameras and their necessary corrections. In: IEEE International Symposium on Signals, Circuits and Systems (2007)Google Scholar
  49. 49.
    Payne, A., Daniel, A., Mehta, A., Thompson, B., Bamji, C.S., Snow, D., Oshima, H., Prather, L., Fenton, M., Kordus, L., et al.: A 512\(\times \)424 CMOS 3D time-of-flight image sensor with multi-frequency photo-demodulation up to 130 MHz and 2 GS/s ADC. In: IEEE International Solid-State Circuits Conference Digest of Technical Papers, pp. 134–135 (2014)Google Scholar
  50. 50.
    Payne, A.D., Jongenelen, A.P.P., Dorrington, A.A., Cree, M.J., Carnegie, D.A.: Multiple frequency range imaging to remove measurement ambiguity. In: 9th Conference on Optical 3-D Measurement Techniques (2009)Google Scholar
  51. 51.
    Remondino, F., Stoppa, D. (eds.): TOF Range-Imaging Cameras. Springer, Berlin (2013)Google Scholar
  52. 52.
    Sarbolandi, H., Lefloch, D., Kolb, A.: Kinect range sensing: structured-light versus time-of-flight Kinect. CVIU 139, 1–20 (2015)Google Scholar
  53. 53.
    Schwarz, B.: Mapping the world in 3D. Nat. Photon. 4(7), 429–430 (2010)CrossRefGoogle Scholar
  54. 54.
    Sell, J., O’Connor, P.: The xbox one system on a chip and kinect sensor. IEEE Micro 32(2), 44–53 (2014)CrossRefGoogle Scholar
  55. 55.
    Son, K., Liu, M.-Y., Taguchi, Y.: Automatic learning to remove multipath distortions in time-of-flight range images for a robotic arm setup. In: IEEE International Conference on Robotics and Automation (2016)Google Scholar
  56. 56.
    Stettner, R., Bailey, H., Silverman, S.: Three dimensional Flash LADAR focal planes and time dependent imaging. Int. J. High Speed Electron. Syst. 18(02), 401–406 (2008)CrossRefGoogle Scholar
  57. 57.
    Stoppa, D., Pancheri, L., Scandiuzzo, M., Gonzo, L., Betta, G.F.D., Simoni, A.: A CMOS 3-D imager based on single photon avalanche diode. IEEE Trans. Circuits Syst. I. Regul. Pap. 54(1), 4–12 (2007)CrossRefGoogle Scholar
  58. 58.
    Zhang, C., Zhang, Z.: Calibration between depth and color sensors for commodity depth cameras. In: Proceedings of the 2011 IEEE International Conference on Multimedia and Expo, ICME’11, pp. 1–6 (2011)Google Scholar
  59. 59.
    Zhang, Z.: A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000)CrossRefGoogle Scholar
  60. 60.
    Zhu, J., Wang, L., Yang, R.G., Davis, J.: Fusion of time-of-flight depth and stereo for high accuracy depth maps. In: Proceedings of CVPR, pp. 1–8 (2008)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  • Radu Horaud
    • 1
  • Miles Hansard
    • 2
  • Georgios Evangelidis
    • 1
  • Clément Ménier
    • 3
  1. 1.INRIA Grenoble Rhône-AlpesMontbonnot Saint-MartinFrance
  2. 2.School of Electronic Engineering and Computer ScienceQueen Mary University of LondonLondonUK
  3. 3.4D View SolutionsGrenobleFrance

Personalised recommendations