Technical Foundation and Calibration Methods for Time-of-Flight Cameras

Part of the Lecture Notes in Computer Science book series (LNCS, volume 8200)


Current Time-of-Flight approaches mainly incorporate an continuous wave intensity modulation approach. The phase reconstruction is performed using multiple phase images with different phase shifts which is equivalent to sampling the inherent correlation function at different locations. This active imaging approach delivers a very specific set of influences, on the signal processing side as well as on the optical side, which all have an effect on the resulting depth quality. Applying ToF information in real application therefore requires to tackle these effects in terms of specific calibration approaches. This survey gives an overview over the current state of the art in ToF sensor calibration.


Range Imaging Depth Camera Multiple Return Calibration Pattern Technical Foundation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Lange, R.: 3D Time-of-Flight Distance Measurement with Custom Solid-State Image Sensor in CMOS/CCD-Technology. PhD thesis (2000)Google Scholar
  2. 2.
    Davis, J., Gonzalez-Banos, H.: Enhanced shape recovery with shuttered pulses of light. In: Pulses of Light? IEEE Workshop on Projector-Camera Systems (2003)Google Scholar
  3. 3.
    Rapp, H.: Experimental and theoretical investigation of correlating tof-camera systems. Master’s thesis (2007)Google Scholar
  4. 4.
    Schmidt, M., Jähne, B.: A physical model of time-of-flight 3D imaging systems, including suppression of ambient light. In: Kolb, A., Koch, R. (eds.) Dyn3D 2009. LNCS, vol. 5742, pp. 1–15. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  5. 5.
    Dorrington, A.A., Cree, M.J., Carnegie, D.A., Payne, A.D., Conroy, R.M., Godbaz, J.P., Jongenelen, A.P.: Video-rate or high-precision: A flexible range imaging camera. In: Electronic Imaging 2008, International Society for Optics and Photonics, pp. 681307–681307 (2008)Google Scholar
  6. 6.
    Lindner, M., Kolb, A.: Lateral and depth calibration of pmd-distance sensors. In: Bebis, G., Boyle, R., Parvin, B., Koracin, D., Remagnino, P., Nefian, A., Meenakshisundaram, G., Pascucci, V., Zara, J., Molineros, J., Theisel, H., Malzbender, T. (eds.) ISVC 2006. LNCS, vol. 4292, pp. 524–533. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  7. 7.
    Schiller, I., Beder, C., Koch, R.: Calibration of a pmd camera using a planar calibration object together with a multi-camera setup. In: The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Part B3a, Beijing, China, vol. XXXVII, pp. 297–302 XXI. ISPRS Congress (2008)Google Scholar
  8. 8.
    Payne, A.D., Dorrington, A.A., Cree, M.J., Carnegie, D.A.: Improved measurement linearity and precision for amcw time-of-flight range imaging cameras. Applied Optics 49(23), 4392–4403 (2010)CrossRefGoogle Scholar
  9. 9.
    Lindner, M.: Calibration and Real-Time Processing of Time-of-Flight Range Data. PhD thesis, CG, Fachbereich Elektrotechnik und Informatik, Univ. Siegen (2010)Google Scholar
  10. 10.
    Schmidt, M.: Analysis, Modeling and Dynamic Optimization of 3D Time-of-Flight Imaging Systems. PhD thesis, IWR, Fakultät für Physik und Astronomie, Univ. Heidelberg (2011)Google Scholar
  11. 11.
    Godbaz, J.P.: Ameliorating systematic errors in full-field AMCW lidar. PhD thesis, School of Engineering, University of Waikato, Hamilton, New Zealand (2012)Google Scholar
  12. 12.
    A., G.S., Aanaes, H., Larsen, R.: Environmental effects on measurement uncertainties of time-of-flight cameras. In: Proceedings of International Symposium on Signals, Circuits and Systems 2007, ISSCS 2007 (2007)Google Scholar
  13. 13.
    Shack, R.V.: Characteristics of an image-forming system. Journal of Research of the National Bureau of Standards 56(5), 245–260 (1956)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Barakat, R.: Application of the sampling theorem to optical diffaction theory. Journal fo the Optical Society of America 54(7) (1964)Google Scholar
  15. 15.
    Saleh, B.E.A., Teich, M.C.: 10. In: Fundamentals of Photonics, pp. 368–372. John Wiley and Sons, New York (1991)CrossRefGoogle Scholar
  16. 16.
    Matsuda, S., Nitoh, T.: Flare as applied to photographic lenses. Applied Optics 11(8), 1850–1856 (1972)CrossRefGoogle Scholar
  17. 17.
    Godbaz, J., Cree, M., Dorrington, A.: Understanding and ameliorating non-linear phase and amplitude responses in amcw lidar. Remote Sensing 4(1) (2012)Google Scholar
  18. 18.
    Seitz, P.: Quantum-noise limited distance resolution of optical range imaging techniques. IEEE Transactions on Circuits and Systems I: Regular Papers 55(8), 2368–2377 (2008)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Erz, M., Jähne, B.: Radiometric and spectrometric calibrations, and distance noise measurement of toF cameras. In: Kolb, A., Koch, R. (eds.) Dyn3D 2009. LNCS, vol. 5742, pp. 28–41. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  20. 20.
    Frank, M., Plaue, M., Rapp, H., Köthe, U., Jähne, B., Hamprecht, F.A.: Theoretical and experimental error analysis of continuous-wave time-of-flight range cameras. Optical Engineering 48(1), 13602 (2009)Google Scholar
  21. 21.
    Emva standard 1288 -standard for measurement and presentation of specifications for machine vision sensors and cameras, Release 3.0 (2010)Google Scholar
  22. 22.
    Zhang, Z.: A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000)CrossRefGoogle Scholar
  23. 23.
    Beder, C., Bartczak, B., Koch, R.: A comparison of PMD-cameras and stereo-vision for the task of surface reconstruction using patchlets. In: IEEE/ISPRS BenCOS Workshop 2007 (2007)Google Scholar
  24. 24.
    Streckel, B., Koch, R.: Lens model selection for visual tracking. In: Kropatsch, W.G., Sablatnig, R., Hanbury, A. (eds.) DAGM 2005. LNCS, vol. 3663, pp. 41–48. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  25. 25.
    Kahlmann, T., Remondino, F., Ingensand, H.: Calibration for increased accuracy of the range imaging camera swissrangertm. In: Proc. of IEVM (2006)Google Scholar
  26. 26.
    Beder, C., Koch, R.: Calibration of focal length and 3d pose based on the reflectance and depth image of a planar object. In: Proceedings of the DAGM Dyn3D Workshop, Heidelberg, Germany (2007)Google Scholar
  27. 27.
    Marvin, L., Ingo, S., Andreas, K., Reinhard, K.: Time-of-flight sensor calibration for accurate range sensing. Comput. Vis. Image Underst. 114(12), 1318–1328 (2010)CrossRefGoogle Scholar
  28. 28.
    Lindner, M., Kolb, A.: Calibration of the intensity-related distance error of the pmd tof-camera. In: Proc. SPIE, Intelligent Robots and Computer Vision, vol. 6764, p. 67640W (2007)Google Scholar
  29. 29.
    Steiger, O., Felder, J., Weiss, S.: Calibration of time-of-flight range imaging cameras. In: 15th IEEE International Conference on Image Processing, ICIP 2008, pp. 1968–1971. IEEE (2008)Google Scholar
  30. 30.
    Swadzba, A., Beuter, N., Schmidt, J., Sagerer, G.: Tracking objects in 6d for reconstructing static scenes. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2008, pp. 1–7. IEEE (2008)Google Scholar
  31. 31.
    Reynolds, M., Dobos, J., Peel, L., Weyrich, T., Brostow, G.J.: Capturing time-of-flight data with confidence. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 945–952. IEEE (2011)Google Scholar
  32. 32.
    Lindner, M., Lambers, M., Kolb, A.: Sub-pixel data fusion and edge-enhanced distance refinement for 2d / 3d images. International Journal of Intelligent Systems Technologies and Applications 5, 344–354 (2008)CrossRefGoogle Scholar
  33. 33.
    Pathak, K., Birk, A., Poppinga, J.: Sub-pixel depth accuracy with a time of flight sensor using multimodal gaussian analysis. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2008, pp. 3519–3524 (2008)Google Scholar
  34. 34.
    Moser, B., Bauer, F., Elbau, P., Heise, B., Schöner, H.: Denoising techniques for raw 3D data of ToF cameras based on clustering and wavelets. In: Proc. SPIE, vol. 6805 (2008)Google Scholar
  35. 35.
    H., S., Moser, B., Dorrington, A.A., Payne, A., Cree, M.J., Heise, B., Bauer, F.: A clustering based denoising technique for range images of time of flight cameras. In: CIMCA/IAWTIC/ISE 2008, pp. 999–1004 (2008)Google Scholar
  36. 36.
    Schuon, S., Theobalt, C., Davis, J., Thrun, S.: Lidarboost: Depth superresolution for tof 3d shape scanning. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2009, pp. 343–350. IEEE (2009)Google Scholar
  37. 37.
    Cui, Y., Schuon, S., Chan, D., Thrun, S., Theobalt, C.: 3d shape scanning with a time-of-flight camera. In: 2010 IEEE Conference on Computer Vision and Pattern Recognition, CVPR, pp. 1173–1180. IEEE (2010)Google Scholar
  38. 38.
    Lindner, M., Kolb, A.: Compensation of motion artifacts for time-of-flight cameras. In: Kolb, A., Koch, R. (eds.) Dyn3D 2009. LNCS, vol. 5742, pp. 16–27. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  39. 39.
    Erz, M.: Charakterisierung von Laufzeit-Kamera-Systemen für Lumineszenz- Lebensdauer-Messungen. PhD thesis, IWR, Fakultät für Physik und Astronomie, Univ. Heidelberg (2011)Google Scholar
  40. 40.
    Gokturk, S.B., Yalcin, H., Bamji, C.: A time-of-flight depth sensor-system description, issues and solutions. In: Conference on Computer Vision and Pattern Recognition Workshop, CVPRW 2004, pp. 35–35. IEEE (2004)Google Scholar
  41. 41.
    Lottner, O., Sluiter, A., Hartmann, K., Weihs, W.: Movement artefacts in range images of time-of-flight cameras. In: International Symposium on Signals, Circuits and Systems, ISSCS 2007, vol. 1, pp. 1–4. IEEE (2007)Google Scholar
  42. 42.
    Hussmann, S., Hermanski, A., Edeler, T.: Real-time motion artifact suppression in tof camera systems. IEEE Transactions on Instrumentation and Measurement 60, 1682–1690 (2011)CrossRefGoogle Scholar
  43. 43.
    Hansard, M., Lee, S., Choi, O., Horaud, R.P.: Time of Flight Cameras: Principles, Methods, and Applications. SpringerBriefs in Computer Science. Springer (2012)Google Scholar
  44. 44.
    Sturmer, M., Penne, J., Hornegger, J.: Standardization of intensity-values acquired by time-of-flight-cameras. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2008, pp. 1–6. IEEE (2008)Google Scholar
  45. 45.
    Zach, C., Pock, T., Bischof, H.: A duality based approach for realtime tv-l 1 optical flow. In: Hamprecht, F.A., Schnörr, C., Jähne, B. (eds.) DAGM 2007. LNCS, vol. 4713, pp. 214–223. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  46. 46.
    Lefloch, D., Hoegg, T., Kolb, A.: Real-time motion artifacts compensation of tof sensors data on gpu. In: Proc. SPIE, Three-Dimensional Imaging, Visualization, and Display, vol. 8738. SPIE (2013)Google Scholar
  47. 47.
    Dorrington, A.A., Godbaz, J.P., Cree, M.J., Payne, A.D., Streeter, L.V.: Separating true range measurements from multi-path and scattering interference in commercial range cameras (2011)Google Scholar
  48. 48.
    Godbaz, J.P., Cree, M.J., Dorrington, A.A.: Closed-form inverses for the mixed pixel/multipath interference problem in AMCW lidar (2012)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  1. 1.Computer Graphics GroupUniversity of SiegenGermany
  2. 2.Heidelberg Collaboratory for Image ProcessingUniversity of HeidelbergGermany
  3. 3.Intel Visual Computing InstituteSaarland UniversityGermany
  4. 4.University of WaikatoNew Zealand
  5. 5.Multimedia Information ProcessingUniversity of KielGermany

Personalised recommendations