Advertisement

Disambiguation of Time-of-Flight Data

  • Miles HansardEmail author
  • Seungkyu Lee
  • Ouk Choi
  • Radu Horaud
Chapter
Part of the SpringerBriefs in Computer Science book series (BRIEFSCOMPUTER)

Abstract

The maximum range of a time-of-flight camera is limited by the periodicity of the measured signal. Beyond a certain range, which is determined by the signal frequency, the measurements are confounded by phase wrapping. This effect is demonstrated in real examples. Several phase-unwrapping methods, which can be used to extend the range of time-of-flight cameras, are discussed. Simple methods can be based on the measured amplitude of the reflected signal, which is itself related to the depth of objects in the scene. More sophisticated unwrapping methods are based on zero-curl constraints, which enforce spatial consistency on the phase measurements. Alternatively, if more than one depth camera is used, then the data can be unwrapped by enforcing consistency among different views of the same scene point. The relative merits and shortcomings of these methods are evaluated, and the prospects for hardware-based approaches, involving frequency modulation are discussed.

Keywords

Time-of-Flight principle Depth ambiguity Phase unwrapping  Multiple depth cameras 

References

  1. 1.
    Bilmes, J.: A gentle tutorial of the EM algorithm and its application to parameter estimation for gaussian mixture and hidden Markov models. Technical Report TR-97-021, University of California. Berkeley (1998)Google Scholar
  2. 2.
    Boykov, Y., Veksler, O., Zabih, R.: Fast approximate energy minimization via graph cuts. IEEE Trans. Pattern Anal. Mach. Intell 23(11), 1222–1239 (2001)Google Scholar
  3. 3.
    Castañeda, V., Mateus, D., Navab, N.: Stereo time-of-flight. In: Proceedings of the International Conference on Computer Vision (ICCV), pp. 1684–1691 (2011)Google Scholar
  4. 4.
    Choi, O., Lee, S.: Wide range stereo time-of-flight camera. In: Proceedings of International Conference on Image Processing (ICIP) (2012)Google Scholar
  5. 5.
    Choi, O., Lim, H., Kang, B., Kim, Y., Lee, K., Kim, J., Kim, C.: Range unfolding for time-of-flight depth cameras. In: Proceedings of International Conference on Image Processing (ICIP), pp. 4189–4192 (2010)Google Scholar
  6. 6.
    Droeschel, D., Holz, D., Behnke, S.: Multifrequency phase unwrapping for time-of-flight cameras. In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Taipei (2010)Google Scholar
  7. 7.
    Droeschel, D., Holz, D., Behnke, S.: Probabilistic phase unwrapping for time-of-flight cameras. In: Joint 41st International Symposium on Robotics and 6th German Conference on Robotics (2010)Google Scholar
  8. 8.
    Fălie, D., Buzuloiu, V.: Wide range time of flight camera for outdoor surveillance. In: Microwaves, Radar and Remote Sensing Symposium, pp. 79–82 (2008)Google Scholar
  9. 9.
    Frey, B.J., Koetter, R., Petrovic, N.: Very loopy belief propagation for unwrapping phase images. In: Advances in Neural Information Processing Systems (2001)Google Scholar
  10. 10.
    Ghiglia, D.C., Romero, L.A.: Robust two-dimensional weighted and unweighted phase unwrapping that uses fast transforms and iterative methods. J.Opt. Soc. Am. A 11(1), 107–117 (1994)Google Scholar
  11. 11.
    Göktürk, S.B., Yalcin, H., Bamji, C.: A time-of-flight depth sensor–system description, issues and solutions. In: Proceedings of the Computer Vision and Parallel Recognition (CVPR) Workshops (2004)Google Scholar
  12. 12.
    Goldstein, R.M., Zebker, H.A., Werner, C.L.: Satellite radar interferometry: two-dimensional phase unwrapping. Radio Sci. 23, 713–720 (1988)Google Scholar
  13. 13.
    Jakowatz Jr, C., Wahl, D., Eichel, P., Ghiglia, D., Thompson, P.: Spotlight-mode Synthetic Aperture Radar: A Signal Processing Approach. Kluwer Academic Publishers, Boston (1996)Google Scholar
  14. 14.
    Jutzi, B.: Investigation on ambiguity unwrapping of range images. In: International Archives of Photogrammetry and Remote Sensing Workshop on Laserscanning (2009)Google Scholar
  15. 15.
    Liang, V., Lauterbur, P.: Principles of Magnetic Resonance Imaging: A Signal Processing Perspective. Wiley-IEEE Press, New York (1999)Google Scholar
  16. 16.
    Mesa Imaging AG. http://www.mesa-imaging.ch
  17. 17.
    McClure, S.H., Cree, M.J., Dorrington, A.A., Payne, A.D.: Resolving depth-measurement ambiguity with commercially available range imaging cameras. In: Image Processing: Machine Vision Applications III (2010)Google Scholar
  18. 18.
    Meyer, F.: Topographic distance and watershed lines. Signal Process. 38(1), 113–125 (1994)Google Scholar
  19. 19.
    Oprişescu, Ş., Fălie, D., Ciuc, M., Buzuloiu, V.: Measurements with ToF cameras and their necessary corrections. In: IEEE International Symposium on Signals, Circuits & Systems (2007)Google Scholar
  20. 20.
    Payne, A.D., Jongenelen, A.P.P., Dorrington, A.A., Cree, M.J., Carnegie, D.A.: Multiple frequency range imaging to remove measurment ambiguity. In: 9th Conference on Optical 3-D, Measurement Techniques (2009)Google Scholar
  21. 21.
    Poppinga, J., Birk, A.: A novel approach to efficient error correction for the swissranger time-of-flight 3D camera. In: RoboCup 2008: Robot Soccer World Cup XII (2008)Google Scholar
  22. 22.
    Rother, C., Kolmogorov, V., Blake, A.: “GrabCut”—interactive foreground extraction using iterated graph cuts. In: International Conference and Exhibition on Computer Graphics and Interactive Techniques (2004)Google Scholar

Copyright information

© Miles Hansard 2013

Authors and Affiliations

  • Miles Hansard
    • 1
    Email author
  • Seungkyu Lee
    • 2
  • Ouk Choi
    • 2
  • Radu Horaud
    • 3
  1. 1.Electronic Engineering and Computer ScienceQueen Mary, University of LondonLondonUK
  2. 2.Samsung Advanced Institute of TechnologyYongin-siKorea, Republic of (South Korea)
  3. 3.INRIA Grenoble Rhône-AlpesMontbonnot Saint-MartinFrance

Personalised recommendations