Depth Imaging by Combining Time-of-Flight and On-Demand Stereo

  • Uwe Hahne
  • Marc Alexa
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5742)


In this paper we present a framework for computing depth images at interactive rates. Our approach is based on combining time-of-flight (TOF) range data with stereo vision. We use a per-frame confidence map extracted from the TOF sensor data in two ways for improving the disparity estimation in the stereo part: first, together with the TOF range data for initializing and constraining the disparity range; and, second, together with the color image information for segmenting the data into depth continuous areas, enabling the use of adaptive windows for the disparity search. The resulting depth images are more accurate than from either of the sensors. In an example application we use the depth map to initialize the z-buffer so that virtual objects can be occluded by real objects in an augmented reality scenario.


Augmented Reality Depth Image Virtual Object Stereo Vision Stereo Camera 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [BBK07]
    Beder, C., Bartczak, B., Koch, R.: A combined approach for estimating patchlets from PMD depth images and stereo intensity images. In: Hamprecht, F.A., Schnörr, C., Jähne, B. (eds.) DAGM 2007. LNCS, vol. 4713, pp. 11–20. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  2. [BR05]
    Bimber, O., Raskar, R.: Spatial Augmented Reality. A K Peters, Ltd., Wellesley (2005)CrossRefGoogle Scholar
  3. [BVZ98]
    Boykov, Y., Veksler, O., Zabith, R.: A variable window approach to early vision. IEEE Trans. Pattern Anal. Mach. Intell. 20(12), 1283–1294 (1998)CrossRefGoogle Scholar
  4. [BWRT96]
    Breen, D.E., Whitaker, R.T., Rose, E., Tuceryan, M.: Interactive occlusion and automatic object placement for augmented reality. Computer Graphics Forum 15(3), 11–22 (1996)CrossRefGoogle Scholar
  5. [FHM+93]
    Faugeras, O., Hotz, B., Mathieu, H., Viéville, T., Zhang, Z., Fua, P., Théron, E., Moll, L., Berry, G., Vuillemin, J., Bertin, P., Proy, C.: Real time correlation based stereo: algorithm implementations and applications. Technical Report RR-2013, INRIA (1993)Google Scholar
  6. [FHS07]
    Fischer, J., Huhle, B., Schilling, A.: Using time-of-flight range data for occlusion handling in augmented reality. In: Eurographics Symposium on Virtual Environments (EGVE), pp. 109–116 (2007)Google Scholar
  7. [FRC+05]
    Feris, R., Raskar, R., Chen, L., Tan, K., Turk, M.: Discontinuity preserving stereo with small baseline multi-flash illumination. In: IEEE International Conference in Computer Vision (ICCV 2005), Beijing, China (2005)Google Scholar
  8. [GAL07]
    Guomundsson, S., Aanæs, H., Larsen, R.: Fusion of stereo vision and time-of-flight imaging for improved 3D estimation. In: International workshop in Conjuction with DAGM 2007: Dynamic 3D Imaging, September 2007, vol. 1, pp. 164–172 (2007)Google Scholar
  9. [HA07]
    Hahne, U., Alexa, M.: Combining time-of-flight depth and stereo images without accurate extrinsic calibration. In: International workshop on Dynamic 3D Imaging, Heidelberg, September 2007, pp. 78–85 (2007)Google Scholar
  10. [Hir01]
    Hirschmüller, H.: Improvements in real-time correlation-based stereo vision. In: SMBV 2001: Proceedings of the IEEE Workshop on Stereo and Multi-Baseline Vision (SMBV 2001), Washington, DC, USA, p. 141. IEEE Computer Society, Los Alamitos (2001)CrossRefGoogle Scholar
  11. [htt09]
  12. [KO94]
    Kanade, T., Okutomi, M.: A stereo matching algorithm with an adaptive window: Theory and experiment. IEEE Trans. Pattern Anal. Mach. Intell. 16(9), 920–932 (1994)CrossRefGoogle Scholar
  13. [KOTY00]
    Kanbara, M., Okuma, T., Takemura, H., Yokoya, N.: A stereoscopic video see-through augmented reality system based on real-time vision-based registration. In: Proceedings. IEEE Virtual Reality, 2000, pp. 255–262 (2000)Google Scholar
  14. [KS06]
    Kuhnert, K.-D., Stommel, M.: Fusion of stereo-camera and pmd-camera data for real-time suited precise 3d environment reconstruction. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, October 2006, pp. 4780–4785 (2006)Google Scholar
  15. [LK06]
    Lindner, M., Kolb, A.: Lateral and depth calibration of PMD-distance sensors. In: Bebis, G., Boyle, R., Parvin, B., Koracin, D., Remagnino, P., Nefian, A., Meenakshisundaram, G., Pascucci, V., Zara, J., Molineros, J., Theisel, H., Malzbender, T. (eds.) ISVC 2006. LNCS, vol. 4292, pp. 524–533. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  16. [LKH07]
    Lindner, M., Kolb, A., Hartmann, K.: Data-fusion of pmd-based distance-information and high-resolution rgb-images. In: International Symposium on Signals, Circuits and Systems (ISSCS), Iasi, Romania (2007)Google Scholar
  17. [LLK07]
    Lindner, M., Lambers, M., Kolb, A.: Sub-pixel data fusion and edge-enhanced distance refinement for 2d/3d images. In: Dynamic 3D Imaging (Workshop in Cunjunction with DAGM 2007), Heidelberg, Germany (September 2007)Google Scholar
  18. [MKF+05]
    Moeller, T., Kraft, H., Frey, J., Albrecht, M., Lange, R.: Robust 3d measurement with pmd sensors. Technical report, PMDTec (2005)Google Scholar
  19. [NMCR08]
    Netramai, C., Melnychuk, O., Chanin, J., Roth, H.: Combining pmd and stereo camera for motion estimation of a mobile robot. In: The 17th IFAC World Congress (July 2008) (accepted)Google Scholar
  20. [Rap07]
    Rapp., H.: Experimental and theoretical investigation of correlating tof-camera systems. In: Physics, Faculty for Physics and Astronomy, University of Heidelberg, Germany (September 2007)Google Scholar
  21. [Reu06]
    Reulke, R.: Combination of distance data with high resolution images. In: ISPRS Commission V Symposium ’Image Engineering and Vision Metrology’ (2006)Google Scholar
  22. [RM00]
    Roerdink, J.B.T.M., Meijster, A.: The watershed transform: Definitions, algorithms and parallelization strategies. FUNDINF: Fundamenta Informatica 41 (2000)Google Scholar
  23. [SNV02]
    Schmidt, J., Niemann, H., Vogt, S.: Dense disparity maps in real-time with an application to augmented reality. In: WACV 2002: Proceedings of the Sixth IEEE Workshop on Applications of Computer Vision, Washington, DC, USA, p. 225. IEEE Computer Society, Los Alamitos (2002)CrossRefGoogle Scholar
  24. [SS02]
    Scharstein, D., Szeliski, R.: A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. Int. J. Comput. Vision 47(1-3), 7–42 (2002)CrossRefzbMATHGoogle Scholar
  25. [VH08]
    Ventura, J., Höllerer, T.: Depth compositing for augmented reality. In: SIGGRAPH 2008: ACM SIGGRAPH 2008 posters, p. 1. ACM, New York (2008)Google Scholar
  26. [XSH+05]
    Xu, Z., Schwarte, R., Heinol, H., Buxbaum, B., Ringbeck, T.: Smart pixel - photonic mixer device (pmd) new system concept of a 3d-imaging camera-on-a-chip. Technical report, PMDTec (2005)Google Scholar
  27. [Zha99]
    Zhang, Z.: A flexible new technique for camera calibration. Technical report, Microsoft Research (1999)Google Scholar
  28. [Zha00]
    Zhang, Z.: A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(11), 1330–1334 (2000)CrossRefGoogle Scholar
  29. [ZWYD08]
    Zhu, J., Wang, L., Yang, R., Davis, J.: Fusion of time-of-flight depth and stereo for high accuracy depth maps. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR (2008)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Uwe Hahne
    • 1
  • Marc Alexa
    • 1
  1. 1.TU BerlinGermany

Personalised recommendations