Advertisement

6D-Vision: Fusion of Stereo and Motion for Robust Environment Perception

  • Uwe Franke
  • Clemens Rabe
  • Hernán Badino
  • Stefan Gehrig
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3663)

Abstract

Obstacle avoidance is one of the most important challenges for mobile robots as well as future vision based driver assistance systems. This task requires a precise extraction of depth and the robust and fast detection of moving objects. In order to reach these goals, this paper considers vision as a process in space and time. It presents a powerful fusion of depth and motion information for image sequences taken from a moving observer. 3D-position and 3D-motion for a large number of image points are estimated simultaneously by means of Kalman-Filters. There is no need of prior error-prone segmentation. Thus, one gets a rich 6D representation that allows the detection of moving obstacles even in the presence of partial occlusion of foreground or background.

Keywords

Kalman Filter Image Point Obstacle Avoidance Stereo Vision Inertial Sensor 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Badino, H.: A Robust Approach for Ego-Motion Estimation Using a Mobile Stereo Platform. In: Jähne, B., Mester, R., Barth, E., Scharr, H. (eds.) IWCM 2004. LNCS, vol. 3417, pp. 198–208. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  2. 2.
    Bar-Shalom, Y., Kirubarajan, T., Li, X.: Estimation with Applications to Tracking and Navigation. John Wiley & Sons, Chichester (2002)Google Scholar
  3. 3.
    Dang, T., Hoffmann, C., Stiller, C.: Fusing Optical Flow and Stereo Disparity for Object Tracking. In: Proc. of the IEEE V. International Conference on Intelligent Transportation Systems, Singapore, September 3-6, pp. 112–117 (2002)Google Scholar
  4. 4.
    Franke, U.: Real-time Stereo Vision for Urban Traffic Scene Understanding. In: IEEE Conference on Intelligent Vehicles 2000, Dearborn (October 2000)Google Scholar
  5. 5.
    Franke, U., Rabe, C.: Kalman Filter based Depth from Motion Estimation with Fast Convergence. To appear in IEEE Conference on Intelligent Vehicles 2005, Las Vegas (June 2005)Google Scholar
  6. 6.
    Heinrich, S.: Real Time Fusion of Motion and Stereo using Flow/Depth constraint for Fast Obstacle Detection. In: Van Gool, L. (ed.) DAGM 2002. LNCS, vol. 2449, p. 75. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  7. 7.
    Lin, C.-F.: Three-dimensional relative positioning and tracking using LDRI. US patent no 6,677,941 (January 13, 2004)Google Scholar
  8. 8.
    Mallet, A., Lacroix, S., Gallo, L.: Position estimation in outdoor environments using pixel tracking and stereovision. In: Proc. of the 2000 IEEE International Conference on Robotics and Automation ICRA 2000, San Francisco, April 2000, pp. 3519–3524 (2000)Google Scholar
  9. 9.
    Martin, M., Moravec, H.: Robot Evidence Grids. The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA (March 1996) (CMU-RI-TR-96-06)Google Scholar
  10. 10.
    Sekuler, R., Watamaniuk, S., Blake, R.: Perception of Visual Motion. In: Stevens’ Handbook of Experimental Psychology, Sensation and Perception, vol. 1 (January 2004)Google Scholar
  11. 11.
    Tomasi, C., Kanade, T.: Detection and Tracking of Point Features. School of Computer Science, Carnegie Mellon University, Pittsburgh, PA (April 1991) (CMU-CS-91-132)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Uwe Franke
    • 1
  • Clemens Rabe
    • 1
  • Hernán Badino
    • 1
  • Stefan Gehrig
    • 1
  1. 1.DaimlerChrysler AGStuttgartGermany

Personalised recommendations