Experiments in Vision-Laser Fusion Using the Bayesian Occupancy Filter

  • John-David Yoder
  • Mathias Perrollaz
  • Igor E. Paromtchik
  • Yong Mao
  • Christian Laugier
Part of the Springer Tracts in Advanced Robotics book series (STAR, volume 79)

Abstract

Occupancy Grids have been used to represent the environment for some time. More recently, the Bayesian Occupancy Filter (BOF), which provides both an estimate of likelihood of occupancy of each cell, AND a probabilistic estimate of the velocity of each cell in the grid, has been introduced and patented. This work presents the first experiments in the use of the BOF to fuse data obtained from stereo vision and multiple laser sensors, on an intelligent vehicle platform. The paper describes the experimental platform, the approach to sensor fusion, and shows results from data captured in real traffic situations.

Keywords

Visual Sensor Stereo Vision Sensor Fusion Stereo Camera Laser Sensor 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    World Health Organization, Global Status Report on Road Safety: Time for Action, http://whqlibdoc.who.int/publications/2009/9789241563840_eng.pdf
  2. 2.
    Urmson, C., et al.: Autonomous driving in urban environments: Boss and the urban challenge. J. of Field Robotics 25(8) (2008)Google Scholar
  3. 3.
    Montemerlo, M., et al.: Junior: The Stanford entry in the Urban Challenge. J. of Field Robotics 25(9) (2008)Google Scholar
  4. 4.
    Parmotchik, I.E., Laugier, C., Perrollaz, M., Yong, M., Nègre, A., Tay, C.: The ArosDyn Project: Robust Analysis of Dynamic Scenes. In: Proc. of the 11th Int. Conf. on Control, Automation, Robotics, and Vision, Singapore (2010)Google Scholar
  5. 5.
    Moravec, H.P., Elfes, A.E.: High Resolution Maps from Wide Angle Sonar. In: Proc. of the 1985 IEEE Int. Conference on Robotics and Automation, pp. 116–121 (March 1985)Google Scholar
  6. 6.
    Moravec, H.P.: Sensor Fusion in Certainty Grids for Mobile Robots. AI Magazine 9(2) (1988)Google Scholar
  7. 7.
    Coue, C., Pradalier, C., Laugier, C., Fraichard, T., Bessiere, P.: Bayesian Occupancy Filtering for Multitarget Tracking: An Automotive Application. Int. J. Robotics Research (1) (2006)Google Scholar
  8. 8.
    Mekhnacha, K., Mao, Y., Raulo, D., Laugier, C.: Bayesian Occupancy Filter based “Fast Clustering-Tracking” Algorithm. In: IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Nice (2008)Google Scholar
  9. 9.
    Thrun, S., Burgard, W., Fox, D.: Probabilistic Robotics. MIT Press (2005)Google Scholar
  10. 10.
    Perrollaz, M., Yoder, J.-D., Laugier, C.: Using Obstacle and Road Pixels in the Disparity Space Computation of Stereo-vision based Occupancy Grids. In: Proc. of the IEEE Int. Conf. on Intelligent Transportation Systems, Madeira, Portugal (2010)Google Scholar
  11. 11.
    Murray, D., Little, J.: Using real-time stereo vision for mobile robot navigation. Autonomous Robots, 8 (January 2000)Google Scholar
  12. 12.
    Tay, C.: Analysis of Dynamics Scenes: Application to Driving Assistance. PhD Thesis, INRIA, France (2009)Google Scholar
  13. 13.

Copyright information

© Springer-Verlag GmbH Berlin Heidelberg 2014

Authors and Affiliations

  • John-David Yoder
    • 1
  • Mathias Perrollaz
    • 2
  • Igor E. Paromtchik
    • 2
  • Yong Mao
    • 2
  • Christian Laugier
    • 2
  1. 1.Ohio Northern UniversityAdaUSA
  2. 2.INRIA Grenoble Rhône-AlpesSaint IsmierFrance

Personalised recommendations