Advertisement

Autonomous Robots

, Volume 26, Issue 2–3, pp 123–139 | Cite as

Model based vehicle detection and tracking for autonomous urban driving

  • Anna PetrovskayaEmail author
  • Sebastian Thrun
Article

Abstract

Situational awareness is crucial for autonomous driving in urban environments. This paper describes the moving vehicle detection and tracking module that we developed for our autonomous driving robot Junior. The robot won second place in the Urban Grand Challenge, an autonomous driving race organized by the U.S. Government in 2007. The module provides reliable detection and tracking of moving vehicles from a high-speed moving platform using laser range finders. Our approach models both dynamic and geometric properties of the tracked vehicles and estimates them using a single Bayes filter per vehicle. We present the notion of motion evidence, which allows us to overcome the low signal-to-noise ratio that arises during rapid detection of moving vehicles in noisy urban environments. Furthermore, we show how to build consistent and efficient 2D representations out of 3D range data and how to detect poorly visible black vehicles. Experimental validation includes the most challenging conditions presented at the Urban Grand Challenge as well as other urban settings.

Keywords

Vehicle tracking Autonomous driving Urban driving Bayesian model Particle filter Laser range finders 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Blackman, S., Co, R., & El Segundo, C. (2004). Multiple hypothesis tracking for multiple target tracking. IEEE Aerospace and Electronic Systems Magazine, 19(1 Part 2), 5–18. CrossRefGoogle Scholar
  2. Blais, F. (2004). Review of 20 years of range sensor development. Journal of Electronic Imaging, 13, 231. CrossRefGoogle Scholar
  3. Buehler, M., Iagnemma, K., & Singh, S. (2007). The 2005 darpa grand challenge: The great robot race. New York: Springer. CrossRefGoogle Scholar
  4. Burgard, W., Fox, D., Hennig, D., & Schmidt, T. (1996). Estimating the absolute position of a mobile robot using position probability grids. In Proceedings of the national conference on artificial intelligence (pp. 896–901). Google Scholar
  5. Darms, M., Rybski, P., Urmson, C., Inc, C., & Auburn Hills, M. (2008). Classification and tracking of dynamic objects with multiple sensors for autonomous driving in urban environments. In Intelligent vehicles symposium, 2008 (pp. 1197–1202). New York: IEEE. CrossRefGoogle Scholar
  6. DARPA (2007). Urban challenge rules, revision oct. 27, 2007. http://www.darpa.mil/grandchallenge/rules.asp.
  7. Dellaert, F., & Thorpe, C. (1998). Robust car tracking using Kalman filtering and Bayesian templates. In Proceedings of SPIE (Vol. 3207, p. 72). Google Scholar
  8. Dickmanns, E. (1998). Vehicles capable of dynamic vision: a new breed of technical beings? Artificial Intelligence, 103(1–2), 49–76. zbMATHCrossRefGoogle Scholar
  9. Dietmayer, K., Sparbert, J., & Streller, D. (2001). Model based object classification and object tracking in traffic scenes from range images. In Proceedings of IV 2001, IEEE intelligent vehicles symposium (pp. 25–30). Google Scholar
  10. Doucet, A., Freitas, Nd., Murphy, K., & Russell, S. (2000). Rao-blackwellised filtering for dynamic Bayesian networks. In Proceedings of the sixteenth conference on uncertainty in artificial intelligence, San Francisco, CA (pp. 176–183). Google Scholar
  11. Fox, D., Burgard, W., & Thrun, S. (1999). Markov localization for mobile robots in dynamic environments. Journal of Artificial Intelligence Research, 11(3), 391–427. zbMATHGoogle Scholar
  12. Gerkey, B., Vaughan, R., & Howard, A. (2003). The player/stage project: Tools for multi-robot and distributed sensor systems. In Proceedings of the 11th international conference on advanced robotics (pp. 317–323). Google Scholar
  13. Ibeo Automobile Sensor GmbH (2008). IBEO AlascaXT brochure. http://www.ibeo-as.com/english/press_downloads_brochures.asp.
  14. Kabadayi, S., Pridgen, A., & Julien, C. (2006). Virtual sensors: Abstracting data from physical sensors. In Proceedings of the 2006 international symposium on world of wireless, mobile and multimedia networks (pp. 587–592). Washington: IEEE Computer Society. CrossRefGoogle Scholar
  15. Leonard, J., How, J., Teller, S., Berger, M., Campbell, S., Fiore, G., Fletcher, L., Frazzoli, E., Huang, A., Karaman, S. (2008). A perception-driven autonomous urban vehicle. Journal of Field Robotics, 25(10), 727–774. doi: 10.1002/rob.20262. CrossRefGoogle Scholar
  16. Mäkynen, A. (2000). Position-sensitive devices and sensor systems for optical tracking and displacement sensing applications. PhD thesis, Department of Electrical Engineering, Oulu University, Oulu, Finland. Google Scholar
  17. Mitchell, T., Hutchinson, R., Just, M., Newman, S., Stefan, R., Francisco, N., & Wang, P. X. (2002). Machine learning of fmri virtual sensors of cognitive states. In The 16th annual conference on neural information processing systems, computational neuroimaging: Foundations, concepts and methods workshop, 2002 (p. 87). Cambridge: MIT Press. Google Scholar
  18. Montemerlo, M. (2003). Fastslam: A factored solution to the simultaneous localization and mapping problem with unknown data association. PhD thesis, Robotics Institute, Carnegie Mellon University. Google Scholar
  19. Montemerlo, M., Becker, J., Bhat, S., Dahlkamp, H., Dolgov, D., Ettinger, S., Haehnel, D., Hilden, T., Hoffmann, G., Huhnke, B. (2008). Junior: The Stanford entry in the urban challenge. Journal of Field Robotics, 25(9), 569–597. zbMATHCrossRefGoogle Scholar
  20. Moravec, H. (1988). Sensor fusion in certainty grids for mobile robots. AI Magazine, 9(2), 61. Google Scholar
  21. Petrovskaya, A., & Thrun, S. (2008a). Efficient techniques for dynamic vehicle detection. In ISER, Athens, Greece. Google Scholar
  22. Petrovskaya, A., & Thrun, S. (2008b). Model based vehicle tracking for autonomous driving in urban environments. In RSS, Zurich, Switzerland. Google Scholar
  23. Petrovskaya, A., Khatib, O., Thrun, S., & Ng, A. Y. (2006). Bayesian estimation for autonomous object manipulation based on tactile sensors. In Robotics and automation. Proceedings of IEEE international conference (pp. 707–714). Google Scholar
  24. Särkkä, S., Vehtari, A., & Lampinen, J. (2007). Rao-blackwellized particle filter for multiple target tracking. Information Fusion, 8(1), 2–15. CrossRefGoogle Scholar
  25. Schulz, D., Burgard, W., Fox, D., & Cremers, A. (2001). Tracking multiple moving targets with a mobile robot using particle filters and statistical data association. In IEEE international conference on robotics and automation, 2001. Proceedings 2001 ICRA (Vol. 2). Google Scholar
  26. Sick Optics (2003). LMS 200/211/220/221/291 laser measurement systems technical description. http://www.sickusa.com/Publish/docroot/Manual(s)/LMSTechnicalDescription.pdf.
  27. Srivastava, A., Oza, N., Stroeve, J., Aeronaut, N., Center, S., & Moffett Field, C. (2005). Virtual sensors: Using data mining techniques to efficiently estimate remote sensing spectra. IEEE Transactions on Geoscience and Remote Sensing, 43(3), 590–600. CrossRefGoogle Scholar
  28. Streller, D., Furstenberg, K., & Dietmayer, K. (2002). Vehicle and object models for robust tracking in traffic scenes using laser range images. In The IEEE 5th international conference on intelligent transportation systems, 2002. Proceedings (pp. 118–123). Google Scholar
  29. Thalmann, D., Noser, H., & Huang, Z. (1997). Autonomous virtual actors based on virtual sensors. Lecture Notes In Computer Science, 1195, 25–42. CrossRefGoogle Scholar
  30. Thrun, S. (2001). A probabilistic on-line mapping algorithm for teams of mobile robots. The International Journal of Robotics Research, 20(5), 335. CrossRefGoogle Scholar
  31. Thrun, S., Burgard, W., & Fox, D. (2005). Probabilistic robotics. Cambridge: MIT Press. zbMATHGoogle Scholar
  32. Urmson, C., Anhalt, J., Bagnell, D., Baker, C., Bittner, R., Clark, M., Dolan, J., Duggins, D., Galatali, T., Geyer, C. (2008). Autonomous driving in urban environments: Boss and the urban challenge. Journal of Field Robotics, 25(8), 425–466. CrossRefGoogle Scholar
  33. Velodyne Lidar, Inc. (2008). High definition lidar HDL-64E S2 specifications. http://www.velodyne.com/lidar/products/specifications.aspx.
  34. Wang, C., Thorpe, C., Thrun, S., Hebert, M., & Durrant-Whyte, H. (2007). Simultaneous localization, mapping and moving object tracking. The International Journal of Robotics Research, 26, 889–916. CrossRefGoogle Scholar
  35. Wang, C. C. (2004). Simultaneous localization, mapping and moving object tracking. PhD thesis, Robotics Institute, Carnegie Mellon University, Pittsburgh, PA. Google Scholar
  36. Wender, S., & Dietmayer, K. (2008). 3d vehicle detection using a laser scanner and a video camera. Intelligent Transport Systems, IET, 2(2), 105–112. CrossRefGoogle Scholar
  37. Zhao, L., & Thorpe, C. (1998). Qualitative and quantitative car tracking from a range image sequence. In 1998 IEEE computer society conference on computer vision and pattern recognition. Proceedings (pp. 496–501). Google Scholar
  38. Zielke, T., Brauckmann, M., & von Seelen, W. (1993). Intensity and edge-based symmetry detection with an application to car-following. CVGIP: Image Understanding, 58(2), 177–190. CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2009

Authors and Affiliations

  1. 1.Computer Science DepartmentStanford UniversityPalo AltoUSA

Personalised recommendations