Advertisement

Stereo Road Detection Based on Ground Plane

  • C. H. Rodríguez-GaravitoEmail author
  • J. Carmona-Fernández
  • A. de la Escalera
  • J. M. Armingol
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9520)

Abstract

This paper presents a robust road perception algorithm aimed to detect multiple lanes with temporal integration, one of the most important tasks in Advanced Driver Assistance Systems (ADAS). A new vision-based system is proposed, consisting on three parts: a line marker detection algorithm, a road line classification and a lane tracking integration. The goal is to detect the position, type and number of road lanes. The developed approach is characterized by the use of the bird’s eye view, road marks filtering based on gradient space algorithms, robust features descriptor for line classification, and road tracking based on time of life for each detected lane. The road detection is done according to the Spanish standard IC 8.2. The system was tested on the test platform IvvI 2.0.

Keywords

Road line detection Bird eye view 

Notes

Acknowledgments

This work was supported by automation engineering department from de La Salle University, Bogotá-Colombia; Administrative Department of Science, Technology and Innovation (COLCIENCIAS), Bogotá-Colombia and the Spanish Government through the CICYT projects (TRA2013-48314-C3-1-R) and (TRA2011-29454-C03-02) and Comunidad de Madrid through SEGVAUTO_TRIES \((S2013/MIT-2713)\).

References

  1. 1.
    Bertozzi, M., Broggi, A.: Gold: a parallel real-time stereo vision system for generic obstacle and lane detection. IEEE Trans. Image Process. 7(1), 62–81 (1998)CrossRefGoogle Scholar
  2. 2.
    de Carreteras, I.: 8.2 ic marcas viales. (1987)Google Scholar
  3. 3.
    Crisman, J.D., Thorpe, C.E.: Scarf: a color vision system that tracks roads and intersections. IEEE Trans. Robot. Autom. 9(1), 49–58 (1993)CrossRefGoogle Scholar
  4. 4.
    Felisa, M., Zani, P.: Robust monocular lane detection in urban environments. In: 2010 IEEE Intelligent Vehicles Symposium (IV), pp. 591–596. IEEE (2010)Google Scholar
  5. 5.
    Liu, G., Worgotter, F., Markelic, I.: Combining statistical hough transform and particle filter for robust lane detection and tracking. In: 2010 IEEE Intelligent Vehicles Symposium (IV), pp. 993–997. IEEE (2010)Google Scholar
  6. 6.
    Martín, D., García, F., Musleh, B., Olmeda, D., Peláez, G., Marín, P., Ponz, A., Rodríguez, C., Al-Kaff, A., De La Escalera, A., et al.: Ivvi 2.0: an intelligent vehicle based on computational perception. Expert Syst. Appl. 41(17), 7927–7944 (2014)CrossRefGoogle Scholar
  7. 7.
    Muad, A.M., Hussain, A., Samad, S.A., Mustaffa, M.M., Majlis, B.Y.: Implementation of inverse perspective mapping algorithm for the development of an automatic lane tracking system. In: 2004 IEEE Region 10 Conference TENCON 2004, pp. 207–210. IEEE (2004)Google Scholar
  8. 8.
    Rodríguez-Garavito, C., Ponz, A., García, F., Martín, D., de la Escalera, A., Armingol, J.: Automatic laser and camera extrinsic calibration for data fusion using road plane. In: 2014 17th International Conference on Information Fusion (FUSION), pp. 1–6. IEEE (2014)Google Scholar
  9. 9.
    Torr, P.H., Zisserman, A.: Mlesac: a new robust estimator with application to estimating image geometry. Comput. Vis. Image Underst. 78(1), 138–156 (2000)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • C. H. Rodríguez-Garavito
    • 1
    • 2
    Email author
  • J. Carmona-Fernández
    • 1
  • A. de la Escalera
    • 1
  • J. M. Armingol
    • 1
  1. 1.Intelligent Systems LaboratoryUniversidad Carlos III de MadridMadridSpain
  2. 2.Automation Engineering DepartmentUniversidad de La SalleBogotáColombia

Personalised recommendations