Skip to main content

Evaluation of Video-Based Driver Assistance Systems with Sensor Data Fusion by Using Virtual Test Driving

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 196))

Abstract

Research and/or Engineering Questions/Objective The vehicle of the future will support its driver by advising him regarding potential hazards. Essential prerequisite therefore is the sensor based perception of the traffic situation. For the recognition of traffic related objects, camera based sensors, deepness cameras, vehicle sensors as well as radar and lidar sensors are used. For the future development of ADAS the fusion of multiple sensor data to a consistent environmental picture will play a key role. The evaluation approach of real world driving tests will no longer be sufficient due to the complexity of the system interactions. New simulation methods are needed to evaluate ADAS by using virtual test driving with realistic vehicle behavior and complex traffic environment. Methodology Therefore it is important to integrate camera based components in a “closed loop”-simulation platform to be able to test sensor data fusion technologies under realistic conditions. To test new driver assistance systems in a simulation environment today animation data is filmed, subsequently this data is used to test an image processing algorithm or a fusion algorithm. But this method cannot be applied if wide-angle cameras such as cameras with fisheye lenses will be used. Within a research frame work for autonomous driving functions a new simulation technology was developed to integrate virtual cameras beside the well know environment sensor in the vehicle dynamic simulation CarMaker. For this purpose the real-time animation was extended with a sophisticated virtual camera model so called “VideoDataStream” to generate simultaneous video data (also PMD for 3D images). The camera positions as well as the camera properties could be applied individually. Additionally it is possible to freely define the type of the camera lens (e.g. fisheye) with lens settings like opening angle and the typical lens failures (e.g. distortion and vignetting). With this new technology it is possible that e.g. camera and radar data can be provided time and place synchronal for the fusion algorithm which should be tested! Results The video data could be used for evaluating image processing and sensor data fusion in Model-/Software-/Hardware-in-the-Loop applications within virtual test driving conditions. Here the created method and examples of image based perception of the vehicle environment as well as sensor data fusion algorithms shall be presented. Among others this covers first of all the recognition of traffic lanes, traffic signs and other traffic partners as well as the fusion of the single information up to a comprehensive environment picture. A further field of application will be the conjunction with navigation systems and digital maps, by which the virtual vehicle supports the navigation system with related GPS position and gets back the “MPP—Most Probable Path” with the “electronic horizon”, which is a type of predictive sensor, with all related preview information in front of the vehicle which are defined in the ADASIS protocol. Conclusion By using the introduced method the capability and efficiency of function development and testing in the area of Advanced Driver Assistant Systems will significantly be improved. Due to a powerful simulation environment a broad range of validation tests can be shifted into simulation because also complex test scenarios can be replicated and the tests are reproducible. The simulation data can be provided time and place synchronal, which is absolutely important, e.g. for a fusion algorithm which should be tested.

F2012-E12-028

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Schmidt S, Schick B (2011) Evaluation of video based driver assistance systems with sensor data fusion by using virtual test driving. IPG Automotive GmbH, Karlsruhe, AUTOREG 2011, Baden, Germany

    Google Scholar 

  2. Müller S-O, Brand M, Wachendorf S, Wolfsburg VAG, Schröder H (2009) Integration vernetzter fahrerassistenz-funktionen mit hil für den vw passat cc. t-Systems ES GmbH Wolfsburg, Thomas Szot, IAV GmbH Gifhorn, Sebastian Schwab, Birgit Kremer, IPG Automotive GmbH, ATZ extra Automotive Engineering Partners

    Google Scholar 

  3. Schick B, Klein-Ridder B, zur Heiden M, Henning J (2008) Simulation methods to evaluate and verify functions, quality and safety of advanced driver assistance systems. AVEC2008

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bernhard Schick .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Schick, B., Schmidt, S. (2013). Evaluation of Video-Based Driver Assistance Systems with Sensor Data Fusion by Using Virtual Test Driving. In: Proceedings of the FISITA 2012 World Automotive Congress. Lecture Notes in Electrical Engineering, vol 196. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33738-3_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33738-3_36

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33737-6

  • Online ISBN: 978-3-642-33738-3

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics