Zusammenfassung
To achieve the goal of an autonomous driving vehicle, more and more sensors are being integrated in the vehicle. For the last generations it was sufficient, that those sensors did send object data, which was then fused into an environment model. However, in order to have a more accurate model and to be able to navigate autonomously, the raw sensor data is needed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Literatur
S. Ingle and M. Phute, “Tesla autopilot: semi autonomous driving, an uptick for future autonomy,” International Research Journal of Engineering and Technology, 2016.
S. L. Poczter and L. M. Jankovic, “The Google Car: driving toward a better future?,” Journal of Business Case Studies, 2014.
X. Chen, H. Ma, J. Wan, B. Li and T. Xia, “Multi-View 3D Object Detection Network for Autonomous Driving,” in IEEE CVPR, 2017, p. 3.
F. Castanedo, “A review of data fusion techniques,” The Scientific World Journal , 2013.
B. Khaleghi, A. Khamis, F. O. Karray and S. N. Razavi, “Multisensor data fusion: A review of the state-of-the-art,” Information fusion, 2013.
M. Aeberhard and N. Kaempchen, “High-level sensor data fusion architecture for vehicle surround environment perception,” in Proc. 8th Int. Workshop Intell. Transp, 2011.
J. Zhang, “Multi-source remote sensing data fusion: status and trends,” International Journal of Image and Data Fusion, pp. 5-24, 2010.
B. Steux, C. Laurgeau, L. Salesse and D. Wautier, “Fade: a vehicle detection and tracking system featuring monocular color vision and radar data fusion,” in Intelligent Vehicle Symposium, 2002. IEEE, 2002, pp. 632-639.
S. Pietzsch, T. D. Vu, J. Burlet, O. Aycard, T. Hackbarth, N. Appenrodt, J. Dickmann and B. Radig, “Results of a precrash application based on laser scanner and short-range radars,” IEEE Transactions on Intelligent Transportation Systems, pp. 584-593, 2009.
M. Mahlisch, R. Schweiger, W. Ritter and K. Dietmayer, “Sensorfusion using spatio-temporal aligned video and lidar for improved vehicle detection,” Intelligent Vehicles Symposium, no. IEEE, 2006.
N. Kaempchen, “Feature level fusion of laser scanner and video data,” Doctoral dissertation, Universität Ulm, 2007.
J. Cesic, I. Markovic, I. Cvisic, and I. Petrovic, “Radar and stereo vision fusion for multitarget tracking on the special Euclidean group,” Robotics and Autonomous Systems, pp. 338-348, 2016.
R. O. Chavez-Garcia and O. Aycard, “Multiple sensor fusion and classification for moving object detection and tracking,” IEEE Transactions on Intelligent Transportation Systems, pp. 525-534, 2016.
P. K. S. L. Ellon Mendes, “ICP-based pose-graph SLAM,” in International Symposium on Safety, Security and Rescue Robotics (SSRR), pp.195 – 200, Lausanne, Switzerland, 2016.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Fachmedien Wiesbaden GmbH, ein Teil von Springer Nature
About this paper
Cite this paper
von Falkenhausen, J., Liu, Q. (2020). Fusion of raw sensor data for testing applications in autonomous driving. In: Bertram, T. (eds) Automatisiertes Fahren 2019. Proceedings. Springer Vieweg, Wiesbaden. https://doi.org/10.1007/978-3-658-27990-5_16
Download citation
DOI: https://doi.org/10.1007/978-3-658-27990-5_16
Published:
Publisher Name: Springer Vieweg, Wiesbaden
Print ISBN: 978-3-658-27989-9
Online ISBN: 978-3-658-27990-5
eBook Packages: Computer Science and Engineering (German Language)