Quantifying the Effects of Environmental Conditions on Autonomy Algorithms for Unmanned Ground Vehicles
Autonomy for commercial applications is developing at a rapid pace; however, autonomous navigation of unmanned ground vehicles (UGVs) for military applications has been deployed to a limited extent. Delaying the use of autonomy for military applications is the environment in which military UGVs must operate. Military operations take place in unstructured environments under adverse environmental conditions. Military UGVs are infrequently tested harsh conditions; therefore, there exists a lack of understanding in how autonomy reacts to challenging environmental conditions. Using high-fidelity modeling and simulation (M&S), autonomy algorithms can be exercised quickly and inexpensively in realistic operational conditions. The presented research introduces the M&S tools available for simulating adverse environmental conditions. Simulated camera images generated using these M&S tools are run through two typical autonomy algorithms, road lane detection and object classification, to assess the impact environmental conditions have on autonomous operations. Furthermore, the presented research proposes a methodology for quantifying these environmental effects.
KeywordsAutonomy Autonomous ground vehicles Environment effects Perception
Permission to publish was granted by Director, Geotechnical and Structures Laboratory.
- 2.Luettel, T., et al.: Autonomous ground vehicles-concepts and a path to the future. Proc. IEEE 100(Centennial Issue), 1831–1839 (2012)Google Scholar
- 3.Jones, R., et al.: Virtual autonomous navigation environment (VANE). In: Earth & Space 2008: Engineering, Science, Construction, and Operations in Challenging Environments, pp. 1–8 (2008)Google Scholar
- 4.Cummins, C.: Virtual autonomous navigation environment. DTIC Document (2008)Google Scholar
- 5.Goodin, C., et al.: Sensor modeling for the virtual autonomous navigation environment. In: IEEE Sensors, pp. 1588–1592 (2009)Google Scholar
- 6.Goodin, C., Durst, P.J., Gates, B., Cummins, C., Priddy, J.: High fidelity sensor simulations for the virtual autonomous navigation environment. In: Ando, N., Balakirsky, S., Hemker, T., Reggiani, M., von Stryk, O. (eds.) SIMPAR 2010. LNCS (LNAI), vol. 6472, pp. 75–86. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-17319-6_10CrossRefGoogle Scholar
- 7.He, X.D., et al.: A comprehensive physical model for light reflection. In: ACM SIGGRAPH Comput. Graphics, vol. 25. no. 4 (1991)Google Scholar
- 13.Abadi, M., et al.: Tensorflow: a system for large-scale machine learning. In: OSDI, vol. 16 (2016)Google Scholar
- 14.Szegedy, C., Liu, W., Jia, Y., et al.: Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–9 (2015)Google Scholar
- 16.Fritsch, J., et al.: A new performance measure and evaluation benchmark for road detection algorithms. In: 16th International IEEE Conference on Intelligent Transportation Systems. IEEE (2013)Google Scholar