Advertisement

User Performance for Vehicle Recognition with Visual and Infrared Sensors from an Unmanned Aerial Vehicle

  • Patrik LifEmail author
  • Fredrik Näsström
  • Fredrik Bissmarck
  • Jonas Allvar
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10901)

Abstract

In many situations it is important to detect and recognize people and vehicles. In this study the purpose was to examine human performance to detect and recognize vehicles on the ground from synthetic video sequences captured from a simulated unmanned aerial vehicle. A visual and an infrared sensor was used on an unmanned aerial vehicle with camera scan rate of the field of view on the ground relative to the ground of either 8 m/s or 12 m/s. The results from this study demonstrated that performance was affected by type of sensor, camera scan rate and type of vehicle. Subjects performed worse with infrared than with visual sensor and increased camera scan rate caused more errors. Also, the results show that recognition performance varied between 67 and 100% depending on type of vehicle. Recognition of specific vehicles was also affected negatively by interference from vehicles of similar appearance. Consequently, a vehicle with unique appearance within the set was easier to recognize.

Keywords

Vehicle recognition Visual sensor IR sensor UAV Human factors 

References

  1. 1.
    Flach, J.M., Hancock, P.A.: An ecological approach to human-machine systems. In: Proceedings of the Human Factors Society Annual Meeting, vol. 36, no. 14, pp. 1056–1058 (1992)CrossRefGoogle Scholar
  2. 2.
    Woods, D.: Toward a theoretical base for representation design in the computer medium: ecological perception and aiding human cognition. In: Flach, J., Hancock, P., Caird, K., Vicente, K. (Eds.) An Ecologixal Approach to Human Machine System, Erlbaum, Hillsdale, New Jersey, pp. 157–188 (1995)Google Scholar
  3. 3.
    Iconshock: “Icon man,” Smashingmagazine (2017). SmashingMagazine.com
  4. 4.
    Donohue, J.: Introductionary review of target discrimination criteria. Wilmington, MA (1991)CrossRefGoogle Scholar
  5. 5.
    Sjaardema, T.A., Smith, C.S., Birch, G.C.: History and Evolution of the Johnson Criteria. Albuquerque, New Mexico (2015)CrossRefGoogle Scholar
  6. 6.
    Kopeika, N.S.: A System Engineering Approach To Imaging. SPIE Optical Engineering Press (1998)CrossRefGoogle Scholar
  7. 7.
    Näsström, F., Bergström, D., Bissmarck, F., Grahn, P., Gustafsson, D., Karlholm, J.: Prestationsmått för sensorsystem, Linköping, FOI-R–4139–SE (2015). (In swedish)Google Scholar
  8. 8.
    Wittenstein, W.: Thermal range model TRM3. In: Proceedings Volume 3436, Infrared Technology and Applications XXIV, vol. 3436, p. 413 (1998)Google Scholar
  9. 9.
    Vollmerhausen, R.H., Jacobs, E.: The Targeting Task Performance (TTP) Metric A New Model for Predicting Target Acquisition Performance. Fort Belvoir, VA, Technical Report AMSEL-NV-TR-230, 2004Google Scholar
  10. 10.
    Näsström, F., Allvar, J., Deleskog, V.: Simulation framework for research of ‘intelligent’ reconnaisance systems (2017)Google Scholar
  11. 11.
    Näsström, F., et al.: Värdering av sensorsystem, Linköping, FOI-R–4474–SE (2017). (In swedish)Google Scholar
  12. 12.
    Colomina, I., Molina, P.: Unmanned aerial systems for photogrammetry and remote sensing: a review. ISPRS J. Photogramm. Remote Sens. 92, 79–97 (2014)CrossRefGoogle Scholar
  13. 13.
    Hinas, A., Roberts, J.M., Gonzalez, F.: Vision-based target finding and inspection of a ground target using a multirotor UAV system. Sensors 17(12), 2929 (2017)Google Scholar
  14. 14.
    Sun, J., Li, B., Jiang, Y., Wen, C.: A camera-based target detection and positioning UAV system for search and rescue (SAR) purposes. Sensors 16(11), 1778 (2016)CrossRefGoogle Scholar
  15. 15.
    York, G., Pack, D.J., York, G., Pack, D.J.: Ground target detection using cooperative unmanned aerial systems. J. Intell. Robot. Syst. 65, 473–478 (2012)CrossRefGoogle Scholar
  16. 16.
    Altshuler, Y., Pentland, A., Bruckstein, A.M.: The cooperative hunters – efficient and scalable drones swarm for multiple targets detection. Swarms and Network Intelligence in Search. SCI, vol. 729, pp. 187–205. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-63604-7_7CrossRefGoogle Scholar
  17. 17.
    Hixson, J.G., Teaney, B.P., May, C., Maurer, T., Nelson, M.B., Pham, J.R.: Virtual DRI dataset development. In: Proceedings of Infrared Imaging Systems: Design, Analysis, Modeling, and Testing XXVIII, vol. 10178 (2017)Google Scholar
  18. 18.
    Hays, W.: Statistics, 5th edn. Harcourt Brace College Publishers, New York (1994)zbMATHGoogle Scholar
  19. 19.
    Greene, J., D’Oliveira, M.: Learning To Use Statistical Tests In Psychology. Open University Press, Philadelphia (1982)Google Scholar
  20. 20.
    Cedrus: “SuperLab 5,” Cedrus (2017). https://www.cedrus.com/superlab/. Accessed 20 Nov 2017
  21. 21.
    Lif, P., Näsström, F., Tolt, G., Hedström, J., Allvar, J.: Visual and IR-based target detection from unmanned aerial vehicle. In: Yamamoto, S. (ed.) HIMI 2017. LNCS, vol. 10273, pp. 136–144. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-58521-5_10CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Patrik Lif
    • 1
    Email author
  • Fredrik Näsström
    • 1
  • Fredrik Bissmarck
    • 1
  • Jonas Allvar
    • 1
  1. 1.Swedish Defence Research AgencyLinköpingSweden

Personalised recommendations