Advertisement

Transparency of Automated Combat Classification

  • Tove Helldin
  • Ulrika Ohlander
  • Göran Falkman
  • Maria Riveiro
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8532)

Abstract

We present an empirical study where the effects of three levels of system transparency of an automated target classification aid on fighter pilots’ performance and initial trust in the system were evaluated. The levels of transparency consisted of (1) only presenting text–based information regarding the specific object (without any automated support), (2) accompanying the text-based information with an automatically generated object class suggestion and (3) adding the incorporated sensor values with associated (uncertain) historic values in graphical form. The results show that the pilots needed more time to make a classification decision when being provided with display condition 2 and 3 than display condition 1. However, the number of correct classifications and the operators’ trust ratings were the highest when using display condition 3. No difference in the pilots’ decision confidence was found, yet slightly higher workload was reported when using display condition 3. The questionnaire results report on the pilots’ general opinion that an automatic classification aid would help them make better and more confident decisions faster, having trained with the system for a longer period.

Keywords

Classification support automation transparency uncertainty visualization fighter pilots 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bisantz, A.M., Cao, D., Jenkins, M., Pennathur, P.R., Farry, M., Roth, E., Potter, S.S., Pfautz, J.: Comparing uncertainty visualizations for a dynamic decision-making task. Journal of Cognitive Engineering and Decision Making 5(3), 277–293 (2011)CrossRefGoogle Scholar
  2. 2.
    British Ministry of Defence: Military Aircraft Accident Summary: Aircraft accident to Royal Air Force Tornado GR MK4A ZG710. Tech. rep. (2004)Google Scholar
  3. 3.
    Dong, X., Hayes, C.: Uncertainty visualizations helping decision makers become more aware of uncertainty and its implications. Journal of Cognitive Engineering and Decision Making 6(1), 30–56 (2012)CrossRefGoogle Scholar
  4. 4.
    Fisher, C., Kingma, B.: Criticality of data quality as exemplified in two disasters. Information & Management 39(2), 109–116 (2001)CrossRefGoogle Scholar
  5. 5.
    Gelsema, S.: The desirability of a nato-central database for non-cooperative target recognition of aircraft. In: Proceedings of the RTO SET Symposium on Target Identification and Recognition Using RF Systems, Oslo, Norway, October 11-13 (2004)Google Scholar
  6. 6.
    Irandoust, H., Benaskeur, A., Kabanza, F., Bellefeuille, P.: A mixed-initiative advisory system for threat evaluation. In: Proceedings of the 15th International Command and Control Research and Technology Symposium: The Evolution of C2, Santa Monica, California, USA (2010)Google Scholar
  7. 7.
    Jian, J.Y., Bisantz, A., Drury, C.: Foundations for an empirically determined scale of trust in automated systems. International Journal of Cognitive Ergonomics 4(1), 53–71 (2000)CrossRefGoogle Scholar
  8. 8.
    de Jong, J., Burghouts, G., Hiemstra, H., te Marvelde, A., van Norden, W., Schutte, K.: Hold your fire!: Preventing fratricide in the dismounted soldier domain. In: Proceedings of the 13th International Command and Control Research and Technology Symposium: C2 for Complex Endeavours, Bellevue, WA, USA (2008)Google Scholar
  9. 9.
    Krüger, M., Kratzke, N.: Monitoring of reliability in bayesian identification. In: 12th International Conference on Information Fusion, FUSION 2009, pp. 1241–1248. IEEE (2009)Google Scholar
  10. 10.
    Lee, J., See, K.: Trust in automation: Designing for appropriate reliance. Human Factors: The Journal of the Human Factors and Ergonomics Society 46(1), 50–80 (2004)CrossRefGoogle Scholar
  11. 11.
    Liebhaber, M., Feher, B.: Air threat assessment: Research, model, and display guidelines. In: The Proceedings of the 2002 Command and Control Research and Technology Symposium (2002)Google Scholar
  12. 12.
    MacEachren, A.M., Roth, R.E., O’Brien, J., Li, B., Swingley, D., Gahegan, M.: Visual semiotics & uncertainty visualization: An empirical study. IEEE Transactions on Visualization and Computer Graphics 18(12), 2496–2505 (2012)CrossRefGoogle Scholar
  13. 13.
    Mark, G., Kobsa, A.: The effects of collaboration and system transparency on cive usage: an empirical study and model. Presence: Teleoperators & Virtual Environments 14(1), 60–80 (2005)CrossRefGoogle Scholar
  14. 14.
    McGuirl, J., Sarter, N.: Supporting trust calibration and the effective use of decision aids by presenting dynamic system confidence information. Human Factors: The Journal of the Human Factors and Ergonomics Society 48(4), 656–665 (2006)CrossRefGoogle Scholar
  15. 15.
    Neyedli, H., Hollands, J., Jamieson, G.: Beyond identity incorporating system reliability information into an automated combat identification system. Human Factors: The Journal of the Human Factors and Ergonomics Society 53(4), 338–355 (2011)CrossRefGoogle Scholar
  16. 16.
    Paradis, S., Benaskeur, A., Oxenham, M., Cutler, P.: Threat evaluation and weapons allocation in network-centric warfare. In: 8th International Conference on Information Fusion, vol. 2, pp. 1078–1085. IEEE (2005)Google Scholar
  17. 17.
    Parasuraman, R., Sheridan, T., Wickens, C.: A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 30(3), 286–297 (2000)CrossRefGoogle Scholar
  18. 18.
    Preece, J., Rogers, Y., Sharp, H.: Interaction Design: Beyond Human-Computer Interaction. Wiley, New York (2002)Google Scholar
  19. 19.
    Skeels, M., Lee, B., Smith, G., Robertson, G.G.: Revealing uncertainty for information visualization. Information Visualization 9(1), 70–81 (2010)CrossRefGoogle Scholar
  20. 20.
    Smith, C., Johnston, J., Paris, C.: Decision support for air warfare: Detection of deceptive threats. Group Decision and Negotiation 13(2), 129–148 (2004)CrossRefGoogle Scholar
  21. 21.
    Wang, L., Jamieson, G., Hollands, J.: Trust and reliance on an automated combat identification system. Human Factors: The Journal of the Human Factors and Ergonomics Society 51(3), 281–291 (2009)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Tove Helldin
    • 1
  • Ulrika Ohlander
    • 2
  • Göran Falkman
    • 1
  • Maria Riveiro
    • 1
  1. 1.University of SkövdeSweden
  2. 2.Saab AeronauticsSweden

Personalised recommendations