Advertisement

Why Are People’s Decisions Sometimes Worse with Computer Support?

  • Eugenio Alberdi
  • Lorenzo Strigini
  • Andrey A. Povyakalo
  • Peter Ayton
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5775)

Abstract

In many applications of computerised decision support, a recognised source of undesired outcomes is operators’ apparent over-reliance on automation. For instance, an operator may fail to react to a potentially dangerous situation because a computer fails to generate an alarm. However, the very use of terms like “over-reliance” betrays possible misunderstandings of these phenomena and their causes, which may lead to ineffective corrective action (e.g. training or procedures that do not counteract all the causes of the apparently “over-reliant” behaviour). We review relevant literature in the area of “automation bias” and describe the diverse mechanisms that may be involved in human errors when using computer support. We discuss these mechanisms, with reference to errors of omission when using “alerting systems”, with the help of examples of novel counterintuitive findings we obtained from a case study in a health care application, as well as other examples from the literature.

Keywords

decision support computer aided decision making alerting systems human-machine diversity omission errors 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bainbridge, L.: Ironies of Automation. Automatica 19, 775–779 (1983)CrossRefGoogle Scholar
  2. 2.
    Sorkin, R.D., Woods, D.D.: Systems with human monitors: A signal detection analysis. Human-Computer Interaction 1, 49–75 (1985)CrossRefGoogle Scholar
  3. 3.
    Hawley, J.K.: Looking Back at 20 Years of MANPRINT on Patriot: Observations and Lessons. Report ARL-SR-0158, U.S. Army Research Laboratory (2007)Google Scholar
  4. 4.
    Bisantz, A.M., Seong, Y.: Assessment of operator trust in and utilization of automated decision-aids under different framing conditions. International Journal of Industrial Ergonomics 28(2), 85–97 (2001)CrossRefGoogle Scholar
  5. 5.
    Dzindolet, M.T., Peterson, S.A., Pomranky, R.A., Pierce, L.G., Beck, H.P.: The role of trust in automation reliance. International Journal of Human-Computer Studies 58(6), 697–718 (2003)CrossRefGoogle Scholar
  6. 6.
    Muir, B.M.: Trust between humans and machines, and the design of decision aids. International Journal of Man-Machine Studies 27, 527–539 (1987)CrossRefGoogle Scholar
  7. 7.
    Azar, B.: Danger of automation: It makes us complacent. APA monitor 29(7), 3 (1998)Google Scholar
  8. 8.
    Singh, I.L., Molloy, R., Parasuraman, R.: Automation-induced “complacency": development of the complacency-potential rating scale. International Journal of Aviation Psychology 3, 111–122 (1993)CrossRefGoogle Scholar
  9. 9.
    Wiener, E.L.: Complacency: is the term useful for air safety. In: 26th Corporate Aviation Safety Seminar, pp. 116–125. Flight Safety Foundation, Inc. (1981)Google Scholar
  10. 10.
    Parasuraman, R., Riley, V.: Humans and automation: Use, misuse, disuse, abuse. Hum. Factors 39, 230–253 (1997)CrossRefGoogle Scholar
  11. 11.
    Wickens, C., Dixon, S., Goh, J., Hammer, B.: Pilot Dependence on Imperfect Diagnostic Automation in Simulated UAV Flights: An Attentional Visual Scanning Analysis. In: Proceedings of the13th International Symposium on Aviation Psychology (2005)Google Scholar
  12. 12.
    Cummings, M.L.: Automation bias in intelligent time critical decision support systems. In: AIAA 1st Intelligent Systems Technical Conference, AIAA 2004 (2004)Google Scholar
  13. 13.
    Skitka, L.J., Mosier, K., Burdick, M.D.: Does automation bias decision making? International Journal of Human-Computer Studies 51(5), 991–1006 (1999)CrossRefGoogle Scholar
  14. 14.
    Meyer, J.: Conceptual issues in the study of dynamic hazard warnings. Human Factors 46(2), 196–204 (2004)CrossRefGoogle Scholar
  15. 15.
    Mosier, K.L., Skitka, L.J., Heers, S., Burdick, M.: Automation bias: Decision making and performance in high-tech cockpits. International Journal of Aviation Psychology 8(1), 47–63 (1998)CrossRefGoogle Scholar
  16. 16.
    Prinzel, L.J., De Vries, H., Freeman, F.G., Mikulka, P.: Examination of Automation-Induced Complacency and Individual Difference Variates. Technical Memorandum No. TM-2001-211413, NASA Langley Research Center, Hampton, VA (2001)Google Scholar
  17. 17.
    Skitka, L.J., Mosier, K., Burdick, M.D.: Accountability and automation bias. International Journal of Human-Computer Studies 52(4), 701–717 (2000)CrossRefGoogle Scholar
  18. 18.
    Meyer, J., Feinshreiber, L., Parmet, Y.: Levels of automation in a simulated failure detection task. In: IEEE International Conference on Systems, Man and Cybernetics 2003, pp. 2101–2106 (2003)Google Scholar
  19. 19.
    Meyer, J.: Effects of warning validity and proximity on responses to warnings. Hum. Factors 43, 563–572 (2001)CrossRefGoogle Scholar
  20. 20.
    Singh, I.L., Molloy, R., Parasuraman, R.: Automation-induced monitoring inefficiency: role of display location. International Journal of Human-Computer Studies 46(1), 17–30 (1997)CrossRefGoogle Scholar
  21. 21.
    Bahner, J.E., Huper, A.-D., Manzey, D.: Misuse of automated decision aids: Complacency, automation bias and the impact of training experience. Int. J. Human-Computer Studies 66, 688–699 (2008)CrossRefGoogle Scholar
  22. 22.
    Moray, N.: Monitoring, complacency, scepticism and eutactic behaviour. International Journal of Industrial Ergonomics 31(3), 175–178 (2003)CrossRefGoogle Scholar
  23. 23.
    Wickens, C.D., Dixon, S.R.: Is there a Magic Number 7 (to the Minus 1)? The Benefits of Imperfect Diagnostic Automation: A Synthesis of the Literature, University of Illinois at Urbana-Champaign, Savoy, Illinois, pp. 1–11 (2005)Google Scholar
  24. 24.
    Dassonville, I., Jolly, D., Desodt, A.M.: Trust between man and machine in a teleoperation system. Reliability Engineering & System Safety (Safety of Robotic Systems) 53(3), 319–325 (1996)CrossRefGoogle Scholar
  25. 25.
    Lee, J.D., Moray, N.: Trust, self-confidence, and operators’ adaptation to automation. International Journal of Human-Computer Studies 40, 153–184 (1994)CrossRefGoogle Scholar
  26. 26.
    Lee, J.D., See, K.A.: Trust in computer technology. Designing for appropriate reliance. Human Factors, 50–80 (2003)Google Scholar
  27. 27.
    Muir, B.M.: Trust in automation: Part I. Theoretical issues in the study of trust and human intervention in automated systems. Ergonomics 37, 1905–1922 (1994)CrossRefGoogle Scholar
  28. 28.
    Muir, B.M., Moray, N.: Trust in automation: Part II. Experimental studies of trust and human intervention in a process control simulation. Ergonomics 39, 429–460 (1996)CrossRefGoogle Scholar
  29. 29.
    Tan, G., Lewandowsky, S.: A comparison of operator trust in humans versus machines. In: Presentation of First International Cyberspace Conference on Ergonomics (1996)Google Scholar
  30. 30.
    de Vries, P., Midden, C., Bouwhuis, D.: The effects of errors on system trust, self-confidence, and the allocation of control in route planning. International Journal of Human-Computer Studies 58(6), 719–735 (2003)CrossRefGoogle Scholar
  31. 31.
    Dzindolet, M.T., Pierce, L.G., Beck, H.P., Dawe, L.A.: The perceived utility of human and automated aids in a visual detection task. Human Factors 44(1), 79–94 (2002)CrossRefGoogle Scholar
  32. 32.
    Parasuraman, R., Molloy, R., Singh, I.L.: Performance consequences of automation-induced “complacency”. International Journal of Aviation Psychology 3, 1–23 (1993)CrossRefGoogle Scholar
  33. 33.
    Bliss, J.P., Acton, S.A.: Alarm mistrust in automobiles: how collision alarm reliability affects driving. Applied Ergonomics 34(6), 499–509 (2003)CrossRefGoogle Scholar
  34. 34.
    Parasuraman, R., Miller, C.A.: Trust and etiquette in high-criticality automated systems. Communications of the ACM 47(4), 51–55 (2004)CrossRefGoogle Scholar
  35. 35.
    Alberdi, E., Povyakalo, A.A., Strigini, L., Ayton, P.: Effects of incorrect CAD output on human decision making in mammography. Acad. Radiol. 11(8), 909–918 (2004)CrossRefGoogle Scholar
  36. 36.
    Alberdi, E., Povyakalo, A.A., Strigini, L., Ayton, P., Given-Wilson, R.: CAD in mammography: lesion-level versus case-level analysis of the effects of prompts on human decisions. Journal of Computer Assisted Radiology and Surgery 3(1-2), 115–122 (2008)CrossRefGoogle Scholar
  37. 37.
    Alberdi, E., Povyakalo, A.A., Strigini, L., Ayton, P., Hartswood, M., Procter, R., Slack, R.: Use of computer-aided detection (CAD) tools in screening mammography: a multidisciplinary investigation. Br. J. Radiol. 78(suppl_1), S31–S40 (2005)CrossRefGoogle Scholar
  38. 38.
    Povyakalo, A.A., Alberdi, E., Strigini, L., Ayton, P.: Evaluating ’Human + Advisory computer’ systems: A case study. In: HCI 2004,18th British HCI Group Annual Conference, British HCI Group, pp. 93–96 (2004)Google Scholar
  39. 39.
    Povyakalo, A.A., Alberdi, E., Strigini, L., Ayton, P.: Divergent effects of computer prompting on the sensitivity of mammogram readers, Technical Report, Centre for Software Reliability, City University, London, UK (2006)Google Scholar
  40. 40.
    Strigini, L., Povyakalo, A.A., Alberdi, E.: Human-machine diversity in the use of computerised advisory systems: a case study. In: 2003 Int. Conf. on Dependable Systems and Networks (DSN 2003). IEEE, Los Alamitos (2003)Google Scholar
  41. 41.
    Hartswood, M., Procter, R., Rouncefield, M., Slack, R., Soutter, J., Voss, A.: ’Repairing’ the Machine: A Case Study of the Evaluation of Computer-Aided Detection Tools in Breast Screening. In: Eighth European Conference on Computer Supported Cooperative Work, ECSCW 2003 (2003)Google Scholar
  42. 42.
    Pritchett, A.R., Vandor, B., Edwards, K.: Testing and implementing cockpit alerting systems. Reliability Engineering & System Safety 75(2), 193–206 (2002)CrossRefGoogle Scholar
  43. 43.
    Karau, S.J., Williams, K.D.: Social loafing: a meta-analytic review and theoretical integration. Journal of Personality and Social Psychology 65, 681–706 (1993)CrossRefGoogle Scholar
  44. 44.
    Latanedo, B., Williams, K., Harkins, S.: Many hands make light the work: the causes and consequences of social loafing. Journal of Personality and Social Psychology 37, 822–832 (1979)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Eugenio Alberdi
    • 1
  • Lorenzo Strigini
    • 1
  • Andrey A. Povyakalo
    • 1
  • Peter Ayton
    • 2
  1. 1.Centre for Software ReliabilityCity University LondonLondonUK
  2. 2.Psychology DepartmentCity University LondonLondonUK

Personalised recommendations