Advertisement

Experimentally Evaluating Bias-Reducing Visual Analytics Techniques in Intelligence Analysis

  • Donald R. Kretz
Chapter

Abstract

Intelligence analysis is a complex process that not only requires substantial training and deep expertise, but is heavily impacted by human cognitive factors. Studies have shown that even experienced, highly-trained personnel sometimes commit serious errors in judgment as a result of heuristic thinking and the impact of judgment bias in matters of national security can be catastrophic. Developing effective debiasing techniques requires addressing a number of daunting challenges. While intuitively appealing, the ability to construct suitable methods to test behaviour under actual work conditions is limited and the generalisability of findings from laboratory settings to work settings is a serious concern. To date, researchers have performed only limited investigations of a small number of debiasing techniques in the workplace. There is still a strong need for experimentally validated debiasing techniques that can be incorporated into analytic tradecraft so that foreseeable thinking errors can be avoided. Drawing from the useful features of prior studies, a reference framework has been developed for the experimental evaluation of bias mitigations applied to problems of an intelligence nature.

References

  1. 1.
    Baron J (2004) Normative models of judgment and decision making. In: Koehler DJ, Harvey N (eds) Blackwell handbook of judgment and decision making. Blackwell, London, pp 19–36CrossRefGoogle Scholar
  2. 2.
    Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction (2004) Intelligence Reform and Terrorism Prevention Act of 2004Google Scholar
  3. 3.
    Cooper JR (2005) Curing analytic pathologies: pathways to improved intelligence analysis. Center for the Study of Intelligence, Washington, DCGoogle Scholar
  4. 4.
    Johnston R (2005) Analytic culture in the US Intelligence Community. Center for the Study of Intelligence, Washington, DCGoogle Scholar
  5. 5.
    Moore DT (2007) Critical thinking and intelligence analysis. National Defense Intelligence College, Washington, DCGoogle Scholar
  6. 6.
    National Commission on Terrorist Attacks upon the United States (2004) The 9/11 commission report. W.W. Norton & Company, New York, NYGoogle Scholar
  7. 7.
    U.S. Senate Select Committee on Intelligence (2004) Report on the U.S. Intelligence Community’s Prewar Intelligence Assessments on IraqGoogle Scholar
  8. 8.
    U.S. Office of Personnel Management (2009) Position classification standard flysheet for intelligence series, GS-0132. Government Printing Office, Washington, DCGoogle Scholar
  9. 9.
    Marrin S (2012) Evaluating the quality of intelligence analysis: by what (mis)measure? Intell Nat Secur 27(6):896–912CrossRefGoogle Scholar
  10. 10.
    U.S. Joint Chiefs of Staff (2013) Joint publication 2.0: joint intelligence. Department of Defense, Washington, DCGoogle Scholar
  11. 11.
    U.S. Office of the Director of National Intelligence (2007) Intelligence Community Directive 203: Analytic StandardsGoogle Scholar
  12. 12.
    Littlepage G, Robison W, Reddington K (1997) Effects of task experience and group experience on group performance, member ability, and recognition of expertise. Organ Behav Hum Decis Process 69(2):133–147CrossRefGoogle Scholar
  13. 13.
    Fiedler FE (1994) Leadership experience and leadership performance. DTIC DocumentGoogle Scholar
  14. 14.
    Tversky A, Kahneman D (1974) Judgment under uncertainty: heuristics and biases. Science 185(4157):1124–1131CrossRefGoogle Scholar
  15. 15.
    Elstein AS, Schwarz A (2002) Clinical problem solving and diagnostic decision making: selective review of the cognitive literature. BMJ 324:729–732CrossRefGoogle Scholar
  16. 16.
    Pronin E, Lin DY, Ross L (2002) The bias blind spot: perceptions of bias in self versus others. Pers Soc Psychol Bull 28(3):369–381CrossRefGoogle Scholar
  17. 17.
    Shanteau J (1992) The psychology of experts: an alternative view. In: Wright G, Bolger F (eds) Expertise and decision support. Plenum Press, New York, NY, pp 11–23CrossRefGoogle Scholar
  18. 18.
    Stanovich KE, West RF (2008) On the relative independence of thinking biases and cognitive ability. J Pers Soc Psychol 94(4):672–695CrossRefGoogle Scholar
  19. 19.
    Toplak ME, West RF, Stanovich KE (2011) The cognitive reflection test as a predictor of performance on heuristics-and-biases tasks. Mem Cognit 39(7):1275–1289CrossRefGoogle Scholar
  20. 20.
    Heuer RJ (1999) Psychology of intelligence analysis. Center for the Study of Intelligence, Washington, DCGoogle Scholar
  21. 21.
    List of Cognitive Biases. Wikipedia n.d. Available from http://en.wikipedia.org/wiki/List_of_cognitive_biases
  22. 22.
    Lord CG, Ross L, Lepper MR (1979) Biased assimilation and attitude polarization: the effects of prior theories on subsequently considered evidence. J Pers Soc Psychol 37(11):2098–2109CrossRefGoogle Scholar
  23. 23.
    Cook MB, Smallman HS (2008) Human factors of the confirmation bias in intelligence analysis: decision support from graphical evidence landscapes. Hum Factors 50(5):745–754CrossRefGoogle Scholar
  24. 24.
    Ross L, Anderson CA (1982) Shortcomings in the attribution process: on the origins and maintenance of erroneous social assessments. In: Kahneman D, Slovic P, Tversky A (eds) Judgment under uncertainty: heuristics and biases. Cambridge University Press, Cambridge, MA, pp 129–152CrossRefGoogle Scholar
  25. 25.
    Chapman LJ (1967) Illusory correlation in observational report. J Verbal Learn Verbal Behav 6(1):151–155CrossRefGoogle Scholar
  26. 26.
    Klauer KC, Meiser T (2000) A source-monitoring analysis of illusory correlations. Pers Soc Psychol Bull 26(9):1074–1093CrossRefGoogle Scholar
  27. 27.
    Garcia-Retamero R, Rieskamp J (2008) Adaptive mechanisms for treating missing information: a simulation study. Psychol Rec 58(4):547–568CrossRefGoogle Scholar
  28. 28.
    Sanbonmatsu DM, Kardes FR, Herr PM (1992) The role of prior knowledge and missing information in multiattribute evaluation. Organ Behav Hum Decis Process 51(1):76–91CrossRefGoogle Scholar
  29. 29.
    Kardes FR et al (2006) Debiasing omission neglect. J Bus Res 59(6):786–792CrossRefGoogle Scholar
  30. 30.
    Troutman CM, Shanteau J (1977) Inferences based on nondiagnostic information. Organ Behav Hum Perform 19:43–55CrossRefGoogle Scholar
  31. 31.
    LaBella C, Koehler DJ (2004) Dilution and confirmation of probability judgments based on nondiagnostic evidence. Mem Cognit 32(7):1076–1089CrossRefGoogle Scholar
  32. 32.
    Windschitl PD, Wells GL (1998) The alternative-outcomes effect. J Pers Soc Psychol 75(6):1411–1423CrossRefGoogle Scholar
  33. 33.
    Dougherty MR, Sprenger A (2006) The influence of improper sets of information on judgment: how irrelevant information can bias judged probability. J Exp Psychol Gen 135(2):262–281CrossRefGoogle Scholar
  34. 34.
    Windschitl PD, Young M, Jenson M (2002) Likelihood judgment based on previously observed outcomes: the alternative-outcomes effect in a learning paradigm. Mem Cogn 30(3):469–477CrossRefGoogle Scholar
  35. 35.
    Pallier G et al (2002) The role of individual differences in the accuracy of confidence judgments. J Gen Psychol 129(3):257–299CrossRefGoogle Scholar
  36. 36.
    Björkman M, Juslin P, Winman A (1993) Realism of confidence in sensory discrimination: the underconfidence phenomenon. Percept Psychophys 54(1):75–81CrossRefGoogle Scholar
  37. 37.
    Stanovich KE, West RF (1998) Individual differences in rational thought. J Exp Psychol Gen 127(2):161–188CrossRefGoogle Scholar
  38. 38.
    Woodward B (2004) Plan of attack. Simon & Schuster, New YorkGoogle Scholar
  39. 39.
    Peterson JJ (2008) Appropriate factors to consider when expressing analytic confidence in intelligence analysis. Department of Intelligence Studies, Mercyhurst CollegeGoogle Scholar
  40. 40.
    Kretz DR, Granderson CW (2013) An interdisciplinary approach to studying and improving terrorism analysis. In: 2013 IEEE international conference on intelligence and security informatics (ISI)Google Scholar
  41. 41.
    Lilienfeld SO, Ammirati R, Landfield K (2009) Giving debiasing away: can psychological research on correcting cognitive errors promote human welfare? Perspect Psychol Sci 4(4):390–398CrossRefGoogle Scholar
  42. 42.
    Heuer RJ, Pherson RH (2010) Structured analytic techniques for intelligence analysis. CQ Press, Washington, DCGoogle Scholar
  43. 43.
    Heuer RJ (2005) How does ACH improve intelligence analysis? In: From the works of Richards J. Heuer, JrGoogle Scholar
  44. 44.
    International Ergonomics Association. Definition and Domains of Ergonomics (2018) Available from https://www.iea.cc/whats/index.html
  45. 45.
    Fischhoff B (1975) Silly certainty of hindsight. Psychol Today 8(11):70Google Scholar
  46. 46.
    Halpern DF (1998) Teaching critical thinking for transfer across domains: dispositions, skills, structure training, and metacognitive monitoring. Am Psychol 53(4):449–455CrossRefGoogle Scholar
  47. 47.
    Willingham DT (2007) Critical thinking: why is it so hard to teach? Am Educ 31(2):8–19Google Scholar
  48. 48.
    Larrick RP (2004) Debiasing. In: Koehler DJ, Harvey N (eds) Blackwell handbook of judgment and decision making. Blackwell Publishing, Oxford, UK, Malden, MA, pp 316–337CrossRefGoogle Scholar
  49. 49.
    Arkes HR (1991) Costs and benefits of judgment errors: implications for debiasing. Psychol Bull 110(3):486–498CrossRefGoogle Scholar
  50. 50.
    Lord CG, Taylor CA (2009) Biased assimilation: effects of assumptions and expectations on the interpretation of new evidence. Soc Pers Psychol Compass 3(5):827–841CrossRefGoogle Scholar
  51. 51.
    Welsh MB, Begg SH, Bratvold RB (2007) Efficacy of bias awareness in debiasing oil and gas judgments. In: 29th Annual meeting of the cognitive science society, Cognitive Science Society, Nashville, TNGoogle Scholar
  52. 52.
    Intelligence Advanced Research Projects Activity. Sirius. n.d. Available from https://www.iarpa.gov/index.php/research-programs/sirius
  53. 53.
    Stapleton AJ (2004) Serious games: serious opportunities. In: Australian game developers conference, Academic Summit, MelbourneGoogle Scholar
  54. 54.
    Fischhoff B (1982) Debiasing. In: Kahneman D, Slovic P, Tversky A (eds) Judgment under uncertainty: heuristics and biases. Cambridge University Press, Cambridge, MA, pp 422–444CrossRefGoogle Scholar
  55. 55.
    Lopes LL (1982) Procedural debiasing, OoN Research, EditorGoogle Scholar
  56. 56.
    Lopes LL (1987) Procedural debiasing. Acta Physiol (Oxf) 64(2):167–185MathSciNetGoogle Scholar
  57. 57.
    Nickerson RS (1998) Confirmation bias: a ubiquitous phenomenon in many guises. Rev Gen Psychol 2(2):175–220CrossRefGoogle Scholar
  58. 58.
    Graber ML, Franklin N, Gordon R (2005) Diagnostic error in internal medicine. Arch Intern Med 165(13):1493–1499CrossRefGoogle Scholar
  59. 59.
    Galinsky AD, Moskowitz GB (2000) Counterfactuals as behavioral primes: priming the simulation heuristic and consideration of alternatives. J Exp Soc Psychol 36(4):384–409CrossRefGoogle Scholar
  60. 60.
    Mumma GH, Wilson SB (1995) Procedural debiasing of primacy/anchoring effects in clinical-like judgments. J Clin Psychol 51(6):841–853CrossRefGoogle Scholar
  61. 61.
    Mussweiler T, Posten AC (2012) Relatively certain! Comparative thinking reduces uncertainty. Cognition 122(2):236–240CrossRefGoogle Scholar
  62. 62.
    Kruger J, Wirtz D, Miller DT (2005) Counterfactual thinking and the first instinct fallacy. J Pers Soc Psychol 88(5):725–735CrossRefGoogle Scholar
  63. 63.
    McKenzie CRM (1998) Taking into account the strength of an alternative hypothesis. J Exp Psychol Learn Mem Cogn 24(3):771–792CrossRefGoogle Scholar
  64. 64.
    Folker RD (2000) Intelligence analysis in theater joint intelligence centers: an experiment in applying structured methods. Joint Military Intelligence College, Washington, DCGoogle Scholar
  65. 65.
    Sanna LJ, Schwarz N, Stocker SL (2002) When debiasing backfires: accessible content and accessibility experiences in debiasing hindsight. J Exp Psychol Learn Mem Cognit 28(3):497–502CrossRefGoogle Scholar
  66. 66.
    Vallée-Tourangeau F, Beynon DM, James SA (2000) The role of alternative hypotheses in the integration of evidence that disconfirms an acquired belief. Eur J Cogn Psychol 12(1):107–129CrossRefGoogle Scholar
  67. 67.
    Vallée-Tourangeau F, Villejoubert G (2010) Information relevance in pseudodiagnostic reasoning. In: 32nd Annual meeting of the cognitive science society, Portland, ORGoogle Scholar
  68. 68.
    Villejoubert G, Vallée-Tourangeau F (2012) Relevance-driven information search in “pseudodiagnostic” reasoning. Q J Exp Psychol 65(3):541–552CrossRefGoogle Scholar
  69. 69.
    Treverton GF (2011) Foreward. In: Moore DT (ed) Sensemaking: a structure for an intelligence revolution. National Defense Intelligence College, Washington, DCGoogle Scholar
  70. 70.
    Hackman JR (2011) Collaborative intelligence: using teams to solve hard problems. Berrett-Koehler Publishers, San Francisco, CAGoogle Scholar
  71. 71.
    Markman KD, Hirt ER (2002) Social prediction and the “allegiance bias”. Soc Cognit 20(1):58–86CrossRefGoogle Scholar
  72. 72.
    Ely JW, Graber ML, Croskerry P (2011) Checklists to reduce diagnostic errors. Acad Med 86(3):307–313CrossRefGoogle Scholar
  73. 73.
    Croskerry P (2003) Cognitive forcing strategies in clinical decisionmaking. Ann Emerg Med 41(1):110–120CrossRefGoogle Scholar
  74. 74.
    Graber ML (2009) Educational strategies to reduce diagnostic error: can you teach this stuff? Adv Health Sci Educ 14:63–69CrossRefGoogle Scholar
  75. 75.
    Haynes AB et al (2009) A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med 360(5):491–499CrossRefGoogle Scholar
  76. 76.
    Trowbridge RL (2008) Twelve tips for teaching avoidance of diagnostic errors. Med Teach 30(5):496–500CrossRefGoogle Scholar
  77. 77.
    Sherbino J et al (2011) The effectiveness of cognitive forcing strategies to decrease diagnostic error: an exploratory study. Teach Learn Med 23(1):78–84CrossRefGoogle Scholar
  78. 78.
    Hernandez I, Preston JL (2013) Disfluency disrupts the confirmation bias. J Exp Soc Psychol 49(1):178–182CrossRefGoogle Scholar
  79. 79.
    Kretz DR (2015) Strategies to reduce cognitive bias in intelligence analysis: can mild interventions improve analytic judgments?. The University of Texas at Dallas, Richardson, TXGoogle Scholar
  80. 80.
    Klayman J, Brown K (1993) Debias the environment instead of the judge: an alternative approach to reducing error in diagnostic (and other) judgment. Cognition 49(1–2):97–122CrossRefGoogle Scholar
  81. 81.
    Smith JF, Kida T (1991) Heuristics and biases: expertise and task realism in auditing. Psychol Bull 109(3):472CrossRefGoogle Scholar
  82. 82.
    Aronson E, Carlsmith JM, Ellsworth PC (1990) Methods of research in social psychology. McGraw-Hill, New YorkGoogle Scholar
  83. 83.
    Various (2007) Encyclopedia of social psychology. SAGE Publications, Inc., Thousand Oaks, Thousand Oaks, CaliforniaGoogle Scholar
  84. 84.
    Rapoport A, Chammah AM, Orwant CJ (1965) Prisoner’s dilemma: a study in conflict and cooperation, vol 165. University of Michigan press, Ann Arbor, ILCrossRefGoogle Scholar
  85. 85.
    Smallman H (2008) JIGSAW–Joint intelligence graphical situation awareness web for collaborative intelligence analysis. In: Macrocognition in teams: theories and methodologies, pp 321–337Google Scholar
  86. 86.
    Ellis PD (2010) The essential guide to effect sizes: statistical power, meta-analysis, and the interpretation of research results. Cambridge University Press, Cambridge, MACrossRefGoogle Scholar
  87. 87.
    Coe R (2002) It’s the effect size, stupid: what effect size is and why it is importantGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Security Engineering and Applied Sciences, Applied Research Associates, IncFriscoUSA
  2. 2.School of Behavioral and Brain SciencesUniversity of Texas at DallasRichardsonUSA

Personalised recommendations