Skip to main content

Improving Criminal Investigations with Structured Analytic Techniques

Part of the Advanced Sciences and Technologies for Security Applications book series (ASTSA)


The intelligence community has used structured analytic techniques, methods designed to reduce bias and increase transparency of process, for years. The techniques force analysts out of routine thinking and away from heuristic habits in order to increase creativity, more comprehensively evaluate the questions, and create a document trail that reveals the thinking process that led to the intelligence product. These methods can be adapted for use in criminal investigations to help reduce bias, improve accuracy, and avoid both wrongful convictions (over 2200 to date) and reparations (more than $2.2 billion) while optimizing resources. The methods shift the investigator from intuitive, daily thinking (System 1, in Kahneman’s terminology) to a more analytical approach (System 2) that creates a transparent process, regardless of the outcome. Structured analytical techniques are simple to use, inexpensive, and largely visual; they promote transparency, creativity, and group discussion, leading to better-supported results.


  • Criminal investigations
  • Structured analytic techniques
  • Intelligence

This is a preview of subscription content, access via your institution.

Buying options

USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-41287-6_7
  • Chapter length: 37 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
USD   119.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-41287-6
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   159.00
Price excludes VAT (USA)
Hardcover Book
USD   159.99
Price excludes VAT (USA)
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22


  1. 1.

  2. 2.

    The tendency to mistakenly perceive connections and meaning between unrelated things. A classic example of apophany is Skinner’s Box, where a hungry pigeon was put in a box and randomly given food pellets. Because the pigeon got a food pellet while performing some action, it correlated the food and the action. As the pigeon increased the number of times it performed that action, it strengthened the false correlation between the non-correlated events (the feeding was random) (Skinner 1948). In humans, one example is the Gambler’s Fallacy, where a perceived “hot streak” is in fact related to chance (Burns and Corpus 2004).

  3. 3.

    If your answer was $0.10, that’s System 1 talking. If the bat costs $1 more than the ball, the ball has to be $0.05 ($1.05 + $0.05 = $1.10). If the ball cost $0.10, the bat and ball would cost $1.20.

  4. 4.

    Carbon-based compounds in the juice are absorbed into the paper’s fibers. Because lemon juice ink is a weak acid, it softens the fibers in the paper. When the paper is warmed, the added heat causes some of the chemical bonds in the dried juice to break down and carbon is released. When the carbon comes into contact with air, it oxidizes, turning the juice pattern brown and, thus, visible.

  5. 5.

    Named for the English Franciscan friar, William of Ockham (c. 1287–1347), a scholastic philosopher and theologian.

  6. 6.

  7. 7.

    Brooks, L. n.d. “Brainstorm Questions, Not Solutions”, at:

  8. 8.

    The model was developed at Gordon Training International in the 1970s with different terminology than is commonly used now ( The model is frequently but incorrectly attributed to Abraham Maslow . It is conceptually similar to the Johari Window (Luft and Ingham 1961) which lists knowns and unknowns for an individual in relation to others in a heuristic exercise.

  9. 9.

    A slang military phrase for Scientific Wild-Ass Guess (Safire 2004).


Both women are pregnant.

  • Aitken, C., Roberts, P., & Jackson, G. (2010). Fundamentals of probability and statistical evidence in criminal proceedings: Guidance for judges, lawyers, forensic scientists and expert witnesses. London: Royal Statistical Society.

    Google Scholar 

  • Ariely, D., Loewenstein, G., & Prelec, D. (2003). “Coherent arbitrariness”: Stable demand curves without stable preferences. The Quarterly Journal of Economics, 118(1), 73–106.

    Google Scholar 

  • Artner, S. J., Girven, R., & Bruce, J. B. (2016). Assessing the value of structured analytic techniques in the US intelligence community. Santa Monica: Rand Corporation.

    Google Scholar 

  • Bennett, G. E. (1933). Assumptions. The Accounting Review, 8(2), 157–159.

    Google Scholar 

  • Burns, B., & Corpus, B. (2004). Randomness and inductions from streaks: “Gambler’s fallacy” versus “hot hand”. Psychonomic Bulletin & Review, 11(1), 179.

    Google Scholar 

  • Central Intelligence Agency. (2009). A tradecraft primer: Structured analytic techniques for improving intelligence analysis. Central Intelligence Agency: Langley.

    Google Scholar 

  • Dewey, J. (1933). How we think. Buffalo: Prometheus Books. (Original work published 1910).

    Google Scholar 

  • Dror, I. E., & Hampikian, G. (2011). Subjectivity and bias in forensic DNA mixture interpretation. Science & Justice, 51(4), 204–208.

    Google Scholar 

  • Dror, I. E., Charlton, D., & Péron, A. E. (2006). Contextual information renders experts vulnerable to making erroneous identifications. Forensic Science International, 156(1), 74–78.

    Google Scholar 

  • Einhorn, H. J., & Hogarth, R. M. (1978). Confidence in judgment: Persistence of the illusion of validity. Psychological Review, 85(5), 395.

    Google Scholar 

  • Ericsson, K. A., Hoffman, R. R., Kozbelt, A., & Williams, A. M. (Eds.). (2018). The Cambridge handbook of expertise and expert performance. Cambridge: Cambridge University Press.

    Google Scholar 

  • Gawande, A. (2010). The checklist manifesto. Profile Books.

    Google Scholar 

  • George, R. Z., & Bruce, J. B. (Eds.). (2014). Analyzing intelligence: National security practitioners’ perspectives (2nd ed.). Washington, DC: Georgetown University Press.

    Google Scholar 

  • Goldstein, D. G., & Gigerenzer, G. (2002). Models of ecological rationality: The recognition heuristic. Psychological Review, 109(1), 75.

    Google Scholar 

  • Gross, S., O’Brien, B., Hu, C., & Kennedy, E. (2014). Rate of false conviction in capital cases. Proceedings of the National Academy of Sciences, 111(20), 7230–7235.

    Google Scholar 

  • Hall, C. C., Ariss, L., & Todorov, A. (2007). The illusion of knowledge: When more information reduces accuracy and increases confidence. Organizational Behavior and Human Decision Processes, 103(2), 277–290.

    Google Scholar 

  • Herzog, M., & Fahle, M. (1997). The role of feedback in learning a Vernier discrimination task. Vision Research, 37(15), 2133–2141.

    Google Scholar 

  • Heuer, R. J. (1999). Psychology of intelligence analysis (2nd ed.). Washington, DC: Center for the Study of Intelligence, Central Intelligence Agency. Available at:

  • Heuer, R. J. (2015). Psychology of intelligence analysis. Langley: CIA Center for the Study of Intelligence.

  • Heuer, R. J., & Pherson, R. H. (2015). Structured analytic techniques for intelligence analysis (2nd ed.). Washington, DC: CQ Press.

    Google Scholar 

  • Kahneman, D. (2011). Thinking, fast and slow. New York: Macmillan.

    Google Scholar 

  • Kahneman, D., Krueger, A. B., Schkade, D., Schwarz, N., & Stone, A. A. (2006). Would you be happier if you were richer? A focusing illusion. Science, 312(5782), 1908–1910.

    Google Scholar 

  • Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121.

    Google Scholar 

  • Loftus, E. F., & Palmer, J. C. (1974). Reconstruction of automobile destruction: An example of the interaction between language and memory. Journal of Verbal Learning and Verbal Behavior, 13(5), 585–589.

    Google Scholar 

  • Loftus, E. F., & Zanni, G. (1975). Eyewitness testimony: The influence of the wording of a question. Bulletin of the Psychonomic Society, 5(1), 86–88.

    Google Scholar 

  • Luft, J., & Ingham, H. (1961). The johari window. Human Relations Training News, 5(1), 6–7.

    Google Scholar 

  • Major, J. (2008). Communicating with intelligence. Lanham: Scarecrow Press.

    Google Scholar 

  • Mercier, H. (2017). Confirmation bias—myside bias. In R. Pohl (Ed.), Cognitive illusions (2nd ed.). Abingdon: Routledge.

    Google Scholar 

  • Miller, G. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. The Psychological Review, 63, 81–97.

    Google Scholar 

  • Nisbett, R. E., Zukier, H., & Lemley, R. E. (1981). The dilution effect: Nondiagnostic information weakens the implications of diagnostic information. Cognitive Psychology, 13(2), 248–277.

    Google Scholar 

  • Nkwake, A. M. (2013). Why are assumptions important? In Working with assumptions in international development program evaluation (pp. 93–111). New York: Springer.

    Google Scholar 

  • Pugh, S. (1991). Total design: Integrated models for successful product engineering. New Jersey: Addison-Wesley.

    Google Scholar 

  • Ross, L., Lepper, M. R., & Hubbard, M. (1975). Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm. Journal of Personality and Social Psychology, 32(5), 880–892.

    Google Scholar 

  • Rossmo, D. K. (2008). Criminal investigative failures. London: CRC press.

    Google Scholar 

  • Safire, W. (2004). No uncertain terms. New York: Simon and Schuster.

    Google Scholar 

  • Simon, H. A. (1956). Rational choice and the structure of the environment. Psychological Review, 63(2), 129–138.

    Google Scholar 

  • Skinner, B. F. (1948). ‘Superstition’ in the pigeon. Journal of Experimental Psychology, 38(2), 168.

    Google Scholar 

  • Slovic, P. (2018). Behavioral problems of adhering to a decision policy. Unpublished manuscript, available at:

  • Smith, M. R., & Alpert, G. P. (2007). Explaining police bias: A theory of social conditioning and illusory correlation. Criminal Justice and Behavior, 34(10), 1262–1283.

    Google Scholar 

  • Spencer, K. B., Charbonneau, A. K., & Glaser, J. (2016). Implicit bias and policing. Social and Personality Psychology Compass, 10(1), 50–63.

    Google Scholar 

  • United Nations Office on Drugs and Crime. (2011). Criminal intelligence manual for analysts. New York: United Nations. Available at:

    Google Scholar 

  • United States Department of Defense. (2017). FM34–52 Human intelligence collector operations: FM 2-22.3. Washington, DC.

    Google Scholar 

  • White, D., Kemp, R. I., Jenkins, R., Matheson, M., & Burton, A. M. (2014). Passport officers’ errors in face matching. PLoS One, 9(8), e103510.

    Google Scholar 

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Max M. Houck .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Verify currency and authenticity via CrossMark

Cite this chapter

Houck, M.M. (2020). Improving Criminal Investigations with Structured Analytic Techniques. In: Fox, B., Reid, J., Masys, A. (eds) Science Informed Policing. Advanced Sciences and Technologies for Security Applications. Springer, Cham.

Download citation

  • DOI:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-41286-9

  • Online ISBN: 978-3-030-41287-6

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)