KEY WORDSclinical reasoning decision making intuition pattern recognition
In the evaluation of a patient, the clinician’s mind employs two parallel systems of problem solving. The intuitive system recognizes patterns quickly (e.g., acute swollen painful big toe = gout). This mode of reasoning is fast and frugal (in its consumption of mental resources) and liberally employs mental shortcuts (heuristics) to get the answer. Rather than systematically exploring a wide range of options, this system detects cues based upon past experience and knowledge, filters information, and conducts a quick fit test on its first impressions. Malcolm Gladwell’s best seller Blink celebrated this approach1.
The analytical system takes a more conscious and deliberate approach to problem solving. This can take the form of extensive data collection and analysis, algorithms, pathophysiologic reasoning, or statistical analysis (Bayes' theorem). This system consumes more cognitive resources, involves the consideration of multiple hypotheses, and tries to avoid shortcuts.
These two diagnostic approaches exist on a continuum and are both engaged to some degree in every patient encounter with bidirectional synergies, checks, and overrides2. Traditionally in psychology and in medicine the analytical system has been treated as the superior, error-free form of thinking, but neuroscience reveals a more nuanced picture that has implications for how we understand and teach clinical reasoning.
In this issue, Stolper et al. introduce gut feelings as a sense of alarm or reassurance that a clinician experiences early in a clinical encounter3. The authors distinguish gut feelings as a particular subset of intuition by emphasizing an association with prognostic rather than diagnostic results, the influence of emotions in their formation, and the somatic sensations a physician may experience when gut feelings arise. They emphasize that these specific elements have received relatively little attention in the clinical reasoning literature, which focuses on diagnostic rather than prognostic accuracy and infrequently mentions the affective or somatic state of the physician decision maker. The authors’ call for medicine to expand its concept of intuition is welcomed, as the essential roles of expected outcomes and emotions in decision making are already well defined in the cognitive and decision psychology literature4.
However, the case that Stolper et al. make for establishing a separate track of clinical reasoning is less compelling. Intuition resides in the unconscious mind and unfolds rapidly, which makes it very challenging for neuroscientists to deconstruct. Although elegant neuroimaging and psychological experiments attest to intuition’s remarkable computational power, we are only just beginning to discover the inner workings of this “black box.” There are no clearly defined subprocesses of intuition that can readily serve the clinician or teacher in the classroom or at the bedside.
Both intuition and gut feelings (as defined by Stolper et al.) rely on early environmental cues, require experience with similar situations (which includes knowledge of outcomes), are rapid and unconscious in their formation, and triage the mind toward early action or more deliberate analysis (the expert concept of “knowing when to slow down”5). Given this high degree of overlap, it is hard to make a case for medicine to claim its own brand of intuition where the emotive and somatic components are separately delineated with the name gut feelings.
The authors affirm that clinical problem solving is a mix of analytical and non-analytical (intuitive) reasoning, which is the leading way that clinical reasoning (and much of human judgment) is currently understood. However, their model (Figure 1 of the article) creates unnecessary complexity by labeling three tracks within the intuition-analysis continuum: medical decision making (mathematical analysis), medical problem solving (non-mathematical cognitive processing), and gut feelings. First, as noted above, gut feelings are not sufficiently differentiated from intuition as to warrant a track. Second, the terms “medical decision making” and “medical problem solving” are currently used interchangeably with little reference to the historically based but linguistically ambiguous distinction. Finally, mathematical analysis (i.e., doing sequential calculations with Bayes' theorem) is so vanishingly rare in everyday practice that it does not warrant the “track” label; rather, it exemplifies the most rarefied form of analytical reasoning.
In summary, given what is (and is not) known about how doctors reason, teachers and clinicians are currently best served by understanding the interplay between intuition and analysis, rather than math versus non-math, or gut feelings versus non-gut feelings intuition6.
These issues withstanding, Stolper et al. take an important stance in elevating the stature of intuition in clinical decision making. Supporting intuition in medicine and medical education is not easy. The phrase “intuitive decision making” brings to mind heuristics, emotions, and experiential rather than evidence-driven reasoning—a combination that makes some clinicians uncomfortable7. These concerns, oftentimes buttressed by post-hoc analysis of medical errors, lead to the assertion that intuitive reasoning should be replaced or at least verified by the supposedly more error-free analytical reasoning. Research shows however that analysis does not always have the upper hand8.
Studies within and outside of medicine demonstrate that intuition—the earliest impressions we form when confronted with a problem—can be more accurate than analytical reasoning. The Iowa gambling task famously demonstrated that when subjects are asked to choose cards among multiple decks with the goal of maximizing profit, the intuitive system (as reflected in skin conductance measurements of hand perspiration) discerns optimal choices after sorting through only 10 cards, while the rational mind takes nearly 50 cards to detect a similar pattern9. Studies of decision-making in nursing, firefighting, engineering, and the military reveal that experts’ solutions to complex and challenging situations often arise immediately without conscious deliberation among options10,11. Studies of physicians have repeatedly demonstrated that the single best predictor of diagnostic accuracy is the early (within minutes) consideration of the correct diagnosis, a process that is arguably governed by intuition given the relative paucity of data early in the encounter12,13.
Multiple experimental studies have shown that instructing trainees to use intuition can lead to equal or greater accuracy than analytical reasoning. Regehr et al. demonstrated that asking first year residents to diagnose skin conditions using first impressions yielded the same accuracy as asking them to take an analytical approach (compare and contrast against competing diagnoses)14. Ark, Eva, and colleagues showed that instructing students to use pattern recognition in EKG diagnosis works as well as asking students to list all the features of the tracing and that performance is optimized when trainees are instructed to do both (“trust, but verify”)15, 16, 17. de Vries et al. demonstrated that psychology students who used unconscious reasoning outperformed classmates who consciously tried to reach a DSM-IV diagnosis for a presented case18.
Arguing the superiority of analysis or intuition is hopelessly quixotic. Experimental manipulations of task difficulty, task familiarity, and decision-maker experience can easily tip the balance in favor of one or the other mode19,20. The emerging message from the psychology literature is that intuition stands as an equal partner with analysis in human decision making. The early lesson from the medical education literature is that we can reasonably empower learners to consider similarity and previous patterns—just like practioners do in real life—even when experience is limited.
What should teachers do with this understanding of how we reason? Students will intuit and analyze without any instruction. That is a product of the neural hardwiring we are all born with. Teachers add value by helping trainees refine their own interplay and coordination between these two systems. Teachers can explicitly point out scenarios where is it appropriate to trust intuitions and patterns, when to trust but verify, and when to abandon intuition and employ pathophysiology, algorithms, statistics, or any other rigorous form of analytical thought. When appropriate, teachers can share their own intuition—or any of its more palatable synonyms (non-analytical reasoning, pattern recognition, rapid cognition, thin slicing, gut feelings)—without creating an artificial analysis, and they can invite their trainees to do the same. Both parties can sharpen their intuition through reflection and feedback21. Much work remains to determine what this form of instruction looks like in the clinical environment.
One last point merits attention, lest it trip up teachers or learners: intuition is not the enemy of evidence-based medicine22. By definition, intuition, because of its rapidity, does not encompass the search for new scientific information in the moment, but intuitive thinking does not imply any resistance to doing so. Every trip to the medical literature (analytical reasoning by definition) contributes to the clinician’s unique combination of experience and medical knowledge that forms the basis of future intuitive decisions. Expert clinical judgment is characterized by an adroit self-regulatory sense of when intuition is insufficient and analysis is necessary5. When intuition is viewed in this way—as a powerful decision-making process and not a fondness of disregarding evidence—a smoother incorporation into conversation and modern teaching can be achieved. Intuition is far from perfect, but it works.
This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
- 1.Blink GM. The Power of Thinking Without Thinking. New York, NY: Little, Brown, and Co.; 2005.Google Scholar
- 3.Stolper E, Van de Wiel M, Van Royen P, Van Bokhoven M, Van der Weijden T, Dinant GJ. Gut Feelings as a Third Track in General Practitioners’ Diagnostic Reasoning. J Gen Intern Med. 2010 Oct 22.Google Scholar
- 4.Lehrer J. How We Decide. New York, NY: Houghton Mifflin Harcourt; 2009.Google Scholar
- 7.Klein G. The Power of Intuition. New York: Currency; 1993.Google Scholar
- 8.Norman G. Dual processing and diagnostic errors. Adv Health Sci Educ Theory Pract. 2009 Sep;14 Suppl 1:37-49. Epub 2009 Aug 11.Google Scholar
- 10.Klein G. Sources of Power: How People Make Decisions. Cambridge, Mass.: MIT Press, 1988.Google Scholar
- 11.Dreyfus HL, Dreyfus SE. Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer. Oxford, England: Blackwell; 1986.Google Scholar
- 18.de Vries M, Witteman CL, Holland RW, Dijksterhuis A. The unconscious thought effect in clinical decision making: an example in diagnosis. Med Decis Making. 2010 Sep-Oct;30(5):578-81. Epub 2010 Mar 12.Google Scholar
- 19.Mamede S, Schmidt HG, Rikers RM, Custers EJ, Splinter TA, van Saase JL. Conscious thought beats deliberation without attention in diagnostic decision-making: at least when you are an expert. Psychol Res. 2010 Nov;74(6):586-92. Epub 2010 Mar 31.Google Scholar
- 21.Quirk M. Intuition and Metacognition in Medical Education. New York: Springer Publishing Company; 2006.Google Scholar