Stephanie Brown Clark, MD, PhD
Diagnosis is a dynamic cognitive process of gathering information, interpreting the data and making a decision based on clinical judgment. Data is collected through a purposeful dialogue that includes the patient’s verbal description of symptoms (“history”) and the clinician’s targeted questions and through a physical examination (“physical”). The clinician applies prior knowledge of similar cases and compares these to the current patient to develop one or more possible hypotheses (“differential diagnoses”). Interpretations depend on data collected and can be influenced by various assumptions and biases, and cognitive shortcuts or “heuristics” that lead to diagnostic errors.
These kinds of cognitive errors have been well studied since the 1970s by psychologists Tversky and Kahneman.1 These errors are predictable and can be avoided. Heuristics are necessary tools for many kinds of decision-making processes, including diagnosis. They are useful strategies; however, they can and do lead to errors. Biases and assumptions cannot be eliminated, but self-awareness and mindfulness about their existence can mitigate their negative effects.
In medicine, some common biases include “confirmatory bias,” to seek and remember information that fits with pre-existing expectations; “illusory correlation” to consider two facts to be causally related, when they are coincidental or unrelated to each other: and “overconfidence” about knowledge level and lack of awareness of gaps and limits in one’s knowledge. All have been identified as a cause of diagnostic error.
“Premature closure,” the most common error in the diagnostic process, occurs when the clinician settles on a diagnosis and does not consider reasonable alternatives. Other significant heuristic errors include “representativeness,” which occurs when the clinician assumes that something seems similar to other things in a certain category but is itself a member of that category. “Availability” refers to clinician bias in favour of what comes to mind easily or has been recently encountered and assumes a higher probability as a diagnosis than a thorough probabilistic assessment of the likelihood of that disease.2
As the clinicians and students in our initial workshops reflected on the connections between “working up” an art object in the museum and “working up” a patient, they first identified multiple comments about the process itself. They identified the challenges of collecting information without immediately interpreting the data, the urgency of getting the “right answer,” the resistance to “letting go” of an hypothesis when contradictory visual evidence was presented by a team member, and the discomfort in being in an art museum and “not knowing” about art. As the discussion developed, the students and clinicians reflected on parallels with the “workup” process with patients and identified propensities for biases, assumptions and cognitive shortcuts in their decision-making processes in clinical settings.
These early discussions informed our work. We focused on the diagnostic process and deconstructed it into essential steps, and articulated a series of premises that reflect sound clinical practice. These premises are intended to serve as a concise checklist for students to counter common heuristic errors.
- Observation before interpretation.
- Communication depends on accurate verbal descriptions.
- Associations influence interpretations.
- Inquiry leads to questions. Questions lead to information.
- Teams are wiser together.
|PREMISE||POTENTIAL HEURISTIC ERRORS|
|Observation before interpretation.||Premature closure.|
|Communication depends on accurate verbal descriptions.||Inadequate translation of visual data into verbal/textual format leads to error.|
|Associations influence interpretations.||Assumptions from previous & recent cases can be liabilities.|
|Inquiry leads to questions.
Questions lead to information.
|Questions influence choice of tests and follow-up.|
|Teams are smarter and wiser together.||Team effort may mitigate errors.|
1. Tversky, A. and Kahneman, D. (1974 Sept 27). “Judgment under Uncertainty: Heuristics and Biases.” Science, 185 (4157) 1124-1131; Tversky, A. and Kahneman, D. (1981 Jan 30). “The Framing of Decisions and the Psychology of Choice.” Science, 211 (4481) 453-8
2. Tversky, A. Kahneman, D. (1974 Sep. 27). “Judgment under Uncertainty: Heuristics and Biases.” Science, New Series, 185 (4157), 1124-1131; Kahneman, D., Tversky, A. (1973). “On the psychology of prediction.” Psychol Rev, 80 237-51; Klein, J.G. (2005 Apr 2). “Five pitfalls in decisions about diagnosis and prescribing.” BMJ, 330 (7974) 781-783.