The Institute of Medicine has reported that each year up to 98,000 deaths result from iatrogenic injury and error.1 Autopsy series have suggested a 15% error rate in the practice of medicine. These numbers are surprising and concerning and raise important questions about how we practice medicine. What kind of errors do we make as providers? How can we reduce the incidence of such occurrences?
Graber et al created a taxonomy that can help understand the issue of medical error.2 The investigators evaluated 100 internal-medicine diagnostic errors identified through quality assurance and autopsy discrepancies. Approximately half of the errors were diagnostic in nature and related to poor data gathering and faulty synthesis of information. The other half of the errors were systems issues, which were defined as technical failures as well as organization and policy issues such as abnormal tests not being communicated to the patient. There were six errors for each case reviewed; often, both cognition and systems issues were involved.
Rheumatology is considered the quintessential diagnostic specialty. Let’s look at what’s known about the cognitive part of this problem. How do clinicians diagnose illness, and what contributes to missed opportunities in this area of patient care?
Cognition
The diagnosis of medical conditions by providers requires a combination of experience, knowledge, and acumen. Critical thinking is at the core of this process and requires us to collect information, work through a problem, and reach a reasonable conclusion. Recent advances in cognitive psychology have provided a theoretical framework to understand decision making. As far back as Plato, the brain was felt to be divided into two components: a rational reasoning part and an emotional area with the rational in charge. More recently, it was thought that the cerebral cortex was the computer and the deeper limbic system was the area shared with animals, the instinctive emotional area. Cognitive theory now shows that these brain areas have evolved in unison and that there are really two systems that work together for decision making.3
In our day-to-day professional life, clinicians move between the two styles of decision making. Efficient doctoring requires frequent use of heuristics and occasional reliance on System II reasoning.
System I
The first diagnostic system, System I, involves pattern recognition and originates in the deeper part of the brain in dopamine-rich neurons. This part of the brain is an instinctual area that integrates life experiences, senses, emotions, and feelings and comes to an intuitive conclusion. Everyday life is full of such responses. Your choices regarding the color of your car or the person you marry are often gut-level decisions. Research has shown that this brain area is much more sophisticated than previously appreciated.
In our professional lives, many of our day-to-day diagnoses or therapeutic decisions are instinctual. As our experience as a provider grows, diagnoses are based on the rich tapestry of patients we’ve seen, articles we’ve read, and the stories we’ve heard. Who among us has not made the diagnosis of fibromyalgia in a new patient in the first minute of the history? This diagnosis is made with heuristics. Heuristics are cognitive shortcuts—rules of thumb—that allow rapid decision making without formal analysis.
I’m reminded of the time my partner diagnosed a patient with Ehlers-Danlos syndrome. I asked him how he did it and he said simply, “He looked like the last Ehlers-Danlos patient I saw.” This is also called thin slicing, a process where we take a wide variety of clues and narrow them to the very few factors that matter. Thin slicing and heuristics allow us to move through a busy day of 20 patients and still practice competent medicine. “Hand synovitis plus a high anti-CCP means rheumatoid arthritis,” or, “That looks like a shingles rash to me.” However, the instinctual approach can be flawed and lead to error as it is by nature quick and not inclusive of all the data.
TABLE 1: How Rheumatologists Think
- There is a dual system of diagnostic reasoning.
- Clinicians rely extensively on instincts and heuristics for a speedy diagnosis.
- When uncertainty exists, revisit the problem list and create a complete differential diagnosis.
- Anchoring bias is the most common reasoning error. When new information appears, reconsider the diagnosis. Avoid favoritism to the original diagnosis.
- Analyze your errors. Why did they occur?
System II
The other system of diagnostic reasoning, System II, is the systematic and analytical style that we associate with great diagnosticians.4 This system is knowledge based and allows us to come up with the complete differential diagnosis. The scientific description of this style is “hypothetico-deductive,” and it emanates from the cerebral cortex and the memory area of the brain. It is rational decision making at its best, thorough and detailed. System II is the hallmark of our subspecialty, as in, “Please see this patient in the unit. I can’t figure it out. They must have some kind of vasculitis.” In this context, this type of decision making involves a thorough problem list of all organs affected and a robust differential diagnosis. This style leads to more diagnostic testing but, when uncertainty exists, the analytic approach is the most reliable.
In our day-to-day professional life, clinicians move between the two styles of decision making (see Table 1). Efficient doctoring requires frequent use of heuristics and occasional reliance on System II reasoning. Often, one needs to invoke a combination of these styles. “Let’s do the vasculitis workup on that ICU patient, but let’s hold steroids for now.” There are cognitive biases, mainly in System I, that occur and can lead to error. At least 30 cognitive biases have been identified that affect our clinical thinking (see Table 2). Examples include an availability bias when you diagnose a new patient with parvovirus arthritis because you’ve seen two such patients recently. A framing bias could occur when you get a consult preceded by, “Can you tap the knee of the drunk who’s on the third floor?” This consult might lead us to a less than thorough evaluation. Finally, there is the most common and troublesome anchoring bias. This bias occurs when we lock in on our early diagnosis and ignore new information. Recently, a resident presented a case to me and diagnosed the patient with classic rheumatic fever. I agreed it was a good story, but the negative strep culture moved rheumatic fever down my differential. The resident had passed over the data, as it did not fit with her instinctual diagnosis.
Send Us a Letter!
Contact us at:
David Pisetsky, MD, PhD, physician editor
E-mail: [email protected]
Dawn Antoline, editor,
E-mail: [email protected]
Phone: (201) 748-7757
The Rheumatologist welcomes letters to the editor. Letters should be 500 words or less, and may be edited for length and style. Include your name, title, and organization, as well as a daytime phone number.
As providers, we need to be cognizant of the dual process model of cognition that weaves throughout our day. We need to push ourselves to rigorous, System II thinking when the answer is not clear. Critical thinking occurs when one is aware of their biases, especially the anchoring bias, and forces reconsideration of the diagnosis when new information surfaces. Understanding cognition will improve our work as providers and hopefully lead to safer and more effective care.
Dr. Boyle is associate professor of medicine at the University of Colorado Denver and Denver Health Medical Center.
References
- Kohn LT, Corrigan JM, Donaldson MS (Eds). To Err is Human: Building a Safer Health System. Washington, DC: National Academy Press, 1999.
- Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005;165: 1493-1499.
- Lehrer J. How we decide. Houghton Mifflin Harcourt, 2009.
- Croskerry P. Clinical cognition and diagnostic error: Applications of a dual process model of reasoning. Adv Health Sci Educ Theory Pract. 2009;14 (Suppl 1):27-35.
TABLE 2: 30 Cognitive Errors
Aggregate bias: The tendency for physicians to believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to their own individual patients.
Anchoring: The tendency to rely too heavily on one trait or piece of information when making decisions.
Ascertainment bias: Occurs when a physician’s thinking is shaped by prior expectations, stereotypes, and biases.
Availability: The tendency to assign a probability to a disease according to vividness of memory.
Base rate neglect: The tendency to base judgments on specifics, ignoring general statistical information.
Commission bias: The tendency toward action rather than inaction stemming from either overconfidence or perceived pressure and desperation to “do something.”
Diagnostic creep: Through the presence of medical intermediaries, what might have started as a possibility eventually becomes definite, and all other possibilities are excluded.
Attribution error: The tendency to be judgmental and blame patients for their illnesses (dispositional causes) rather than examine the responsible circumstances (situational factors).
Confirmation bias: The tendency to search for or interpret information in a way that confirms one’s preconceptions.
Omission bias: The tendency to judge harmful actions as worse than equally harmful omissions (inactions).
Order effects: The tendency to remember the beginning part (primacy effect) or the end (recency effect).
Outcome bias: The tendency to judge a decision by its eventual outcome instead of on the quality of the decision at the time it was made.
Gambler’s fallacy: The tendency to think that future probabilities are altered by past events, when in reality they are unchanged.
Gender bias: The prejudice in action or treatment against a person on the basis of their sex.
Hindsight bias: The tendency to see past events as being predictable.
Multiple alternatives: A multiplicity of options on a differential diagnosis may lead to significant conflict and uncertainty. The process may be simplified by reverting to a smaller subset with which the physician is familiar but may result in inadequate consideration of other possibilities.
Overconfidence effect: Excessive confidence in one’s own ability to answer questions, reflecting a tendency to act on incomplete information, intuitions, or hunches.
Psych-out error: The tendency to attribute serious medical conditions to psychiatric conditions.
Representativeness: Thoughts that are guided by a prototype so that possibilities that contradict the prototype are not considered, resulting in attribution of symptoms to the wrong cause.
Search satisficing: The tendency to stop searching for a diagnosis once you find something satisfactory.
Sutton’s slip: The diagnostic strategy of going for the obvious instead of sufficiently considering alternative possibilities.
Triage-cueing: The triage process prevalent throughout the health care system, from the self-triage of patients to the selection of a specialist by the referring physician, which allows for the triage decisions to ultimately influence patient diagnosis and care.
Unpacking principle: Providing a more detailed description of an event increases its judged probability.
Vertical line failure: Commonly known as “thinking inside the box,” an inflexible diagnostic approach that emphasizes economy, efficacy, and utility.
Zebra retreat: The tendency to not consider a particular disease because of its unfamiliarity.
Playing the odds: The tendency in ambiguous presentations to opt for a benign diagnosis on the basis that it is significantly more likely than a serious one.
Posterior Probability: Occurs when a physician’s estimate for the likelihood of disease is unduly influenced by what has gone on before for a particular patient.
Visceral bias: The influence of affective sources of error on decision making.
Yin-Yang out: The tendency to believe that nothing further can be done to illuminate any definitive diagnosis so that the physician is let out of further diagnostic effort.
Premature closure: Focusing excessively on one disease because of one characteristic finding.