In our day-to-day professional life, clinicians move between the two styles of decision making (see Table 1). Efficient doctoring requires frequent use of heuristics and occasional reliance on System II reasoning. Often, one needs to invoke a combination of these styles. “Let’s do the vasculitis workup on that ICU patient, but let’s hold steroids for now.” There are cognitive biases, mainly in System I, that occur and can lead to error. At least 30 cognitive biases have been identified that affect our clinical thinking (see Table 2). Examples include an availability bias when you diagnose a new patient with parvovirus arthritis because you’ve seen two such patients recently. A framing bias could occur when you get a consult preceded by, “Can you tap the knee of the drunk who’s on the third floor?” This consult might lead us to a less than thorough evaluation. Finally, there is the most common and troublesome anchoring bias. This bias occurs when we lock in on our early diagnosis and ignore new information. Recently, a resident presented a case to me and diagnosed the patient with classic rheumatic fever. I agreed it was a good story, but the negative strep culture moved rheumatic fever down my differential. The resident had passed over the data, as it did not fit with her instinctual diagnosis.
Send Us a Letter!
Contact us at:
David Pisetsky, MD, PhD, physician editor
E-mail: [email protected]
Dawn Antoline, editor,
E-mail: [email protected]
Phone: (201) 748-7757
The Rheumatologist welcomes letters to the editor. Letters should be 500 words or less, and may be edited for length and style. Include your name, title, and organization, as well as a daytime phone number.
As providers, we need to be cognizant of the dual process model of cognition that weaves throughout our day. We need to push ourselves to rigorous, System II thinking when the answer is not clear. Critical thinking occurs when one is aware of their biases, especially the anchoring bias, and forces reconsideration of the diagnosis when new information surfaces. Understanding cognition will improve our work as providers and hopefully lead to safer and more effective care.
Dr. Boyle is associate professor of medicine at the University of Colorado Denver and Denver Health Medical Center.
References
- Kohn LT, Corrigan JM, Donaldson MS (Eds). To Err is Human: Building a Safer Health System. Washington, DC: National Academy Press, 1999.
- Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005;165: 1493-1499.
- Lehrer J. How we decide. Houghton Mifflin Harcourt, 2009.
- Croskerry P. Clinical cognition and diagnostic error: Applications of a dual process model of reasoning. Adv Health Sci Educ Theory Pract. 2009;14 (Suppl 1):27-35.
TABLE 2: 30 Cognitive Errors
Aggregate bias: The tendency for physicians to believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to their own individual patients.