The 1999 Institute of Medicine report To Err Is Human gave a sobering depiction of the magnitude and consequences of medical error.1 The report concluded that approximately 98,000 people die in hospitals annually due to preventable medical errors. Of all the errors detailed in this report, diagnostic errors have since been determined to be the most common and may occur in up to 15% of cases, based on autopsy data.2 It is therefore imperative that medical providers understand their role in diagnostic error.
Diagnostic Errors
Diagnostic errors are classified as system-based or cognitive errors. System-based errors are rooted in the complex structure of healthcare delivery. For example, a diagnosis may be delayed due to a protracted wait time for a magnetic resonance imaging (MRI) scan because the number of machines available may be insufficient to meet clinical demand.
In contrast to system-based errors, cognitive errors are related to medical decision making. Cognitive errors can be due to 1) lack of knowledge, 2) faulty data gathering or 3) incorrect synthesis of information. Cognitive errors contribute to the majority of missed and delayed diagnoses.2
Medical school education focuses primarily on improving knowledge and data gathering, but most cognitive errors are the result of the incorrect synthesis of information rather than the result of knowledge deficits.2 Consequently, rheumatic diseases are fraught with potential for cognitive errors because our diseases are rare, complex, lack definitive diagnostics and require the careful synthesizing of data from many sources.
Medical information is synthesized through two distinct methods: heuristic thinking (i.e., type 1 thinking) and analytic thinking (i.e., type 2 thinking).3 Heuristic thinking is the process by which an expert quickly attributes a set of medical information to a diagnosis via pattern recognition. Heuristic thinking is faster than analytic thinking, but also more error prone.
Analytic thinking is a consciously controlled effort to reason through a problem.3 This synthesis strategy is used when a medical provider methodically processes all diagnostic possibilities that could be associated with a patient’s medical history, exam findings, lab results and imaging studies. Analytic thinking entails developing a lengthy differential diagnosis and methodically considering each possibility. Although thorough and less likely to miss something important, type 2 thinking is time intensive and not feasible at all junctures.
It has been proposed that even though early learners need to rely primarily on analytic (i.e., type 2) thinking, more experienced clinicians can safely default to heuristic (i.e., type 1) thinking—provided they learn to recognize those moments when it may be critical to intentionally switch to analytic thinking to minimize the chance of a serious negative outcome due to misdiagnosis.4 Analytic thinking is most important when cognitive errors are mostly like to occur (described in the next section) and when the stakes are greatest, such as with life-threatening illness or high-risk therapies.
No human can avoid cognitive errors entirely. However, reducing the number of errors we make in a given clinical scenario can prevent errors from aligning in a way that results in a misdiagnosis, as demonstrated by James Reason’s Swiss cheese model.5 In other words, if we have to synthesize medical information in five different ways in a clinic visit, we may be able to reach the correct diagnosis if we make only one or two cognitive errors, whereas we are more likely to make a misdiagnosis if we make additional cognitive errors in that same encounter.
Although cognitive errors account for only some of the holes in the Swiss cheese model, they are the holes over which providers have the greatest influence—the aspect of care that individual providers can mitigate through debiasing strategies (described in the final section).5 Indeed, studies into the impact of teaching trainees about cognitive errors have shown that when medical providers learn about the different types of cognitive errors, they are less likely to make these errors and more likely to reach the correct diagnosis.6,7 Toward that end, in Table 1, we describe some of the most common cognitive errors, providing rheumatology-based clinical examples.
Table 1: Examples of Common Cognitive Biases in Rheumatology
Cognitive Bias | Description | Example |
---|---|---|
Search Satisficing | Calling off the search once a diagnosis is found | A patient comes to see you for polyarthritis. Due to a strong family history of rheumatoid arthritis (RA), you request a test for rheumatoid factor, which is positive, and correctly diagnose RA. However, believing you had a diagnosis, you miss the tophi on his ears that were an indication he also has gout, accounting for the majority of his symptoms. |
Framing Effect | When the initial description of the case strongly shapes your thinking | You are consulted for management of a gout flare in a patient with a history of gout. Treatment for gout is initiated but the patient fails to improve. Joint aspiration later reveals septic arthritis. |
Diagnosis Momentum | When a diagnosis insidiously becomes a label | A patient with ocular pseudotumor from “suspected IgG4-related disease” is admitted for respiratory failure, attributed to community-acquired pneumonia. The history and physical exams state he “has pseudotumor secondary to IgG4-RD” (the word suspected being omitted) and no further workup is sent. Granulomatosis with polyangiitis is missed because the pseudotumor was labeled as IgG4-RD without further investigation. |
Anchoring Bias | Locking on to certain features while ignoring others | A patient is admitted to the hospital for pneumonia is found to have palpable purpura and a creatinine of 2.5. The consulting rheumatologist recalls palpable purpura as being a classic sign of IgA vasculitis, elicits a history of knee pain, and diagnosis a 75-y/o patient with IgA vasculitis, as opposed to the actual senile purpura combined with chronic kidney disease (CKD) and osteoarthritis. |
Confirmation Bias | Looking for evidence to support rather than refute a diagnosis | A patient with CKD and a history of gout reports episodic swelling in the small joints of the hands. His uric acid is 7, and his joint symptoms are attributed to gout. Parathyroid hormone is not checked, and hyperparathyroidism is missed. |
Psych-Out Bias | Attributing all symptoms to a known psychiatric diagnosis | A patient with panic attacks is diagnosed with generalized anxiety disorder after reporting persistent dyspnea and palpations. A year later she is brought to the emergency department via emergency medical services after a witnessed syncopal episode. Computed tomography angiography reveals an enlarged pulmonary artery. Right heart catheterization confirms severe, long-standing pulmonary hypertension. |
Premature Closure | When a diagnosis is made, the thinking stops and new information is neglected | A patient with a history of giant cell arteritis develops monocular vision loss and left-sided weakness. Pulse-dose steroids are initiated for presumed temporal arteritis, but the patient develops encephalopathy and clinically deteriorates. MRI shows embolic stroke and transthoracic echocardiogram confirms infective endocarditis. |
Hassle Bias | Justifying inaction due to the hassle of action | A patient with a history of CKD from uncontrolled diabetes and gout comes to the clinic for right knee pain and swelling. Joint aspiration is deferred during a busy clinic day, and symptoms are attributed to a gout flare. However, the pain and swelling subsequently don’t improve with steroids. The patient returns to clinic and mentions he has been falling since developing peripheral neuropathy. Aspiration is performed and reveals a bloody effusion. Diagnosis of an intra-articular fracture is delayed. |
Posterior Probability | Overestimating the likelihood of a diagnosis due to the patient’s past medical history | A patient with a diagnosis of IgA vasculitis reports multiple flares of joint pain and effusions despite resolution of his purpura with conservative therapies. He undergoes several courses of steroid tapers, which temporarily resolve musculoskeletal symptoms. A year later, joint aspiration is performed during a flare and reveals gout. |
Availability Bias | The prevalence of a diagnosis is inflated by the ease with which the diagnosis was made | A patient with a history of tobacco use is diagnosed with thromboangiitis obliterans after presenting with ischemic digits. ANCA is not sent and granulomatosis with polyangiitis is missed. |
Commission Bias | Medical errors made due to over-action, inspired by a sense of beneficence on the part of the medical provider | A lupus patient presents with a creatinine of 6.3 and is found on biopsy to have lupus nephritis Class VI. The doctor starts to explain the plan to get the patient set up for dialysis, but she is devastated by the news. Influenced by the patient’s reaction, the provider reconsiders and decides it might be worth trying six months of intravenous cyclophosphamide. |
Omission Bias | Medical errors made due to inaction in an attempt to follow the principle of nonmaleficence, doing no harm | A patient with anti-MDA5 dermatomyositis is intubated in the ICU, declining despite maximal oxygen and pressure support. Bronchoscopy is deemed unsafe on current ventilator requirements. Pulse-dose steroids are deferred as infection cannot be definitively ruled out with bronchoalveolar lavage. |
“Anything-Could-Be-Rheumatic” Bias | When a provider habitually overestimates the likelihood of patients’ symptoms being explained by a rheumatic diagnosis | A patient with a history of psoriasis and chronic musculoskeletal pain but no evidence of inflammatory arthritis on exam continues to follow with rheumatology for years due to “possible psoriatic arthritis.” |
“Not-It” Bias | When a provider habitually underestimates patients’ symptoms being due to rheumatic disease | A patient with two months of joint pain and frank synovitis on exam is diagnosed with a viral arthritis and follow-up rheum care not arranged, justified based on the patient endorsing occasional rhinorrhea amid allergy season. |
When
One trick to minimizing cognitive errors entails recognizing when specific errors are most likely to occur. For example, availability bias is most likely to occur soon after seeing an unusual case or learning new information about a diagnosis. If ever you find yourself saying, “What a coincidence. I just learned about this disease last week, and here I am seeing it for the first time,” that’s a time to pause and consider the possible influence of availability bias.
Similarly, when working with a patient with a known psychiatric history, considering the possibility of psych-out bias is important. Virtually any reported symptom can be attributed to a psychiatric diagnosis. For example, fatigue can be attributed to depression, and shortness of breath can be attributed to anxiety. Indeed, although depression is a common cause of fatigue, depressed patients also develop lupus and vasculitis, for which fatigue may be a predominant presenting symptom.
Tragically, psych-out bias carries considerable consequences: Patients with severe mental health disorders die, on average, 10–20 years earlier than the general population.8 It is now recognized that the majority of these deaths are secondary to physical illness, with estimates suggesting that only 25% of these patients received the correct diagnosis prior to death.8-10
In addition to availability and psych-out biases, providers need to be cautious when considering rare or exciting diagnoses. When a zebra diagnosis is on the differential, overattachment to an interesting diagnosis can lead a clinician to commit anchoring bias (see Table 2). The zebra diagnosis may be less likely than more banal diagnoses on the differential, but the desire to “find a Whipple’s disease” can result in failure to consider pieces of data that don’t fit that diagnosis, anchoring instead on the aspects that do.
Table 2: Conditions Leading to Bias
Over-attachment to a diagnosis | • Anchoring • Confirmation Bias • Premature Closure |
---|---|
Inheriting others’ thinking | • Diagnostic Momentum • Framing Effect |
Failure to consider an alternative or secondary diagnosis | • Search Satisficing |
Provider attributes | • Hassle Bias • Commission/Omission Bias • Rheum/Not-It Bias |
Likelihood misperception | • Availability Bias • Posterior Probability Error |
Affective biases | • Psych-out Error • Like/Dislike a Patient |
Similarly, when a rheumatologist makes a diagnosis due to a pathognomonic finding, anchoring bias may play a role, and the provider would do well to remember that no finding has 100% specificity, making the term pathognomonic something of a misnomer. For example, although palpable purpura is a common finding in IgA vasculitis, it can also occur in other conditions, and rheumatologists should never make the error of believing that palpable purpura equals IgA vasculitis—or that any other finding equals the associated diagnosis.
Clinicians should also be aware of environmental situations that increase the risk of making medical errors. Often, in these scenarios, providers have limited time with a high volume of patients. In such instances, providers may rely too heavily on type 1 thinking and lack the time or energy to toggle into and out of type 2 thinking as needed. This leaves providers especially vulnerable to bias on the most demanding days. On these days, it is all too easy to place undue emphasis on a patient’s previous medical history and assume current symptoms must be related to a previous diagnosis (i.e., posterior probability error).
In busy scenarios, providers may only have time to thoroughly focus on certain clinical features (i.e., anchoring bias), use diagnostic testing to support rather than refute initial reasoning (i.e., confirmation bias), and then cease reasoning once testing seemingly confirms suspicions (i.e., premature closure). Importantly, time constraints may also dissuade providers from performing additional testing or procedures that might be labor intensive, such as coordinating a muscle biopsy (i.e., hassle bias). Unfortunately, biases are additive, and the correct diagnosis is easily obscured by layers of bias.
How Cognitive Errors Unfold
Awareness of the many described cognitive biases—of which only a small set are highlighted in this article—is an important first step, but providers also need to learn to recognize bias in clinical scenarios. With practice, it becomes possible to recognize biases at play in real time.
Case 1
A 50-year-old homeless man with paranoid schizophrenia and gout is admitted with generalized anasarca from newly diagnosed heart failure. He undergoes aggressive diuresis, but swelling of his left wrist and hand do not improve. The patient is also noted to have significant pain and limited range of motion of the left wrist. He reports to multiple providers that the pain and swelling of his right wrist started several months prior to admission, while he was making a 50-mile trip using his motorized scooter. This history is attributed to delusional thinking in the setting of undertreated schizophrenia.
A rheumatologist is consulted for management of gout and, after an X-ray to rule out calcium pyrophosphate dihydrate crystal deposition disease, prescribes colchicine without performing arthrocentesis. The patient’s pain, however, persists for weeks.
Ultimately, an MRI of the wrist is performed, revealing diffuse rice bodies in the flexor tendons. Eventually, the patient agrees to debridement of the wrist, and cultures reveal Mycobacterium tuberculosis.
Diagnosis: M. tuberculosis tenosynovitis
Cognitive errors:
- Psych-out bias: attributing the chronic nature of his symptoms to delusion;
- Posterior probability: neglecting to consider a full differential diagnosis due to a history of gout; and
- Framing bias: the rheumatologist’s thinking is influenced by being consulted for “gout management.”
Case 2
A 56-year-old woman with tophaceous gout and years of stable low-back pain develops an acute right-sided foot drop and alerts her rheumatology providers to this change in symptoms because she is concerned this is a complication of her gout. The patient lives four hours away from her rheumatologist and requests that testing be minimized due to transportation challenges. Her providers recommend MRI, which reveals a right-sided disc bulge at L4–L5.
Her foot drop is attributed to L5 nerve root involvement of her sciatica and a neurosurgeon operates to relieve the compression. Within one week of surgery, however, she develops bilateral ischemic toes, diffuse palpable purpura and acute renal failure. Renal biopsy reveals crescentic necrotizing glomerulonephritis with granulomatous inflammation, consistent with granulomatosis with polyangiitis.
Diagnosis: Foot drop due to mononeuritis from granulomatosis with polyangiitis
Cognitive errors:
- Search satisficing: no other evaluation was performed once the MRI found evidence of disc bulge, even though most disc bulges do not cause nerve compression;
- Hassle bias: due to the difficulty of arranging an urgent outpatient nerve conduction study, this test was not ordered; and
- Confirmation bias: providers looked only for MRI verification of mechanical lumbago, rather than ordering additional studies to look for other possible causes of new neurologic symptoms.
Case 3
A 40-year-old woman presents with daily, early-morning fevers, arthralgias, macular rash, pharyngitis and leukocytosis, despite four weeks of broad antimicrobial therapy. A rheumatologist is consulted to evaluate a fever of unknown origin, but signs off after the initial evaluation, suggesting an infectious etiology. A broad infectious and malignant evaluation is unrevealing, prompting a second consultation with a rheumatologist.
A different team immediately begins evaluation for—and ultimately makes a diagnosis of—adult-onset Still’s disease, without the patient developing any symptoms beyond those with which she initially presented.
Diagnosis: Adult-onset Still’s disease
Cognitive errors:
- Availability bias: the probability of an infectious cause was inflated by a recent fever of unknown origin caused by infection;
- Hassle bias: on a busy service, providers face an insidious pressure to sign off on cases, sometimes prematurely; and
- Framing bias: if a busy, tired trainee on the consulting service suspects a condition is not rheumatic, due to the biases listed above, the manner in which the case is presented to the attending physician is likely to influence the attending’s clinical reasoning.
Case 4
A 21-year-old woman is diagnosed with biopsy-confirmed, toxin-induced myopathy caused by an over-the-counter herbal supplement. Her creatinine kinase (CK) levels and strength improve with cessation of the supplement. However, her strength does not return to baseline after four months; her CK improves from 50,000 U/L to 1,000 U/L, but never normalizes.
Her providers recognize the potential for posterior probability bias leading to ongoing attribution of persistent CK elevation to the toxin-induced myopathy. Therefore, they take a step back and reinitiate a broad evaluation for myopathy, inclusive of thyroid function studies. The patient’s thyroid-stimulating hormone (TSH) is found to be 128 µIU/L, and her thyroxine (T4) level is undetectable. Anti-thyroid peroxidase antibody is strongly positive. She is diagnosed with autoimmune thyroiditis, and with initiation of levothyroxine, her labs normalize and her strength returns.
Diagnosis: Hypothyroid-induced myopathy
Debiasing Strategies (see Table 3):
- Metacognition: appreciation of the potential for posterior probability;
- Recognition of unexplained or unresolved symptoms: ongoing weakness despite treatment (toxin avoidance); and
- Application of a specific forcing: evaluating thyroid function in all myopathy patients.
As shown in this last case, multiple strategies can be employed to recognize and prevent cognitive errors, a process referred to as metacognition. At its core, metacognition is an awareness of one’s own clinical reasoning and the potential ways error can be introduced in the process of reaching a diagnosis.11 Acknowledging fallibility, considering multiple perspectives and allowing for self-critique are some of the strategies providers can use to increase metacognition. Metacognition can enable a provider to recognize that a hectic clinical scenario or a patient with a known psychiatric diagnosis increases the chance of cognitive errors, prompting the provider to use debiasing strategies.
Table 3: Debiasing Strategies
Transitions in care • New consults • Receiving handoff • Giving handoff or writing notes | Re-examine the workup to date and search for any possible gaps or assumptions. Question if the diagnosis is in fact a “working diagnosis” or if the diagnosis has been confirmed. Determine what ambiguity needs to be emphasized. Chose language that accurately indicates what is known versus what is suspected. |
---|---|
Hectic situations | Ask what the next most likely diagnosis would be if the initial diagnosis is ruled out. |
Frustrating patients | Take an extra thorough history/review of systems, be thorough in formulating a differential, and take precautions to avoid confirmation bias, premature closure and hassle bias. |
Debiasing Strategies
Forcing strategies enable a provider to insert predetermined best practices into a scenario in which biases may lead the provider away from best practices. Generic forcing strategies are generalized questions providers can ask themselves while building a differential diagnosis, such as: What is the second-most-likely diagnosis? What diagnosis cannot afford to be missed? What sort of cognitive error would be most likely to occur in this setting?12
By contrast, specific forcing strategies address a common diagnostic crossroads, such as using a predefined set of diagnostic tests for a specific chief complaint. For example, a rapid response team may always order troponins, electrocardiogram and chest X-ray for all decompensating adults with chest pain. This is a cost-effective evaluation that can alert the provider to ischemic abnormalities when the provider may not have myocardial infarction high on their differential.
Along these lines, a rheumatologist may always check thyroid labs in patients being evaluated for possible myositis; although most patients won’t have thyrotoxic myopathy, uniformly ordering this inexpensive test nearly eliminates the chance of missing a patient who does have myopathy due to thyroid disease.
Metacognition and cognitive forcing strategies can be taught to trainees through a cognitive apprenticeship, in which experienced clinicians guide learners through their own cognitive processes.13 For example, when working with trainees, medical educators should consider taking time to point out clinical situations in which cognitive errors may occur or have occurred in the past.
Most cognitive errors are the result of the incorrect synthesis of information rather than the result of knowledge deficits.
Next, it’s helpful to share cognitive forcing strategies, such as when an attending physician explains, “Gout is a great mimicker of septic arthritis, and vice versa, so whenever I consider one diagnosis, I make sure to consider the other as well.”
In addition, questions from a learner or from a patient/family member can be recognized as wonderful opportunities to insert a moment of type 2—analytic—thinking into a clinical scenario, which is itself a debiasing strategy. Embrace the question, “Why do I think this isn’t anti-neutrophil cytoplasmic antibody (ANCA) associated vasculitis?” and in laying out your explanation, you may find yourself concluding that although it is not a typical ANCA presentation, it would be reasonable to order an ANCA test to rule out an atypical presentation.
Finally, although good teachers push trainees to make a commitment and state their favored diagnosis, educators should also teach trainees when and how ambiguity can be helpful. Along these lines, we should encourage trainees to ask themselves not just “What diagnosis do I think is most likely?” but also “What is the second most likely diagnosis?” and “What other diagnosis can we not afford to miss in this patient?”
A method for framing these important clinical considerations for trainees is the mnemonic “Whenever U RACE, tie your LACES” proposed by Gordon and Kemnitz (see Table 4).14
Table 4: Mnemonic for Medical Error Prevention
Whenever U RACE, Tie Your LACES14 | Considerations |
---|---|
Unexplained symptoms | Are there symptoms or lab values that do not fit the current working diagnosis? |
Return visit | Arrange for close follow-up to re-assess the patient’s clinical status and diagnosis |
At-risk population | Is this patient part of a demographic who is at risk for cognitive bias? (mentally ill) |
Critical Conditions | If the patient is in critical condition, what time-sensitive diagnosis must be considered? (aortic dissection, subarachnoid hemorrhage) |
End of shift | Recognize provider fatigue. Recognize potential errors in sign-out |
Life-threatening conditions | What conditions cannot be missed? (myocardial infarction, sepsis) |
Anything else? | What are other possible explanations for the patient’s presentation? |
Coherency? | Does the patient’s history and objective data fit with the current diagnosis? |
Everything explained? | Does the diagnosis explain ALL symptoms, lab results and imaging findings? |
Second problem present? | Could two diagnoses better explain the patient’s presentation than one diagnosis? |
Practice Makes Perfect
Much like any other skill in medicine, learning to recognize and minimize cognitive errors takes practice. Invariably, the first step entails acknowledging one’s vulnerability to cognitive biases. Next steps involve being aware of 1) the situations in which cognitive errors are most likely to occur and 2) the types of biases to which you, as an individual, are most prone. Then we must develop and practice debiasing strategies, which should be shared with trainees as we role model mitigating our own biases.
In Sum
By admitting our susceptibility to bias, practicing cognitive forcing techniques, being introspective when we are especially predisposed to making cognitive errors and teaching our trainees to engage us daily in conversations about biases, we can learn to identify biases in real time, providing the opportunity to compensate for those biases and avoid a misdiagnosis. Mastering cognitive errors provides us the opportunity to become the master diagnosticians we all strive to be.
Megan Milne, MD, is currently is a second-year rheumatology fellow at Duke University, Durham, N.C. She completed her medical education at the University of Pittsburgh School of Medicine and her internal medicine residency training at University of Texas Southwestern Medical Center, Dallas.
Rebecca E. Sadun, MD, PhD, is an adult and pediatric rheumatologist at Duke University School of Medicine, Durham, N.C., with an interest in how cognitive biases impact diagnostic reasoning in rheumatology.
References
- Institute of Medicine (US) Committee on Quality of Health Care in America. To err is human: Building a safer health system. Kohn LT, Corrigan JM, Donaldson MS, editors. Washington, D.C.: National Academies Press; 2000.
- Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005 Jul;165(13):1493–1499.
- Klaczynski PA. Analytic and heuristic processing influences on adolescent reasoning and decision-making. Child Dev. 2001 May–Jun;72(3):844–861.
- Croskerry P. Adaptive expertise in medical decision making. Med Teach. 2018 Aug;40(8):803–808.
- Reason J. Human error: Models and management. BMJ. 2000 Mar 18;
320(7237):768–770.
- Reilly JB, Ogdie AR, Von Feldt JM, Myers JS. Teaching about how doctors think: A longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf. 2013 Dec;22(12):1044–1050.
- Logan AA, Rao M, Cornia PB, et al. Virtual interactive case-based education (VICE): A conference for deliberate practice of diagnostic reasoning. MedEdPORTAL. 2021 May 19;17:11159.
- Liu NH, Daumit GL, Dua T, et al. Excess mortality in persons with severe mental disorders: A multilevel intervention framework and priorities for clinical practice, policy and research agendas. World Psychiatry. 2017 Feb;16(1):30–40.
- Walker ER, McGee RE, Druss BG. Mortality in mental disorders and global disease burden implications: A systematic review and meta-analysis. JAMA Psychiatry. 2015 Apr;72(4):334–341.
- Correll CU, Solmi M, Veronese N, et al. Prevalence, incidence and mortality from cardiovascular disease in patients with pooled and specific severe mental illness: A large-scale meta-analysis of 3,211,768 patients and 113,383,368 controls World Psychiatry. 2017 Jun;16(2):163–180.
- Croskerry P. Cognitive forcing strategies in clinical decisionmaking. Ann Emerg Med. 2003 Jan;41(1):110–120.
- Croskerry P. Achieving quality in clinical decision making: Cognitive strategies and detection of bias. Acad Emerg Med. 2002 Nov;9(11):1184–204.
- Medina MS, Castleberry AN, Persky AM. Strategies for improving learner metacognition in health professional education. Am J Pharm Educ. 2017 May;81(4):78.
- Gordon DC, Kemnitz M. Cognitive errors in emergency medicine. Crit Dec Emerg Med. 2013;27(12):11–18.