How Cognitive Biases Shape Decisions in Emergency Medicine

Slow down • Be aware of base rates for your differentials • Consider what data is truly relevant • Actively seek alternative diagnoses • Ask questions to disprove your hypothesis [1] 

In a high pressure environment where multiple high stakes clinical decisions are made based on limited information, the influence of cognitive biases and heuristic thinking on emergency medical practice is self-evident. Is the clinician requesting a chest X ray on a young patient with reproducible chest pain because they recently missed a pneumothorax? Should a patient with an inconsistent neurological exam consistent with possible functional neurology still be exposed to significant amounts of radiation for a CT head simply because they meet criteria for scanning? Does the last patient on our shift always get the same standard of care as the first?

The mechanics of how we think

Cognitive biases, or heuristics, are the mental shortcuts made in decision making and are commonly seen as the ‘rules of thumb’ used by the subconscious. Their role is significant in guiding clinical practice and aspects of their mechanism are well known. Cognitive biases are thought to be responsible for approximately 75% of medical errors in internal medical practice [2], and have been cited in 30% of reflective essays made by emergency clinicians discussing medical errors they have been involved in [3].

There are an extensive number of cognitive biases at play in the emergency department which may be familiar to readers (Table 1). Our attempts at objective information acquisition, processing and decision making within emergency medicine can be easily derailed without even realising it.

BiasDescriptionExample
Availability BiasPreferentially favoring more recent and readily available answers/solutions due to ease of recall and incorrectly perceived importance.Recent missed pulmonary embolism prompts excessive CT pulmonary angiogram scanning in low-risk patients.
Base Rate NeglectIgnoring the underlying incident rates of conditions or population-based knowledge, as if they don’t apply to the individual patient.A positive exercise stress test in a young woman prompting an angiogram. The ‘base rate’ is so low in this population that this result is more likely false positive than true positive.
Confirmation BiasInterpreting information gained during a consultation to fit a preconceived diagnosis, rather than the converse.Suspecting the patient has an infection and the raised white cells proves this, rather than ‘I wonder why the white cells are raised, what other findings are there?’
Conjunction RuleThe incorrect belief that the probability of multiple events being true is greater than a single event (relates to Occam’s Razor).A confused patient with hypoxia and deranged renal function is far more likely to simply have pneumonia than a subdural/pulmonary embolism/obstruction simultaneously.
OverconfidenceAn inflated opinion of one’s diagnostic ability leading to subsequent error; confidence in judgments doesn’t align with accuracy.A doctor trusting their assessment more than they should – particularly problematic with inaccurate examinations, such as auscultation for pneumonia.
RepresentativenessMisinterpreting the likelihood of an event by considering key similarities to its parent population and individual characteristics that define it, leading to distraction.A man with classic symptoms of a heart attack, but also anxious, and whose breath smelled of alcohol. The latter details have no bearing on the likelihood of a heart attack, nor alter the degree to which he is a member of his risk demographic but distract and decrease the diagnostic pick up.
Search SatisficingCeasing to look for further information or alternative answers once the first plausible solution is found.When encountering an acutely dyspnoeic patient, treating their obvious pneumonia and stopping investigations at that point, failing to search for and recognise a secondary myocardial infarction.
Diagnostic MomentumContinuing a clinical course of action instigated by previous clinicians without considering available information and changing the plan if required.Fixating on a previously assigned label of ‘possible pulmonary embolism’ and organising CT imaging for a patient who may have subsequent results that suggest otherwise (e.g., positive blood cultures the following day).
The Framing EffectReacting to a particular choice differently depending on how the information is presented.A pharmaceutical company may present new drug A as having a 95% cure rate, and suggest this is superior to drug B that has a significant 2.5% failure rate.
Commission BiasA tendency towards action rather than inaction (the opposite of omission bias).Historical transfusion targets in gastrointestinal bleeds – the approach was traditionally to aim for higher targets rather than do nothing. ‘Better to be safe than sorry’ mentality and to artificially raise the haemoglobin ‘just in case’.

Table 1. Summary of biases in clinical medicine. Adapted from O’Sullivan ED, Schofield SJ. Cognitive bias in clinical medicine. Journal of the Royal College of Physicians of Edinburgh. 2018 Sep;48(3):225-32 [4]. 

Type 1 vs Type 2 thinking in the emergency department

Daniel Kahnemann proposed a dual process theory of cognition in which we rely heavily on Type 1 thinking, a fast, intuitive and pattern based system requiring low cognitive effort but is vulnerable to bias, and infrequently engage in Type 2 thinking, a slower, more methodical and analytical system with a higher cognitive burden [5]. Both forms of cognition have been demonstrated in a clinical setting using functional MRI imaging on experienced neurologists who were asked to diagnose both routine and complex cases [6]. Non-analytical reasoning (Type 1) has an important role in emergency medicine in the diagnosis of routine cases and acting on a clinical ‘gut feeling’ of how unwell a patient is. Type 2 thinking allows a clinician to move out of a pattern recognition mindset, critically appraise a situation and engage in complex problem solving. However this doesn’t always have the appropriate time or ideal environmental conditions to take effect within the emergency department. 

Bayesian Inference

Thomas Bayes was an 18th century Presbyterian minister who took a keen interest in probabilities at the end of his life. His concept of how probabilities are modified in light of new information (known as Bayes’ theorem) [7], can be mapped onto clinician behaviour and demonstrates the opportunities for cognitive bias to affect our decision making. An emergency clinician weighs up a ‘pre-test’ probability of a patient having a condition – this could be affected by their immediate clinical assessment, prior education and experience, and awareness of the underlying incidence rate of the condition (base rate) in the wider population. The clinician then modifies this ‘pre-test’ probability from the outcome of their investigations. This is often an informal and iterative process to create a ‘post-test’ probability that the patient actually has the condition. The approach of adjusting probabilities is known as Bayesian inference. A classic example of this in an emergency setting is arranging a D-Dimer. 

D-dimer is the name of one of the families of fibrin fragments which circulate following a thrombotic event or due to the abnormal activation of the clotting process. It is used in emergency medicine for the assessment specifically for venous thromboembolism (VTE), as well as identifying aortic dissection and disseminated intravascular coagulation (DIC). D-dimer has a low specificity, meaning many patients who do not have venothrombotic event will have a raised D-dimer due to infection, older age, pregnancy, chronic inflammatory disease, renal clearance, malignancy, co-morbid hepatic and cardiac disease [8] . Assessment of patients with suspected VTE involves combining symptoms and clinical findings with known predisposing factors to assign patients into categories of pre-test clinical probability using a Well’s Score. Low risk patients can be excluded from further investigation using the pulmonary embolism rule out criteria (PERC). Moderate and higher risk patients are further investigated using a combination of D-Dimer and/or imaging (CTPA or V/Q scans for suspected pulmonary embolism, US Doppler for suspected deep vein thrombosis). Despite guidelines designed to risk stratify patients a significant number of PERC negative patients still undergo investigation in the emergency department despite the low diagnostic yield of doing so. In one study 25.5% of PERC negative patients underwent further testing [9]. In this cohort of 3,024 patients presenting to the emergency department with chest pain and/or dyspnoea, 17.5% were PERC negative and amongst these only 2 patients who were ultimately diagnosed with an acute pulmonary embolism. VTE overinvestigation, especially CTPA imaging can prolong admission as well as contributing to cumulative lifetime radiation and putting patients at an increased risk of cancer, renal impairment and contrast induced allergic reactions.  The issue that clinicians face is that once a D-Dimer test is positive it will likely require confirmatory imaging despite the known high false positive rate of the initial test. Some of the latest strategies designed to try to reduce overinvestigation include adjusting the cut off for a significant D-Dimer result based on the clinical probability of VTE, renal function and age [10-12]. The D-Dimer dilemma goes to show how the nature of Bayesian inference in the emergency department can dramatically affect a patient’s trajectory from acute presentation. 

How the emergency department environment makes clinicians vulnerable to bias

It is worth considering the factors which help perpetuate biased clinical behaviours as well as the limitations of strategies designed to combat these. Many systemic factors in the emergency department leave clinicians exposed to cognitive biases by keeping them in a Type 1 mode of thinking. Emergency departments are full of interruptions to workflow, which knocks productivity and focus and forces the use of mental shortcuts to recover lost time [13]. Review of triage notes and previous clinical assessments before seeing the patient can result in anchoring bias. Clinicians in the emergency department are often arbitrarily pressured to make decisions by factors beyond the patient in front of them. This includes awareness of the number of patients still waiting to be seen, pressure to meet the 4 hour target for a patient to be admitted, transferred or discharged within the department, and the knowledge of bedspace issues in the wider hospital. Unless clinicians intentionally try to follow up a patient they have seen, there is often no formal feedback on the consequences and role of cognitive bias in their decision making. Poorly designed rotas with periods of intense shiftwork can lead to a feeling of ‘decision making fatigue’ where there is a tendency to default to a mode of thinking with a lighter cognitive burden [14]. It is common practice to try to take on the easiest and most straightforward case as the final patient on a shift due to an awareness of the increasing difficulty to perform complex clinical decision making. Feelings of being unable to question busy and unapproachable senior decision makers have been well documented as part of the ‘human factors’ approach to medical errors [15].

Addressing cognitive biases

Many of our cognitive biases are possibly unavoidable products of evolution. There is a tendency to recognise biases in others with limited self-awareness of our own (blind spot bias). Doctors rating themselves as excellent unbiased decision makers often ironically score poorly in bias testing [16, 17]. There are numerous negative studies which show teaching critical thinking has minimal effect on behaviour [18, 19]. Intentionally slowing down cognition can enable a shift to Type 2 thinking and reduce the number of clinical errors. Metacognition (awareness of one’s own thought processes) can be beneficial especially around confidence levels for a particular decision [20]. Both of these moderately successful strategies can be difficult to implement in a chaotic clinical environment. There is an old adage that the smartest person in the room is the room. Based on this principle, bootstrapping is a technique used to combine multiple estimates and can play a useful role in improving accuracy especially in quantitative assessments such as estimating a patient’s weight [21].  Checklists, now commonplace in many aspects of clinical practice, can help redirect the clinician to important factors for consideration such as negative findings [22]. However, checklists can also be clunky and excessive, obstruct the flow of clinical reasoning, and can render the clinician to feel like a decision making robot in service of a pervading culture of defensive medicine. 

Ultimately, we are only as good as the systems we work in. There are some possible novel approaches to combat cognitive biases in emergency medicine which would require a fundamental shift in both healthcare culture and technology. Consider the possibility of an AI integrated electronic health record system (EHRS) which generates direct feedback on the patients a clinician has seen in the department with a personalised detailed cognitive bias summary. Could existing proformas, guidelines and checklists exist as options in an EHRS alongside AI ‘nudging’ prompts which help address cognitive biases in real time by expanding differentials, slow clinical thinking down and provide insights into the base rates of possible diagnoses? Smart rota design and self-rostering may enable clinicians to work in bespoke shift patterns which minimise the psychological and emotional drivers of cognitive biases in the workplace. The approachability of senior decision makers could be improved through both teaching and formal design. The majority of medical errors stem from cognitive biases yet this is often underrecognised in the investigation of and subsequent reflection from medical errors [23].  Other safety critical industries such as aviation tend to emphasise the hidden vulnerabilities in systems which make an error more likely to occur (known as latent risk factors), over the role of the individual [24, 25]. 

Realistic medicine is a movement which encourages clinicians to move towards a more pragmatic use of healthcare resources where a proposed intervention or investigation is considered against the effectiveness of non-intervention and non-investigation within a patient centred framework [26]. Perhaps a culture shift towards a greater acceptance by clinicians of individual risk, being honest with our patients in terms of all possible outcomes of care as well as the limitations of healthcare resources at our disposal could dissuade reflexive thinking. Such a strategy would also reduce the levels of overinvestigation and overtreatment within the wider healthcare service.  

Cognitive biases within emergency medicine cannot be countered by awareness alone. Drivers of clinical bias in this setting are complex and multifactorial. Some cognitive biases may ultimately prove to be unmodifiable and shackled too tightly to hardwired human evolution. Yet, the systematic healthcare redesigns to try to address cognitive biases within emergency medicine are in theory both technologically and practically possible. It’s time for clinicians to become cognisant of our own cognition. 

References

1. Klein JG. Five pitfalls in decisions about diagnosis and prescribing. Bmj. 2005 Mar 31;330(7494):781-3. 

2. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Archives of internal medicine. 2005 Jul 11;165(13):1493-9.

3. Okafor N, Payne VL, Chathampally Y, Miller S, Doshi P, Singh H. Using voluntary reports from physicians to learn from diagnostic errors in emergency medicine. Emergency Medicine Journal. 2016 Apr 1;33(4):245-52.

4. O’Sullivan ED, Schofield SJ. Cognitive bias in clinical medicine. Journal of the Royal College of Physicians of Edinburgh. 2018 Sep;48(3):225-32 

5. Kahneman D. Thinking, fast and slow. macmillan; 2011 Oct 25.

6. Van Den Berg B, De Bruin AB, Marsman JB, Lorist MM, Schmidt HG, Aleman A, et al. Thinking fast or slow? Functional magnetic resonance imaging reveals stronger connectivity when experienced neurologists diagnose ambiguous cases. Brain Communications. 2020;2(1):fcaa023. 

7. Bayes T. An essay towards solving a problem in the doctrine of chances. Biometrika. 1958 Dec 1;45(3-4):296-315.

8. Innocenti F, Lazzari C, Ricci F, Paolucci E, Agishev I, Pini R. D-dimer tests in the emergency department: current insights. Open Access Emergency Medicine. 2021 Nov 11:465-79.

9. Buchanan I, Teeples T, Carlson M, Steenblik J, Bledsoe J, Madsen T. Pulmonary embolism testing among emergency department patients who are pulmonary embolism rule‐out criteria negative. Academic emergency medicine. 2017 Nov;24(11):1369-76.

10. Roy PM, Friou E, Germeau B, Douillet D, Kline JA, Righini M, et al. Derivation and validation of a 4-level clinical pretest probability score for suspected pulmonary embolism to safely decrease imaging testing. JAMA cardiology. 2021 Jun 1;6(6):669-77.

11. Pfortmueller CA, Lindner G, Funk GC, et al. Role of D-Dimer testing in venous thromboembolism with concomitant renal insufficiency in critical care. Intensive Care Med. 2017;43(3):470–471. doi:10.1007/s00134-016-4646-3

12. Righini M, Goehring C, Bounameaux H, Perrier A. Effects of age on the performance of common diagnostic tests for pulmonary embolism. Am J Med. 2000;109(5):357–361. doi:10.1016/s0002-9343(00)00493-9

13. Mark G, Gudith D, Klocke U. The cost of interrupted work: more speed and stress. InProceedings of the SIGCHI conference on Human Factors in Computing Systems 2008 Apr 6 (pp. 107-110).

14. Pignatiello GA, Martin RJ, Hickman Jr RL. Decision fatigue: A conceptual analysis. Journal of health psychology. 2020 Jan;25(1):123-35.

15. Bleetman A, Sanusi S, Dale T, Brace S. Human factors and error prevention in emergency medicine. Emergency Medicine Journal. 2012 May 1;29(5):389-93.

16. Borak J, Veilleux S. Errors of intuitive logic among physicians. Social science & medicine. 1982 Jan 1;16(22):1939-43.

17. Hershberger PJ, Part HM, Markert RJ, Cohen SM, Finger WW. Development of a test of cognitive bias in medical decision making. Academic Medicine. 1994 Oct 1;69(10):839-42.

18. Niu L, Behar-Horenstein LS, Garvan CW. Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educational research review. 2013 Jun 1;9:114-28.

19. Willingham DT. Critical thinking: Why is it so hard to teach?. Arts Education Policy Review. 2008 Mar 1;109(4):21-32.

20. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Academic medicine. 2003 Aug 1;78(8):775-80.

21. Herzog SM, Hertwig R. The wisdom of many in one mind: Improving individual judgments with dialectical bootstrapping. Psychological Science. 2009 Feb;20(2):231-7.

22. Hales BM, Pronovost PJ. The checklist—a tool for error management and performance improvement. Journal of critical care. 2006 Sep 1;21(3):231-5.

23. Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: a systematic review. BMC medical informatics and decision making. 2016 Nov 3;16(1):138.

24. Kapur N, Parand A, Soukup T, Reader T, Sevdalis N. Aviation and healthcare: a comparative review with implications for patient safety. JRSM open. 2015 Dec 2;7(1):2054270415616548.

25. Syed M. Black box thinking: why most people never learn from their mistakes–but some do. Penguin; 2015 Nov 3.

26. Fenning SJ, Smith G, Calderwood C. Realistic Medicine: Changing culture and practice in the delivery of health and social care. Patient education and counseling. 2019 Oct 1;102(10):1751-5.

How interesting was this post?

Average rating 5 / 5. Vote count: 2

No votes so far! Be the first person to rate this post.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top

Get exclusive discount codes by signing up to our newsletter

By registering to our free newsletter you’ll receive discount codes to medical platforms such as Pastest, Quesmed, MRCP, UKMLA and MSRA question banks.