Journal Mobile

Author(s): 
ED O’Sullivan, SJ Schofield
Journal Issue: 
Volume 48: Issue 3: 2018

Format

Abstract

Cognitive bias is increasingly recognised as an important source of medical error, and is both ubiquitous across clinical practice yet incompletely understood. This increasing awareness of bias has resulted in a surge in clinical and psychological research in the area and development of various ‘debiasing strategies’. This paper describes the potential origins of bias based on ‘dual process thinking’, discusses and illustrates a number of the important biases that occur in clinical practice, and considers potential strategies that might be used to mitigate their effect.

HTML Full Text

J R Coll Physicians Edinb 2018; 48: 225–232 | doi: 10.4997/JRCPE.2018.306

Cognitive bias in clinical medicine

ED O’Sullivan1, SJ Schofield2

Cognitive bias is increasingly recognised as an important source of medical error, and is both ubiquitous across clinical practice yet incompletely understood. This increasing awareness of bias has resulted in a surge in clinical and psychological research in the area and development of various ‘debiasing strategies’. This paper describes the potential origins of bias based on ‘dual process thinking’, discusses and illustrates a number of the important biases that occur in clinical practice, and considers potential strategies that might be used to mitigate their effect.

Keywords: cognitive bias, diagnostic error, heuristics, interventions

Financial and Competing Interests: No financial or competing interests declared

Introduction

The human brain is a complex organ with the wonderful power of enabling man to find reasons for continuing to believe whatever it is that he wants to believe.

– Voltaire

Cognitive error is pervasive in clinical practice. Up to 75% of errors in internal medicine practice are thought to be cognitive in origin, and errors in cognition have been identified in all steps of the diagnostic process, including information gathering, association triggering, context formulation, processing and verification.1,2 Further evidence can be gleaned from analysis of errors at a veteran’s affairs facility, suggesting at least 13% of diagnostic errors relate to interpretation of test results and 78.9% involve cognitive error during the patient encounter.3 Reflecting on personal errors, doctors identify cognitive factors in 30% of errors in the emergency department and 42% in internal medicine wards.4,5 As a result, in 2013 the National Academies of Sciences, Engineering and Medicine formally explored the overlooked role of clinical reasoning and cognition in diagnostic errors in their publication Improving Diagnosis in Health Care.6 This report bemoans the nationwide lack of formal training in clinical decision-making and recognises that research into the causes of diagnostic error and education of diagnosticians should be a key priority in efforts to minimise error and improve patient outcomes. Despite this growing awareness of cognitive error, this has proven a challenging area to research for a variety of reasons, including a lack of high-quality data on prevalence, lack of granularity of data, and difficulty studying the somewhat invisible and mysterious process of a clinician’s decisions.7–9

Cognitive bias can lead to medical error

An important concept in understanding error is that of cognitive bias, and the influence this can have on our decision-making.10–12 Cognitive biases, also known as ‘heuristics’, are cognitive short cuts used to aid our decision-making. A heuristic can be thought of as a cognitive ‘rule of thumb’ or cognitive guideline that one subconsciously applies to a complex situation to make decision-making easier and more efficient. It has been recognised within the medical community since the 1970s but research has been sporadic and largely in fields outside of medicine, such as the military, economics and business.13 It is now becoming increasingly apparent that significant diagnostic error can result from cognitive bias.14 It is likely that most, if not all, clinical decision-makers are at risk of error due to bias – it seems to be a ubiquitous phenomenon and does not correlate with intelligence nor any other measure of cognitive ability.15 Ironically, a lack of insight into one’s own bias is common, demonstrated by doctors who described themselves as ‘excellent’ decision-makers and ‘free from bias’ subsequently scoring poorly in formal test batteries.16,17 The causes of bias are varied, and include learned or innate biases, social and cultural biases, a lack of appreciation for statistics and mathematical rationality, and even simply environmental stimuli competing for our attention.18

The goal of current research in the field is thus to recognise, understand and potentially to modify or mitigate bias in some way. As clinicians we are tasked with trying to minimise bias in both our own practice and in that of our juniors and students. Accordingly, this paper is a summary of current understanding in the field, and practical tips for educators and clinicians.

An illustration of cognitive bias

Consider a young, fit patient presenting with chest pain. If their attending clinician has recently missed a diagnosis of aortic dissection they will have been understandably upset by such an event, and aortic dissection will now be at the forefront of their mind when encountering similar symptoms. Our young patient may have no clinical signs to support such a diagnosis, and may objectively be of very low risk for dissection, but our theoretical clinician is concerned regardless. This is an example of the ‘availability bias’ and a familiar scenario for those of us in clinical practice. The temporally recent events, and the emotional component of these events, have resulted in a brain that is now ‘primed’ for such a diagnosis. This priming may ultimately lead this doctor astray. They will likely expose this young patient unnecessarily to ionising radiation because of their bias as they request a CT scan ‘just in case’. This effect may be compounded by the presence of an additional bias such as ‘base rate neglect’. Here, the doctor may appreciate that aortic dissection is exceedingly rare in such patients (i.e. the ‘base rate’ is low) but their ‘base rate neglect’ bias is overriding this knowledge and the doctor may order the scan regardless of the very low probability of a positive result. The concept of the base rate and the phenomenon of base rate neglect are hugely important concepts when considering the sensitivity and specificity of diagnostic tests. Furthermore, the base rate is the cornerstone of the Bayesian approach to inferences – wherein the clinician begins with a ‘pre-test likelihood’ of a patient having a condition and modifies this likelihood repeatedly and iteratively (and often quite loosely) as test results return and new information is encountered. To better appreciate the potential for base rate neglect to impeded accurate diagnosis, consider Figure 1. Grid A represents a condition with a prevalence of 3% (blue squares). Grid B introduces a diagnostic test. Even if this test can detect 100% of the cases, a false-positive rate of 5% (green squares) results in eight positive patients for every 100 tested (three true positive, five false positive). Thus, for any given patient with a positive result, they have a 37.5% chance of having this condition. Consider then grid C. The clinician’s decision is now even more complex in the case of a rare condition (a lower base rate), a test which may not be 100% accurate (i.e. a lower sensitivity) and may have a higher false-positive rate (i.e. lower specificity). When interpreting such a test result, neglecting to appreciate the true base rate of a condition in your specific population is to fall at the first hurdle.

Cognitive bias in clinical medicine

While the above scenario describes two specific biases in clinical practice, there are many more. Unfortunately, data are lacking as to the true incidence of specific biases in medicine, partly due to the absence of primary data itself, and the difficulty in extracting such a proximal cause of an error in retrospective analysis. This challenge is sometimes compounded by the blind spot bias, whereby people ironically demonstrate a tendency to appreciate a bias in others, but not in themselves.19 This can hinder reflection and recognition of the role of cognitive bias in an adverse event analysis. Despite these difficulties a body of knowledge is emerging with at least some preliminary data pertaining to important biases.20 Table 1 describes some important cognitive biases including those that have been formally documented in the literature in experimental settings. Clearly there are other important biases we encounter daily (e.g. the authors battle ‘search satisfying’ daily – stopping investigating a problem once the first plausible explanation is found; and ‘diagnostic momentum’ – continuing the treatment plan started by others without stepping back and independently evaluating the situation). Expert opinion suggests that many other biases beyond those listed have an important impact on medical practice.21 In the broader context of patient safety, cognitive bias is an important basis of the ‘human factors’ approach to patient safety – the relationship between clinicians and the systems with which they interact. The failure of information acquisition, processing and decision-making relates in part to our cognitive bias, and all of the examples and interventions discussed could also be understood within the human factors paradigm. Indeed, the speculated number of potential biases is vast, and their nature varied. We recommend consulting one of the exhaustive lists complied by Croskerry should the reader wish to study such biases further.22

Origins of bias: dual process thinking

An increasingly established framework for understanding the decision-making process is the dual process theory. This theory considers our thought process as a type 1 or type 2 process, with each pathway characterised by their own important attributes.23,24 Type 1 thinking is a fast, intuitive, pattern recognition driven method of problem solving, which places a low cognitive burden on the user, and allows one to make fast and accurate decisions rapidly. In contrast, type 2 thinking is a slower more methodical and thoughtful process. Type 2 thinking may place a higher cognitive strain on the user but allows them to appraise data more critically and look beyond patterns, and may potentially be more suitable for complex problem solving. Current opinion among psychologists is that we spend about 95% of our time in type 1 thinking.25 Although very efficient and time effective, cognitive bias and resulting error is thought to be more likely to occur during type 1 processing.26,27

It is probable that optimal diagnostic approaches use both type 1 and type 2 thinking at appropriate times. Non-analytical (type 1) reasoning is shown to be as effective as reflective reasoning for diagnosing routine clinical cases.28 There is additional evidence in the field of emergency medicine that a type 1 ‘gut feeling’ assessment of patient’s illness has a role in clinical practice, with a reported sensitivity of 74–87% for assessing whether a patient is ‘sick’, which is a reasonable output for a quick and essentially cost-free test. However, this type 1, rapid decision was poor at predicting diagnosis or aiding further prognostication.29–31 Furthermore, not all biases originate in type 1 processing, but when bias does occur it is thought this can only be dealt with by activating type 2 processing. Thus, an appropriate balance of type 1 and type 2 processes is required for optimal clinical performance.22 Situations of stress, fatigue, sleep deprivation and cognitive overload may predispose to error and allow cognitive bias to emerge.32

In support of the current psychological research, some fascinating objective data are emerging to potentially support the dual process theory. There are now functional MRI data to support the existence of different cognitive patterns. Activation of right lateral prefrontal cortex is noted when a logical task is correctly performed and when subjects inhibit a cognitive bias (type 2 thinking), a finding supporting this area’s potential role in cognitive monitoring. In contrast, when logical reasoning was overcome by belief bias, activity was noted in the ventral medial prefrontal cortex, a region associated with affective processing (type 1).33 Finally, there is some evidence that type 2 processing requires more blood glucose, and that alterations of blood glucose can modulate the type of processing predominantly used.34

How can we ‘debias’ ourselves?

Given the importance and prevalence of cognitive bias, how then can we mitigate its effect on our practice? While the authors reviewed strategies to improve student and doctor decision-making, there is a dearth of high-quality interventional trials attempting to formally cognitively debias doctors. What follows is a short appraisal of the evidence for current strategies with reference to some key studies.

1. Bias-specific teaching sessions

Bias-specific teaching seems the most immediately sensible approach to the problem. However, teaching critical thinking is a challenge, and while it may improve learners’ ‘awareness’ of bias,35 interventional studies of teaching sessions demonstrated effect sizes that were often insignificant or small.36,37 For example, research suggests that teaching diagnosticians about self-serving bias (i.e. falsely attributing positive outcomes to one’s own skill or intervention) does not have any measurable clinical impact on their decision-making.38

The highest quality of evidence comes from a single positive randomised controlled trial found in paediatric medical literature. The authors taught corrective strategies to clinicians (e.g. mnemonics, Bayesian tools) that were successful in targeting base rate neglect and search satisfying.39 In contrast, a similarly themed experiment used a teaching session as an intervention in 57 medical students in a Canadian emergency department. The strategies employed were a standardised case-based 90 min teaching session focused on understanding bias, how to identify bias and how to counteract them.40 Results were disappointing, as the researchers failed to demonstrate retention or any improvement in decision-making with cognitive forcing strategies. Undeterred, this same research group subsequently performed a larger trial of 145 medical students, enhanced this time with a control group. Again, the researchers failed to demonstrate any difference between the two groups; however, the intervention was a rather opportunistic additional teaching session during a 4-week rotational block. This does not guarantee that students were self-implementing the forcing strategies taught. Furthermore, the only biases they were attempting to address were the again ‘availability bias’ and ‘search-satisfying bias’.41

In conclusion, while focused educational sessions seem an intuitive and practical approach to mitigating bias, the evidence to support this is mixed and there are certainly enough negative studies to suggest it would be a low-yield intervention at best.

2. Slowing down

Popularised by Kahneman, slowing down during cognition could allow the diagnostician to transition into ‘type 2’ thinking, reflect more critically on data and ultimately make fewer errors. Encouragingly, there have been broadly positive results from trials attempting to force decision-makers to slow down.24 Certainly, diagnostic error due to experimentally induced availability bias was mitigated in medical students by forced slow deliberation, and diagnostic accuracy was shown to be improved by simply slowing cognition in two subsequent trials.42,43 Improvement in diagnostic accuracy in non-trial settings has been suggested by studies of slowing down and consciously deliberating on problems regardless of any specific underlying bias introduced.44,45 Useful insight into the nature of this intervention can be garnered from a trial of antibiotic prescribing that demonstrated workflow changes forcing physicians to slow down and consider ‘why’ and ‘how’ antibiotics were prescribed had a positive impact. Two important themes that emerged from the qualitative analysis were respondents’ comments that the intervention forced their attention to important questions (‘it reminds us to think about it’) and induced slow and deliberative reasoning (‘it makes you think twice’).46

The data are not unanimously in favour of slowing down, however. Canadian residents were randomised to work through 20 diagnostic cases at speed, or instead asked to slow down. The slower group took 20 s longer per question, but failed to show any increase in diagnostic accuracy.47 The reason behind the conflicting findings are unclear, but we speculate that as most positive studies used experimental settings to induce bias, the higher baseline levels of bias resulted in a larger effect size allowing interventions to reach ‘significance’. This has not curbed the use of ‘slowing down’ within the field of surgery, where enforced slowing is sometimes implemented in an attempt to encourage cognitive refocusing and minimisation of error, although we could find no experimental evidence supporting this.48,49

On balance, ‘slowing down’ as an intervention is supported by a growing body of evidence and is a simple intervention that we could feasibly consider in our own practice.

3. Metacognition and ‘considering alternatives’

Metacognition is the awareness of, and insight into one’s own thought processes. Forcing clinicians to ask themselves ‘what else could this be?’ is a form of metacognition and may force one to consider ‘why’ one is pursuing certain diagnoses, and consider important alternative scenarios.50 There are a number of positive studies supporting the role of metacognition in improving decision-making. For example, experimentally, ‘considering the opposite’ has been shown to help mitigate against the anchoring effect.51–53 Similarly, overconfidence bias has been tackled rather elegantly in a classroom setting, by simply asking students to give an estimate of their confidence. This was sufficient to improve diagnostic accuracy as they reassessed their position and often changed their mind – effectively debiasing themselves.54 Asking neuropsychologists to explain their reasoning when answering clinical questions had a similar impact and minimised the effect of the hindsight bias.55

Perhaps it is the specific bias that determines whether metacognition has any benefit, or even the specific approach to reflection. Regardless, metacognition in the broadest sense has demonstrated real potential, albeit the nuances surrounding the timing and nature of its use remain unclear.

4. Checklists

Checklists have been a simple and popular debiasing strategy used clinically and in many industries. They are ideal for deployment in a controlled environment with predictable patients and procedures, hence their popularity in the surgical world. Checklists can be thought of as a cognitive forcing tool that demand the user think in a more ordered fashion.56–58 Checklists are a debiasing strategy that challenges ‘structure’ of thought, attempting to force our cognition onto certain topics even if they were not previously considered.59,60 Variations on checklists include computerised clinical decision support systems that may have a role in reducing cognitive load by providing decision aids, guidance and differential diagnostic lists.61

Experimentally, checklists were found to increase cardiopulmonary examination-related accuracy, but only when the user was able to return and re-examine as needed.62 This suggests to the authors that pre-encounter checklists to prime the user are possibly a valuable approach. The nature of the checklist and content are likely important factors in their efficacy. Shimizu and colleagues compared the efficacy of a ‘differential diagnosis’ checklist to general ‘debiasing checklist’ and found the former to be superior in improving diagnostic accuracy.63 Their conclusion was that focusing on specific negative differentials may be more useful than generic approaches – here there is some overlap with metacognitive methods. Further research has found checklists to enhance ECG interpretation, while adding to the time taken to reach diagnosis, but not adding to the perceived cognitive load.64 The potential advantages of checklists are many (e.g. ease of creation and use and low cost) and the experimental evidence to support their deployment in medicine is slowly accumulating.65–67

5. Teaching statistical principles

Lack of formal education in statistics and logic is often bemoaned by clinicians and researchers alike as an explanation for poor insight into underlying principles, thus leading to error. While this might seem an obvious remedy, experimental evidence only partially supports statistical teaching as an effective intervention. On one hand, teaching students statistical principles has helped them transfer this knowledge to abstract statistical problem solving and overcome cognitive bias.68,69 In addition, using ‘analogical’ training (teaching students about statistical biases) researchers were able to create a lasting effect in students’ ability to avoid bias in specifically designed questionnaires up to 4 weeks later.70,71 However, the effect was varied, and specifically statistical biases were the most effected by teaching intervention, whereas other biases were not. Given the range of potential bias in medicine, this limited effect is an important drawback. Crucially, these studies only involved students in non-clinical scenarios. In contrast, the single relevant study in the literature within clinical medicine found that despite training in statistics, physicians performed very poorly and tended to ignore important prevalence data and statistical concepts when problem solving.72 In conclusion, much like generic bias-related teaching, a singular focus on statistical principles has not been demonstrated to be of much clinical utility.

6. Novel methods

Several novel methods exist to mitigate bias. Working on the principle that a group of experts is likely to have a more accurate answer than an individual,73 dialectical bootstrapping is the act of forcing yourself to assume your first estimate to a quantitative answer was incorrect and attempting to answer again. The average of your two answers is demonstrably more accurate.74 This novel method of increasing accuracy may obviously have rather specific clinical uses, when considering arithmetic and quantitative problems. Examples might include estimating a patient’s weight at the bedside, or their baseline renal function when no historical measurements are available.

Games have been used outside of medical practice to improve hypothesis generation. A novel videogame teaching method and repetition of a challenge was found to be superior to explicit training in mitigating confirmation and fundamental attribution error bias in a group of 703 decision-makers. While playing a fictionalised scenario, interviewing terror suspects, participants were given both implicit and explicit feedback on multiple cognitive biases as encountered. This proved superior to traditional teaching and an instructional video in helping participants avoid new cognitive bias.75 Such a fictionalised interview seems highly transferable to clinical medicine, and certainly there is an appetite among medical students for the use of games in education.76 Thus development of ‘serious games’ of this nature is becoming an active field of research, which may have future roles in medical training and bias modification.77–79

Conclusion

Undoubtedly cognitive bias is a major contributor to medical error and is underrepresented in education and neglected in clinical practice. Current literature is limited in terms of accurately describing the prevalence and significance of specific biases, which makes subsequent experimental work difficult. Some encouraging data are emerging suggesting there is potential in several interventions, such as formally ‘slowing down’, checklists and using metacognition, for example. The modest effect sizes of positive trials and multiple negative trials are helping to build a more complete understanding of how to tackle bias. They suggest that formal teaching and focusing on statistics may be of limited use, and future researchers will need to understand the role of such interventions and how they can complement emerging novel and metacognitive methods.

The current challenge is extrapolating these experimental findings to educational and clinical settings. The literature is lacking any longer-term studies with follow-up data to demonstrate that any intervention had a lasting effect. There remains scepticism as to the role of debiasing from some commentators despite some of the promising results described.55,80,81 They argue the hardwired nature of some of these cognitive biases is unavoidable and to suggest they can be easily fixed seems unlikely. Certainly there are enough negative studies to merit such concerns. Thus, the debate continues as to whether our biases are inescapable and shackle us to our savannah-dwelling ancestors, or rather are elegant optimisation protocols refined over millennia that we will simply need to adapt to modern decision-making.82

Improving our understanding and awareness of our own bias seems a sensible first step in enhancing our understanding of clinical decision-making, improving patient care, informing future research and equipping clinicians for the cognitive rigors of clinical medicine. To take a pragmatic approach, we suggest it is worthwhile remembering a few important points in our medical practice, and following some suggested rules for good decision-making, adapted from the BMJ.17

  • Slow down.
  • Be aware of base rates for your differentials.
  • Consider what data is truly relevant.
  • Actively seek alternative diagnoses.
  • Ask questions to disprove your hypothesis.
  • Remember you are often wrong. Consider the immediate implications of this.   

References

1 Kassirer JP, Kopelman RI. Cognitive errors in diagnosis: instantiation, classification, and consequences. Am J Med 1989; 86: 433–41.

2 Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med 2005; 165: 1493–9.

3 Singh H, Giardina TD, Meyer AND et al. Types and origins of diagnostic errors in primary care settings. JAMA Intern Med 2013; 173: 418–25.

4 Okafor N, Payne VL, Chathampally Y et al. Using voluntary reports from physicians to learn from diagnostic errors in emergency medicine. Emerg Med J 2016; 33: 245–52.

5 Schiff GD, Hasan O, Kim S et al. Diagnostic error in medicine: analysis of 583 physician-reported errors. Arch Intern Med 2009; 169: 1881–7.

6 Balogh EP, Miller BT, Ball JR, editors. Improving Diagnosis in Health Care. Washington, DC: The National Academies Press; 2015.

7 Zwaan L, Singh H. The challenges in defining and measuring diagnostic error. Diagnosis (Berl) 2015; 2: 97–103.

8 Singh H. Editorial: Helping health care organizations to define diagnostic errors as missed opportunities in diagnosis. J Comm J Qual Patient Saf 2014; 40: 99–101.

9 Graber M. The incidence of diagnostic error in medicine. BMJ Qual Saf 2013; 22: ii21–7.

10 Croskerry P. Our better angels and black boxes. Emerg Med J 2016; 33: 242–4.

11 Croskerry P. A universal model of diagnostic reasoning. Acad Med 2009; 84: 1022–8.

12 Croskerry P. From mindless to mindful practice — cognitive bias and clinical decision making. N Engl J Med 2013; 368: 2445–8.

13 Detmer DE, Fryback DG, Gassner K. Heuristics and biases in medical decision-making. J Med Educ 1978; 53: 682–3.

14 Redelmeier DA. The cognitive psychology of missed diagnoses. Ann Intern Med 2005 Jan 18; 142: 115.

15 Stanovich KE, West RF. On the relative independence of thinking biases and cognitive ability. J Pers Soc Psychol 2008; 94: 672–95.

16 Hershberger PJ, Part HM, Markert RJ et al. Development of a test of cognitive bias in medical decision making. Acad Med 1994; 69: 839–42.

17 Klein JG, Kahneman D, Slovic P et al. Five pitfalls in decisions about diagnosis and prescribing. BMJ 2005; 330: 781–3.

18 Stiegler MP, Tung A. Cognitive processes in anesthesiology decision making. Anesthesiology 2014; 120: 204–17.

19 Pronin E, Lin DY, Ross L. The bias blind spot: perceptions of bias in self versus others. Personal Soc Psychol 2002; 28: 369–81.

20 van den Berge K, Mamede S. Cognitive diagnostic error in internal medicine. Eur J Intern Med 2013; 24: 525–9.

21 Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2013; 22: ii58–64.

22 Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 2002; 9: 1184–204.

23 Kahneman D. Maps of bounded rationality: psychology for behavioral economics. Am Econ Rev 2003; 93: 1449–75.

24 Kahneman D. Thinking, Fast and Slow. 1st ed. New York: Farrar, Straus and Giroux; 2011.

25 Lakoff G, Johnson M. Philosophy In The Flesh: The Embodied Mind and Its Challenge To Western Thought. New York: Basic Books; 1999.

26 Evans JSBT, Curtis-Holmes J. Rapid responding increases belief bias: evidence for the dual-process theory of reasoning. Think Reason 2005; 11: 382–9.

27 Evans JSBT. Dual-processing accounts of reasoning, judgment, and social cognition. Ann Rev Psychol 2008; 59: 255–78.

28 Mamede S, Schmidt HG, Penaforte JC. Effects of reflective practice on the accuracy of medical diagnoses. Med Educ 2008; 42: 468–75.

29 Cabrera D, Thomas JF, Wiswell JL et al. Accuracy of “my gut feeling:” comparing system 1 to system 2 decision-making for acuity prediction, disposition and diagnosis in an academic emergency department. West J Emerg Med 2015; 16: 653–7.

30 Walston JM, Thomas JF, Wiswell JL et al. How accurate is “my gut feeling?”: comparing the accuracy of system 1 versus system 2 decision making for the acuity prediction of patients presenting to an emergency department. Ann Emerg Med 2014; 64: S48–9.

31 Wiswell J, Tsao K, Bellolio M et al. “Sick” or “not-sick”: accuracy of System 1 diagnostic reasoning for the prediction of disposition and acuity in patients presenting to an academic ED. Am J Emerg Med 2013; 31: 1448–52.

32 Croskerry P. ED cognition: any decision by anyone at any time. CJEM 2014; 16: 13–9.

33 Goel V, Dolan RJ. Explaining modulation of reasoning by belief. Cognition 2003; 87: B11–22.

34 Masicampo EJ, Baumeister RF. Toward a physiology of dual-process reasoning and judgment: lemonade, willpower, and expensive rule-based analysis. Psychol Sci 2008; 19: 255–60.

35 Reilly JB, Ogdie AR, Feldt JM et al. Teaching about how doctors think: a longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf 2013; 22: 1044–50.

36 Niu L, Behar-Horenstein LS, Garvan CW. Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 2013; 9: 114–28.

37 Willingham DT. Critical thinking: why is it so hard to teach? Arts Educ Policy Rev 2008; 109: 21–32.

38 Babcock L, Loewenstein G. Explaining bargaining impasse: the role of self-serving biases. J Econ Perspect 1997; 11: 109–26.

39 Jenkins MM, Youngstrom EA. A randomized controlled trial of cognitive debiasing improves assessment and treatment selection for pediatric bipolar disorder. J Consult Clin Psychol 2016; 84: 323–33.

40 Sherbino J, Yip S, Dore KL et al. The effectiveness of cognitive forcing strategies to decrease diagnostic error: an exploratory study. Teach Learn Med 2011; 23: 78–84.

41 Sherbino J, Kulasegaram K, Howey E et al. Ineffectiveness of cognitive forcing strategies to reduce biases in diagnostic reasoning: a controlled trial. CJEM 2014; 16: 34–40.

42 Mamede S, van Gog T, van den Berge K et al. Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. JAMA 2010; 304: 1198–203.

43 Schmidt HG, Mamede S, van den Berge K et al. Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Acad Med 2014; 89: 285–91.

44 Hess BJ, Lipner RS, Thompson V et al. Blink or think: can further reflection improve initial diagnostic impressions? Acad Med 2015; 90: 112–8.

45 Mamede S, Schmidt HG, Rikers RM et al. Conscious thought beats deliberation without attention in diagnostic decision-making: at least when you are an expert. Psychol Res 2010; 74: 586–92.

46 Jones M, Butler J, Graber CJ et al. Think twice: a cognitive perspective of an antibiotic timeout intervention to improve antibiotic use. J Biomed Inform 2017; 71S: S22–31.

47 Norman G, Sherbino J, Dore K et al. The etiology of diagnostic errors: a controlled trial of system 1 versus system 2 reasoning. Acad Med 2014; 89: 277–84.

48 Moulton C, Regehr G, Lingard L et al. Slowing down to stay out of trouble in the operating room: remaining attentive in automaticity. Acad Med 2010; 85: 1571–7.

49 Moulton CE, Regehr G, Mylopoulos M et al. Slowing down when you should: a new model of expert judgment. Acad Med 2007; 82: S109–16.

50 Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med 2003; 78: 775–80.

51 Gilovich T. How we know what isn’t so: the fallibility of human reason in everyday life. New York: Free Press; 1991.

52 Mussweiler T, Strack F, Pfeiffer T. Overcoming the inevitable anchoring effect: considering the opposite compensates for selective accessibility. Personal Soc Psychol Bull 2000; 26: 1142–50.

53 Lowe DJ, Reckers PMJ. The effects of hindsight bias on jurors’ evaluations of auditor decisions. Decis Sci 1994; 25: 401–26.

54 Renner CH, Renner MJ. But I thought I knew that: using confidence estimation as a debiasing technique to improve classroom performance. Appl Cogn Psychol 2001; 15: 23–32.

55 Arkes H, Faust D, Guilmette T et al. Eliminating the hindsight bias. J Appl Psychol 1988; 73: 305–7.

56 Haynes AB, Weiser TG, Berry WR et al. A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med 2009; 360: 491–9.

57 Pronovost P, Needham D, Berenholtz S et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med 2006; 355: 2725–32.

58 Thammasitboon S, Cutrer WB. diagnostic decision-making and strategies to improve diagnosis. Curr Probl Pediatr Adolesc Health Care 2013; 43: 232–41.

59 Hales BM, Pronovost PJ. The checklist--a tool for error management and performance improvement. J Crit Care 2006; 21: 231–5.

60 Weiser TG, Haynes AB, Lashoher A et al. Perspectives in quality: designing the WHO Surgical Safety Checklist. Int J Qual Health Care 2010; 22: 365–70.

61 Garg AX, Adhikari NK, McDonald H et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA 2005; 293: 1223–38.

62 Sibbald M, de Bruin ABH, Cavalcanti RB et al. Do you have to re-examine to reconsider your diagnosis? Checklists and cardiac exam. BMJ Qual Saf 2013; 22: 333–8.

63 Shimizu T, Matsumoto K, Tokuda Y. Effects of the use of differential diagnosis checklist and general de-biasing checklist on diagnostic performance in comparison to intuitive diagnosis. Med Teach 2013; 35: e1218–29.

64 Sibbald M, de Bruin ABH, van Merrienboer JJG. Checklists improve experts’ diagnostic decisions. Med Educ 2013; 47: 301–8.

65 Tang R, Ranmuthugala G, Cunningham F. Surgical safety checklists: a review. ANZ J Surg 2014; 84: 148–54.

66 Treadwell JR, Lucas S, Tsou AY. Surgical checklists: a systematic review of impacts and implementation. BMJ Qual Saf 2014; 23: 299–318.

67 Birkmeyer JD, Miller DC. Surgery: can checklists improve surgical outcomes? Nat Rev Urol 2009; 6: 245–6.

68 Fong GT, Krantz DH, Nisbett RE. The effects of statistical training on thinking about everyday problems. Cogn Psychol 1986; 18: 253–92.

69 Kosonen P, Winne PH. Effects of teaching statistical laws on reasoning about everyday problems. J Educ Psychol 1995; 87: 33–46.

70 Aczel B, Bago B, Szollosi A et al. Is it time for studying real-life debiasing? Evaluation of the effectiveness of an analogical intervention technique. Front Psychol 2015; 6 1120.

71 Loewenstein J, Thompson L, Gentner D. Analogical learning in negotiation teams: comparing cases promotes learning and transfer. Acad Manag Learn Educ 2003; 2: 119–27.

72 Borak J, Veilleux S. Errors of intuitive logic among physicians. Soc Sci Med 1982; 16: 1939–47.

73 Mannes AE, Soll JB, Larrick RP. The wisdom of select crowds. J Pers Soc Psychol 2014; 107: 276–99.

74 Herzog SM, Hertwig R. The wisdom of many in one mind: improving individual judgments with dialectical bootstrapping. Psychol Sci 2009; 20: 231–7.

75 Dunbar NE, Miller CH, Adame BJ et al. Implicit and explicit training in the mitigation of cognitive bias through the use of a serious game. Comput Human Behav 2014; 37: 307–18.

76 Cassam Q. Diagnostic error, overconfidence and self-knowledge. Palgrave Commun 2017; 3: 17025.

77 Barton M, Symborski C, Quinn M et al. The use of theory in designing a serious game for the reduction of cognitive biases. In: DiGRA ’15 – Proceedings of the 2015 DiGRA International Conference. Finland: Digital Games Research Association; 2015.

78 Dunbar NE, Wilson S, Adame B et al. The development of a serious game for the mitigation of cognitive bias. Int J Game-Based Learn 2013.

79 Symborski C, Barton M, Quinn M et al. Missing: a serious game for the mitigation of cognitive biases. In: Interservice/Industry Training, Simulation and Education Conference 2014; 2014 Dec 1–5; Arlington: I/ITSEC; 2014. p. 1–13.

80 Fioratou E, Flin R, Glavin R. No simple fix for fixation errors: cognitive processes and their clinical applications. Anaesthesia 2010; 65: 61–9.

81 Graber M. Metacognitive training to reduce diagnostic errors: ready for prime time? Acad Med 2003; 78: 781.

82 Jenkins MM, Youngstrom EA. A randomized controlled trial of cognitive debiasing improves assessment and treatment selection for pediatric bipolar disorder. J Consult Clin Psychol 2016; 84: 323–33.

83 Wallsten TS. Physician and medical student bias in evaluating diagnostic information. Med Decis Making 1981; 1: 145–64.

84 Scherer LD, de Vries M, Zikmund-Fisher BJ et al. Trust in deliberation: the consequences of deliberative decision strategies for medical decisions. Health Psychol 2015; 34: 1090–9.

85 Tversky A, Kahneman D. Extensional versus intuitive reasoning: the conjunction fallacy in probability judgment. Psychol Rev 1983; 90: 293–315.

86 Brannon LA, Carson KL. The representativeness heuristic: influence on nurses’ decision making. Appl Nurs Res 2003; 16: 201–4.

1Renal Registrar, Department of Renal Medicine, Royal Infirmary of Edinburgh, UK; 2Senior Lecturer, Centre for Medical Education, University of Dundee, UK

REVIEW

Correspondence to:
ED O’Sullivan
Department of Renal Medicine
Royal Infirmary of Edinburgh
51 Little France Crescent
Edinburgh EH16 4SA

UK

Email:
eoindosullivan@gmail.com

Figure 1 A description of common biases encountered in clinical medicine and accompanying examples

Table 1 Bias in clinical medicine

Bias

 

Availability bias

More recent and readily available answers and solutions are preferentially favoured because of ease of recall and incorrectly perceived importance42,43

Example

Recent missed pulmonary embolism prompts excessive CT pulmonary angiogram scanning in low-risk patients

Base rate neglect

This occurs in medicine when the underlying incident rates of conditions or population-based knowledge are ignored as if they do not apply to the patient in question82

Example

A positive exercise stress test in a young woman prompting an angiogram. The ‘base rate’ is so low in this population that this result is more likely false positive than true positive

Confirmation bias

Diagnosticians tend to interpret the information gained during a consultation to fit their preconceived diagnosis, rather than the converse83,84

Example

Suspecting the patient has an infection and the raised white cells proves this, rather than ‘I wonder why the white cells are raised, what other findings are there?’

Conjunction rule

The incorrect belief that the probability of multiple events being true is greater than a single event. This relates to ‘Occam’s razor’ – a simple and unifying explanation is statistically more likely than multiple unrelated explanations85

Example

A confused patient with hypoxia and deranged renal function is far more likely to simply have a pneumonia than a subdural/pulmonary embolism/obstruction simultaneously

Overconfidence

An inflated opinion of their diagnostic ability leading to subsequent error. Their confidence in their judgements does not align with the accuracy of these judgements54

Example

A doctor trusting their assessment more than they should – particularly problematic with inaccurate examinations, such as auscultation for pneumonia

Representativeness

Misinterpreting the likelihood of an event considering both the key similarities to its parent population, and the individual characteristics that define that event86

Example

A man with classic symptoms of a heart attack, but also anxious, and who’s breath smelled of alcohol. The latter details have no bearing on the likelihood of a heart attack, nor alter the degree to which he is a member of his risk demographic but distract and decrease the diagnostic pick up

Search satisfying

Ceasing to look for further information or alternative answers when the first plausible solution is found

Example

When encountering an acutely dyspnoeic patient, treating their obvious pneumonia and stopping investigations at that point, failing to search for and recognise the secondary myocardial infarction

Diagnostic momentum

Continuing a clinical course of action instigated by previous clinicians without considering the information available and changing the plan if required (particularly if plan commenced by more senior clinician)

Example

Fixating on a previously assigned label of ‘possible pulmonary embolism’ and organising CT imaging for a patient who may have subsequent results that suggest otherwise (e.g. positive blood cultures the following day)

 

 

Bias

 

The framing effect

Reacting to a particular choice differently depending on how the information is presented to you

Example

A pharmaceutical company may present new drug A as having a 95% cure rate, and suggest this is superior to drug B that has a significant 2.5% failure rate

Commission bias

A tendency towards action rather than inaction. The bias is ‘ommision bias’

Example

Historical transfusion targets in gastrointestinal bleeds – the approach was traditionally to aim for higher targets rather than do nothing. ‘Better to be safe than sorry’ and to raise the haemoglobin ‘just in case

 

Table 1 cont.

 

PDF