Journal Logo

SURGICAL PERSPECTIVES

Recognizing Heuristics and Bias in Clinical Decision-making

Hughes, Tasha M. MD, MPH*; Dossett, Lesly A. MD, MPH*; Hawley, Sarah T. PhD; Telem, Dana A. MD*

Author Information
doi: 10.1097/SLA.0000000000003699
  • Free

Cognitive bias in surgical decision-making is poorly understood and may represent a bias blind spot relative to the more commonly described and discussed forms of overt bias (ie, race or sex bias). Surgical care is unique in that we often need to make quick, decisive clinical decisions. Surgeons become acclimated to this type of fast-acting, quick-thinking throughout training, where this type of decision-making is critical to our individual success and, in some cases, is critical to the preservation of patients’ life, limbs, or vital organs.

Although there are many scenarios where rapid decisions and swift action allow surgeons to effectively do their job, this type of reasoning is also the most susceptible to reliance on heuristics and introduction of cognitive bias. This characterization of fast-thinking as susceptible to bias is described by the dual process theory, which defines 2 types of reasoning each with their own strengths and limitations. The implication of these 2 types of reasoning in both professional and personal settings was the topic of the best-selling novel “Thinking Fast and Slow,” in which type I reasoning is intuitive, fast, usually effective but more likely to fail than type II reasoning, which is analytic, reliable, and safe but also time and resource intense.1–3 Type I reasoning, although fast and often accurate, is at high risk of introducing bias and resulting in incorrect conclusions. Furthermore, when this model of reasoning leads to an incorrect conclusion it is unlikely to be identified and corrected.1 This type of reasoning, defined by short cuts based on previous “similar" experiences (ie, pattern recognition), is a common adaptive strategy ingrained in all of us through surgical training. Collectively, reasoning that is based on previous experiences and defined by pattern recognition irrespective of the data and facts is defined as heuristics. Although the tension between type I and type II reasoning has been studied as a possible source of diagnostic errors in the emergency room setting, it has not been studied in surgical care.1,4 Surgery requires type I reasoning to rapidly synthesize a large volume of data to provide high impact and time-sensitive care, but the consequent negative implications of this type of reasoning are not often considered.

In the process of making complex treatment decisions, the introduction of bias more common in heuristic thinking poses a potential threat to health care quality and outcomes for our patients. The introduction of bias in the setting of type I, or heuristic, reasoning occurs by way of introduction of a number of different cognitive biases. Although >100 different cognitive biases have been described in the vast cognitive psychology literature, only certain of these have been described in health care decision-making and the majority of research on heuristics and cognitive bias are based on hypothetical patient scenarios rather than through analysis of actual medical decision-making.5 Anchoring and availability bias are 2 such cognitive pitfalls that have been described in other contexts of nonsurgical medicine. Anchoring describes the cognitive bias in which we base our reasoning and ultimate decisions on the first piece of information that we are offered, irrespective of subsequent data that may be presented.6 Availability refers to the cognitive process of using the most recent or more vivid “similar" experience to define the current experience.7 In clinical medicine, this heuristic may be triggered by a single shared characteristic without equal consideration paid to all facets of the current case or condition.

As a practicing surgeon, we know that providers across all specialties work with the best intentions to provide care that is safe, effective, and consistent with best practice within the field. Despite best intentions, the influence of heuristics and bias find their way into clinical care. The following is an example of a patient diagnosed with breast cancer presented at a multidisciplinary tumor board that demonstrates the way in which bias can be introduced even in well-intended care settings.

‘Ms. Smith is an anxious, young woman with a new diagnosis of right breast invasive ductal carcinoma. She is tearful and very emotional throughout our interaction. She has no other medical problems but does report using medication to help her sleep since receiving this diagnosis. Her husband first noticed the mass one year ago. She has used some topical oils hoping it would go away. [laughter] She was brought in by her husband who has palpated the mass as she is too afraid to feel the mass herself. Although she was too overwhelmed to say exactly how she wants to proceed, I bet she will want a bilateral mastectomy.’

Although there is important patient information presented above, this vignette also contains extraneous data that may trigger certain heuristics (Table 1),6,8–15 resulting in potential contamination of reasoning with provider bias. For instance, the redundant emphasis on her emotional state (recall she was anxious, tearful, emotional, and overwhelmed) may lead other providers hearing the case to believe she is likely to be time-consuming and require additional resources to complete her encounter. Before we learned more than her name, we were told she was anxious. We are also told that her husband is responsible for both finding her cancer and bringing it to medical attention, both of which may diminish her own autonomy in the subsequent encounter. Lastly, we are told that she may have employed alternative therapies for an extended period of time before presentation and this may provoke an emotional response among providers to think she is likely to decline any recommendations we plan to offer or that she is more foolish or less intelligent for her belief in these therapies.

TABLE 1
TABLE 1:
Clinical Heuristics, Associated Cognitive Bias, and Nonmedical and Medical Mitigation Efforts

Sterilizing our clinical reasoning from all bias is unrealistic. Recognizing, measuring, and attempting to mitigate the effect of bias on clinical decision-making and outcomes is imperative. Since the release of the Institute of Medicine report “To Err is Human” in 1999,16 a vigorous focus on patient safety has emerged, including the contribution of cognitive bias and resultant cognitive errors on patient safety. In a study of 100 cases of diagnostic errors, the error was defined as “no fault," “system-related," or “cognitive." At least 1 cognitive type error was found to be associated with diagnostic errors in 74% of cases.17 Beyond just identifying types of bias in diagnostic errors, systematic approaches to address these issues have been proposed and, to a lesser degree, attempted in nonsurgical settings. Croskerry et al propose many strategies that are specific to the workplace to balance fast and automatic thinking with slower, more measured data acquisition, and analysis to reduce cognitive bias. These include structured data acquisition, deliberate decoupling, and reflection on initial impressions, slowing down strategies to place purposeful stopping points in clinical reasoning and affective debiasing, or the intentional separation of emotional and cognitive assessments.4 Similar efforts within surgery or the multi-disciplinary cancer setting have not been described and, as such, are an area with a great deal of opportunity to address these challenging issues in our own care delivery patterns.

Recognition of cognitive errors, including those associated with provider bias and heuristic reasoning, has focused largely on diagnostics and patient safety, whereas much less work has focused on the effect on treatment decision-making and even less is known about the downstream effects on patient outcomes. In the context of surgical care—where “fast thinking” is part of the job and essential for success—there remains a need to understand where biases can be introduced. A further challenge will be designing and introducing interventions into this fast-paced clinical environment, but understanding the vulnerable points where bias can occur is a first step. Simply asking providers to think slower, have less bias, and use kinder and more thoughtful language without demonstrating the influence of these forces on downstream outcomes is premature. Understanding the specific features of the patient, provider or care environment most susceptible to cognitive biases and exploring downstream outcomes is necessary to begin the challenging task of discussing and reducing bias in surgical decision-making.

REFERENCES

1. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2013; 22: (suppl 2): ii58–ii64.
2. Evans JS. In two minds: dual-process accounts of reasoning. Trends Cogn Sci 2003; 7:454–459.
3. Kahneman D. Thinking Fast and Slow. New York: Farrar, Straus and Giroux; 2011.
4. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf 2013; 22: (suppl 2): ii65–ii72. Available at: https://www.faasafety.gov/files/gslac/courses/content/258/1097/AMT_Handbook_Addendum_Human_Factors.pdf. Accessed September 12, 2019.
5. Blumenthal-Barby JS, Krieger H. Cognitive biases and heuristics in medical decision making: A critical review using a systematic search strategy. Med Decis Making 2015; 35:539–557.
6. Kahneman D, AT. Judgement under Uncertainty: Heuristics and Biases New York: Cambridge University Press 1982.
7. Schwarz N, Bless H, Strack F, et al. Ease of retrieval as information—another look at the availability heuristic. J Pers Soc Psychol 1991; 61:195–202.
8. MacLeod C, Mathews A, Tata P. Attentional bias in emotional disorders. J Abnorm Psychol 1986; 95:15–20.
9. Greitzer F, Andrews D, Herz R, et al. Training Strategies to Mitigate Expectancy-Induced Response Bias in Combat Identification: A Research Agenda. In: Air Force Research Laboratory/RHA WRRD, ed.: Ashgate Publishing Limited; 2010:173–189.
10. https://www.faasafety.gov/files/gslac/courses/content/258/1097/AMT_Handbook_Addendum_Human_Factors.pdf. Accessed September 12, 2019.
11. Rodman J. Cognitive Biases and Decision Making: A Literature Review and Discussion of Implications for the US Army. USACAC Repository: Fort Leavenworth, KS: Mission Command - Capabilities Development and Integration Directorate (CDID); 2015.
12. Stein CT, Drouin M. Cognitive Bias in the Courtroom: Combating the anchoring effect through tactical debasing. In: Review UoSFL, ed. 52(3)2018:393–428.
13. Gilbert D, Malone P. The Correspondence Bias. Psychological Bulletin 1995; 117:21–38.
14. Yu L, Zellmer-Bruhn M. Introducing team mindfulness and considering its safeguard role against conflict transformation and social undermining. Acad Manag J 2018; 61:324–347.
15. Hopthrow T, Hooper N, Mahmood L, et al. Mindfulness reduces the correspondence bias. Q J Exp Psychol 2017; 70:351–360.
16. Medicine Io. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press; 2000.
17. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med 2005; 165:1493–1499.
Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved.