Skip Navigation LinksHome > February 2014 - Volume 89 - Issue 2 > How Do We Think? Can We Learn to Think Better?
Text sizing:
A
A
A
Academic Medicine:
doi: 10.1097/ACM.0000000000000116
From the Editor

How Do We Think? Can We Learn to Think Better?

Sklar, David P. MD

Free Access
Article Outline
Collapse Box

Author Information

Editor’s Note: The opinions expressed in this editorial do not necessarily reflect the opinions of the AAMC or its members.

An intern and I were called to see a 60-year-old man who had dangerously low blood pressure. He had been feeling light-headed all day and described an uncomfortable sensation in his stomach, as if he were going to vomit. As we were walking toward the man, I asked the intern to describe his differential diagnosis for hypotension.

“The pump (heart), the volume (blood), and the pipes (blood vessel tone),” he said.

“Good,” I said, “So how is that going to help you in this case?”

“Well,” he said, “I will do a history and see if he has any symptoms that might go along with a heart problem, or maybe with an infection that could indicate sepsis, because these are probably our most common causes of hypotension in someone his age. Then I will do a physical to confirm what I find in the history. And then I will order labs to check for myocardial infarction or infection or blood loss.”

We soon found ourselves face to face with the patient, and I immediately noticed that he looked uncomfortable. He was grimacing and there was sweat on his forehead. The monitor above his bed indicated a blood pressure of 70/30 and a pulse of 88. His respiratory rate was 22 and his oxygen saturation was 95%. The intern began his history while I worked with the nurses and technicians to establish intravenous lines for fluid administration. I had this terrible feeling that something bad was about to happen. I listened as the patient described a surgery a week ago in which his gallbladder had been removed, and I immediately wondered whether the current problem might be related to the surgery—perhaps a complication causing sepsis or bleeding. The man explained that he was diabetic, did not drink alcohol, and had been doing well after his surgery until today. The very fact that the man could talk cogently temporarily reassured me; it meant that the flow of blood to his brain was sufficient to allow him to think and talk, and that there was likely enough time to gather more information without taking other action.

At this point I was searching for some obvious patterns—chest pain that might lead us toward a cardiac cause for the low blood pressure; or chills, fever, or redness around the surgical site that might lead toward a diagnosis of infection. Doctors in our department had recently misdiagnosed some early sepsis cases, and I had attended lectures about the importance of early recognition of sepsis and treatment with antibiotics. I was tempted to order broad-spectrum antibiotics just in case this could be septic shock. I listened to the intern’s questions and the patient’s answers, hoping that some clues would appear, but there was no history of chest pain or shortness of breath, fever, or blood loss, just a fullness in his stomach that made him want to vomit. I put my hand on the man’s abdomen wondering whether he might have a ruptured aortic aneurysm, but the abdomen felt soft and the man did not wince when I applied pressure. I was beginning to feel very uneasy, as no patterns had emerged, yet the man had a bad look on his face. I did not feel comfortable waiting for the results of blood tests and imaging, because I sensed that this man might deteriorate and have a cardiac arrest, and then it would be too late to alter the course of his illness. But I did not have a specific diagnosis that would justify a treatment.

As the intern began his physical exam, the man suddenly sat forward and had a look on his face that I have come to recognize over the years. Something very bad was about to happen. And then he vomited about a liter of blood. This was both dramatic and terrifying and gave us the answer for the cause of the hypotension as well as some clear direction for actions we could now take. We hurried to administer blood through our three intravenous lines, called the intensive care unit and the gastroenterology fellow, and began to puzzle out why this man’s stomach might have filled with blood. An ulcer? Gastritis? Fortunately, the bleeding slowed and the man responded well to the blood we gave. His blood pressure came up and he was soon resting comfortably in the ICU.

The intern and I discussed the approach to hypotension and how the “uncomfortable feeling” in the man’s stomach and the hypotension might have led us to the right diagnosis if not for all the other confusing information about the recent surgery, the recent cases of sepsis that had been missed, and the time pressures that clouded our thinking. Although we may not have committed any errors, we also had not come up with the correct diagnosis. The case got me to thinking about what we know about the process of diagnostic decision making in medicine and how errors and delays occur.

In this month’s issue of Academic Medicine we have two reports1,2 and a Commentary3 about diagnostic decision making. The reports are by researchers who have contributed significantly to the scholarship of diagnostic thinking over the years. The first authors of the two reports, Henk G. Schmidt, PhD, and Geoffrey R. Norman, PhD, had previously collaborated with a third author on an article4 published in 1990 in this journal that outlined a theory of clinical reasoning that emphasizes how expert decision making progresses through stages related to the accumulation of experience with specific clinical problems. According to their theory, novice physicians generally follow a slow, deliberate, analytic approach that depends upon an understanding of basic mechanisms of disease and pathophysiology to reach a diagnosis. Experts, however, often use the recognition of patterns of clinical presentations that they have seen in the past, and can switch into a more analytic approach when the patterns do not match their previous experience. The theory informs thinking about the nature of expertise and how to help our trainees attain this expertise. Rather than training students and residents in how to think and problem solve only, the authors emphasize the importance of exposure to a variety of problems to help create mental models and patterns that could augment the understanding of basic pathological mechanisms.

In our current issue, Schmidt et al1 explore the effect of exposure to recent clinical information upon the later diagnostic thinking of physicians and show that such exposure can create an unconscious bias that will lead to erroneous diagnoses. In the case I described earlier, the recent lectures about the early diagnosis of sepsis may have created a bias that made me incorrectly elevate the possibility of sepsis as a cause for hypotension for my patient. The authors also suggest that there may be a way to counteract bias through slowing down, reexamining the information, and using careful analytic thinking.

Also in the current issue, Norman et al2 explore whether being slow and thorough has any advantage in producing accurate diagnoses. They used prepared written cases and resident physicians as participants. In their study, they show that slowing down to take more time in analyzing a clinical vignette did not improve accuracy, and conclude that this finding may contradict thinking about the source of most errors in clinical decision making. In such thinking, errors have been blamed on the use of the more rapid, intuitive, pattern-based thinking, and a slower, more analytic approach has been considered to be better.

Croskerry et al3 in their Commentary caution against interpreting the results of Norman and colleagues’ study to mean that slowing down during a difficult case would not help reduce errors in decision making. In fact, they strongly maintain that in teaching the strategies for optimal clinical decision making, we not only need to encourage slowing down but also explain why it is necessary, that is, what is involved in the analytic process. Moulton et al,5 in their review of theories of expertise, suggest that experts are able to recognize when patterns do not seem to fit their previous experience, and the slower, more analytic type of reasoning will be needed to find an appropriate solution. It is the ability to recognize ill-defined, unstructured problems and the need to approach them differently from the usual pattern recognition problems that may differentiate experts from those individuals who are experienced but lack true expertise. In the case that I described above, it was quickly apparent to me that this patient’s presentation of hypotension did not fit a pattern I was familiar with, which set me on the course to use a more analytic approach that I terminated when the patient vomited blood, which clarified the cause of the hypotension.

So what are the implications of these reports and the Commentary? I find the following three, which I have stated as questions below:

1. What are the implications for prevention of medical error in the clinical environment? Such error is caused by a combination of individual provider errors and systems problems. An understanding of how providers may err in their diagnostic thinking may help us develop aids such as care guidelines and checklists that can assist providers at different levels of expertise. We may also be able to improve our education of providers so that they can recognize high-risk situations and modify their approach when needed. However, a comprehensive approach to reducing medical error and improving patient safety will also involve examination of the environment of care and improvement of care systems. Patient safety experts assume that errors will occur, and they attempt to create care environments that will catch errors before they cause harm to patients. For example, if a physician prescribes a medication to which a patient is allergic, the nurse or pharmacist will usually catch the error through his or her routine review of allergies with the patient, and prevent the error from causing harm.

Provider cognition problems should be considered in the overall context of error reduction within the environment of care.6 We need to continue to work on designing safer clinical environments that will support providers in their decision-making activities and build in redundancy and other mechanisms so that errors that do occur will not cause patient harm. As we continue to gain an understanding of how providers make clinical decisions, we will need to link what we learn with implications for improving the safety of our clinical systems of care.

1. What are the implications for research on physician cognition? The two studies and Commentary continue the debate about the roles of various types of thinking (slower and more analytic, or faster and more intuitive) in medical error, and whether slowing down during certain situations and using a more analytic type of thinking will reduce errors. However, we must consider whether the research models of future studies can be extrapolated to actual clinical environments. Do the results of studies on cognition based upon written scenarios apply to actual clinical situations in which information is often unreliable, complex, and disorganized? I think Croskerry et al3 address these questions when they recommend that further research on clinical decision making

2. should investigate operating characteristics of the decision maker under both natural and experimental conditions as well as the influence of the many contextual variables in clinical medicine…. However, the clinical applicability of findings remains paramount.

3. What are the implications for teaching about diagnostic thinking and medical error? Students must be assisted in their progression toward expertise in diagnostic thinking by being exposed to a variety of cases and problems, under the supervision of expert faculty. Assessment will need to be appropriate for the different stages through which students’ diagnostic expertise progresses. The development of cognitive clinical expertise should be an essential objective of medical education. Furthering our understanding of the mechanisms whereby that occurs and how to best support and nurture it should be a high priority for the medical education community, both for the improved expertise of our students and the safety of our patients.

David P. Sklar, MD

Back to Top | Article Outline

References

1. Schmidt HG, Mamede S, Berge KV, et al. Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Acad Med. 2014;89:285–291

2. Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: a controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89:277–284

3. Croskerry P, Petrie DA, Reilly JB, Tait G. Deciding about fast and slow decisions. Acad Med. 2014;89:197–200

4. Schmidt HG, Norman GR, Boshuizen HP. A cognitive perspective on medical expertise: theory and implication. Acad Med. 1990;65:611–621

5. Moulton CA, Regehr G, Mylopoulos M, MacRae HM. Slowing down when you should: a new model of expert judgment. Acad Med. 2007;82(10 suppl):S109–S116

6. Sklar DP, Crandall C. What do we know about emergency department safety? Agency for Healthcare Research and Quality: Mortality and Morbidity Rounds on the Web. June 2010 http://webmm.ahrq.gov/perspective.aspx?perspectiveID=88 Accessed October 16, 2013

© 2014 by the Association of American Medical Colleges

Login

Article Tools

Share