Journal Logo

Breaking News

Breaking News: Under Pressure Dual Process Theory for Avoiding Errors

Shaw, Gina

doi: 10.1097/01.EEM.0000437838.30259.f3
    Figure
    Figure

    A 60-year-old woman presented to the ED after a fall with a deep laceration to her palm. The experienced physician checked her neurovascular function, and then irrigated and sutured the wound. He instructed the patient to follow up with her family doctor 10 days later to have the sutures removed.

    But when she arrived at her doctor's office for suture removal, the woman commented that she couldn't move her thumb well. “Well, at your age, when your thumb's been immobilized for this long, it can become stiff,” her doctor said. He prescribed physiotherapy.

    Arriving at the hospital about a week later for her physiotherapy appointment, the patient decided to go back to the emergency department first. She recounted her story, and this time the treating emergency physician did an x-ray that revealed that the thumb was dislocated at the interphalangeal joint. The dislocation had gone on so long — nearly three weeks — that all that could be done was to refer the patient to plastics, where the joint was fused.

    “How did a very competent ED physician miss a dislocated thumb?” asked Pat Croskerry, MD, PhD, a professor of emergency medicine at Dalhousie University in Nova Scotia and a leading expert on cognitive psychology and diagnostic errors in emergency medicine. “At the end of the day, he seems to have focused too much attention on the soft wound, the laceration.”

    Dr. Croskerry encountered this case as a new department head in emergency medicine a number of years ago. Concerned and intrigued by the many errors he was seeing from his department-wide perspective, the case became one of many that he recorded and categorized.

    “I went to the psychology literature to see if anything corresponded to these patterns of behavior, and lo and behold, psychologists had done their work on cognitive biases and heuristics starting back in 1975, and they had documented many of these cognitive errors experimentally,” Dr. Croskerry said.

    Wikipedia now lists more than 90 such “cognitive biases” that can lead to error, including “recency bias” (the tendency to weigh recent events more than earlier events), the “ambiguity effect” (avoiding options for which missing information makes the probability seem unknown), “anchoring or focalism” (the tendency to rely too heavily on one trait or piece of information when making decisions), and “search satisficing” (calling off the search before all reasonable possibilities have been examined), which was the error that led the emergency physician to miss the dislocated thumb.

    After “more or less abandoning” psychology for medicine, Dr. Croskerry found that cognitive psychology was an ideal vehicle for understanding medical errors, particularly those made in the fraught, intense environment of the emergency department.

    “Some people have described the ED as a natural laboratory for the study of error, and it could also be one for the study of decision-making,” he said. “In the ED, you get conditions like cognitive overload because you are backed up and under pressure to move patients quickly. You get fatigue, you get sleep deprivation with people coming off night shifts, and you can also get a sort of dysphoria, a disruption of your affect. Those categories have proven to be the precise conditions under which decision-making is threatened.”

    One of Dr. Croskerry's greatest contributions to the understanding of medical errors in general and emergency medicine in particular is his elucidation of dual process theory as it applies to the physician under pressure. Dual process theory “broadly divides decision-making into intuitive (System 1) and analytical (System 2) processes. System 1 is especially dependent on contextual cues,” he wrote in “Context Is Everything or How Could I Have Been That Stupid?”

    “There appears to be a universal human tendency to contextualize information, mostly in an effort to imbue meaning but also, perhaps, to conserve cognitive energy. Most decision errors [have] ... two major implications. The first is that insufficient account may have been taken of the context in which the original decision was made. Secondly, in trying to learn from decision failures, we need the highest fidelity of context reconstruction as possible.” (Healthc Q 2009;12(Sp):e171.)

    Most emergency physicians live much of their lives in the System 1 mode of decision-making, noted Scott Weingart, MD, an associate professor and the director of emergency critical care at the Icahn School of Medicine at Mt. Sinai Hospital in New York. “But the problem is that if our System 1 processing has built-in errors due to lack of experience or poor reflection, we might automatically respond in a way that leads to patient damage.”

    He cites the sacred heuristic of the ABCs as an example: any time an emergency physician is presented with an unknown situation with a patient, the fallback is to manage airway, breathing, and circulation. “That will usually take care of any life-threatening priorities,” Dr. Weingart said.

    But not always. What if, for example, you have a patient experiencing an ST-elevation MI who suddenly loses consciousness while talking to you? “If you go to the ABC fallback, you'll manage the airway and start bagging and only then assess circulation,” he said. “But when a patient with an ST-elevation MI becomes unconscious and goes into arrest, that means a disintegrated rhythm into V-fib. In that instance, you need to just grab the defibrillators and shock after checking rhythm. I've seen residents, before they've seen an example of this, grabbing difficult airway management equipment and getting someone to do CPR. But each second you waste decreases the patient's chance of being successfully converted.”

    That scenario, Dr. Weingart said, starts out as a System 2 — requiring more analytical thought processing — but should become a System 1 after experience. On the other hand, what if you have a patient with ST-elevation MI who is receiving heparin and about to be sent to the cath lab, but you notice that something doesn't seem quite right, that this patient appears to be in significantly more pain than the typical ST-elevation MI patient?

    “The natural, System 1 instinct would be to rush the patient to the cath lab even more quickly,” Dr. Weingart said. “But an experienced emergency physician should learn that every time you see something a little bit weird, you need to take a step back and adopt a System 2 stance. In other words, slow down. That seems deliberately counterintuitive to your natural inclination, which is that when a patient appears sicker, you move more quickly. But if your ST-MI patient is experiencing an aortic dissection, which could be the cause of the extreme pain, sending him to the cath lab is not what he needs.”

    Dr. Weingart said he and his colleagues in clinical education simulate all the situations they can imagine in which intuitive processing will fail. “We want people to get the System 1 corrections that make for good heuristics instead of bad ones by being forced into bad situations.”

    And to do that, they have to be, well, not particularly nice to their trainees. “Part of our simulation is to induce enormous amounts of stress in our participants so that the things we build in during the simulation will be replicated in real-life stressful environments,” he said.

    A simulation actor playing a nurse may act inept, for example, and a simulated surgical consultant may offer a barrage of criticism. “If we're nice and gentle and there are no bad stressors, they may be able to respond nicely for the oral boards, but when they are in stress because a real-life patient is dying in front of them, they won't be able to activate the same area of their brain. Our sims are quite trying and leave you with a pulse rate as high as running a marathon.”

    Medical schools are now putting more effort into teaching future physicians about how decision-making actually works in practice, Dr. Croskerry said. “We do believe that you can teach people how to ‘de-bias’ themselves and employ strategies to avoid cognitive errors.” He explored the groundwork for developing “cognitive de-biasing” strategies in two BMJ Quality and Safety papers. (Aug. 7, 2013, and Aug. 30, 2013 [Epub ahead of print].)

    “Most people who work in the ED have brains that are automatically in System 1. Your brain doesn't consult you. It sees a pattern and reacts,” he said. “But the key is to know when to get out of System 1. Some might say there's no shortcut to that kind of clinical acumen; you have to get burned and get a lot of miles under your belt.”

    But Dr. Croskerry said he believes you can teach at least some of it. Critical thinking education at Dalhousie is integrated into medical education from the very beginning. “We teach students the dual process model and the properties of the two categories from week one,” he said. “They learn about cognitive biases, the four conditions that threaten decision-making, and how to recognize red flags. These are not things medical students are usually taught, but if they understand them, I believe they will become better decision-makers sooner.”

    Click and Connect!Access the links in EMN by reading this issue on our website or in our iPad app, both available onwww.EM-News.com.

    © 2013 by Lippincott Williams & Wilkins