Skip Navigation LinksHome > December 2006 - Volume 28 - Issue 12 > Human Error in the Emergency Department
Emergency Medicine News:
doi: 10.1097/01.EEM.0000288911.80528.7a
Quality Matter

Human Error in the Emergency Department

Welch, Shari J. MD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Welch is the quality improvement director in the emergency department at LDS Hospital in Salt Lake City, a clinical faculty member at the University of Utah School of Medicine, faculty at the Institute for Healthcare Improvement and the Urgent Matters Project for the Robert Woods Johnson Foundation, a quality improvement consultant to Utah Emergency Physicians, and a member of the Emergency Department Benchmarking Alliance.

Problem solving by the human mind is an amazingly complex phenomenon, and through research in human factors and cognitive function, we are learning that errors are a byproduct of human cognition. (JAMA 1994;272[23™:1851.) In the book To Err is Human, the Institute of Medicine brought the notion of human error and its role in patient safety to the national consciousness. (National Academy Press, Washington, D.C., 1999.)

Figure. No caption a...
Image Tools

This has helped health care professionals in an important and fundamental paradigm shift from the “bad practitioner/incident report model” to the “bad system/safety culture” model for addressing medical error. This shift is still in progress, and the attendant work has just been undertaken. (JAMA 2002;287[15™:1997; Ann Emerg Med 1999;34[3™:373; Qual Saf Health Care 2004;13:255.)

The study of mistakes is fascinating and tremendously important in emergency medicine where practitioners make hundreds of decisions in a single shift. By understanding how we are likely to err, we can build an environment that helps us to get it right, so that reliability and patient safety are incorporated into every operation and clinical process in the emergency department.

As I pointed out in a previous column (EMN 2006;28[9™:41), reliability in medicine (by definition, the right care for the right patient in the right time frame) is overall at only 80 percent to 90 percent, and adverse events occur in four percent of admitted patients, accounting for one million injuries and 180,000 deaths a year. (N Engl J Med 1991;324:377.) As Lucian Leape, MD, has pointed out, “Systems that rely on error-free performance are doomed to fail.”

Cognitive errors can be categorized as slips or mistakes. Slips are unintended acts and can be further broken down into four types. First, there is capture, where frequent schema take over. An example would be driving to work when you left the house to go to the dentist. The second type of slip is referred to as a description error. In this instance, the right object is used for the wrong action, such as eating soup with a fork. The third type of slip is the associative activation or an incorrect mental association, such as answering the door when the phone rings. The fourth type of slip is referred to as loss of activation, but this is quite simply a memory loss, such as walking into a room and forgetting why you came. These slips occur frequently by interruption.

A number of factors increase the likelihood of slips. These include frequent interruptions, fatigue, busyness, noise, heat, and visual stimuli. Alcohol and drugs increase the likelihood of slips. This generally sounds like the milieu of a busy working emergency department, doesn't it? (On your next shift, count how many slips you make!)

Back to Top | Article Outline

Cognitive Mistakes

Mistakes involve more complex cognitive functioning and are more difficult to understand. They can be categorized as knowledge-based errors and rule-based errors. With knowledge-based errors, the problem-solver lacks the critical knowledge or misinterprets the situation with wrong pattern matching. An example is a physician who matches respiratory distress and rales to CHF instead of ARDS. With rule-based errors, the rules are misapplied. A similar example is a physician who gives Lasix to a patient with rales, though the patient has pneumonia. There are four mechanisms that can lead to these types of mistakes:

* Availability heuristic.

* Confirmation bias.

* Coning of attention.

* Reversion.

In the availability heuristic, the operator uses a biased memory or the first information that comes to mind. The rest of the problem-solving may be flawless, but the process began with a flawed premise. Similarly, in confirmation bias, the operator looks for data to support an early hypothesis, rather than letting the data lead to the diagnosis. This may be the most common process for diagnostic mistakes in the emergency department. In coning of attention, the operator focuses on one source of information, ignoring other data and arriving at a flawed assessment. Finally, with the reversion mechanism, under adverse conditions the operator falls back on old patterns of behavior though these have been shown to be inadequate in the past, and this leads to mistakes (usually operational).

By recognizing the common types of slips and mistakes that occur regularly, we can strive to create a working environment which minimizes them. For instance, we know that interruptions are the enemy of problem solving and cognitive functioning. Many emergency departments are creating “interruption-free zones.” One important zone is being created wherever nurses obtain and prepare medications for patients. Because medication errors account for almost 20 percent of adverse events, this would seem an obvious innovation for patient safety. Some departments have marked off these interruption-free zones with markings on the floor. In a similar vein, one could argue for such zones for physicians wherever they sit to review patient data and complete their charts.

We are not ashamed to say, ‘To err is human,’ and from that build a safe and reliable health care system

A cultural shift must acknowledge the need for uninterrupted time and space for problem-solving. According to one recent study, an emergency physician performed on average 67 discrete tasks in an 180-minute interval and was interrupted (defined as an interruption lasting more than 10 seconds) more than 30 times! (Acad Emerg Med 2000;7[11™:1239.) The practice of staff interrupting physicians while they are dictating or processing patient information is widespread.

By understanding some of the mechanisms for making mistakes, physicians can make an effort to guard against them. In particular, confirmation bias and availability heuristic would be two mechanisms about which practitioners should be vigilant. Time constraints often push the clinician into making decisions and drawing conclusions quickly and with little data. Emergency physicians (perhaps more than other specialists) are frequently forced to operate in an information vacuum.

Most practitioners treat patients daily without a shred of information regarding their past medical history or medications. Often patient acuity makes this a necessary practice, but hopefully as we march into a future with the promise of the electronic medical record, this will be a thing of the past. This likely promotes a habit of reaching conclusions without adequate information and perhaps with the confirmation bias I described.

Besides creating an environment less conducive to slips and mistakes and understanding the mechanisms in which mistakes are made, the emergency department and its workers need to cultivate heightened team awareness of where mistakes and errors are occurring. As part of the cultural changes mentioned at the top of this discussion, we need to pay attention to latent errors or the near-misses. I recently heard about a scenario in which a dangerous situation for a patient was managed well by a physician and staff. The typical response on most clinical units would likely be congratulatory, telling the tale over and over as a story of heroics and successful management.

In high-reliability organizations (HROs), however, this near-miss would be analyzed, studied, picked apart, and seen as a potential failure. HROs have a heightened awareness of the mistakes and errors that lurk out there in their work environment and keep a keen eye out for them. They come up with plans for managing these near-misses. (Managing the Unexpected: Assuring High Performance in an Age of Complexity. San Francisco, Jossey-Bass, 2001.) This is something at which most emergency departments are woefully unskilled.

Another example of a near-miss in the emergency department recently occurred in one of my shops. There is currently a national shortage of Mefoxitin, which is the antibiotic of choice at our Level I trauma center for penetrating trauma and peritonitis. When the shortage came up, each surgeon wanted a different substitute antibiotic, and nurses were having to work with medications with which they were not familiar. In one instance, three different trauma team members (the team leader physician, the trauma PA, and the intern) all ordered different antibiotics. The nurse caught it before the patient received three broad-spectrum antibiotics.

Rather than waiting for another near-miss medication error, it was suggested by a physician that the surgeons agree on an alternative regimen and that nurses be quickly educated regarding the substitution. Typically this type of preemptive strike is not common in emergency medicine. In the old paradigm, the department would wait for a sentinel event to act. In the new paradigm an ever-vigilant team is on the lookout for those near-misses and ready to design to prevent them.

It is a new approach to our working environment. No longer should we blame the operator, or wait for adverse events to act. We are all about designing an environment where mistakes are anticipated and mitigated, where we create a setting conducive to uninterrupted problem-solving, and where we anticipate our own failures. And we are not ashamed or abashed to stand up and say, “To err is human,” and from that kernel of truth, build a health care system that is safe and reliable.

© 2006 Lippincott Williams & Wilkins, Inc.

Login

Article Tools

Images

Share