Emergency Medicine News:
Dr. Welch is a fellow with Intermountain Institute for Health Care Delivery Research, an emergency physician with Utah Emergency Physicians, and a member of the board of the Emergency Department Benchmarking Alliance. She has written two books on ED operational improvement; the latest, Quality Matters: Solutions for the Efficient ED, is available from Joint Commission Resources Publishing.
You will make mistakes today. You are hardwired to. If given a simple arithmetic problem, there is a three percent chance you will make an error in solving it and a 10 percent chance you will inspect the problem and not be able to recognize the error. (Patient Safety in Emergency Medicine, Philadelphia: Lippincott Williams & Wilkins; 2009.) Depending on the task, we can expect human performance to fail between one and 10 percent of the time. Human factors engineering studies the limits of human performance and the strategies to improve performance when using tools, equipment, systems, and environments. It also examines the user and the activity and how that interaction can go awry.
The ED is an inherently unsafe environment that is impossibly conducive to committing errors. There are many aspects of our work environment that make this so. There are many distractions and interruptions of work flow in the emergency department — 10 interruptions per hour for emergency physicians. (Acad Emerg Med 2000;7: 1239.) Between time pressures and a huge amount of information to be managed, caregivers interface with hand-offs that are casual, and communication occurs on the fly and in sound bites. We have groups of people working together that are barely acquainted, performing tasks under pressure that they do not always do, and these tasks are not executed in a regular fashion. Much of the work is not sequenced, scripted, or performed as teamwork. Standardization has yet to be embraced by EDs, and similar scenarios are managed differently and learning is difficult. This technical work that we do requires equipment and supplies that are often not standardized or predictable, and this can lead to operator error. In short, the chaos of the work environment promotes medical error, not patient safety.
There are no quick fixes for safety in the ED, but human factors research is a good place to start this monumental task. Sometimes the fixes are very simple and low cost. In a high-profile medical error case, the actor Dennis Quaid went public about an adverse event involving his twin infants at Cedars-Sinai Medical Center. The newborns were being treated for possible neonatal sepsis, and each had an IV line. The infants and a third child accidentally received high-dose heparin instead of heparin flush. Look at the old vials of the two solutions produced by Baxter.
They were of similar size and shape, and the labels varied only in the shade of blue, not the color. Baxter made numerous changes to the design of the bottle, including size, shape, color. They also added a tear-off warning label to the more dangerous solution. Noting that this error can be made due to the limits of human performance, design changes were made to prevent them.
Mr. Croskerry in his book about ED patient safety recounts another sad tragedy due to human error. A child was being hooked up to an EKG monitor, and the lead cable was accidentally inserted into the power source for the infusion pump. The cables, though, from different companies and designed for different purposes were compatible and interchangeable, and the child was electrocuted. The possibilities for such errors in our work environment are too numerous to count. Shouldn't the wrong thing be the impossible thing to do by design? There also have been cases in newborn nurseries of infants getting breast milk tube feedings inadvertently through an IV.
This is going to require that we get upstream of our errors and develop systems for dealing with near-mistakes and full-blown errors. Do you have a system in your department for catching and learning from the near misses? Typically in the fast-paced ED that never closes, there has to be a sentinel event associated with a catastrophic outcome to see any changes for safety. This should be the challenge to all of us. We get up in the morning knowing that we will inevitably make mistakes, and we know these mistakes may extract a higher price from patients and society (than say the mistakes made by a pastry chef). Shouldn't we mandate that at the very least we look for these mistakes and try to design around them? Shouldn't the ideal ED be an environment where it is impossible to do the wrong thing?