Human Factors in Medicine: A Medical Error Model that Isn't Full of Holes : Emergency Medicine News

Journal Logo

Human Factors in Medicine

Human Factors in Medicine

A Medical Error Model that Isn't Full of Holes

Jedick, Rocky MD, MBA

doi: 10.1097/01.EEM.0000898248.08659.aa
    Swiss cheese model, medical error

    I was remembered for two things in my residency senior roast video: always bringing my daughter to resident events (aside from the annual pub crawl) and reliably referencing the Swiss cheese model at morbidity and mortality conferences.

    I was first introduced to this model during an aviation mishap course early in my career as a U.S. Air Force flight surgeon. When a plane crashes, a massive investigation is launched immediately to determine causal factors that can lead to real change and prevent future incidents. This is flight safety.

    This approach has been fine-tuned over decades. It is standardized, thorough, and scientific. Emotional insults and blame are removed to encourage transparency and participation.

    The safety feedback loop utilized by the Federal Aviation Administration, the National Aeronautics and Space Administration, and the Department of Defense is quite impressive. The Swiss cheese model is intimately intertwined within the mishap investigation process. This approach lies in stark contrast to the one for medical errors in which blame is readily assigned to individuals.

    The Cheese Stands Alone

    Studying the causation of human error actually has a fairly long history dating back to the Industrial Revolution. Many concepts developed during that time were later incorporated into aviation, nuclear energy, space, and other highly reliable industries where an error rate close to zero is eagerly sought. Many argue that health care should operate in a highly reliable way, but we know this seldom happens in practice.

    The Swiss cheese model was developed in the 1990s by James Reason, PhD, a psychology professor at the University of Manchester. This advancement over previous frameworks viewed poor outcomes as final events in a linear chain of occurrences often illustrated as a line of dominoes.

    Each slice of cheese in this model represents a human system of defense against error. The holes in each slice represent the potential for error. A bad outcome occurs when the holes line up and permit a trajectory for accident opportunity. (BMJ. 2000;320[7237]:768; This paradigm can be used to minimize error potential when designing complex systems and to identify causes for error during mishap investigation.

    Each slice of cheese can fit into one of four levels of potential failure: unsafe acts, preconditions to unsafe acts, unsafe supervision, and organizational influences. Using this model as a way to understand what causes a plane to crash, aviation safety organizations layer a taxonomy of possible causes of error (called the Human Factors Analysis and Classification System or HFACS) over the Swiss cheese model. Investigators trying to determine what went wrong identify contributing and causal factors. Unsafe acts typically are the proximate cause to the mishap and can be an error or violation depending on whether the act was intentional or unintentional. Preconditions are issues within an environment, individual physiology, or interpersonal dynamic that may have contributed or led to the final unsafe act. The other two levels consider patterns of behavior or culture by the supervisor and the organization.

    Where these processes really demonstrate their power, however, is how the industry incorporates findings into a continually improving feedback loop leading to real change that prevents similar future incidents. The proof is in the numbers: Within barely 100 years, aviation has gone from a highly dangerous new technology to the safest mode of transportation. (International Air Transport Association.

    Applications in Medicine

    Elaine Bromiley tragically died during routine surgery in 2005 due to a number of egregious errors in judgment and procedure. Her husband, Martin, was a commercial pilot and in response to her death started the Clinical Human Factors Group to bring the safety culture and procedures of aviation to medicine. (Clinical Human Factors Group. He famously said, “It wasn't the clinicians that failed; it was the system and training that failed by making it hard to do the right things.”

    This high-visibility case and a growing interest in patient safety allowed Mr. Bromiley's group and others like it to push health care to incorporate best practices from highly reliable organizations to prevent errors. But we still have a long way to go.

    Health care faces many obstacles that make the aviation model for safety challenging. Medical errors are often unknown or unreported, unlike the national attention paid when an airplane crashes, which allows the industry to stop and conduct a massive investigation. And due to the tempo of the operations, there is rarely an analysis of a medical event. Even in the Bromiley case, it took incredible persistence from her husband to force the hospital to produce a document similar to an aviation mishap report. (EMCrit.

    Except for massive sentinel events, we are just too busy to stop, debrief, and learn from our mistakes. The health care system is actually a non-system of private, public, and nonprofit entities that interact with a variety of affiliated intermediaries—the pharmaceutical and insurance companies. This makes standardization of procedure, data collection, and incorporation of safety findings nearly impossible at scale. It also seems medical providers have a different cultural appreciation for safety compared with their pilot counterparts. Pilots train with checklists from day one. Physicians train by applying working memory.

    It's nerdy and I was often teased about using it, but the Swiss cheese model is the best approach I know to identify the elusive underlying causes of poor outcomes. It provides a well-researched vocabulary and a standardized approach, avoids the temptation to blame individuals, and allows users to identify triggers that can be changed or modified to prevent future mishaps or sentinel events. Medicine can be practiced more safely.

    Dr. Jedickis a board-certified emergency physician who works in EDs in Las Vegas and as clinical faculty at the University of Utah. He also practices aviation medicine, previously serving as an active-duty flight surgeon in the U.S. Air Force with several fighter squadrons and now in the Utah Air National Guard. He is also an FAA aviation medical examiner and previously completed a space medicine clerkship with NASA. Follow him on Twitter@RockyJedickMD. Read his previous columns at

    Copyright © 2022 Wolters Kluwer Health, Inc. All rights reserved.