Skip Navigation LinksHome > April 2007 - Volume 29 - Issue 4 > From Aviation to Anesthesia: Creating a Culture of Safety
Emergency Medicine News:
doi: 10.1097/01.EEM.0000269598.43898.63
Quality Matters

From Aviation to Anesthesia: Creating a Culture of Safety

Welch, Shari J. MD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Welch is the quality improvement director in the emergency department at LDS Hospital in Salt Lake City, a clinical faculty member at the University of Utah School of Medicine, faculty at the Institute for Healthcare Improvement and the Urgent Matters Project for the Robert Woods Johnson Foundation, a quality improvement consultant to Utah Emergency Physicians, and a member of the Emergency Department Benchmarking Alliance.

In 1848, a young woman visited her physician to have a problem toenail removed. Eighteen-year-old Hannah Greer received an anesthetic of chloroform to sedate her for the procedure, but was given too much and sustained a cardiac arrest. Efforts at resuscitation were unsuccessful. (Brandy and water were poured down the unconscious girl's throat.) This was the first recorded death due to complications of anesthesia. Now, 150 years later, the Hannah Greer case is still debated and discussed.

Figure. No caption a...
Image Tools

More people die each year from medical error than from breast cancer or motor vehicle crashes or AIDS. (To Err is Human. 1999. Institute of Medicine. National Academy Press, Washington, D.C.) In fact, according to the Harvard Medical Practices study which estimates medical injuries in hospitals, mortality from iatrogenic causes likely approaches 100,000 deaths a year in the U.S. (Such projections were the basis and foundation for the 100,000 Lives Campaign organized by the Institute for Healthcare Improvement last year.)

In addition, the National Patient Safety Foundation revealed in a study 10 years ago that roughly one in six Americans had a personal experience with a medical error. (Louis Harris and Associates, 1997. National Patient Safety Foundation at the AMA: Public Opinion of Patient Safety Issues Research Findings, Rochester, NY.)

A contribution to the problem lies in the culture and craft of medicine that emphasizes the skill and character of the physician as the main weapons against medical error. The suggestion is that quality and safety will be achieved if the physician has integrity, cares about his patient, and works and studies hard. Meanwhile, a growing body of research has resulted in an understanding that the vast majority of medical errors (better than 80%) are system derived, meaning system flaws set good people up to fail.

In fact, most experts suggest that only five percent of medical errors are due to incompetent or poorly rendered care, while 95 percent of errors that cause harm involve conscientious and competent caregivers trying to achieve determined positive outcomes for their patients. (Michael Leonard, 2004. Achieving Safe and Reliable Healthcare. Health Administration Press. Chicago, IL.)

These concepts are important to grasp when looking at the current system for monitoring and managing medical errors. The current model widely used in health care involves a find-the-bad-apple approach. This system functions by identifying individuals and individual performance as causing a bad outcome, and becomes active only after a mistake or sentinel event is recognized. Instead of anticipating and expecting human errors and designing a system to prevent them, this shame-and-blame model has been used with little success for several decades.

The incident report, a retrospective investigation conducted in whispers that punishes individuals, is known to be a flawed and ineffective strategy for creating a “safety culture,” yet it is still the most commonly used methodology at hospitals for dealing with adverse events. Even if weeding out health care's bad apples through these means was successful, this would still only decrease medical errors by five percent because 95 percent of errors is derived from system flaws and failures. So how do we build a “safety culture?” How has such a culture been created in the aviation industry? In anesthesia?

In the late 1970s, a series of commercial airline mishaps led to several exhaustive studies looking for the root causes of these aviation errors. One of the sentinel events involved a United Airlines plane heading into Portland, OR, after a shock absorber broke and the landing gear descended prematurely. Though the crew had contingency plans for landing safely, the pilot was preoccupied with the malfunction and ran out of fuel while waiting for instructions from the ground. The pilot failed to verbalize his problems, and the crew did not voice their concerns about fuel. The plane crashed in a wooded area six miles from the runway, and seven people died because of a lack of communication and teamwork.

Back to Top | Article Outline

Aviation's Lessons

A sequence of similar mishaps and an investigative study by Helmreich led to the conclusion that 70 percent of commercial airline crashes were the result of communication errors. (Scientific American 1997;5:62.) Interestingly, the Joint Commission on Accreditation of Healthcare Organizations in its analysis of adverse events in hospitals has concluded that a similar percentage of these events in health care are due to communication failures. (Joint Commission Sentinel Event Alert; Issue 26, June 17, 2002; www.jointcommission.org; accessed Jan., 9, 2007.)

Around the same time the commercial aviation industry in the U.S. began to look at human factors, concepts, and research including the influence of teamwork, communication, and fatigue on performance and safety. The major airlines began mandating that cockpit crews train in teamwork. This began the cultural change that has evolved into the “safety culture.” This culture fosters open reporting of near-misses and incidents without fear of reprisals. This new environment acknowledges that human errors will occur, and strives to create safeguards against them. A culture of teamwork then is valued and promoted.

Most importantly, the aviation industry embraced the concepts of standardized procedures, guidelines, and protocols. Critical actions are checked and rechecked in the cockpit. By decreasing variation, they increased predictability and safety, a recipe beginning to gain favor in the house of medicine. It has been applied rigorously in the OR by anesthesiologists in a movement to prevent anesthetic complications, which began more than 50 years ago.

In 1954, Henry Beecher, an anesthesiologist at Massachusetts General Hospital, began to examine the safety of anesthesia. His study concluded that anesthesia was “twice as deadly as polio!” In 1984, Cooper looked at the management of errors and equipment failures in the OR. (Anesthesiology 1984;60[1]:34.) He noted that 70 percent of the errors were due to human error. Anesthesiologists began an open dialogue about how to prevent harm and standardize care, and technology was used to prevent accidents.

Some of the innovations instrumental in improving the reliability and safety of anesthesia include pulse oximetry, end tidal CO2 monitoring, ultrasound for line placement, continuous EKG monitoring, frequent blood pressure monitoring, and intraoperative documentation standards.

Anesthesia has become safer as a result of these innovations and standardizations, and is the most reliable subspecialty in medicine. In 1954, one of every 1500 patients died of anesthetic complications. In 2001 that number is one in 250,000 patients.

Figure. No caption a...
Image Tools

For health care as a whole to become safer and more reliable, it must work to duplicate this same culture of safety that is seen in the air and in the OR. It will need to build systems that compensate for human error with safeguards and redundancies to make human performance better than it is. (Nolan T. 2004. Improving the Reliability of Healthcare. Innovation Series 2004. Institute of Healthcare Improvement, www.ihi.org, accessed Dec. 12, 2006.) It also will also need to take the gags off frontline workers and encourage a climate of open disclosure about near-misses and anticipation of these errors.

The concepts of team behaviors will need to be incorporated throughout health care. The captain-of-the-ship model will have to give way to something more akin to a SWAT-team model. Won't it be a refreshing change? When the stakes and stress are high in the trenches, the team works to solve problems, and everybody watches everybody else's back to guard against those errors we know we will make.

© 2007 Lippincott Williams & Wilkins, Inc.

Login

Article Tools

Images

Share