It is a fact of history that many of us who pioneered simulation in healthcare took as inspiration the experience of other intrinsically high-hazard industries such as commercial aviation. Many of us have argued forcefully that healthcare should adopt simulation comprehensively in part to follow the model set by these industries. Yet, with simulation in healthcare having some roots that go back decades, and with even aviation-inspired curricula in healthcare (like ACRM) approaching 20th anniversaries (ACRM was first offered in September 1990) one can wonder why the healthcare industry has not embraced and implemented simulation as fully as has been done by other industries including commercial aviation, the military, or nuclear power. I would like to explore this analogy a little more deeply, reviewing what I see as the meaningful similarities, as well as the profound differences, between healthcare as an industry and these other risky human endeavors, focusing especially on commercial aviation.
This paper is a combination of my own personal recollections and an objective assessment of the industries. Some elements of this analysis are taken from my paper Structures and Organizational Issues in Patient Safety: A Comparison of Health Care to Other High Hazard Industries (California Management Review, Fall, 2000)1 while other aspects come from the talks I have been giving in the last 7 years.
The Flight Deck as a Cognitive Parallel for Dynamic Healthcare Settings
A little known point about my own laboratory's development of the mannequin-based simulator in the late 1980s was that we started with the goal of creating a tool for the understanding of cognition of anesthesiologists in the handling of adverse events. We were driven to this by our analysis of the “chain of accident evolution” in anesthesia – the ways in which inciting triggers end up as major catastrophes if not interrupted by the intervention of the anesthesiologist. We made a number of conjectures about cognition in these settings:
“To recover from anesthesia incidents, the anesthesiologist must: 1) detect one or more of the manifestations of the incident in progress; 2) verify the manifestations and reject false alarms; 3) recognize that the manifestations represent an actual or potential threat; 4) assure continued maintenance of life-sustaining functions; 5) implement “generic” diagnostic or corrective strategies to provide failure compensation and allow continuation of surgery if possible; 6) achieve specific diagnosis and therapy for the underlying causes; and 7) provide follow-up of recovery to ensure adequate correction or compensation.”2
To provide empirical data to confirm these conjectures we needed a way to provide standardized adverse events to different clinicians to tease out the typical response behaviors. My own background as an avowed “aviation and space nut” stood me in good stead, giving me the knowledge of the existence of simulators in these arenas. That led us to create the CASE simulator series initially for this cognition research.3 Our experiments were described in several papers appearing in Anesthesia and Analgesia.4–6
As we looked for models in medicine of such dynamic decision making processes we didn't find them. Most of the literature on medical decision making were about quite static decisions, and concerned highly mathematical and probabilistic techniques that couldn't readily account for the behavior of anesthesiologists that we had observed. But we did find models for such cognition in other industries, particularly aviation. The flight deck of the aircraft does have a number of cognitive parallels with that of anesthesia and other medical domains, such as intensive care or emergency medicine.
In all such settings most time will be spent in ordinary and routine activities. Like in the operating room, even in a busy ICU or emergency department the true crises will be rare. Hours of boredom, moments of terror is a mantra on the flight deck as well as in the operating room. A “flight” has many similarities to “an anesthetic.” Each has a phase of preparation – analysis of the situation (weather versus underlying diseases) – and equipment checks. Take off is like induction, cruise is like maintenance, and landing is like emergence. Certain flights, and certain surgical cases have key midstream milestones that must be anticipated and planned for. On both the flight deck and in the operating room there is a plethora of information sources, some of them mutually redundant, requiring dynamic allocation of attention but allowing cross-checking between sources.
Our own research had shown that anesthesiologists did not solve acute problems by direct application of deep abstract reasoning but rather by applying “precompiled” knowledge, doing the usual things about the usual problems. Aviation does the same thing except they have codified many of the responses into an emergency procedures manual; pilots are expected to know from memory the first few items of these procedures, but to use the written protocols themselves for anything further.
Both commercial flying and healthcare are conducted in crews and teams. We consider a crew to be one or more individuals each from a specific discipline, sometimes with their own hierarchy; multiple crews working together makeup a team. The flight deck crew are pilots (the days of nonpilot flight engineers in a 3 person flight deck are almost gone) with a “Captain” and “First-Officer”. The flight deck crew combine with the cabin crew to make up the aircraft team, and the team is larger still when air traffic control, airline dispatch and maintenance are included. Similarly, the OR has a surgeon crew, a nursing crew, an anesthesia crew, and sometimes specialist technicians (like cardiopulmonary bypass perfusionists). The crews work together as a team.
Aviation found that intracrew and intercrew coordination was a major feature of good problem solving and that failures of coordination were at the root of many accidents.7 We also felt, by introspection, that a substantial part of expertise in anesthesiology lay in the ability to coordinate the anesthesia crew members (whenever there was more than one) and in the ability to coordinate with the other crews, especially the surgeons.
Given these similarities, it is no wonder that we believed intuitively then, and still do now, that it was worth adapting many practices of aviation for use in healthcare. The use of simulation in aviation has been extensive, both for teaching practical “stick and rudder” skills but since the mid 1980s, also for the so called “nontechnical” skills known as Crew Resource Management (CRM).8,9 Having first heard about CRM possibly in an episode of the PBS show NOVA called “Why Planes Crash” we were fortunate that one key architect of early CRM worked at the nearby NASA Ames Research Center. This contact facilitated our exposure to the CRM approach allowing us to rapidly adapt many elements of CRM into a simulation-based curriculum for anesthesiology (ACRM). The wide spread of the simulation-based CRM approach within anesthesiology and across health care disciplines and domains has been gratifying to watch. Clearly, the resonance that we perceived between the cognitive and social psychologic aspects of work on the flight deck with that in the hospital has been shared by thousands of others.
Nearly 20 years down the line in applying aviation concepts to health care, I still stand by the marked parallels at the level of the “sharp end” work itself. The dynamic thinking of people in dynamic fields of health care is much like that of pilots (and where it isn't yet, it probably should be closer). The 2 activities are not the same of course. Patients are not airplanes. Some aspects of health care are intrinsically different from aviation because of this fact. Other aspects are different not because of an intrinsic difference in the work but rather because of differences in the organizational structure of health care as an industry versus air transport as an industry. Let me explore some of each kind of these differences.
What Does It Mean That Patients Are Not Airplanes?
A major feature of the notion that “patients are not airplanes” is that people don't design and build human beings whereas they do design and build airplanes. I like that say that no one provides the instruction manual for humans. These facts mean that the level of uncertainty about human beings is enormously greater than that about airplanes. Each plane of a given type will behave in nearly the same fashion given the same small set of key characteristics (eg, thrust, weight, altitude, angle of attack) whereas the diversity between human beings is enormous. Designers instrument airplanes to give key data that can be relied on to fly the plane, whereas in health care clinicians typically obtain a smattering of data (eg, blood pressure, ECG, oxygen saturation) from noninvasive external sources. Airplanes are usually in good shape when we fly them, and there aren't mechanics in the back working on the aircraft during a flight, liable to sever a hydraulic line or the like. A daily variable in flying is weather and in this regard has some parallel to the routine diversity of “patient acuity” that we deal with in health care. Still, in commercial flying if the weather is bad enough, the planes don't fly regardless of how badly the passengers need to get where they're going. In health care, if the surgery is important enough it must go ahead regardless of whether the underlying disease state poses a danger. Another consideration is that health care is very personal. Most people don't care who the pilot of their airliner is as long as she or he is good at the job, and we don't care if the same pilot flies us on one leg of our trip as on the next leg. But we do care a lot about our physicians and having a personal relationship with physicians is perceived to be very important. Moreover, health care is full of issues of social norms and ethics that rarely enter into the sphere of aviation.
Organizational Differences Between Healthcare and Other Industries
Integration and Economies of Scale: Both aviation and health care are extremely decentralized, in contrast to some other high hazard undertakings that have been studied extensively (like aircraft carriers, of which the entire world has only 20, in the hands of but a few nations' navies). Annually in the U.S. there are more than 11 million departures by large airlines and somewhat over 30 million surgical procedures (the actual number is hard to come by), a roughly comparable figure. Both endeavors are conducted at hundreds or thousands of sites scattered all across the nation, some very large, and some relatively small. In this respect the industries are comparable. But whereas only about 10 airlines are responsible for the 11 million flights, the surgical procedures are conducted at (on the order of) 4,000–6,000 hospitals and a similar number of standalone surgicenters (not to mention office-based surgery). These are owned by on the order of 1,000 – 6,000 firms (no one really knows the number of firms; there are some large hospital chains, but most hospitals are one or 2 of a kind).
The small number of firms gives airlines a huge economy of scale and it greatly simplifies both official and unofficial safety regulation of the industry. A good safety idea, even if not an official “de jure” regulation, can be adopted by the industry nationwide if 10 firms think it is worth doing. In healthcare one would have to convince each one of the many thousands of firms. There are a few examples of very large integrated health care organizations in the U.S (the Veterans Affairs health system is one and Kaiser Permanente is another; both have had much publicized safety efforts). Whether safety is actually greater in such integrated systems than in any nonintegrated collection of institutions of similar size, diversity, and scope remains to be seen, but it is conceivable that a system with some of the economies of scale and integration like that of the airline industry could come into being and demonstrate safety benefits of such organization.
Accidents and the Means of Production
The rates of fatal accidents in these industries is markedly disparate. Between 2002 and 2006 there were on the order of 10.5 to 11.5 million departures on major airlines (Part 121– see http://www.ntsb.gov/aviation/Table5.htm) with between 0 to 3 fatal accidents killing 0–50 people (median about 20). This yields rates of fatal crashes on the order of 0.020 per 100,000 departures. In healthcare we do not know the rates of fatal accidents so clearly. An airplane is never supposed to crash, and when it does it is highly public and may harm dozens or hundreds of people. When it comes to health and disease, all humans are destined to die, and in the industrial world nearly all will die in proximity to healthcare activities. Sorting out those that were due to accidents and those that were due only to the natural course of disease is difficult. Further, healthcare accidents are hidden and usually harm only one person. Still if taking the most wildly optimistic estimates of healthcare success – say the rate of fatal accidents due only to anesthesia care for healthy patients having routine surgery, which are on the order of 0.5 deaths per 100,000 cases,10,11 healthcare is still 25 times more dangerous than flying. For healthcare as a whole the gap is probably considerably larger.
In aviation and other hazardous industries accidents harm workers and often the public, are highly publicized by the media, generate lawsuits, and (cynically) of even more concern is that a catastrophic accident destroys the means of production. In such a case the airplane (or even worse in the power industry, a power plant) is removed from service and has to be replaced at great cost and disruption. This gives even the hard-hearted “bean-counters” a healthy interest in avoiding accidents. In healthcare by contrast, accidents or other episodes of suboptimal care harm a patient but do not (generally) harm workers or the means of production. When clinicians hurt a patient they may feel bad about it but then they send for the next patient scheduled for that site. The recent announcement by the major U.S. government payer for healthcare (Centers for Medicare and Medicaid Services) that it will no longer pay for certain preventable conditions, mistakes or infections resulting from a hospital stay is a slight step in the direction of greater “business reasons” to avoid accidents, but even so the means of production are left intact. Imagine how much more seriously this would be taken if every time there was a serious problem in care in the OR or ICU that site had to be taken out of service for a year.
In the U.S. a single federal agency regulates air transport, and comprehensively oversees nearly every level of equipment, personnel, and detailed operations all the way down to some flight crew processes. Beyond the official regulator there is a national independent agency (the National Transportation Safety Board) that investigates accidents and makes safety recommendations to the regulatory body. The firms themselves exert strong control over the pilots with standard operating procedures and company policies that are strongly adhered to. In health care, while a federal agency regulates drugs and devices, it does not regulate the practice of medicine. Each of 50 states and 3 federal jurisdictions (Department of Defense, Department of Veterans Affairs, and the Indian Health Service) regulates the practice of medicine. The level of government regulatory control is variable, and in general it is very limited (or nonexistent) at the level of actual physician/crew practices. Individual firms impose only modest controls over the clinical practices of individual clinicians. There is no independent official safety organization for health care.
There is de facto regulation by accreditation. Organizations like the Joint Commission (now JC, formerly JCAHO) impose a level of semi-official regulation through accreditation standards. Where such accreditation is required to qualify for payment for services (such as from government payers) the voluntary accreditation becomes effectively official. Yet some healthcare institutions choose not to be accredited by the JC and even the scrutiny of this agency has been challenged in the past12 as being relatively ineffective. The JC has moved toward more stringent assessment procedures, especially in using relatively unannounced survey visits (though the hospitals often know in advance that “the JC is in town”). Nonetheless, the level of control imposed by such indirect regulation is still rather limited.
In fact, some have compared healthcare's massive decentralization with extreme physician autonomy (only somewhat reduced from the “Golden Age” perceived by many middle-aged or older physicians) to the old “Guild Workshop” model of work (see for example Paul Starr's book The Social Transformation of American Medicine).13 In this model the hospital developed historically like a workshop in which the independent artisan guild members came to do their work. The workshop did not impose any level of control over the guild members, although the guilds collectively might have done so. Today, some hospitals do belong to voluntary safety consortia that attempt to share best practices. I have personally been involved in several such consortia. They are well-meaning but the level of organizational standardization and control are weak and patchy at best.
Many of the organizational structures in health care come from more than 150 years ago, and some are truly ancient. But the nature of health care work has changed greatly in this time. Whereas before there was often little harm physicians could do to patients (leeches and other methods of bleeding were probably harmful but rarely lethal in and of themselves), in many modern settings the “lethality per meter squared” is quite high. The nature of modern industrial, high-tech, and high risk medical care may not match well to organizational structures from antiquity. Although over the past 40 years there has been some erosion of physician autonomy, and the nature of how healthcare is paid for has changed markedly, the day-to-day work of physicians is one of the most autonomous activities of intrinsically risky human endeavors. A substantial amount of autonomy and flexibility is necessary in the system because human beings are not airplanes or nuclear power plants. Healthcare does not need to achieve the same level of proceduralization and control that mark these industries. But the pendulum in healthcare, in my opinion, is quite far to the other extreme, and getting to a happy medium is an important goal on the road to improving safety and quality.
Education and Training
Differences between the industries on the structure and organization of education and training are also profound. The major airlines in the U.S. rely on other systems to conduct primary flight training (some airlines elsewhere in the world conduct “ab initio” flight training, taking pilot candidates from zero experience and doing all the training themselves). In the past many pilots came from the military; another pathway is now common – pilots working their way up in the civilian sector then through commuter airlines to the majors. Selection in both systems is fierce, but it is not so much on the basis of knowledge-base tests of underlying knowledge but rather more on the basis of actual job skill and performance. The airlines have highly focused training programs to teach and assess performance according to their own strictures and requirements. Regular training is mandated by the firm and by the government – it can be done either in a real airliner or in a properly approved simulator. Obviously, the simulator is a safer and cheaper alternative. On top of this is a regimen of yearly performance assessment of pilots by the government regulator, both during simulated flights and during actual “line” flights.
Healthcare uses intense selection at the level of professional school entry (based on school grades and tests but not on performance aptitudes for the jobs themselves) followed by a long period of “book learning” mostly of concepts and basic skills, then followed by an intense period of apprenticeship or on-the-job training, taking care of patients under (variable levels of) supervision. What one learns depends heavily on which patients “roll in the door” during one's rotation and the peculiarities of the specific faculty members acting as supervisors. When very challenging situations or crises arise the junior personnel are bumped out of the way so that senior personnel can best treat the patient. There is little systematic performance assessment. Training mostly ends with completion of residency and/or fellowship period, with a minimal (and variable) state-based requirement of “continuing education.” Especially for physicians the continuing education requirement can be met by a wide diversity of modalities, few of them intense or focused on real issues of job skill or patient safety.
These differences, like most of the other organizational differences suggested above, are not intrinsic to the fact that patients are not airplanes. Healthcare could use exactly the same kinds of structures and methods to assure the optimal initial and recurrent training of personnel. The status quo is a result of historical, political, and economic forces, not the analytical or experience-based consideration of how best to achieve the desired results.
If by some magic the healthcare system could be constructed from scratch it would probably be structured entirely differently, and it might look much more like some of the other high-hazard industries. In fact, I sometimes wonder – thinking about my own field of anesthesiology – whether things would have been different had powered flight been accomplished in 1846 and anesthesia demonstrated only in 1906 rather than the other way around. But, it isn't possible to rebuild the healthcare system from first principles; instead fixes must be made incrementally while the primary structures and organizational elements of the system remain intact. This is one of the key barriers to matching the processes of simulation-based training and performance assessment we admire in aviation. Considerable progress has been made in the last twenty years using analogies from such industries, but it will take decades before the fruits of this approach are fully realized in healthcare. Readers of this Journal will be heavily involved in this process, but I have learned over more than 20 years that we had better have a lot of patience and a long time-horizon in mind!
1. Gaba D. Structural and organizational issues in patient safety: a comparison of health care to other high-hazard industries. Calif Manag Rev
2. Gaba D, Maxwell M, DeAnda A. Anesthetic mishaps: breaking the chain of accident evolution. Anesthesiology
3. Gaba D, DeAnda A. A comprehensive anesthesia simulation environment: re-creating the operating room for research and training. Anesthesiology
4. Gaba D, DeAnda A. The response of anesthesia trainees to simulated critical incidents. Anesth Analg
5. DeAnda A, Gaba D. Unplanned incidents during comprehensive anesthesia simulation. Anesth Analg
6. DeAnda A, Gaba D. The role of experience in the response to simulated critical incidents. Anesth Analg
7. Billings CE, Reynard WD. Human factors in aircraft incidents: results of a 7-year study. Aviat Space Environ Med
8. Helmreich R. Theory underlying CRM training: psychological issues in flight crew performance and crew coordination, Cockpit Resource Management Training (NASA Conference Publication 2455). Orlady HW, Foushee HC, eds. Washington, D.C.: National Aeronautics and Space Administration; 1986:15–22.
9. Helmreich RL. Does CRM training work? Air Line Pilot
10. Lagasse RS. Anesthesia safety: model or myth? A review of the published literature and analysis of current original data. Anesthesiology
11. Cooper J, Gaba D. No myth: anesthesia is a model for addressing patient safety (editorial). Anesthesiology
12. Dame L, Wolfe S. The failure of “private ” hospital regulation: an analysis of the Joint Commission on Accreditation of Healthcare Organizations' inadequate oversight of hospitals. Washington, D.C.: Public Citizen Health Research Group; 1996.
13. Starr P. The Social Transformation of American Medicine
. New York: Basic Books; 1982.