At first glace, it might seem incongruous to take lessons from the world’s leading manufacturers—Toyota, Alcoa, and the like—and apply them to reduce the risk of medical error and otherwise improve patient safety and the quality of health care delivery. Designing and building products to sell for profit would seem to have little in common with treating patients with minimal nosocomial risk and maximum responsiveness and efficiency. But in fact, there are important similarities. In both domains, great mastery of one’s discipline is not enough to ensure successful delivery of products and services. (Although it may sufficient to make one responsible for doing so, as happened to the laser angioplasty specialist we will discuss later in this paper.) Individual elements must be integrated coherently into remarkably complex work processes comprising a great many interdependent elements. Yet, some manufacturers do integrate individual expertise so harmoniously that they can consistently generate services and products with quality, variety, safety, efficiency, and responsiveness that far outstrip those of their competitors.
How they accomplish this provides lessons for achieving better health care outcomes. Not that it would make sense to try to make hospitals and clinics resemble factories, even the very best factories. But what if some hospitals were as much better than others at providing health care as the best automakers, such as Toyota, are when compared to their peers – developing products with “half the engineering hours … in half the time,” producing them with “half the human effort … half the factory space … and half the inventory.”1 And then—because hospitals aren’t competing with each other quite the way automakers are—what if all hospitals were that good? This is demonstrably possible.
The remainder of this discussion will begin with a look at the ontology of medical system failure—why health care systems perform far below the skills, training, aspirations, and efforts of the people they employ and the science and technology they engage. It will continue with a look at why some manufacturers get so much more yield out of their people and processes than their competitors do. Next, examples will come of applying lessons from industry in health care. But if these lessons are to become part of health care, they must become part of residency training; that is taken up in the final part of this discussion.
This is not a hypothetical discussion. “Proofs of concept,” projects at hospitals in Boston, Pittsburgh, Seattle, Salt Lake City, Appleton (Wisconsin), and elsewhere have already yielded impressive and encouraging—not to mention life- and limb-saving—results. A cohort of hospitals in Pittsburgh reduced central-line-associated blood stream infections 68% on average,2 with the leaders in the group scoring reductions greater than 90%.3 Other hospitals reduced patient falls markedly, improved the efficacy of diet and nutrition in the patient treatment process, and increased the quality and efficacy of presurgical nursing care.4 Dr. Stephen Raab and his colleagues at the University of Pittsburgh Medical Center (UPMC) Shadyside Hospital achieved marked improvements in Papanicolaou test quality by applying lessons from Toyota.5 Given the well-publicized rates of medical errors,6 making improvements that reduced error levels by an order of magnitude on a national level would save tens of thousands of lives and billions of dollars.
Ambiguity and Workarounds as Contributors to Medical Error
In 2002, the Annals of Internal Medicine launched a series, “Quality Grand Rounds,” in which examples of medical error were detailed to demonstrate how the shortcomings of work systems – in contrast to the incompetence, inattention, or malfeasance of individuals – contribute to adverse events.7 My colleague, Mark Schmidhofer, and I asked ourselves if the medical errors detailed in the series had common root causes from which lessons could be drawn to improve health care quality and to reduce the potential for catastrophe.8 We did indeed find a common ontology. In each case, people were asked to perform work in which there was ambiguity. In some cases, it was unclear precisely what objectives they were trying to achieve individually or as a group. In some cases, the objectives were clear but there was ambiguity as to how the work should be carried out. In some cases, objectives and methods were clear, but there was still ambiguity as to whether the process was actually proceeding as expected.
To the curse of ambiguity was added the curse of workarounds. In the face of uncertainty of objectives, methods, or current state—and even when it was clear that something wasn’t right—people made do, working around the difficulties, often acting heroically to “get the job done.” The problem with “coping” is that the underlying problems are always left in place to confound other people, often catastrophically.8
For example, a 68-year old patient, “Mrs. Grant,” was discovered at 8:15 am with full body seizures.9 Computed tomography (CT) scans revealed no neurological explanation such as a clot or mass, but the laboratory reported an “undetectable” serum glucose level. Infusions of dextrose were futile; she lapsed into an irreversible coma, and life support was later withdrawn. What had happened? At 6:45 that morning, when an alarm had indicated a possible central line occlusion, the night-nurse had rushed to Mrs. Grant’s room and administered heparin. Unfortunately, according to the case’s author, it appears that the nurse hadn’t given heparin after all, but had injected insulin into the central line with disastrous consequences. Why the mistake? Heparin and insulin, both clear and colorless, were stored near each other in vials of similar size, shape, and weight, labeled with small type.
The ambiguity due to medication packaging and presentation—do I or don’t I have the right medication?—meant that when the nurse rushed to help Mrs. Grant, he had to be vigilant, checking, double-checking, and triple-checking that he had the right vial and not the wrong one. However, vigilance is a poor guarantor of safety, particularly when people are rushing (as this nurse was), stressed, or tired (as he might well have been at the end of a night shift).10
But there was more than this one mistake leading to Mrs. Grant’s avoidable death. For every death due to medication error, there are multiple injuries; for every injury, there are multiple near misses; for every near miss, there are multiple slips and mistakes.11 Why does this matter? Studies show that when health care workers experience an interruption, difficulty, or other disruption in routine, they work around the problem without bringing attention to it. They get the job done—right now when it needs to be done—but whatever caused their problem remains uninvestigated and waiting to cause more trouble another time.12
Therefore, in Mrs. Grant’s particular case, there may have been hundreds if not thousands of times when a nurse was momentarily unsure whether he or she was holding a vial of insulin or of heparin—they were so much alike. But what happened when these warnings of a dangerous situation were encountered? The nurse double-checked, perhaps catching an error and grabbing a different vial, and hurried on. He or she had managed to step over the landmine, but left it in the ground where it eventually tripped up a nurse and killed a patient.
There is another dimension to this case: system failures leading to human error. If we ask, “Who killed Mrs. Grant?” the answer is not really “Mrs. Grant’s nurse.” In some regards, he did exactly what was right. He heard an alarm, rushed to respond, sized up the situation correctly—that an occlusion might be forming—and acted appropriately, administering what he thought to be heparin as an anticoagulant.
Regrettably, vials of the two drugs were easily confused. So, was it the pharmacy that killed Mrs. Grant? The medications were presumably of the right concentration, purity, and dosage, and the labeling was probably accurate. But the pharmacy’s packaging, storage, and presentation of the medication were flawed from the perspective of the nurses. Apparently, mechanisms weren’t in place to allow the handoff from one discipline to the other to be well designed in the first place or improved as its deficiencies became clear.
Manufacturing can offer lessons on dealing with ambiguity, workarounds, and the effect of systems on individuals.
Managing Complex Systems for High Performance
My experience is that when many people think of manufacturing, they imagine highly repetitive, often automated production of repeatable items—a stark contrast with health care, in which the patients, the manifestation and causes of their symptoms, and their treatments are far less standardized or predictable.
However, the iconic image of routinized production of repeated products overlooks a fundamental challenge of managing industrial processes. When product designers and process designers begin their work, they face tremendous uncertainty as to what is required to succeed. There is an enormous space in which they must search to determine what product functionality will delight customers and which technical configurations will provide it. On the production side too, the right process is not known in advance but must be discovered through intensive, iterative design and development.
Converging from an initial state with so many unknowns and so many possible solutions to a final state in which products and processes are highly explicit and finally tuned is nontrivial. A major model upgrade—changing a product’s styling and performance or introducing technological advances to keep apace (or ahead) of competitors—can comprise hundreds of engineering-years of effort and intellectual content.13 Designing a new automobile engine and the plant to build it has engineering costs of tens of millions of dollars and total costs in the hundreds of millions.14
Great manufacturers distinguish themselves by how quickly and effectively they close the gap between initial uncertainty and the certainty of predictable work on predictable products. They achieve better designs in less time with less effort than do their competitors.13 As a product’s final form is emerging, these companies can develop processes that are far more capable, stable, and adaptable than those of their competitors and they can do it in less time and with less effort.15 Disparities can be twofold in efficiency and tenfold in quality.16–18
What sets these best-in-class manufacturers apart is that they are inveterate experimenters. As I will explain further, they treat their processes as if they are “ill” and figure out how to make them “healthy” with the same approach and discipline used by well trained, experienced doctors and nurses.
Although the great manufacturers may start with great uncertainty, not knowing the “right” answer, they resolve this by prototyping their processes in much the same way that they prototype their products. They begin by specifying what is to be considered “normal” for the process in question. This means specifying the objectives—what mix and volume of product they expect to have to deliver to whom by when; the pathways: the sequencing and assignment of responsibility for work tasks that is expected to lead to success; connections: mechanisms for handing off information, products, and services, that are expected to successfully inform people what work they need to do when and that are expected to let people provide what is requested in a usable format; and methods: the actual design of the tasks for which people are responsible.
Having specified a temporary but explicit prototype solution, they test it to discover all the ways in which it is inadequate, in much the same way that a doctor will do an initial work-up and diagnosis on a patient, develop a treatment plan, and follow the patient’s progress to see if his or her actual reaction matches expectations. In some cases, the specification may be exceptionally short lived—seconds, minutes, or hours—but regardless of its duration, the specification serves two purposes. It momentarily captures the best, shared understanding of what will lead to success. And, by making explicit what is expected to happen with what results, it makes it far easier to see problems as they occur, which is critically important. Just as doctors will do a new round of workup, diagnosis, treatment plan development, and testing if the first approach fails, the great manufacturers quickly make changes in the design of their products and processes as inadequacies are discovered, thereby rapidly converging on superlative solutions.19,20
In contrast to these extraordinarily fast, frequent, iterative cycles of experimentation, discovery, and convergent learning, the industrial also-rans are far more ponderous in their approach. Their actions suggest a belief that it is possible to do without high-speed, iterative, experimentation and to think one’s way to adequate answers for exceedingly complex problems. They lock down more parameters much earlier and make much bigger initial commitments, based on less knowledge experience gained from fewer prototyping, learning cycles. Although this might seem to be a potentially speedier approach, the results are far inferior. Without an energetic discovery process, these also-rans make large commitments, discover their inadequacies, and then have to go through another ponderous exercise of making commitments that again prove to be flawed. Rather than quickly converging from many possible solutions to the best one, they bounce from one inadequate solution to the next, embedded in learning cycles that are fewer, slower, and less frequent.13 (Imagine how discomforting it would be to be cared for by a doctor who took a similar approach – detailing a comprehensive treatment plan, without a thorough examination, without energetic monitoring, simply because a comparable approach had worked for a previous patient.)
The highest-performing manufacturers’ commitment to rapid discovery continues even after a product and its manufacturing processes would seem to be finalized. In what might seem a counterintuitive approach, these companies manage their work processes as if they are in intensive care, with great clarity as to what is expected to happen and immediate feedback from built-in monitors when something goes wrong. When a problem is identified, it is quickly contained and investigated, both to prevent its propagation and to prevent its recurrence.21 In an ironic contrast, the also-rans operate their processes as if they are convinced their designs are in strapping good health. With less clarity as to what is really expected to occur, it is harder for these companies to recognize discrepancies. Even when discrepancies are identified, they are worked around and little time is devoted to preventing problems from recurring. In other words, those with the healthiest processes act like hypochondriacs while those with the sickest processes act as if they are as healthy as horses.
Consider the case of Alcoa, which converted itself into the safest employer in America despite the inherent dangers to which its workers are exposed. Alcoa has vast mining operations, but the danger of mining is only the beginning. The bauxite that is extracted—a compound of aluminum and oxygen—is smelted by running enormous currents of electricity through the material to strip off the oxygen, leaving molten aluminum behind. Massive volumes of hot metal are then poured and shaped, adding velocity and pressure to an already hazardous set of conditions. Yet, the rate of workplace injury for Alcoa employees is less than one tenth of the overall rate for U.S. workers.
How is this possible? Starting in the late 1980s, Alcoa leadership and Alcoa’s workforce underwent a sea change in attitude and behavior.22 There had been a view within Alcoa that processes with such complex chemistry and physics were inherently unstable and consequently that they were inherently dangerous and could not be made any “healthier.” Alcoa arguably had some of the world’s smartest material scientists and process engineers in its research and development laboratories. But no matter what was invested in the design of a new process, there were things that could not be known about what was inherently leading-edge technology. Problems were inevitable because knowledge was not and could not be complete initially.
However, the problems were not unsolvable. The key was to swarm individual problems as they occurred, using rapidity of response to capture the unanticipated contextual factors—the unforeseen combinations of people, process, product, place, and time—that created the opportunity for harm. By catching glitches early on, when they manifested themselves as odd perturbations, Alcoans created opportunities for themselves to delve deeper into just those parts of their processes about which their knowledge was weakest. By doing so, they were able to repeatedly modify their systems so that they became ever more capable and reliable. Alcoans realized that their processes suffered problems with idiosyncratic causes but that these were typically precursor indications of major failures, so that if the minor problems were addressed at the time and place of their source, big failures could be avoided. Certainly, the parallels with disease diagnosis, treatment, and management are many.
Avoiding Ambiguity and Workarounds in Health Care
For the past several years, a number of hospitals have practiced flipping their behavior from a persistent tolerance for ambiguity and workarounds to a habit of treating their “sick” underperforming processes with the same structure and discipline with which they treat their sickest patients. They have learned that eschewing ambiguity and specifying processes in detail makes it much easier to see deviations when they occur and, when problems are experienced, to contain and resolve them.23
For example, a presurgical nursing unit at West Pennsylvania Allegheny Hospital had persistent problems with sign-in, registration, timely blood draws, and so on. Nurses, secretaries, and technicians would scurry to make do and compensate for incomplete documentation and missing information, only to have to work around the same problems the next day. The staff broke this loop by investigating failures as they occurred, redesigning their work with exceptional frequency, and moving on to the next problem, rather than coming back to the same one repeatedly. Initially, for instance, 7 out of 42 patients each day didn’t have blood work complete before they were ready for transport to the operating room. Patient by patient, the staff extracted ambiguity out of the process—by designating who was to draw blood from each patient, by making clear how registration would signal that blood was to be drawn, and by providing a designated space, properly equipped and stocked, where samples could be taken. Over a matter of weeks, a persistent problem went away and stayed away—because it had finally been solved rather than worked around. A similar approach of solving problems as they appeared rather than constantly working around them reduced the time secretaries spent each day “building” patient charts from nine hours to two. The reduction in registration time per patient was like getting another FTE for free, too. And the nurses, freed from their own compensating workarounds, effectively increased the nursing-to-patient ratio without an additional hire.23
A primary care team at Massachusetts General Hospital’s Revere Health Center ran three two-hour flu shot clinics. Rather than depend on shared memory of previous clinics and on a rough idea of how the clinics should be operated, the staff scripted their check-in, vaccination, documentation, and departure processes in advance, with as much detail as possible. Once the first clinic had begun, they quickly discovered that their “script” was imbalanced and incomplete. Rather than merely cope with the process’s initial inadequacies, the staff quickly huddled, determined why they thought the process was suffering glitches, created “treatments” such as reordering, reassigning, or redesigning work, and carried on. Over the three two-hour clinics, they kept responding quickly to glitches in their ever-evolving process. Because they were willing to make changes in the course of work—several dozen times, in fact—they raised their vaccination count from 43 on the first day to 151 on the third. Because they had improved efficiency so much that they could free up a staff member from the clinic, they actually raised their shots per staff-hour from 6 to 30.24
One of the most successful examples of putting industrial process improvement methods to work was an effort by several hospitals to reduce central line infection rates. Although the specifics of placing and maintaining central lines are quite different from the specifics of preparing patients for surgery or of managing flu vaccinations, the same approach—specify in advance how work is expected to be conducted, identify every break in procedure as it occurs, and quickly institute process changes to prevent a recurrence of the problem—led to fantastic gains in patient safety. Hospitals participating in this effort averaged a 68% reduction in central line infection rates, with some hospitals achieving reductions of 90% and more.2,3
Implications for Residency Training
A cardiologist I know was brought up short when his eight-year-old daughter asked, “Daddy, what grade did you go to?” Including primary and secondary school, premed, medical school, internship, residency, a detour for a master’s in physics, and fellowships, he realized that he had gone to 27 grades. His daughter, not at all impressed, replied, “Well, I’m starting the third grade today.”
Despite his daughter’s condescension, my colleague reflected on what had occurred during those 27 grades. As he had progressed, his expertise had become ever deeper in an ever-narrower field, culminating in his cutting-edge knowledge about a particular subspecialty—not cardiology or even angioplasty, but laser angioplasty.
Finally, as a master of his subdiscipline, he faced a conundrum. As a senior physician, he was “in charge”—responsible, he was told, for the care of the patients in his unit when he was the attending on call. It was clear to him that, although he was tremendously competent at contributing the piece in which he had specialized, patient care depended on the myriad contributions of people on the other side of one boundary or another—those in other disciplines such as pulmonology, endocrinology, surgery, and psychiatry; those in other professions such as nursing and pharmacy; and those in his own profession but at other levels, such as residents and medical students. He knew that no matter how great everyone’s individual expertise might be, if their contributions didn’t mesh well, patients would pay the price.
When he objected that he had deep knowledge within his discipline’s silo, the vertical element of his role, but far less expertise in creating and managing the horizontal flows across disciplinary boundaries, he was told, “Don’t worry, you’ll figure it out,” probably not the advice for which he hoped.
We certainly wouldn’t want or expect doctors, nurses, engineers, or firefighters to master their disciplines by “figuring it out” on the job. Rather, we would expect that each discipline has a guiding set of principles, frameworks, and theories that can be taught in some didactic fashion and mastered through coached, hands-on experience. Mechanical engineers, for example, learn the basic principles and formulas in courses on solids, system dynamics, and thermodynamics. They master these principles with problem sets, lab exercises, and projects, gradually progressing from experiences which are simpler, faster, and more directed to those which are more complex and slower to resolve and for which there is less guidance. From there, it still takes time to develop more practical expertise, but the whole process will take far less time and arrive at a higher level of practical expertise than if each aspiring engineer were required to rediscover Newtonian mechanics and reinvent calculus on his or her own, having to design a jet engine as a first exercise.
Similarly, doctors start with courses on the basics of anatomy, physiology, biochemistry, and so on, but master and learn the principles underpinning these fields in labs, clerkships, internships, residencies, and fellowships—likewise starting with problems that are simpler, faster in feedback, more bounded, and more directed, then gradually progressing to situations of greater complexity. The clinical training has two benefits. Doctors treat patients in need of care, and while treating patients, master the principles they will need to treat patients in the future.
Though process management is an entirely different expertise, it is one that can be mastered in exactly the same fashion. It is essential, then, to provide those who will be responsible for managing care delivery processes—chief residents and other senior housestaff, attending physicians, charge and chief nurses, and so on—with formal training in managing the delivery of care, rather than asking them (or silently expecting them) to “figure it out” as they go.
And the basics of process design, process operation, and process improvement are teachable. Companies such as Toyota and Alcoa, already mentioned above, and Vanguard, and Southwest Airlines – which have created leadership positions in financial services and commercial aviation on the strength of their outstanding internal operations, combine superlative levels of quality, efficiency, responsiveness, and safety. The principles underpinning their success are codifiable—they lend themselves to explanation and didactic teaching. I, myself documented the experience of a manager, newly hired at Toyota although highly experienced elsewhere, as he underwent Toyota “basic training” in how to specify work, how to diagnose problems as soon as they happen, how to arrive at good solutions through repeated experiments, and how to design the work so that problems not only announce themselves but often suggest their own solutions.20 Although he already had a stellar track record as a manufacturing manager at one of Toyota’s competitors, someone with tremendous training and expertise in the technical aspects of a manufacturing environment, this training was a revelation to him because it was the first time he had received similarly disciplined training in managing the integration of disparate technologies into capable, reliable, adaptive processes.
These same principles can be learned and applied by health care organizations through experiential practice. So far, this has taken place largely in special projects, not as institution wide transformations, but the health care professionals who have been trained in the principles of process design, operation, and improvement have already achieved dramatic improvements in a variety of medical specialties and clinical settings.
It is most important to understand that, in the most encouraging cases, the improvements described earlier in this discussion were achieved by applying the principles of process improvement that staff members had learned previously while attacking different problems. At Massachusetts General Hospital’s Revere Health Center, for example, staff members worked on a project to improve general patient registration. Some weeks later, when the center was preparing to offer flu shot clinics, staff members eagerly applied the principles they had learned previously in order to design, operate, and continually improve this new process as well as possible. Likewise, at the UPMC Shadyside, the principles of process management were practiced in one area, and applied with great success in the pathology lab.
But such training needs to be the rule, not the exception. One can imagine building rotations into residency programs during which physicians learn to diagnose and treat sick processes just as they spend rotations developing their expertise in diagnosing and treating sick patients. Residents would have a four- to six-week rotation in process design, operation, and improvement. Then, before becoming a chief resident or otherwise acquiring supervisory responsibility, a resident would have to run a process improvement rotation himself or herself, teaching these skills to junior residents.
One might make several objections to this proposal. First, one might argue that residency training is already long enough—or already too long—and that we cannot afford to add a few months to it. True, residency is already long, but consider the amount of time that doctors and nurses spend struggling with broken systems, checking and double-checking what they are doing, tracking down lab results, searching for gloves or gowns, reconstructing information that was poorly conveyed in reports or charts, and so on. You would likely win back so much productive time by improving work processes that residencies could actually be shortened, even with this new training added.
One might object that a process improvement rotation would require teaching hospitals to offer up examples of their own poor process management in order to give the residents something to improve. There would be formidable institutional barriers to such confessions, but since everyone working in a hospital knows full-well which processes are not only inefficient and aggravating but have the potential to be dangerous, these are barriers over which dedicated leaders could lead the charge.
One might ask: Who in our health care facility is qualified to teach these skills? The answer may well be no one, initially, in which case, the organization might need to look outside itself and, quite possibly, outside the health care industry to manufacturers who are the best in their industry. This may be less of a barrier than it seems, because many experienced process managers will consider it an exciting and worthwhile use of their hard-won skills to help improve health care, and their companies may well relish the chance to earn good will by “loaning” their experts to such an effort.
Pilot projects have shown that if you apply process management principles and skills to health care delivery, the outcomes are much better and the effort needed to produce those outcomes is actually less. In cases I’ve mentioned and others, it was as if the facility had gained extra staff. While the transition from pilot project to policy would encounter institutional obstacles, the potential benefits are so great that one can argue that it would be unfair to doctors and irresponsible to patients not to do so. No doctor should be in the position of my cardiologist friend who found himself responsible for processes on which people’s lives depended but untrained for such a responsibility—and no doctor need be. No nurse should be in the position of having his or her best efforts turned into disaster because one vial of colorless fluid looks like another—and no nurse need be. No patient should be in the position of Mrs. Grant—and no patient need be.
Thank you to Mr. John Elder for invaluable contributions to earlier drafts of this manuscript.