To Err is Human, the Institute of Medicine's recent report on medical errors, has quite understandably prompted a great hue and cry over how to enhance the safety of patients in our complex health care system. From Capitol Hill to the White House and beyond, answers are being demanded and potential solutions proposed.
The concern ignited by the IOM report is altogether appropriate. The report makes the facts alarmingly clear—our health care system produces an unconscionable number of errors, resulting in costly morbidity and in numerous unnecessary deaths. The report also makes clear that the key to reducing medical errors is not through blaming individuals for negligence, but through correcting deficiencies of the system.
So, without in any way diminishing our efforts to protect patients from incompetent or incapacitated individuals, we must find ways to reduce the harm stemming from the incompetence inherent in our error-prone health care system. In my view, the barriers to enhancing patient safety through improvements in the system fall into two broad categories: (1) our culture-driven attitudes about mistakes and (2) our disinclination to apply systems thinking, aggravated by our limited knowledge about how complex systems work.
I needn't belabor the former; the point has been made repeatedly about how loath doctors are to admit mistakes, their own or anyone else's. We have been inculcated throughout our training with the belief that perfection is the goal of every physician. But that's only part of the explanation. Our irrational belief in unattainable perfection has, unfortunately, been strongly reinforced by our litigious society, which fosters yet another irrational belief: every bad outcome must be somebody's fault. The consequence is that mistakes of all kinds are too often covered up, partly because of shame in having “failed” and partly because of fear of malpractice suits.
Our reluctance to acknowledge error will not yield easily to legislative or administrative fixes. Indeed, this barrier to enhancing patient safety could well be buttressed rather than breached if, in responding to the IOM report, policymakers choose to focus not on ways to prevent future errors, but on ways to punish past mistakes, thereby unleashing a fruitless search for the “bad apples.”
Conversely, by helping us gain mastery over the complex systems in which modern health care is delivered, policymakers can usher in a new era of quality improvement efforts. To be human is, indeed, to err. Memory is faulty, knowledge of relevant patient-specific facts is often unavailable where and when it is needed, attention can be diverted by other pressing priorities, and fatigue is occasionally unavoidable. What health care professionals need to safeguard their patients from these frequent sources of inadvertent error are better systems to shore up their human frailties, e.g., timely information to support clinical decisions, routine availability of evidence-based standards of care, automatic reminders about drug dosage and potential adverse reactions, easy-to-use electronic clinical information systems.
Many institutions have incorporated these and other quality-improvement, error-reducing enhancements into their health care systems. Examples include the computer-based decision-support system at the LDS Hospital in Salt Lake City, the electronic patient record system deployed in most VA hospitals, and the multiple approaches to evidence-based decision making spearheaded by John Wennberg's Center for the Evaluative Clinical Sciences at the Dartmouth-Hitchcock Medical Center. Much progress could be made quickly by the wide dissemination of already-proven “best practices.” Their adoption will, in some cases, undoubtedly require new or re-deployed resources, but not in every instance. In any case, as or more important than new money is a commitment from top institutional leaders to place the issue of patient safety at the head of the prority list, to encourage the recognition rather than the concealing of error, and to value errors and “near misses” as golden opportunities to learn how to prevent future occurrences.
Still more is required, however. Even if we implemented everywhere all of the error-reducing methods we currently know of, our task would not be complete. Much remains to be learned, both about how our complex health care system operates and about how to modify physicians' behavior. It would be a colossal tragedy if lawmakers failed to seize this moment of heightened public awareness about patient safety as an opening to increase our country's investment in quality and outcomes research. As the IOM report says, “… there is still much to learn about the types of errors committed in health care and why they occur.” The report's first of eight recommendations calls for the creation of a Center for Patient Safety within the Agency for Healthcare Research and Quality (formerly AHCPR). Among the Center's responsibilities would be to “develop knowledge and understanding of errors in health care by developing a research agenda, funding Centers of Excellence, evaluating methods for identifying and preventing errors, and funding dissemination and communication activities to improve patient safety.”
At long last, the patient-safety genie is out of the bottle. Our task is to tame it, not force it back into hiding.