Nearly a decade ago, the Institute of Medicine's landmark report, To Err Is Human,1 highlighted the shortcomings in patient safety in the United States. Despite resolute efforts, progress in patient safety has proven slow and arduous. One factor contributing to the labored progress is the paucity of trained physicians with dedicated time who can help advance the science and practice of safety. Although isolated cases of quality and safety curricula are starting to surface,2–5 there are few combined degree programs (e.g., MD plus a masters) in quality improvement, patient safety, or related disciplines, such as human factors research or cognitive psychology. This deficit contrasts sharply with multiple combined MD–PhD programs in basic research. To make substantial progress in advancing the science of patient safety, we will need well-trained researchers with a deep understanding of their discipline.6
Physician leadership models for quality and safety are underdeveloped.7 Many hospitals have physicians who serve as the chief medical officer (CMO) or vice president for medical affairs. Yet, precious few of these individuals have formal training in quality of care or patient safety,8 and the broader infrastructure to support their work is fragmented and varies widely.9 As a result, proactive quality and safety activities are simply added to myriad competing duties, such as dealing with physician credentialing, patient complaints, and reactive regulatory quality and safety efforts, such as the National Patient Safety Goals. Leaders must actively promote a culture of safety and make safety and quality a priority across their organizations.10
The problem is amplified in academic medical centers (AMCs) because of bifurcated governance. In most schools, the medical school and the hospital have separate governance structures; quality and safety leaders generally reside in the hospital. Few medical schools have vice deans of patient safety or quality of care. This separation makes it awkward and sometimes difficult to engage faculty in quality and safety efforts, and makes it complex to implement safety interventions that span multiple departments. Thus, organizational structure, competing job responsibilities, and insufficient training limit the ability of AMCs to effectively, efficiently, and proactively lead quality and safety improvement efforts. Moreover, without this investment in training and a better integrated organizational structure, it will be difficult to make substantial improvements in quality and create a future generation of quality and safety leaders.
In this essay, we briefly explore the factors that contribute to the limited success of AMCs in quality and safety efforts, focusing on an underdeveloped scientific footing and the lack of strong physician quality and safety leadership. On the basis of that discussion, we suggest a structure to organize the physician leadership component of AMC quality and safety efforts, and outline a path to develop leaders in AMCs. The concepts discussed in this essay are based on our reflections and collective experiences, given that no significant body of literature or science exists on this key topic.
Root Cause Analysis of Poor Progress on Quality and Safety
The striking contrasts between the United States' success in biomedical research and its failure to deliver high-quality, cost-effective care is largely explained by our failure to view the delivery of health care as a science. As a result, there is scant evidence to help guide quality and safety decisions, and few clinical quality and safety leaders who are adequately trained and supported in their roles.
Insufficient quality and safety science evidence
At the root of the insufficient evidentiary base on quality and safety is an imbalance between the substantial support for basic biomedical knowledge and the slight support available to translate that knowledge into improved clinical outcomes. The benefits from the U.S. government's investment in biomedical science are both awe inspiring and potentially lifesaving. Diagnostic imaging studies provide detailed tours of diseased vessels, blood flow, and suspicious cancer cells. The average American life expectancy has increased from 69 to 78 years from 1955 to today. Many terminal cancers are now curable, AIDS is a manageable chronic illness, and mechanical heart valves allow some patients with cardiovascular disease to live longer. The United States is more productive in biomedical research than the entire European Union,11 and the world looks to us for major breakthroughs in medical research.
Yet this same U.S. medical system leaves surgical instruments in patients, overdoses children with blood-thinner medications, operates on the wrong side of the body, delivers appropriate therapies only 50% of the time, and kills nearly a hundred thousand people each year from preventable errors.1,12 Perhaps most disturbing is a recent Commonwealth Fund report13 that ranked the U.S. health care system last among other industrialized nations in terms of quality, access, efficiency, equity, and outcomes. Despite these poor outcomes, our median per capita expenditure for hospital services and drugs is three times larger than the 29 other member countries in the Organization for Economic Cooperation and Development.14 How can the United States be so successful in biomedical research yet flounder in delivering high-quality care?
Without trivializing this complex problem, we believe that the breakdown largely stems from failing to view the delivery of health care as a science. Most in biomedical research view science as finding new genes or effective therapies. Yet, the use of those therapies has generally been viewed as the art of medicine. For every dollar the U.S. government spends on traditional biomedical research, it spends a penny on research to ensure patients actually receive the interventions identified through biomedical research.15,16 Given this imbalance, it is predictable to find this disconnect between America's extraordinary basic and clinical science research and its dismal patient health outcomes. Patients and other stakeholders pay a substantial price for this myopic focus on biomedical research and the erroneous assumption that these biomedical efforts will immediately translate into better and safer care. In progressing from new knowledge to improved health, which is the goal of our investment in biomedical research, we have a narrow bottleneck in delivering interventions to patients. We know precious little about how to translate evidence into practice.
Nevertheless, there are examples of significant benefit from research aimed at ensuring patients receive evidence-based interventions. In a 2003 project funded in part by the Agency for Healthcare Research and Quality (AHRQ), a research team from Johns Hopkins University School of Medicine partnered with the Michigan Health & Hospital Association and 127 Michigan intensive care units (ICUs) to eliminate central-line-associated bloodstream infections (CLABSIs) throughout the state. Within three months of implementing the program, which included simple interventions like using a checklist to ensure doctors followed recommended practices, over 50% of participating ICUs reduced their rate of CLABSIs to zero, and that rate has persisted for four years. The overall rate of these infections was reduced by two thirds.17
There is a growing body of quality and safety improvement work occurring in the United States and abroad. Although the enthusiasm and concerted efforts are encouraging, the science underlying most initiatives is still immature,18,19 and our ability to even measure (let alone improve) patient safety is limited.20 Thus, the robust improvements found in the AHRQ-funded study mentioned above are rare.
Even for national programs to improve quality of care, like the Centers for Medicare and Medicaid Services' program to not pay for preventable complications, the science is poor. Little is known regarding how to accurately measure these complications or the extent to which they are preventable.21 Without advances in the science of quality and safety, improvements in quality of care will remain illusory. Yet, to advance the science, we need to build the capacity of quality and safety researchers and leaders to design, implement, and evaluate improvement interventions.22 Moreover, efforts must be cooperative and interdependent rather than independent, and they must focus on achieving measurable results to demonstrate progress.
Lack of clinical quality and safety leadership
The second factor contributing to the United States' insufficient progress on quality and safety is the paucity of appropriately trained and supported clinical quality and safety leaders and researchers. Many AMCs have appointed clinicians as patient safety officers.23 Yet, few clinicians have leadership skills or the training required to conduct rigorous quality improvement research or to develop, implement, and evaluate quality and safety improvement programs. Neither medical schools nor residencies across the board provide the requisite skills. Although little is known about how best to train quality and safety leaders, we believe formal training in a master's or doctoral program is essential. Superficial training or “weekend” seminars will likely not suffice. We learned this from basic science research; in which medical schools developed and now have multiple MD–PhD programs in basic science research. To our knowledge, there are no MD–PhD programs in human factors engineering or other safety-related disciplines.
Unfortunately, most clinical leaders have inherited quality and safety responsibilities as a by-product of strategic planning efforts rather than through training and skill development.23 Far too many quality and safety leaders are physicians who had difficulty sustaining consistent research funding or who had an interest in quality and safety without having a clearly defined set of skills. This leaves many newly anointed leaders unprepared for their roles, with a low probability of making substantial improvements in quality and safety.
From our informal review and networking with colleagues, most national leaders in quality and safety came to this field after training as clinical researchers (often with public health degrees in clinical investigation or epidemiology) or through organic evolution, resulting from prior experience in engineering science, aviation, or manufacturing, rather than as the result of deliberate training.
Yet many institutions recognize the need to have well-trained and competent quality and safety leaders. Demand for quality and safety leadership has quickly outstripped the serendipitous supply. A recent perusal of online health care employment opportunities with titles such as “director of care management,” “vice president for quality,” “director of quality improvement and care management,” and “director of performance improvement” identified 150 active recruitments (www.monster.com; accessed August 11, 2009). Though the terms describing these positions vary widely, they all involve responsibility for overseeing the quality and safety of care. This often encompasses meeting regulatory requirements, credentialing, managing risk and liability claims, and delivering the highest-quality and safest care possible.
Barriers in roles of quality and safety leaders
Even within organizations that hire quality and safety leaders, the structures of the roles often limit their effectiveness. This occurs for several reasons.
Inadequate time allocation.
First, the time formally allocated (and paid) to the role is often inadequate or nonexistent. The quality and safety responsibilities are often additional duties for busy clinical leaders, and the roles and responsibilities are ambiguous. As regulatory and local efforts to improve quality have increased, quality leaders must make trade-offs regarding where to focus their attention and efforts. In general, job descriptions and resources have not kept pace with the growing responsibilities. In part for these reasons, leaders often spend most of their time implementing “reactive” regulatory efforts or publicly visible interventions, rather than addressing the real issues that plague their microsystems. Health care organizations must recognize that no physician–leader can take on both a traditional CMO role and all the expanded responsibilities demanded by the modern safety and quality fields.
Second, the office of quality and safety leaders is generally underresourced. Though there may be support for a physician–leader, there is limited support for staff and data management to design, implement, and evaluate interventions. Hiring a physician–leader is often seen as an end point rather than as a down payment on the infrastructure required to effectively manage quality and safety.24 Resources are generally invested at the senior leader level with limited or no resources at the department or unit levels. As a result, there is neither the framework of clinical governance nor the protected time to implement proactive quality and safety initiatives at the department and unit levels.
Separation of hospital and medical school.
Third, quality and safety leaders usually “live” organizationally within hospital operations, with little connection to the medical school or graduate medical education. Though we did not formally evaluate how many quality leaders have formal roles in the medical school and hospital, our experience suggests that such dual roles are the exception. Although in some AMCs the governance of the medical school and hospital are integrated, in most they are not. Even in those few AMCs with integrated organizational charts, medical schools and medical centers continue to possess distinct cultures, finances, and reward systems. With few exceptions, this organizational dichotomy (with the quality leader on the hospital side) limits the effectiveness of the leader in engaging department chairs and their faculty, and limits scholarly efforts that would advance the science of quality and safety. It also fails to create an environment in which science is translated into educational programs to develop the next generation of physician quality and safety leaders.
Attracting physicians into administrative roles has always been a challenge, but there are some lessons to be learned from recent experience with medical directors. The growth of managed care and, later, pay for performance has increased the demand and urgency for rigorous medical management that extends beyond the reasonable expectations for medical directors who volunteer their time. The medical director role was made robust through position descriptions, explicit performance agreements, and compensated protected time for these activities. The same will be required to develop expertise at the department and unit levels in quality and safety.
Fourth are the reporting relationships and scope of authority of clinical quality and safety leaders. It has been argued that quality and safety should receive equal or greater attention as financial issues in the health care organizational structure.25 Yet, in our experience, most clinical quality and safety leaders do not report directly to the CEO or have direct access to the board. In stark contrast, it is difficult to find a health care organization in which the CFO does not directly report to the CEO. The implicit message is that quality and safety are less important, when, in fact, poor performance on both will hurt an institution's financial bottom line. Moreover, the location of a quality leader's office also highlights its relative importance. Whereas the CFO or vice dean's office is often located in the “C-suite,” the quality and safety leader is often located in a lower-rent district.
Physicians can learn much from nursing regarding an appropriate structure that supports improvement. Nursing has a well-defined leadership structure in which resources are invested and management structures exist at the unit, department, and hospital or health system levels.
In contrast, physician leadership structures are underdeveloped and underresourced. The required competencies are poorly defined, and reporting relationships marginalize the role. Without protected time and sufficient resources, efforts at the unit and department levels result in reactive fixes. Resources can be scraped together to deal with the crisis of the day, but little time and few resources are allocated to undertake methods to protect against a future event. As a result, it is difficult to design, implement, and evaluate proactive quality and safety improvement programs. Quality and safety dashboards, where they exist, are generally based on what is expedient rather than what is accurate and important. Although some faculty at the departmental level volunteer time to undertake quality and safety efforts, fiscal constraints and effort-reporting substantially limit the ability of faculty to volunteer their time. As such, it should not be surprising that clinician leaders have made limited progress toward improving quality and safety.
A Way Forward
Though there is limited empiric evidence to guide recommendations, we support four initiatives to accelerate our national progress on quality and safety: (1) invest in quality and safety science, (2) revise quality and safety governance in AMCs, and (3) integrate roles within the hospital and medical school.
Invest in quality and safety science
For patients to receive the full benefits of our national investment in biomedical research, we must invest in studies to ensure patients receive the beneficial therapies discovered by biomedical research. Just as research funding supported our ability to look at blood flow in the brain, research funding is needed to identify effective methods to ensure patients receive those therapies without causing harm. Aiming for a more rational balance between biomedical research and translational research to apply the fruits of that science in clinical settings is a start. A concerted program to double the spending for translational research through AHRQ, the Veterans Administration, and private foundations during a five-year period to complement the Clinton-era effort to do the same in the National Institutes of Health would be a first step. One key role for physicians is to add a voice and advocacy to this imbalance. With this funding, medical schools could start combined MD–PhD programs in quality and safety or related disciplines, such as human factors engineering.
In addition, research should include feedback as well as feed-forward systems. Traditional biomedical research plods along from basic to clinical research. Although efforts to increase translational research have grown, progress is slow. This contrasts with a feedback system (typically used by venture capitalists), in which we start with a problem (e.g., people are dying from preventable infections) and work backwards. Although feed-forward is good for developing new knowledge, it is less effective at improving population health. Both feedback and feed-forward systems are needed.
Revise quality and safety governance in AMCs and integrate with medical schools
The senior quality and safety leader should have dual roles as a vice dean or equivalent in the medical school and the quality and safety leader in the hospital, report to the CEO and the dean, and have an appropriate organizational infrastructure. This structure would link the academic and clinical missions for improving quality. Each clinical department (e.g., medicine, surgery) should have a vice chair with at least 50% time dedicated for quality and safety and a sufficient infrastructure, such as a safety nurse and data analyst. The duties of the vice chairs should be explicitly enumerated in a position description to ensure that their role proactively addresses salient quality and safety problems rather than merely reacting to events and regulatory pressures when clinical downtime permits. To project that framework to the front lines, each patient care area or unit should have a physician–leader with 20% effort dedicated to quality and safety work with unit-based nurse safety leaders.
Within this structure, the vice dean of quality can work with and mentor department vice chairs of quality to obtain scholarly productivity and strategize ways to improve quality and safety in their department. In turn, vice chairs can work with and mentor unit leaders to implement and evaluate improvement efforts in the context of their unit culture. Such a structure provides not only a mechanism for mentoring and improving quality but also a mechanism for organizational learning.
Quality of care and patient safety are applied sciences, and physicians leading these efforts need to balance service (internal program development) and scholarship (manuscripts, grants, national reputation); this is no easy task.22 They must meet regulatory requirements, manage hospital safety efforts, monitor progress for improving safety, and develop and implement new knowledge regarding how to improve quality. As these academic physicians devote increasing time to quality and safety, it is essential that they do so in a scholarly way to secure promotion, further the school's academic mission, and ensure that internal quality and safety efforts are scientifically conducted and evaluated. Although criteria for promoting physicians who work in quality have yet to be articulated,22 they should include both service and scholarship. Like clinician–educators who produce a teaching portfolio, quality improvement clinicians should produce a project portfolio in which they document their quality improvement efforts.
The maintenance and growth of quality and safety efforts within AMCs will only be secured if they are valued as scholarly pursuits by the chief of service who controls the academic promotions process in many AMCs. Physicians leading quality and safety work need their time supported so they can lead these efforts. As regulatory and hospital demands increase, physicians are spending increasing amounts of time on quality improvement. To move through the promotion track, it is imperative that they link scholarly and administrative efforts.
In addition to linking to the hospital and department chairs, the vice dean for quality should connect with the vice deans of education and research. This would help ensure that quality and safety programs are integrated into medical school and graduate medical education curricula and that training in quality and safety research is integrated into clinical research training. It would also help clarify IRB requirements for quality improvement efforts.
Create a leadership training pipeline for quality and safety
We likely need tiered training in order to develop a robust crew of physician quality and safety leaders. For example, there are core skills in the science of quality improvement and patient safety that all clinicians should understand.26,27 Greater skills are needed for unit and perhaps small department leaders of safety. This could involve certificate programs. Finally, even greater training is needed for large department and health system leaders of quality and safety. This will likely require a formal graduate-level degree, such as a master's or doctorate.
The increased demand for clinical quality and safety leaders can no longer be met through the happenstance development of required skills, knowledge, and attitudes. Producing researchers and administrators is a complex adaptive system, like geese flying in formation. Yet, these complex systems are often governed by simple rules. To produce researchers, whether basic, clinical, or quality, they must obtain formal training, connect with a qualified mentor, and get protected time to participate in a research experience. These same rules likely apply to the training of quality and safety leaders. The core skills needed for leaders in quality and safety include formal training in epidemiology, study design, biostatistics, behavioral change, organizational change management, health care financing, systems analysis, process improvement, and leadership.28 Beyond this formal coursework, physician quality and safety leaders should follow the same well-proven path of basic science and clinical researchers.
Formal degree programs from schools of public health are required in the area of quality improvement. Much of the formal training already exists in training grant programs for clinical researchers such as K12 programs. A K12 program provides junior faculty who have recently completed clinical training or a postdoctoral fellowship with grant support and protected time for clinical research and tuition for formal training in research methods. Such clinical research training can form the core for quality training that can be supplemented with additional training. This type of joint training could help accelerate the translation of evidence into practice by creating a common vocabulary and culture among clinical and quality improvement researchers.
In addition, novel relationships between areas with formal education in quality and safety science, such as schools of engineering, business, and medical schools, will be helpful. The Johns Hopkins University School of Medicine, for example, added a patient safety course to its second-year curriculum, incorporated safety and quality components into its core clerkships, and added an elective course in which medical students shadow and experience the work of other types of care providers. Other AMCs are looking to include quality in their medical school curricula and faculty development programs.29 Also, GME requires that residents “participate in interdisciplinary teams as it relates to quality of patient care and systems issues.”30 Whether this requirement has improved quality and whether we have sufficient numbers of faculty to teach this material is not known. Tapping into some of the attitudinal aptitudes for quality and safety leadership, such as an understanding of organizational behavior and the primacy of culture, would benefit from collaborations with the social sciences. Unfortunately, there are few programs to support this type of formal training, although the Dartmouth Institute for Health Policy, the Veterans Administration quality scholars, and the Johns Hopkins Bloomberg School of Public Health doctorate in improving clinical and economic performance of organizations offer promising programs.
Hospitals increasingly rely on physicians to help lead their quality and safety efforts. Without investing in formally training physician–leaders in quality and safety methods, and without investing in time and infrastructure for these physicians to serve as “bridge researchers” between internal and academic quality and safety efforts, we will likely mark the 20th anniversary of To Err Is Human with the same disappointing progress in improving health care quality and safety.
The authors wish to acknowledge Christine G. Holzmueller for her assistance in editing the manuscript.
1 Kohn L, Corrigan J, Donaldson M, eds; Institute of Medicine. To Err Is Human: Building a Safer Health System. Institute of Medicine Report. Washington, DC: National Academies Press; 1999.
2 Thompson DA, Cowan J, Holzmueller C, Wu AW, Bass E, Pronovost P. Planning and implementing a systems-based patient safety curriculum in medical education. Am J Med Qual. 2008;23:271–278.
3 Varkey P, Karlapudi S, Rose S, Swensen S. A patient safety curriculum for graduate medical education: Results from a needs assessment of educators and patient safety experts. Am J Med Qual. 2009;24:214–221.
4 Varkey P, Karlapudi S, Rose S, Nelson R, Warner M. A systems approach for implementing practice-based learning and improvement and systems-based practice in graduate medical education. Acad Med. 2009;84:335–339.
5 Boonyasai RT, Windish DM, Chakraborti C, Feldman LS, Rubin HR, Bass EB. Effectiveness of teaching quality improvement to clinicians: A systematic review. JAMA. 2007;298:1023–1037.
6 Sung NS, Crowley WF Jr, Genel M, et al. Central challenges facing the national clinical research enterprise. JAMA. 2003;289:1278–1287.
7 Flin R, Yule S. Leadership for safety: Industrial experience. Qual Saf Health Care. 2004;13(suppl 2):ii45–ii51.
8 Donaldson LJ. Safe high quality health care: Investing in tomorrow's leaders. Qual Health Care. 2001;10(suppl 2):ii8–ii12.
9 Rosenberg RN. Translating biomedical research to the bedside: A national crisis and a call to action. JAMA. 2003;289:1305–1306.
10 Pronovost PJ, Weast B, Holzmueller CG, et al. Evaluation of the culture of safety: Survey of clinicians and managers in an academic medical center. Qual Saf Health Care. 2003;12:405–410.
11 Soteriades ES, Falagas ME. Comparison of amount of biomedical research originating from the European Union and the United States. BMJ. 2005;331:192–194.
12 McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the united states. N Engl J Med. 2003;348:2635–2645.
13 Davis K. President's Message. A Prescription for Our Nation's Ailing Health Care System. New York, NY: The Commonwealth Fund; 2008.
14 Schoen C, Davis K, How SK, Schoenbaum SC. U.S. health system performance: A national scorecard. Health Aff (Millwood). 2006;25:w457–w475.
15 Loscalzo J. The NIH budget and the future of biomedical research. N Engl J Med. 2006;354:1665–1667.
16 Moses H 3rd, Dorsey ER, Matheson DH, Thier SO. Financial anatomy of biomedical research. JAMA. 2005;294:1333–1342.
17 Pronovost P, Needham D, Berenholtz S, et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med. 2006;355:2725–2732.
18 Leape LL, Berwick DM. Five years after To Err Is Human: What have we learned? JAMA. 2005;293:2384–2390.
19 Pronovost PJ, Miller MR, Wachter RM. Tracking progress in patient safety: An elusive target. JAMA. 2006;296:696–699.
20 Agency for Healthcare Research and Quality. National Healthcare Quality Report 2007. Rockville, Maryland: Agency for Healthcare Research and Quality; 2008.
21 Pronovost PJ, Goeschel CA, Wachter RM. The wisdom and justice of not paying for “preventable complications.” JAMA. 2008;299:2197–2199.
22 Shojania KG, Levinson W. Clinicians in quality improvement: A new career pathway in academic medicine. JAMA. 2009;301:766–768.
23 Shook J. Reflections of a patient safety officer. Pediatr Radiol. 2008;38(suppl 4):S690–S692.
24 Pronovost PJ, Rosenstein BJ, Paine L, et al. Paying the piper: Investing in infrastructure for patient safety. Jt Comm J Qual Patient Saf. 2008;34:342–348.
27 Wachter RM. Understanding Patient Safety. 1st ed. Columbus, Ohio: The McGraw-Hill Companies, Inc.; 2008.
28 Pronovost PJ, Goeschel CA, Marsteller JA, Sexton JB, Pham JC, Berenholtz SM. A framework for patient safety research and improvement. Circulation. 2009;119:330–337.
29 The Academic Medical Center Working Group of the Institute for Healthcare Improvement. The imperative for quality: A call for action to medical schools and teaching hospitals. Acad Med. 2003;78:1085–1089.