Secondary Logo

Journal Logo


Creating a High-Reliability Health Care System

Improving Performance on Core Processes of Care at Johns Hopkins Medicine

Pronovost, Peter J. MD, PhD; Armstrong, C. Michael; Demski, Renee MSW, MBA; Callender, Tiffany MSW; Winner, Laura MBA; Miller, Marlene R. MD, MSc; Austin, J. Matthew PhD; Berenholtz, Sean M. MD, MHS; Yang, Ting PhD, MHS; Peterson, Ronald R. MHA; Reitz, Judy A. ScD; Bennett, Richard G. MD; Broccolino, Victor A. MBA; Davis, Richard O. PhD; Gragnolati, Brian A. FACHE; Green, Gene E. MD; Rothman, Paul B. MD

Author Information
doi: 10.1097/ACM.0000000000000610


Patients continue to suffer preventable harm, much of which stems from our health care system failing to deliver recommended therapies.1,2 Up to 30% of health care costs, approximately one trillion dollars or $9,000 from every U.S. household, cover therapies that do not help patients recover.3 To remedy these problems and respond to payment policy changes, hospitals, physician practices, managed care organizations, and other medical facilities are merging to form large, integrated health systems. Although such mergers offer promise, they create complex management structures that pose challenges to coordinate improvement efforts and create accountability across merged entities.

Health systems with an academic medical center (AMC) have the capacity to reduce costs and improve quality of care. AMCs, especially those based at universities, can leverage clinician–researchers and safety scientists to advance quality improvement and can apply that science to provide better care throughout the health system. Nonetheless, explicit demonstrations of such activities are scarce.

Johns Hopkins Medicine (JHM), the academic health system that incorporates the Johns Hopkins University School of Medicine and the Johns Hopkins Health System (JHHS), made a commitment to improve value and advance improvement science. Leaders of JHM formed the Armstrong Institute for Patient Safety and Quality (Armstrong Institute) in 2011 to coordinate research, training, and operations for quality improvement and patient safety efforts throughout JHM. The Armstrong Institute’s mission is to partner with patients, their loved ones, and others to eliminate preventable harm, to continuously improve patient outcomes and experience, and to eliminate waste in health care. The institute aims to build capacity for improvement; advance improvement science through transdisciplinary research; design, implement, and evaluate improvement interventions; create structures and systems for peer and organizational learning; and create transparency and robust accountability systems. It is housed within the Johns Hopkins University School of Medicine, employs about 70 staff, and has 140 core faculty from more than eight schools and institutes (e.g., public health, nursing, engineering, and ethics institute), representing 18 different disciplines. The institute’s annual budget is about $15 million, with 75% funded through grants, philanthropy, and contracts and 25% from JHM to support operations.

Parallel to those efforts, after discussion with JHHS leaders, the JHM Board of Trustees set its first empirical quality goal: to ensure that the patients served by JHM received the recommended care for acute myocardial infarction, heart failure, pneumonia, surgical care, and children’s asthma at least 96% of the time. They chose 96% because it aligned with the target performance on inpatient core measures that hospitals needed to achieve the Delmarva Foundation for Medical Care’s 2013 Excellence Award for Quality Improvement in Hospitals4 and The Joint Commission’s Top Performer on Key Quality Measures award5; both awards had remained elusive for most AMCs and JHHS hospitals. The Delmarva Foundation award considered Centers for Medicare and Medicaid (CMS) quality indicators that were part of hospital pay-for-performance programs and public reporting. There was no financial incentive to achieve either award; however, some measures were part of the CMS value-based purchasing program, and most were processes of care that were important, were associated with improved patient outcomes, and were, we felt, easy to improve, which would motivate those participating in the initiative to sustain the work. Our efforts to improve performance on core measures and achieve The Joint Commission award at The Johns Hopkins Hospital have been described previously.6 In this article, we describe the structure, process, and accountability across JHHS to meet or exceed the 96% goal, evaluate the health system’s ability to realize this goal, and discuss the effectiveness of the new governance structure.

Organization of the Improvement Initiative

We used a conceptual model (briefly described below, with details published previously)6 nested within a quality improvement infrastructure to implement this prospective core measure initiative. The infrastructure was organized as a fractal. That is, at every level of JHM (i.e., health system, hospital, department, and unit), there was a similar quality infrastructure, defined as people with the skills, time, and accountability to lead the improvement work. The levels were connected vertically to provide a means for accountability, with the higher-level entity (e.g., health system) creating a structure for the entities in the next level lower (e.g., each hospital). The new governance organization, described below, manages this vertical connection. Yet, the power of the fractal model stems from the horizontal connections, facilitated by clinical communities among colleagues, to support peer and organizational learning, share common goals, and influence social norms.

Five JHHS inpatient hospitals (described in Table 1) participated in the initiative, which started in March 2012 and was fully implemented by June 2012. For this initiative, we identified 7 of the 25 process-of-care measures (core measures; see Table 2) in which JHHS as a whole or a hospital individually performed below 96% in 2011 and targeted them for improvement.

Table 1
Table 1:
Characteristics of the Five Hospitals That Participated in the Johns Hopkins Health System Quality Improvement Initiative and the Awards They Received
Table 2
Table 2:
Compliance on Targeted Core Process-of-Care Measures Before (2011) and After (2012 and 2013) Implementation of the Johns Hopkins Health System Quality Improvement Initiative

Governance structure

With the formation of the Armstrong Institute, the JHM Board of Trustees created a new governance structure (see Figure 1) for the oversight of quality and safety, seeking to ensure that all care delivered by JHM mapped into this structure. The JHM Board of Trustees is the governing body. The JHM Patient Safety and Quality Board Committee (JHM Quality Board Committee) provides oversight and ensures accountability for all JHM leaders; it includes representatives from JHM leaders and all entities in the health system with a board or quality committee, which affords peer protection when entities share privileged information and ensures compliance with regulatory requirements. The JHM Quality, Safety, and Service Executive Committee brings together all quality, clinical, and executive leaders from all JHHS hospitals to review strategic priorities and decide what topic or issues are taken to the JHM Quality Board Committee. The Armstrong Institute, which was created to improve quality and patient safety across the health system, is part of the Johns Hopkins University School of Medicine and JHM. It connects the JHM Quality Board Committee’s strategic leadership goals with the activities of departments, units, and affiliate groups to meet these goals.

Figure 1
Figure 1:
Governance structure for the oversight of quality of care and patient safety for Johns Hopkins Medicine (JHM), which is the academic health system for the Johns Hopkins University School of Medicine and the Johns Hopkins Health System (JHHS). The hospitals and affiliates reside under JHHS. The JHM Patient Safety and Quality Board Committee (JHM Quality Board Committee) reports to the Board of Trustees and provides oversight and ensures accountability for all JHM leaders; it includes representatives from all entities in the health system with a board or quality committee. The JHM Quality, Safety, and Service Executive Committee convenes quality, clinical, and executive leaders from all JHHS hospitals to review strategic priorities and determines the JHM Quality Board Committee’s next focus area. The Armstrong Institute for Patient Safety and Quality connects the JHM Quality Board Committee’s strategic leadership goals with the activities of departments, units, and affiliate groups to meet these goals.

Conceptual model

Armstrong Institute staff developed a four-part, sequential conceptual model to guide the initiative. The model addressed the challenges that accompany quality and safety efforts.7 Each part is summarized below.

Define and communicate goals and measures.

The chair of the JHM Quality Board Committee and the director of the Armstrong Institute sent a memorandum to everyone from hospital presidents and their trustees to frontline staff, clarifying the need for the 96% goal, identifying the Armstrong Institute as the coordinating body for the initiative, and asking everyone to collaborate in achieving the goal.

Create an enabling infrastructure.

Armstrong Institute staff conducted a gap analysis to identify poor-performing core measures. Armstrong Institute leaders reviewed each hospital’s compliance scores in 2011 and targeted a measure for improvement if the aggregate score for the five participating hospitals was below 96%, or if any hospital performed below 96% for 2 of the 12 months in 2011. Through this process, they identified seven measures for improvement (see Table 2).

In June 2012, Armstrong Institute leaders formed a clinical work group for each targeted core measure. Each group had an improvement team (physicians and nurses from the relevant departments [e.g., heart failure on a medical floor], information technology staff, and quality improvement staff) from each hospital and an Armstrong Institute team (project manager, improvement science faculty member, and Lean Sigma Master Black Belt). The Armstrong Institute team coordinated the work and provided clinical and process improvement expertise.

Engage frontline clinicians and create clinical communities to support peer and organizational learning.

The clinical work groups formed clinical communities to examine the work processes, barriers, and best practices for their core measure that would enable social support and influence peer norms.8 Groups met weekly for the first three months, biweekly over the subsequent four-month period, and then monthly, many continuing their work today.

Clinical work groups used a variety of Lean Sigma tools, such as process maps, fishbone diagrams, and the Define–Measure–Analyze–Improve–Control (DMAIC) framework,9 to systematically identify failures and find ways to improve and control performance. Improvement strategies included education, reminders on checklists, and forcing functions, such as requiring a field in the electronic health record. Hospital teams used the A3 Lean Sigma tool (example described previously)6 to manage and communicate their DMAIC work. The A3 tool was used to aggregate performance for each measure, both at the hospital level and the JHHS level, and to produce reports for JHHS leaders and the JHM board.

Establish transparent reporting and ensure accountability.

The JHM Board of Trustees committed to transparently report results and to develop a robust accountability plan for JHHS. To meet these promises, the Armstrong Institute created an electronic dashboard for each hospital that displayed core measure performance by month for the year to date. Core measures below 96% displayed in red, and measures ≥ 96% displayed in green.

The trustees and JHHS leadership developed a formal accountability plan, which had an escalating four-level performance review process that was activated when a hospital performed below the 96% target on any given measure. Briefly, a first-level review by a local performance improvement team was triggered when a hospital missed the target for one month. A missed target for two months activated a review by the hospital’s quality board committee and president; a missed target for three months warranted a review by the JHM Quality, Safety, and Service Executive Committee; and a missed target for four months warranted a review by the JHM Quality Board Committee, accompanied by an audit by Armstrong Institute staff. A detailed description of this process has been described previously.6

The accountability plan helped to sustain results. Once a hospital maintained at least 96% compliance on a core measure for four consecutive months, it entered the sustainability phase. The hospital improvement team and Armstrong Institute team reviewed the sustainability criteria, examined the team’s failure mode analysis, and executed interventions to ensure that they had a process that would sustain compliance. If any risks were found, the hospital team addressed them before drafting a sustainability plan using the A3. If the process was sound, the sustainability plan was approved by clinical, quality, and executive leadership and submitted for review to the JHM Quality, Safety, and Service Executive Committee and the JHM Quality Board Committee.

Evaluation of Core Measure Performance

The primary outcome measure was the percentage of patients at each hospital who received the recommended process of care. We aggregated performance for each core measure into an annual mean performance score and compared annual hospital performance before (2011) and after (2012 and 2013) the initiative. We reported the percentage of the core measures that met or exceeded 96% for each hospital, and the percent compliance for the seven measures targeted for improvement. To test the hypothesis that the initiative improved care across JHHS, we also aggregated the data from all five participating hospitals and performed Fisher exact tests to compare aggregated preinitiative data and aggregated postinitiative data for each of the seven measures. We considered P < .05 significant; we used statistical software R 3.0.3 (R Foundation for Statistical Computing, Vienna, Austria) for the analyses.

Table 2 includes the percent compliance before and after the initiative for the seven targeted measures by hospital. The percentage of targeted measures with an annual aggregate performance ≥ 96% for each hospital increased from 2011 to 2013 (see Figure 2). JHHS achieved ≥ 96% compliance on six of the seven targeted measures by 2013 (see Figure 3). We aggregated the data for each of the seven measures and compared baseline (2011) and postinitiative (2012 and 2013) data, finding no statistical significance. Of the five hospitals, four received the Delmarva Foundation award and two received The Joint Commission award in 2013 (see Table 1 and Supplemental Digital Table 1 at

Figure 2
Figure 2:
Percentage of targeted core measures at baseline (2011) and postinitiative (2012 and 2013) when each hospital achieved an annual aggregate performance ≥ 96%. Core measures reported varied by hospital depending on the patient population served; n = 7 for The Johns Hopkins Hospital (JHH); n = 6 for Johns Hopkins Bayview Medical Center (Bayview); Howard County General Hospital (HCGH), Suburban Hospital (Suburban), and n = 4 Sibley Memorial Hospital (Sibley). The variation in sample size was based on whether the hospital was required to track the measure or performed the clinical service.
Figure 3
Figure 3:
Overall annual percent compliance for seven targeted core measures at baseline (2011) and postinitiative (2012 and 2013) for the Johns Hopkins Health System. Abbreviations: CY, calendar year; PCI, percutaneous coronary intervention; ED, emergency department; CAC, children’s asthma care.

Other Outcomes of the Improvement Initiative

In this article, we describe a new governance structure to oversee the quality of care provided across JHM and how the governing bodies, supported by the Armstrong Institute, used a conceptual model nested in a fractal infrastructure to improve performance on core measures. The governance structure provided a much-needed platform to help health system and hospital leaders manage quality-of-care-improvement efforts throughout a complex health system. Within this structure, the JHM Quality Board Committee was created; its members answer to the JHM Board of Trustees and serve as a subcommittee for each hospital board of trustees, connecting all participating entities and engaging them in common goals.

As health systems expand, especially academic health systems, it is unclear who is overseeing the quality of the care delivered, and how that oversight connects practitioners in different settings, hospitals, ambulatory clinics, ambulatory surgical centers, and other care areas, and through multiple organizational levels to a board structure. Accountability is a relationship, improving obligations among individuals, and each level must ensure that they clarify roles and provide resources to ensure that quality goals can be met. Although this oversight system is not totally charted for JHM, our new governance structure is allowing such mapping to occur. For example, Armstrong Institute and JHM leaders are examining the oversight of all ambulatory surgery facilities, an area our safety leaders identified as high risk. In this effort, Armstrong Institute leaders are working to clearly define the roles and responsibilities for quality with the hospital, ambulatory, and home care executives and with the academic department chairs. This work is technically and socially complex, and the JHM Quality Board Committee provides both a home and an urgency to such efforts.

Once the conceptual model was implemented, all five hospitals improved their performance on the core measures, and some hospitals received the Delmarva Foundation and Joint Commission awards. We believe that this model nested in the fractal infrastructure enabled the quality improvement initiative to succeed among the five hospitals, balancing their independence and interdependence. We clearly communicated the goals of the initiative and the hospitals’ performances in meeting them, built capacity for the initiative through core measure clinical work groups and the use of Lean Sigma methods, and established processes for accountability and sustainability. Intrinsic motivation, through the Armstrong Institute’s mission to partner with patients, their loved ones, and others to eliminate preventable harm, to continuously improve patient outcomes and experience, and to eliminate waste in health care, was the catalyst to improve care. Nevertheless, the Delmarva Foundation and Joint Commission awards were galvanizing forces, providing a specific goal toward which to focus the improvement efforts.

Our work is novel and has several important implications. Foremost, it demonstrated the importance of strong leadership from the board and an independent committee of leaders in quality and safety at a large AMC. Many AMCs are acquiring hospitals and developing lower-cost off-campus service sites (e.g., ambulatory surgery centers), and the governance and management of quality and patient safety in these AMCs can be ambiguous or lag behind the pace of expansion. Our governance structure helped leaders focus and effectively coordinate the initiative across five hospitals. Second, it demonstrated the value of AMCs in advancing improvement science.10,11 Our conceptual model was robust, structured, disciplined, informed by science regarding why improvement efforts can stall,7 and tested before being implemented across the health system. Third, we demonstrated that AMCs can achieve high reliability on core measures, suggesting that other AMCs can do the same with the right leadership and a structured and robust improvement process.5

Importantly, our work adds to the field of health system governance and performance improvement. The JHM governance structure ensured accountability for the safety and quality of the care delivered throughout our complex, diverse, and growing academic health system. The fractal infrastructure established a cascading, explicit plan for accountability that mapped to a process for peer-to-peer learning and support, and was a powerful intervention in changing peer norms and improving performance.8 Our work highlights the importance of the commitment of leadership and the interdependence of hospitals, while encouraging independence through local innovation and implementation of such a model to improve quality of care. Hospital leaders and clinical department directors embraced the accountability plan and agreed to use it in future improvement efforts.

The conceptual model we used to achieve the 96% goal helped create a high-reliability system.7,12 In this initiative, hospital and affiliate trustees, hospital presidents, and other corporate officers deliberated over the goal, how to measure it, and how to achieve it. Once they reached consensus, they clearly communicated the goal to executive leadership and staff. At all board, executive, and departmental quality meetings, leaders reiterated the goal and reviewed hospital performance. Before this initiative, core measure performance was only reported quarterly at board and quality improvement meetings. Moreover, no performance goal had been articulated or accountability process defined to address inadequate performance. Armstrong Institute staff recognized that performance had to be reviewed monthly rather than quarterly, results had to be transparently reported using a standard format, and accountability had to be created through an escalating process of performance review, progressing from the bedside (i.e., the local hospital improvement team) to the board room (i.e., the JHM Board of Trustees).

We built capacity to ensure that clinicians had the time and skills needed to conduct robust improvement efforts.13,14 Local hospital clinicians were engaged through the clinical work groups, understood performance expectations, and felt ownership for improvement, while Armstrong Institute staff supported them throughout the initiative. Accountability led to quicker investigations into poor performance. For example, children’s asthma care compliance dropped in 2011 because we upgraded the computer order entry system and lost the decision support rules that helped ensure compliance. Once this problem was identified, a new decision support rule was written, and performance dramatically improved. Finally, to close the loop, we included a sustainability plan for each core measure.

Whether to focus on process or outcome measures is a contentious issue, but both are important. Too few valid outcome measures have been identified, so the development of such measures should be a research and policy priority. Process measures help target specific therapies to improve outcomes. Yet, process measures must be significant and valid, and some do not correlate with outcomes. Carefully selected and valid process measures can help focus improvement efforts. Still, patient outcomes are influenced by teamwork, culture, and leadership, factors that are not measured by care processes. To improve patient outcomes, we need to focus on the entire package—processes, culture, and outcomes.

Our evaluation has several limitations. First, we cannot establish a causal relationship between the initiative and care improvements because we used a pre–post study design, not a randomized control trial. Second, the study was conducted at one large academic health system, and its generalizability is unknown. JHM has a strong, long-standing culture of patient safety and quality improvement and scholarship in this area. Other academic and nonacademic institutions may lack this culture or the resources present at JHM. Nevertheless, the governance structure, fractal infrastructure, accountability model, and Lean Sigma are all general leadership interventions that could be implemented anywhere. Finally, some of the 2012 data were collected before we fully implemented the initiative, which may have affected our results.

Health systems could implement several aspects of this initiative with limited resources. First, board and health system leaders could create a governance structure for oversight to ensure that the quality and safety of care, anywhere it is delivered, are reported to a board committee. Second, leaders could apply the conceptual model we used—define and communicate goals and measures, create a supporting infrastructure by engaging local clinicians and connecting them in clinical communities, and create transparency and accountability systems. Although the resources devoted to creating a supporting infrastructure may vary, all health systems could adopt these two components using limited resources.


Our results suggest that this initiative was associated with improved performance on core process-of-care measures across JHHS. Robust leadership from the board and a fractal infrastructure created a model that fostered interdependence and independence among the participating hospitals and staff. At all five hospitals, performance improved, and some hospitals achieved the Delmarva Foundation and Joint Commission awards. Our initiative was effective not only because the work was implemented locally with staff from each hospital at the core of the process improvements but also because the entire JHM team was committed and accountable for perfect patient care and safety. We will continue to apply this model to these seven core measures and will expand it to other performance measures across our health system.

Acknowledgments: The authors wish to thank the National Leadership Core Measures Work Groups for participating in the initiative: Cora Abundo, Sharon Allen, Marc Applestein, Walt Atha, Kelly Baca, Deborah Baker, Jennifer Baxter, Ed Bessman, Michael Brinkman, Judy Brown, Tanya Brown, Barbara Bryant, Brendan Carmody, Karen Carroll, Bridget Carver, Jennifer Castellani, Yang Ho Choi, Cathy Clarke, Mel Coker, Margareta Cuccia, Ruth Dalgetty, Richard Day, Andrea DelRosario, Katherine Deruggiero, Denice Duda, Robert Dudas, John Dunn, Damon Duquaine, Joe Dwyer, Alexis Edwards, Deirdre Flowers, Cathy Garger, Kimberly Goldsborough, Susan Groman, Felix Guzman, Leslie Hack, Margie Hackett, Laura Hagan, Judith Haynos, Elizabeth Heck, Genie Heitmiller, Peter Hill, Ann Hoffman, Keith Horvath, Roberta Jensen, Peter Johnson, Ilene Jonas, Kimberly Kelly, Terri Kemmerer, Salwa Khan, Mark Landrum, Karen Lieberman, Barton Leonard, Jackie Lobien, Chepkorir Maritim, Giuliana Markovich, Bernard Marquis, Blanka McClammer, Deborah McDonough, Barbara McGuiness, Janet McIntyre, Danielle McQuigg, Melissa Means, Karen Michaels, Julie Miller, Vicki Minor, Regina Morton, Jennifer Moyer, Hilda Nimako, Sharon Owens, Eric Park, Judith Peck, Peter Petrucci, Brent Petty, Marcy Post, Sarah Rasmussen, Jennifer Raynor, Joanne Renjel, Jon Resar, Sharon Rossi, Leo Rotello, Stuart Russell, Mustapha Saheed, Jacky Schultz, Paige Schwartz, Amanda Shrout, LeighAnn Sidone, Nancy Smith, Rita Smith, Tracey Smith, Evelyn St. Martin, Elizabeth Taffe, Cynthia Thomas, Tina Tolson, Jeff Trost, Cynthia Walters, Carol Ware, Robin Wessels, Glen Whitman, and Chet Wyman. The authors also wish to thank the Armstrong Institute Lean Sigma Master Black Belt coaches (Timothy Burroughs, Julie Cady-Reh, Richard Hill, Robert Hody, and Rick Powers); the Johns Hopkins Hospital clinical directors (Henry Brem, George Dover, David Eisele, James Ficke, Julie Freischlag, Brooks Jackson, Sewon Kang, Gabor Kelen, Andrew Lee, Jonathan Lewin, Justin McArthur, William Nelson, Daniel Nyhan, Jeffrey Palmer, Alan Partin, John Ulatowski, and Myron Weisfeldt) for their support of physician engagement and improving patient outcomes; the Quality Improvement Department vice president (Judy Brown), directors (Richard Day, Deborah McDonough, Janet McIntyre, and Katie Servis), and staff; Karthik Rao, for performing the literature review for this article; and Christine G. Holzmueller, for reviewing and editing an earlier version of this article.


1. McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348:2635–2645
2. Hayward RA, Asch SM, Hogan MM, Hofer TP, Kerr EA. Sins of omission: Getting too little medical care may be the greatest threat to patient safety. J Gen Intern Med. 2005;20:686–691
3. Fineberg HV. Shattuck lecture. A successful and sustainable health system—how to get there from here. N Engl J Med. 2012;366:1020–1027
4. Delmarva Foundation. . Excellence Award for Quality Improvement. 2013 Accessed October 23, 2014
5. Joint Commission. . Facts about top performers on key quality measures program. Accessed October 23, 2014
6. Pronovost PJ, Demski R, Callender T, et al.National Leadership Core Measures Work Groups. Demonstrating high reliability on accountability measures at the Johns Hopkins Hospital. Jt Comm J Qual Patient Saf. 2013;39:531–544
7. Dixon-Woods M, Baker R, Charles K, et al. Culture and behavior in the English National Health Service: Overview of lessons from a large multi-method study. BMJ Qual Saf. 2014;23:106–115
8. Dixon-Woods M, Bosk CL, Aveling EL, Goeschel CA, Pronovost PJ. Explaining Michigan: Developing an ex post theory of a quality improvement program. Milbank Q. 2011;89:167–205
9. Pande PS, Neuman RP, Cavanaugh RR The Six Sigma Way: How GE, Motorola, and Other Top Companies Are Honing Their Performance. 2000 New York, NY McGraw-Hill
10. Marshall M, Pronovost P, Dixon-Woods M. Promotion of improvement as a science. Lancet. 2013;381:419–421
11. Aveling EL, Martin G, Armstrong N, Banerjee J, Dixon-Woods M. Quality improvement through clinical communities: Eight lessons for practice. J Health Organ Manag. 2012;26:158–174
12. Weick KE, Sutcliffe KM Managing the Unexpected: Resilient Performance in an Age of Uncertainty. 20072nd ed. San Francisco, Calif Jossey-Bass
13. Cohen RI, Jaffrey F, Bruno J, Baumann MH. Quality improvement and pay for performance: Barriers to and strategies for success. Chest. 2013;143:1542–1547
14. Pronovost PJ, Rosenstein BJ, Paine L, et al. Paying the piper: Investing in infrastructure for patient safety. Jt Comm J Qual Patient Saf. 2008;34:342–348

Supplemental Digital Content

© 2015 by the Association of American Medical Colleges