Other Outcomes of the Improvement Initiative
In this article, we describe a new governance structure to oversee the quality of care provided across JHM and how the governing bodies, supported by the Armstrong Institute, used a conceptual model nested in a fractal infrastructure to improve performance on core measures. The governance structure provided a much-needed platform to help health system and hospital leaders manage quality-of-care-improvement efforts throughout a complex health system. Within this structure, the JHM Quality Board Committee was created; its members answer to the JHM Board of Trustees and serve as a subcommittee for each hospital board of trustees, connecting all participating entities and engaging them in common goals.
As health systems expand, especially academic health systems, it is unclear who is overseeing the quality of the care delivered, and how that oversight connects practitioners in different settings, hospitals, ambulatory clinics, ambulatory surgical centers, and other care areas, and through multiple organizational levels to a board structure. Accountability is a relationship, improving obligations among individuals, and each level must ensure that they clarify roles and provide resources to ensure that quality goals can be met. Although this oversight system is not totally charted for JHM, our new governance structure is allowing such mapping to occur. For example, Armstrong Institute and JHM leaders are examining the oversight of all ambulatory surgery facilities, an area our safety leaders identified as high risk. In this effort, Armstrong Institute leaders are working to clearly define the roles and responsibilities for quality with the hospital, ambulatory, and home care executives and with the academic department chairs. This work is technically and socially complex, and the JHM Quality Board Committee provides both a home and an urgency to such efforts.
Once the conceptual model was implemented, all five hospitals improved their performance on the core measures, and some hospitals received the Delmarva Foundation and Joint Commission awards. We believe that this model nested in the fractal infrastructure enabled the quality improvement initiative to succeed among the five hospitals, balancing their independence and interdependence. We clearly communicated the goals of the initiative and the hospitals’ performances in meeting them, built capacity for the initiative through core measure clinical work groups and the use of Lean Sigma methods, and established processes for accountability and sustainability. Intrinsic motivation, through the Armstrong Institute’s mission to partner with patients, their loved ones, and others to eliminate preventable harm, to continuously improve patient outcomes and experience, and to eliminate waste in health care, was the catalyst to improve care. Nevertheless, the Delmarva Foundation and Joint Commission awards were galvanizing forces, providing a specific goal toward which to focus the improvement efforts.
Our work is novel and has several important implications. Foremost, it demonstrated the importance of strong leadership from the board and an independent committee of leaders in quality and safety at a large AMC. Many AMCs are acquiring hospitals and developing lower-cost off-campus service sites (e.g., ambulatory surgery centers), and the governance and management of quality and patient safety in these AMCs can be ambiguous or lag behind the pace of expansion. Our governance structure helped leaders focus and effectively coordinate the initiative across five hospitals. Second, it demonstrated the value of AMCs in advancing improvement science.10,11 Our conceptual model was robust, structured, disciplined, informed by science regarding why improvement efforts can stall,7 and tested before being implemented across the health system. Third, we demonstrated that AMCs can achieve high reliability on core measures, suggesting that other AMCs can do the same with the right leadership and a structured and robust improvement process.5
Importantly, our work adds to the field of health system governance and performance improvement. The JHM governance structure ensured accountability for the safety and quality of the care delivered throughout our complex, diverse, and growing academic health system. The fractal infrastructure established a cascading, explicit plan for accountability that mapped to a process for peer-to-peer learning and support, and was a powerful intervention in changing peer norms and improving performance.8 Our work highlights the importance of the commitment of leadership and the interdependence of hospitals, while encouraging independence through local innovation and implementation of such a model to improve quality of care. Hospital leaders and clinical department directors embraced the accountability plan and agreed to use it in future improvement efforts.
The conceptual model we used to achieve the 96% goal helped create a high-reliability system.7,12 In this initiative, hospital and affiliate trustees, hospital presidents, and other corporate officers deliberated over the goal, how to measure it, and how to achieve it. Once they reached consensus, they clearly communicated the goal to executive leadership and staff. At all board, executive, and departmental quality meetings, leaders reiterated the goal and reviewed hospital performance. Before this initiative, core measure performance was only reported quarterly at board and quality improvement meetings. Moreover, no performance goal had been articulated or accountability process defined to address inadequate performance. Armstrong Institute staff recognized that performance had to be reviewed monthly rather than quarterly, results had to be transparently reported using a standard format, and accountability had to be created through an escalating process of performance review, progressing from the bedside (i.e., the local hospital improvement team) to the board room (i.e., the JHM Board of Trustees).
We built capacity to ensure that clinicians had the time and skills needed to conduct robust improvement efforts.13,14 Local hospital clinicians were engaged through the clinical work groups, understood performance expectations, and felt ownership for improvement, while Armstrong Institute staff supported them throughout the initiative. Accountability led to quicker investigations into poor performance. For example, children’s asthma care compliance dropped in 2011 because we upgraded the computer order entry system and lost the decision support rules that helped ensure compliance. Once this problem was identified, a new decision support rule was written, and performance dramatically improved. Finally, to close the loop, we included a sustainability plan for each core measure.
Whether to focus on process or outcome measures is a contentious issue, but both are important. Too few valid outcome measures have been identified, so the development of such measures should be a research and policy priority. Process measures help target specific therapies to improve outcomes. Yet, process measures must be significant and valid, and some do not correlate with outcomes. Carefully selected and valid process measures can help focus improvement efforts. Still, patient outcomes are influenced by teamwork, culture, and leadership, factors that are not measured by care processes. To improve patient outcomes, we need to focus on the entire package—processes, culture, and outcomes.
Our evaluation has several limitations. First, we cannot establish a causal relationship between the initiative and care improvements because we used a pre–post study design, not a randomized control trial. Second, the study was conducted at one large academic health system, and its generalizability is unknown. JHM has a strong, long-standing culture of patient safety and quality improvement and scholarship in this area. Other academic and nonacademic institutions may lack this culture or the resources present at JHM. Nevertheless, the governance structure, fractal infrastructure, accountability model, and Lean Sigma are all general leadership interventions that could be implemented anywhere. Finally, some of the 2012 data were collected before we fully implemented the initiative, which may have affected our results.
Health systems could implement several aspects of this initiative with limited resources. First, board and health system leaders could create a governance structure for oversight to ensure that the quality and safety of care, anywhere it is delivered, are reported to a board committee. Second, leaders could apply the conceptual model we used—define and communicate goals and measures, create a supporting infrastructure by engaging local clinicians and connecting them in clinical communities, and create transparency and accountability systems. Although the resources devoted to creating a supporting infrastructure may vary, all health systems could adopt these two components using limited resources.
Our results suggest that this initiative was associated with improved performance on core process-of-care measures across JHHS. Robust leadership from the board and a fractal infrastructure created a model that fostered interdependence and independence among the participating hospitals and staff. At all five hospitals, performance improved, and some hospitals achieved the Delmarva Foundation and Joint Commission awards. Our initiative was effective not only because the work was implemented locally with staff from each hospital at the core of the process improvements but also because the entire JHM team was committed and accountable for perfect patient care and safety. We will continue to apply this model to these seven core measures and will expand it to other performance measures across our health system.
Acknowledgments: The authors wish to thank the National Leadership Core Measures Work Groups for participating in the initiative: Cora Abundo, Sharon Allen, Marc Applestein, Walt Atha, Kelly Baca, Deborah Baker, Jennifer Baxter, Ed Bessman, Michael Brinkman, Judy Brown, Tanya Brown, Barbara Bryant, Brendan Carmody, Karen Carroll, Bridget Carver, Jennifer Castellani, Yang Ho Choi, Cathy Clarke, Mel Coker, Margareta Cuccia, Ruth Dalgetty, Richard Day, Andrea DelRosario, Katherine Deruggiero, Denice Duda, Robert Dudas, John Dunn, Damon Duquaine, Joe Dwyer, Alexis Edwards, Deirdre Flowers, Cathy Garger, Kimberly Goldsborough, Susan Groman, Felix Guzman, Leslie Hack, Margie Hackett, Laura Hagan, Judith Haynos, Elizabeth Heck, Genie Heitmiller, Peter Hill, Ann Hoffman, Keith Horvath, Roberta Jensen, Peter Johnson, Ilene Jonas, Kimberly Kelly, Terri Kemmerer, Salwa Khan, Mark Landrum, Karen Lieberman, Barton Leonard, Jackie Lobien, Chepkorir Maritim, Giuliana Markovich, Bernard Marquis, Blanka McClammer, Deborah McDonough, Barbara McGuiness, Janet McIntyre, Danielle McQuigg, Melissa Means, Karen Michaels, Julie Miller, Vicki Minor, Regina Morton, Jennifer Moyer, Hilda Nimako, Sharon Owens, Eric Park, Judith Peck, Peter Petrucci, Brent Petty, Marcy Post, Sarah Rasmussen, Jennifer Raynor, Joanne Renjel, Jon Resar, Sharon Rossi, Leo Rotello, Stuart Russell, Mustapha Saheed, Jacky Schultz, Paige Schwartz, Amanda Shrout, LeighAnn Sidone, Nancy Smith, Rita Smith, Tracey Smith, Evelyn St. Martin, Elizabeth Taffe, Cynthia Thomas, Tina Tolson, Jeff Trost, Cynthia Walters, Carol Ware, Robin Wessels, Glen Whitman, and Chet Wyman. The authors also wish to thank the Armstrong Institute Lean Sigma Master Black Belt coaches (Timothy Burroughs, Julie Cady-Reh, Richard Hill, Robert Hody, and Rick Powers); the Johns Hopkins Hospital clinical directors (Henry Brem, George Dover, David Eisele, James Ficke, Julie Freischlag, Brooks Jackson, Sewon Kang, Gabor Kelen, Andrew Lee, Jonathan Lewin, Justin McArthur, William Nelson, Daniel Nyhan, Jeffrey Palmer, Alan Partin, John Ulatowski, and Myron Weisfeldt) for their support of physician engagement and improving patient outcomes; the Quality Improvement Department vice president (Judy Brown), directors (Richard Day, Deborah McDonough, Janet McIntyre, and Katie Servis), and staff; Karthik Rao, for performing the literature review for this article; and Christine G. Holzmueller, for reviewing and editing an earlier version of this article.
1. McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348:2635–2645
2. Hayward RA, Asch SM, Hogan MM, Hofer TP, Kerr EA. Sins of omission: Getting too little medical care may be the greatest threat to patient safety. J Gen Intern Med. 2005;20:686–691
3. Fineberg HV. Shattuck lecture. A successful and sustainable health system—how to get there from here. N Engl J Med. 2012;366:1020–1027
6. Pronovost PJ, Demski R, Callender T, et al.National Leadership Core Measures Work Groups. Demonstrating high reliability on accountability measures at the Johns Hopkins Hospital. Jt Comm J Qual Patient Saf. 2013;39:531–544
7. Dixon-Woods M, Baker R, Charles K, et al. Culture and behavior in the English National Health Service: Overview of lessons from a large multi-method study. BMJ Qual Saf. 2014;23:106–115
8. Dixon-Woods M, Bosk CL, Aveling EL, Goeschel CA, Pronovost PJ. Explaining Michigan: Developing an ex post theory of a quality improvement program. Milbank Q. 2011;89:167–205
9. Pande PS, Neuman RP, Cavanaugh RR The Six Sigma Way: How GE, Motorola, and Other Top Companies Are Honing Their Performance. 2000 New York, NY McGraw-Hill
10. Marshall M, Pronovost P, Dixon-Woods M. Promotion of improvement as a science. Lancet. 2013;381:419–421
11. Aveling EL, Martin G, Armstrong N, Banerjee J, Dixon-Woods M. Quality improvement through clinical communities: Eight lessons for practice. J Health Organ Manag. 2012;26:158–174
12. Weick KE, Sutcliffe KM Managing the Unexpected: Resilient Performance in an Age of Uncertainty. 20072nd ed. San Francisco, Calif Jossey-Bass
13. Cohen RI, Jaffrey F, Bruno J, Baumann MH. Quality improvement and pay for performance: Barriers to and strategies for success. Chest. 2013;143:1542–1547
14. Pronovost PJ, Rosenstein BJ, Paine L, et al. Paying the piper: Investing in infrastructure for patient safety. Jt Comm J Qual Patient Saf. 2008;34:342–348
Supplemental Digital Content
© 2015 by the Association of American Medical Colleges