Bohmer, Richard M.J. MBBCh, MPH; Bloom, Jonathan D. MD; Mort, Elizabeth A. MD, MPH; Demehin, Akinluwa A. MPH; Meyer, Gregg S. MD, MSc
Since the publication of the Institute of Medicine's (IOM's) first two reports on quality and safety, American health care delivery organizations have been challenged to focus on improving systems of care, adhere to underlying principles for providing quality care, and undertake redesign of health care delivery to ensure safety, effectiveness, efficiency, patient-centeredness, timeliness, and equity.1,2 Whereas previously quality was largely assumed, today it is increasingly quantifiable and requires organized systems for improvement. With the publication of the IOM reports, increased expectations for quality and safety, along with demands for greater transparency, were poised to have a profound effect on American health care. Yet, despite the compelling case for change that was made, to date many of the strongest voices behind those reports have noted relatively little progress.3 Why is this so?
In part, the progress has been slow because the task at hand is enormous. But, more important, a full response to the IOM reports requires more than the gradual evolution of existing structures; it requires a fundamental restructuring to create the cultural and technical revolutions required to fully embrace the IOM's vision. Traditional structures and cultures within academic health centers (AHCs) have been designed or have evolved organically to meet the tripartite missions of teaching, research, and clinical care. Most often, quality and safety structures have been positioned as adjuncts to the core organizational structures and, as such, are not in the position to radically advance the quality and safety agenda. As the IOM noted, advancing quality and safety will require more than “business as usual”; it demands leadership attention and the application of sound organizational management principles to position an organization for success. Here, we use a case study of efforts at Boston's Massachusetts General Hospital (MGH) to restructure quality and safety as an example of the value of using a systematic process of engaging clinical leadership, developing an organizational framework dependent on proven business principles, leveraging focus events, and maintaining executive dedication to execution of the initiative.
Recognizing the Need for Change
The efforts to improve quality and safety at MGH began long before the publication of the IOM quality reports. Early examples of that commitment include a full-disclosure publication4 of a fatal medication safety event in the New York Times in 1852 and the work of Codman5 on his End Results system a half century later. But, as is the case at most AHCs, other aspects of the tripartite mission of teaching, research, and cutting-edge clinical care most often received greater emphasis than clinical quality and safety. During the past half-century, another competitor for institutional attention emerged: the business imperative, best summarized by the maxim “no margin, no mission.”
Before 2003, MGH leadership had always recognized quality and safety as important, but they were not foremost on the institution's agenda. At our institution, as elsewhere, quality and safety initiatives had been distributed among various units, including nursing, compliance, quality assurance, risk management, and our decision support unit. Although each of these units contributed to the overall goal of providing safe and effective care, each had a subtly different immediate goal. Several unrelated efforts to improve quality and safety were being led across the hospital. Some of these, such as developing pathways for accelerating diet advancement to facilitate discharge and developing a telestroke program, were done under the aegis of an institution-wide clinical performance management program. Others, such as the development of radiology information systems, an integrated electronic health record for cardiac surgery, and teamwork training in anesthesia, were the results of departmental efforts. In addition, medical management efforts focused on pay for performance also began to proliferate, often without clear coordination with other organizational efforts.
Despite significant progress in many of these endeavors, the efforts were not centrally coordinated. Some focused on external reporting and meeting hospital regulations, and others addressed quality assurance redesign, while others still were concerned with immediate, short-term “fire-fighting” of emerging problems. The latter often occurred at the department or division levels of the organization with little knowledge or support from central organizational leadership. As a result, quality and safety efforts could achieve measurable progress in a few key areas, but the ad hoc approach created a highly inefficient system for achieving widespread change. Often, this decentralized system resulted in redundant efforts and confusion around roles and responsibilities. Despite tremendous talent and good intentions, we were organizationally ill equipped to meet the challenges outlined in the IOM reports.
To address the deficiencies in our approach to quality and safety improvement, in 2003, we conducted a strategic planning session to outline deliberate efforts for a coordinated, organization-wide approach to quality and safety (see Figure 1 for a timeline of the MGH restructuring efforts). The strategic planning effort was catalyzed by a confluence of changes in the leadership of the hospital and the physicians organization and the recognition that continued growth in the face of capacity constraints, provider competition, and possible softening of patient demand would be a central problem for our institution. Not surprisingly, like similar strategic planning exercises being conducted at academic medical centers around the country in the wake of the release of the IOM reports on quality, we concluded that quality and safety are critical to all core activities of the AHC and should permeate all strategic decisions.
What later distinguished our effort, however, was our choice of tactics to begin to address our “quality chasm” between where we were and where we wanted to be. During the strategic planning process, we generated an MGH-wide call for worthy quality and safety projects to help accelerate our efforts. We received a robust response of 29 submissions, each of which addressed at least one of the six aims identified in the IOM report (Safety, Effectiveness, Efficiency, Timeliness, Patient-Centeredness, and Equity). But as these submissions were reviewed during the strategic planning exercise, it became clear that aspirations to lead in quality were not going to be met by the reality of a series of stand-alone projects, which ultimately fell short of creating a whole greater than the sum of their parts.
Rather than pursuing the 29 individual projects, we chose to undertake three short-term projects to help define our next steps. The first of the three was a review of our efforts at measuring quality across the institution. The “white paper” produced by this effort documented the variety of quality measurement activities at both the organizational and departmental levels, yet it also revealed the costly lack of coordination among those efforts (e.g., redundant data-collection efforts to collect patient-level data for reporting to different external groups). The second, a review of quality and safety programs at other institutions, revealed a variety of approaches to organizing for quality and safety within academic medical centers, yet it did not produce a clear best practice for doing so. Finally, we measured our institutional culture, focusing on high-risk units using the Agency for Healthcare Research and Quality safety culture survey.6 The results revealed that fewer than half of the physicians and nurses surveyed had an overall positive perception of our safety and that, in each safety domain, there was significant variation in perception across units. Separately, a review of data from our medicolegal insurer revealed that, as at other AHCs, organizational rather than individual failures remained a key problem.
Focusing on culture first
On the basis of the survey results we reevaluated our tactics, shifting from a focus on interventions (the “seeds”) to preparing our culture (the “soil”) for our quality and safety journey. Officially shifting the focus to our culture acknowledged the importance of being attentive to the early steps of managing organizational change, such as creating a sense of urgency and developing a guiding coalition.7 Attention to culture is also based on the increasing appreciation of the importance of a quality- and safety-centered culture for producing the best possible outcomes.8,9 What started out as an ambitious set of disparate projects united by a common underlying theme (the six aims of the Quality Chasm report) was subsequently distilled to a parsimonious set of cultural interventions initiated in late 2003 and undertaken throughout a period of three years.
First, we encouraged faculty and staff to use an electronic safety reporting system to raise quality and safety issues forthrightly without being punished. The electronic system ensured confidentiality, which had been difficult to protect in the prior paper-based incident reporting system. The goal of this intervention was not the conversion from paper-based incident reporting to an electronic format; instead, it was the institutional commitment to make safety reporting easy and to provide a mechanism to get feedback to those who reported issues.
The second cultural intervention was a formal commitment to broadly share our quality and safety data within the organization. In pursuit of this effort, we created a performance data-sharing policy and initiated a yearlong effort to convert the findings from the quality measurement white paper into a quality and safety dashboard.
In an effort to broaden our understanding of quality and safety issues beyond the information gleaned from patient experience survey results, we attempted to bring the patient's voice into our discussions through a composite patient experience narrative. This narrative was generated by asking hospital employees who had been patients in our institution to volunteer for a two-hour structured interview about their experiences as patients. Theoretically, our own employees best understand the institution's culture and practices but might also be unforgiving of poor performance in their own organization. We felt that the perspective of these individuals would be more revealing than that of nonemployee patients, whose views could be more readily challenged. In all, 40 interviews were completed and subjected to qualitative analysis, producing a powerful and frank appraisal of our strengths and weaknesses from a patient perspective. A number of respondents were very happy with their overall experience as patients, and the majority expressed confidence in the overall expertise and staff ability at MGH. At the same time, many expressed frustration with impersonal treatment, poor pain management, long waiting times, embarrassing or uncomfortable situations, violations of confidentiality, and systems of care that were poorly designed for both patients and the hospital staff caring for them.
Fostering leadership engagement
Our cultural interventions were complemented by an attempt to increase the engagement in quality and safety issues of the MGH and Massachusetts General Physicians Organization (MGPO) boards of trustees. This required exposing board members to the IOM Quality Chasm framework through presentations on quality and safety and later adopting many of the elements subsequently described in the “Boards on Board” effort of the Institute for Healthcare Improvement.10 As our efforts gained traction, the amount of board meeting time devoted to quality and safety increased to the point at which time spent on quality and safety eclipsed time devoted to finances (Figure 2). Although lay trustees generally have strong backgrounds in business and finance, making them well prepared to review and comment on financial matters, they needed additional time to become equally informed and engaged on quality matters.
Over time, our cultural interventions solidified the commitment of the hospital and physicians organization senior management leadership to making quality and safety the primary institutional focus. However, this shift in emphasis was not immediately embraced by departmental chiefs or clinical leadership, much less by the middle management or frontline clinical or administrative staff. With time, there was a concern that a disconnect between these organizational layers could undermine our modest progress. To address this concern, we initiated a series of executive education retreats for MGH, MGPO, and board leadership. During these retreats, participants not only examined case studies from within and outside health care but also studied frank appraisals of the current state of quality and safety at MGH. These retreats were intended to create a level playing field and a common language for future discussions on quality and safety, facilitate sharing the work done to date and the case for why we need to do much more, tap into board member expertise in large-scale change in a noncrisis environment, and set the stage for the development of a robust multiyear plan. Beyond the content of the retreats, the unprecedented decision to devote six days spread over three retreats to quality and safety sent an important message across the institution about the seriousness of leaders’ devotion to these issues.
Any tendency toward continued incremental improvement in quality and safety was rapidly challenged by events internal and external to our retreat discussions. Internally, a discussion of the parallels between the 2003 Columbia Space Shuttle disaster11 and quality and safety vulnerabilities within our institution illustrated underlying concerns among leaders. After the accident, the Columbia Accident Investigation Board12 concluded that foam shedding from the exterior of the shuttle may have been the direct cause of the catastrophic structural failure, but organizational flaws allowed the faulty vessel to be cleared for launch in the first place. The review of the Columbia case provided a safe metaphorical context to explore topics such as the normalization of deviance and organizational contributions to system failures. The exercise also provided a language to engage in frank conversations around our institution's extant system failures, such as the role of emergency department overcrowding—for which all parts of the institution must take some responsibility—in creating poor patient outcomes.
External to the retreat was a Joint Commission review of our facility during the final days of our third off-site meeting. Although MGH was ultimately reaccredited, the review survey revealed a number of quality-related areas in need of improvement. When confronted with these results, our organizational leadership faced two relatively stark options. The first, and safest, would be to keep our survey results relatively quiet within the institution in the hope that our steady albeit slow trajectory of improvement would eventually result in compliance. The second would be to immediately share the results of our survey throughout the organization in the hope that this transparency would foster the development of an institutional commitment to quality improvement, which would allow us to break through barriers that had condemned us to a slow pace of improvement. That approach, however, ran the risk of public disclosure.
In the end, the leadership decided to bring our results to the broad attention of the organization. Some of the immediate effects of this disclosure were quite positive. The act bolstered institutional resolve, inspired our lay trustees to generate appropriate demands for accountability, and engaged clinical leaders, particularly physicians, in tackling the challenges the Joint Commission identified. There were also negative effects of the disclosure, including a leak of our internal communications to The Boston Globe, the city's major daily newspaper. Our “poor performance” on the survey was covered as a front-page story.13
Ultimately, we chose not only to continue sharing data transparently within the organization but also to share our results and progress directly with our trustees, our patients, and the public.14 In the period immediately following the Joint Commission visit, nearly every senior leadership meeting began with a frank appraisal of our progress in addressing the challenges cited by the Joint Commission surveyors—a practice which continues to this day with a quality and safety update at the beginning of every senior leadership meeting.
Restructuring Quality and Safety
The combination of an obligatory period of cultural improvement followed by two key events—our leadership retreats and the Joint Commission survey—created the policy window for transformational change that escalated institutional momentum behind quality and safety improvement initiatives. This synergy caused us to realize that, despite modest progress, we needed to accelerate our efforts to improve quality and safety; without it, the timeline for change would have been extended.15 The context created by the combination of the focus on culture and the two key events demanded a new strategy for creating a systematic, integrated approach to planning, designing, measuring, assessing, and improving the quality and safety of care and service provided to our patients and employees. That approach was guided by a parsimonious set of principles, which were developed during the retreats. These principles are highlighted in List 1.
The six-month restructuring effort began with a systematic review of quality and safety programs both within and outside our institution. The former included an inventory of MGH quality and safety initiatives and a series of focused interviews from key stakeholders around the institution. One key finding from these interviews was that the quality assurance activities, such as our Patient Care Assessment Committee and our safety reporting system, were not integrated with our quality improvement activities. The external review included benchmarking of leading programs at other AHCs. We considered various models, such as an academically focused quality and safety institute, a consultancy model in which a small number of experts support decentralized quality initiatives, and a completely centralized program with a large number of staff focused on measurement and improvement activities. Although no single model seemed applicable to our environment, there were elements in each of these approaches, such as the need to explicitly integrate the development of a robust information-technology-based platform to support ongoing quality measurement, which could inform our efforts.
Creating the organizational framework
Integrating the findings from the internal and external reviews resulted in a broad blueprint for our restructuring effort. Key aspects of that framework included the need for quality and safety to be locally owned (i.e., by departments) and involve all levels of the organization (with special attention to those in middle management, who had not been fully engaged in our prior quality and safety efforts). We felt this decentralized approach was essential in an autonomous, decentralized culture where frontline units were most familiar with issues and problems and were in the best position to devise appropriate solutions. An active central quality organization focused on measurement infrastructure and improvement expertise, however, was considered an essential enabler for this decentralized approach. The central quality organization would need to address our multiple missions of clinical care, research, and academics as well as our community service role. Moreover, we needed to rationalize an existing “quality bureaucracy,” characterized by a proliferation of committees, redundant efforts, confusion around roles and responsibilities, and competition for limited resources.
Given this required framework, it was clear that a traditional reorganization of the existing quality and safety programs would fall short of our ambitions. As a result, we chose to approach organizational redesign with a clean slate, recognizing that quality and safety work should not be morphed into or attached to the preexisting hospital structures. Two early activities provided definition to the effort. First, we recognized that the hospital's existing mission statement did not provide the opportunity to tightly integrate our quality and safety goals:
To provide the highest quality care to individuals and to the local and distant communities we serve, to advance care through excellence in biomedical research, and to educate future academic and practice leaders of the health care professions.
With this in mind, the mission statement was rewritten through an inclusive process which engaged multiple stakeholders across the organization. The resulting mission statement,
Guided by the needs of our patients and their families, we aim to deliver the very best health care in a safe, compassionate environment; we advance that care through innovative research and education; and, we improve the health and well-being of the diverse communities we serve,
was accompanied by an explicit organizational credo and boundaries statement.16 These three declarations, when accompanied by internal quality measurement and ongoing external benchmarking with peers searching for best practices, provided a cohesive organizational control system which has facilitated our quality and safety efforts.17 Both the process and content of the effort on the organizational control system afforded us the opportunity to progress along the change-management pathway through developing a vision, communicating that vision, and empowering a broad base of people to take action.7
The next step in our restructuring was a rigorous definition of the work needed to meet our mission—the functional requirements for the system. The functional requirements were organized into four broad categories which included operation controls (governance structures and incentives), performance measurement and tracking (with an emphasis on trying to discern important quality and safety signals from our data-collection efforts), analysis (including connections to our research enterprise), and process improvement design and implementation (including training in process improvement techniques and clinical decision support design). Matching functional requirements to our current capabilities allowed us to quickly identify important gap areas, decide which activities could be discontinued with resources reprogrammed to address weaknesses, and decide which of our existing activities in quality and safety had a place in our new structure.
Crafting an organizational design
The definition of functional requirements preceded the design of the final form of our Center for Quality and Safety (CQS). Our task was then to design a structure de novo to deliver these requirements. The guiding principle of the organizational redesign has been integration; activities, processes, and structures relating to quality and safety are often fragmented in complex health care delivery organizations. Moreover, the activities relating to performance—measurement and incentives, for example—and those relating to improvement, are often separated. The new design attempted to integrate all of the functional requirements on the grounds that they all depended on each other. Hence, measurement and reporting activities, which provided reports to outside audiences and to support physician pay-for-performance incentives, were brought under one unit.
One of the largest innovations in our organizational redesign was the coupling of hospital operations and improvement (Figure 3).18 “Hospital operations” encompassed all our existing mechanisms of delivering care, including strategy and performance measures. AHCs typically manage this cycle separately and make adjustments via incentives, investments in human and technical capital, and establishing behavioral boundaries. Our hospital had operated competently within this model for many years prior, and, like most AHCs using this model, generally assumed the hospital was implementing “best practice.” Though this may be true to a large extent, it slows progress and makes institution- or even unit-level improvements difficult to evaluate. “Improvement” encompasses a cycle of learning in which either new data or experience indicate a need for changes in practice and operations. Though this cycle is common to most other AHCs, the fundamental coupling of improvement with hospital operations by design is a novel approach to increasing the efficiency and effectiveness of care. Active inputs allow real-time feedback after adjustments are made. This allows our hospital to quickly determine whether a new approach improves the quality or safety of care while providing the flexibility to continuously change the design to meet emerging and ongoing quality and safety needs.
As in the Toyota Production System, this framework requires all staff, those within the CQS as well as those working at the department level, to have two jobs: do the work (the operations part of the cycle) and improve the work (the improvement cycle).5,19 In addition, it provides a mechanism to tightly link priorities identified through signals detected in the improvement cycle (e.g., results of patient experience surveys) to operations (e.g., the creation of explicit incentives to improve the aberrant signal). Finally, the new framework facilitates disciplined priority setting in which operational efforts are, by design, explicitly linked to signals from our improvement cycle.
Executing the strategy
Execution of the new quality and safety improvement strategy has consisted of seven key components: aligning the hospital's clinical leadership, investing in central and departmental infrastructure, leveraging information technology and its ability to enforce behaviors, implementing institution-wide incentives, emphasizing system-wide initiatives, engaging governance, and promoting transparency. Our hospital's commitment to this undertaking was expressed in many ways from the outset. The first was the creation of the new position of senior vice president of quality and patient safety. This demonstrated a statement of intent by the hospital leadership and granted the CQS improved access to both decision making and resources. The person in this position also serves as the director of the CQS. In addition, the hospital invested significant resources in central and departmental infrastructure to aid in this effort. Over $1.5 million was allocated for immediate support at the unit and department levels. Each department established a paid quality assurance chair with an explicit position description. Further, 30 paid employees—many of whom were redeployed from preexisting positions throughout the hospital—were organized to staff the center. To continue the integration of quality and safety into our organizational fabric, one leadership meeting per month is now devoted to a discussion of quality and safety issues. In addition, each senior leadership meeting now begins with a quality and safety discussion rather than the focusing on finances first, as was the previous practice.
The new electronic dashboard exemplifies our hospital's effort to improve management efficiency in this way, which actively collects real-time data on both outcome and process measures. Released in October 2007, the dashboard has become the primary vehicle for reporting quality, and it serves as a central place where leaders can track our progress on quality and safety initiatives. It also serves as an educational tool, describing each measure, displaying how we are doing relative to benchmark, and indicating if and how we are improving. This allows easy targeting of measures by which we could make significant improvements. We anticipate this dashboard will play greater roles as the demand for transparency increases, and we are making deliberate efforts to promote transparency both internally and externally.20 Following Kotter's7 change-management process, we embarked on a series of short-term projects to generate early successes, consolidate our gains, and test our new approaches and structure in an effort to begin to embed them within our organizational culture. Following a transparent and inclusive priority-setting process, we have committed ourselves to undertaking two to three hospital-wide quality improvement initiatives per year. In addition, each clinical service or unit was required to undertake and report on one data-driven effort per year.
We have employed two different methods to enable our improvement cycle to affect our operations cycle. First, we implemented institution-wide incentives. These encouraged activities that were consistent with our renewed focus on safe, high-quality care. Examples include providing financial incentives for all physicians aligned with network-wide quality goals, pay-for-performance contracts, and hospital-wide quality improvement initiatives.21 We incorporated institutional- and service-based quality goals into a new leadership bonus program so that a portion of the bonus is contingent on the recipient meeting these goals. Similar goals were incorporated in the employee bonus program. This marked an important decision to break the tradition of awarding the entire staff a bonus based on organizational financial performance in favor of splitting the bonus between financial performance and reaching our quality and safety improvement goals. Our initial employee incentive program focused on achieving >90% compliance with hand hygiene hospital-wide. This initiative drew increased attention to the goal and helped adjust culture as groups on the individual unit level launched their own campaigns for improvement. Although the bonus itself was modest (∼$500 per FTE staff member), the association between hand hygiene and the bonus program sent a clear message of institutional priorities—improving quality and safety was not an option and was as essential as the traditional “no margin no mission.” By the end of the year, the goal was met. As a final example of the use of incentives, each department chair and vice chair was required to report on departmental quality improvement activities as part of their annual evaluation.
The second method we used was taking better advantage of our information technology in forcing behaviors. The best example lies in our efforts to improve medication reconciliation. A hard-stop was introduced into the orders section of the medical record 24 hours after admission if a preadmission medication list had not been built. This hard-stop prevented the primary team from writing additional orders until this key step in medication reconciliation had been achieved, with the exception of a medical emergency. As a result, 90% of all admissions lists were built within 24 hours of admission, 97% were built within 36 hours of admission, and 96% had this list reconciled with their in-hospital medications at the time of discharge.
For both hand hygiene and medication reconciliation, the cultural transformation engendered by the shift to extreme transparency was immediate and visible. For example, a hospital chaplain once stopped a group of rounding residents who were about to enter a patient's room to remind them of the importance of hand hygiene. The medication reconciliation empowerment resulted in more consistent feedback from clinicians about how to improve our system and the acceptance of the need to enable a “hard-stop” at 24 hours after admission (with ongoing audits of noncompliance for allowable reasons, such as medical emergencies).
Lessons Learned and Next Steps
Although the context of each AHC could be seen as unique, the circumstances we leveraged to facilitate change at our AHC are common. This case study of our institution's efforts to improve the state of quality and safety illustrates a central point that is generalizable to similar opportunities and efforts and other AHCs: The “usual suspects” approach alone (i.e., organizing only peripheral projects focusing on hot topics) will not allow an organization to achieve the goal of superior quality and safety. Quality and safety improvement should become the central focus of the institution, as opposed to a goal that is acted on in addition to daily operations. In other words, better results are achieved when quality and safety become “the business,” as opposed to an augmentation to the business. This change requires the whole organization to become more oriented around these efforts, which further requires an explicit design, leadership support, and a coherent strategy.
Despite the ambitious scope of our change, we found the financial resources required to implement our design to be relatively modest. The majority of resources, both budget and staff, were harvested from disparate uncoordinated efforts across the organization and were redeployed under the new CQS. The incremental costs of executing our strategy, which included funding for the department-based quality chairs, investment in measurement infrastructure, and the incentive programs, represent an ongoing commitment of approximately 0.33% of the MGH operational budget. These costs are comparable with, and in many cases less than, the resources required to develop a new clinical program (e.g., developing a new ambulatory specialty practice). The real “costs” of execution cannot be understood in terms of dollars, but in terms of organizational will and leadership commitment—priceless prerequisites.
The organizational redesign we have employed is clearly an evolving work in progress. Sustaining our trajectory of cultural change while introducing new focus areas (e.g., improving the patient experience of care) remains the central focus of our efforts. To do so, we have explored new mechanisms to keep our entire workforce engaged beyond our early success with our hand hygiene program. For example, Excellence Every Day, our broad initiative to ensure continual survey readiness for the Joint Commission, builds off our earlier experience with engaging our community in responding to the 2006 Joint Commission survey. Given the current national emphasis on containing health care costs, we, like other AHCs, also anticipate shifting increased attention to mechanisms to improve efficiency through the introduction of new tools (e.g., robust process improvement).
Despite the need to address new challenges, we have found that the management design framework presented in this case study has remained a useful and adaptable platform for promoting change. As illustrated here, the advantages of applying a thoughtful design framework over a more opportunistic and project-driven approach continue to be realized and present a generic framework for AHCs to improve quality and safety.