Secondary Logo

Journal Logo

Articles

Building the Bridge to Quality: An Urgent Call to Integrate Quality Improvement and Patient Safety Education With Clinical Care

Wong, Brian M. MD; Baum, Karyn D. MD, MSEd, MHA; Headrick, Linda A. MD, MS; Holmboe, Eric S. MD; Moss, Fiona CBE, MD; Ogrinc, Greg MD, MS; Shojania, Kaveh G. MD; Vaux, Emma MD, PhD; Warm, Eric J. MD; Frank, Jason R. MD, MA(Ed)

Author Information
doi: 10.1097/ACM.0000000000002937

Abstract

Nearly 15 years after the seminal To Err Is Human report,1 numerous multinational studies suggest that the quality and safety chasm persists,2–4 driven by inadequate leadership preparation, system and process failures, poor communication, and disempowerment of staff and patients. Unfortunately, health professions education does not currently prepare learners and establish the skills needed to reduce these antecedents of poor quality.5–8 This lack of progress is particularly disappointing given the multiple calls for training programs to establish core competencies in quality improvement and patient safety (QIPS) among their graduates. In fact, dating back to 2003, the Institute of Medicine’s “Health Professions Education: A Bridge to Quality” report listed “focusing on quality improvement” as 1 of the 5 key competencies required of all health professionals to ensure that they provide the highest-quality and safest medical care possible.9

Current models of QIPS education are not fully integrated with clinical care delivery, which represents a major impediment toward achieving widespread QIPS competency among health professions learners and practitioners. Specific issues that highlight this lack of integration include:

  • The lack of clarity surrounding the actual role that learners should play in contributing to organizational change and improvement10;
  • The rotational model of training and the lack of integration of learners into the clinical microsystems in which they work11,12;
  • The shortage of faculty who have sufficient expertise in QIPS to support resident learning in many training programs8,13;
  • The lack of alignment between the care delivery system and the educational program with respect to the evaluation of key educational and clinical outcomes14,15; and
  • Clinical learning environments that are unaware of, or continue to achieve, suboptimal safety and quality outcomes, which then potentially imprint upon learners in those environments the types of behaviors that are carried forward years into practice and perpetuate these suboptimal safety and quality outcomes.16,17

To address the urgent need to improve QIPS training for learners as well as outcomes and experiences for patients and their families, the Royal College of Physicians and Surgeons of Canada organized a 2-day consensus conference called Building the Bridge to Quality on September 28 and 29, 2016, in Niagara Falls, Ontario, Canada (see Supplemental Digital Appendix 1, available at http://links.lww.com/ACADMED/A730, for conference faculty list and agenda). The goal was to convene an international group of educational and health system leaders, educators, frontline clinicians, learners, and patients to create a list of actionable strategies that individuals and organizations can use to better integrate QIPS education with clinical care. In this article, we introduce a new framework that acts as a road map toward a more integrated model of QIPS education and clinical care, summarize the process undertaken to develop the list of actionable strategies (framed as action statements organized under 4 key strategic directions), and list concrete examples and describe how groups can get started.

The QIPS Education Integration Framework

As a first step, organizations need to determine their current state of teaching and doing QIPS, with a goal to move toward a more integrated model of QIPS education and clinical care. Here, the term “organization” could represent formally recognized institutions (e.g., academic teaching hospital) as well as the units or clinical microsystems (e.g., internal medicine unit, emergency department) within them. The QIPS Education Integration Framework represents an organization’s starting point and desired future state using a 2 × 2 matrix, where the intensity of QIPS education in the training program is represented on the y-axis, and the intensity of QIPS activity in the clinical environment is represented on the x-axis (see Figure 1).

Figure 1
Figure 1:
The QIPS Education Integration Framework. The framework represents an organization’s starting point and desired future state using a 2 × 2 matrix, where the intensity of QIPS education in the training program is represented on the y-axis, and the intensity of QIPS activity in the clinical environment is represented on the x-axis. Abbreviation: QIPS indicates quality improvement and patient safety.

Using this QIPS Education Integration Framework, organizations are situated in 1 of 4 quadrants, where quadrant 4 is the desired state.

  • Quadrant 1 organizations are characterized by training programs that handle QIPS education (the y-axis criteria) superficially (e.g., several patient safety presentations during academic “half-days”), with few faculty members actively engaged in QIPS activities or teaching. QIPS activities in the clinical environment (the x-axis criteria) include only those that are mandated by regulatory requirements (e.g., meeting accreditation standards that relate to QIPS).
  • Quadrant 2 organizations have a clinical environment with a well-articulated and visible QIPS strategy (e.g., hospital-wide QIPS strategic plan, strong investment in individual and systems-based solutions to address QIPS concerns), but training programs lag with respect to QIPS education offerings. QIPS activities occurring in the clinical environment do not include learners in advancing QIPS goals.
  • Quadrant 3 organizations have the opposite pattern; training programs actively engage learners in QIPS education, typically in the form of experiential quality improvement (QI) projects, with many faculty proficient in both the practice and the teaching of QIPS. However, the clinical environment does not have the resources in place to support meaningful learner engagement in improvement efforts. The culture of the clinical environment fosters resistance to change among frontline clinical providers, which conveys tacit messages that run counter to the goals of the QIPS education.
  • Quadrant 4 organizations represent the desired state, with QIPS educational activities as fully integrated with QIPS activities in the clinical environment. Learners and faculty have many opportunities to contribute to advancing QIPS goals within the clinical environment and see QIPS education and practice as part of daily work. These high-functioning QIPS clinical environments immerse learners in high-quality safe practices and a culture that fosters continuous learning and improvement.

Consensus-Building Process

With the QIPS Education Integration Framework in mind, we engaged in a consensus-building process that started before the 2-day conference held in September 2016 and continued following the in-person conference. Several principles guided our work. First, we wanted to ensure that we had broad engagement of key stakeholder groups (e.g., clinicians from multiple professions, learners, patients and families, educators including program directors, educational and health system leaders, and policy makers and regulators) in all aspects of the entire consensus-building process. We also valued international perspectives and prioritized international representation in our planning process. Finally, we deliberately used an iterative approach to planning to allow consensus statements to develop through discussions, refining them through iterative loops of feedback. By adopting these guiding principles, we hoped that key stakeholder groups would feel that they have a true “stake” in promoting the eventual implementation and dissemination of the consensus action statements. List 1 provides a detailed overview of the consensus-building process, divided into preconference, in-conference, and postconference activities.

List 1

Consensus-Building Activities, From the Building the Bridge to Quality Conference, Niagara Falls, Ontario, Canada, September 28 and 29, 2016

Preconference activities

Established the conference planning committee
  • Appointed a committee chair and planning committee members from Canada, the United States, and the United Kingdom (including a patient and a learner representative)
  • Held monthly teleconferences to organize the 2-day meeting and develop the overall consensus-building process (including pre- and postconference activities)
  • Developed the conceptual framework (see Figure 1) to guide discussions and brainstorming activities planned for the 2-day conference
  • Planned the overall structure and content of the 2-day meeting (including conference goals and objectives, session design, use of interactive techniques to generate ideas, and facilitated discussion to summarize key themes and recommendations)
Proposed a conceptual framework
  • Collectively created a purpose statement to guide the work of the 2-day conference
  • Developed the QIPS Education Integration Framework (2 × 2 grid) that would inform brainstorming activities to identify strategies that could bridge QIPS education and clinical care
Assembled a Program Advisory Board
  • Invited 40 individuals representing clinicians from multiple professions (medicine, nursing, and pharmacy), learners, patients, researchers, and educational and health system leaders from multiple countries (including Canada, the United States, the United Kingdom, and Australia)
  • Engaged the PAB on 2 teleconferences and follow-up email correspondence to solicit advice on the purpose statement for the meeting, the conceptual framework, and the organization of the conference itself (e.g., we modified our program to lead and close the meeting with presentations from a learner and a patient)
Identified key stakeholder groups and organizations and encouraged their constituents to attend the conference
  • Disseminated promotional materials to key stakeholder groups and organizations
  • These included:
    • Medical societies, colleges, and associations (e.g., Royal Society of Medicine in the United Kingdom, Royal Australasian College of Surgeons, Association of American Medical Colleges)
    • Regulatory bodies (e.g., Royal College, Accreditation Canada, Accreditation Council of Graduate Medical Education)
    • Nursing organizations (e.g., Quality and Safety Education for Nurses Institute)
    • Resident organizations (e.g., Resident Doctors of Canada)
    • Patient partners (e.g., Patients for Patient Safety Canada)
    • Provincial health quality councils (e.g., Health Quality Ontario)
    • Medical protective agencies (e.g., Canadian Medical Protective Association)

In-conference activities

  • Setting the stage: Opening remarks from a patient and a learner, introduction of the organizational framework, and panel presentation of innovative QIPS educational models that improve both learner and clinical outcomes
  • Brainstorming activities: Small-group breakout session to engage participants in brainstorming exercises to generate idea maps and, eventually, early versions of action statements
  • Summarizing themes: Reporting ideas and strategies discussed during brainstorming sessions in a facilitated large-group setting, with further refinement of ideas
  • Stakeholder mapping: Small-group discussions to identify key stakeholder groups and to determine how best to partner with them to disseminate, implement, and evaluate changes related to action statements
  • Feedback from education and health system leaders: An international panel of education and health system leaders shared their insights on the meeting as a whole, reflected on the days’ discussions, and provided guidance for how to proceed
  • Commitment to change: Participants reflected on what they can do as individuals to enact change in their local environment and commit to making this change when they return home

Postconference activities

Prepared preliminary draft of the action statements based on conference data synthesis
  • A subgroup of the conference planning committee synthesized data from the following sources:
    • Hundreds of post-it notes with ideas generated by conference participants
    • Chart paper summarizing small-group discussions and idea maps
    • Meeting notes from large-group discussions as well as faculty debrief meetings at the start and end of each day of the conference
    • Carbon copies of driver diagrams documenting individual participants’ commitments to change
    • Tweets from the conference Twitter feed
    • Personal reflections of the conference planning committee members
    • Preliminary action statements generated during the actual conference
  • Data synthesis led to preliminary draft of action statements
Iterative refinement of action statements
  • Held 2 teleconferences for conference planning committee members, which led to significant reorganization of action statements into 4 key strategic directions and removal of redundant items
  • Organized webinars for PAB and conference participants to share updated version of the action statements; feedback provided to further refine recommendations (e.g., greater emphasis on QIPS education, patient perspective, making the language more interprofessional)
Dissemination of the penultimate version of action statements
  • Workshops delivered as part of the Association of American Medical Colleges Integrating Quality meetings (June 2017, 2018), Royal College of Physicians and Surgeons of Canada International Conference on Residency Education 2017 meeting (October 2017), and the ASPIRE Faculty Development program (December 2017), where strategic directions and action statements were presented
  • Served both as an opportunity for early dissemination and as a form of member checking to ensure that strategic directions and action statements resonated with target audience

Abbreviations: QIPS indicates quality improvement and patient safety; PAB, Program Advisory Board; ASPIRE, Advancing Safety for Patients in Residency Education.

Key Strategic Directions and Related Action Statements

Through the consensus-building process, emergent themes were brought together into 4 key strategic directions. A series of action statements were generated in relation to each of these strategic directions (see Table 1). The action statements are intended to provide actionable strategic approaches to achieve the integration of QIPS education and clinical care.

Table 1
Table 1:
Strategic Directions and Action Statements Developed Through the Building the Bridge to Quality Initiative

Several key assumptions provide important context for the strategic directions and action statements that emerged through the consensus-building process. First, it must be emphasized that the central motivation for the integration of QIPS education and clinical care is to make patient experiences and outcomes better and that “better patient outcomes” are the ones that are meaningful to patients and families. Second, there was broad consensus that the recommendations should not refer only to a single health profession (i.e., physician training). Instead, we intend the terms “learners” and “faculty” to represent persons from all health professions education. The next sections provide a more detailed description of the action statements with illustrative examples.

Strategic direction 1: Prioritize the integration of QIPS education and clinical care

The action statements within this first strategic direction highlight the need for training programs and clinical environments to work together to fully integrate QIPS education into clinical care. For many, the lack of integration between QIPS education and clinical care remains an unrecognized gap and perpetuates ongoing concerns about educating learners in systems that achieve suboptimal results. Elevating its importance and celebrating and disseminating successful examples are crucial first steps that lay the foundation for activities at the intersection of QIPS education and clinical care that can improve current and future patient experiences and outcomes.

The Exemplary Care and Learning Site is an example of a model that has shown early promise as an approach to achieving continual improvement in care and learning in the clinical setting.18 The model’s core design elements include features such as patients and families informing process changes; trainees engaging both in care and in the improvement of care; and leaders knowing, valuing, and practicing improvement. The pilot experience demonstrates both the immense value that such an integrated model brings to bear with respect to improved learning and care and the challenges that some sites faced when trying to implement certain elements (particularly, the patient and family element). Other models that simultaneously improve learning about QIPS and optimizing patient care include High Reliability Organization19 and Learning Health System20 frameworks; adapting these to explicitly engage learners and directly inform QIPS education are other potential strategies.

On a broader scale, the Accreditation Council for Graduate Medical Education (ACGME) Clinical Learning Environment Review (CLER) initiative represents the most explicit effort by a national organization to bring about change to integrate QIPS education into clinical care.14 Its stated goal is “to improve how clinical sites engage resident and fellow physicians in learning to provide safe, high quality patient care.”21 The associated “Pursuing Excellence in Clinical Learning Environments” initiative demonstrates a commitment to driving change, with seed funding provided to 8 innovation sites to develop new models of education and care, and a plan toward disseminating and celebrating successful models for others to adopt and implement.

Investment of resources is critical to achieve better integration of QIPS education and clinical care and will require innovative funding models. For example, the University of California, San Francisco (UCSF) developed the Housestaff Incentive Program to drive resident-led system-level improvements of both organizational and program-specific QI goals.22 Through this program, which derives its funding from the hospital’s broader pay-for-performance program for frontline providers, every resident receives $400 per goal achieved, up to $1,200 per year, if their program achieves their QI goals for the year. An evaluation of this initiative demonstrated meaningful engagement of housestaff in organizational QIPS efforts that led to significant improvement of numerous QI goals.

Strategic direction 2: Build structures and implement processes to integrate QIPS education and clinical care

Important structural barriers exist that will hinder efforts to prioritize the integration of QIPS education and clinical care. Despite colocation within an academic teaching hospital, community-based residency program, or ambulatory clinic, education and health systems and their leaders have largely worked separately within their own silos. Learners are often unable to engage in QI work even when interested because of conflicting priorities, schedules, and reward systems. Patients and families are pushed even further from the action. Educational systems revolve around an academic structure where faculty are rewarded for grants, publications, and excellence in education. Health care delivery systems, on the other hand, revolve around the chief executive officer and other executives, with providers and staff receiving rewards for the delivery of financially effective and efficient care. Therefore, the action statements for this strategic direction provide organizations with attainable strategies to build new structures and implement new processes targeted at removing barriers between the existing silos.

Several published examples offer guidance for building processes and structures to better integrate QIPS education and clinical care. The Beth Israel Deaconess Medical Center’s internal medicine program reorganized its resident clinical teaching unit assignment process so that residents rotate back to the same unit multiple times to facilitate longitudinal involvement in QI initiatives.23 The University of Cincinnati’s internal medicine program completely restructured its 3-year program to create an “ambulatory long block” of 12 consecutive months to fully embed residents in the interprofessional team of an ambulatory care clinic for a year, thereby “pulling” learners into QI activities occurring in the clinical environment as active participants.24 Both represent structural changes that overcome the rotational nature of residency training to allow for better integration of learners into interprofessional teams within the clinical microsystem. Other institutions, such as the University of Pennsylvania Health System, have created specific QI tracks that align with institutional quality goals and include a core curriculum, faculty mentorship, and integration into an interprofessional health care leadership team that is accountable for quality and safety outcomes in a hospital unit.25

At the level of the care delivery system, some organizations have created formal structures to bridge the QIPS education and clinical care gap. Examples include the Department of Veterans Affairs Chief Resident in Quality and Patient Safety program26 and the emergence of House Staff Quality Councils,27 as well as faculty leadership roles that bridge the educational and care delivery systems.28 Such formalized roles and committees create a formal platform to enable resident and faculty engagement in institutional QIPS activities. Adopting promotion language and processes that support advancement based on QIPS work29 creates legitimacy for this type of work and could encourage faculty and resident interest in serving in these roles.

Strategic direction 3: Build capacity for QIPS education at multiple levels

The first 2 strategic directions lay the groundwork for the integration of QIPS education and clinical care, yet success will hinge on sustained and broad-based efforts to build capacity among clinicians, learners, patients and families, and education and health system leaders to carry out, teach, and role model QIPS in their daily work. While some faculty development models exist for QI education,13,30 they tend to pull individuals out of their local settings and may inadvertently widen the divide between QIPS education and care delivery. New capacity-building models are needed.

Treating faculty and residents as co-learners is one potentially promising approach to build capacity across both faculty and learner levels. The University of Toronto has had extensive experience with such a model, with faculty and residents experiencing the QI education together and working alongside one another on QI projects.31 Resident participants learn QI through experiential project work, and faculty participants sharpen their QI skills while simultaneously developing their ability to supervise and teach QI. This faculty–resident pairing has resulted in significant QI capacity building, contributed to the sustainability of the program, and resulted in numerous improvements to clinical processes and outcomes.

Similarly, the University of Missouri–Columbia academic health center integrated medical students, nursing students, and other health care professionals in training with health care workers on interprofessional QI teams.32 Teams received training in QI, accompanied by expert QI mentoring, with dual goals of increasing expertise in improvement while improving care. In addition to clear evidence that students gained QI skills, many of the teams also improved care processes. In a published description of the program, one of the key drivers cited by the authors was the combination of the idealism of health learners with real-life experiences of seasoned health care professionals on the same working team.

Leveraging new models of continuing professional development that align with QI can also build much-needed capacity at the faculty/practitioner level33 and, in some cases, offer opportunities to support resident/learner QIPS education as well.34 The emergence of the American Board of Medical Specialties (ABMS) Multispecialty Portfolio Program,35 which works with a wide variety of health care organizations to support QI activities that physicians can participate in to earn continuing certification credit, is one such model.

QIPS educational efforts must expand beyond faculty members and learners to include education and health system leaders, who have influence over the necessary process and structural changes and can set a strategic direction that prioritizes the integration of QIPS education and clinical care. Such training efforts must explicitly emphasize partnerships with patients and families as integral to their daily improvement work. Similarly, capacity building among patients and families must expand significantly to activate them as key partners in QIPS education,36 as well as continuous improvement activities in general.37–40 Such programs need rigorous evaluation, however, to establish the most appropriate and effective ways to prepare patients and families for this partnership.41

Strategic direction 4: Align educational and patient outcomes to improve quality and patient safety

Successful integration of QIPS education and clinical care ideally results in the dual improvement of both educational and patient outcomes.42,43 Such alignment will make explicit the bidirectional nature of educational and patient outcomes. The action statements describe how exemplary organizations and training programs commit to measuring and reporting on educational outcomes and linking these to outcomes that matter to patients and their families. In the end, clinical care needs must be a key driver of QIPS educational processes. Continuous self-directed external assessments of performance and structured gap closure must occur at all levels of the learning continuum. For graduate medical education, resident-sensitive quality measures,44 which are reflective of the work that residents perform, are a potentially promising approach to inform residents of their learning needs as they relate to QIPS. Another approach could involve expanding the use of patient-reported experiences (e.g., Consumer Assessment of Healthcare Providers and Systems45) or outcomes (e.g., patient-reported outcomes measures),46 which are rich and meaningful sources of data that can inform improvements to both QIPS education and care delivery. Not to be overlooked, continuing professional development activities informed explicitly by performance data will be integral to holding the gains of undergraduate and graduate medical education QIPS training.

Some have cautioned against evaluating the impact of education on patient outcomes,47 yet examples exist where QIPS education has realized this level of impact. For example, the I-PASS handover program combined teamwork-based handover education with institutional changes (e.g., integrating the written sign-out tool with the electronic medical record, workflow changes to avoid paging interruptions during resident handover) and culture change to improve resident handover skills (i.e., the “educational” outcome) while simultaneously reducing preventable adverse events, medical errors, and near misses (i.e., “clinical” or “patient” outcomes).48 Going back to the earlier example of the ambulatory long block reconfiguration at the University of Cincinnati, participating residents self-reported perceptions of improved learning, but, more important, numerous quality indicators for patients receiving care from residents working in this ambulatory clinic have improved and are now among the best in the entire health system.24

A Road Map to Achieve Greater Integration of QIPS Education and Clinical Care

Individuals and organizations can link the QIPS Education Integration Framework to one or more action statements from the 4 strategic directions to build a road map toward greater integration of QIPS education and clinical care. It is important to recognize that clinical microsystems within an organization are highly variable. The various training programs (e.g., internal medicine, general surgery, obstetrics–gynecology) and the diverse clinical environments (e.g., inpatient ward, ambulatory clinic, emergency department, labor and delivery suite) contribute to within-organization variability, so ideally the framework would be applied to the clinical subunit or microsystem within the larger organization that best corresponds to one’s scope of responsibility and authority. Figure 2 provides hypothetical examples to illustrate how road maps can be created.

Figure 2
Figure 2:
Potential road maps to achieve greater integration of QIPS education and clinical care. These hypothetical examples demonstrate how organizations can develop a road map based on their starting quadrant on the QIPS Education Integration Framework. The quadrant determines the potential next steps, which then inform the choice of action statement that is most relevant. Abbreviation: QIPS indicates quality improvement and patient safety.
  • Step 1: Determine organizational current state: Using the QIPS Education Integration Framework (Figure 1), determine the quadrant that best represents the organizational current state with respect to QIPS education within the training program, the intensity of QIPS activity within the clinical environment, and the degree to which these are integrated. The optimal way for organizations to make this determination is uncertain. They could potentially draw upon existing tools or processes such as the ACGME CLER review for QIPS educational current state14 or tools such as the Agency for Healthcare Research and Quality Hospital Survey on Patient Safety Culture49 or the High-Value Care Culture Survey50 to determine clinical environment QIPS current state. Similarly, the Joint Commission developed the High-Reliability Health Care Maturity Model as a framework that defines progress toward high reliability.19 By interviewing a mix of patient safety managers, senior leadership, and clinicians with a special interest in patient safety, they were able to differentiate organizations by stage of maturity and progress toward high reliability.51
  • Step 2: Identify action statements that move the organization to quadrant 4: Choice of action statement(s) will depend largely on the organization’s starting point (i.e., the current quadrant). Select among the action statements to prioritize those changes that are most likely to bridge the divide between QIPS education and clinical care. In doing so, one matches the prioritized solution to the most relevant areas of need and maximizes the return on invested time and effort expended to implement bridging activities (see Figure 2).
  • Step 3: Test and implement changes or interventions: Guided by a prioritized list of action statements identified, organizations next move to testing and evaluating specific changes or interventions. Where possible, organizations should seek successful examples (e.g., reviewing the literature, sharing knowledge at conferences) and adapt them for their local contexts. In those instances where prior examples of successful interventions do not exist, organizations will need to develop novel and innovative approaches to achieve their integration goals.
  • Step 4: Disseminate work and lessons learned widely: Success at the clinical subunit level must be spread within and across organizations. Broader adoption of effective interventions will require rigorous evaluation efforts, drawing upon a variety of frameworks including modified Kirkpatrick framework,52 realist evaluation,53 and contribution analysis,54 to name a few. Such evaluative efforts can also contribute to knowledge for how best to apply the QIPS Education Integration Framework and to improve upon it.

A Call to Action

After decades of work, many organizations can point to significant improvements in the quality and safety of the health care they provide. Others can point to great progress in QIPS education. Unfortunately, few have achieved both, and even fewer can demonstrate QIPS education and clinical care that are fully integrated and synergistic. The outcomes of the Building the Bridge consensus process can provide a road map toward greater integration of QIPS education and clinical care that is universally relevant and important. Therefore, a major transformation in educational and care delivery models is urgently needed and will require a major commitment and investment of resources to build capacity for change; to innovate, test, and implement interventions; and to align evaluative efforts to demonstrate impact on both learning and patient outcomes.

Too often, such calls to action are taken up by individual groups in an uncoordinated fashion, leading to sporadic examples of success and pockets of excellence. Building the Bridge efforts cannot afford to follow a similar path because the stakes for learners, patients, and families are simply too high. A critical next step will be to engage key stakeholder groups, including:

  • Health professions schools (e.g., medicine, nursing, pharmacy);
  • Academic medical centers and teaching hospitals (including organizations such as the Association of American Colleges and the Association of Faculties of Medicine of Canada);
  • Regulatory bodies (e.g., the Joint Commission, Accreditation Canada, Federation of State Medical Boards, ACGME);
  • Certifying bodies (e.g., ABMS);
  • Payers (e.g., Centers for Medicare & Medicaid Services);
  • Health care quality and patient safety organizations (e.g., National Patient Safety Foundation, Institute for Healthcare Improvement, Canadian Patient Safety Institute); and
  • Patient groups (e.g., Patients for Patient Safety Canada)

Efforts to bring together these stakeholder groups by creating formal and informal networks can drive the necessary transformation that is needed at the institutional, regional, and national levels and facilitate shared learning and widespread change.

There is also a risk of “preaching these recommendations to the converted.” In other words, attention must be paid to the ways that we can motivate individuals, groups, and organizations that may not necessarily see the immediate need to make these changes. This will require a multifaceted approach that draws upon a variety of different types of incentives. For some, these might be financial incentives or motivators, like the ones used by UCSF to encourage housestaff to address organizational quality issues. Pay-for-performance programs that provide financial rewards or penalties to individual health care providers or institutions according to how well they deliver on specific quality measures have become pervasive in numerous health care jurisdictions, yet there are relatively few examples where such programs have led to demonstrable improvements in actual patient outcomes.55 At a minimum, it behooves educational and care delivery systems to align their financial incentives and work together to achieve the common goal of improved learning and care.

While financial incentives may play a role, that is not why most health professionals get up every morning and work hard all day. Instead, they are much more motivated by providing what is best for their patients and are frustrated when there is a gap between best care and their demonstrable results. Therefore, we firmly believe that we need to move beyond external motivators and emphasize intrinsic motivation and harness the energy derived from fulfilling a shared purpose to improve patient outcomes and experiences. Part of what is inspiring and energy giving about health care improvement is that it draws on learning and creativity, which represent key elements that underpin intrinsic motivation.56 Therefore, we must find ways to highlight these elements and make explicit moral arguments that clearly articulate our duty as health care professionals to build strong delivery systems because they lead to better care for patients and better education for learners.

Taken together, we hope to spur a collective international movement to shift the culture of educational and clinical environments to build bridges that connect training programs and clinical environments, align educational and health system priorities, and improve both learning and care, with improved outcomes and experiences for patients, their families, and communities.

Acknowledgments:

The authors would like to thank the 37 members of the Building the Bridge to Quality Program Advisory Board. We would also like to acknowledge Ms. Ginette Bourgeois of the Royal College of Physicians and Surgeons of Canada for the administrative support provided to the Building the Bridge to Quality initiative.

References

1. Kohn LT, Corrigan J, Donaldson MS. To Err Is Human: Building a Safer Health System. 2000.Washington, DC: National Academy Press.
2. Levine DM, Linder JA, Landon BE. The quality of outpatient care delivered to adults in the United States, 2002 to 2013. JAMA Intern Med. 2016;176:1778–1790.
3. Landrigan CP, Parry GJ, Bones CB, Hackbarth AD, Goldmann DA, Sharek PJ. Temporal trends in rates of patient harm resulting from medical care. N Engl J Med. 2010;363:2124–2134.
4. Baines R, Langelaan M, de Bruijne M, Spreeuwenberg P, Wagner C. How effective are patient safety initiatives? A retrospective patient record review study of changes to patient safety over time. BMJ Qual Saf. 2015;24:561–571.
5. Crosson FJ, Leu J, Roemer BM, Ross MN. Gaps in residency training should be addressed to better prepare doctors for a twenty-first-century delivery system. Health Aff (Millwood). 2011;30:2142–2148.
6. Wagner R, Koh NJ, Patow C, Newton R, Casey BR, Weiss KB; CLER Program. Detailed findings from the CLER National Report of Findings 2016. J Grad Med Educ. 2016;8(2 suppl 1):35–54.
7. Moran KM, Harris IB, Valenta AL. Competencies for patient safety and quality improvement: A synthesis of recommendations in influential position papers. Jt Comm J Qual Patient Saf. 2016;42:162–169.
8. Headrick LA, Baron RB, Pingleton SK, et al. Teaching for Quality: Integrating Quality Improvement and Patient Safety Across the Continuum of Medical Education. 2013.Washington, DC: Association of American Medical Colleges.
9. Greiner AC, Knebel E. Health Professions Education: A Bridge to Quality. 2003.Washington, DC: National Academies Press.
10. Gonzalo JD, Lucey C, Wolpaw T, Chang A. Value-added clinical systems learning roles for medical students that transform education and health: A guide for building partnerships between medical schools and health systems. Acad Med. 2017;92:602–607.
11. Holmboe ES, Batalden P. Achieving the desired transformation: Thoughts on next steps for outcomes-based medical education. Acad Med. 2015;90:1215–1223.
12. Holmboe E, Ginsburg S, Bernabeo E. The rotational approach to medical education: Time to confront our assumptions? Med Educ. 2011;45:69–80.
13. Myers JS, Tess A, Glasheen JJ, et al. The Quality and Safety Educators Academy: Fulfilling an unmet need for faculty development. Am J Med Qual. 2014;29:5–12.
14. Nasca TJ, Weiss KB, Bagian JP. Improving clinical learning environments for tomorrow’s physicians. N Engl J Med. 2014;370:991–993.
15. Weiss KB, Co JPT, Bagian JP; CLER Evaluation Committee. Challenges and opportunities in the 6 focus areas: CLER National Report of Findings 2018. J Grad Med Educ. 2018;10(suppl 4):25–48.
16. Asch DA, Nicholson S, Srinivas S, Herrin J, Epstein AJ. Evaluating obstetrical residency programs using patient outcomes. JAMA. 2009;302:1277–1283.
17. Chen C, Petterson S, Phillips R, Bazemore A, Mullan F. Spending patterns in region of residency training and subsequent expenditures for care provided by practicing physicians for Medicare beneficiaries. JAMA. 2014;312:2385–2393.
18. Headrick LA, Ogrinc G, Hoffman KG, et al. Exemplary Care and Learning Sites: A model for achieving continual improvement in care and learning in the clinical setting. Acad Med. 2016;91:354–359.
19. Chassin MR, Loeb JM. High-reliability health care: getting there from here. Milbank Q. 2013;91:459–490.
20. Olsen L, Aisner D, McGinnis JM; Institute of Medicine (U.S.). Roundtable on Evidence-Based Medicine. The Learning Healthcare System: Workshop Summary. 2007.Washington, DC: National Academies Press.
21. Accreditation Council for Graduate Medical Education. Clinical learning environment review. https://www.acgme.org/What-We-Do/Initiatives/Clinical-Learning-Environment-Review-CLER. Accessed July 30, 2019.
22. Vidyarthi AR, Green AL, Rosenbluth G, Baron RB. Engaging residents and fellows to improve institution-wide quality: The first six years of a novel financial incentive program. Acad Med. 2014;89:460–468.
23. Tess AV, Yang JJ, Smith CC, Fawcett CM, Bates CK, Reynolds EE. Combining clinical microsystems and an experiential quality improvement curriculum to improve residency education in internal medicine. Acad Med. 2009;84:326–334.
24. Zafar MA, Diers T, Schauer DP, Warm EJ. Connecting resident education to patient outcomes: The evolution of a quality improvement curriculum in an internal medicine residency. Acad Med. 2014;89:1341–1347.
25. Patel N, Brennan PJ, Metlay J, Bellini L, Shannon RP, Myers JS. Building the pipeline: The creation of a residency training pathway for future physician leaders in health care quality. Acad Med. 2015;90:185–190.
26. Watts BV, Paull DE, Williams LC, Neily J, Hemphill RR, Brannen JL. Department of Veterans Affairs Chief Resident in Quality and Patient Safety Program: A model to spread change. Am J Med Qual. 2016;31:598–600.
27. Fleischut PM, Evans AS, Nugent WC, et al. Ten years after the IOM report: Engaging residents in quality and patient safety by creating a House Staff Quality Council. Am J Med Qual. 2011;26:89–94.
28. Myers JS, Tess AV, McKinney K, et al. Bridging leadership roles in quality and patient safety: Experience of 6 US academic medical centers. J Grad Med Educ. 2017;9:9–13.
29. Staiger TO, Mills LM, Wong BM, Levinson W, Bremner WJ, Schleyer AM. Recognizing quality improvement and patient safety activities in academic promotion in departments of medicine: Innovative language in promotion criteria. Am J Med. 2016;129:540–546.
30. Baxley EG, Lawson L, Garrison HG, et al. The teachers of quality academy: A learning community approach to preparing faculty to teach health systems science. Acad Med. 2016;91:1655–1660.
31. Wong BM, Goldman J, Goguen JM, et al. Faculty–resident “co-learning”: A longitudinal exploration of an innovative model for faculty development in quality improvement. Acad Med. 2017;92:1151–1159.
32. Hall LW, Headrick LA, Cox KR, Deane K, Gay JW, Brandt J. Linking health professional learners and health care workers on action-based improvement teams. Qual Manag Health Care. 2009;18:194–201.
33. Sargeant J, Wong BM, Campbell CM. CPD of the future: A partnership between quality improvement and competency-based education. Med Educ. 2018;52:125–135.
34. Oyler J, Vinci L, Arora V, Johnson J. Teaching internal medicine residents quality improvement techniques using the ABIM’s practice improvement modules. J Gen Intern Med. 2008;23:927–930.
35. American Board of Medical Specialties. ABMS Multi-Specialty Portfolio Program. https://www.abms.org/initiatives/committing-to-physician-quality-improvement/multi-specialty-portfolio-program. Accessed July 30, 2019.
36. Jha V, Buckley H, Gabe R, et al. Patients as teachers: A randomised controlled trial on the use of personal stories of harm to raise awareness of patient safety for doctors in training. BMJ Qual Saf. 2015;24:21–30.
37. Batalden M, Batalden P, Margolis P, et al. Coproduction of healthcare service. BMJ Qual Saf. 2016;25:509–517.
38. Fulmer T, Gaines M. Partnering With Patients, Families and Communities to Link Interprofessional Practice and Education. 2014.New York, NY: Josiah Macy, Jr. Foundation.
39. Cox M, Naylor M. Transforming Patient Care: Aligning Interprofessional Education With Clinical Practice Redesign. Proceedings of a Conference Sponsored by the Josiah Macy Jr. Foundation. 2013.New York, NY: Josiah Macy Jr. Foundation.
40. Carman KL, Dardess P, Maurer M, et al. Patient and family engagement: A framework for understanding the elements and developing interventions and policies. Health Aff (Millwood). 2013;32:223–231.
41. Wong BM, Ackroyd-Stolarz S, Bukowskyi M, et al. Integrating Patient Safety and Quality Improvement Into CanMEDS 2015: Preliminary Recommendations to the Expert Working Group Chairs. 2014.Ottawa, ON, Canada: Royal College of Physicians and Surgeons of Canada.
42. Chahine S, Kulasegaram KM, Wright S, et al. A call to investigate the relationship between education and health outcomes using big data. Acad Med. 2018;93:829–832.
43. Wong BM, Holmboe ES. Transforming the academic faculty perspective in graduate medical education to better align educational and clinical outcomes. Acad Med. 2016;91:473–479.
44. Schumacher DJ, Holmboe ES, van der Vleuten C, Busari JO, Carraccio C. Developing resident-sensitive quality measures: A model from pediatric emergency medicine. Acad Med. 2018;93:1071–1078.
45. Cleary PD, Crofton C, Hays RD, Horner R. Advances from the Consumer Assessment of Healthcare Providers and Systems (CAHPS®) project. Introduction. Med Care. 2012;50(suppl 1):50.
46. Basch E. Patient-reported outcomes—Harnessing patients’ voices to improve clinical care. N Engl J Med. 2017;376:105–108.
47. Cook DA, West CP. Perspective: Reconsidering the focus on “outcomes research” in medical education: A cautionary note. Acad Med. 2013;88:162–167.
48. Starmer AJ, Spector ND, Srivastava R, et al.; I-PASS Study Group. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371:1803–1812.
49. Jones KJ, Skinner A, Xu L, Sun J, Mueller K. Henriksen K, Battles JB, Keyes MA, Grady ML. The AHRQ Hospital Survey on Patient Safety Culture: A tool to plan and evaluate patient safety programs. In: Advances in Patient Safety: New Directions and Alternative Approaches. Vol. 2: Culture and Redesign. 2008:Rockville, MD; 1–22.
50. Gupta R, Moriates C, Harrison JD, et al. Development of a high-value care culture survey: A modified Delphi process and psychometric evaluation. BMJ Qual Saf. 2017;26:475–483.
51. Sullivan JL, Rivard PE, Shin MH, Rosen AK. Applying the High Reliability Health Care Maturity Model to assess hospital performance: A VA case study. Jt Comm J Qual Patient Saf. 2016;42:389–411.
52. Headrick LA, Paull DE, Weiss KB. Dent JA, Harden RM, Hunt D. Patient safety & quality of care. In: A Practical Guide for Medical Teachers. 2017:5th ed. Edinburgh, UK: Elsevier; 215–221.
53. Wong G, Greenhalgh T, Westhorp G, Pawson R. Realist methods in medical education research: What are they and what can they contribute? Med Educ. 2012;46:89–96.
54. Van Melle E, Gruppen L, Holmboe ES, Flynn L, Oandasan I, Frank JR; International Competency-Based Medical Education Collaborators. Using contribution analysis to evaluate competency-based medical education programs: It’s all about rigor in thinking. Acad Med. 2017;92:752–758.
55. Mendelson A, Kondo K, Damberg C, et al. The effects of pay-for-performance programs on health, health care use, and processes of care: A systematic review. Ann Intern Med. 2017;166:341–353.
56. Dethmer J, Chapman D, Klemp K. The 15 Commitments of Conscious Leadership: A New Paradigm for Sustainable Success. 2015.United States: Conscious Leadership Group.

Supplemental Digital Content

Copyright © 2019 by the Association of American Medical Colleges