Research has revealed that the quality of health care delivered in the United States is suboptimal.1–3 In response, health care systems have deployed significant resources to improve patient care and to comply with newly enacted quality regulations.4,5 Acknowledging the physician’s role in improving quality, the Accreditation Council for Graduate Medical Education (ACGME) has introduced mandates for residents and fellows to attain competency in systems-based practice (SBP), defined as knowledge of the environmental context and systems within which they function.6 On the horizon are additional proposals, such as the Medicare Payment Advisory Commission’s recommendation on linking quality outcomes to remuneration for teaching hospitals7 and the ACGME’s stipulations for teaching hospitals to demonstrate how trainees use data to improve systems of care.8
Accordingly, teaching hospitals and graduate medical education (GME) programs are dually motivated to engage residents and fellows in improving hospital-wide quality and to facilitate their development of key quality improvement (QI) skills. Many recent efforts have been limited in scope9,10 and have revealed the challenges of engaging frontline physicians in QI.11 ,12 One promising arena is the use of performance-based incentives, which has yielded positive results in motivating physicians to improve care quality.13–15
To include residents and fellows in hospital-wide QI efforts and to provide them with opportunities to gain SBP experience, we designed a QI program that combines experiential learning and financial incentives. We hypothesized that an institution-wide QI structure incorporating these elements could help our institution’s medical center and its GME programs successfully meet their collective QI challenges. Here we describe the program we developed and report on the outcomes of the first six years of the ongoing University of California, San Francisco (UCSF) Resident and Fellow QI Incentive Program.
Development of the QI Incentive Program
The UCSF School of Medicine has 25 accredited residency programs and 55 accredited fellowships, as well as 40 non-ACGME/non–American Board of Medical Specialties programs. The UCSF Medical Center (UCSFMC) is an urban, 600-bed, tertiary care adult and children’s hospital, where each year approximately 900 residents and fellows spend at least 12 weeks of their training time.
In 2001, the UCSFMC implemented the Incentive Award Program (IAP) for all nonphysician staff. This program, which is designed to engage and align employees with the medical center’s mission, provides a financial bonus to managers and staff if UCSFMC-wide patient safety, quality, patient satisfaction, and financial performance goals are achieved. Although collective efforts are required to meet certain goals (e.g., patient satisfaction, rate of pneumococcal vaccination), physicians (including residents and fellows) were not included in this bonus program.
In 2006, the associate dean for GME (R.B.) and the UCSFMC chief executive officer proposed a variant of the IAP that would include residents and fellows. This generated debate among Graduate Medical Education Committee (GMEC) members. Proponents argued that engaging residents and fellows in QI was necessary and that financial incentives, as a mainstay of current medical practice, should be mirrored during training. Opponents rejoined that financial incentives were unprofessional and unnecessary; trainees should not be paid to “do the right thing,” and such incentives could detract from core training. After substantial discussion, the GMEC passed the proposal, launching the QI incentive program at the start of fiscal year (FY) 2007.
Development process and structure
We developed the UCSF Resident and Fellow QI Incentive Program iteratively, with lessons learned informing subsequent program revisions. Retrospectively, program development can be divided into three phases, which are described below. As the QI incentive program was developed, the GME director of quality and safety programs (A.R.V.) and associate dean of GME (R.B.) met regularly with the UCSFMC associate chief medical officer (A.G.) and the UCSFMC director of patient safety and quality to discuss proposed goals (which were, in turn, vetted by the UCSFMC chief medical officer and the chief executive officer), develop the structure, review data, and implement programmatic changes.
Phase 1: QI incentive program launch with all-program project goals (FY 2007).
In the first phase, we determined eligibility and set project goals that encompassed all training programs (all-program project goals). Any resident or fellow who spent a minimum of 12 weeks per academic year providing care at the UCSFMC was deemed eligible. We set three goals that paralleled goals in the staff IAP program. The UCSFMC collected data. For each project goal achieved, all eligible residents and fellows received $400 paid by the UCSFMC as a one-time bonus (maximum of $1,200/trainee).
Phase 2: All-program project goal domains and resident/fellow input (FYs 2007–2012).
In the second phase, we further delineated three domains of all-program project goals: patient satisfaction, quality/safety, and operation/utilization. We determined that the patient satisfaction project goal should parallel that set for the staff IAP, to align everyone under a common objective. For the other two domains, we aimed to create project goals that would be relevant to residents and fellows, within their control, and of high UCSFMC strategic priority. We collected resident and fellow input for these goals through surveys as well as from the Resident and Fellow Council and the chief residents. All residents and fellows who met the service requirement (described above) were automatically enrolled into the all-program arm. As the measures and data collected by the UCSFMC reflected the care delivered by and efforts of the group collectively, the eligible residents and fellows were treated as one cohort for payment (i.e., they could not opt out of the program).
Phase 3: Addition of program-specific project goals (FYs 2010–2012).
To further engage residents and fellows and to identify project goals more proximal to their experience, we added a second arm to the QI incentive program focusing on project goals specific to training programs (program-specific project goals). Trainees were encouraged to work with their peers, faculty mentors, and training program leadership to create a project goal specific to their department/program. Training programs that chose to participate in this arm created a proposal describing their project goal and designated a faculty mentor and resident/fellow “champion(s)” as project leader(s). These proposals were presented by the champions to, and underwent iterative review by, the Resident and Fellow Incentive Committee, comprising UCSFMC and GME leadership, peer-selected representatives, QI experts, and GMEC members. Proposals were evaluated on five criteria: importance of topic, educational value, medical center and department alignment, feasibility of measurement, and resident/fellow involvement. The champions and faculty mentors of accepted proposals were invited to GME-sponsored seminars where they could build QI skills to impart to other participating residents/fellows (a “train the trainer” model) and collaborate with other champions and mentors. They also met with the GME director of quality and safety (A.R.V.; later, G.R.) for feedback and mentoring.
All-program project goals
To evaluate whether the Resident and Fellow QI Incentive Program’s all-program project goals were met during FYs 2007–2012, we gathered data from a variety of sources. The data collected, data sources, and analytics specific to each project goal are described below. The all-program project goals are fully described in Table 1.
Data for this domain were derived from one item on the UCSFMC patient satisfaction survey—“Likelihood of your recommending this hospital to others.” Surveys were collected and analyzed by Press-Ganey, a national patient satisfaction survey company. Data provided to UCSFMC were expressed as respondents’ mean score and a percentile ranking that allowed comparison with hospitals in the University HealthSystem Consortium (www.uhc.edu) that also use the Press-Ganey survey.
Project goals in this domain were related to Joint Commission core measures, patient satisfaction with pain control, resident/fellow influenza vaccination rates, resident/fellow completion of an infection control module, and hand hygiene. Joint Commission core measures data were collected and analyzed by the UCSFMC Patient Safety and Quality Department through administrative databases and chart abstraction. Patient satisfaction with pain control was evaluated through responses to the item “How well your pain was controlled” on the aforementioned patient satisfaction survey; these data were collected and analyzed by Press-Ganey. Influenza vaccination rates were determined by the percentage of residents and fellows who received the vaccination at the medical center, attested to receiving it through another source, or submitted a declination of vaccination form. Infection control module completion data were collected through the UCSFMC learning management system and were reported as a percentage of the total resident/fellow cohort. For these two measures, data were collected and analyzed by the Patient Safety and Quality Department. (We also analyzed faculty rates for the purposes of comparison, although faculty were not included in the QI incentive program.)
Hand hygiene data were collected through direct observation of all health care professionals, performed on all hospital units by individuals trained on the observation protocol. These data were analyzed by the Patient Safety and Quality Department and reported as the percent compliance of all health care professionals (including physicians).
Project goals in this domain related to documentation standards, compliance with Centers for Medicare and Medicaid Services (CMS) Conditions of Participation, and lab work. Documentation standards were evaluated by chart reviews conducted by UCSFMC staff. CMS compliance was evaluated by the outcome of a full CMS survey. Lab utilization data were derived from administrative data, analyzed by UCSFMC Decision Support Services, and reported as tests per patient per inpatient day (tests/patient/day).
Data pertaining to the project goals in these three domains were compiled by the UCSFMC into a scorecard that was distributed quarterly by the GME office to all UCSF residents and fellows, core faculty, chief residents, and department/program leaders. For a sample scorecard, see Supplemental Digital Figure 1 at http://links.lww.com/ACADMED/A182.
Program-specific project goals
Residents and fellows collected data for program-specific projects from a variety of sources. These included UCSFMC administrative databases, chart abstraction, and individually designed collection techniques. Data were collated by the project champions and submitted to the GME office quarterly, where they were compiled into a scorecard and distributed, as described above.
To capture the strategies that residents and fellows employed to achieve goals, the GME director of quality and safety programs (A.R.V./G.R.) discussed all-program projects regularly with chief residents and interviewed champions of program-specific projects. The interviews were transcribed and comments were categorized into themes by two independent reviewers. Any discrepancies were discussed until consensus was reached.
This project was deemed exempt from Committee on Human Research approval.
In the first six years of the Resident and Fellow QI Incentive Program (FYs 2007–2012), 5,275 residents and fellows (annual average = 880 trainees) were eligible and participated in the all-program arm. Over the first three years of the program-specific arm (FYs 2010–2012), 540 residents and fellows participated, representing 16 (88%) of the 18 eligible residencies and the Department of Medicine fellowship programs. Training program participation in the program-specific arm increased by 25% each year (FY 2010 = 9 programs, FY 2011 = 12 programs, FY 2012 = 16 programs). On average, $724,450 per FY was paid in bonuses (approximately $800 to each eligible resident/fellow for two goals met/FY).
All-program project goals
Residents and fellows achieved 11 (61%) of the 18 all-program project goals during FYs 2007–2012. Domain-specific outcomes are highlighted below. Detailed outcomes for each goal in each domain are provided in Table 1.
The patient satisfaction project goal was met in four of the six years. The average mean score on patients’ likelihood of recommending the UCSFMC to others increased year-on-year.
This domain included four project goals attempted over the six years. Goals were met in three of the years. In FY 2007, compliance with Joint Commission core measures was > 90% for > 2 months, meeting that year’s goal. Pain control as reported by patients did not achieve the goal set (a 75th percentile ranking) for FY 2008 (71st percentile) or FY 2009 (63rd percentile). In FY 2010, 95% of residents and fellows complied with influenza vaccination, 86% of residents and fellows completed the infection control module, and 50.2% of health care providers (including residents and fellows) were compliant with hand hygiene standards. Although the composite of these three metrics did not achieve the year’s goal (85% composite compliance), the resident and fellow scores were higher than those of the faculty (faculty vaccination: 79%; faculty module completion: 29%). Hand hygiene compliance was greater than the goal of 85% for 7 months in FY 2011 and for all 12 months of FY 2012.
This domain included four project goals attempted over the six years. Goals were met in four of six years. In FY 2007, residents and fellows achieved > 2 months of documentation standard compliance for 21 of 30 elements, falling short of the goal (> 2 months for all 30 elements). In FY 2008, the UCSFMC met the goal of full compliance with the CMS Conditions of Participation. In FY 2009, residents and fellows maintained an average of 95% compliance with all elements of complete order writing (increased from a baseline of < 50%), thus meeting that year’s goal. In FYs 2010 and 2011 they met the lab utilization goal by decreasing test orders compared with the prior year: In FY 2010 they decreased CBC and CBC plus differential tests by at least 5% (1.07 to 0.99 tests/patient/day), and in FY 2011 they decreased common tests by at least 5% (common electrolytes and CBC; 9.67 to 8.89 tests/patient/day). In FY 2012, however, they did not meet the goal as test ordering remained essentially unchanged (8.89 to 8.90 tests/patient/day).
Program-specific project goals
Thirty-seven program-specific projects were completed during FYs 2010–2012, and the goals were achieved for 28 (76%; see Table 2 for specific project goals and outcomes). The program-specific projects fit into four broad categories: workflow improvements, enhanced communication, effective documentation, and patient-level interventions. Below, we describe a representative goal and outcome for each category.
An example of a workflow improvement was the dermatology residents’ project goal of decreasing clinic wait times by 25%. The residents succeeded in reducing wait times by 58% (from 23.44 minutes to 13.8 minutes) in the target clinic. To enhance communication, the radiology residents attempted to report critical results in ≤ 60 minutes for 95% of eligible cases; they achieved this in 97.3% of cases.
In terms of effective documentation, Department of Medicine fellows set out to improve the quality of written consultation notes by 20% in FY 2011. They developed a rubric to evaluate required elements for notes16 and used it to analyze notes after acquiring a baseline (60% of notes with all required elements) and completing educational interventions. Although they did not meet their goal in the year they set it—there was a decrease of 2.3%—they have continued the project and related interventions to date.
As a patient-level intervention, the neurology residents aimed to decrease the duration of nicardipine use for patients with intracranial hemorrhage to < 72 hours. The residents met their goal by decreasing duration to 46.7 hours.
Residents and fellows described a wide variety of strategies used to achieve their project goals. We categorized these as information distribution, system-wide modifications, and audit/feedback (Table 3). Almost all of the training programs used existing conferences to disseminate QI project information and deliver relevant education. System-wide modifications were largely dependent on departmental resources. For example, the Department of Medicine introduced a new discharge summary template into the electronic medical record, thus facilitating the department’s program-specific project goal for FY 2011. The Department of Neurosurgery modified the responsibilities of certain residents to facilitate completed chart consent forms to improve operating room start times. With regard to audit/feedback, all training programs collected and reported data back to their residents and fellows, although communication mechanisms and frequency varied by program.
In its first six years, the UCSF Resident and Fellow QI Incentive Program provided a structure for thousands of residents and fellows to inform the QI agenda of the UCSFMC, to gain SBP experience, and to improve the quality of care delivered across the institution, for which they were financially rewarded.
This program has cultivated collaboration between the medical center and GME, which is essential to QI and for trainees’ development of SBP competency.17–19 Although engaging frontline physicians in QI efforts is challenging,11,20 the vast majority of our residents participated during the first six years. The program gives residents and fellows the chance to voice their insights21 and agency to drive relevant QI projects. Residents and fellows across specialties have advanced dozens of projects, which differentiates our program from other QI educational interventions.22–26 As this study shows, our program also provides residents and fellows with opportunities to identify and act on system errors, advocate for optimal systems, coordinate specialty-relevant care within the system, and incorporate cost-awareness considerations—all of which are components of SBP.6 These aspects make our incentive program particularly relevant to the growing number of academic physicians who face challenges working at the intersection of QI and medical education.27
Our program provides a structure, support, and experience in SBP that we believe will foster learning and, by extension, competence in this arena. Its real-time, problem-focused, self-directed pedagogy mirrors the resident/fellow clinical learning environment and is congruent with adult learning theory.28 Although this study did not assess the acquisition of knowledge, skills, attitudes, or enhanced competence during the first six years of the program, these outcomes are important to assess rigorously in the future, as it will be necessary to report the educational impact of programs designed to meet existing and upcoming ACGME requirements.6,8
Although financial incentives are debated in the literature,29 they are commonplace in medical practice and seemingly effective in improving quality.30,31 Our program’s financial incentives appear to heighten the value of QI involvement, which theoretically influences behavior change32 and may have acted as a motivational instrument for the busy residents and fellows. Although some QI policies are moving toward financial penalties for failure to reach goals, we believe that doing so may have unintended consequences33; therefore, we did not include financial “disincentives” in our program.
There was also debate regarding financial incentives within our own institution. Some training program directors initially disagreed philosophically with providing a financial bonus to residents and fellows, but they agreed that a central QI program laying the foundation for experiential learning in SBP was valuable. Subsequently, with a substantial majority of members voting in favor, the GMEC agreed to implement the financial incentive.
The UCSFMC’s investment in the QI incentive program has yielded high returns. Financially, there have been potential positive impacts on cost avoidance (e.g., regulatory citations), cost savings (e.g., on-time surgical starts), and revenue generation (e.g., patient referral from enhanced satisfaction).34 In the face of current Medicare payment structures that bundle payments based on disease and withhold payment for errors,35,36 as well as proposed links between payment and quality metrics,7 the QI incentive program’s positive financial impact may well increase.
Arguably, the highest returns on this investment are not monetary. During the period studied, residents and fellows worked on 55 projects and achieved 71% of the goals set. Although we cannot be certain of the extent to which residents and fellows contributed to some outcomes (e.g., patient satisfaction), other projects for which goals were met—such as decreasing dermatology clinic wait time and improving ER room-to-exam time—had been unsuccessfully attempted by the UCSFMC previously (using outside consultants). From the UCSFMC leadership’s perspective, this program has heightened residents’ and fellows’ QI awareness, led to alliances between on-the-ground QI professionals and frontline physicians, and created a cohort of individuals armed with experience to tackle our institution’s next QI challenges. The program’s success is reflected in its continued funding and position as the cornerstone of UCSF’s GME quality and safety programs.
Although the majority of project goals were met during the study period, 29% were not. We posit that there are four key factors of project success: importance to residents and fellows, availability of mentors, department buy-in, and measurement assistance. We observed that project goals lacking perceived value did not gain traction among residents and fellows. This informed our decision to add the program-specific arm and to incorporate chief residents and the Resident and Fellow Council into the project selection process. We also noted that mentor involvement varied and that projects with involved faculty mentors tended to succeed. We therefore designed GME QI seminars so that project champions could leverage mentors (including peer mentors), and we added one-on-one mentoring with the GME director of quality and safety. Future iterations of the QI incentive program will include faculty mentor development. In addition, high-level buy-in from department leadership allowed champions to move quickly on project work and facilitated success. Finally, after we observed that programs that used independently designed data collection techniques had difficulty achieving their goals, we encouraged champions to consider aligning their projects with projects for which data were already being collected.
This study has limitations. We did not initially design this program as a research study. Over time, we found that the significant gains remained robust, and subsequently we overlaid an assessment strategy. As such, we did not have a control group to account for the influence of secular changes and the Hawthorne effect. Further evaluation will be necessary to better understand potential confounders. Also, we cannot demonstrate a clear association between the financial incentive and project results. Nonetheless, we believe that absent this program, it would be unlikely that these results would have been achieved, as previously our residents and fellows lacked a structure or incentive to work on QI. Providing incentives to take on this additional work is a recommended strategy to maintain involvement.26 The residents and fellows received bonus payments in an all-for-one fashion regardless of their individual efforts. This structure, though, fosters collective teamwork and reflects other financial incentive programs.37 In addition, we were unable to conduct a robust return-on-investment analysis because of the lack of cost transparency; further attempts and modeling will be important in the future. Finally, as noted above, our current findings do not provide insight into residents’ and fellows’ learning, satisfaction, perceptions of the motivating effect of the incentive, or enhanced competency in SBP. These facets will be important to assess rigorously as the QI incentive program continues.
Providers in the United States must transform health care by reducing complexity, decreasing waste, and improving quality.37 For the teaching hospitals that provide care to 18.5 million patients annually,38 engaging trainees in QI efforts that address real problems in real time will help ensure that the next generation of physicians is properly equipped to drive the health care system into the future.39 Programs like ours are likely to return value and to serve as steps forward in accomplishing the transformation.
Acknowledgments: The authors would like to thank the University of California, San Francisco, Resident and Fellow Council, specifically Dr. Patrick Guffey and Dr. Seema Nagpal, for project development; Mark Laret, chief executive officer, Joshua Adler, MD, chief medical officer, and Ernie Ring, MD, former chief medical officer, of the University of California, San Francisco Medical Center for collaborative program inception and ongoing support; Herodia Allen for administrative support; Paul Day for administrative and data compilation; and Amy J Markowitz, JD, for editorial assistance on the manuscript.
1. McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348:2635–2645
2. Institute of Medicine Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. 2001 Washington, DC National Academy Press
3. Wachter RM. Patient safety at ten: Unmistakable progress, troubling gaps. Health Aff (Millwood). 2010;29:165–173
8. Weiss KB, Wagner R, Nasca TJ. Development, testing, and implementation of the ACGME clinical learning environment review (CLER) program. J Grad Med Educ. 2012;;4:396–398
9. Lee AG, Beaver HA, Greenlee E, et al. Teaching and assessing systems-based competency in ophthalmology residency training programs. Surv Ophthalmol. 2007;52:680–689
10. Tess AV, Yang JJ, Smith CC, Fawcett CM, Bates CK, Reynolds EE. Combining clinical microsystems and an experiential quality improvement curriculum to improve residency education in internal medicine. Acad Med. 2009;84:326–334
11. Silversin J. Leading Physicians Through Change: How to Achieve and Sustain Results. 2012 Tempe, Florida ACPE
12. Boonyasai RT, Windish DM, Chakraborti C, Feldman LS, Rubin HR, Bass EB. Effectiveness of teaching quality improvement to clinicians. JAMA. 2007;298:1023–1037
13. Greene SE, Nash DB. Pay for performance: An overview of the literature. Am J Med Qual. 2009;24:140–163
14. Lindenauer PK, Remus D, Roman S, et al. Public reporting and pay for performance in hospital quality improvement. N Engl J Med. 2007;356:486–496
15. Petersen LA, Woodard LD, Urech T, Daw C, Sookanan S. Does pay-for-performance improve the quality of health care? Ann Intern Med. 2006;145:265–272
16. Tuot DS, Sehgal NL, Neeman N, Auerbach A. Enhancing quality of trainee-written consultation notes. Am J Med. 2012;125:649–652
18. National Patient Safety Foundation LLI. Unmet Needs: Teaching Physicians to Provide Safe Patient Care. 2010 Boston, Mass National Patient Safety Foundation
19. Jenson HB, Dorner D, Hinchey K, Ankel F, Goldman S, Patow C. Integrating quality improvement and residency education: Insights from the AIAMC National Initiative about the roles of the designated institutional official and program director. Acad Med. 2009;84:1749–1756
20. Pronovost PJ, Miller MR, Wachter RM, Meyer GS. Perspective: Physician leadership in quality. Acad Med. 2009;84:1651–1656
21. Ashton CM. “Invisible” doctors: Making a case for involving medical residents in hospital quality improvement programs. Acad Med. 1993;68:823–824
22. Kim CS, Lukela MP, Parekh VI, et al. Teaching internal medicine residents quality improvement and patient safety: A lean thinking approach. Am J Med Qual. 2010;25:211–217
23. Oyler J, Vinci L, Arora V, Johnson J. Teaching internal medicine residents quality improvement techniques using the ABIM’s practice improvement modules. J Gen Intern Med. 2008;23:927–930
24. Holmboe ES, Prince L, Green M. Teaching and improving quality of care in a primary care internal medicine residency clinic. Acad Med. 2005;80:571–577
25. Peters AS, Kimura J, Ladden MD, March E, Moore GT. A self-instructional model to teach systems-based practice and practice-based learning and improvement. J Gen Intern Med. 2008;23:931–936
26. Fleischut PM, Evans AS, Nugent WC, et al. Ten years after the IOM report: Engaging residents in quality and patient safety by creating a house staff quality council. Am J Med Qual. 2011;26:89–94
27. Shojania KG, Levinson W. Clinicians in quality improvement: A new career pathway in academic medicine. JAMA. 2009;301:766–768
28. Merriam S. Adult learning theory for the twenty-first century. New Dir Adult Contin Educ. 2008;2008(119)::93–98 doi: 10.1002/ace.309.
29. Jha AK, Joynt KE, Orav EJ, Epstein AM. The long-term effect of premier pay for performance on patient outcomes. N Engl J Med. 2012;366:1606–1615
30. Sutton M, Nikolova S, Boaden R, Lester H, McDonald R, Roland M. Reduced mortality with hospital pay for performance in England. N Engl J Med. 2012;367:1821–1828
31. Flodgren G, Eccles MP, Shepperd S, Scott A, Parmelli E, Beyer FR. An overview of reviews evaluating the effectiveness of financial incentives in changing healthcare professional behaviours and patient outcomes. Cochrane Database Syst Rev. July 6, 2011 CD009255.
32. Blue C. The predictive capacity of the theory of reasoned action and the theory of planned behavior in exercise research: An integrated literature review. Res Nurs Health. 1995;18:105–112
33. Mookherjee S, Vidyarthi AR, Ranji SR, Maselli J, Wachter RM, Baron RB. Potential unintended consequences due to Medicare’s “no pay for errors” rule? A randomized controlled trial of an educational intervention with internal medicine residents. J Gen Intern Med. 2010;25:1097–1101
34. Mourad M, Cucina R, Ramanathan R, Vidyarthi AR. Addressing the business of discharge: Building a case for an electronic discharge summary. J Hosp Med. 2011;6:37–42
35. Hackbarth G, Reischauer R, Mutti A. Collective accountability for medical care—toward bundled Medicare payments. N Engl J Med. 2008;359:3–5
36. Wachter RM, Foster NE, Dudley RA. Medicare’s decision to withhold payment for hospital errors: The devil is in the details. Jt Comm J Qual Patient Saf. 2008;34:116–123
39. Hackbarth G, Boccuti C. Transforming graduate medical education to improve health care value. N Engl J Med. 2011;364:693–695