Secondary Logo

Journal Logo

Research Reports

The Projected Responses of Residency-Sponsoring Institutions to a Reduction in Medicare Support for Graduate Medical Education

A National Survey

Riaz, Mahrukh MPH; Palermo, Tia PhD; Yen, Michael MPH; Edelman, Norman H. MD

Author Information
doi: 10.1097/ACM.0000000000000695


Approximately 70% of the nationwide expenditure on graduate medical education (GME) is provided through Medicare reimbursement to teaching hospitals, using two separate calculations. Direct medical education (DME) funds are intended to reflect the actual costs of training, such as the salaries of residents and their educators and administrative costs. Indirect medical education (IME) funds are intended to reimburse hospitals for the additional costs inherent in maintaining a GME program. Accordingly, the IME payment calculation is based on the intensity of the teaching program as measured by the ratio of residents to operated beds. Currently, IME reimbursement to a hospital is about two-thirds of the total Medicare payment to that institution.1

In 2010, a congressional advisory committee—the Medicare Payment Advisory Commission—concluded that no more than half of Medicare’s total IME payment could be “empirically validated” and suggested that the other half be redistributed among teaching hospitals to foster improvements in the quality of GME.2 This finding caught the attention of policy makers who were seeking to control the cost of Medicare. Accordingly, the 2010 Simpson–Bowles proposal called for the elimination of 60% of IME reimbursement, whereas the president’s budget proposal for the 2015 fiscal year called for the elimination of 10% of this payment.3 The president’s proposed budget for the 2016 fiscal year includes a $16 billion reduction in Medicare support for GME over 10 years. This cut is approximately twice the 10% reduction in IME reimbursement proposed in the 2015 budget.4

In this study, we sought to assess the projected responses of residency-sponsoring institutions if faced with such cuts. Given the real possibility of a reduction in IME payment in future Medicare budgets, our findings are of interest. In addition, this study provides insight into how institutions value the core specialty residencies (i.e., first certification, such as internal medicine), which has implications for the future direction of GME. We believe that the combined pressures of a growing and aging population, increased insurance coverage, and the increasing numbers of medical school graduates will eventually force an increase in the number of GME positions. The sources of funding for this increase are not clear, although several have been proposed. Whatever the sources of funding may be in the future, institutions will play a key role in shaping any expansion of GME. This expansion will be driven by their individual needs and preferences, which we explore in this study.

Our study expands on the findings of a 2011 study conducted by the Accreditation Council for Graduate Medical Education (ACGME) by investigating individual institutional GME strategies, the value they place on various residency programs, and any differences related to the number of residencies at sponsoring hospitals.5


Data and sample

We surveyed all residency-sponsoring institutions in the United States accredited by the ACGME and the American Osteopathic Association that were listed as general medical/surgical institutions and that received Medicare support for GME training.6,7 We used a Web-based survey instrument (see Supplemental Digital Appendix 1 at deployed through Survey Monkey, and made three attempts from February through October 2012 to elicit responses through e-mail requests addressed to the director of GME, usually referred to as the “designated institutional official” or “DIO” in the ACGME database. We chose this study design because, in the past, we sent surveys to hospital CEOs and did not get a good response rate. Additionally, we believed that the DIO would be the institutional official most cognizant of the institution’s residency programs as well as most likely to understand the values of the key stakeholders, including the CEO and trustees.

The Stony Brook institutional review board for research on human subjects approved our study.


We developed the survey in consultation with residency program directors at Stony Brook. We “field tested” the survey by circulating it to all DIOs in the Stony Brook system and made revisions based on their feedback.

As part of the survey, we asked respondents if their institution had developed a plan to deal with a reduction in Medicare support for GME. We then asked at what level of reduction in IME reimbursement their institution would consider downsizing programs, providing several response options. To determine what strategy the institution would employ for downsizing, we asked respondents to indicate the first step their institution would take to reduce GME expenditures. Here, too, we provided multiple response choices; however, respondents could select only one.

To determine which core specialty residencies would be most at risk should an institution decide to downsize and which would most likely be protected, we assessed a second set of data. We were concerned that asking this question directly would discourage responses, despite our assurance of confidentiality. Therefore, we asked respondents to rate the value of their institution’s core specialty residency programs on a five-point scale (very low to very high). We asked respondents to rate the value of these programs from an economic/operational point of view and from an educational/public service point of view. We used a dropdown list of all core specialty residencies and asked respondents to rate only those that were active at their institution.

For analysis, we divided respondents into two approximately equal groups: large (six or more residencies) and small (five or fewer residencies) institutions. We also divided respondents into groups according to self-reported density of the population served by their institution (rural, suburban, and urban).

Statistical analysis

We first described the demographic characteristics of the institutions in our sample, including the program size and density of the population served. Next, we summarized responses to the questions regarding the presence of an institutional plan for downsizing and the projected first step to reduce GME expenditures. At each level of reduction, we performed bivariate chi-square tests to examine whether the reported plans for downsizing varied by residency program size (large versus small). Next, we described the reported value (economic/operational and educational/public service) of each residency program. Because each institution has a different set of residency programs, we could not compare and rank all residencies directly (i.e., the sample size for each residency program is different) to see which were most valued; thus, we calculated only descriptive statistics. Finally, we compared the reported residency program values by program size. For this analysis, we categorized reported values into two categories (very high and high versus moderate, low, and very low) and performed chi-square tests, for which the independent variable was large versus small program size and the dependent variables were (1) economic/operational value and (2) educational/public service value for each residency program separately. All analyses were performed using Stata version 12 (StataCorp, College Station, Texas).


Of the 555 institutions invited to participate in our study, 192 responded, for an overall response rate of 35%. Of these, 92 institutions were classified as small and 100 as large. We found no significant difference between responding and nonresponding institutions with regard to program size and total number of residencies. Only 8 institutions identified as rural, and we found no significant difference in responses between urban and suburban institutions.

Sixteen (17%) responding institutions with small GME programs and 29 (29%) with large GME programs indicated that they had developed a plan to deal with a possible reduction in IME payment. The level of reduction at which all respondents (not only those with a plan) thought their institution would begin downsizing residency programs varied with the number of residencies sponsored (see Figure 1). Thirty-three (33%) institutions with large GME programs would begin downsizing if their IME payment was reduced by 10%, whereas only 18 (20%) institutions with small programs would do so at this level (P = .034). Conversely, only 18 (18%) institutions with large GME programs would maintain all their programs intact even if all IME support were lost (see “100%” in Figure 1), whereas 31 (34%) institutions with small programs would do so (P = .017).

Figure 1
Figure 1:
The level of reduction in indirect medical education (IME) funding at which institutions may begin downsizing, by program size. Many institutions, especially the larger ones (six or more residencies), would decrease support for programs when faced with a 10% cut in IME reimbursement. Conversely, many smaller institutions (five or fewer residencies) would maintain their support even if all IME funding were eliminated (100%). Standard error is shown.

The strategies to make these reductions also varied with the number of residencies sponsored (see Figure 2). The first step for 42 (42%) institutions with large GME programs would be a reduction in support for specific residency programs; only 23 (25%) institutions with small programs projected this first step (P = .016). Conversely, 33 (36%) institutions with small GME programs would apply across-the-board reductions; only 19 (19%) institutions with large programs projected this first step (P = .012).

Figure 2
Figure 2:
The projected first step institutions would take in response to cuts in indirect medical education funding, by program size. The first step for 42% of larger institutions (six or more residencies) would be decreased support for specific programs. The first step for 36% of smaller institutions (five or fewer residencies) would be to decrease support for programs across the board. Standard error is shown.

Figures 3 and 4 show the mean value score (on the five-point scale) for each specialty. The range for mean economic/operational value (1.67–4.39) was considerably greater than the range for mean educational/public service value (2.70–4.54). Although these mean scores cannot be compared statistically, we identified important differences between them. For example, internal medicine, surgery, and emergency medicine were valued “high” or “very high” (i.e., the two highest values on the five-point scale) in both the economic/operational category and the educational/public service category by 136 (71%) respondents. In contrast, dermatology, nuclear medicine, plastic surgery, and preventive medicine were scored “high” or “very high” in both categories by only 60 (31%) respondents. All other residency programs fell within these two extremes.

Figure 3
Figure 3:
The mean value of each core specialty residency program to responding institutions from an economic/operational point of view. Responses were on a scale of 1 to 5, 1 being very low and 5 being very high. Values ranged widely, with hospital-intensive residencies scoring highest. Standard error is shown.
Figure 4
Figure 4:
The mean value of each core specialty residency program to responding institutions from an educational/public service point of view. Responses were on a scale of 1 to 5, 1 being very low and 5 being very high. Scores were generally higher and in a narrower range than those from an economic/operational point of view. Standard error is shown.

Family medicine was the only residency program for which value varied significantly with the size of the institution’s overall GME program (see Figure 5). Institutions with small GME programs scored family medicine “high” or “very high” from both the economic/operational (36/47; 77%) and the educational/public service (44/47; 94%) points of view. Institutions with large programs frequently scored family medicine “high” or “very high” from the educational/public service point of view (44/55; 80%) but less frequently from the economic/operational point of view (23/55; 42%). These differences by GME program size were statistically significant (P < .001).

Figure 5
Figure 5:
The value of a family medicine residency program to responding institutions, by program size. Both larger (six or more residencies) and smaller (five or fewer residencies) programs valued family medicine highly from an educational/public service point of view. Only smaller institutions valued it highly from an economic/operational perspective. Standard error is shown.


Our study provides a detailed description from institutions across the United States of projected responses to proposed cuts to IME funding. Four salient findings emerged from our study: (1) Institutions were cognizant of possible IME payment cuts, yet only about 20% had developed a contingency plan; (2) even a 10% reduction in IME reimbursement would lead a substantial number of institutions, especially those with large GME programs, to consider downsizing; (3) institutions with small and large GME programs would take different approaches to downsizing; and (4) the degree to which each core specialty residency is valued from the economic/operational point of view varies greatly, yet the degree to which each is valued from the educational/public service point of view varies less.

The differences between responses from large and small institutions are informative. Institutions with small GME programs were less likely to decrease their support for GME at modest reductions in IME reimbursement, were more likely to make across-the-board cuts than specific program cuts, and valued the economic/operational contributions of family medicine residencies more highly than institutions with large GME programs. These findings are consistent with each other. Institutions with small GME programs rely more on payment for primary and secondary care for their revenue than do institutions with large GME programs. In this setting, primary care residents are valuable both because they provide inpatient services and because they staff outpatient facilities, which refer individuals for inpatient care. Thus, not surprisingly, these institutions train a greater percentage of primary care physicians than institutions with large GME programs.8 Accordingly, we expect them to value and protect their relatively few, mostly primary care, residency programs.

Interestingly, 33% of institutions with large GME programs indicated that they would begin downsizing at a modest 10% cut in IME reimbursement and that the reductions would most likely be focused on specific programs. That they would focus on specific programs may simply reflect the fact that these institutions have many more residency programs, both core and subspecialty, to choose from in a downsizing process. On the other hand, their apparent willingness to begin downsizing upon a rather modest reduction in IME support is surprising. This willingness may reflect a preexisting desire to make such changes, with the reduction in IME reimbursement providing a convenient rationale. Most large teaching hospitals support their GME program with often-considerable internal funds beyond the capped Medicare reimbursement. Officials at hospitals experiencing financial stress might opt to reduce hospital GME funding by eliminating or downsizing less valued programs. Of note, hospital administrators tend to be evaluated from an economic point of view based on operating margin (profit or loss in a for-profit setting) rather than total expenditures. For example, in a medium-sized hospital with an operating budget of $500 million and a negative margin of 1% (not an uncommon circumstance), eliminating 10 residency positions would result, on average, in a savings of $1 million, thereby reducing the hospital’s unfavorable margin by 20%.1

Our findings are similar to those reported by the ACGME in their 2011 study.5 They made no distinction between institutions with large and small GME programs, nor did they ask about specific residencies. Thus, we use overall averages here to compare our findings with theirs. The ACGME reported that 68% of respondents to their survey would begin to reduce core residency programs at a 33% reduction in IME reimbursement. We found that, at that level of IME payment reduction, 70% of institutions would downsize their programs. Similarly, at a 50% reduction in IME funds, the ACGME reported that 82% of institutions would begin downsizing; in comparison, we found that 77% would.

Our findings regarding the value that institutions place on different residency programs are consistent with previous research findings, yet also provide new insights. In another study, we found that when training institutions in New York State were asked which programs they would add or expand if given additional funds, they valued hospital-intensive programs far more than programs that are largely based in ambulatory care settings.9 The values shown in Figures 3 and 4 confirm this finding for the nation as a whole. Medical and surgical residents are the “workhorses” of inpatient medicine and are less expensive than their hospitalist or nonphysician clinician (e.g., nurse practitioner and physicians assistant) peers. Emergency medicine residents are less expensive than fully trained emergency medicine specialists. In addition, in recent years, emergency departments have become a growing source of inpatients for hospitals, which has an economic impact on institutions.10 Conversely, the specialties less valued in our study are conducted most often in ambulatory or ambulatory surgical settings, which has less impact on hospital finances.

Preventive medicine deserves special mention. It was considered to be of little value from an economic/operational point of view, not surprisingly. However, that it also was considered to be of little value from an educational/public service point of view is of concern. Many agree that disease prevention must be a high priority in the United States and that physicians who practice this specialty are in short supply.11 Thus, there is a disconnect between the public discourse on the need for more primary care providers, like those who practice preventive medicine, and the value of such specialties within institutions.

Our study has a number of limitations. First, we received a modest response rate, though one typical of surveys directed at similar populations. Still, we remain confident in our findings, given the lack of difference between respondents and nonrespondents with regard to the number of residency programs at their institution. However we may not have examined other differences between these two groups. For example, DIOs from institutions under financial stress—those contemplating downsizing their GME program—might have been more likely to respond to the survey. On the other hand, our findings are comparable to those of the ACGME survey, which had a greater response rate. Furthermore, even if we somewhat overestimated the overall propensity for downsizing GME programs, our response rate does allow us to conclude that a modest reduction in IME reimbursement would lead a significant number of institutions to contemplate downsizing their GME program. We believe that this reduction could be of a sufficient enough magnitude to affect the national GME output. In some circumstances, this downsizing would not be in the national interest—for example, reductions in family medicine programs in larger institutions or in preventive medicine and other “less valued” specialties for which an unmet need exists.

Another potential limitation is that we surveyed DIOs rather than CEOs, who are the final decision makers at institutions. However, DIOs are more likely to defend residency programs than potentially bottom-line-oriented CEOs. Thus, any bias introduced by surveying DIOs should have been toward not downsizing when faced with reductions in IME payments, rather than toward increased willingness to downsize. Finally, the multiple-choice questions allowed only one response per institution, perhaps precluding more nuanced responses.


In summary, we found that the differences in the degree to which institutions value each core specialty residency are quite great when viewed from an economic/operational point of view, somewhat less when viewed from an educational/public service point of view, and, in the case of family medicine, related to the size of the institution’s GME program. If and how institutions downsize or eliminate residencies when faced with a reduction in Medicare support varies with the number of residencies the institution sponsors. Institutions with fewer residencies were less likely to downsize and more likely to support their family medicine program.

Our findings suggest that, to maintain and enhance training in those specialties that are less valued by their sponsoring institutions from an economic/operational point of view but considered valuable to society, it may be necessary to increase external support for these residency programs. The number of graduating physicians in the United States is increasing, yet federal support for GME remains static at best.12 Now is the ideal time to make changes to how GME programs are funded to ensure that institutions retain an adequate number of trainees even for those specialties they currently value less from an economic point of view.

Acknowledgments: The authors thank Erin Emanuele for research assistance.


1. Edelman NH, Romeiser JJohns MME chair. Financing of graduate medical education. In: Ensuring an Effective Physician Workforce for America: Recommendations for an Accountable Graduate Medical Education System. 2011 Atlanta, Ga Josiah Macy Jr. Foundation
2. Medicare Payment Advisory Commission. . Chapter 4: Graduate medical education financing: Focusing on educational priorities. In: Report to the Congress: Aligning Incentives in Medicare. 2010 Washington, DC MedPAC
3. Association of American Medical Colleges. President’s FY 2015 Budget Once Again Cuts Medicare Providers. Washington Highlights. March 7, 2014. Accessed January 22, 2015
4. . President’s proposed budget would be one step forward, two steps back for America’s patients [press release]. 2015 Washington, DC Association of American Medical Colleges
5. Nasca TJ, Miller RS, Holt KD. The potential impact of reduction in federal GME funding in the United States: A study of the estimates of designated institutional officials. J Grad Med Educ. 2011;3:585–590
6. Accreditation Council for Graduate Medical Education. . List of ACGME Accrediated Programs and Sponsoring Institutions. 2013 Accessed January 22, 2015
7. American Osteopathic Association. . Osteopathic Medical Internships and Residiencies. 2013 Accessed January 22, 2015
8. Chen C, Petterson S, Phillips RL, Mullan F, Bazemore A, O’Donnell SD. Toward graduate medical education (GME) accountability: Measuring the outcomes of GME institutions. Acad Med. 2013;88:1267–1280
9. Edelman NH, Goldsteen RL, Goldsteen K, Yagudayev S, Lima F, Chiu L. Institutions with accredited residencies in New York State with an interest in developing new residencies or expanding existing ones. Acad Med. 2013;88:1287–1292
10. Schuur JD, Venkatesh AK. The growing role of emergency departments in hospital admissions. N Engl J Med. 2012;367:391–393
11. Committee on Training Physicians for Public Health Careers. Training Physicians for Public Health Careers. 2007 Washington, DC National Academies Press
12. Jolly P, Erikson C, Garrison G. U.S. graduate medical education and physician specialty choice. Acad Med. 2013;88:468–474

Supplemental Digital Content

© 2015 by the Association of American Medical Colleges