Graduate Medical Education (GME) plays a key role in the makeup of the U.S. physician workforce, and it represents the largest public investment in health workforce development through Medicare, Medicaid, and other federal funding. Yet, the physician workforce is struggling to meet the nation’s health care needs, particularly in primary care and geographically underserved areas. Amid increasing calls for greater accountability in the GME system, we propose a method for examining institutional GME outcomes that can ultimately inform future education and policy decisions.
Background
The GME system dictates the overall size and specialty mix of the U.S. physician workforce. With few exceptions, physician licensing in every state requires at least one year of U.S. GME. Therefore, the total availability of U.S. training positions defines the overall size of the physician workforce, and the number of GME training positions available for each specialty effectively determines the number of individuals who can pursue a career in that specialty. The location of GME programs affects long-term practice locations because physicians tend to locate in the same geographic area as their residency,1–3 and exposure to rural and underserved settings during GME increases the likelihood of continuing to work with these populations after graduation.4–7
GME has been publicly funded since the passage of Medicare in 1965. In 2009, Medicare contributed $9.5 billion8 to GME. Medicaid provided an additional $3.18 billion.9 These two contributions represent the largest public investment in U.S. health workforce development.10 Despite this public investment, physician shortages in certain specialties, including primary care, general surgery, and psychiatry, and in rural and underserved areas, persist.11–18 These shortages limit access to care, and a growing number of studies suggest that health systems built on strong primary care bases improve quality and constrain the cost of health care.19–22 Even with good evidence that the composition of the physician workforce affects access, quality, and cost, federal GME funding is provided without specialty training expectations or requirements to evaluate training outcomes.
As early as 1965 and as recently as 2011, advisory bodies have recommended that GME be more accountable to the public’s health needs.23–25 In 2010, there were three prominent calls for increased GME accountability. The Josiah Macy Jr. Foundation issued a report concluding that, because GME is financed with public funds, it should be accountable to the public.26 The Medicare Payment Advisory Commission recommended greater transparency with and accountability for Medicare GME payments.27 The Patient Protection and Affordable Care Act mandated that the Council on Graduate Medical Education (COGME) develop performance measures and guidelines for longitudinal evaluation for Title VII of the Public Health Service Act programs, which include GME programs receiving funds through these programs.28 However, COGME's wider mandate is to make recommendations to Congress with respect to GME programs relative to health workforce needs in the United States.
Despite these calls for accountability, important characteristics of GME programs such as training in priority health needs and relevant delivery systems, and workforce outcomes, including specialty and geographic distribution, remain unaddressed. The impact of residency programs on local or regional physician workforces is not measured or tracked. Nonetheless, measuring GME outcomes is essential to inform deliberations about medical workforce problems and policies. This is particularly true given current GME resource constraints and the reexamination of the adequacy of the U.S. physician workforce following the passage of the Patient Protection and Affordable Care Act.29,30
Attention has been paid to geographic and specialty outcomes of undergraduate medical education31; however, relatively little scholarship has been applied to these issues in GME programs. Measuring GME outcomes is difficult because of the complex arrangement of the training institutions and the variable paths traveled by the trainees. At the current time, approximately 111,586 “residents” and “fellows” are employed in 8,967 training programs in 150 specialty areas.32 These programs are (usually) parts of larger institutions designated as “sponsoring institutions” for the purpose of accreditation or “primary teaching sites” important for Medicare reimbursement purposes. In 2011, there were approximately 679 Accreditation Council for Graduate Medical Education (ACGME)-accredited sponsoring institutions and over 1,135 ACGME-accredited primary teaching sites.33 For the purposes of this study, we focus on the workforce outcomes of these GME programs.
We propose a method for measuring workforce-relevant outcomes of GME by sponsoring institutions and primary teaching sites, using existing data. We purposefully examine both. Sponsoring institutions, identified for accreditation purposes, assume the ultimate financial and academic responsibility for the GME program.34 Primary teaching sites are generally hospitals, organizations that directly receive Medicare GME payments. Both sponsoring institutions and teaching sites often represent a consortium of academic institutions, hospitals, and ambulatory clinics that collectively take responsibility for residency training programs. Useful tracking systems with different emphases could be constructed using either sponsoring institutions or primary teaching sites.
Method
With approval from the institutional review boards of the George Washington University and the American Academy of Family Physicians, the 2011 American Medical Association (AMA) Masterfile and its GME historical supplement were used to identify physicians completing residency between 2006 and 2008 (117,504 physicians). We selected a historical cohort to ensure that physicians had time to locate after training and to allow the AMA Masterfile to update their information. Given our focus on characterizing institutional and training site outcomes, we identified physicians who had completed more than one residency during this period and were represented more than once in our data set (8,977 physicians). We used the same AMA Masterfile to characterize these physicians three to five years after they had completed their residency programs in the study period in order to estimate primary care, general surgery, psychiatry, and obstetrics–gynecology output. In cases where physicians did training beyond their primary specialty, we used the specialty of their final training program as their practicing specialty. Primary care was defined as family medicine, general internal medicine (GIM), general pediatrics, internal medicine–pediatrics, internal medicine geriatrics, and family medicine geriatrics. Obstetrics–gynecology data were not included in the primary care outcome but were reported separately.
We calculated GIM retention as the number of GIM graduates who did no further training beyond their primary residency divided by the number of all GIM graduates at each sponsoring institution or primary teaching site (including those who completed subspecialty training). General surgery retention rates were similarly calculated.
We used AMA Masterfile addresses to determine physician location. We supplemented these data with information from the National Provider Identifier (NPI) database35 to improve the quality of practice addresses we found in the AMA Masterfile. Using unique combinations of name and address, we were able to match 97% of the physicians in the 2011 NPI with physicians in the Masterfile. We preferentially used the NPI physician address if the NPI update year was later than the last year of residency for an individual physician. As the cohort (2006–2008 graduates) was a relatively recent cohort, the NPI correction increased the likelihood of capturing current work addresses. We geocoded practice addresses to determine practice in a rural (nonmetro) county and in a primary care Health Professional Shortage Area (HPSA). Rural was defined using the U.S. Department of Agriculture Rural Urban Continuum Codes.36 The Health Resources and Services Administration (HRSA) Data Warehouse was used to identify HPSA geographies.37
We also matched our data with 2009 Medicare claims data to identify physicians working in a Federally Qualified Health Center (FQHC) or Rural Health Center (RHC). We used the AMA Masterfile–NPI match to link physicians with a unique physician identification number that we then matched to a 100% sample of 2009 FQHC and RHC Medicare claims files. Using this method, we identified 2,373 physicians who had at least one claim in an RHC or FQHC. Using data provided by HRSA,38 we identified graduates who had ever participated in the National Health Service Corps (NHSC) using unique combinations of first name, last name, specialty, and birth year. We used hospital cost reports (2008)39 to identify Medicare GME funding for hospitals.
The AMA Masterfile GME supplement assigns an “icode” to each residency program. The icode most often corresponds with the ACGME sponsor institution code, and less frequently with the primary teaching site code. In all cases, we were able to uniquely assign individuals to sponsoring institutions. In cases where the icode matched to the sponsor code, we assigned primary teaching sites using 2011 data from the ACGME that identified all residency programs by specialty with their sponsoring institutions and primary teaching sites. This match raised some methodological challenges. It is possible for a single sponsoring institution’s residency programs in different specialties to be situated in different primary teaching sites. To address this problem, we linked unique combinations of sponsoring institution and specialty in both the AMA Masterfile and the AGCME data. A second challenge was that residencies in the same specialty can be situated in two or more primary teaching sites. In these cases, we could not uniquely match a residency with a particular primary teaching institution. In the analysis file, we flagged these cases. Third, because we matched later (2011) ACGME lists of sponsoring and primary teaching sites with earlier (2006–2008) AMA information, we were unable to match programs that had closed or opened or changed their affiliation during the intervening period of time. Finally, some ACGME primary teaching site information was missing, and we did not have institutional information for osteopathic or Canadian residency programs. We hand-edited nonmatches when possible, using the Internet to search for programs to determine whether programs had closed, changed names, or changed affiliations; we called programs to confirm.
After hand-editing, we were able to find unique matches for 7,219 of the 8,810 unique sponsoring institution/specialty combinations. This corresponds to 101,304 of the 117,504 residents in our sample. Our inability to situate a resident in a primary teaching institution was mainly due to those cases where a sponsoring institution sponsored programs in the same specialty in multiple primary teaching sites (10,089 residents).
We used pairwise correlation analysis, weighted for the number of residents, to examine the relationships between institution-level primary care, IM retention, and rural outcomes with institutional characteristics, including number of specialties trained, rurality, percent female, percent osteopathic (in all allopathic residency programs), percent international medical graduate (IMG), and average age.
Results
Summary outcomes
Sponsoring institutions.
Table 1 provides summary outcome measures for sponsoring institutions and primary teaching sites. For the 2006–2008 period we identified 759 sponsoring institutions, whose weighted, mean percentage of graduates in primary care was 24.2%, median 17.7% (see Figure 1). Considering only unique individuals, the average rose to 25.2%; however, this overestimates primary care production, as we could not account for primary care physicians practicing as hospitalists. We found that 158 institutions produced no primary care graduates, and 184 institutions produced more than 80%; the latter tended to be smaller institutions. For sponsoring institutions providing internal medicine training, retention in GIM ranged widely from 8.3% to 95.2%, with an average of 37.9% (limited to programs training at least the minimum required by the ACGME40 in one year and weighted for the number of GIM graduates). A total of 255 sponsoring institutions graduated general surgery residents between 2006 and 2008, with an average general surgery retention of 38.4% (weighted for the number of general surgery graduates). We identified 183 sponsoring institutions graduating psychiatry residents.
Figure 1: Relationship between percentage of graduates in primary care and number of residents trained in U.S. graduate medical education sponsoring institutions. Data are limited to sponsoring institutions with more than three graduates during 2006–2008. Institutions in Puerto Rico are not included.
Table 1: Summary of U.S. Graduate Medical Education Outcome Measures for Residents Graduating in 2006–2008
Overall, 198 institutions produced no rural physicians, 10 institutions had all graduates go to rural areas (weighted mean for all programs was 8.5% rural; median 6.3%). Considering only unique individuals, the average percentage of graduates providing direct patient care in rural areas was 4.8%. We found that 283 institutions produced no physicians practicing in FQHCs or RHCs; 479 institutions produced no NHSC physicians.
Primary teaching sites.
We identified 957 primary teaching sites for the 2006–2008 period. Of the 117,504 physicians in our study, we were unable to uniquely assign 16,200 individuals to primary teaching sites (13.8%), and 99 primary teaching sites had incomplete data due to sponsoring institutions sponsoring multiple same-specialty GME programs in different primary teaching sites. In 63 of the 99 primary teaching sites, residents who could not be uniquely associated with those sites included residents in primary care fields, most commonly family medicine and internal medicine.
Program-level outcomes
We compared program-level outcomes for the 161 sponsoring institutions producing more than 200 graduates over the three year study period—more than three-fourths of all residents (90,217). Appendix 1 shows the bottom and top 20 primary care producers. These institutions had an average of 40 training programs (SD = 20). This group of larger training institutions could similarly have been ranked on production of rural physicians, which ranged from none to 61.2%, or on other measures.
Appendix 1: Outcomes of the Top and Bottom Producers of Primary Care Graduates, U.S. Graduate Medical Education Sponsoring Institutions With More Than 200 Graduates Between 2006 and 2008
For primary teaching sites, 158 sites produced more than 150 graduates between 2006 and 2008, collectively training 60.8% (61,632) of graduates that can be assigned to primary teaching sites. Appendixes 2 and 3 show the characteristics and outcomes of the top and bottom primary care producers, excluding primary teaching sites for which we were unable to uniquely assign all residents to that site. The top 20 primary care producing sites graduated 1,658 primary care graduates out of a total of 4,044 graduates (41.0%) and received $292.1 million in total Medicare GME payments ($72,230 per resident). The bottom 20 graduated 684 primary care graduates out of a total of 10,937 graduates (6.3%) and received $842.4 million ($77,004 per resident).
Appendix 2: Characteristics of the Top and Bottom Producers of Primary Care Graduates, U.S. Graduate Medical Education (GME) Primary Teaching Sites With More Than 150 Graduates Between 2006 and 2008*
Appendix 3: Outcomes of the Top and Bottom Producers of Primary Care Graduates, U.S. Graduate Medical Education Primary Teaching Sites With More Than 150 Graduates Between 2006 and 2008*
Full sponsoring institution and primary teaching site outcomes are available at www.graham-center.org/gmemapper.
Associations
There was a negative relationship between the number of specialties trained and graduates practicing in rural areas (see Figure 2). Increasing rurality of a sponsoring institution was associated with increasing rural output. The evaluation of relationships identified outliers. For example, despite training more than 20 different specialties, we found more than 40% of Geisinger Health System and Mary Hitchcock Memorial Hospital graduates to be practicing in rural areas. Both institutions are located in nonmetropolitan areas. This example points to the need for further analysis that could be done using program-level outcomes. Correlation analysis suggests positive associations between percent primary care output and percent internal medicine residents retained in primary care, percent rural output, rurality of the program, percent female, percent osteopathic graduates, percent IMGs, and mean age. We also observed positive correlations between percent rural output and rurality of the program, percent internal medicine residents retained in primary care, percent osteopathic graduates, percent IMGs, and mean age. We found negative associations between percent primary care output and number of specialties trained, and between percent rural output and number of specialties trained and percent female. Table 2 provides correlation analysis.
Figure 2: Relationship between percentage of graduates practicing in rural areas and number of specialties trained at U.S. graduate medical education sponsoring institutions. Data are limited to sponsoring institutions with more than three graduates during 2006–2008. Institutions in Puerto Rico are not included.
Table 2: Correlation Analysis of Graduate Medical Education Outcome Measures for Sponsoring Institutions*
Discussion
GME accountability
In public policy discussions, Medicare GME funding is being targeted simultaneously for reduction and for increased accountability, highlighting a need for recipient organizations to be able to measure relevant outcomes of their GME expenditures. This analysis demonstrates that outcomes can be measured for all Medicare sponsoring institutions and approximately 90% of ACGME primary teaching sites, demonstrating that outcome measures are possible for GME training.
Additionally, it provides perspective to policy makers and educators by allowing direct comparisons between GME training institutions similar in size and scope, and allowing identification of institutions that have achieved particular success in producing physicians in primary care and geographically underserved areas despite prevailing trends. Given critical health workforce needs that may vary at national, state, and local levels, a better understanding of outputs at the institution level will allow educators and local, regional, and national policy makers to assess the performance of programs relative to local and national workforce needs, and focus interventions and policies for improvement. This analytic approach can also be used to look at any number of specialty and geographic outcomes.
GME outcomes
Beyond demonstrating a method to measure GME outcomes, some findings bear comment. Primary care physician production of 25.2% and rural physician production of 4.8% will not sustain the current workforce, solve problems of maldistribution, or address acknowledged shortages. The relatively small number of physicians choosing to work in RHCs, FQHCs, HPSAs, and the NHSC will not support a doubling of the capacity of safety net services envisioned by the Affordable Care Act.41
Past GME policies have often relied on proxies, such as choice of residency specialty or statements of intent to practice in rural or shortage areas, for measuring institutional production of physicians in primary care and underserved areas. However, a substantial portion of internal medicine and general surgery graduates subsequently subspecialize. The results reported here show that some institutions retain less than 10% of their internal medicine residents in primary care. Actual outcomes will enable much higher precision in designing institutional, regional, and national workforce training policies. Whereas these findings represent a cross-section of GME graduates, these measures can be repeated on an ongoing basis with the potential to monitor trends, target limited resources, and prioritize institutions producing physicians in high-need specialties. These measures also have potential use in evaluating GME demonstration projects and the long-term impact of GME policy changes.
Evaluating relationships between various institutional characteristics and outcomes in high-need specialties and underserved areas also provides an opportunity to identify outliers. For example, rural physician production and retention of internal medicine residents in primary care are negatively associated with training larger numbers of specialties; however, some programs appear to defy the trends. Geisinger Health System and Mary Hitchcock Memorial Hospital both train more than 20 different specialties, yet more than 40% of their graduates practice in rural areas. Wright State University School of Medicine, Madigan Healthcare System, and the National Capital Consortium train in more than 15 different specialties, yet retain more than 60% of their internal medicine residents in GIM. The ability to identify these outliers allows further study of the factors that contribute to their success.
Training patterns
It is not surprising that large teaching hospitals and academic health centers train sizable numbers of subspecialists. Conspicuous, however, is that the magnitude and consistency of these numbers, relative to primary care graduates, across these institutions is striking. This bifurcation of outcomes invites the conclusion that institutions with more subspecialty training programs are inclement for the production of primary care. Do residents choose large teaching hospitals for the subspecialty opportunities available, or does the environment of the multiple specialties influence the subsequent training choices of generalist trainees—or both? The low primary care output observed in specialty-rich training institutions is reinforced by the current Medicare GME formulae that result in higher payments to those large institutions, as well as the ability of more specialized GME programs to support generally more highly reimbursed services. These are important questions to consider in the national discussion about imbalance in the workforce and strategies to increase primary care physician output.
A similar pattern emerges with regard to rural physicians whose training sites are predominantly in institutions with fewer specialties. Yet, there are academic health centers with substantial numbers of training programs graduating significant numbers into rural practice. Geisinger Health System and Mary Hitchcock Medical Center are located in less urban areas and train using local facilities. Although major medical centers are not often based in rural areas, the pattern of graduates in the general analysis and the success of these two programs in rural health staffing suggest that targeted funding for rurally based residencies in small or large residency programs offers a strategy for augmenting the rural physician workforce.
Limitations
The AMA Masterfile presents known limitations in accuracy; however, the GME supplement is generally more accurate because of how these data are collected. Concerns exist regarding specialty and practice self-designation by physicians, address inaccuracies, and delays in information updating.42–44 When possible, we addressed these issues by correcting specialties when residency training information suggested more recent training in a different field. We preferentially used secondary addresses when the primary address was a home address, and we also used NPI addresses when the NPI update year was more recent than the last year of residency training.
The inability to uniquely associate approximately 16,000 individuals to primary teaching sites produced incomplete primary teaching site outcomes. In reporting program-level outcomes for primary teaching sites, we indicate those programs at which we are unable to uniquely assign all graduates.
Further, the ACGME database only allowed identification of primary teaching sites. Primary teaching sites do not represent all teaching hospitals. In 2008, an additional 460 hospitals received Medicare GME payments according to CMS hospital cost reports. These are likely secondary teaching sites and represent a relatively small portion of the total Medicare spending on GME—approximately $706 million (7.6%) of $9.3 billion. However, to implement an accountability system using our findings, these hospitals would need to identify either their sponsoring institution or primary teaching site affiliations for their residency training programs.
Our study also largely excludes those physicians trained in osteopathic residency programs. Because of the separation of the accreditation processes between the allopathic and osteopathic medical school and GME systems, the AMA Masterfile has an increased delay in capturing individuals trained purely in the osteopathic pathway. In the future, these individuals may be added to the analysis by collaborating with the American Osteopathic Association, which maintains a similar database to the AMA Masterfile.
Conclusions
Medicare GME financing is the largest public investment in health care workforce development in the nation, with two-thirds of nearly $10 billion in annual funding going to the 200 hospitals training the largest number of residents. Despite this funding, the physician workforce continues to face critical shortages in specific specialties and locations, most of which are minimally served by the graduates of those 200 hospitals. As a result, Medicare GME-funded institutions face increasing scrutiny and calls for greater accountability. Our findings demonstrate that outcome measures in key workforce areas at the institution and hospital level are achievable. These outcomes can be used to develop an accountability system, inform policy and education, and evaluate the results of changes in the GME system.
Acknowledgments: Dr. Chen would like to thank Dr. Marion Danis in the Department of Clinical Bioethics in the Clinical Center of the National Institutes of Health for acting as her intramural mentor for her DREAM Award. The authors are also grateful for the input of key stakeholders who participated in a qualitative study that informed the outcome measures used in this study.
References
1. Dorner FH, Burr RM, Tucker SL. The geographic relationships between physicians’ residency sites and the locations of their first practices. Acad Med. 1991;66:540–544
2. Seifer SD, Vranizan K, Grumbach K. Graduate medical education and physician practice location. Implications for physician workforce policy. JAMA. 1995;274:685–691
3. Steele MT, Schwab RA, McNamara RM, Watson WA. Emergency medicine resident choice of practice location. Ann Emerg Med. 1998;31:351–357
4. Pathman DE, Steiner BD, Jones BD, Konrad TR. Preparing and retaining rural physicians through medical education. Acad Med. 1999;74:810–820
5. Brooks RG, Walsh M, Mardon RE, Lewis M, Clawson A. The roles of nature and nurture in the recruitment and retention of primary care physicians in rural areas: A review of the literature. Acad Med. 2002;77:790–798
6. Morris CG, Johnson B, Kim S, Chen F. Training family physicians in community health centers: A health workforce solution. Fam Med. 2008;40:271–276
7. Reese VF, McCann JL, Bazemore AW, Phillips RL Jr. Residency footprints: Assessing the impact of training programs on the local physician workforce and communities. Fam Med. 2008;40:339–344
8. Centers for Medicare and Medicaid Services. . Cost reports.
https://www.cms.gov/costreports/02_hospitalcostreport.asp. Accessed May 1, 2013
9. Henderson TM Medicaid Direct and Indirect Graduate Medical Education Payments: A 50-State Survey. 2010 Washington, DC Association of American Medical Colleges
https://members.aamc.org/eweb/upload/Medicaid%20Direct_Indirect%20GME%20Payments%20Survey%202010.pdf. Accessed May 1, 2013
10. Wynn B, Guarino C, Morse L, Cho M Alternative Ways of Financing Graduate Medical Education. 2006 Santa Monica, Calif RAND Health
http://aspe.dhhs.gov/health/reports/06/AltGradMedicalEdu/report.pdf. Accessed May 1, 2013
11. Cooper RA, Getzen TE, McKee HJ, Laud P. Economic and demographic trends signal an impending physician shortage. Health Aff (Millwood). 2002;21:140–154
12. Doescher MP, Fordyce MA, Skillman SM, Jackson JE, Rosenblatt RA Persistent Primary Care Health Professional Shortage Areas (HPSAs) and Health Care Access in Rural America. 2009 Seattle, Wash WWAMI Rural Health Research Center
http://depts.washington.edu/uwrhrc/uploads/Persistent_HPSAs_PB.pdf. Accessed May 1, 2013
13. Williams TE Jr, Satiani B, Thomas A, Ellison EC. The impending shortage and the estimated cost of training the future surgical workforce. Ann Surg. 2009;250:590–597
14. Lynge DC, Larson EH, Thompson MJ, Rosenblatt RA, Hart LG. A longitudinal analysis of the general surgery workforce in the United States, 1981–2005. Arch Surg. 2008;143:345–350
15. Thomas CR, Holzer CE 3rd. The continuing shortage of child and adolescent psychiatrists. J Am Acad Child Adolesc Psychiatry. 2006;45:1023–1031
16. U.S. Department of Health Services Health Resources and Services Administration Bureau of Health Professions. The Physician Workforce: Projections and Research Into Current Issues Affecting Supply and Demand. 2008 Washington, DC Health Resources and Services Administration
http://bhpr.hrsa.gov/healthworkforce/reports/physwfissues.pdf. Accessed May 1, 2013
17. U.S. Department of Health Services Health Resources and Services Administration Bureau of Health Professions. Physician Supply and Demand: Projections to 2020. 2006 Washington, DC Health Resources and Services Administration
http://www.achi.net/HCR%20Docs/2011HCRWorkforceResources/Physician%20Supply%20and%20Demand-2020%20kl.pdf. Accessed May 1, 2013
18. Association of American Medical Colleges Center for Workforce Studies. The Complexities of Physician Supply and Demand: Projections Through 2025. 2008 Washington, DC Association of American Medical Colleges
https://members.aamc.org/eweb/upload/The%20Complexities%20of%20Physician%20Supply.pdf. Accessed May 1, 2013
19. Chang CH, Stukel TA, Flood AB, Goodman DC. Primary care physician workforce and Medicare beneficiaries’ health outcomes. JAMA. 2011;305:2096–2104
20. Jerant A, Fenton JJ, Franks P. Primary care attributes and mortality: A national person-level study. Ann Fam Med. 2012;10:34–41
21. Starfield B, Shi L, Macinko J. Contribution of primary care to health systems and health. Milbank Q. 2005;83:457–502
22. Fisher ES, Wennberg DE, Stukel TA, Gottlieb DJ, Lucas FL, Pinder EL. The implications of regional variations in Medicare spending. Part 1: The content, quality, and accessibility of care. Ann Intern Med. 2003;138:273–287
23. Coggeshall LT Planning for Medical Progress Through Education. 1965 Washington, DC Association of American Medical Colleges
24. Institute of Medicine. Primary Care Physicians: Financing Their Graduate Medical Education in Ambulatory Settings. 1989 Washington, DC Institute of Medicine
http://iom.edu/Reports/1989/Primary-Care-Physicians-Financing-Their-Graduate-Medical-Education-in-Ambulatory-Settings.aspx. Accessed May 1, 2013
25. Council on Graduate Medical Education. Advancing Primary Care. 2010 Washington, DC Health Resources and Services Administration
http://www.hrsa.gov/advisorycommittees/bhpradvisory/cogme/Reports/twentiethreport.pdf. Accessed May 1, 2013
26. Josiah Macy Jr. Foundation. Ensuring an Effective Physician Workforce for America: Recommendations for an Accountable Graduate Medical Education System. 2011 New York, NY Josiah Macy Jr. Foundation
http://www.josiahmacyfoundation.org/docs/macy_pubs/Effective_Physician_Workforce_Conf_Book.pdf. Accessed November May 1, 2013
27. Medicare Payment Advisory Commission. Report to the Congress: Aligning Incentives in Medicare. 2010 Washington, DC Medicare Payment and Advisory Commission
http://www.medpac.gov/documents/jun10_entirereport.pdf. Accessed May 1, 2013
28. . Pub L No. 111-148, 152.
29. Hofer AN, Abraham JM, Moscovice I. Expansion of coverage under the Patient Protection and Affordable Care Act and primary care utilization. Milbank Q. 2011;89:69–89
30. Kirch DG, Henderson MK, Dill MJ. Physician workforce projections in an era of health care reform. Annu Rev Med. 2012;63:435–445
31. Mullan F, Chen C, Petterson S, Kolsky G, Spagnola M. The social mission of medical education: Ranking the schools. Ann Intern Med. 2010;152:804–811
32. Brotherton SE, Etzel SI. Graduate medical education, 2010–2011. JAMA. 2011;306:1015–1030
33. Miller RS. Senior vice president of applications and data analysis, Accreditation Council for Graduate Medical Education. Personal communication with R. Phillips, April 6 2011
34. Accreditation Council for Graduate Medical Education. . Glossary of Terms.
http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/ab_ACGMEglossary.pdf. Accessed May 1, 2013
35. Centers for Medicare and Medicaid Services. . NPI Files.
http://nppes.viva-it.com/NPI_Files.html. Accessed May 1, 2013
36. United States Department of Agriculture. . Measuring rurality: Rural–urban continuum codes.
http://www.ers.usda.gov/Briefing/Rurality/RuralUrbCon/. Accessed May 1, 2013
37. Health Resources and Services Administration. . Data Warehouse.
http://datawarehouse.hrsa.gov/. Accessed May 1, 2013
38. Berry M. Office of Shortage Designation, Health Resources and Services Administration. Personal communication with R. Phillips, September 2 2010
40. Accreditation Council for Graduate Medical Education. . ACGME Program Requirements for Graduate Medical Education in Internal Medicine.
http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/140_internal_medicine_07012009.pdf. Accessed May 1, 2013
41. Geiger Gibson/RCHN Community Health Foundation Research Collaborative. . Policy Research Brief No. 19: Strengthening Primary Care to Bend the Cost Curve: The Expansion of Community Health Center Through Health Reform.
http://sphhs.gwu.edu/departments/healthpolicy/dhp_publications/pub_uploads/dhpPublication_895A7FC0-5056-9D20-3DDB8A6567031078.pdf. Accessed May 1, 2013
42. Freed GL, Nahra TA, Wheeler JRResearch Advisory Committee of American Board of Pediatrics. . Counting physicians: Inconsistencies in a commonly used source for workforce analysis. Acad Med. 2006;81:847–852
43. Grumbach K, Becker SH, Osborn EH, Bindman AB. The challenge of defining and counting generalist physicians: An analysis of Physician Masterfile data. Am J Public Health. 1995;85:1402–1407
44. Konrad TR, Slifkin RT, Stevens C, Miller J. Using the American Medical Association physician masterfile to measure physician supply in small towns. J Rural Health. 2000;16:162–167