Secondary Logo

Share this article on:

Internal Medicine Residency Program Directors’ Views of the Core Entrustable Professional Activities for Entering Residency: An Opportunity to Enhance Communication of Competency Along the Continuum

Angus, Steven V., MD; Vu, T. Robert, MD; Willett, Lisa L., MD, MACM; Call, Stephanie, MD, MSPH; Halvorsen, Andrew J., MS; Chaudhry, Saima, MD, MSHS

doi: 10.1097/ACM.0000000000001419
Research Reports

Purpose To examine internal medicine (IM) residency program directors’ (PDs’) perspectives on the Core Entrustable Professional Activities for Entering Residency (Core EPAs)—introduced into undergraduate medical education to further competency-based assessment—and on communicating competency-based information during transitions.

Method A spring 2015 Association of Program Directors in Internal Medicine survey asked PDs of U.S. IM residency programs for their perspectives on which Core EPAs new interns must or should possess on day 1, which are most essential, and which have the largest gap between expected and observed performance. Their views and preferences were also requested regarding communicating competency-based information at transitions from medical school to residency and residency to fellowship/employment.

Results The response rate was 57% (204/361 programs). The majority of PDs felt new interns must/should possess 12 of the 13 Core EPAs. PDs’ rankings of Core EPAs by relative importance were more varied than their rankings by the largest gaps in performance. Although preferred timing varied, most PDs (82%) considered it important for medical schools to communicate Core EPA-based information to PDs; nearly three-quarters (71%) would prefer a checklist format. Many (60%) would be willing to provide competency-based evaluations to fellowship directors/employers. Most (> 80%) agreed that there should be a bidirectional communication mechanism for programs/employers to provide feedback on competency assessments.

Conclusions The gaps identified in Core EPA performance may help guide medical schools’ curricular and assessment tool design. Sharing competency-based information at transitions along the medical education continuum could help ensure production of competent, practice-ready physicians.

S.V. Angus is internal medicine residency program director and vice chair of education, Department of Medicine, University of Connecticut School of Medicine, Farmington, Connecticut.

T.R. Vu is associate professor of clinical medicine and associate director of internal medicine clerkships, Department of Medicine, Indiana University School of Medicine, Indianapolis, Indiana.

L.L. Willett is internal medicine residency program director and vice chair of education, Department of Medicine, University of Alabama, Birmingham, Birmingham, Alabama.

S. Call is associate chair for education and program director, Internal Medicine Training Program, Department of Internal Medicine, Virginia Commonwealth University, Richmond, Virginia.

A.J. Halvorsen is project and data manager, Office of Educational Innovations, Internal Medicine Residency Program, Mayo Clinic, Rochester, Minnesota.

S. Chaudhry is vice president for academic affairs and chief academic officer, Memorial Healthcare System, Fort Lauderdale, Florida.

Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A394.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: This study was approved by the Mayo Clinic institutional review board.

Correspondence should be addressed to Steven V. Angus, Department of Medicine, University of Connecticut School of Medicine, 263 Farmington Ave., Farmington, CT 06030; e-mail: angus@uchc.edu.

Medical educators are grappling with ways to advance competency-based education (CBE). Milestones and entrustable professional activities (EPAs) are two frameworks that operationalize CBE.1–4 While these frameworks are supported in both graduate medical education (GME) and undergraduate medical education (UME), only GME programs are required to report CBE outcomes, via milestones data submitted to the Accreditation Council for Graduate Medical Education (ACGME).1 In 2012, the Alliance for Academic Internal Medicine (AAIM) Educational Redesign Committee published a list of 16 proposed end-of-training EPAs, comprising the core of the profession, that graduating internal medicine (IM) residents must be able to perform independently to enter unsupervised practice.5 This introduction of end-of-training EPAs for GME has sparked interest in developing similar EPAs for UME.6

In 2013, the Association of American Medical Colleges (AAMC) convened a drafting panel to identify a “common core set of behaviors that might be expected of all graduates” that could drive curriculum development, aid in assessing competency using the EPA framework, and better define the path to entrustment.7 The panel’s work resulted in the definition of 13 Core Entrustable Professional Activities for Entering Residency (Core EPAs)—similar to AAIM’s end-of-training EPAs for graduating residents—that all medical school graduates on day 1 of residency training, regardless of specialty choice, should be expected to perform without direct supervision (i.e., with indirect supervision).7 Direct supervision means the supervising physician is physically present with the patient; indirect supervision means the physician is nearby and available to the resident by telephone or electronic modality (e.g., text message, e-mail).8

Although the Core EPAs are not mandated for reporting purposes, they provide an opportunity for UME programs to communicate information about the competency of medical school graduates to residency programs. In previous studies, IM residency program directors (PDs) have expressed their expectations of new interns9 and a perceived lack of preparation by some medical school graduates.10 , 11 A secondary goal of the Core EPAs initiative is to “demonstrate improvement in the gap between performance and expectations for students entering residency who have been entrusted on the Core EPAs.”7

PDs are involved in transitions between phases of the medical education continuum, as graduating students move from UME into GME and as graduating residents move from GME into fellowship or employment and continuing medical education. Thus, they have a unique perspective on how to enhance communication across the continuum. At the UME-to-GME transition, PDs accept and receive graduates from multiple medical schools, with each school having differing curricula, graduation requirements, and methods of communicating about individual students’ competency. By creating a core set of expectations for graduating medical students, the Core EPAs could standardize the information transmitted by schools to PDs. This transmission of standardized, competency-based data from medical schools to residency programs could be viewed as akin to the new (2016) ACGME requirement that PDs transmit milestones-based evaluations of each graduating resident to the fellowship program where that resident matched,8 highlighting the importance of communicating competency-based information at transitions along the continuum so that no learner is functioning beyond his or her capabilities.

However, the UME perspective regarding the skills expected for graduating medical students may not be congruent with the GME perspective. Given PDs’ role in receiving medical school graduates, now “day 1 interns,” their expectations can inform discussions about how to most effectively use the Core EPAs. As we found no published studies exploring IM PDs’ perspectives of the Core EPAs, we sought to determine their views on how essential each Core EPA is for new interns to perform without direct supervision, as well as how and when Core EPA information may be best transmitted and used at the transition from UME to GME. Because PDs are also involved in transitioning their graduating residents into subspecialty fellowship training or independent practice, we also sought their perspectives on feeding forward competency-based information as their graduates move along the continuum.

Back to Top | Article Outline

Method

The Association of Program Directors in Internal Medicine (APDIM) Survey and Scholarship Committee surveys its members to obtain feedback on important issues. The 2015 spring survey was sent via SurveyMonkey (Palo Alto, California) to the PDs of 361 member programs, representing 95% of the 382 ACGME-accredited IM residency programs in the United States. This 18-question survey sought PDs’ opinions of the 13 Core EPAs—specifically, how essential each is for day 1 interns; on which Core EPAs there are gaps between expected and observed performance; and whether, how, and when Core EPA-based information about medical students should be shared with PDs and whether having this information would change how new interns are supervised. To quantify the relative importance of the Core EPAs, a series of questions asked PDs how essential they considered each for new interns to possess on day 1 without direct supervision and then to rank-order the 3 Core EPAs they felt were the most essential. PDs were also asked to rank-order the 3 Core EPAs for which they had observed the largest gaps between the expected and observed performance of new interns.

Additional questions focused on PDs’ perspectives on communication at the transition from IM residency to fellowship or employment—specifically, whether competency-based information on graduating residents should be communicated to fellowship directors and employers using AAIM’s 16 end-of-training EPAs (analogous to medical schools transmitting Core EPA-based information to GME PDs), and when EPA-based evaluations should be provided. PDs were also asked whether they thought there should be a way for them to provide feedback to medical schools and for fellowship directors/employers to provide feedback to them on the level of agreement of assessments of competency.

The survey items were developed by subcommittees of APDIM and the Clerkship Directors in Internal Medicine to provide GME and UME perspectives, respectively. Survey items were pilot tested for content and response process validity by PDs on the APDIM Survey and Scholarship Committee. All of the authors participated in the survey development process, and most served on the Survey and Scholarship Committee (S.V.A., L.L.W., S.C., A.J.H., S.C.).

The survey was conducted between March 2 and May 18, 2015. Three e-mail reminders were sent to PDs, encouraging participation. One response was permitted per program. AAIM (the parent organization of the APDIM) provided us with the study data set, which was merged with data sets for publicly available program characteristics and then deidentified for our review.

Descriptive results for survey items were summarized using frequencies and percentage of all responding programs. To assess the representativeness of the sample, demographics were compared for responders and nonresponders across publicly available variables using Fisher’s exact test or Welch’s t test as appropriate. The significance level for these comparisons was set at .05.

This study was approved by the Mayo Clinic institutional review board. The survey instrument is available as Supplemental Digital Appendix 1 at http://links.lww.com/ACADMED/A394.

Back to Top | Article Outline

Results

The survey response rate was 57% (204/361 IM programs). Table 1 provides a comparison of responder and nonresponder programs across the publicly available characteristics.

Table 1

Table 1

The majority of responding IM PDs felt that new interns must or should possess the ability to perform 12 of the 13 Core EPAs without direct supervision on day 1 of their internship. There was only 1 Core EPA, “perform the general procedures of a physician,” that the majority of responding IM PDs felt was not necessary (see Figure 1).

Figure 1

Figure 1

Back to Top | Article Outline

Relative importance of the Core EPAs

There was a wide range of responses in the PDs’ rank-ordering of the three most essential Core EPAs (see Figure 2). “Gather a history and perform a physical examination” was identified by 97% (n = 197) of PDs as one of the three most essential Core EPAs, followed by “provide an oral presentation/summary of a patient encounter” (61%; n = 124) and “develop a prioritized differential diagnosis and select a working diagnosis following a patient encounter” (39%; n = 80).

Figure 2

Figure 2

Back to Top | Article Outline

Core EPAs with the largest gaps between expected and observed performance

The frequency with which PDs identified a Core EPA as having one of the three largest gaps between expected and observed performance is shown in Figure 2. Here the range of responses was narrower, with perceived gaps in the Core EPAs clustering together. “Develop a prioritized differential diagnosis and select a working diagnosis following a patient encounter” was identified by the most respondents (41%; n = 84) as having one of the three largest gaps, followed by “provide an oral presentation/summary of a patient encounter” (33%; n = 67). Three additional Core EPAs were identified by 30% (n = 61) of respondents as having one of the three largest gaps.

Back to Top | Article Outline

Communication at transitions along the medical education continuum

The majority of PDs indicated that communication of competency-based information at the transition from UME to GME was important: Eighty-two percent (n = 167) agreed or strongly agreed with the statement that “once EPAs are fully integrated into medical schools’ curricula and assessment, I would expect medical schools to send information about a student’s performance on each EPA as part of an educational handoff.…” Additionally, 77% (n = 158) indicated that it would be somewhat or very important for medical schools to describe their assessment methodology for determining entrustment. Despite expecting UME programs to provide this information, PDs’ planned use of it varied. Only one respondent (0.5%) would allow new interns to practice under indirect supervision immediately if Core EPA information were provided, whereas 29% (n = 59) would require a limited number of direct observations before allowing interns to practice under indirect supervision, 31% (n = 63) would correlate medical schools’ assessments to their own intern orientation assessments, and 37% (n = 75) would not change how they supervise new interns, requiring direct supervision for all.

Table 2 reports PDs’ preferred timing for communicating competency-based information at the transitions from medical school to residency and from residency to fellowship/employment. As the table shows, PDs would prefer to receive such information sooner during transitions than they would prefer to send it. About half of PDs (49%; n = 100) indicated that the best time for medical schools to provide Core EPA information would be as part of the Electronic Residency Application Service (ERAS) application, whereas 20% (n = 40) thought it should be provided upon graduation. In contrast, 51% (n = 105) indicated that the most appropriate time to provide an EPA-based evaluation to fellowship directors/employers would be at or after graduation, while 27% (n = 55) thought it should be included in the ERAS fellowship application or the employment application.

Table 2

Table 2

Most PDs felt that communication during transitions should be bidirectional. Eighty-four percent (n = 172) agreed or strongly agreed that a mechanism should be in place for PDs to provide feedback to medical schools on the level of agreement between a school’s and a residency program’s assessments of competency. Similarly, 81% (n = 165) agreed or strongly agreed that a mechanism should be in place for fellowship directors/employers to provide feedback to PDs on the level of agreement between the residency program’s and the fellowship program’s/employer’s assessments of competency.

Table 3 shows PDs’ preferred methods for medical schools to communicate information to them about an individual’s achieved level of competency. Nearly three-quarters of PDs (71%; n = 145) indicated that they would prefer an EPA checklist, either linked to the ACGME milestones or as a stand-alone document. Regarding communication about competency at the transition from residency to fellowship/employment, 60% (n = 123) of PDs agreed or strongly agreed that they would be willing to send fellowship directors/employers an evaluation based on the AAIM’s 16 end-of-training IM EPAs (analogous to the AAMC’s Core EPAs) for each resident; 19% (n = 38) disagreed or strongly disagreed.

Table 3

Table 3

Back to Top | Article Outline

Discussion

Our work highlights IM PDs’ views of the importance of communication about competency at transition points along the medical education continuum and on the use of EPAs to transmit competency-based information on medical school and residency graduates as they progress along the continuum. Our study is timely, as concerns about reworking, splitting, or rearranging the Core EPAs have emerged recently.12 PDs, who accept graduates from multiple medical schools, are uniquely positioned to comment on graduating medical students’ skill sets and readiness for internship. The IM PDs who responded to our survey agreed that graduating medical students should or must be competent in most of the Core EPAs. They considered several Core EPAs to be less essential than others for new interns to perform under indirect supervision on day 1, however, including performing general procedures of a physician, identifying systems failures, and obtaining informed consent. Interestingly, these three Core EPAs were identified in a 2014 survey of 503 PDs across specialties13 as those for which graduates of Liaison Committee on Medical Education–accredited medical schools were least prepared.

Our study also identified the Core EPAs for which PDs perceived the largest gaps between the level of performance they expected of new interns and the level they observed. The frequencies with which PDs identified Core EPAs as among those with the largest gaps were closely clustered; that is, PDs shared agreement that all Core EPAs had gaps in performance at roughly the same frequency. Identifying the gaps between expected and observed performance may help inform medical schools’ curricula and assessment tools, leading to increased preparedness of all medical school graduates and improved communication of this readiness to GME programs.

As noted above, our study highlights PDs’ perspectives on the importance of communication during transitions. PDs responded favorably to accepting communication of readiness from medical schools, but they had mixed views about when the information should be shared. Almost half felt the best time to receive information about a student’s Core EPA performance would be as part of the ERAS application. Although receiving this information as part of the ERAS application would perhaps allow for selecting “better-prepared” applicants, this timing may be impractical. The AAMC created the Core EPAs as a standard for graduating students.7 Many of the activities that allow students to be deemed entrustable may not be observed until the fourth year of medical school when students take on a more independent role, assuming the responsibilities of an acting intern. Recent publications14 , 15 describe how the fourth year of medical school in general, and the IM subinternship in particular, can provide medical students with realistic preparation for internship. Thus, the fourth year specifically can be used to assess the competency of students, helping achieve the overarching goals of the Core EPA initiative.14 , 15 As the ERAS application is submitted early in the fourth year, schools may not have time by then to fully assess EPAs for all graduating students. A recently published study, though, demonstrated the feasibility of providing a post-Match competency-based medical student performance evaluation (MSPE) to matching residency programs; such information was judged to be useful for all incoming interns.16 Yet, despite wanting Core EPA information, it is unclear if and how PDs would use it. Many PDs would not supervise interns less and would still require them to perform all activities under direct supervision until the residency program determined that they could be entrusted. We hypothesize that this would likely change over time with more experience in the transmission and reception of standardized competency-based information from medical schools and at different rates for specific Core EPAs.

Whereas most IM PDs wanted to receive competency-based information from medical schools, fewer felt that they should feed forward such data on their graduates as part of the transition from residency to fellowship/employment. This may be explained by our survey methodology: We asked PDs specifically about the AAIM’s end-of-training IM EPAs because these were most analogous to the AAMC’s Core EPAs. However, end-of-training EPAs may not be a focus for PDs and may not currently be assessed. Our results might have been different if we had asked about transmission of competency-based information in general, given that core residency programs are required to submit milestones data to the ACGME on all trainees and to transmit a resident’s milestones data to the fellowship program where that resident matches.8 Another potential explanation may be related to the dual role of the PD. PDs act as advocates for their residents to help them secure the best fellowships possible, and the PD letter is an important factor in the fellowship application process.17 These letters typically highlight an individual resident’s strengths without reporting his or her weaknesses (analogous to the MSPEs, or dean’s letters, that PDs receive from medical schools). Providing competency-based evaluations for a specific resident might affect that resident’s likelihood of matching into a fellowship. As the fellowship Match rate is an important recruitment factor for residency programs, the transmission of this information may create a possible conflict of interest for PDs.

This consideration is supported at least at face value by our finding that only 27% (n = 55) of PDs felt that competency-based information should be shared as part of the ERAS fellowship application or the employment application; many more (51%; n = 105) felt it should be communicated to fellowship directors/employers at or after graduation. We hypothesize that a similar conflict may arise in medical schools if the descriptive narrative in the MSPE is replaced or supplemented by a standardized assessment using the Core EPAs.

PDs also indicated that there should be a mechanism to provide feedback on how a learner’s competency was assessed when entering a residency program or fellowship as compared with how that learner’s competency was assessed when graduating from medical school or residency, respectively. This open, bidirectional communication around standardized measures of competency could serve to strengthen collaboration between UME and GME programs to the benefit of our learners and patients. Knowing a particular learner’s areas of strength and weakness would allow for individualization of the learning plan, as is already required by some specialties, and assurance that learners’ needs are not overlooked during the transitions along the educational continuum.18 , 19

Back to Top | Article Outline

Limitations

This study has several limitations. First, we assumed that PDs were knowledgeable about the AAMC’s Core EPAs. Although a definition and brief explanation of the Core EPAs was provided at the beginning of our survey instrument, it is unclear how well versed all PDs were when completing the survey. Responses might have been different if respondents had been provided the details of each Core EPA rather than the Core EPA without additional description. Second, we obtained the viewpoint of only IM PDs. IM is the largest of the ACGME specialties, but it will be important to assess the viewpoints of stakeholders from other specialties because the Core EPAs were developed for the undifferentiated medical school graduate. Third, large university-based and -affiliated programs appear to be overrepresented in our sample (see Table 1); however, an analysis of how PDs of different program types categorized the 13 Core EPAs found no significant difference between those that PDs thought new interns must, should, or did not need to possess (data not shown).

Back to Top | Article Outline

Conclusions

This study provides insight into IM PDs’ views on two emerging topics in medical education: the use of Core EPAs as a framework to ensure graduating medical students’ readiness for internship, and the importance of standardized communication of competency across transition points along the educational continuum. More thought and study should be given to sharing CBE outcomes at transition points along the medical education continuum. To the best of our knowledge, our study is the first to ask if such information exchange would be welcomed and used by IM PDs, a key stakeholder group of medical educators. It is encouraging that IM PDs are interested in receiving and providing such information across the continuum. In pediatrics, EPAs are being used to bridge the educational continuum20; further studies are needed to see whether this desire to bridge is generalizable across other specialties and to determine the best framework to operationalize communication of competency-based assessments.

We hope this study can guide medical schools toward designing curricula and assessment tools that fill gaps identified in the expected and observed performance of the Core EPAs. Doing so will help fulfill the goal of the Core EPA project and advance the conversation to create a true educational continuum that benefits our learners and patients. While the concept of bridging the educational continuum is in its infancy, the sharing of competency-based information in EPA or milestones format at times of transition from UME to GME and from GME to fellowship/employment could help ensure that our medical education system is producing practice-ready physicians who are competent to meet the needs of the patients they will serve.

Acknowledgments: The authors are grateful for the support of the Association of Program Directors in Internal Medicine (APDIM), members of the APDIM Survey and Scholarship Committee, and the residency program directors who completed this survey.

Back to Top | Article Outline

References

1. Accreditation Council for Graduate Medical Education. Milestones. https://www.acgme.org/acgmeweb/tabid/430/ProgramandInstitutionalAccreditation/NextAccreditationSystem/Milestones.aspx. Accessed July 21, 2016.
2. ten Cate OEntrustability of professional activities and competency-based training. Med Educ. 2005;39:1176–1177.
3. Ten Cate ONuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5:157–158.
4. Hauer KE, Kohlwes J, Cornett P, et alIdentifying entrustable professional activities in internal medicine training. J Grad Med Educ. 2013;5:54–59.
5. Alliance for Academic Internal Medicine. Internal medicine end of training EPAs. http://www.im.org/p/cm/ld/fid=639. Accessed July 21, 2016
6. Chen HC, van den Broek WE, ten Cate OThe case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2015;90:431–436.
7. Association of American Medical Colleges. Core entrustable professional activities for entering residency. https://www.mededportal.org/icollaborative/resource/887. Updated May 28, 2014. Accessed July 21, 2016.
8. Accreditation Council for Graduate Medical Education. ACGME common program requirements. Effective July 1, 2016. https://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/CPRs_07012016.pdf. Accessed July 21, 2016.
9. Angus S, Vu TR, Halvorsen AJ, et alWhat skills should new internal medicine interns have in July? A national survey of internal medicine residency program directors. Acad Med. 2014;89:432–435.
10. Lypson ML, Frohna JG, Gruppen LD, Woolliscroft JOAssessing residents’ competencies at baseline: Identifying the gaps. Acad Med. 2004;79:564–570.
11. Wagner D, Lypson MLCentralized assessment in graduate medical education: Cents and sensibilities. J Grad Med Educ. 2009;1:21–27.
12. Ten Cate OTrusting graduates to enter residency: What does it take? J Grad Med Educ. 2014;6:7–10.
13. Association of American Medical Colleges. The Core Entrustable Professional Activities for Entering Residency: Spring update 2015. https://www.aamc.org/download/427456/data/spring2015updatepptpdf.pdf. Accessed July 21, 2016.
14. Santen SA, Seidelman JL, Miller CS, et alMilestones for internal medicine sub-interns. Am J Med. 2015;128:790–798.e2.
15. Vu TR, Angus SV, Aronowitz PB, et alCDIM–APDIM Committee on Transitions to Internship (CACTI) Group. The internal medicine subinternship—Now more important than ever: A joint CDIM–APDIM position paper. J Gen Intern Med. 2015;30:1369–1375.
16. Sozener CB, Lypson ML, House JB, et alReporting achievement of medical student milestones to residency program directors: An educational handover. Acad Med. 2016;91:676–684.
17. Mikhail S, Bernstein PSelection criteria for fellowships: Are we all on the same page? Acad Intern Med Insight. 2007;5(1):1–10.
18. Accreditation Council for Graduate Medical Education. ACGME program requirements for graduate medical education in pediatrics. Effective July 1, 2015. https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/320_pediatrics_07012015.pdf. Accessed July 21, 2016.
19. Guevara M, Grewald Y, Hutchinson K, Amoateng-Adjepong Y, Manthous CIndividualized education plans in medical education. Conn Med. 2011;75:537–540.
20. Carraccio C, Englander R, Gilhooly J, et alBuilding a framework of entrustable professional activities, supported by competencies and milestones, to bridge the educational continuum. Acad Med. 2017;92:324–330.

Supplemental Digital Content

Back to Top | Article Outline
© 2017 by the Association of American Medical Colleges