The Association of American Medical Colleges (AAMC) recently published a draft set of 13 core entrustable professional activities (EPAs) for entering residency and encouraged medical schools to consider them in determining outcomes for graduating students.1 The AAMC has been soliciting and receiving feedback on the nature or descriptions of the particular EPAs that were chosen.2 Separate from the issue of whether the published EPAs are the right ones,2 many might question the theoretical and practical rationales for the use of EPAs in undergraduate medical education (UME). Reasons range from whether workplace activities are an appropriate framework for medical school outcomes to whether entrustment for unsupervised practice applies to students. We propose to provide a context and present arguments for how we might think about EPAs for UME, both in considering and expanding on the work of the AAMC.
The idea of an outcomes-based approach to curricular design and implementation as well as learner assessment and curriculum development has been proposed in medicine since the 1970s, and it has gained increasing attention since the 1990s.3 This international movement towards greater emphasis on learner outcomes is known more widely as competency-based medical education (CBME) and is compelling the delineation of clearer performance expectations for graduates of medical training.3,4 A variety of CBME frameworks have been adopted for graduate medical education (GME) in multiple countries including the United States, Canada, the United Kingdom, Sweden, Australia, and the Netherlands.5 Despite its growing adoption, significant controversies remain.3,4 One concern of educators is that the adopted CBME frameworks, such as the Accreditation Council for Graduate Medical Education core competency domains in the United States, do not fully capture or focus on the actual performance outcome of caring for patients.3–8 These authors argue that the parts, or the abilities within individual competency domains, do not add up to the whole of practice.4 Mastery of abilities in individual competency domains does not ensure the capability to integrate them across domains or to appropriately apply them to patient care. Also, the capability to provide patient care in one context or clinical circumstance may not necessarily translate to other contexts and circumstances. Lastly, the focus on objective assessment of measurable learner abilities may detract attention from the assessment of how learners actually care for their patients in a variety of clinical contexts. Educators have argued that performance outcomes should be framed in the context of clinical care, recognizing that professional development requires the integration of abilities across multiple competency domains and application within the health care environment.3,6,8–10 The concept of EPAs, a relatively new CBME-related framework, was introduced as a potential solution to these concerns.9,11
EPAs operationalize medical education outcomes as essential professional activities that one entrusts a professional to perform.12 An example of such an activity is “care of complicated pregnancies.”5 Each EPA is a synthesis of multiple competency domains (e.g., medical knowledge, communication skills, and professionalism) and requires the integration of knowledge, skills, and attitudes.12 Whereas traditional competency frameworks focus on qualities of the person, EPAs focus on qualities of the work to be completed.5 EPAs therefore ground outcomes in the tasks of physicians and offer an approach to CBME that better addresses concerns around integration of competency domains and context than previous CBME frameworks. In the Netherlands, EPAs have been used successfully as a blueprint for obstetrics–gynecology GME and physician assistant training programs.5,13 EPAs for GME training in psychiatry have been implemented in Australia and New Zealand.14,15 Similar to other CBME frameworks, EPAs have primarily been applied to GME. The International CBME Collaborators have suggested that we work backwards from GME competency expectations to build necessary competency expectations for UME as well.16 The question arises as to whether working backwards is the right approach. To consider this question, we must ask even a more fundamental question: Can EPAs work as a competency framework for UME? We answer that question by addressing whether EPAs are appropriate for UME, what UME EPAs would look like, and what entrustment in UME means.
Are EPAs Appropriate for UME?
We believe the answer is yes; EPAs do have a place in and can be advantageous for UME. Our arguments for the suitability of EPAs will center on the continuity and developmental progression of learners, the generalizability and applicability across the continuum of the principles underlying EPAs, and the recognition and quality assurance of student work in the clinical workplace.
Continuity and developmental progression of learners
The end of medical school can be seen as the completion of a specific training period with its own competency expectations. In reality, medical school completion is just one point along the continuum of physician training.17 Medical school prepares and provides learners with generic knowledge and skills to support the continued development of more advanced and additional specialty-specific knowledge and skills in GME. Learners develop progressive proficiency along the continuum from UME to GME training. To reach the true potential of CBME, we need to think about operationalizing expectations across the entire continuum. Medical education curricula and learner expectations at each level should build progressively upon previous levels, ideally demonstrate spiral (e.g., iterative and increasing) development of concepts and skills, and be related parts of a comprehensive system.16 Application of the same competency framework in both UME and GME training would promote this type of vertical integration across the continuum and foster true CBME. For instance, the AAMC is working with four institutions to pilot a competency-based pediatrics training program that will span the UME/GME continuum and employ competency-based rather than time-based advancement. For this pilot, an important early step was to adopt one unifying competency framework that would span the continuum.18 The use of a unifying framework allows better alignment of educational activities and a consistent approach to achievement-based progression throughout the pilot UME/GME training program.
From a developmental perspective, the EPA approach can work well as a unifying competency framework for UME and GME. As previously described by ten Cate and colleagues,19 the entrustment decisions as operationalized in the EPA approach align with the Dreyfus and Dreyfus model for the development of expertise and with the developmental curves described in medical skill development. Using the stages of Dreyfus and Dreyfus, learners would begin as novices for most skills or activities and progress at individual rates during their professional training through the advanced beginner stage to reach the competent stage. Regardless of learner level, the point at which the learner reaches the competent stage for any given activity would correspond to the point at which the learner would be entrusted to perform that professional activity unsupervised.
Generalizability and applicability of EPA principles
As noted above, EPAs are essential professional or workplace activities that one entrusts a professional to perform.12 The key principles that underlie the EPA concept, workplace learning and trust, are generalizable to the continuum of physician training. Both apply to UME as well as GME.
Workplace learning, defined as experiential learning through participation in the workplace, is at the heart of clinical education.19,20 While workplace learning has been recognized as the crux of GME, we would argue that it is also essential for UME. Certainly it has a clear role in clerkship learning, so the use of EPAs there seems evident. One could argue that preclerkship learning is knowledge-focused classroom-based learning in which workplace learning and therefore EPAs (which are workplace activities) do not have a role. However, educators have called for, and medical schools have increasingly incorporated, early/preclerkship workplace-based clinical education to help students in their professional identity formation, provide exposure to aspects of patient and community health, and develop student–patient communication skills.21 Vertically integrated clinical curricula with early clinical experiences and increasing clinical responsibilities over time have been shown to improve clinical capabilities in graduates and their preparation for transition to residency.22 In addition, students in even the first year of medical school have demonstrated the ability to participate in and contribute to the clinical workplace when given the opportunity, clear roles, and adequate support.23
In the clinical workplace, trust is a key element of the supervision of learners. Here, clinical supervisors make decisions to invite learner participation or provide learner responsibilities for patient care based on their trust of the individual learner. This trust is a judgment grounded in multiple factors related to the supervisor, learner, supervisor–learner relationship, situational and workplace context, and activity to be performed.24 The factors that have been described, including learner factors such as competence/experience, attitudes, and insight into limitations, are generalizable to learners at all levels and applicable to different workplace environments.24,25 Similar to the entrustment decisions clinicians make about residents, clinical supervisors also make daily decisions about whether to trust individual students with specific activities.
Recognition and quality assurance of student work
Attention to student abilities framed around clinical workplace activities has several advantages. As noted above, EPAs can help clarify the nature of students’ early clinical engagement and increasing responsibilities over time. They also allow articulation of how students can contribute to the care of patients from the very beginning of medical school, and make visible these student contributions and the value they add to patient care. This definition and recognition of student work can help educators align student output with student learning goals and motivation, institutional expectations, and societal needs. Explicit recognition of levels of student participation and clarity around activities that can be entrusted promote quality and safety in the clinical workplace. It can increase transparency for the public about how we are addressing our obligation to provide safe care and may even be helpful for teaching hospitals to meet regulatory needs.
The Joint Commission International, which accredits hospitals, places attention squarely on student privileges—not just their achievement of competency expectations but whether they can be trusted to safely perform specific patient care activities. Examples of student privileges or activities recently introduced at the University Medical Center Utrecht include “providing non-therapeutic medical information to patients,” “requesting routine laboratory investigations,” and “placing urine catheters,” among many others.26 These smaller activities may serve to cluster into EPAs. It has been suggested that digital badges encoded with just this type of information about the individual student can be accessed by others in the workplace (faculty, supervising residents, allied health professionals, etc.) to determine delegation of or student participation in patient care responsibilities.27 These EPAs also may well serve as reminders and assurances to the students, workplace community, medical centers, and the public that students can and do provide safe and value-added patient care. In addition, when stakeholders are able to ensure that the contributions made by students are safe and value-added, students may be allowed to assume greater responsibility and participate even more actively in the provision of patient care.
We acknowledge that a significant amount of UME learning is focused on knowledge and foundational skill-building and limited to the classroom, where EPAs do not have a direct role. However, the final expected outcomes of UME training can be captured by EPAs that require the achievement and contextual application of these basic knowledge and skills. Also, as early clinical experiences are increasingly introduced and vertically integrated into the later clinical curriculum, it becomes important to apply workplace-based assessments across all years of UME training. Just as the integration of a set of GME EPAs may reflect the professional activities of a subspecialty, so too can an integrated set of thoughtfully constructed UME EPAs reflect the professional activities of a medical school graduate. Care would need to be taken to make sure the EPAs relate to the overarching goals of the medical school and have defined milestones to allow for assessment of the classroom-based learning that will support these workplace activities. We believe that EPAs may be an excellent key to the legitimate peripheral participation recommended by Lave and Wenger for early learners in a professional community of practice.28
What Would UME EPAs Look Like?
Because medical training is a continuum, logically, UME-level EPAs should align with GME-level EPAs. One approach would be to use the same or similar EPAs in UME as in GME. For instance, one could use the same EPA title but explicitly limit the scope of the EPA in its description for UME learners (e.g., limit activity to cooperative or medically stable patients). However, even with limitations in scope, the EPAs developed for GME are large units of combined complex activities requiring complex high-order skills11 (e.g., care of complicated pregnancies [obstetrics–gynecology],5 care for a well newborn [pediatrics],29 manage care of patients with chronic diseases [medicine],30 care for patient with delirium [psychiatry]).15 With approximately 20 EPAs encompassing the competency expectations for a specialty,5,11,29,30 these EPAs are likely at too high a level and too broad to be practically useful for assessment in UME.
We recognize a hierarchy in the organization of learning, such that more complex, higher-order skills or activities are built from simpler, subordinate skills or activities.31 Therefore, another approach would be to develop UME-specific EPAs that represent subsets of activities that will eventually integrate together and nest within broader EPAs to provide the foundation for GME-level activities. These would be more practical for implementation by targeting assessment at the expected UME level of development. Such EPAs could be defined on the basis of the list of graduation competency expectations or objectives most medical schools have. Alternatively, all beginning GME learners are entrusted with certain activities on day one of their training, such as gathering a history and performing a physical examination appropriate to the clinical situation. These can serve as a starting point for defining core EPAs for UME. This is the approach taken by the AAMC. Stakeholders recently convened a national committee to define EPAs for UME and have published a set of 13 draft core EPAs for entering residency.1 Together, these core EPAs represent the baseline activities required to support GME EPAs across all specialties. The AAMC notes that these are the very basic core EPAs. They do not address different expectations across individual institutions, nor are they meant to encompass specialty-specific graduation competency expectations of individual specialties.
In addition to general skills such as the 13 defined by the AAMC, slightly different competency expectations of medical school graduates will be held by different fields. For instance, the expectations for a beginning surgical resident are generally different from those for a beginning psychiatry resident. Discussion about whether medical school should prepare graduates in a generic or in a more specialized approach is ongoing.2,17 At the moment, medical students graduate with core skills as well as early specialty-specific skills, mostly gained through electives in their final year of medical school.32 Even careful consideration of the traditional clerkship rotations may reveal overlap of core and specialty-specific knowledge and skill expectations. Thus, we should also define specialty-specific EPAs for UME that can serve as selective achievements for students preparing to enter specific GME programs of their choosing. These specialty-specific EPAs would link more directly to GME-level EPAs, and the level of entrustment that should be achieved would differ by student based on career path. These specialty-specific EPAs could guide student selection of senior year electives as well as help program directors ensure a baseline competency level of their entering residents. If operationalized properly, these specialty-specific EPAs could ease advising during the fourth year, ensure more adequately prepared entering residents, and obviate the need for extracurricular “boot camps”33–35 before or during residency. Lastly, in addition to the basic core EPAs mandatory for all students and specialty-specific EPAs mandatory for students preparing for specific GME programs, we could define optional EPAs that individual students could achieve on the basis of their capacities and interest. For instance, schools offering required or elective scholarly concentration programs could have EPAs related to each area of scholarly concentration36,37 (see Table 1).
Entrustment in UME
EPAs set forth definitions of clear workplace tasks for students that can allow students to assume greater responsibility and participate more actively in patient care, which can in turn increase their motivation to learn.20 Some students could potentially demonstrate readiness for practice of certain core or specialty-specific EPAs earlier than typically expected in the training continuum. To support safe escalation of student responsibilities, we need to very clearly and thoughtfully define the degrees of supervision for students. One consideration is whether the entrustment and supervision scale currently in use in GME can be applied to UME.
The GME entrustment and supervision scale uses five different levels of supervision to define the levels of entrustment, providing few levels of gradation for the beginning learner19 (see Table 2). As noted previously, medical students may never practice without supervision. Under the GME entrustment and supervision scale, students would only progress from level 1 (not allowed to practice EPA) to levels 2 (practice EPA under proactive/full supervision) and 3 (practice EPA under reactive/on-demand supervision) for most activities. Therefore, it may be helpful and more practical for UME to include additional levels resulting in more granular progression in the decrease in supervision. These additional levels would be particularly helpful if EPAs are to be operationalized for assessments along the entire trajectory of UME training. One possible consideration could be to develop a different entrustment and supervision scale for UME. However, if one advantage of using EPAs as a UME competency framework is the potential for continuity of UME and GME training, one would ideally prefer to use a single entrustment and supervision scale throughout the course of medical training.
We therefore recommend using the current entrustment and supervision scale but expanding the lower levels of the scale to include more gradations of supervision, allowing additional layers of progressive learner autonomy. For instance, full or active supervision can be subdivided into two levels. To start, the learner practices the activity in collaboration with the supervisor as a coactivity. Then as the learner advances, he or she performs the activity on his or her own with the supervisor in the room and ready to assist when needed. This distinction between the types of full supervision may be particularly useful for procedural skills. Similarly, practice under reactive or on-demand supervision could be further broken down into levels with the supervisor outside the room but physically nearby and immediately available or, for a more advanced learner, with the supervisor at a distance and readily available by phone. The AAMC in its description of expected level of achievement for its core EPAs for entering residency proposed a similar expansion of the definition of reactive/on-demand or indirect supervision.1 We also recommend adding into these levels for reactive supervision gradations in the amount of verification the supervisor performs on the learner’s work. For instance, to support graduated autonomy, the supervisor can initially check all completed work, then check just a sample of the completed work, and finally only review the learner’s report of the completed work2 (see Table 2). Naturally, full entrustment for unsupervised activities may never happen within UME for most tasks. However, the progression from supervisor presence in the room to trusting the student to ask for help only when needed is a significant milestone towards autonomy.
To conclude, EPAs can bring added value to UME. In contrast to GME, there has not been similar implementation of a standard competency framework in the United States for medical student performance expectations. UME EPAs, such as those proposed by the AAMC, may help to focus UME assessment more directly on workplace activities as well as provide tangible ways to address other challenges in medical education. Adoption of the EPA framework in UME would allow alignment with the EPAs being developed in GME and provide a true continuum in medical training. EPAs can increase transparency in the workplace regarding student abilities and activities from the very beginning of medical training. We believe EPAs can be operationalized for UME if we develop UME-specific EPAs, as suggested by the AAMC. However, we should expand beyond the AAMC recommendations to include EPAs that represent specialty-specific and elective professional activities and further refine and expand the entrustment scale to include additional gradations of supervision. If operationalized appropriately, EPAs may prove to be a powerful way to assess students in the workplace and allow students to truly contribute to patient care while ensuring patient safety.
Acknowledgments: The authors wish to thank Drs. Patricia O’Sullivan, Arianne Teherani, and Gersten Jonker for their critical review of the manuscript.
1. Association of American Medical Colleges. . Core Entrustable Professional Activities for Entering Residency (CEPAER) AAMC CAPAER Drafting Panel Report. 2014 Washington, DC Association of American Medical Colleges https://www.mededportal.org/icollaborative/resource/887
. Accessed September 30, 2014
2. ten Cate O. Trusting graduates to enter residency: What does it take? J Grad Med Educ. 2014;6:7–10
3. Frank JR, Snell LS, ten Cate O, et al. Competency-based medical education: Theory to practice. Med Teach. 2010;32:638–645
4. Lurie SJ. History and practice of competency-based assessment. Med Educ. 2012;46:49–57
5. ten Cate O, Scheele F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med. 2007;82:542–547
6. Brooks MA. Medical education and the tyranny of competency. Perspect Biol Med. 2009;52:90–102
7. Iobst WF, Sherbino J, ten Cate O, et al. Competency-based medical education in postgraduate medical education. Med Teach. 2010;32:651–656
8. Whitcomb ME. Redirecting the assessment of clinical competence. Acad Med. 2007;82:527–528
9. ten Cate O. Competency-based education, entrustable professional activities, and the power of language. J Grad Med Educ. 2013;5:6–7
10. ten Cate O, Billett S. Competency-based medical education: Origins, perspectives and potentialities. Med Educ. 2014;48:325–332
11. Carraccio C, Burke AE. Beyond competencies and milestones: Adding meaning through context. J Grad Med Educ. 2010;2:419–422
12. Pangaro L, ten Cate O. Frameworks for learner assessment in medicine: AMEE [Association for Medical Education in Europe] guide no. 78. Med Teach. 2013;35:e1197–e1210
13. Mulder H, ten Cate O, Daalder R, Berkvens J. Building a competency-based workplace curriculum around entrustable professional activities: The case of physician assistant training. Med Teach. 2010;32:e453–e459
14. Boyce P, Spratt C, Davies M, McEvoy P. Using entrustable professional activities to guide curriculum development in psychiatry training. BMC Med Educ. 2011;11:96
16. Harris P, Snell L, Talbot M, Harden RM. Competency-based medical education: Implications for undergraduate programs. Med Teach. 2010;32:646–650
17. ten Cate O. What is a 21st century doctor? Rethinking the significance of the MD degree. Acad Med. 2014;89:966–969
18. Powell DE, Carraccio C, Aschenbrener CA. Pediatrics redesign project: A pilot implementing competency-based education across the continuum. Acad Med. 2011;86:e13
19. ten Cate O, Snell L, Carraccio C. Medical competence: The interplay between individual ability and the health care environment. Med Teach. 2010;32:669–675
20. Dornan T, Boshuizen H, King N, Scherpbier A. Experience-based learning: A model linking the processes and outcomes of medical students’ workplace learning. Med Educ. 2007;41:84–91
21. Yardley S, Littlewood S, Margolis SA, et al. What has changed in the evidence for early experience? Update of a BEME systematic review. Med Teach. 2010;32:740–746
22. Wijnen-Meijer M, ten Cate O, van der Schaaf M, Harendza S. Graduates from vertically integrated curricula. Clin Teach. 2013;10:155–159
23. Chen HC, Sheu L, O’Sullivan P, ten Cate O, Teherani A. Legitimate workplace roles and activities for early learners. Med Educ. 2014;48:136–145
24. Hauer KE, ten Cate O, Boscardin C, Irby DM, Iobst W, O’Sullivan PS. Understanding trust as an essential element of trainee supervision and learning in the workplace. Adv Health Sci Educ Theory Pract. 2014;19:435–456
25. Wijnen-Meijer M, van der Schaaf M, Nillesen K, Harendza S, ten Cate O. Essential facets of competence that enable trust in graduates: A delphi study among physician educators in the Netherlands. J Grad Med Educ. 2013;5:46–53
26. Handbook of Quality and Safety for Medical Students. 2013 Utrecht, the Netherlands: University Medical Center Utrecht, the Netherlands
27. Mehta NB, Hull AL, Young JB, Stoller JK. Just imagine: New paradigms for medical education. Acad Med. 2013;88:1418–1423
28. Lave J, Wenger E Situated Learning: Legitimate Peripheral Participation. 1991 Cambridge, UK Cambridge University Press
31. Swing SRInternational CBME [Competency-Based Medical Education] Collaborators. . Perspectives on competency-based medical education from the learning sciences. Med Teach. 2010;32:663–668
32. Lyss-Lerman P, Teherani A, Aagaard E, Loeser H, Cooke M, Harper GM. What training is needed in the fourth year of medical school? Views of residency program directors. Acad Med. 2009;84:823–829
33. Naylor RA, Hollett LA, Castellvi A, Valentine RJ, Scott DJ. Preparing medical students to enter surgery residencies. Am J Surg. 2010;199:105–109
34. Fernandez GL, Page DW, Coe NP, et al. Boot cAMP: Educational outcomes after 4 successive years of preparatory simulation-based training at onset of internship. J Surg Educ. 2012;69:242–248
35. Cohen ER, Barsuk JH, Moazed F, et al. Making July safer: Simulation-based mastery learning during intern boot cAMP. Acad Med. 2013;88:233–239
36. Green EP, Borkan JM, Pross SH, et al. Encouraging scholarship: Medical school programs to promote student inquiry beyond the traditional medical curriculum. Acad Med. 2010;85:409–418
37. Bierer SB, Chen HC. How to measure success: The impact of scholarly concentrations on students—a literature review. Acad Med. 2010;85:438–452