Calls for medical education reform advocate the use of curricular designs that integrate the teaching of didactic knowledge with experiential learning.1 Medical schools strive to provide this integrated approach by imbedding in the preclerkship curriculum a variety of early clinical experiences, such as preceptorship placements in a hospital or ambulatory clinic setting.2,3 Ideally, these preceptorships allow students to learn clinical and reasoning skills by interacting with patients in the clinical workplace.4–6 The richness of learning in the workplace, however, depends on the degree of student engagement in workplace activities.7 Yet, preceptors, who frequently are volunteer faculty, have difficulty identifying activities appropriate for students in their clinical practices. Similarly, students are unclear about their role in the workplace. Consequently, preceptorships often consist mostly of shadowing experiences rather than the desired hands-on clinical experiences.8–11
Entrustable professional activities (EPAs) offer a potential solution to this problem by identifying the work activities that preclerkship students can be expected to perform. EPAs are aligned with workplace learning, and they place competencies in the context of practice.12 Here, multiple competencies can be evaluated through the lens of the specific activities a physician performs.13 By defining the educational goals and objectives of early clinical experiences in the language of EPAs, we can provide students and preceptors with explicit guidance regarding students’ roles. Using this model, we can clarify what developmentally appropriate activities students can be expected to perform and how they may be effectively integrated into the preceptor’s clinical workplace.
Although EPAs primarily have been applied to graduate medical education (GME), we have argued that they also are applicable to undergraduate medical education (UME).14 Also, EPAs define expectations specifically at transition points in medical training—at entry into residency, fellowship, and/or practice.13,15–19 Entry into clerkship is just an earlier transition point. We therefore developed EPAs to clarify competency expectations for students entering clerkship.
To develop these EPAs, we first determined a level of supervision or entrustment with which to anchor the EPAs to ensure consistency. The published supervision scale for EPAs includes five levels of supervision/entrustment.12 EPAs for entry into practice are defined at level 4: The learner may practice unsupervised. Because students are not expected to perform clinical activities without supervision, we designed our EPAs at entrustment level 3: The learner may practice under reactive supervision. For some activities, this means that a student may complete the activity alone in the room with the patient, while the supervisor is outside the room but immediately available.
Approaches to EPA development have commonly included the use of the Delphi or nominal group process.15,16,20–25 Both use experts to define what the EPAs should be, leading to a list of EPA titles. However, the real essence of an EPA is in its fully elaborated description, which includes a detailed explanation containing limitations of the activity; expected knowledge, skills, and attitudes; associated competencies; and assessment information.26,27 Although the tasks described in an EPA title may seem self-evident and require little explanation for a learner transitioning to practice, novices to the workplace, and those working with them, need explicit details to understand the specifications and limitations of the activities they are being asked to perform. This detailed information is needed to implement EPAs for performance assessment.
The purpose of this article is to report the novel methodology we used to develop fully elaborated descriptions of EPAs for entry into clerkship. Such a description includes seven parts: (1) EPA title; (2) specifications and limitations; (3) required knowledge, skills, and attitudes; (4) link to an existing competency framework; (5) information sources to assess progress; (6) methods to arrive at the entrustment decision; and (7) conditions and implications for entrustment.26,27 We also emphasize the steps we took during development to ensure content validity.
In the sections that follow, we describe the methods and results of our multistep process, providing the rationale, data sources, and data analysis for each phase (see Figure 1). The entire process took approximately two years, and all phases were approved by the University of California, San Francisco (UCSF) institutional review board.
Phase I: Identification of EPA Content Domains
The identification of EPA content domains should be based on demonstrated student capabilities. Thus, to determine what professional activities preclerkship students are capable of, and what activities could feasibly be incorporated into the curriculum, we started with a study of preclerkship students’ roles and activities in UCSF’s student-run clinics (SRCs). We found that students performed direct patient care activities, such as history taking and physical examinations, patient education, encounter documentation, and minor procedures (e.g., phlebotomy, vaccinations).28 They also helped patients access local health programs and performed quality improvement activities.28 We confirmed this information via preceptor interviews and student focus groups.
Between November 2012 and June 2013, we conducted 20- to 45-minute structured telephone interviews with preceptors in UCSF’s preclerkship clinical skills course. We selected preceptors with demonstrated ability to incorporate students into workplace activities. Seventy-one of 363 preceptors met this criteria by having consistently high student ratings (over three academic years, 2009–2012) for their ability to provide learners with history taking and physical examination opportunities and for their teaching effectiveness. To capture student activities across various clinical workplaces, we used purposive sampling to recruit 40 of the 71 preceptors who worked in outpatient clinics, emergency departments, and inpatient wards; in general or subspecialty practices; and cared for adult or pediatric patients. We asked the preceptors about the patient care activities they have students perform and their ability to provide students with clinical skills practice opportunities.
We invited all second-year (MS2) and third-year (MS3) UCSF students to participate in focus groups, and included the first students who responded from each class. We conducted two 60-minute focus groups, one with MS2 students who had recently completed their first preclerkship year and one with MS3 students who were currently completing their clerkships. We asked them to reflect on what they found most valuable in their preceptorship experiences, their ability to practice various clinical skills, and whether they could have engaged in additional learning or patient care activities.
All interviews and focus groups were audio-recorded, transcribed, and deidentified. One author (H.C.C.) used the constant comparative method with open coding to analyze the transcripts for types of clinical workplace activities.29,30 Two authors (H.C.C., M.M.) triangulated these activities with those described in the SRC study.28
Twenty-two of the 40 preceptors responded, and 19 were interviewed. Eight students participated in the MS2 focus group and 3 in the MS3 focus group. Using the clinical workplace activities described in the SRC study, the preceptor interviews, and the student focus groups, we identified five initial EPA content domains: (A) information gathering, (B) information sharing with providers, (C) information sharing with patients, (D) patient advocacy and quality improvement, and (E) information management for lifelong learning.
Phase II: EPA Content Domain Mapping and Confirmation
To ensure the relevance and adequacy of each EPA content domain, we mapped them to existing competency frameworks, as ten Cate and Young31 recommended for establishing credibility with stakeholders and providing a framework for observation and assessment.
We performed this mapping between May and November 2013. First, we mapped the EPA content domains to UCSF’s preceptorship objectives and preclerkship clinical skills course competencies. Then we mapped the domains to our institution’s graduation competencies and milestones. Finally, we mapped the domains to the GME EPAs from medicine18 and pediatrics17 and to the Association of American Medical Colleges (AAMC) Core Entrustable Professional Activities for Entering Residency (CEPAER)15 when they became available in November 2013. Two authors (H.C.C., M.M.) independently completed each step of this mapping and met to discuss and reconcile differences.
We linked all existing curricular objectives and competencies to each of the five EPA content domains except “facilitate learning by giving, receiving, and applying feedback.” This competency was important but not specific to, and therefore not linked to, any EPA content domain. We also were able to map the five EPA content domains to 11 of the 16 medicine EPAs,18 8 of the 16 draft pediatrics EPAs,17 and 10 of the 13 AAMC CEPAER.15 Because our domains were narrower in scope, representing foundational and smaller units of activity than the GME EPAs, some of the domains mapped to multiple GME EPAs. Our mapping results confirmed the relevance of our EPA content domains to current and future expectations of preclerkship students.
Phase III: EPA Content Description and Expert Consultation
To be used in practice, the content of each EPA must be expanded beyond its content domain to include a detailed delineation of the expected observable behaviors and the context for those behaviors. To achieve this step, we developed comprehensive seven-part descriptions for each EPA using ten Cate’s published guidelines27 and the assistance of an EPA expert (O.t.C.).
From July to December 2013, using level 3 supervision (practice under reactive supervision) as a guide and the results of our curricular mapping, we developed titles for each EPA content domain and delineated the scope of each EPA by specifying the parameters or conditions limiting each activity. For instance, for the activity of gathering information from the history and physical examination of a patient, we limited the activity to types of patients appropriate for a preclerkship student (e.g., those who are medically stable). We also identified the knowledge, skills, and attitudes needed for the successful completion of each EPA. We then used the preclerkship course objectives to identify where in the curriculum students would learn, for instance, the required foundational science knowledge to support these clinical activities. Finally, we identified the sources of information for determining students’ progress, the conditions and methods for granting level 3 entrustment, and the implications for students once they have been granted level 3 entrustment.
As we developed these seven-part EPA descriptions,27 we received guidance and iterative feedback from our EPA expert (O.t.C.) regarding the structure, clarity, and adequacy of the descriptions. We also sought and received feedback on the clarity of our EPA descriptions from health professions educators working with our EPA expert at the University Medical Center Utrecht.
We developed fully elaborated descriptions for each EPA (see Appendix 1 for an example). Upon review, we revised the EPA domains to improve their suitability for preclerkship students (see Chart 1). For example, we separated generating a differential diagnosis and plan from information sharing to create a separate EPA. Doing so makes explicit an important activity that naïve preclerkship students may not recognize and emphasizes it as an entrustable contribution to patient care that can serve as a prerequisite for more advanced patient care activities. We also merged the last two EPAs (D and E) into one. For preclerkship students, the practice of information management is most often observable when applied to researching resources for patients or the health care team, where it is also a patient advocacy and quality improvement behavior.
Phase IV: Assurance of Appropriate EPA Content
The standards for content-related validity set out in the Standards for Educational and Psychological Testing state that the process for specifying assessment content should be described and justified according to the intended population and the construct the assessment is intended to measure.32 To ensure the appropriateness of each EPA, we engaged internal and external subject matter experts to assess the specifications/limitations of each activity for the correct level of complexity and alignment with expected student competencies. We opted to use focused workshop discussions to elicit in-depth feedback from diverse stakeholders in medical education. These workshops allowed us to explain the intended use of the EPAs and expected supervision/entrustment level, explore perspectives, understand concerns, and work collaboratively with workshop participants to refine content.
From January to April 2014, we held four content validation workshops: two local, one national, and one international. The first two workshops were held at UCSF; one with preclerkship clinical skills course directors and the other with the clerkship curriculum committee. The third workshop was held at the 2014 annual meeting of the AAMC Western Group on Educational Affairs. The last workshop was held at the 2014 Ottawa Conference, a biennial international medical education conference focused on assessment. See Table 1 for details about workshop participants, procedures, and outcomes.
All workshops were led by two authors (H.C.C., M.M.) and followed the same format. Up to two additional authors (A.T., O.t.C., P.O.) participated as small-group facilitators. After a brief introduction explaining how the EPAs were developed and the expected level of supervision/entrustment, participants divided themselves into small working groups by background (clinician/nonclinician, UME/GME, institution) to ensure diversity in each group (see Table 1). Each small group focused on a specific EPA, discussed its title and specifications/limitations, and provided written comments. A large-group discussion of each EPA followed. Finally, participants discussed whether any EPAs were missing. Participant and small-group notes were collected. All workshops were audio-recorded, and workshop facilitators took additional notes. Two authors (H.C.C., M.M.) compiled and reviewed all information.
On the basis of the discussions at the local workshops, we refined the EPA titles and detailed specifications/limitations, including a substantive change in language for EPA 5. Smaller refinements followed each subsequent workshop. Table 1 details this refinement process. Consensus of the workshop participants was that no EPAs were missing, although some suggested adding an EPA for common clinical procedures.
Phase V: Finalizing EPA Content With Expert and Stakeholder Review
Because our EPA descriptions underwent several refinements, we performed a final review with our EPA expert as well as with local stakeholders to ensure adherence to EPA principles and the appropriateness and alignment of the EPA content with curricular objectives.
In May 2014, our EPA expert (O.t.C.) reviewed the refined EPA descriptions for conceptualization, wording, and semantics. The resulting final version of the EPA descriptions was sent to UCSF’s preclerkship clinical skills course directors and clerkship curriculum committee for review and approval in June 2014.
Our EPA expert found that our EPAs were written with a focus on the learner, similar to that seen in typical competency language (i.e., what a learner will do). With his guidance, we reframed the EPAs to focus on the activity/unit of work within a specific context. We made no other alterations. Local stakeholders approved the final version of the EPAs for implementation in the fall of 2014. See Chart 1 for the final titles of the EPAs and Appendix 1 and 2 for the specifications/limitations.
In this article, we described the methodology we used to identify and develop detailed descriptions of five core EPAs for entry into clerkship. These EPAs clarify the developmentally appropriate activities that preclerkship students can perform as part of their engagement in the clinical workplace. We mapped these EPAs to our institution’s curricular objectives and aligned them with EPAs developed by other groups. Our EPAs also are supported by content validity evidence from both internal and external subject matter experts. Our detailed descriptions and validity evidence may allow others to operationalize these EPAs to improve early clinical experiences at their institutions.
Workshop participants readily accepted the constructs and content domains for the five EPAs we identified. They agreed on the expected level of supervision (practice under reactive supervision) and helped to tailor the content of the EPAs (expanding or limiting the breadth/complexity of the detailed specifications and limitations) to fit this level of supervision. Despite variable preclerkship clinical preparation across institutions, workshop participants were able to agree on what is expected of preclerkship students.
Some participants did express interest in a procedures-oriented EPA, which highlights that ours are core EPAs. Individual institutions may choose to include additional, elective EPAs, such as one related to procedures, to suit their institution-specific objectives and student needs. In addition, the sources of information used to arrive at an entrustment decision likely will differ on the basis of local resources and circumstances. For these reasons, we did not include any information specific to implementation at UCSF in the full EPA description in Appendix 1. We encourage institutions interested in implementing these core preclerkship EPAs to complete parts 3, 4a, 4b, 5, and 6 of the EPA descriptions according to their local curriculum. Discussions about the information sources for assessment should look beyond existing assessments and address the validity of the assessments for making entrustment decisions. We recommend that multiple and preferably different types of information sources (e.g., faculty evaluation, multisource feedback, standardized patient exams) be used to gauge progress and that entrustment decisions be based on the input of more than one person or time point (e.g., three faculty members recommending entrustment).
A valuable outcome of our EPA development process was the promotion of discussion among the preclerkship and clerkship faculty, who may have differing expectations of students’ clinical skills.33 At our institution, this discussion created consensus among the faculty regarding expectations of students entering clerkship. Now these expectations can be explicit and clearly delineated for students. The framing of students’ roles and responsibilities in the clinical workplace also generated valuable conversations about the importance of learning through participation and the ability of preclerkship students to contribute to the care of patients. By carefully considering the level of supervision, meaning of entrustment, and detailed specifications/limitations for each EPA, faculty were able to reach consensus on which tasks and in what circumstances preclerkship students could safely engage in authentic patient care activities. Such discussions may help alter faculty expectations and encourage them to see preclerkship students as contributors in the clinical workplace, rather than as potential burdens.
We began this project a year before the AAMC CEPAER were publicly available.15 Later, we found the CEPAER too broad in scope relative to preclerkship students’ capabilities. Broad EPAs may prevent preclerkship students from assuming the degree of responsibility needed for legitimate participation in the workplace, and our primary goal for this work was to define EPAs that could promote student participation. Also, whether preclerkship students should be capable of performing each of the 13 CEPAER to some degree or only a subset of them, with the expectation that they would expand their skills during clerkship, was unclear. Therefore, we elected not to work backward from the CEPAER but to work forward on the basis of evidence of demonstrated preclerkship student capabilities. The ability to link or nest our preclerkship EPAs to the CEPAER, as demonstrated by our mapping process, reinforces their content validity and allows their use with the CEPAER.
We acknowledge that EPAs should not be too granular.34 However, preclerkship students may see seemingly small tasks as major responsibilities that only later will become part of a broader task. For instance, EPA 2, “Integrate information gathered about a patient to construct a reasoned and prioritized differential diagnosis as well as a preliminary plan for common chief complaints,” becomes part of the residents’ broader task to “manage care of patients with acute common diseases.”18 We deliberated the validity of EPA 2 because it is an activity that would not necessarily be prohibited without supervision. We retained it as part of our EPAs, deciding that for very junior learners, the activity should be conceptualized and presented as a responsibility that contributes to patient care. This evolution of small tasks into broader responsibilities adds to the holistic sense of growth in clinical performance that is the goal of medical education.
It is important to note where our EPAs may have digressed from recommendations in the literature. As is evident in Chart 1, the EPA titles increased in length during our development process, despite the recommendation to keep titles short.34,35 We felt that adding limitations to the titles, rather than mentioning them only in the elaborated EPA description, would emphasize their appropriateness for preclerkship students and help prevent any misunderstandings or concerns that might occur when faculty and students encounter these EPAs for the first time. Also, EPA 5 (“Provide the health care team with resources to improve an individual patient’s care or collective patient care”) is similar to and incorporates elements of two AAMC CEPAER (“Form clinical questions and retrieve high-quality evidence to advance patient care” and “Identify systems failures and contribute to a culture of safety and improvement”). ten Cate has questioned whether these two AAMC CEPAER are true EPAs.35 However, unlike the two AAMC CEPAER, EPA 5 meets the definition of an EPA in two respects: (1) It is a discrete task that relates back to the care of a patient rather than to an ongoing habit, and (2) students can advance to higher levels of autonomy for this task.34
Our methodology has several limitations. First, it was a lengthy process, taking almost two years. However, basing our EPAs on evidence of demonstrated preclerkship student capabilities facilitated their acceptance by various stakeholders. This acceptance was particularly important because many stakeholders do not appreciate the extent to which preclerkship students are capable of engaging in patient care activities.33,36 Second, our use of local and conference workshops restricted the number of content experts we could engage in our process. It also limited our ability to control who was engaged at the national and international levels, as participation was based on conference attendance. However, we did specifically hold our workshops at meetings we knew would be attended by individuals experienced in medical education, and both workshops were well attended. We also kept track of our participants’ educational roles, clinical background, and institutions; had them work in diverse groups; and checked for broad representation of institutions and regions in evaluating the workshop feedback. In fact, we found the workshop approach to collecting content validity evidence highly advantageous for allowing rich discussions and for fine-tuning the detailed specifications and limitations of the EPAs, which would have been more difficult using strategies such as the Delphi or nominal group process. Third, the validity evidence that we collected focused entirely on content validity. Additional validity evidence regarding use in student assessment is required. The EPAs currently are being implemented locally, and future work will focus on collecting this validity evidence.
In conclusion, we developed full descriptions for five core EPAs for entry into clerkship, following published guidelines for EPA development and with special attention to validity standards for educational testing. We endorse their use by other institutions because we anticipate that the EPAs can provide explicit guidance for the engagement of preclerkship students in clinical workplace activities with attention to patient safety.
Acknowledgments: The authors would like to thank Charlie DeVries and Dr. Charlotte Wills for their assistance in conducting the student focus groups and preceptor interviews. They also would like to thank the University of California, San Francisco, School of Medicine (UCSF) preclerkship clinical skills course directors, members of the UCSF clerkship curriculum committee, and workshop participants at the 2014 Association of American Medical Colleges Western Group on Educational Affairs annual meeting in Honolulu, Hawaii, and at the 2014 Ottawa Conference in Ottawa, Ontario, Canada.
1. Cooke M, Irby DM, O’Brien BC Educating Physicians: A Call for Reform of Medical School and Residency. 2010 San Francisco, Calif Jossey-Bass
2. Dornan T, Littlewood S, Margolis SA, Scherpbier A, Spencer J, Ypinazar V. How can experience in clinical and community settings contribute to early medical education? A BEME systematic review. Med Teach. 2006;28:3–18
3. Yardley S, Littlewood S, Margolis SA, et al. What has changed in the evidence for early experience? Update of a BEME systematic review. Med Teach. 2010;32:740–746
4. Dolmans DH, De Grave W, Wolfhagen IH, van der Vleuten CP. Problem-based learning: Future challenges for educational practice and research. Med Educ. 2005;39:732–741
5. Durning SJ, Artino AR. Situativity theory: A perspective on how participants and the environment can interact: AMEE guide no. 52. Med Teach. 2011;33:188–199
6. Koens F, Mann KV, Custers EJ, Ten Cate OT. Analysing the concept of context in medical education. Med Educ. 2005;39:1243–1249
7. Billet S. Workplace participatory practices: Conceptualizing workplaces as learning environments. J Workplace Learn. 2004;16:312–324
8. Alford CL, Currie DM. Introducing first-year medical students to clinical practice by having them “shadow” third-year clerks. Teach Learn Med. 2004;16:260–263
9. Başak O, Yaphe J, Spiegel W, Wilm S, Carelli F, Metsemakers JF. Early clinical exposure in medical curricula across Europe: An overview. Eur J Gen Pract. 2009;15:4–10
10. Hopayian K, Howe A, Dagley V. A survey of UK medical schools’ arrangements for early patient contact. Med Teach. 2007;29:806–813
11. McLean M. Sometimes we do get it right! Early clinical contact is a rewarding experience. Educ Health (Abingdon). 2004;17:42–52
12. ten Cate O, Snell L, Carraccio C. Medical competence: The interplay between individual ability and the health care environment. Med Teach. 2010;32:669–675
13. ten Cate O, Scheele F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med. 2007;82:542–547
14. Chen HC, van den Broek WE, ten Cate O. The case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2015;90:431–436
16. Boyce P, Spratt C, Davies M, McEvoy P. Using entrustable professional activities to guide curriculum development in psychiatry training. BMC Med Educ. 2011;11:96
19. Rose S, Fix OK, Shah BJ, Jones TN, Szyjkowski RD. Entrustable professional activities for gastroenterology fellowship training. Hepatology. 2014;60:433–443
20. Fessler HE, Addrizzo-Harris D, Beck JM, et al. Entrustable professional activities and curricular milestones for fellowship training in pulmonary and critical care medicine: Report of a multisociety working group. Chest. 2014;146:813–834
21. Hauer KE, Kohlwes J, Cornett P, et al. Identifying entrustable professional activities in internal medicine training. J Grad Med Educ. 2013;5:54–59
22. Touchie C, De Champlain A, Pugh D, Downing S, Bordage G. Supervising incoming first-year residents: Faculty expectations versus residents’ experiences. Med Educ. 2014;48:921–929
23. Slipp S, Wycliffe-Jones K, Weston W. When is a resident “good to go”? Using a modified Delphi technique to define and benchmark entrustable professional activities for family medicine residency training.Abstract presented at: 16th Ottawa ConferenceApril 29, 2014Ottawa, Ontario, Canada
24. Holzhausen Y, Maaz A, Degel A, Peters H. Identifying EPAs for undergraduate medical education.Abstract presented at: Annual Meeting of the Association for Medical Education in EuropeSeptember 3, 2014Milan, Italy
25. Wong E. Using nominal group technique to develop entrustable professional activities for family medicine.Abstract presented at: Annual Meeting of the Association for Medical Education in EuropeSeptember 3, 2014Milan, Italy
26. Ten Cate O. AM last page: What entrustable professional activities add to a competency-based curriculum. Acad Med. 2014;89:691
27. Ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5:157–158
28. Chen HC, Sheu L, O’Sullivan P, Ten Cate O, Teherani A. Legitimate workplace roles and activities for early learners. Med Educ. 2014;48:136–145
29. Watling CJ, Lingard L. Grounded theory in medical education research: AMEE guide no. 70. Med Teach. 2012;34:850–861
30. Corbin J, Strauss A Basics of Qualitative Research. 20083rd ed Thousand Oaks, Calif Sage
31. ten Cate O, Young JQ. The patient handover as an entrustable professional activity: Adding meaning in teaching and practice. BMJ Qual Saf. 2012;21(suppl 1):i9–i12
32. American Educational Research Association; American Psychological Association; National Council on Measurement in Education. . Validity. The Standards for Educational and Psychological Testing. 2014 Washington, DC American Educational Research Association:11–31
33. Wenrich M, Jackson MB, Scherpbier AJ, Wolfhagen IH, Ramsey PG, Goldstein EA. Ready or not? Expectations of faculty and medical students for clinical skills preparation for clerkships. Med Educ Online. 2010;15 doi: 10.3402/meo.v15i0.5295.
34. ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf MF. Curriculum development for the workplace using entrustable professional activities (EPAs): AMEE guide 99 [published online July 14, 2015]. Med Teach. doi: 10.3109/0142159X.2015.1060308.
35. Ten Cate O. Trusting graduates to enter residency: What does it take? J Grad Med Educ. 2014;6:7–10
36. Yardley S, Brosnan C, Richardson J, Hays R. Authentic early experience in medical education: A socio-cultural analysis identifying important variables in learning interactions within workplaces. Adv Health Sci Educ Theory Pract. 2013;18:873–891