Since the advent of the Flexner Report in 1910,1 medical education in the United States has conformed to a fixed-time structure. Following completion of an undergraduate degree, physicians-in-training have entered a four-year medical school curriculum. This curriculum has generally delivered two years of “basic science” education followed by two years of “clinical” education. Upon obtaining this four-year medical degree, graduates have entered some form of graduate medical education (GME) program offering further, discipline-specific clinical training to become licensed and qualified for specialty board certification. As GME programs have evolved in the latter half of the 20th century, they too have relied on a time-based path to certification in a medical specialty. While there have been periodic experiments with three-year medical school curricula, notably in the late 1970s and early 1980s and continuing sporadically in the 21st century,2,3 medical training during the past century has adhered to a time-defined process marked by four years of undergraduate medical education (UME) and a defined number of years in residency and fellowship training.
Although competency-based medical education (CBME) has been advocated for many years,4 the initiation of the Accreditation Council for Graduate Medical Education (ACGME) Outcomes Project in the late 1990s was instrumental in reviving a shift to CBME.5 Conceptually, this effort focused on the outcomes of medical education programs rather than the process of curriculum delivery.6 The natural progression of this effort was to challenge the concept that completion of a certain period of time in training was necessary and sufficient to be qualified to enter medical practice. Rather, CBME emphasizes outcomes, or competencies, in defined areas as the pathway to practice, suggesting that learners would logically progress at variable rates toward this goal.
Setting the Foundation: The Origins of EPAC
The idea for what would become EPAC was originally developed by Deborah Powell, dean emerita at the University of Minnesota Medical School. While serving on an ACGME Residency Review Committee and later as an ACGME board member, she saw the ACGME/American Board of Medical Specialties shift to CBME as an opportunity to test and fully realize a framework that individualizes learning and advances students through education and training in a time-variable manner based on carefully assessed achievement of specific competencies. In 2009, recognizing that an organizational champion would be required to oversee a pilot project of this magnitude, she engaged Carol Aschenbrener, then executive vice president of the AAMC, to host a small preliminary meeting to explore the idea of what was then called the Pediatric Redesign Project. Pediatrics was chosen as the specialty for the pilot based on the readiness of the discipline to engage in innovation. In that same year, the American Board of Pediatrics (ABP), having just completed a four-year self-study of residency training, announced the implementation of the Initiative for Innovation in Pediatric Education, believing that advancements in GME training were predicated on an ongoing series of controlled experiments.7–9 Present at the initial meeting in Washington, DC, were representatives of the AAMC, the ACGME, the ABP, the National Association of Children’s Hospitals and Related Institutions (NACHRI), and faculty and administrators from three medical schools (University of Colorado, University of Maryland, and University of Minnesota), two of which would later become part of EPAC. The attendees endorsed the concept, and a second meeting was held in Minneapolis in January 2011 where representatives from the AAMC and the National Board of Medical Examiners (NBME), faculty members from the three medical schools, and other medical educators continued planning for the pilot project. Dr. Powell recruited two other medical schools to join the project, and a third meeting was held in May 2011 at AAMC headquarters. This meeting included faculty from all five initial participant schools (University of California, San Francisco; University of Colorado; University of Maryland; University of Minnesota; University of Utah), as well as individuals who would become consultants to the project and representatives from the AAMC, ACGME, ABP, NBME, and NACHRI. Representatives from the schools, with Drs. Powell, Carraccio, and Aschenbrener, established a pattern of biannual meetings at the AAMC, which has continued to the present. Later in 2011, the first publication describing the project appeared in Academic Medicine.10 In addition to the strong interest of the ABP in educational innovation, the specialty of pediatrics was a natural choice to pilot the concept of competency-based advancement as there is evidence that pediatrics is a relatively early and stable career choice for entering medical students.11 This was a key consideration, as the Pediatric Redesign Project required an early commitment to a specialty for the participating students.
The authors of this paper were all members of the initial planning group and have participated throughout the project. We notified other key stakeholders in UME and GME of our project plans and asked for their input, including the Liaison Committee for Medical Education (LCME), the Federation of State Medical Boards (FSMB), and the National Resident Matching Program (NRMP). We also consulted hospital administrators in anticipation of the economic implications of true competency-based advancement potentially altering the time spent in residency.
In its initial meetings, the planning group proposed details of the pilot project and defined practical barriers to its implementation and success. The pilot would identify students early in their UME training, guarantee them a residency position in pediatrics at the institution where they were students, and advance them to the residency program when they met predetermined competency standards for entry into GME. Although the hypothesis was that this approach could, on average, shorten the duration of medical education and training, it was understood from the outset that the length of education and training might be extended for some learners to achieve necessary competence. We identified and sought to mitigate potential barriers, including requirements for licensure, medical school graduation requirements, eligibility for GME training, and funding for administrative support. Furthermore, we recognized that the current approach to assessment of student performance based on grades and time-limited clerkship performance would need considerable change to ensure that participants were confident in competency-based advancement decisions. The success of the project would thus hinge upon its careful and considered approach to student assessment.
In 2011, the project was endorsed for implementation by the ABP, the ACGME, and the FSMB. The ABP and ACGME agreed to flexibility in their requirements, and the FSMB agreed to work with project leaders to address any potential barriers to state licensure. Each medical school obtained LCME approval for EPAC as a medical education pathway. Important to this approval process was that schools illustrate that all LCME requirements would be met and that an opt-out path was possible for students selected for the pilot program who might later wish to change their career choice. We also consulted with the NRMP, whose advice was helpful in interpreting Match rules so as not to commit a violation when we advanced our learners to residency. Each of our students has matched to their program, but the timing of their matriculation was adjusted by mutual consent from the student and the program.
Developing the Design of EPAC
Five schools initially agreed to participate in the project. Two of the institutions were schools at which members of the early planning group held faculty appointments. Dr. Powell also approached three other schools whose deans were pediatricians and which had expressed interest in the project. The school leaders’ participation in the early conceptualization of the project facilitated orientation to the program at those institutions. All had large pediatric residency programs (45–90 residents each) that also agreed to participate. The leaders understood that the success of competency-based, time-variable progression across the UME-to-GME transition, and across the GME-to-practice or -fellowship continuum, required commitments from both the UME and the GME leadership for each program. Thus, the “core” and minimum participation of each team became defined as the third-year pediatric clerkship director and the residency program director.
The group of schools began to meet with the planning committee to define guiding principles for the program. Much of the early discussion was devoted to what constituted appropriate outcomes for learners and how those outcomes would be assessed. These outcomes led to understanding that considerable faculty development would be necessary, and that new approaches to assessment would be required. The traditional summative, end-of-rotation feedback would need to be replaced with tools that tracked individual professional development over time and that lent themselves to performance thresholds for advancement. While the ACGME competencies existed and milestones were being implemented, the concept of entrustable professional activities (EPAs) was emerging. Importantly, the AAMC and the ABP were simultaneously working on a set of EPAs as a framework for assessment in UME and pediatric specialty training, respectively. The planning group thus agreed to use EPAs as the guiding framework for assessment across the continuum of education and training. To support this approach, we enlisted Olle ten Cate, the originator of the concept of EPAs,12 early in the design phase.
Although EPAC was originally conceived as a uniform pilot curriculum across five medical schools, it quickly became apparent that the group had to reach agreement not only on the guiding principles for the program (e.g., all students would have a longitudinal pediatrics experience throughout medical school and carry these patients into residency), but also how the guiding principles and the EPAC program could be incorporated into the existing educational structure at each site. For example, while stakeholders agreed on outcome measures of time-variable progression, and the measures defining “do no harm” to learners would be consistent across all five sites, the paths to achieving those outcomes would vary. The group also explored and proactively addressed challenges and local barriers to implementation (e.g., at the time, California statute required four years of medical school for medical license eligibility). One school eventually withdrew. At the time, the University of Maryland was developing competing curricular priorities for its students concerning participation in research that would have been difficult to accommodate in this clinical education pilot program. Soon after the May 2011 meeting, the decision to rename the Pediatric Redesign Project as Education in Pediatrics Across the Continuum (EPAC) was made, and EPAC was officially born. The two goals of the EPAC project are to establish a model for true CBME through variable-time, meaningfully assessed demonstration of competence across the continuum of UME, GME, and independent practice; and to develop a continuous educational pathway linking the “continuum” of UME, GME, and practice using pediatrics as a model.
As the group began to move from the design to the implementation phase, it became clear that support for faculty with specific oversight of EPAC implementation would be critical to the project’s success. With support from the EPAC group, the AAMC applied for and received a grant from the Josiah Macy Jr. Foundation in 2013. The Macy Foundation (http://macyfoundation.org), known for facilitating innovative projects in medical education, recognized the far-reaching implications of a program that could truly test the paradigm shift to competency-based, time-variable advancement across the continuum of education and training. This was an important milestone as the funding provided external validation of the project concept and design, and supported faculty effort and administrative personnel at each of the schools to ensure sufficient protected time for implementation. As part of the Macy application, each school agreed to commit to four cohorts of learners entering and advancing through residency training at their respective institutions. The proposed cohort size was four learners per year at each school.
In 2013, program leaders at each school introduced the EPAC program to first-year medical students. For purposes of introducing the program to potential participants and for consistency in planning across sites, three phases of engagement were defined within the programs: “EPAC Explore,” “EPAC Focus,” and “EPAC Match.” Through EPAC Explore, students are introduced to the concept of time-variable advancement, provided information about careers in pediatrics, and informed that they would be offered a pediatrics residency position if accepted as part of the pilot project. Additionally, the schools have designed approaches to expose interested students to pediatrics and to develop their aptitude in and confidence about that path, including didactic sessions, mentor connections, and/or clinical experiences. In EPAC Focus, preclerkship and early clerkship curricula are modified to create further opportunities for students to more directly address their interest in pediatrics. While EPAC Explore is informational and typically targets many interested students, EPAC Focus involves more active and identifiable participation. EPAC Match is the phase of the program in which each institution selects and commits to a cohort of up to four students for focused pediatric clinical education, a guaranteed residency position, and an agreement to advance to residency based on attainment of competency and meeting the school’s graduation requirements. The timing of the EPAC Match phase varies across the four schools from early in the second year of medical school to the middle of the traditional third year. The selection process involves an application developed at each school and a common set of structured interview questions that were adapted from questions already in use for residency selection at the University of Minnesota. These tools were used to assess not only academic aptitude but also professionalism characteristics that are considered critical to success in this program. Because this pilot is a feasibility study and we knew participating students would be our partners in change management, these characteristics include independence, initiative, interest in participating in an educational pilot, and comfort with uncertainty.
EPAC leadership, consisting of clerkship directors, program directors, and site leaders at each school, as well as consultants, steering committee members (Drs. Powell, Carraccio, and Englander), and AAMC representatives, has divided into a number of groups that span the participating institutions to engage in focused work to address assessment, program evaluation, and implementation challenges at the local sites. These groups meet by phone at least once or twice per month. The group of team leaders from each site meets by phone monthly with one or more members of the steering committee. To address the need for oversight and analysis of both learner assessment and program evaluation in the national pilot, two additional PhD consultants, one with a background in cognitive psychology and competency-based assessment and the other with extensive medical education and qualitative program evaluation experience, were hired for a small percentage of their time (5%–10%) as consultants to work directly on the EPAC project.
A major challenge of the implementation phase has been identifying and developing a standard set of assessment tools, to complement local tools, that would inform the decisions about student readiness to transition from UME to GME and from GME to fellowship/practice using the agreed-on framework of EPAs linked to competencies and milestones.13,14 Of note, the pediatric EPAs build on Core EPAs for Entering Residency15 (List 1). EPAC leadership agreed that faculty would assess students on all 13 Core EPAs using site-specific tools. For 5 of the 13 Core EPAs, the schools agreed that their local assessments were insufficient. After researching the literature, all sites agreed to use a set of tools with published validity evidence to fill these gaps (Table 1). Each site uses a published supervision scale and an agreed-on target for level of supervision16 to determine entrustment on each of the 13 EPAs. These entrustment decisions then serve as a foundation for the decision to advance to GME. The Maslach Burnout Inventory17 and the Jefferson Scale of Physician Empathy18 are also being used to help assess our commitment to “do no harm.” The qualitative research consultant to the project doing program evaluation is invested in interviewing all participants, including students, to better understand the impact of the EPAC program on them and their learning.
Core Entrustable Professional Activities for Entering Residency15
- EPA 1: Gather a history and perform a physical examination
- EPA 2: Prioritize a differential diagnosis following a clinical encounter
- EPA 3: Recommend and interpret common diagnostic and screening tests
- EPA 4: Enter and discuss orders/prescriptions
- EPA 5: Document a clinical encounter in the patient record
- EPA 6: Provide an oral presentation of a clinical encounter
- EPA 7: Form clinical questions and retrieve evidence to advance patient care
- EPA 8: Give or receive a patient handover to transition care responsibility
- EPA 9: Collaborate as a member of an interprofessional team
- EPA 10: Recognize a patient requiring urgent or emergent care and initiate evaluation and management
- EPA 11: Obtain informed consent for tests and/or procedures
- EPA 12: Perform general procedures of a physician
- EPA 13: Identify system failures and contribute to a culture of safety and improvement
Abbreviation: EPA indicates entrustable professional activity.
For a subset of assessments in UME, each site has a comparison group consisting of students interested in a career in pediatrics who are not participating in EPAC. Each GME program will include two such controls who are non-EPAC residents from the same medical school and two residents from other medical schools.
Each school now has at least three cohorts of students in its EPAC program (Table 2). Forty-eight total students have been enrolled across all four sites. Entrustment is being monitored through regular clinical competency committee (CCC) meetings at each school. We borrowed the CCC concept from the GME experience, where it has been in use for several years.19 Our CCCs are aggregating assessments from faculty, residents, and interprofessional team members for each EPA, as well as student self-assessments, as the basis for entrustment decisions. Upon the CCC’s decision that a student has reached the level of entrustment to perform each of the 13 EPAs with indirect supervision, the students are able to enter a transition-to-residency phase. On the basis of their progress to date, 12 students have advanced to pediatric residency. The first of these students began their internship in the fall of 2016, six to nine months before their peers in the traditional, time-based curriculum. Other students have advanced throughout the year at all four schools as they have met standards for entry into GME. At two of the schools, advancement to GME occurred when the students reached the required level of entrustment. At the other two schools, this level of entrustment was documented when it occurred and the students advanced to residency with their peers on the traditional calendar. Work is ongoing at these schools to address approaches to local curricular and administrative requirements that will permit true time-variable advancement. Four students have withdrawn from the program before advancement to residency. Three chose to pursue training in other disciplines (family medicine, otolaryngology, and child neurology), and the fourth withdrew to pursue a leave of absence for personal reasons.
An important perception on the part of our students is that their EPAC preceptors provide formative assessments that are candid and direct, with the goal of learning and improvement rather than a goal of informing grades. Although the program is young, the feedback from both students and faculty about this has been positive. We plan to explore the effect further as our numbers grow through subsequent cohorts. By lowering the stakes for these assessments, we feel they may make a more meaningful contribution to our learners’ career development as pediatricians.
To our knowledge, this is the first time that medical students in the United States are transitioning from UME to a GME program based on the assessment and attainment of competence rather than completion of a time-based curriculum. Other experiences with non-time-delimited medical education include orthopedic surgery residents at the University of Toronto who complete rotations based on the attainment of competence rather than weeks of service completed.20 That program has focused on early acquisition of technical skills, although ultimately all CanMEDS competencies are assessed. Importantly, the leaders of that effort have also acknowledged the importance of more frequent assessment and feedback, and the value of feedback that is formative, not just summative. This is a cornerstone of competency-based advancement that has guided our program development. There are also “short tracks” toward advancement to residency in primary care and other specialties.2 However, these tracks have not been developed to test competency-based advancement. They are designed to facilitate career development for those with specific career interests in a shortened, yet still fixed, time frame. Our early success supports the feasibility of a shift in thinking from medical education as a fixed-time, variable-outcome system of curriculum delivery to medical education as a fixed-outcome, variable-time effort to truly assess the development and competence of learners. We recognize that success in GME is necessary to validate the UME portion of the pilot. As we continue to follow our learners, we hope their outcomes will underscore the benefits of competency-based advancement.
Implementing programs such as EPAC, while providing the opportunity to redesign medical education, presents many challenges. The effort requires firm commitments from institutional leaders, both at the medical school and residency level, and from external accreditation, certification, and licensing bodies. Each of our schools has had to reform its approach to the assessment of participating students and engage in substantial faculty development. Furthermore, this type of curriculum delivery may not be appropriate for all learners or all specialties. There is still much to learn about the characteristics of students who stand to benefit most from this approach, and about those who enter but eventually opt out of the program (as has happened with four students to date). We have also raised the question of whether students who begin this program in medical school might transition to a residency program at another institution, perhaps one of those that is also participating in EPAC. Our participating students have a very positive attitude about entering pediatric residency at the same institution where they attended medical school. Many of them have said they hoped to do so whether or not they participated in EPAC and that this was a factor in their decision to participate. Accepting local entrustment decisions for off-cycle advancement from one medical school to another institution’s residency program was a practical challenge we chose not to introduce into this pilot. We acknowledge that additional flexibility might influence the interest of other students in the program in the future.
Although we are at a point where we have successfully advanced students to GME based on attainment of competence in the Core EPAs for Entering Residency, other outcomes demand further study. For example, is the performance of learners in this program different from their peers in the traditional curriculum, both in relation to important educational outcomes and also for interpersonal characteristics like empathy and job satisfaction? What is the impact, if any, of early career choice on their professional development? And, can these concepts be generalized to specialties other than pediatrics? We are gathering data about these issues, and our approach to the pilot should help us begin to answer these questions over the ensuing years. We are enthusiastic about what we have accomplished, but recognize that demonstrating objective outcomes that are comparable, if not superior, to those currently being achieved in a fixed-time educational process will be important.
Our involvement with learners in this program is intensive, including their frequent formative feedback discussions with preceptors and regular meetings with program leadership about progress within the program and individualized curricular modifications. This raises valid questions about whether it is scalable to larger cohorts. As successive cohorts enter the program, we expect to learn more about this aspect of implementation. We have also debated whether the fundamental principles of the EPAC program can be applied at the GME level. Can a resident be graduated to independent practice when competent rather than after three years of training? We believe the answer to this question is yes, but recognize practical limitations on the ability of GME programs to have trainees begin and end their training “off-cycle.” However, these practical barriers have more to do with the way residency training is structured than the professional development of physicians.
In summary, the EPAC program is in the process of advancing its first cohorts of learners from medical school to residency based on the attainment of competence rather than time. We believe that the EPAC pilot demonstrates that competency-based advancement, independent of time, is possible. We look forward to studying the outcomes of our learners and subsequent cohorts as we seek to move beyond feasibility to an understanding of value.
Education in Pediatrics Across the Continuum (EPAC) Study Group: University of California, San Francisco School of Medicine: Sara Buckelew, MD, MPH; Duncan Henry, MD; Michele Long, MD; Daniel C. West, MD; University of Colorado School of Medicine: Janice L. Hanson, PhD, EdS; M. Douglas Jones Jr, MD; Paritosh Kaul, MD; J. Lindsey Lane, BMBCh; Carol Okada, MD; Adam Rosenberg, MD; Jennifer B. Soep, MD; University of Minnesota Medical School: Robert D. Acton, MD; John S. Andrews, MD; Emily C. Borman-Shoap, MD; Kathleen D. Brooks, MD, MBA, MPA; Michael J. Cullen, PhD; Aaron L. Friedman, MD; Sophia P. Gladding, PhD; Patricia M. Hobday, MD; Katherine E. Murray, MD, MPH; Deborah Powell, MD; University of Utah School of Medicine: James F. Bale Jr, MD; Kristen A. Bettin, MD, MEd; Jaime Bruse; Edward B. Clark, MD; Brittany Esty, MD; Tiffany S. Glasgow, MD; Brian P. Good, MD; Bruce E. Herman, MD; Sara M. Lamb, MD; Meghan O’Connor, MD; Kristin Randall; Danielle Roussel, MD; Wayne M. Samuelson, MD; Adam T. Stevenson, MD.
EPAC Study Group contributors: Additional EPAC Study Group members who contributed to this article are Dorene Balmer, PhD (University of Pennsylvania Perelman School of Medicine); Carol Carraccio, MD, MA (American Board of Pediatrics); H. Carrie Chen, MD, PhD (Georgetown University School of Medicine); AbhayDandekar, MD (Kaiser Permanente Northern California); Lisa Howley, PhD (Association of American Medical Colleges); and Alan Schwartz, PhD (University of Illinois College of Medicine at Chicago). J.S. Andrews is vice chair for education and associate professor, Department of Pediatrics, and associate dean for graduate medical education, University of Minnesota Medical School, Minneapolis, Minnesota; ORCID: http://orcid.org/0000-0002-2008-2686., J.F. Bale is professor of pediatrics and neurology, Department of Pediatrics, University of Utah School of Medicine, Salt Lake City, Utah., J.B. Soep is associate professor of pediatrics, Department of Pediatrics, and pediatric clerkship director, University of Colorado School of Medicine, Aurora, Colorado., M. Long is associate professor of pediatrics, Department of Pediatrics, University of California, San Francisco, School of Medicine, San Francisco, California; ORCID: http://orcid.org/0000-0002-8399-5589., C. Carraccio is vice president for competency-based assessment programs, American Board of Pediatrics, Chapel Hill, North Carolina., R. Englander is professor of pediatrics, Department of Pediatrics, and associate dean for undergraduate medical education, University of Minnesota Medical School, Minneapolis, Minnesota., D. Powell is professor of laboratory medicine and pathology, Department of Laboratory Medicine and Pathology, and dean emerita, University of Minnesota Medical School, Minneapolis, Minnesota.
Acknowledgments: The authors wish to acknowledge the generous support of the Association of American Medical Colleges (AAMC) and the Josiah Macy, Jr. Foundation, as well as important contributions from the following individuals: Thomas Nasca, MD, Timothy Brigham, MDiv, PhD, Joseph Gilhooly, MD, and Susan Day, MD, from the Accreditation Council for Graduate Medical Education; Humayun Chaudhry, DO, MS, MACP, FACOI, from the Federation of State Medical Boards; Stephen Clyman, MD, MS, from the National Board of Medical Examiners; Gail McGuinness, MD, and James Stockman, MD, from the American Board of Pediatrics; Carol Aschenbrener, MD, former chief medical education officer at the AAMC; Olle ten Cate, PhD, Linda Lewin, MD, David Hirsh, MD, Tara Kennedy, MD, and Linda Perkowski, PhD. Many staff members at the AAMC have provided invaluable support to the project, including Chris Hanley, MBA, Beatrice Schmider, MA, Natalie Floyd, Chris Dawson, MHA, Jan Bull, MA, Nancy Moeller, PhD, Emory Morrison, PhD, Susan Bodilly, PhD, Mary Ellen Gusic, MD, and Alison Whelan, MD. The authors also thank the students at the participating medical schools, without whom this work would not have been possible.
1. Flexner A. Medical Education in the United States and Canada. 1910.New York, NY: Carnegie Foundation for the Advancement of Teaching.
2. Cangiarella J, Fancher T, Jones B, et al. Three-year MD programs: Perspectives from the Consortium of Accelerated Medical Pathway Programs (CAMPP). Acad Med. 2017;92:483–490.
3. Raymond JR Sr, Kerschner JE, Hueston WJ, Maurana CA. The merits and challenges of three-year medical school curricula: Time for an evidence-based discussion. Acad Med. 2015;90:1318–1323.
4. Magaghie W, Miller G, Sajid A, Telder T. Competency-based curriculum development on medical education: An introduction. Public Health Pap. 1978;68:11–91.
5. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: From Flexner to competencies. Acad Med. 2002;77:361–367.
6. Batalden P, Leach D, Swing S, Dreyfus H, Dreyfus S. General competencies and accreditation in graduate medical education. Health Aff (Millwood). 2002;21:103–111.
7. Jones MD Jr, McGuinness GA, Carraccio CL. The Residency Review and Redesign in Pediatrics (R3P) project: Roots and branches. Pediatrics. 2009;123(suppl 1):S8–S11.
8. Jones MD Jr, McGuinness GA; Residency Review and Redesign in Pediatrics (R3P) Committee. The future of pediatric residency education: Prescription for more flexibility. J Pediatr. 2009;154:157–158.
9. Jones MD Jr. Innovation in residency education and evolving pediatric health needs. Pediatrics. 2010;125:1–3.
10. Powell DE, Carraccio C, Aschenbrener CA. Pediatrics redesign project: A pilot implementing competency-based education across the continuum. Acad Med. 2011;86:e13.
11. Compton MT, Frank E, Elon L, Carrera J. Changes in U.S. medical students’ specialty interests over the course of medical school. J Gen Intern Med. 2008;23:1095–1100.
12. ten Cate O, Scheele F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med. 2007;82:542–547.
13. Englander R, Flynn T, Call S, et al. Toward defining the foundation of the MD degree: Core entrustable professional activities for entering residency. Acad Med. 2016;91:1352–1358.
14. Carraccio C, Englander R, Gilhooly J, et al. Building a framework of entrustable professional activities, supported by competencies and milestones, to bridge the educational continuum. Acad Med. 2017;92:324–330.
15. Association of American Medical Colleges. Core entrustable professional activities for entering residency: Curriculum developers’ guide. www.mededportal.com/icollaborative/resource/887
. Published 2014. Accessed October 1, 2017.
16. Chen HC, van den Broek WE, ten Cate O. The case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2015;90:431–436.
17. Maslach C, Jackson S. The measurement of experienced burnout. J Occup Behav. 1981;2:99–113.
18. Hojat M, Mangione S, Nasca TJ, et al. The Jefferson Scale of Physician Empathy: Development and preliminary psychometric data. Educ Psychol Meas. 2001;61:349–365.
19. Andolsek K, Padmore J, Hauer KE, Holmboe E. Accreditation Council for Graduate Medical Education Clinical Competency Committees: A guide book for programs. https://www.acgme.org/Portals/0/ACGMEClinicalComp etencyCommitteeGuidebook.pdf
. Published 2015. Accessed October 1, 2017.
20. Ferguson PC, Kraemer W, Nousiainen M, et al. Three-year experience with an innovative, modular competency-based curriculum for orthopaedic training. J Bone Joint Surg Am. 2013;95:e166.
References cited in tables only
21. Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003;326:319–321.
22. Upton D, Upton P. Development of an evidence-based practice questionnaire for nurses. J Adv Nurs. 2006;53:454–458.
23. Kersten HB, Frohna JG, Giudice EL. Validation of an Evidence-Based Medicine Critically Appraised Topic Presentation Evaluation Tool (EBM C-PET). J Grad Med Educ. 2013;5:252–256.
24. Olupeliyawa AM, O’Sullivan AJ, Hughes C, Balasooriya CD. The Teamwork Mini-Clinical Evaluation Exercise (T-MEX): A workplace-based assessment focusing on collaborative competencies in health care. Acad Med. 2014;89:359–365.
25. West DC, Ferrell CL, Boscardin C, Jannicelli A, Rosenberg A. The Western PedSCO: A direct observation tool to measure resident performance in pediatric patient encounters. Acad Pediatr. 2011;11:e9.
26. Archibald D, Trumpower D, MacDonald CJ. Validation of the Interprofessional Collaborative Competency Attainment Survey (ICCAS). J Interprof Care. 2014;1820:1–6.
27. Glissmeyer EW, Ziniel SI, Moses J. Use of the Quality Improvement (QI) Knowledge Application Tool in assessing pediatric resident QI education. J Grad Med Educ. 2014;6:284–291.
28. Wittich CM, Beckman TJ, Drefahl MM, et al. Validation of a method to measure resident doctors’ reflections on quality improvement. Med Educ. 2010;44:248–255.
29. Rosenbluth G, Burman N, Ranji S, Baron RB. UCSF Quality Improvement (QI) Project Assessment Tool. https://icollaborative.aamc.org/resource/841/
. Published 2013. Accessed October 13, 2017.
30. Fall LH, Berman NB, Smith S, White CB, Woodhead JC, Olson AL. Multi-institutional development and utilization of a computer-assisted learning program for the pediatrics clerkship: The CLIPP project. Acad Med. 2005;80:847–855.