Secondary Logo

Journal Logo

The Application of Entrustable Professional Activities to Inform Competency Decisions in a Family Medicine Residency Program

Schultz, Karen, MD; Griffiths, Jane, MD; Lacasse, Miriam, MD, MSc

doi: 10.1097/ACM.0000000000000671
Articles
Free
SDC

Assessing entrustable professional activities (EPAs), or carefully chosen units of work that define a profession and are entrusted to a resident to complete unsupervised once she or he has obtained adequate competence, is a novel and innovative approach to competency-based assessment (CBA). What is currently not well described in the literature is the application of EPAs within a CBA system. In this article, the authors describe the development of 35 EPAs for a Canadian family medicine residency program, including the work by an expert panel of family physician and medical education experts from four universities in three Canadian provinces to identify the relevant EPAs for family medicine in nine curriculum domains. The authors outline how they used these EPAs and the corresponding templates that describe competence at different levels of supervision to create electronic EPA field notes, which has allowed educators to use the EPAs as a formative tool to structure day-to-day assessment and feedback and a summative tool to ground competency declarations about residents. They then describe the system to compile, collate, and use the EPA field notes to make competency declarations and how this system aligns with van der Vleuten’s utility index for assessment (valid, reliable, of educational value, acceptable, cost-effective). Early outcomes indicate that preceptors are using the EPA field notes more often than they used the generic field notes. EPAs enable educators to evaluate multiple objectives and important but unwieldy competencies by providing practical, manageable, measurable activities that can be used to assess competency development.

K. Schultz is associate professor and program director, Department of Family Medicine, Queen’s University, Kingston, Ontario, Canada.

J. Griffiths is assistant professor and assessment director, Department of Family Medicine, Queen’s University, Kingston, Ontario, Canada.

M. Lacasse is assistant professor and assessment director, Département de médecine familiale et de médecine d’urgence, Faculté de médecine, Université Laval, Quebec City, Quebec, Canada.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: Reported as not applicable.

Previous presentations: A workshop on writing entrustable professional activities was presented at the Association for Medical Education in Europe annual conference, August 2013, Prague, Czech Republic. A short presentation on the entrustable professional activity field note and its use in academic advisor meetings was given at the Association for Medical Education in Europe annual conference, September 2014, Milan, Italy.

Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A262.

Correspondence should be addressed to Karen Schultz, 115 Clarence St., Suite 319, Kingston, Ontario, Canada, K7L 5N6; telephone: (613) 533-9300, ext. 73933; e-mail: karen.schultz@dfm.queensu.ca.

While competencies, as personal attributes, can be difficult to assess, activities are observable and measurable.1 Assessing entrustable professional activities (EPAs) is a novel and innovative approach to competency-based assessment (CBA).2 EPAs are carefully chosen “professional activities that together constitute the mass of critical elements that operationally define a profession.”1 Faculty can observe, assess, and entrust these units of work to an unsupervised resident once she or he has obtained adequate competence.1 For example, in family medicine, the care of a patient with a chronic disease and the care of an intrapartum patient are two EPAs involving the integrated competencies of medical expert, communicator, collaborator, and professional in different situations. A resident who successfully completes these EPAs has achieved these competencies.

The literature on the theoretical value of EPAs is rapidly expanding. What is currently not well described in the literature, however, is the application of EPAs within a CBA system, a critical next step if EPAs are to be of practical use. In this article, we discuss the development of EPAs for a Canadian family medicine residency program, the incorporation of these EPAs into EPA field notes, and their use as both a formative tool to structure day-to-day assessment and feedback and a summative tool to ground competency declarations about residents. Much of the CBA terminology in the literature overlaps, and many terms are used in different ways. In this article, we use the terms competence, competency, milestones, and benchmarks. Competence and competency have been defined in a number of different ways.3–7 However, an examination of those definitions reveals that they both contain the concepts of context; judgment; an integration of knowledge, skills, and attitudes; a measurable outcome; and being habitual and impermanent. Therefore, our working definition, combining these concepts and building on the work from the Scottish Doctor Project,8 is that competence is the ability to repeatedly do the right thing at the right time in the right way to the right person. We conceptualize competence to be made up of a number of competencies, each in turn being marked by a number of developmental stages. These developmental stages, although blending together, do have transition points or, like markers on a highway, milestones. We refer to the descriptions of how a resident performs at these milestones as benchmarks.

Since 2010, we have worked to transform our family medicine residency program into a competency-based model. In considering how to assess competency, we considered two issues. First, we discussed how to bridge our curriculum, which had hundreds of objectives—too many to assess individually—and our existing curriculum and assessment frameworks. These frameworks were made up of important competencies but were too broad to assess easily and not integrative enough to reflect real-life performance.9 Our concern about unwieldy, nonintegrative competencies has been raised by others regarding other curriculum and assessment frameworks, such as the CanMEDS roles, the Accreditation Council for Graduate Medical Education (ACGME) competencies, and the Scottish Doctor domains.5,10

Our second consideration was how to ensure that our assessment strategy fulfilled van der Vleuten’s11 utility index for assessment tools (validity × reliability × educational value × acceptability × cost-effectiveness). In the clinical setting, these terms are operationalized as follows: validity—assessing the performance of the resident working as a physician (i.e., at the “does” level of the Miller pyramid)12; reliability—using multiple expert assessors trained to recognize levels of competence and assessing performance over time and across contexts13; educational value (or assessment for learning)—having assessments also support resident learning; acceptability—measuring items that have face value and meaning for all stakeholders in a practical way using a well-accepted system; and cost-effectiveness—using strategies that would not require a lot of time or great expense.

We believed that EPAs would be the appropriate bridge between our many curriculum objectives and our large, nonintegrative curriculum and assessment frameworks.14 By first translating the competencies into carefully chosen clinically relevant core activities, known to all preceptors and residents, and then by assessing those activities, we would be able to evaluate the competencies. By linking our EPAs to daily feedback and assessment in the form of electronic EPA field notes, we envisioned using our EPAs formatively. By collating the EPA field notes into individual EPAs, we also could use them in summative decision making and competency declarations. In the following sections, we outline the steps we took to establish our EPA-based assessment system, the lessons we learned along the way, the positive outcomes of the system to date, and our next steps for the future.

Back to Top | Article Outline

Preliminary Work

Early in 2010, we searched the literature and considered societal needs15 and the future needs of our residents16 to confirm that our existing curriculum objectives covered the skills we wanted our residents to have at the end of their training. We then reworded the objectives using competency-based language, with observable assessable actions (e.g., changing an objective from “will know” to “demonstrates”). Next, we organized the objectives into nine curriculum domains relevant to our program. For example, our domains reflected that family medicine residents need to learn how to provide care across a patient’s lifespan (e.g., care of children, care of adults, care of the elderly, etc.) (see Table 1). Other specialties likely will organize their competencies using other domains.17–19

Table 1

Table 1

Our next step was to form an expert panel to ensure that we solicited diverse, informed perspectives to help identify the relevant EPAs for each curriculum domain. We initially envisioned a national consensus project involving all 17 Canadian family medicine residency programs to broaden the consensus on the relevant EPAs. Logistically, however, this proved impossible within our time constraints. Instead, we invited interested family physician and medical education experts from four Canadian family medicine residency programs in three provinces to join the expert panel. We felt that this expert panel reflected an acceptably broad consensus.

Back to Top | Article Outline

Step 1: Decide on the EPAs

As an expert panel, we met in person and by teleconference six times from late 2010 through 2011 for three to four hours at a time to brainstorm what “operationally defines us as a profession,” incorporating information from the literature and from our personal experiences. We agreed on a workable number of EPAs by considering the main presentations of patients within each curriculum domain. We worked to ensure that most, if not all, patient presentations would fall into one of the EPAs, and we considered EPAs that spanned all the clinical domains (i.e., the physicianship EPAs). Ultimately, we decided on 35 EPAs (see Table 1).

Back to Top | Article Outline

Step 2: Design the EPA Framework

Designing the EPA framework required determining how the EPAs would be incorporated into our assessment system. We decided that a template, or matrix, for writing the EPAs was needed. Accrediting bodies often delineate the competencies that residents need to graduate for accredited residency programs. Residency programs could use these competencies as the foundation for their EPAs. For us, the College of Family Physicians of Canada (CFPC) defined our curriculum and assessment frameworks and the corresponding competencies as the CanMEDS–Family Medicine roles,20 skill dimensions, phases of the clinical encounter, domains of clinical care, and priority topics (see Table 2).21

Table 2

Table 2

Others have used different models for the vertical axis of their framework, such as the CanMEDS roles and the ACGME competencies. We knew that we would be asking hundreds of preceptors to use the EPA field notes that we chose; thus, for our vertical axis, after much discussion, we selected the phases of the clinical encounter. We felt that this design would be both practical (busy preceptors would not need to observe a whole encounter but only a more manageable part of an encounter) and intuitive, and therefore it would be an acceptable framework. For our horizontal axis, we chose the three levels of supervision as our milestones of competency attainment,6 which we felt were also intuitive for our preceptors. Supervisors naturally make these “entrustment” decisions as they work closely with residents (see Table 3). Again, others have used different milestones for their horizontal axis, depending on the anticipated use of their framework (e.g., postgraduate year, Dreyfus model level, etc.).22,23

Table 3

Table 3

After setting the axes of our framework, we created a generic EPA template with descriptions in each cell of competence in that phase of the clinical encounter at that level of supervision. To generate this generic template, which we built iteratively from late 2011 to summer 2012, we drew from the competency literature and our experience working with learners at all stages of residency.10,20,24–31 We considered ideas about trustworthiness2,10,32; the concept of continuity in family medicine, realizing that each visit for a patient is not isolated in time but must incorporate previous and future care concerns; and educational theories about the development of increasing expertise and what that looks like in practice.23 We endeavored to capture what was unique about family medicine and what sets our discipline apart from others (see Table 3).

To ensure that our 35 EPAs also covered the competencies within the CanMEDS–Family Medicine roles, the priority topics, and the skill dimensions, two of us (K.S., J.G.) mapped/blueprinted the generic template back to the CanMEDS–Family Medicine roles and skill dimensions and the EPAs back to the priority topics (mapping available from the authors on request). Doing so reassured us that if our residents completed all the EPAs, we could confidently make decisions about their competency in all of the CFPC curriculum and assessment frameworks (Table 2).

Back to Top | Article Outline

Step 3: Describe the Benchmarks Within Each EPA

From summer 2012 to spring 2013, we modified each cell in the generic template to create the 35 EPA-specific templates, incorporating the knowledge, skills, attitudes, values, and concepts of that EPA. This process was iterative, with two of us (K.S., J.G.) writing the majority of the EPA-specific templates, using topic-specific guidelines and our combined 55 years of clinical experience. The expert panel then provided feedback verbally and via e-mail, which we incorporated into the templates. We anticipate further modifications as we collect feedback from the preceptors and residents who are now using these templates in practice.

Back to Top | Article Outline

Step 4: Decide How to Integrate the EPAs Into an Assessment System

For formative assessment, we wanted a system in which preceptors could assess the EPAs in the workplace on a daily basis. We have used field notes since 2009, first on paper, then electronically starting in 2011, for daily workplace-based assessment and feedback. Field notes are brief notes that document a resident’s performance in the clinical environment and summarize the verbal feedback given about his or her performance.33 In summer 2013, we linked the EPAs to our existing electronic field notes, allowing preceptors to make daily competency declarations about part of an EPA based on brief observations of performance. These assessments from different preceptors in different settings would accumulate over time. With the introduction of the EPA field note, a preceptor now codes for the CanMEDS–Family Medicine role, skill dimension, curriculum domain, phase of the clinical encounter, and EPA being assessed (a task that takes less than one minute); decides on a level of competency for that brief performance; and provides narrative feedback. Adding the EPAs to the field notes restructured a random process of feedback to focus on those activities/competencies deemed critical for our residents.

Our narrative descriptions of the expected performance within each cell of the EPA-specific templates could inform decisions about these brief performances. In the electronic field note, when a user hovers the cursor over the “details” link beside each level of supervision (see Supplemental Digital Appendix 1 at http://links.lww.com/ACADMED/A262), the narrative description of performance expected for that particular EPA, phase of the clinical encounter, and level of supervision appears. These descriptions are not meant to reduce complex tasks to a checklist of actions but to paint a picture of competence at the different levels of supervision.

Much like clinicians gain expertise by creating “illness scripts” for different patient presentations,34 these descriptions helped preceptors gain expertise by creating “competency scripts” of the different levels of a resident’s performance. In addition, we hope that these “competency scripts,” in making performance standards explicit and describing each of the three levels of performance, will calibrate what preceptors expect of residents, thereby decreasing the subjectivity of their decisions. The descriptions also can inform the narrative feedback that preceptors provide. Finally, this design creates a road map for residents, explaining how they can improve their performance to reach the next milestone. This assessment tool, then, is not only one of learning but also one for learning.

Another important consideration in a CBA system is how summative competency declarations are made and distributed to stakeholders. At our institution, the existing CBA system is the portfolio assessment support system (PASS), which has two equally important components—each resident has an electronic portfolio and an academic advisor with whom she or he meets three times a year. While the electronic portfolio collects a number of assessments (e.g., rotation evaluations, objective structured clinical exam results, simulation course results, resident-as-teacher evaluations, multisource feedback), the most important CBA tool for evaluating most competencies is the EPA field note. Because the field notes are electronic, multiple preceptors in multiple settings over time can enter information, and that information can be collated and sorted according to the relevant skills and roles (see Supplemental Digital Appendix 2 at http://links.lww.com/ACADMED/A262) described in Table 2, then resorted according to the specific EPA (see Supplemental Digital Appendix 3 at http://links.lww.com/ACADMED/A262). By reviewing the electronic portfolio every four months, academic advisors can pull together data, make competency declarations, discuss these declarations with residents, and decide next steps to further residents’ competency development. Academic advisors then pass on these competency declarations to the program director who determines program completion and who forwards these declarations to the regulatory bodies.

Attention to the time to competency attainment by a resident will identify outliers, both those who are having difficulty and those who are excelling. Family medicine is a specialty in which residents develop competency simultaneously in many domains over the course of their residency, rather than sequentially, as may be the case in other specialties. In Canada, family medicine residency is a two-year program. We expect that our residents will require minimal supervision for all EPAs by the beginning of their second year and supervision for refinement by the end of their training.

In the future, we plan to collect data on all residents’ progression through the EPAs to better define expected trajectory and outliers. For residents falling behind this pace, we will determine an educational diagnosis to identify the cause(s) and modify their program accordingly. Likewise, for residents exceeding these expectations, we will incorporate enriched experiences into their remaining training and further develop their competency towards the next levels of proficiency and expertise.22 By doing so, we hope to counter the criticism that CBA does not promote development beyond competence.7,35

Back to Top | Article Outline

Other Considerations

Introducing EPAs into our assessment system involved asking preceptors, academic advisors, and residents to do something new. Thus, our change management process was critical to the uptake of the EPAs. Concurrently with the steps described above, we held faculty development sessions to inform, seek input, and garner buy-in from preceptors and academic advisors and to further develop their assessment expertise. CBA occurs primarily in the workplace; hence, it relies heavily on expert opinions. Such a system is only as good as those experts, so faculty development is critical.35 We involved the residents in a similar process for the same reasons—to seek their input, secure their buy-in, and build their expertise in self-directed learning. We also recognize that the EPAs as we have defined them are a work in progress—curriculum objectives will change as societal needs change, which must be reflected in the EPAs, and the “competency scripts” will evolve, necessitating revisions to the EPA-specific templates.

We were not able to start this work with national consensus around the EPAs we included or the generic template we designed. We hope to conduct this work in the future and to modify the EPAs and template accordingly. However, we are optimistic that the expert panel with members from four programs in three provinces developed an accurate and helpful model to use as the foundation of future work and research.

Back to Top | Article Outline

Outcomes to Date

We have used the EPA field notes since July 2013. The total number of field notes completed for our 140 residents increased by approximately 10% (from 6,072 in the first 10 months of the 2012–2013 academic year when we used the generic field notes to 6,658 in the first 10 months of the 2013–2014 academic year when we used the EPA field notes). We developed a number of initiatives to encourage field note completion in general, so this increase cannot necessarily be attributed to the implementation of the EPA field notes alone. We are glad to see, though, that they are not a deterrent to preceptor engagement. It is still too early to formally assess the impact of the EPA field notes, but our future plans include conducting qualitative research to study the impact of the EPA field notes on residents’ competency development and preceptors’ and academic advisors’ confidence in making competency declarations. Although not the primary intention of the EPA field notes, the richness of the documentation of residents’ performance and the traceable trajectory of their competency development (or lack thereof) have successfully supported the decision to extend or (rarely) to terminate a resident’s program.

Finally, one of our expert panelists at the Université Laval is using our EPAs not as EPA field notes but as benchmarks for new competency-based, in-training evaluation reports, demonstrating the adaptability of the family medicine EPAs for other assessment strategies.

Back to Top | Article Outline

Conclusions

By developing EPAs for family medicine and incorporating them into electronic field notes and our PASS, we translated our multiple objectives and important but unwieldy, nonintegrative competencies into practical, manageable, measurable activities that allow us to formatively and summatively assess competency development. EPA field notes are a foundational tool to assess competency development in residents in a way that addresses some concerns about CBA and aligns with van der Vleuten’s utility index (validity × reliability × educational value × acceptability × cost-effectiveness).36 EPA field notes exhibit validity because they integrate the knowledge, skills, attitudes, and values of each EPA and apply them directly to patient care. They exhibit reliability because they allow multiple assessors to evaluate residents over time and in different contexts, and because, by creating “competency scripts,” they increase the expertise of the assessors. EPA field notes have educational value because the “competency scripts” give residents a road map to further develop their competency, making them tools for learning, not just assessments of learning. They exhibit acceptability because they are intuitive to residents and preceptors (incorporating phases of the clinical encounter and levels of supervision). Finally, EPA field notes are cost-effective because they keep assessment as part of the existing workplace environment.

Acknowledgments: The authors wish to acknowledge Jonathan Kerr, Wayne Weston, Robert Connelly, and Laura McEwen for their invaluable input in designing the entrustable professional activities; Danielle O’Keefe, Patti McCarthy, and Olle ten Cate for their review of this work and helpful comments; Glenn Brown, chair of the Queen’s University family medicine department, for his support of this work; and the Queen’s University family medicine residents and preceptors for their patience and feedback as this work was implemented.

Back to Top | Article Outline

References

1. ten Cate O, Scheele F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med. 2007;82:542–547
2. ten Cate O. Trust, competence, and the supervisor’s role in postgraduate training. BMJ. 2006;333:748–751
3. Albanese MA, Mejicano G, Mullan P, Kokotailo P, Gruppen L. Defining characteristics of educational competencies. Med Educ. 2008;42:248–255
4. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287:226–235
5. Govaerts MJ. Educational competencies or education for professional competence? Med Educ. 2008;42:234–236
6. Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T. Toward a definition of competency-based education in medicine: A systematic review of published definitions. Med Teach. 2010;32:631–637
7. Frank JR, Snell LS, ten Cate O, et al. Competency-based medical education: Theory to practice. Med Teach. 2010;32:638–645
8. Harden JR, Crosby MH, Davis M, Friedman RM. AMEE guide no. 14: Outcome-based education: Part 5—From competency to meta-competency: A model for the specification of learning outcomes. Med Teach. 1999;21:546–552
9. ten Cate O, Young JQ. The patient handover as an entrustable professional activity: Adding meaning in teaching and practice. BMJ Qual Saf. 2012;21(suppl 1):i9–i12
10. ten Cate O, Snell L, Carraccio C. Medical competence: The interplay between individual ability and the health care environment. Med Teach. 2010;32:669–675
11. Van Der Vleuten CP. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ Theory Pract. 1996;1:41–67
12. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 suppl):S63–S67
13. van der Vleuten CP, Schuwirth LW, Scheele F, Driessen EW, Hodges B. The assessment of professional competence: Building blocks for theory development. Best Pract Res Clin Obstet Gynaecol. 2010;24:703–719
14. ten Cate O. Entrustability of professional activities and competency-based training. Med Educ. 2005;39:1176–1177
15. Ontario College of Family Physicians. Vision 2020: Raising the Bar in Family Medicine and Ontario’s Primary Care sector. 2011 Toronto, Ontario, Canada Ontario College of Family Physicians http://ocfp.on.ca/docs/publications/vision-2020---raising-the-bar-in-ontarios-healthcare-system---final.pdf. Accessed December 29, 2014
16. The Future of Medical Education in Canada (FMEC). http://www.afmc.ca/future-of-medical-education-in-canada/. Accessed December 29, 2014
17. Mulder H, Ten Cate O, Daalder R, Berkvens J. Building a competency-based workplace curriculum around entrustable professional activities: The case of physician assistant training. Med Teach. 2010;32:e453–e459
18. Jones MD Jr, Rosenberg AA, Gilhooly JT, Carraccio CL. Perspective: Competencies, outcomes, and controversy—linking professional activities to competencies to improve resident education and practice. Acad Med. 2011;86:161–165
19. Scheele F, Teunissen P, Van Luijk S, et al. Introducing competency-based postgraduate medical education in the Netherlands. Med Teach. 2008;30:248–253
20. Allen T, Bethune C, Brailovsky C, et al. Defining competence for the purposes of certification by the College of Family Physicians of Canada: The evaluation objectives in family medicine. Report of the Working Group on the Certification Process 2010 http://www.cfpc.ca/uploadedFiles/Education/Certification_in_Family_Medicine_Examination/Definition%20of%20Competence%20Complete%20Document%20with%20skills%20and%20phases.pdf. Accessed December 29, 2014
21. Tannenbaum D, Konkin J, Parsons E, et al. CanMEDS–Family Medicine: Working Group on Curriculum Review. 2009 http://www.cfpc.ca/uploadedFiles/Education/CanMeds%20FM%20Eng.pdf. Accessed December 29, 2014
22. Dreyfus HL, Dreyfus SE Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer. 1988 New York, New York Free Press
23. Carraccio CL, Benson BJ, Nixon LJ, Derstine PL. From the educational bench to the clinical bedside: Translating the Dreyfus developmental model to the learning of clinical skills. Acad Med. 2008;83:761–767
24. McWhinney IR A Textbook of Family Medicine. 20093rd ed. Oxford, England Oxford University Press
25. Sanche G, Audétat M-C, Laurin S. Aborder le raisonnement clinique du point de vue pédagogique—III. Les difficultés de raisonnement clinique à l’étape du traitement et du raffinement des hypothèses: La fermeture prématurée. Pédagogie Médicale. 2012;13:103–108
26. Laurin S, Audétat M-C, Sanche G. Aborder le raisonnement clinique du point de vue pédagogique—IV. Les difficultés de raisonnement clinique à l’étape du raffinement et du traitement des hypothèses: Les difficultés de priorisation. Pédagogie Médicale. 2012;13:109–114
27. Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009;84:1022–1028
28. Levenstein JH, McCracken EC, McWhinney IR, Stewart MA, Brown JB. The patient-centred clinical method. 1. A model for the doctor–patient interaction in family medicine. Fam Pract. 1986;3:24–30
29. Audétat M-C, Laurin S, Sanche G. Aborder le raisonnement clinique du point de vue pédagogique—II. Les difficultés de raisonnement clinique à l’étape du recueil initial des données et de la génération d’hypothèses. Pédagogie Médicale. 2012;12:231–236
30. Bordage G, Grant J, Marsden P. Quantitative assessment of diagnostic ability. Med Educ. 1990;24:413–425
31. Sanche G, Audétat M-C, Laurin S. Aborder le raisonnement clinique du point de vue pédagogique—VI. Les difficultés de raisonnement clinique à l’étape de l’élaboration du plan d’intervention. Pédagogie Médicale. 2012;13:209–214
32. Sterkenburg A, Barach P, Kalkman C, Gielen M, ten Cate O. When do supervising physicians decide to entrust residents with unsupervised tasks? Acad Med. 2010;85:1408–1417
33. Donoff MG. Field notes: Assisting achievement and documenting competence. Can Fam Physician. 2009;55:1260–1262, e100
34. Schmidt HG, Norman GR, Boshuizen HP. A cognitive perspective on medical expertise: Theory and implication. Acad Med. 1990;65:611–621
35. Leung WC. Competency based medical training: Review. BMJ. 2002;325:693–696
36. Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32:676–682

Supplemental Digital Content

Back to Top | Article Outline
© 2015 by the Association of American Medical Colleges