Secondary Logo

Journal Logo

A Core Components Framework for Evaluating Implementation of Competency-Based Medical Education Programs

Van Melle, Elaine PhD; Frank, Jason R. MD, MA(Ed); Holmboe, Eric S. MD; Dagnone, Damon MD, MSc, MMEd; Stockley, Denise PhD; Sherbino, Jonathan MD, MEd on behalf of the International Competency-based Medical Education Collaborators

doi: 10.1097/ACM.0000000000002743
Research Reports
Free
SDC

Purpose The rapid adoption of competency-based medical education (CBME) provides an unprecedented opportunity to study implementation. Examining “fidelity of implementation”—that is, whether CBME is being implemented as intended—is hampered, however, by the lack of a common framework. This article details the development of such a framework.

Method A two-step method was used. First, a perspective indicating how CBME is intended to bring about change was described. Accordingly, core components were identified. Drawing from the literature, the core components were organized into a draft framework. Using a modified Delphi approach, the second step examined consensus amongst an international group of experts in CBME.

Results Two different viewpoints describing how a CBME program can bring about change were found: production and reform. Because the reform model was most consistent with the characterization of CBME as a transformative innovation, this perspective was used to create a draft framework. Following the Delphi process, five core components of CBME curricula were identified: outcome competencies, sequenced progression, tailored learning experiences, competency-focused instruction, and programmatic assessment. With some modification in wording, consensus emerged amongst the panel of international experts.

Conclusions Typically, implementation evaluation relies on the creation of a specific checklist of practices. Given the ongoing evolution and complexity of CBME, this work, however, focused on identifying core components. Consistent with recent developments in program evaluation, where implementation is described as a developmental trajectory toward fidelity, identifying core components is presented as a fundamental first step toward gaining a more sophisticated understanding of implementation.

E. Van Melle is senior education scientist, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada.

J.R. Frank is director of specialty education, strategy and standards, Office of Specialty Education, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada.

E.S. Holmboe is senior vice president for milestones development and evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois.

D. Dagnone is competency-based medical education faculty lead and associate professor, Department of Emergency Medicine, Faculty of Health Sciences, Queen’s University, Kingston, Ontario, Canada.

D. Stockley is professor and scholar in higher education, Office of the Vice-Provost (Teaching and Learning), Queen’s University, Kingston, Ontario, Canada.

J. Sherbino is assistant dean, Program for Education Research and Development, Faculty of Health Sciences, McMaster University, Hamilton, Ontario, Canada.

Funding/Support: None reported.

Other disclosures: Eric Holmboe receives royalties from Elsevier Publishers.

Ethical approval: This study received ethical clearance through the Queen’s University Health Sciences and Affiliated Teaching Hospitals Research Ethics Board (HSREB) #6015151.

Previous presentations: An initial version of the framework was presented at the World Summit on Competency-Based Education, August 27 and 28, 2016, Barcelona, Spain.

Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A670.

Correspondence should be addressed to Elaine Van Melle, 33 Hill St., Kingston, Ontario, Canada, K7L 2M4; email: vanmelle@queensu.ca.

Competency-based medical education (CBME) is rapidly being adopted across the globe.1 Consequently, educators and program leaders are in an unprecedented position to study implementation and evaluate outcomes of innovative curricula. However, these studies are hampered by the lack of a common description of what constitutes a CBME program. A similar situation faced problem-based learning (PBL) three decades ago when the absence of a standard description contributed to a series of inconsistent and inconclusive arguments regarding the impact of PBL.2–4 The purpose of this article is to describe a common framework that will permit reaching a deeper understanding of CBME programs, the influence of context, and the conditions under which they can work most effectively.5

Back to Top | Article Outline

The Importance of Evaluating CBME Program Implementation

Implementation evaluation is a specific form of program evaluation that examines the question “Is the program operating as intended?”6 It allows researchers to open the “black box” of program functioning.7 Without this information, the risk is present of producing a Type III error—that is, attributing negative findings to a failure in program theory when negative findings may actually reflect an error in program implementation.8

For example, a 2013 study concluded that competency-based curricula do not produce graduates who are better prepared for medical practice,9 thereby challenging a key assumption underlying CBME. The implementation of CBME that was studied, however, was described as devoting 15% of curriculum time to competency development without any significant changes to teaching or learning. In contrast, the landmark description of CBME states, “Implementation of such a system demands substantial redefinition of faculty and student roles and responsibilities.”10(p55) Consequently, it is questionable as to whether this study actually examined a CBME curriculum.

Avoiding a Type III error is the most common reason cited for undertaking implementation evaluation. Other reasons include documenting deviations from, and differences in, implementation; allowing for more meaningful comparisons of interventions; and promoting external validity by providing adequate guidelines for implementation.11 Implementation evaluation allows researchers and educators to provide evidence if what occurred in the program can be reasonably connected to outcomes.6 Asking questions about the connection between CBME program activities and outcomes is particularly important so that education can be connected to health care practice or patient outcomes.12,13

Back to Top | Article Outline

The Need for a Common Framework

Evaluating the fidelity of design elements in CBME, however, has been hampered by the lack of a shared understanding of what constitutes a CBME program.14 For example, in describing competency-based education, Spady15 lists outcomes, time, instruction, and measurement as the four absolute minimum defining characteristics of competency-based education. In translating competency-based education to the medical field, Frank and colleagues16 describe a focus on curricular outcomes, an emphasis on abilities, a de-emphasis on time-based training, and the promotion of learner centeredness as key elements. More recently, Carraccio and Englander1 expanded the qualities of CBME to include the standardization of desired outcomes; a clear model of the trajectory for becoming an expert physician; evidence-based learning strategies; assessment tools based on care delivery; an emphasis on formative assessment; direct observation of learners; and the existence of quality relationships between learners with patients, mentors, and health care team members as essential practices in a CBME program.

Back to Top | Article Outline

Challenges in Creating a Common CBME Framework

Traditionally, the term “fidelity of implementation” (FOI) is applied to studies that evaluate program implementation. FOI is defined as the “proportion of program components that were implemented” and so represents “the adherence of actual, treatment delivery to the protocol originally developed.”17(p316) Simply stated, ensuring fidelity means implementing a program as designed exactly the same way every time.18 Typically, fidelity studies rely on creating consensus regarding the essential practices one would expect to see in a particular program.19,20 Although generic overarching categories can be used to organize the criteria (e.g., structure or process elements), the end result tends to be a checklist of specific activities used to measure and rate FOI.17,21,22

One issue with a checklist approach “involves the dynamic nature of programs”17(p330) because judgments about which program activities are essential evolve over time. For example, in CBME program implementation, the establishment of a clinical competency committee (CCC) seems to only recently be emerging as an essential practice.23,24 Furthermore, even assuming that there could be consensus regarding the essential practices, the intent of such lists is to create a dichotomous yes-or-no judgment.19,20 The quality of the implementation is not taken into consideration. Quality of implementation, however, makes a significant difference. For example, recent research reveals that all CCCs do not operate in a common fashion.25 Some committees adopt a problem approach focusing primarily on identifying residents in difficulty, whereas others use a developmental approach focusing on the progress of all residents. A checklist approach does not allow for this critical differentiation.

Another challenge is that educational innovations such as CBME are often complex26; they have many different operating parts that can contribute to a variety of outcomes.27 Such innovations, therefore, are highly sensitive to context,28 and so the expectation of FOI—that a program will be implemented exactly the same way, using exactly the same practices in every circumstance—is simply unrealistic. Pérez et al argue29 that under such conditions adaptation—that is, allowing for changes to the original design as long as the integrity of the innovation is not compromised—should be the primary concern. Meeting this challenge requires avoiding a checklist or recipe-like approach and, rather, identifying components that “provide guidance that must be interpreted and applied contextually.”18(p254)

The purpose of this study was therefore to develop a common framework that allows the central question in implementation evaluation, “Has CBME been implemented as intended?” to be consistently applied across differing contexts.21,30

Back to Top | Article Outline

Method

To meet these challenges, Cousins et al19 describe focusing on important components of the innovation rather than on a checklist of practices. Defined as “an essential and indispensable element of an intervention,”31(p3) these core components provide an overarching organizer. The components are specific to the innovation yet robust enough to embrace different practices; in other words, they can be applied to a range of program contexts. Such core components are identified by making explicit the underlying perspective framing the change and identifying the key components that align with the viewpoint. Cousins et al describe the development as taking place in a collaborative fashion in order to strengthen validity and use.19 Accordingly, this was the approach taken in developing the CBME Core Components Framework (CCF). More specifically, we used a two-step method to develop a framework and then to achieve consensus among an international group of medical education experts on a CBME CCF.

Back to Top | Article Outline

Step 1: Developing a draft CCF

To develop the CBME CCF, we used an iterative process that unfolded over a 10-month period between February and December 2015. As recommended, we began by making explicit our understanding of the underlying perspective characterizing the nature of the intended curricular change.18 Situating the change within the educational literature on curriculum development led to an initial identification of core components. To create the framework, we drew from the literature on educational innovations and program evaluation. Accordingly, the CBME CCF was designed using three successive layers.32 The foundational layer links each core component to theories, models, or best practices informing CBME.33 A principles layer offers “a form of rich high-level counsel”34(p195) that can be used to guide implementation, and a practice layer capture the details of implementation. With support from health science and education librarians, two authors (E.V.M. and D.S.) examined and drew from the literature on medical education and education theory to identify seminal or key influential literature informing the core components and accompanying layers. The CBME CCF was developed in consultation with a cross-section of stakeholders located at Queen’s University in Kingston, Ontario, Canada, a Canadian university integrally involved in implementing CBME. Consisting of program directors at the planning, early, and later stages of implementation; institutional CBME leads; education and assessment specialists; and education researchers, this group of 12 individuals met twice to review the draft framework.

Back to Top | Article Outline

Step 2: Determining consensus

We adopted a Delphi approach to validate two layers of the CBME CCF.35,36 The Delphi approach was selected because, as a consensus technique, it allows for views to be expressed anonymously, thereby eliminating the influence that can occur through face-to-face meetings.37 We focused on two layers of the framework, the core components and principle statements, because the Delphi approach works best with higher-level concepts as opposed to an in-depth exploration of the topic.38 We created the expert panel by inviting an international group of CBME scholars to participate in the study. Referred to as the International Competency-based Medical Education Collaborators, this unique partnership was convened approximately 10 years ago to examine conceptual issues and current debates in CBME.39,40 With approximately 60 members, this group participates in monthly teleconferences to discuss developments in CBME, has produced a series of publications on CBME, and has hosted a set of webinars as well as two world summits.41 For the purposes of this study, those indicating interest among this international group of experts formed the Delphi expert panel.

At the beginning of each round, we emailed participants a survey. The five core components of the CBME CCF were used to create the survey for Round 1. Initiated in January 2016, this round also included a sixth question asking if a core component was missing from the list. The principle statements from the CBME CCF formed the survey for Round 2, which took place in March 2016. In both rounds the expert panel was asked to indicate their opinion among these options: “agree as worded,” “agree with rewording,” “disagree,” or “not sure.” Both rounds also included the opportunity for additional comments for each of the survey items. The surveys were developed by the principal author (E.V.M.) and reviewed by the research team before distribution to the expert panel.

Because there is no standard method for defining consensus in a Delphi study, it is important to be explicit about the choice of decision point.42 In our case, the research team agreed that ≥ 70% of the respondents was a reasonable predefined standard for indicating consensus. The responses “agree” and “agree with rewording” were added together to determine whether we had reached the ≥ 70% level. At the end of each round the principal author (E.V.M.) summarized the results for discussion by the team of authors. The summary included a synthesis of comments regarding implications for required revisions to the CBME CCF.

Back to Top | Article Outline

Results

Developing a draft CCF

As we worked to make explicit our understanding of the intended curricular change through CBME, we found two different perspectives used to frame the change to an outcomes-based approach in medical education: production and reform (Table 1).43,44 The production viewpoint draws from the manufacturing industry, where “medical schools, like factories, can produce highly desirable products adapted to user needs and desires.”43(pS41) Alternatively, the reform position focuses on “flexible, individually tailored programs that can adapt to variable rates of competence attainment.”43(pS4) These different perspectives of CBME can lead to emphasizing the importance of different curricular elements. For example, in the production perspective, emphasis is placed primarily on assessment for the purposes of identifying problem learners. In the reform scenario, however, equal importance is placed on all curricular elements, aligned in such a way so that all learners have the opportunity to develop the required competencies.44,45

Table 1

Table 1

In deciding which perspective should guide the creation of the CBME CCF, we noted that the adoption of CBME is often described as a transformative change,1,46,47 requiring a significant shift in behaviors.48 Consequently, we adopted a reform characterization in developing the framework.

To identify specific core components, and in keeping with the reform perspective, we drew from the educational concept of “constructive alignment,” which emphasizes the importance of all curricular elements supporting each other and existing in a balanced ecosystem.44 More specifically, in an outcomes-based curriculum, constructive alignment requires teaching, learning, and assessment practices to be oriented toward learning outcomes.44,45

We then explored these curricular elements in light of our search for seminal literature informing CBME. Creating an ecosystem was supported by the description of CBME as an outcomes-based curriculum where “competence represents the goal of an educational programme, and a curriculum provides the mechanism through which competence is to be acquired.”10(p51) Accordingly, the articulation of explicit outcome competencies was identified as a central core component guiding the development of teaching, learning, and assessment. Furthermore, the literature on teaching, learning, and assessment in a CBME curriculum described distinct theoretical foundations reinforcing their inclusion as separate core components. We noted, however, that mastery learning, a concept fundamental to CBME, was not yet represented in the evolving framework. Described as the need to arrange competencies as a “sequential path through the programme” while still allowing for considerable flexibility in individual learner progression, mastery learning is a fundamental conceptual framework illuminating how CBME is supposed to work.33,49 Accordingly, the sequenced progression of competencies was identified as a core component. Conceptualizing CBME from a reform perspective therefore resulted in identification of the following five core components:

  • Outcome competencies
  • Sequenced progression
  • Tailored learning experiences
  • Competency-focused instruction
  • Programmatic assessment

Responding to the characterization of constructive alignment as occurring within an ecosystem, we envisioned the outcome competencies and sequenced progression as central core components guiding the development of learning, teaching, and assessment practices (Figure 1). In turn, this ecosystem is moderated by features unique to the local context such as the size of the program, availability of resources including learning experiences, and qualities of the learning environment.

Figure 1

Figure 1

These five core components were then used to organize and populate the full framework, with each layer informing the next, allowing for a core components framework specifically customized for a CBME program (Figure 2). We also created a list of seminal articles emerging from our work critical to informing each of the core components (Supplemental Digital Appendix 1, available at http://links.lww.com/ACADMED/A670).

Figure 2

Figure 2

Back to Top | Article Outline

Reaching consensus on the five core components

The five core components and associated principle statements formed the basis for the consensus process. Of the 59 members of the international CBME collaborators group, 25 agreed to participate. Because it is suggested that a panel size should range from 10 to 30 participants,50 with an ideal size being no less than 10,37 this panel size was considered to be adequate. The two rounds had response rates of 100% and 96% consecutively. The specific results are described as follows.

During Round 1, over 70% agreement was reached for all five of the core components (Figure 3). From the analysis of the comments, two specific wording changes were identified. First, that 56% of the respondents chose “agree with rewording” for Component 2 was notable (Figure 3). Analysis of the comments resulted in a revision of the original wording of “competencies are arranged progressively” to “competencies and their developmental markers are sequenced progressively” (Figure 2). The second change was in response to respondents noting consistently that learners do not follow a predefined trajectory as they acquire competencies. Accordingly, for the second wording change, the phrase “the progressive development of competencies” appearing in three of the five components was revised to “the developmental acquisition of competencies.” All changes described above were fully discussed and endorsed by the research group.

Figure 3

Figure 3

The principle statements were the specific focus of Round 2. Once again, agreement reached the predefined consensus level of 70% for all of the principle statements (Figure 3). Analysis of the comments did not demonstrate any specific themes in relation to suggested rewording. Rather, the comments indicated agreement with the principle but a desire to add in more detail and to clarify terminology.

Back to Top | Article Outline

Discussion

We began development of a common framework to guide evaluation of CBME program implementation by making explicit our perspective that the adoption of CBME requires comprehensive curricular reform (Table 1). Drawing from the literature on medical education and education theory, we then identified five core components central to CBME program implementation. Initially, based on the concept of “constructive alignment,” the five core components were portrayed as an ecosystem, thereby reinforcing their applicability across all settings, balanced with the possibility for variation in specific practices as influenced by local context (Figure 1). We then used the five core components to form the basis of a multilevel framework (Figure 2). Using a Delphi approach, an expert panel validated the identification of five core components (Figures 3 and 4). This work addresses a number of challenges identified in evaluating CBME programs.

The focus on core components, rather than a checklist of practices, ensures that the framework can be applied equally across all contexts. For example, although specific assessment practices, or what are referred to as more surface-level features of CBME,32 might differ from program to program, the existence of a programmatic approach to assessment should be common across all programs. Embracing the influence of context is consistent with recent developments in program evaluation where it is recognized that under dynamic conditions, “one size does not fit all,” and so the reliance on rigid rules is being replaced by guiding principles.51 FOI accordingly focuses on the extent to which the program exemplifies integrity to key concepts rather than specific practices.18 In this fashion, the use of core components allows for the promotion of local innovation while fidelity is maintained.

The possibility for local innovation suggests that various configurations of CBME may emerge over time. For example, programs may use different combinations of academic advisors, CCCs, coaches, and mentors. Given the potential for evolution of variability in specific practices, implementation itself is described as a developmental trajectory toward fidelity.20 Over the course of this development, however, there may come a point where changes in surface features compromise the attainment and/or maintenance of fidelity.32 Because a synthesis of studies will be required to provide the rich explanatory analysis to understand program impact,5 identifying when an innovation such as CBME actually represents the proposed model becomes a critical question.52 The CCF we present here can help in answering this question, thereby avoiding a Type III error.

As well, CBME is a complex service intervention consisting of multiple activities and outcomes. Therefore, creating an in-depth understanding of the relationship between activities and outcomes as implementation unfolds is critical.27 The five core components can support a more systematic and organized approach toward implementation evaluation—an approach that also allows for the identification of unanticipated outcomes, which is an important aspect of any evaluation effort. Ultimately, this level of clarity is called for if we wish to establish how to enhance future educational practice in a way that leads to the improvement of patient outcomes.53

Finally, beyond assisting program evaluation efforts, this framework can also provide guidance to the adoption of CBME in any situation new to these concepts. For example, framed by the principle of local implementation being guided by global considerations, or “glocalization,” the CCF is being used to translate CBME into training in Taiwanese specialty medical education.54

Back to Top | Article Outline

Limitations of the study

The identification of core components specific to CBME was dependent on articulating a particular perspective seen to underlie the intended change. As described, we chose a reform perspective in which all curricular elements, not just assessment practices, are described as being integral to the change process. We fully recognize, however, that as a medical education innovation, CBME is still in the early stages of implementation. It is very possible that other perspectives will continue to emerge as we gain more experience with implementation, suggesting the need for further work in identifying and describing core components.

As well, the CCF is primarily focused on the design elements of CBME. It has been recently proposed, however, that the learning conditions underpinning the educational model are also important in influencing fidelity.47 That different learning conditions may exist is supported by our identification of two different orientations when implementing CBME: a problem approach and a developmental approach.25 Indeed, there is evidence to suggest that the experience of learning in an environment that focuses on deficiencies is very different from that driven by a growth mindset.55 Although the reform perspective underlying the core components assumes the importance of developing competence in all learners (Table 1), an explicit connection between the core components and learning conditions has not yet been considered. In particular, the application of complexity science could be particularly useful in understanding this relationship.28,56–58

Back to Top | Article Outline

In conclusion

Similar to PBL, it has been suggested that there are many challenges to the widespread uptake of CBME.59 Nonetheless, CBME is an innovation that is rapidly being adopted into practice. Initial evaluation of CBME programs points to the need for a more sophisticated understanding of implementation. Learning from our experience with PBL, this article provides a standardized approach to moving this recommendation forward. With a focus on CBME as a transformative innovation, designed to promote the growth and development of all learners, the CBME CCF promotes clarity and consistency in evaluating program implementation.

The need for these studies to be theory based is also a recommendation stemming from our experience with PBL.60,61 This lesson learned parallels the call for more theory-driven efforts in the field of program evaluation.62 The deliberate linking of theory and practice, as exemplified by the full CBME CCF, provides a significant step forward toward meeting this challenge, ultimately bringing us one step closer to creating a refined understanding of the conditions under which CBME works most effectively to enhance patient care outcomes.

Figure 4

Figure 4

Acknowledgments: The authors would like to acknowledge and thank the group of medical education leaders and stakeholders at Queen’s University in Kingston, Ontario, Canada for providing feedback to the initial draft framework. The authors would also like to acknowledge support for conducting the Delphi process provided through the Royal College of Physicians and Surgeons of Canada. In particular, the authors would like to thank the International Competency-based Medical Education Collaborators for their ongoing interest in, and support of, this project.

Back to Top | Article Outline

References

1. Carraccio CL, Englander R. From Flexner to competencies: Reflections on a decade and the journey ahead. Acad Med. 2013;88:1067–1073.
2. Albanese MA, Mitchell S. Problem-based learning: A review of literature on its outcomes and implementation issues. Acad Med. 1993;68:52–81.
3. Berkson L. Problem-based learning: Have the expectations been met? Acad Med. 1993;68(10 suppl):S79–S88.
4. Vernon DT, Blake RL. Does problem-based learning work? A meta-analysis of evaluative research. Acad Med. 1993;68:550–563.
5. Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist review—A new method of systematic review designed for complex policy interventions. J Health Serv Res Policy. 2005;10(suppl 1):21–34.
6. Patton MQ. Essentials of Utilization-Focused Evaluation. 2012.Thousand Oaks, CA: Sage Publications.
7. Funnell SC, Rogers PJ. Purposeful Program Theory: Effective Use of Theories of Change and Logic Models. 2011.San Francisco, CA: Jossey-Bass.
8. Dobson D. Avoiding a Type III error in program evaluation: Results from a field experiment. Eval Program Plann. 1980;3:269–276.
9. Kerdijk W, Snoek JW, van Hell EA, Cohen-Schotanus J. The effect of implementing undergraduate competency-based medical education on students’ knowledge acquisition, clinical performance and perceived preparedness for practice: A comparative study. BMC Med Educ. 2013;13:76.
10. McGaghie WC, Miller GE, Sajid AW, Tedler TV. Competency-Based Curriculum Development in Medical Education: An Introduction. 1978.Geneva, Switzerland: World Health Organization.
11. Bryson JM, Patton MQ, Bowman RA. Working with evaluation stakeholders: A rationale, step-wise approach and toolkit. Eval Program Plann. 2011;34:1–12.
12. Chahine S, Kulasegaram KM, Wright S, et al. A call to investigate the relationship between education and health outcomes using big data. Acad Med. 2018;93:829–832.
13. Weinstein DF, Thibault GE. Illuminating graduate medical education outcomes in order to improve them. Acad Med. 2018;93:975–978.
14. Glasgow NJ, Wells R, Butler J, Gear A. The effectiveness of competency-based education in equipping primary health care workers to manage chronic disease in Australian general practice settings. Med J Aust. 2008;188(8 suppl):S92–S96.
15. Spady WJ. Competency based education: A bandwagon in search of a definition. Educ Res. 1977;6:9–14.
16. Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T. Toward a definition of competency-based education in medicine: A systematic review of published definitions. Med Teach. 2010;32:631–637.
17. Mowbray CT, Holter MC, Teague GB, Bybee D. Fidelity criteria: Development, measurement, and validation. Am J Eval. 2003;24:315–340.
18. Patton MQ. What is essential in developmental evaluation? On integrity, fidelity, adultery, abstinence, impotence, long-term commitment, integrity, and sensitivity in implementing evaluation models. Am J Eval. 2016;37:250–265.
19. Cousins BJ, Aubry TD, Fowler HS, Smith M. Using key component profiles for the evaluation of program implementation in intensive mental health case management. Eval Program Plann. 2004;27:1–23.
20. Hall GE, Hord SM. Implementing Change: Patterns, Principles, and Potholes. 2015.4th ed. New York, NY: Pearson.
21. Century J, Cassata A, Rudnick M, Freeman C. Measuring enactment of innovations and the factors that affect implementation and sustainability: Moving toward common language and shared conceptual understanding. J Behav Health Serv Res. 2012;39:343–361.
22. Holter MC, Mowbray CT, Bellamy CD, MacFarlane P, Dukarski J. Critical ingredients of consumer run services: Results of a national survey. Community Ment Health J. 2004;40:47–63.
23. Andolsek K, Padmore J, Hauer KE, Edgar L, Holmboe E. Clinical Competence Committees: A Guidebook for Programs. 2017. 2nd ed. Chicago, IL: Accreditation Council for Graduate Medical Education; https://www.acgme.org/Portals/0/ACGMEClinicalCompetencyCommitteeGuidebook.pdf. Accessed March 28, 2019.
24. Royal College of Physicians and Surgeons of Canada. Competence by Design. Competence Committees. 2017. Ottawa, Ontario, Canada; Royal College of Physicians and Surgeons of Canada;http://www.royalcollege.ca/rcsite/cbd/assessment/competence-committees-e. Accessed March 28, 2019.
25. Hauer KE, Chesluk B, Iobst W, et al. Reviewing residents’ competence: A qualitative study of the role of clinical competency committees in performance assessment. Acad Med. 2015;90:1084–1092.
26. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and Evaluating Complex Interventions: New Guidance. 2006. London, UK: Medical Research Council; https://mrc.ukri.org/documents/pdf/complex-interventions-guidance. Accessed March 28, 2019.
27. Van Melle E, Gruppen L, Holmboe ES, Flynn L, Oandasan I, Frank JR; International Competency-based Medical Education Collaborators. Using contribution analysis to evaluate competency-based medical education programs: It’s all about rigor in thinking. Acad Med. 2017;92:752–758.
28. Holmboe E. The journey to competency-based medical education: Implementing milestones. Marshall J Med. 2017;3. http://dx.doi.org/10.18590/mjm.2017.vol3.iss1.2
29. Pérez D, Van der Stuyft P, Zabala Mdel C, Castro M, Lefèvre P. Erratum to: “A modified theoretical framework to assess implementation fidelity of adaptive public health interventions.” Implement Sci. 2016;11:106.
30. Carraccio C, Englander R, Van Melle E, et al; International Competency-based Medical Education Collaborators. Advancing competency-based medical education: A charter for clinician–educators. Acad Med. 2016;91:645–649.
31. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.
32. Varpio L, Bell R, Hollingworth G, et al. Is transferring an educational innovation actually a process of transformation? Adv Health Sci Educ Theory Pract. 2012;17:357–367.
33. Bordage G. Conceptual frameworks to illuminate and magnify. Med Educ. 2009;43:312–319.
34. Shulha LM, Whitmore E, Cousins BJ, Gilbert N, al Hudib H. Introducing evidence-based principles to guide collaborative approaches to evaluation: Results of an empirical process. Am J Eval. 2016;37:193–215.
35. Humphrey-Murto S, Varpio L, Wood TJ, et al. The use of the Delphi and other consensus group methods in medical education research: A review. Acad Med. 2017;92:1491–1498.
36. Valentijn PP, Vrijhoef HJ, Ruwaard D, Boesveld I, Arends RY, Bruijnzeels MA. Towards an international taxonomy of integrated primary care: A Delphi consensus approach. BMC Fam Pract. 2015;16:64.
37. Waggoner J, Carline JD, Durning SJ. Is there a consensus on consensus methodology? Descriptions and recommendations for future consensus research. Acad Med. 2016;91:663–668.
38. Hsu CC, Sanford BA. The Delphi technique: Making sense of consensus. Pract Assess Res Eval. 2007;12(10). http://pareonline.net/getvn.asp?v=12&n=10. Accessed April 10, 2019.
39. Frank JR, Snell L, Englander R, Holmboe ES; ICBME Collaborators. Implementing competency-based medical education: Moving forward. Med Teach. 2017;39:568–573.
40. Snell LS, Frank JR. Competencies, the tea bag model, and the end of time. Med Teach. 2010;32:629–630.
41. International Competency-based Medical Education Collaborators. http://gocbme.org/icbme-site/index.html#about. Accessed March 28, 2019.
42. von der Gracht HA. Consensus measurement in Delphi studies: Review and implications for future quality assurance. Tech Forecast Soc Change. 2012;79:1525–1536.
43. Hodges BD. A tea-steeping or i-Doc model for medical education? Acad Med. 2010;85(9 suppl):S34–S44.
44. Biggs J. Teaching for Quality Learning at University. 2003.2nd ed. New York, NY: Open University Press.
45. Morcke AM, Dornan T, Eika B. Outcome (competency) based education: An exploration of its origins, theoretical basis, and empirical evidence. Adv Health Sci Educ Theory Pract. 2013;18:851–863.
46. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: From Flexner to competencies. Acad Med. 2002;77:361–367.
47. Holmboe ES, Batalden P. Achieving the desired transformation: Thoughts on next steps for outcomes-based medical education. Acad Med. 2015;90:1215–1223.
48. Eckel PD, Kezar A. Key strategies for making new institutional sense: Ingredients to higher education transformation. Higher Educ Policy. 2003;16:39–53.
49. McGaghie WC. Mastery learning: It is time for medical education to join the 21st century. Acad Med. 2015;90:1438–1441.
50. Nair R, Aggarwal R, Khanna D. Methods of formal consensus in classification/diagnostic criteria and guideline development. Semin Arthritis Rheum. 2011;41:95–105.
51. Patton MQ. Principles-Focused Evaluation: The Guide. 2018.New York, NY: The Guilford Press.
52. Horsley T, Regehr G. When are two interventions the same? Implications for reporting guidelines in education. Med Educ. 2018;52:141–143.
53. Cook DA, West CP. Perspective: Reconsidering the focus on “outcomes research” in medical education: A cautionary note. Acad Med. 2013;88:162–167.
54. Chou F, Hsiao C, Chen CC. Does CBME translate across systems? Learning from the Taiwanese experience. Workshop presented at: International Conference on Residency Education; October 18–20, 2018; Halifax, Nova Scotia, Canada.
55. Dweck CS. Mindset: The New Psychology of Success. 2016.New York, NY: Random House.
56. Hawe P, Shiell A, Riley T. Complex interventions: How “out of control” can a randomised controlled trial be? BMJ. 2004;328:1561–1563.
57. Cilliers P. Sturmberg JP, Martin CM. Understanding complex systems. In: Handbook of Systems and Complexity in Health. 2013.New York, NY: Springer.
58. Snyder S. The Simple, the Complicated and the Complex: Educational Reform Through the Lens of Complexity Theory. Organisation for Economic Co-operation and Development (OECD) Education Working Papers. 2013.Vol 96. Paris, France: OECD Publishing.
59. Boucher A, Frank JR, Van Melle E, Oandasan I, Touchie C. Competency Based Medical Education: A White Paper Commissioned by the AFMC Board of Directors. 2017. Ottawa, Ontario, Canada: Association of Faculties of Medicine of Canada; https://mededconference.ca/sites/default/files/AFMC-CompetencyBasedMedicalEducation_en.pdf. Accessed March 28, 2019.
60. Norman GR, Schmidt HG. Effectiveness of problem-based learning curricula: Theory, practice and paper darts. Med Educ. 2000;34:721–728.
61. Norman GR, Schmidt HG. Revisiting “Effectiveness of problem-based learning curricula: Theory, practice and paper darts.” Med Educ. 2016;50:793–797.
62. Brousselle A, Champagne F. Program theory evaluation: Logic analysis. Eval Program Plann. 2011;34:69–78.

Supplemental Digital Content

Back to Top | Article Outline
© 2019 by the Association of American Medical Colleges