Using Contribution Analysis to Evaluate Competency-Based Medical Education Programs: It’s All About Rigor in Thinking : Academic Medicine

Secondary Logo

Journal Logo

Articles

Using Contribution Analysis to Evaluate Competency-Based Medical Education Programs: It’s All About Rigor in Thinking

Van Melle, Elaine PhD; Gruppen, Larry PhD; Holmboe, Eric S. MD; Flynn, Leslie MD, MEd; Oandasan, Ivy MD, MHSc; Frank, Jason R. MD, MAEd; for the International Competency-Based Medical Education Collaborators

Author Information
Academic Medicine 92(6):p 752-758, June 2017. | DOI: 10.1097/ACM.0000000000001479
  • Free

Abstract

Competency-based education is a unique approach to medical education, particularly, but not exclusively, to residency education. Based on a sequential acquisition of competencies required for practice, it is a significant change from the standard time-based model.1–5 This change is being driven by many concerns, one of which is that the current medical education system is not well equipped to prepare physicians to practice in today’s rapidly evolving health care system.6–10 Consequently, competency-based medical education (CBME) is being implemented around the globe.11–14 With an emphasis on matching competencies to population needs, CBME provides a disciplined approach to ensuring that the curriculum serves the needs of the health care system.4,8

Despite the fact that CBME is founded on sound educational principles and advancements in educational practice,15 we have little evidence regarding whether, relative to traditional practices, it produces physicians who are better prepared for today’s practice environments, let alone whether it ultimately contributes to improved patient outcomes. Gathering such evidence requires rigorous evaluation of CBME programs and outcomes. Consequently, as an international group of collaborators, we set out to provide guidance for the evaluation of CBME programs.

Discussions at a two-day summit held in September 2013 and a review of the literature led us to consider CBME as a complex intervention that entails multiple activities combined in various ways to contribute to the achievement of outcomes.16 Accordingly, we suggest that nontraditional approaches that do not assume a direct cause-and-effect relation between activities and outcomes may offer the best approach to CBME program evaluation. The purpose of this article is to describe such an approach.

CBME as a Complex Service Intervention

A hallmark of CBME is that all curricular elements—learning experiences, teaching methods, and systems of assessment—work together in a way that promotes cumulative learning along the continuum of increasing medical sophistication.4,17 CBME therefore consists of many activities that interact in multiple ways. Another feature of CBME is the variety of outcomes stemming from program activities, a sample of which is provided in List 1. As shown, these outcomes can be categorized as proximal shorter-term outcomes primarily associated with program activities, and distal system-wide outcomes resulting from a variety of factors, which can include but are not limited to residency education. For example, in achieving the distal outcome of improved patient care, other factors, such as accessibility of health care services, also play a role. Distal outcomes tend to take years to unfold and, due to their broader scale, are typically referred to as impacts. These distal impacts are equivalent to the fourth, or “results,” level described in the Kirkpatrick model18 and as such are often the most challenging to attain and evaluate.

List 1

A Sample of Outcomes Associated With Competency-Based Medical Educationa

Proximal Outcomes: Program-Related Outcomes

  • Learners who are actively engaged in setting learning objectives in keeping with received feedback (e.g., learners who demonstrate self-regulated learning)
  • Learners who are competent at entry to practice
  • Ease of transition into practice
  • Earlier identification of learners who are not progressing as quickly as expected
  • Earlier identification of learners who are progressing faster than expected
  • Development of faculty expertise in providing feedback
  • A shift in the culture of assessment

Distal Outcomes: System-Wide Impacts

  • Improved patient care outcomes
  • Alignment of medical education with population health needs
  • A seamless learning experience across the continuum of medical education
  • More focused and relevant continuing professional development activities
  • Standard setting for maintenance of competence in relevant professional activities
  • Physicians who are able to engage in the pursuit of expertise and mastery
  • Enhanced portability of practice across jurisdictions

aA feature of competency-based medical education is the variety of outcomes stemming from program activities; this list provides a sample.

With its range of interacting components leading to multiple outcomes that unfold over time, CBME can be understood as a complex service intervention.16 Under conditions of complexity, components interact in different combinations, leading to the emergence of different outcomes. Child rearing might be said to exemplify this kind of complexity, in that “raising one child provides experience but is no guarantee of success with the next.”19(p9) And, just as parent–child interactions lie at the heart of the development of the child toward adulthood, so too are the interactions between faculty and a resident at the heart of the resident’s progress toward becoming a competent physician. Although the parents typically exert the main influence in a family system, other factors also shape a child’s development. A similar complexity is at work in CBME: Multiple factors such as faculty interactions with residents influence residents’ development of competence (see Figure 1), and to any learning situation, each resident brings unique experiences and his or her own capacity to engage in CBME as a self-regulated learner.20,21 At the same time, each faculty member also has a unique ability to respond to this heterogeneity.22 As with any complex system, it is the quality of the interaction between the individuals within the CBME intervention that is most critical.23 This interaction must create the necessary feedback loops to amplify appropriate behavior and thus support the emergence of new patterns of competent performance.24

F1
Figure 1:
Competency-based medical education as a complex service intervention, with a range of interacting curricular elements (e.g., learning experiences, teaching methods, and systems of assessment) which lead to multiple outcomes that unfold over time. The two inner arrows symbolize how the relationship between the interacting agents (e.g., teachers and learners) leads to emergent patterns of behavior, or competencies, over time.

Context is another key aspect of complex service interventions.25 For example, institutional context, including the system of assessment, level of faculty expertise, and organization of learning experiences, as well as the organization of local health care resources, can all serve to modulate faculty and resident interactions. By way of illustration, although residents are expected to achieve the same competencies, their contextualized learning experiences and interactions with faculty influence the specific trajectory required to attain competence. Consequently, no two CBME programs will ever be exactly alike, which makes evaluating CBME programs all the more challenging.

Challenges in CBME Program Evaluation: Contribution Versus Attribution

Evaluating CBME requires a change in mind-set.26 We need to move away from a traditional approach, in which “goals are easily agreed upon and can be precisely measured, and the cause and effect are well described,”27(p980) toward an appreciation that multiple possible pathways contribute to the achievement of program goals.28 In this section we elaborate on this theme of moving away from attribution analysis in favor of contribution analysis (CA).

Attribution analysis

Traditionally, attribution analysis has been used for program evaluation (see Table 1).29 This approach has worked well for reproducible interventions with clearly definable objectives that can be achieved relatively quickly. When these conditions are met, a direct causal relationship between the program activity and outcome may be inferred. Positivist experimental methods are most often used with attribution analysis; these rely on comparison groups and randomization to focus on a specific outcome variable and distribute “unwanted” variables evenly among groups.

T1
Table 1:
Attribution Analysis Versus Contribution Analysis

Under conditions of complexity, however, the path to success is so involved that it cannot be articulated in advance.30 Controlling for variables is extremely challenging, as is the task of ensuring that interventions (learning experiences and curricula) are consistent and reproducible regardless of context. Consequently, traditional reductionist methods that rely on control and predictability tend to be inadequate in generating meaningful understanding of program impacts.27,28,31,32 Under conditions of complexity an approach is needed that examines the “full continuum and range of degrees of connection and relationship between the activities undertaken and the results observed.”26(p376) CA provides such an approach.

Contribution analysis

In CA, the key question posed in evaluation is “How much of a difference (or contribution) has the program made to the observed outcomes?”31(p54) In undertaking CA, a critical task is to open up the “black box” of program implementation33 to ensure a full understanding of both how and why the program has been implemented.34 Understanding the both how and why will help us to describe an impact pathway: that is, a pathway that draws from theory to articulate the assumptions underlying the linkages between the program activities and expected outcomes. The impact pathway is a way to describe the “theory of change”; CA is used to test this theory “against logic and evidence available … and either confirms or suggests revisions … where reality appears otherwise.”31(p54) The end result is a robust and credible contribution story that provides a clear description of what a program looks like in a particular context, how it has made a difference, and why.

Undertaking a CA requires a mind-set in which rigor in thinking becomes an overriding concern (see Table 1).26,35 By rigor in thinking, we mean the ability to provide a plausible description of the linkages between multiple program activities, their relationship to proximal and distal outcomes, and the underlying assumptions informing those connections. It entails the capacity to draw from different theories in articulating assumptions. In explaining findings, rigor in thinking requires the ability to consider multiple hypotheses beyond the original framing, as well as to identify where inferences may be stronger or weaker within the impact pathway.31,36 Rigor in methodology remains important, but focuses on collecting information from a variety of sources that shed light on the interpretation of how the program activities lead to observed outcomes.31 In fact, in keeping with the condition of complexity, it is expected that the contribution story will require several iterations to unfold within a particular context. In the following section we consider how such a CA could be applied to CBME.

Applying CA to CBME

CA was first developed as a way to move past the practice of simply monitoring the implementation of government programs to actually articulating the link between government programs and key outcomes, such as more jobs, a healthier public, or better living conditions.37 CA has been described as a theory-based approach to evaluation,38 and Mayne39 proposes that it provides a systematic way to make credible causal claims under conditions of complexity. Indeed, over the past 15 years CA has been applied to different settings26,40–42 and is developing into an increasingly sophisticated approach to program evaluation.31,39,42,43

Although CA has not yet been applied to medical education, it is suggested that it can be a powerful approach to evaluating a major curriculum innovation for which multiple causal claims must be examined in a rigorous and credible fashion.27,35 However, recognizing that CA requires new understanding and capabilities, we wanted to provide guidance for its application to a CBME program. To do so we adapted an example from the literature31(p68) and developed a potential theory of change for CBME (see Figure 2). We then used this theory of change to describe how Mayne’s six-step CA model could apply to evaluating a CBME program.31,37

F2
Figure 2:
An example of a theory of change for competency-based medical education that captures the focus and scope of a contribution analysis. The overall program theory is that a focus on competencies required for practice allows for the individualized and progressive development of competence and therefore results in learners who are better prepared to enter into practice and provide quality patient care. In the figure, assumptions and risks are labeled (C) if the program has control over the assumptions and risks, (DI) if it has direct influence over them, (II) if it has indirect influence over them, or (0) if it has no influence over them.

Step 1: Set out the cause-and-effect issue to be addressed

Acknowledging that CBME is a complex service intervention, and accordingly embracing the question of contribution versus attribution (see Table 1), is a critical first step in conducting a CA. An understanding that the contribution of program activities to outcomes will be multifaceted and multilayered will naturally follow. The challenge then is to determine the desired focus and scope of the CA in a particular setting where CBME has been implemented. Clearly articulating how other influencing factors may shape the implementation of CBME in a particular institutional context is an important part of this opening discussion.

Step 2: Develop a postulated theory of change

A central task in CA is to illustrate a theory of change that captures the focus and scope of the CA. Figure 2 presents an example of such a theory for CBME. The external influences identified in Figure 2 provide the starting point. The challenge is to identify and highlight factors that are most likely to influence the outcomes of interest. The results chain, in which the proximal and distal outcomes are arranged in a temporal fashion, forms the backbone of the theory of change.33,34 In this particular example, a number of outcomes were selected from List 1 to create the results chain presented in Figure 2.

Figure 2 provides a generic model that can be adapted to any CBME program. The outcome statements are deliberately nonspecific—for instance, learners who are competent at entry to practice (proximal outcome), or improved patient-care outcomes (distal outcome). Outcome statements, however, should become more precise when they are used to create a theory of change for a particular specialty program—for example, a specific patient care outcome or outcomes that may be of particular relevance to the specialty of general surgery, in which case the results chain can be recast accordingly. Indeed, depending on the focus of the CA and the number of influences or factors involved, the results chain may range from being quite simple to being very complex with many boxes and cross-linking arrows.31,44

Linking the outcomes to program activities is another critical element of the results chain. This is captured by the outputs box of Figure 2, which describes the anticipated products that should arise as the CBME program is implemented—that is, patterns of progression, individualized learning pathways, and personal learning plans. Therefore, articulating program outputs, or the anticipated products that lead to outcomes, is also an important aspect of forming a results chain.

To complete the theory of change, the assumptions and risks anticipated for the outcomes and outputs should be described. The statements should be grounded in theories related to the CBME field and reflect and readily resonate with various stakeholder assumptions.34 This can be particularly challenging in CBME, where multiple theories underlie program activities.15,45 Accordingly, a guide capturing mechanisms of action can be helpful.26Chart 1 is an example of a guide that may be used to inform the assumptions and risks (see Figure 2). In this particular example, we have drawn from the concepts of self-regulated learning and programmatic assessment to shape a list of assumptions and risks. In Chart 1, the list of assumptions (right-hand column) points toward theories of actions to explain why CBME works. The risks (left-hand column) relate to how the program was implemented or the implementation theory. Attention must also be paid to the extent to which the program has control over the assumptions and risks (see Figure 2’s legend). Generally, the more distal the outcome (e.g., improved patient care outcomes), the more other variables outside of the control of the program come into play. The program theory, or the overarching statement that provides a summary of the theory of change, is a grounding feature of the process31 (see the first complete sentence of Figure 2’s legend).

AT1
Chart 1:
Mechanisms of Action Applicable to Competency-Based Medical Educationa

Step 3: Gather existing evidence

In Step 3, the focus is to test the theory of change against logic and available evidence. A wide range of evidence could be gathered, and so the key is to determine where concrete evidence is available and is most needed. For example, in Figure 2 any one of the following questions could form the focus for gathering evidence.

  1. What do pathways look like for individual residents as they progress toward competence?
  2. How do the curricular activities (e.g., learning experiences, teaching methods, assessment systems) associated with CBME contribute to the development of these pathways?
  3. How have these pathways contributed to the development of competence upon entry to practice?
  4. To what extent are these competencies aligned with the health needs of the local population?
  5. What is the relationship between the competencies and patient care outcomes?

These questions cover both program processes (e.g., Was the program implemented as intended?—See questions 1 and 2) as well as outcomes (e.g., What was the impact of the program?—See questions 3, 4, and 5).

CA relies on multiple sources of evidence (e.g., quantitative, qualitative, a review of previous studies or the literature) to tell the story of how a program is making a difference to achieve specific proximal and distal outcomes. For example, producing pathways of progression, or learning curves, requires the compilation of quantitative data.46 In CBME this can include using milestones to develop measurable progression toward competence.47 Qualitative data such as interviews with learners and/or teachers are useful to describe the types of instructional strategies and how they may influence the trajectory of the learning curve. A review of local reports might be required to determine the health needs of the local population and to determine other needs. Gathering evidence primarily focuses on three areas in the theory of change: observed results as articulated in the results chain, testing assumptions, and describing the influence of external factors.

Step 4: Assemble the contribution story

Using the available evidence a contribution story can now be assembled. In keeping with Steps 1 to 3, a credible contribution story contains the following elements31(p87):

  • the context is thoroughly described;
  • the theory of change is plausible;
  • activities, resulting outputs, and observed outcomes are well described;
  • the evidence behind the key links in the theory of change is presented, including a discussion of the strengths and weaknesses;
  • the results of the analysis are explained; and
  • other influencing factors are discussed.

Creating such a story helps to determine which links in the chain are supported by strong evidence and which may be weak.43,44,48

Step 5: Seek out additional evidence

In assembling the contribution story, it may become apparent that more data are needed to improve understanding or to strengthen certain linkages of the theory of change. Additional data can be gathered through multiple methods, including direct measurement of results and linkages, surveys, more in-depth literature searches, and examining existing databases. Because the overall goal is to improve the credibility of the contribution story, triangulation of findings is an important consideration. In undertaking both Steps 4 and 5, it may also become apparent that the theory of change needs to be revised. Indeed, creating a theory of change should always be viewed as an ongoing, iterative process allowing for an ever-deepening understanding of program activities and outcomes.

Step 6: Revise and strengthen the contribution story

Using this new evidence, a stronger contribution story can be created. Indeed, the entire process of CA is best used as an iterative approach in which Steps 3 to 6 are seen as an ongoing cycle as we enhance our understanding of the relationship between CBME program activities and proximal outcomes related to individual pathways of progression and distal outcomes of better patient care. The approach to program evaluation using CA is not intended to be a one-off activity; rather, it is meant to provide a story that unfolds over time. Indeed, experience suggests that it may be too challenging to examine all causal factors at once.36 At times, it may be more feasible for a program to focus on exploring one of the factors one year and another the following year, thereby creating an overall contribution story. When the overall theory of change is particularly complex, as may be the case with CBME, conducting a number of smaller, focused evaluations may be a preferable approach when conducting a CA.31

Getting Started

Unlike other theory-based approaches (e.g., realistic evaluation), CA requires that the theory of change clearly capture stakeholders’ assumptions.27 Consequently, CA relies on deliberation and consensus building among multiple stakeholders, particularly at the inception phase.35,36 Thus, it is recommended that those implementing CBME be engaged in this process at the outset.26,29 Furthermore, they should have access to individuals with expertise in theories related to CBME and quantitative and qualitative methods of research.

The Importance of CA to CBME

As a complex service intervention, in which multiple activities combine in various ways to contribute to the achievement of outcomes, CBME requires a different approach to evaluation: an approach that links activities, outcomes, and theory. We provide a six-step approach to undertaking such an analysis. The central challenge of this approach, which we refer to as CA, is to develop a theory of change and then test this theory against logic and available evidence.

Accordingly, in conducting a CA, rigor in thinking, together with the ability to consider multiple hypotheses and pathways, becomes foundational. This focus fits well with current directions in health professions education where opening the black box of program implementation can provide a powerful understanding of how theory can advance practice.25,49,50 CA is also consistent with current directions in program evaluation that aim to capture the increasing sophistication of practice51 and for which the ability to integrate complex information is becoming a key skill.52

Given the level of expertise, experience, and effort required, it is important to acknowledge that undertaking a CA may not be feasible in every situation. Nonetheless, acquiring an understanding of CA, particularly the rigor in thinking required to link program activities, outcomes, and theory, will serve to strengthen our understanding of how CBME can lead to better-prepared physicians and improved patient outcomes. Indeed, it is increasingly recognized that conditions of complexity require the synthesis of multiple studies.53 With CBME on the verge of widespread adoption,5 the time is right to carefully elucidate what CBME looks like in a particular context, how it has made a difference, and why. CA is a powerful vehicle for providing such an approach.

Acknowledgments: The authors would like to acknowledge editorial support provided through the Royal College of Physicians and Surgeons of Canada.

References

1. Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin CShifting paradigms: From Flexner to competencies. Acad Med. 2002;77:361–367.
2. Frank JR, Snell LS, Cate OT, et al.Competency-based medical education: Theory to practice. Med Teach. 2010;32:638–645.
3. Hodges BDA tea-steeping or i-Doc model for medical education? Acad Med. 2010;85(9 suppl):S34–S44.
4. McGaghie WC, Miller GE, Sajid AW, Telder TVCompetency-Based Curriculum Development in Medical Education: An Introduction. 1978. Geneva, Switzerland: World Health Organization; Public health papers no. 68.
5. Carraccio CL, Englander RFrom Flexner to competencies: Reflections on a decade and the journey ahead. Acad Med. 2013;88:1067–1073.
6. Baker GR, Norton PG, Flintoft V, et al.The Canadian Adverse Events Study: The incidence of adverse events among hospital patients in Canada. CMAJ. 2004;170:1678–1686.
7. Choudhry NK, Fletcher RH, Soumerai SBSystematic review: The relationship between clinical experience and quality of health care. Ann Intern Med. 2005;142:260–273.
8. Frenk J, Chen L, Bhutta ZA, et al.Health professionals for a new century: Transforming education to strengthen health systems in an interdependent world. Lancet. 2010;376:1923–1958.
9. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. 2001.Washington, DC: National Academies Press;
10. Shaw R, Drever F, Hughes H, Osborn S, Williams SAdverse events and near miss reporting in the NHS. Qual Saf Health Care. 2005;14:279–283.
11. Frank JR, Danoff DThe CanMEDS initiative: Implementing an outcomes-based framework of physician competencies. Med Teach. 2007;29:642–647.
12. Batalden P, Leach D, Swing S, Dreyfus H, Dreyfus SGeneral competencies and accreditation in graduate medical education. Health Aff (Millwood). 2002;21:103–111.
13. Simpson JG, Furnace J, Crosby J, et al.The Scottish doctor—Learning outcomes for the medical undergraduate in Scotland: A foundation for competent and reflective practitioners. Med Teach. 2002;24:136–143.
14. Confederation of Postgraduate Medical Education Councils. Australian curriculum framework for junior doctors. http://curriculum.cpmec.org.au/. Accessed August 22, 2016.
15. Swing SRInternational CBME Collaborators. Perspectives on competency-based medical education from the learning sciences. Med Teach. 2010;32:663–668.
16. Medical Research Council. Developing and Evaluating Complex Interventions: New Guidance. 2008.London, UK: Medical Research Council;
17. Gruppen LD, Mangrulkar RS, Kolars JCThe promise of competency-based education in the health professions for improving global health. Hum Resour Health. 2012;10:43.
18. Kirkpatrick DL, Kirkpatrick JDEvaluating Training Programs. 2006.3rd ed. San Francisco, CA: Berrett-Koehler Publishers;
19. Westley F, Zimmerman B, Patton MQGetting to Maybe: How the World Is Changed. 2006.Toronto, Ontario, Canada: Vintage Canada;
20. Artino AR, Brydges R, Gruppen LDCleland J, Durning SJSelf-regulated learning in healthcare profession education: Theoretical perspectives and research methods. In: Researching Medical Education. 2015:Oxford, UK: John Wiley & Sons; 155–166.
21. Schunk DH, Zimmerman BJMotivation and Self-Regulated Learning: Theory, Research, and Applications. 2012.Hillsdale, NJ: Lawrence Erlbaum Associates;
22. Cooper H, Geyer RUsing “complexity” for improving educational research in health care. Soc Sci Med. 2008;67:177–182.
23. North American Primary Care Research Group. A complexity science primer: What is complexity science and why should I learn about it? NAPCRG Resources. August 2009. https://www.napcrg.org/Portals/51/Documents/Beginner%20Complexity%20Science%20Module.pdf. Accessed August 22, 2016.
24. Capra FThe Hidden Connections: A Science for Sustainable Living. 2004.New York, NY: Random House;
25. Regehr GIt’s NOT rocket science: Rethinking our metaphors for research in health professions education. Med Educ. 2010;44:31–39.
26. Patton MQA utilization-focused approach to contribution analysis. Evaluation. 2012;18:364–377.
27. Dauphinee WDThe role of theory-based outcome frameworks in program evaluation: Considering the case of contribution analysis. Med Teach. 2015;37:979–982.
28. Sturmberg JP, Martin CMSturmberg JP, Martin CMComplexity in health: An introduction. In: Handbook of Systems and Complexity in Health. 2013:New York, NY: Springer; 1–17.
29. Patton MQEssentials of Utilization-Focused Evaluation. 2012.Thousand Oaks, CA: Sage Publications;
30. Rogers PJUsing programme theory to evaluate complicated and complex aspects of interventions. Evaluation. 2008;14:29–48.
31. Mayne JForss K, Marra M, Schwartz RContribution analysis: Addressing cause and effect. In: Evaluating the Complex: Attribution, Contribution and Beyond. 2011.New Brunswick, NJ: Transaction Publishers;
32. Haji F, Morin MP, Parker KRethinking programme evaluation in health professions education: Beyond “did it work?” Med Educ. 2013;47:342–351.
33. Funnell SC, Rogers PJPurposeful Program Theory: Effective Use of Theories of Change and Logic Models. 2011.San Francisco, CA: Jossey-Bass;
34. Weiss CHWhich links in which theories shall we evaluate? New Dir Eval. 2000;87:35–45.
35. Moreau KA, Eady KConnecting medical education to patient outcomes: The promise of contribution analysis. Med Teach. 2015;37:1060–1062.
36. Delehais T, Toulemonde JApplying contribution analysis: Lessons from five years of practice. Evaluation. 2012;18:281–293.
37. Mayne JAddressing contribution analysis through contribution measures: Using performance measures sensibly. Can J Program Eval. 2001;18:1–24.
38. Chen HTTheory-Driven Evaluations. 1990.Newbury Park, CA: Sage Publications;
39. Mayne JContribution analysis: Coming of age? Evaluation. 2012;18:270–280.
40. Biggs JS, Farrell L, Lawrence G, Johnson JKA practical example of contribution analysis to a public health intervention. Evaluation. 2014;20:214–229.
41. Rotem A, Zinovieff MA, Goubarev AA framework for evaluating the impact of the United Nations fellowship programmes. Hum Resour Health. 2010;8:7.
42. Wimbush E, Montague S, Mulherin TApplications of contribution analysis to outcome planning and impact evaluation. 2012;18:310–329.
43. Lemire ST, Neilsen SB, Dybdal LMaking contribution analysis work: A practical framework for handling influencing factors and alternative explanations. Evaluaion. 2012;18:294–309.
44. Mayne JILAC brief 26: Making causal claims. http://www.managingforimpact.org/resource/ilac-brief-26-making-causal-claims. Published October 2012. Accessed August 22, 2016.
45. Hodges BD, Kuper ATheory and practice in the design and conduct of graduate medical education. Acad Med. 2012;87:25–33.
46. Pusic MV, Boutis K, Hatala R, Cook DALearning curves in health professions education. Acad Med. 2015;90:1034–1042.
47. Meade LB, Borden SH, McArdle P, Rosenblum MJ, Picchioni MS, Hinchey KTFrom theory to actual practice: Creation and application of milestones in an internal medicine residency program, 2004–2010. Med Teach. 2012;34:717–723.
48. Mayne JContribution analysis: An approach to exploring cause and effect. ILAC brief 16. betterevaluation.org/resources/guides/contribution_analysis/ilac_brief. Published May 2008. Accessed August 22, 2016.
49. Bordage GConceptual frameworks to illuminate and magnify. Med Educ. 2009;43:312–319.
50. Cook DA, West CPReconsidering the focus on “outcomes research” in medical education: A cautionary note. Acad Med. 2013;88:1–6.
51. Altschuld JW, Engle MThe inexorable historical press of the developing evaluation profession. New Dir Eval. 2015;145:5–19.
52. Scriven MOn the differences between evaluation and social science research. Eval Exchange. 2004;9:1–19.
53. Rycroft-Malone J, McCormack B, Hutchinson AM, et al.Realist synthesis: Illustrating the method for implementation research. Implement Sci. 2012;7:1–10.
Copyright © 2017 by the Association of American Medical Colleges