Secondary Logo

Journal Logo

Research Reports

Flipping the Quality Improvement Classroom in Residency Education

Bonnes, Sara L. MD, MS; Ratelle, John T. MD; Halvorsen, Andrew J. MS; Carter, Kimberly J. MD; Hafdahl, Luke T. MD; Wang, Amy T. MD; Mandrekar, Jayawant N. PhD; Oxentenko, Amy S. MD; Beckman, Thomas J. MD; Wittich, Christopher M. MD, PharmD

Author Information
doi: 10.1097/ACM.0000000000001412
  • Free


Experts in medical education have observed that in-depth learning and retention require teaching approaches that are electronically accessible, learner focused, and interactive.1–4 Graduate medical training traditionally entails many hours of attending lectures and completing assignments, yet research suggests that learners’ engagement and knowledge retention are limited by this pedagogical approach.5–8 The flipped classroom (FC) model—which for this study is defined as delivery of core content to students independently before class, often using electronic technology, with class time devoted to applying the core content in facilitated group discussions1,6,9—has the potential to revolutionize residency education. However, few studies of the FC model in medical education have been conducted.

The FC concept originated for use with high school and undergraduate students.5,10 Recently, the Khan Academy developed online video sessions for elementary and high school students to improve their understanding of core educational topics, and some school districts have used this instructional content to flip their classrooms.1,4 There are some examples of the FC model within the health professions as well; when implemented in pharmacy education, the FC increased class attendance, student learning, and perceived value of the model.6,11 Additionally, flipping the core biochemistry course at Stanford Medical School increased class attendance from 30% to 80%.4 In the era of electronic learning, the FC model has the potential to engage and educate residents who must balance their educational and clinical demands.4,12

In particular, with this study, we saw an opportunity to apply the FC model to quality improvement (QI) curricula. The Accreditation Council for Graduate Medical Education (ACGME) expects residents to participate in identifying and solving systems errors as part of the systems-based practice and practice-based learning and improvement requirements.13 Most programs have developed QI curricula to fulfill this obligation.14,15 Published resident QI curricula have described traditional didactic sessions and independent participation in QI projects.8,14–16 Only a minority of residency programs have implemented online curricula,17–20 and none of these online curricula have explicitly addressed QI. The team-based nature of QI lends itself to group interaction, yet we are unaware of previous studies that investigate implementing an FC for resident QI education.

From prior research, we hypothesized that resident physicians’ perceptions of the FC would improve after experiencing the curriculum and that the degree of improvement would be associated with certain resident characteristics, including resident demographics, baseline knowledge, and participation in the FC curriculum.4,6,11 Therefore, the goals of this study were to (1) develop and validate an instrument to measure residents’ perceptions of the flipped QI classroom, (2) determine whether participation in a flipped QI curriculum improved residents’ perceptions of this method versus a traditionally didactic-based curriculum, and (3) identify associations between residents’ characteristics with a change in QI knowledge and FC perception scores.


Study design and participants

We conducted a prospective validation study of all categorical residents in the Mayo Clinic Internal Medicine Residency Program during the 2014–2015 academic year. The program consists of categorical residents in three-year training programs. All categorical postgraduate year 1 (PGY-1) (n = 48) and postgraduate year 3 (PGY-3) (n = 47) residents participated in a new FC QI (“flipped QI”) cur riculum that was implemented for the first time during the 2014–2015 academic year. The flipped QI curriculum occurred during a one-month required outpatient rotation and consisted of core content that was to be completed before attending in-class sessions.

Flipped QI curriculum.

Core content for the flipped QI curriculum, which was completed before class, was adapted from the Institute for Healthcare Improvement (IHI) Open School, an online curriculum that has enrolled and trained more than 150,000 residents and students around the world.21 The IHI modules use the Model for Improvement methodology to educate trainees on how to successfully improve health care quality. All the residents participating in the flipped QI curriculum were required to complete selected IHI Open School modules on Fundamentals of Improvement, the Model for Improvement, and Measuring for Improvement.22 Patient safety was addressed in the context of being the underlying driver for the need to improve the quality of health care. The modules took the intervention residents approximately five total hours to complete. The framework for the Model for Improvement module involves setting an aim; choosing measurements and change strategies; and then testing, implementing, and spreading those changes through plan–do–study–act (PDSA) cycles.14

For the in-class portion of the flipped QI classroom, small groups (four PGY-1 and four PGY-3 residents) applied the online content in a facilitated group QI project during each monthly rotation. Residents were asked to complete the preassigned IHI online content before attending each session. During the monthlong rotation, twice-weekly one- to three-hour sessions were facilitated by residency faculty (S.L.B., J.T.R., and C.M.W.) along with chief medical residents (K.J.C. and L.T.H.). In-class time used active learning techniques, including collaborative learning, problem solving, self-reflection, small-group debate, and application of the online content through participation in the group QI project.6,7 In-class sessions were focused around each deliverable section of the QI project, including (1) developing an aim statement, (2) selecting measurements, (3) identifying and interviewing stakeholders, (4) performing a systems analysis, (5) selecting tests of change, (6) implementing a PDSA cycle, and (7) completing a run chart. The residents also completed a project charter midway through the curriculum and a project summary at the end of the curriculum. In addition to meeting in scheduled and facilitated in-class sessions, the resident teams met independently to work on their QI projects. Examples of group projects included reducing unnecessary daily blood tests, decreasing care costs by choosing alternate deep vein thrombosis prophylaxis, decreasing constipation among patients receiving narcotics, and improving blood sugar management in diabetic patients. These projects were successful at reducing costs or process outcomes tracked in project control charts.

Traditional QI curriculum.

The postgraduate year 2 (PGY-2) residents (n = 48), who had not previously participated in the new flipped QI classroom curriculum, participated in a traditional (nonflipped) QI and patient safety curriculum.23 This curriculum built upon four 30-minute passive in-person QI lectures provided during the PGY-1 year that covered aim statements, outcome measurement, root cause analysis, interventions, and PDSA cycles. The content covered similar key concepts taught to the intervention group through the IHI modules but in a lecture format. The PGY-2 residents did not participate in the intervention group’s QI project; instead, they reviewed the concepts from the QI lectures and applied them to a morbidity and mortality conference patient for whom safety concerns had been identified. During this independent application experience, the PGY-2 resident reviewed the case, developed an aim statement around an identified QI problem, conducted a root cause analysis, and suggested possible interventions. This study was deemed exempt by the Mayo Clinic Institutional Review Board.

FC Perception Instrument development

Content for the FC Perception Instrument (FCPI) was derived from existing instruments6,7 and with input from the authors (S.L.B., J.T.R., A.T.W., T.J.B., and C.M.W.) who have experience in residency education and instrument design. Specifically, items were created around an FC’s major components, including off-loaded content, in-class work, active learning techniques, and teamwork.5–7,24 After iterative revision, eight items structured on a five-point Likert scale (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly agree) were selected for inclusion.

Data collection and analysis

The residents were surveyed with the use of SurveyMonkey (SurveyMonkey Inc., Palo Alto, California), a commercially available online survey program at the start and at the completion of the monthlong flipped QI curriculum. The survey was available through an e-mail link, and classroom time was given to complete the survey. The survey collected information on (1) residents’ preferences for the traditional classroom or FC, (2) residents’ past experiences with an FC, (3) self-reported completion of online modules, and (4) the eight items of the FCPI. Demographic characteristics, including sex, postgraduate year, and the American College of Physicians Internal Medicine In-Training Examination (ITE) scores, were retrieved from existing administrative databases. Additionally, all residents completed the previously validated QI Knowledge Assessment Tool (QIKAT) before and after the flipped QI curriculum.16,20,25 QIKAT responses were evaluated by study authors (S.L.B., J.T.R., K.J.C., and L.T.H.); the maximum score possible was 27. The QIKAT measures application of QI knowledge to unique settings, which was part of the experience for both the intervention and control group residents. Discrepancies in QIKAT scores were discussed, and a final aggregate score for each resident was reached by consensus. The PGY-2 control group also completed all survey elements and QIKAT assessments at the start and end of the month without participating in the flipped QI curriculum.

Factor analysis was completed on the FCPI items. The clustering of multiple ratings (before and after the FC curriculum) for each resident was accounted for with the use of an adjusted correlation matrix and generalized estimating equations. Orthogonal rotation was used to apply the correlation matrix to factor analysis. Factor analysis was also performed for sensitivity analysis with an unadjusted correlation matrix to compare the instrument results from before and after use of the FC curriculum. We used minimal proportion criteria to extract factors and a scree plot to confirm the final model. The cutoff for keeping items was a factor loading of 0.50 or more. Cronbach α (α > 0.7 was considered acceptable) was used in calculations of internal consistency reliability for items within each factor and overall.26

Categorical variables were summarized as numbers of participants and percentages of the sample. Continuous variables were summarized as mean (SD). Baseline differences in perception and knowledge scores across postgraduate years were assessed with Kruskal–Wallis analysis of variance (ANOVA). Differences in perception scores from before and after the FC curriculum were assessed for each item and overall with the Wilcoxon signed rank test. Identifying information was only available to the study statistician for linking survey responses to resident variables; after linking the data, all resident identifiers were removed. Associations between resident variables and the change (before and after the FC curriculum) in QIKAT and FCPI scores were determined with Kruskal–Wallis ANOVA or the Wilcoxon rank-sum test, as appropriate. The threshold for statistical significance was P < .05. Statistical analyses were conducted with SAS version 9.3 (SAS Institute Inc, Cary, North Carolina).


FCPI validation

Precourse surveys were completed by 36 of 48 (75%) PGY-1 residents, 30 of 48 (63%) PGY-2 residents, and 33 of 47 (70%) PGY-3 residents. Postcourse surveys were completed by 40 (83%), 36 (75%), and 41 (87%), with paired survey responses available for 32 (67%), 27 (56%), and 31 (66%) PGY-1, PGY-2, and PGY-3 residents, respectively. Factor analysis of all completed perception instruments revealed a two-dimensional model to measure perceptions of the flipped QI curriculum experience. The two identified factors were (1) the perception that preclass activity enhances learning (three items) and (2) perceptions that in-class application enhances learning (five items). The internal consistency reliabilities (Cronbach α) were 0.81 for the preclass activity factor, 0.88 for the in-class application factor, and 0.84 for all eight items overall.

Precourse and postcourse FC perceptions

For the PGY-1 and PGY-3 residents (the intervention groups that experienced the flipped QI classroom curriculum) who completed pre- and postcourse surveys (n = 63), the mean (SD) scores for the FCPI ranged from 3.16 (0.94) to 4.24 (0.61) (see Table 1). Among these residents, the postcourse perceptions increased from the precourse perception scores for three of the eight perception instrument items, including (1) “Online modules enhance my learning” (precourse 3.16 [0.94] vs. postcourse 3.49 [0.93], P = .006); (2) “I participate and engage in projects in-class” (precourse, 3.86 [0.74]; postcourse, 4.06 [0.50]; P = .02); and (3) “Working on a team enhances my learning” (precourse, 3.77 [0.73]; postcourse, 4.00 [0.70]; P = .01).

Table 1
Table 1:
Resident Physicians’ Perceptions of the Mayo Clinic Internal Medicine Flipped Quality Improvement Curriculum, 2014–2015a

When asked whether they preferred a traditional classroom or an FC, more of these 63 residents preferred an FC after completing the flipped QI curriculum (traditional, 22 [35%]; flipped, 41 [65%]) compared with before completing the curriculum (traditional, 31 [49%]; flipped, 32 [51%]; P < .0001) (Figure 1). Additionally, residents who preferred a traditional classroom at the end of the study had lower mean (SD) FC perception scores before the FC curriculum was instituted (traditional, 3.65 [0.36]; flipped, 4.08 [0.50]; P = .0006).

Figure 1
Figure 1:
Comparison of residents’ preferences for traditional or flipped classroom before and after the Mayo Clinic Internal Medicine Residency flipped quality improvement (QI) curriculum (2014–2015). Pre- and postcourse data were available for 63 postgraduate year 1 or 3 residents who experienced the flipped QI classroom curriculum.

QIKAT scores

Precourse QIKATs were completed by 45 PGY-1 residents (94%), 40 PGY-2 residents (83%), and 34 PGY-3 residents (72%). Postcourse QIKATs were completed by 43 (90%), 27 (56%), and 28 (60%), with paired QIKAT scores available for 42 (88%), 22 (46%), and 20 (43%) PGY-1, PGY-2, and PGY-3 residents, respectively. Baseline QI knowledge, as measured with mean (SD) QIKAT score, was similar for PGY-1, PGY-2, and PGY-3 residents (18.4 [3.2], 17.7 [3.9], and 18.6 [3.9], respectively; P = .55). At the end of the rotation, for the PGY-2 residents with paired QIKAT scores (n = 22), who did not complete the FC curriculum but participated in the traditional (nonflipped) QI and patient safety curriculum, the mean (SD) score remained relatively unchanged, with an increase of 0.7 (3.6) points (P = .13). However, for residents participating in the curriculum, the mean (SD) QIKAT scores increased 5.2 (3.2) points for PGY-1 residents and 5.0 (3.3) points for PGY-3 residents, with an overall increase of 5.1 (3.2) points (all P < .0001) (Figure 2).

Figure 2
Figure 2:
Comparison of residents’ Quality Improvement Knowledge Assessment Tool (QIKAT) scores before and after the Mayo Clinic Internal Medicine Residency flipped quality improvement curriculum (2014–2015). For the 62 residents in the intervention group, the mean (SD) change in QIKAT scores was 5.1 (3.2) (P < .0001); the mean (SD) change in scores for the 22 control group residents (0.7 [3.6]) did not significantly change (P = .13). Data are from residents who had pre- and post-QIKAT and perception scores.

Associations between resident characteristics and FCPI scores

Forty-seven PGY-1 and PGY-3 residents experienced the flipped QI curriculum and provided pre- and postcourse QIKAT and perception instrument data. There was a positive association between the percentage of modules for which the resident reported completion and the mean (SD) change in FC perception scores as follows: −0.06 (0.36) for 0% to 74% of modules completed, and 0.19 (0.47) for 75% to 100% of modules completed (P = .04) (see Table 2). Additionally, there was an association between previous FC exposure and the mean (SD) change in QIKAT score as follows: 3.1 (2.7) for prior exposure, and 6.1 (3.0) for no prior exposure (P = .002). There was also a difference in baseline mean (SD) QIKAT scores between these two groups (prior exposure, 19.5 [4.1]; no prior exposure, 17.4 [3.5]; P = .004). There were no associations between change in QIKAT scores or change in FC perception scores according to sex, postgraduate year, or ITE scores (all P > .15).

Table 2
Table 2:
Associations Between Resident Characteristics and Change in QIKAT and Flipped Classroom (FC) Perception Scores Before and After Experiencing the Mayo Clinic Internal Medicine Residency Flipped QI Curriculum (2014–2015)a

Perceptions of FC components

Of the 81 residents who completed surveys at the end of the FC curriculum, 6 residents (7%) felt that their QI knowledge was most enhanced by online modules, 29 (36%) felt that it was most enhanced by in-class sessions, and 45 (56%) felt that it was enhanced from online modules and in-class sessions equally (Figure 3).

Figure 3
Figure 3:
Residents’ perceptions of the component of the flipped classroom that most enhanced their quality improvement (QI) knowledge after participating in the Mayo Clinic Internal Medicine Residency flipped QI curriculum (2014–2105). Data are from 81 residents.


In this study, we report the first investigation, to our knowledge, of a flipped QI classroom in residency education. We found that the FCPI had compelling validity evidence, including content; internal consistency reliability; and an intuitive, two-factor structure (preclass activity and in-class application). Residents’ perceptions of the FC improved after exposure to the curriculum and were associated with more engagement in the online modules. Additionally, residents who participated in the curriculum demonstrated improved QI knowledge after the curriculum compared with the control group that participated in a separate nonflipped QI and patient safety curriculum. Finally, residents valued the in-class application sessions more than the online component. These findings have important implications for QI curricula and graduate medical education in general, as residency training programs increasingly use FC models.

Residents’ perceptions of the FC improved after exposure to this FC curriculum. Residents were also more likely to engage in class projects and value the educational benefit of working on a team after completing the curriculum. These findings suggest that the application component of the FC enhances overall perceptions of an FC. These findings are congruent with a prior study on FCs in pharmacy education by McLaughlin et al,6 who also found that exposure to FCs increased students’ preference for this method.

The present study found that among residents exposed to the FC, compared with a control group of residents, QI knowledge significantly increased. Similarly, Ramar et al27 noted that an FC is an effective instructional method for teaching QI to fellows. We found that residents with no prior exposure to an FC had larger improvements in QI knowledge scores, suggesting that the novelty of this method may have resulted in greater learning and engagement. Although the FC has generally been found to be effective, trainees have reported areas for improvement11,27; therefore, residents in our study who had previously experienced FCs may have entered the experience with preconceptions and biases that dampened their enthusiasm for the QI FC and thus their aptitude for learning.

A majority of residents expressed that their QI knowledge was enhanced more by the in-class sessions than by the online content. Online learning has been proposed as an alternative to traditional classroom formats to balance the time constraints of education and clinical practice.28 Nonetheless, studies have shown that interactive online learning was preferable to static, computer-based modules and that online learning is as effective as traditional instructional formats.29,30 Our study further emphasizes that the in-class interactive component is greatly valued by residents and contributes significantly to their learning.6,11 These findings should caution educators not to load curricular content into online modules without also including in-class sessions, where the learned content should be discussed and applied to practical scenarios.

The cost of developing and maintaining an FC has been identified as a limitation to adopting a flipped approach.31 However, in a recent cost-effectiveness analysis, the FC was less expensive than a traditional classroom.32 Furthermore, as noted by McLaughlin et al,33 cost should be carefully weighed against the established educational benefits of the FC model. An advantage of the current study is that the content used for the intervention group was the IHI module series, which is available free to any learner.

The FCPI described in this study is supported by strong validity evidence. In medical education research, construct validity is based on content, internal structure, and relations to the validity evidence of other variables.34–38 We developed content validity from previous research on the FC model and iterative revision by study investigators with experience in QI education and instrument design.6,7 Internal structure validity was based on factor analysis that revealed two dimensions of the FC model: online preparation and in-class application. Criterion (i.e., relations to other variables) validity was established by associations between resident characteristics and FC exposures, with changes in FC perception scores and QI knowledge.

This study has several limitations. First, it is a single-institution study, which could limit its generalizability. However, teaching QI is a universal component of residency training programs, and the IHI teaching modules used in this study are widely recognized and accessible to all programs and learners. Second, educational outcomes of this study represented Kirkpatrick’s level 1 (reaction) and level 2 (learning) and not the higher levels (behavior and patient results).39 Nonetheless, most education studies report reaction level outcomes and, unlike this study, lack comparison groups.40 Third, some PGY-2 and PGY-3 residents had absences at the end of this rotation, limiting our paired QIKAT results. Fourth, the residents in the intervention and control groups received similar but not identical didactic content and application experiences. Therefore, it may be that the content or application experiences—not the FC—had the biggest impact on perceptions and outcomes. Future research should focus on comparing the FC with other instructional designs.


This is the first study to report the implementation of a flipped QI classroom curriculum in residency education. These findings underscore the importance of combining both online learning and in-class discussion to promote active learning. We are hopeful that the validated FCPI that was developed in this study will be useful when applied to future FC curricula and research in graduate medical education. The findings of this study suggest that future active learning curricula should use the FC model to engage learners in small-group discussions and encourage the application of knowledge to real scenarios.


1. Prober CG, Khan S. Medical education reimagined: A call to action. Acad Med. 2013;88:14071410.
2. Alberts B. Failure of skin-deep learning. Science. 2012;338:1263.
3. Mehta NB, Hull AL, Young JB, Stoller JK. Just imagine: New paradigms for medical education. Acad Med. 2013;88:14181423.
4. Prober CG, Heath C. Lecture halls without lectures—a proposal for medical education. N Engl J Med. 2012;366:16571659.
5. Hamdan N, McKnight P, McKnight K, Arfstrom KM. The flipped learning model: A white paper based on the literature review titled a review of flipped learning. Accessed July 15, 2016.
6. McLaughlin JE, Roth MT, Glatt DM, et al. The flipped classroom: A course redesign to foster learning and engagement in a health professions school. Acad Med. 2014;89:236243.
7. Pierce R, Fox J. Vodcasts and active-learning exercises in a “flipped classroom” model of a renal pharmacotherapy module. Am J Pharm Educ. 2012;76:196.
8. Tomolo AM, Lawrence RH, Aron DC. A case study of translating ACGME practice-based learning and improvement requirements into reality: Systems quality improvement projects as the key component to a comprehensive curriculum. Postgrad Med J. 2009;85:530537.
9. Moffett J. Twelve tips for “flipping” the classroom. Med Teach. 2015;37:331336.
10. Bishop JL, Verleger MA. The flipped classroom: A survey of the research. In: Proceedings of the 120th ASEE Annual Conference & Exposition, June 2013, Atlanta, Georgia. 2013.Washington, DC: American Society for Engineering Education.
11. Khanova J, Roth MT, Rodgers JE, McLaughlin JE. Student experiences across multiple flipped courses in a single curriculum. Med Educ. 2015;49:10381048.
12. Martin SK, Farnan JM, Arora VM. Future: New strategies for hospitalists to overcome challenges in teaching on today’s wards. J Hosp Med. 2013;8:409413.
13. Accreditation Council for Graduate Medical Education. ACGME program requirements for graduate medical education in internal medicine. Accessed July 15, 2016.
14. Langley GL, Moen RD, Nolan KM, Nolan TW, Norman CL, Provost LP. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. 2009.2nd ed. San Francisco, CA: Jossey-Bass.
15. Varkey P, Karlapudi S, Rose S, Nelson R, Warner M. A systems approach for implementing practice-based learning and improvement and systems-based practice in graduate medical education. Acad Med. 2009;84:335339.
16. Ogrinc G, Headrick LA, Morrison LJ, Foster T. Teaching and assessing resident competence in practice-based learning and improvement. J Gen Intern Med. 2004;19(5 pt 2):496500.
17. Ogrinc G, Headrick LA, Mutha S, Coleman MT, O’Donnell J, Miles PV. A framework for teaching medical students and residents about practice-based learning and improvement, synthesized from a literature review. Acad Med. 2003;78:748756.
18. Oyler J, Vinci L, Arora V, Johnson J. Teaching internal medicine residents quality improvement techniques using the ABIM’s practice improvement modules. J Gen Intern Med. 2008;23:927930.
19. Oyler J, Vinci L, Johnson JK, Arora VM. Teaching internal medicine residents to sustain their improvement through the quality assessment and improvement curriculum. J Gen Intern Med. 2011;26:221225.
20. Vinci LM, Oyler J, Johnson JK, Arora VM. Effect of a quality improvement curriculum on resident knowledge and skills in improvement. Qual Saf Health Care. 2010;19:351354.
21. Institute for Healthcare Improvement. Open school. Chapters/Documents/OpenSchool%20Brochure.pdf. Accessed July 15, 2016.
22. Institute for Healthcare Improvement. IHI open school courses. Accessed July 19, 2016.
23. Szostek JH, Wieland ML, Loertscher LL, et al. A systems approach to morbidity and mortality conference. Am J Med. 2010;123:663668.
24. McDonald K, Smith CM. The flipped classroom for professional development: Part I. Benefits and strategies. J Contin Educ Nurs. 2013;44:437438.
25. Singh MK, Ogrinc G, Cox KR, et al. The Quality Improvement Knowledge Application Tool Revised (QIKAT-R). Acad Med. 2014;89:13861391.
26. DeVellis RF. Scale Development: Theory and Applications (Applied Social Research Methods Series; no. 26). 1991.Newbury Park, CA: Sage.
27. Ramar K, Hale CW, Dankbar EC. Innovative model of delivering quality improvement education for trainees—a pilot project. Med Educ Online. 2015;20:28764.
28. Cook DA, Dupras DM. Teaching on the Web: Automated online instruction and assessment of residents in an acute care clinic. Med Teach. 2004;26:599603.
29. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Instructional design variations in Internet-based learning for health professions education: A systematic review and meta-analysis. Acad Med. 2010;85:909922.
30. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: A meta-analysis. JAMA. 2008;300:11811196.
31. Spangler J. Costs related to a flipped classroom. Acad Med. 2014;89:1429.
32. Maloney S, Nicklen P, Rivers G, et al. A cost-effectiveness analysis of blended versus face-to-face delivery of evidence-based medicine to medical students. J Med Internet Res. 2015;17:e182.
33. McLaughlin JE, Roth MT, Mumper RJ. In reply to Spangler. Acad Med. 2014;89:14291430.
34. Beckman TJ, Cook DA, Mandrekar JN. What is the validity evidence for assessments of clinical teaching? J Gen Intern Med. 2005;20:11591164.
35. Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: Theory and application. Am J Med. 2006;119:166.e7166.e16.
36. Downing SM. Validity: On meaningful interpretation of assessment data. Med Educ. 2003;37:830837.
37. Messick S. Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. ETS Res Rep Series. 1994;1994(2):i28.
38. American Educational Research Association, American Psychological Association, National Council on Measurement in Education. Standards for Educational and Psychological Testing. 1999.Washington, DC: American Educational Research Association.
39. Kirkpatrick D. Great ideas revisited: Techniques for evaluating training programs (then): Revisiting Kirkpatrick’s four-level model (now). Train Dev. 1996;50(1):5459.
40. Reed DA, Cook DA, Beckman TJ, Levine RB, Kern DE, Wright SM. Association between funding and quality of published medical education research. JAMA. 2007;298:10021009.
Copyright © 2016 by the Association of American Medical Colleges