Medical educators and accrediting organizations have shifted their emphasis from what is taught in the curriculum to what a medical student, resident, or practicing physician can perform. Whereas most trainees and practicing physicians can demonstrate competence in clinical and communication skills, a minority fail to meet the expected standard and require remediation. Despite widespread endorsement of the expectation that physicians-in-training and practicing physicians be assessed for their competence, it remains challenging to identify accurately and reliably those trainees and physicians who are incompetent or less than fully competent and to remediate their deficiencies effectively. Less than fully competent physicians or trainees fail to maintain acceptable standards in one or more areas of professional physician practice, whereas incompetent physicians lack the abilities (cognitive, noncognitive, and communicative) and qualities needed to perform effectively within the scope of professional physician practice.1
Remediation begins with the identification of trainees or physicians in practice who fail to demonstrate competence during assessments of their skills. Identification of trainees needing remediation may be easiest at the undergraduate level because the performance expectations of students are relatively homogeneous, and students are frequently tested within their schools. The advent of the United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills (CS) exam has prompted an increase in the assessment of clinical skills in medical schools, both to evaluate students' achievement of skills emphasized in their schools' curriculum and to prepare students for the licensing exam.2 Assessment at the graduate medical education (GME) level becomes more challenging because training differentiates along specialty lines and because trainees are expected not only to learn but also to provide necessary service to patients. There has been broad adoption of the competency framework for assessment in GME, but this construct still remains unsupported by the literature, and valid and reliable methods of assessing competencies do not yet exist.3 At all levels of training, it is rare that supervisors in nonprocedural specialties directly observe trainees with patients, which leaves supervisors to draw inferences about the competence of students and residents from their oral presentations and their interactions with other health care providers.4 Similarly, physicians in practice are rarely assessed in their work environments, in part because of the paucity of reliable, valid, and feasible assessment tools.5 Nonetheless, the public assumes and desires that physicians are monitored regularly and will receive remedial intervention when needed.6
When deficits go undetected or unaddressed, physician performance and patient safety are jeopardized. For instance, performance problems in the domains of knowledge and professionalism have been linked to subsequent disciplinary action by state medical boards.7,8 Medical schools are investing their resources to prepare their students to effectively perform core clinical and communication skills in the USMLE Step 2 CS Exam,9,10 and residency programs are developing innovative methods of teaching and assessing competence in the six competency domains defined by the Outcome Project of the Accreditation Council for Graduate Medical Education,11 including those domains, such as professionalism, that previously received less attention.12 However, it remains unclear how a lack of competence should be addressed before advancement, and medical education lags behind other areas of education13 in developing robust strategies for remediation.
The learning sciences offer guidance for structuring remediation programs in medical education. For example, when dealing with knowledge and reasoning problems, the focus should be on helping learners to build strong knowledge structures and representations (e.g., schema, scripts, exemplars, and prototypes).14–19 For both gaining knowledge and learning skills (procedural and communication), students need to participate in deliberate (i.e., conscious and focused) practice and need to receive feedback.20 These interventions assist learners in thinking deeply, reasoning soundly, and practicing deliberately and repetitively. To remedy deficiencies in professionalism, learners may need explicit instruction, guided practice, mentored reflection, and observation of and interaction with role models.7,21–26
Our purposes in this study were to review the literature on remediation interventions in undergraduate, graduate, and continuing education and to determine whether this literature is congruent with research from the learning sciences. Specifically, we sought to identify interventions that have been used for remediation, to examine the areas that were targeted for remediation, and to determine the outcomes of remediation efforts. Our goal was to develop an ideal model of remediation based on the literature and on the learning sciences.
Method
We defined remediation as having three components that were based on criteria proposed by the Federation of State Medical Boards.27 First, deficiencies in the individual's performance are identified through an assessment process. Second, an attempt is made to provide remedial education to the individual. Third, after the remedial intervention, the individual is reassessed in the area of his or her deficient performance.
The literature search focused on studies of remediation that took place in undergraduate medical education (UME), GME, and continuing medical education (CME) of postlicensure physicians. We searched the MEDLINE database through April 2008 for citations by using terms related to remediation (remediation or remedial teaching), level of practitioner (medical students, clinical clerks, internship and residency, and physicians), and other related terms (clinical competence; program evaluation or program development; educational measurement, curriculum, or model; and mentors). We extended the search through October 2008 to identify any newly published studies. In addition, we manually searched the bibliographies of relevant retrieved articles and identified articles from our personal knowledge of the field. We included English-language studies and excluded opinion articles, review articles, descriptions of curricula without a remediation group, and surveys on remediation.
We developed a standardized data-extraction form based on the Best Evidence Medical Education Collaboration protocol.28 We extracted the following information from each article: level and number of learners/physicians, study location, description of assessment, skill area, criteria for remediation (standard setting), remediation activities, retesting, and outcomes of remediation. We assessed the level of behavioral impact by using the four-level Kirkpatrick hierarchy to assess the strength of the intervention.29 We defined the levels of impact as follows: Level 0 = descriptive study only (no assessment of impact); Level 1 = participation (a description of the participants' views of the experience); Level 2a = modification of participants' attitudes/perceptions; Level 2b = modification of participants' knowledge/skills; Level 3 = behavior change (documentation of the transfer of learning to the workplace); Level 4a = wider changes in the organizational delivery of care attributable to the educational program; and Level 4b = benefits to patients/trainees (any improvement in the health/well-being of patients/trainees as a direct result of an educational program). We did not perform a meta-analysis because this review was not a systematic review and because the measurements used to assess competence were highly variable.
One of us (K.E.H.) performed the literature search with the assistance of a health sciences librarian, and all other authors reviewed and confirmed the appropriateness of the retained and excluded articles on the basis of their review of titles and abstracts. Next, we worked in three teams—the UME team (K.E.H., M.A.P., and D.M.I.), the GME team (T.R.H., W.A.N., A.C., and P.K.), and the CME team (T.R.H. and W.A.N.)—to abstract each article. For each team's article abstracts, one investigator from another team reviewed each article to validate the accuracy. Finally, three of us (K.E.H., M.A.P., and D.M.I.) reviewed each abstracted article and the abstracted information to confirm accuracy and to standardize the abstracted information. We used consensus to resolve disagreements about search criteria, data extraction, and classification of study results.
Results
Of 207 citations identified, 170 (63 UME, 43 GME, and 64 CME) were selected for further review on the basis of the title, abstract, and, when relevant, the full article. Selected articles contained all three components of remediation as listed in Methods (i.e., identification of performance deficit, remediation intervention, and reassessment of performance after intervention). Thirteen studies met eligibility criteria; the results are described here and in Appendix 1 . Articles that were initially reviewed but excluded were of several types: descriptions of performance-problem identification only or of problem identification and remediation without reassessment of performance, surveys of program directors or other educators about performance problems or remediation, and opinion pieces.
Eligibility criteria for studies
UME.
Seven articles addressed the remediation of deficits of medical students, including one article that reported on preclinical students and six articles that reported on clinical clerkship students. Two articles portrayed interventions limited to addressing poor scores on written examinations and improving knowledge.30,31 Five articles used standardized patient examinations to identify clinical skills deficits,32–36 and one article combined the objective structured clinical examination format with knowledge assessments to identify students who needed remediation.34 No studies based the diagnosis of learner deficits on clinical performance with actual patients.
GME.
Both of the studies addressing the remediation of deficits of residents used in-training examinations to identify residents with knowledge deficits and then provided remediation.37,38 Whereas one remediation program addressed knowledge acquisition through a program of reading and study skills,38 the other mandated repeat clinical rotations in addition to reading.37
CME.
At the practicing physician level, four studies assessed physicians' practice and remediated a variety of deficiencies. Deficits were identified by peer assessors in two studies39,40 and by a licensing organization in two other studies,41,42 both of which included some physicians who had referred themselves for remediation.
Methodological quality
The methodological quality of the studies varied with the subjects' training level. Eight of nine studies evaluating trainees, both undergraduate and graduate, were coded as Level 2b in the Kirkpatrick hierarchy for “modification of participants' knowledge/skills.”30–35,37,38 Three studies of practicing physicians were coded as Level 3 (behavior change [documentation of the transfer of learning to the workplace]). Physicians' practice behaviors were evaluated after the remediation intervention by using expert judgments.39,41,42 One study of practicing physicians40 was coded as Level 1 (participation [a description of the participants' views of the experience]) because the main outcome measure was a behavior change as self-assessed by the physicians involved in the intervention, on the basis of their own learning goals.
Two studies did not describe the criteria for remediation.33,34 None of the studies included a contemporaneous control group of low performers who did not receive remediation. Two studies did not describe a retest or reassessment beyond self-assessment or satisfaction.36,40
Remediated skill areas
Six of the nine studies that addressed UME or GME described the remediation of knowledge deficits identified through written examinations.30,34,35,37,38,43 Four of these nine studies32,33,35,36 focused on remediation of clinical skills, and one of those four also addressed knowledge deficiencies.35 The four articles on postlicensure physicians described remediation of generalist office or subspecialty practice, which encompassed multiple skills assessed through chart reviews, chart-stimulated recall, interviews, and peer assessments.39–42
Outcomes of remediation
The seven studies on remediation of the deficits of medical students used written examinations of knowledge,30,31 standardized patient assessments,32,33,35,36 or a combination of the two34 to diagnose learners in need of remediation. All but one of these studies36 used the same assessments as outcome measures after remediation, and those six studies demonstrated improvements in scores. These studies were classified as Level 2b (modification of knowledge/skills), which did not include any assessment of behavior change.
At the GME level, the two studies diagnosed learner deficiencies through in-training examinations and remediated those deficiencies through individualized study plans that included faculty mentoring (surgery)38 or additional clinical rotations (radiology).37 Outcomes were in-training examination and subsequent examination scores, which improved for most participants. These studies also were classified as Level 2b.
All four of the studies examining practicing physicians came from Canada.39–42 They diagnosed performance deficiencies in physicians' actual clinical practice by using a combination of methods including chart review, chart-stimulated recall, and interviews. Two of the studies used peer assessments.39,40 Three of them showed improved outpatient clinical practice after remediation, as rated by interviewers or chart reviews, and their impact was classified as Level 3 (behavior change [documentation of the transfer of learning to the workplace]).39,41,42 One study40 assessed physician satisfaction with the program and showed that participants felt their performance had improved; this study was classified as having a Level 1 impact (a description of the participants' views of the experience).
Discussion
This review of the literature on remediation of the deficiencies of physicians across the educational continuum yielded surprisingly few studies that described remediation interventions coupled with assessments of remediation efficacy. The studies that we did identify were predominantly small, single-institution efforts. This paucity of studies evaluating remediation efforts is concerning, and it highlights the need to perform more large-scale, outcome-based remediation interventions and to publish the results of those interventions.
Because medical school would seem the ideal location for remediation, we anticipated finding multiple studies evaluating remediation efforts in UME. However, only three studies described outcomes of remediation of medical students' clinical skills.32,33,35 Medical students are prime candidates for remediation when needed. Because students function in a training environment without direct, unsupervised responsibility for patients, they are free of employer–employee contractual issues, and they receive more direct clinical supervision than do residents or physicians in practice. Developmental education is a conceptual framework used by those working at the college level, in which remediation incorporates comprehensive efforts to help individual students mature both academically and personally through course work, advice and mentoring from faculty, and other aspects of their medical training.13 This type of approach would be more feasible for a medical student than for a resident physician who shoulders clinical responsibilities.
The only studies we found evaluating the outcomes of resident remediation focused on knowledge, but not on any other of the core domains of competence. Although residents practice independently, usually without direct observation by their supervising attendings, no studies addressed the remediation of the clinical skills of residents demonstrating performance deficits. There may be several reasons for the lack of published remediation interventions in GME. Remediation requires a large investment of resources, and residents are needed to staff clinical services; removing them from clinical duties for participation in remediation can be challenging. Remediation is likely to be conducted on an individual basis by using untested methods and anecdotal outcomes.44 Residency program directors, unlike medical student educators, may feel limited by the legal policies inherent in their employer–employee relationship with the resident. The reliance on in-training examinations also reflects the availability of these knowledge-based examinations. These instruments efficiently provide a mechanism for testing and retesting residents' mastery of knowledge free from the confounders that pervade assessments of clinical practice.
The studies of remediation of the deficits of physicians in practice were the only studies we found that examined clinical performance with patients; studies of trainees relied on measures of performance obtained through written and clinical skills examinations. This shift in focus from assessments based on examinations to assessments based on actual clinical practice reflects the progression of a physician's professional development from the acquisition of knowledge and skills to clinical practice. High-quality patient care is inherently difficult to assess because it requires integration of knowledge with both clinical and communication skills during service to patients. The Dreyfus model of the development of expertise45 and Miller's pyramid46 similarly emphasize that, at the highest levels of physician competence, physicians can understand each case in a broader context, recognize elements that do not fit usual patterns, and exercise mature judgment. Nevertheless, in the four studies of remediation of physicians in practice that we identified,39–42 the assessed outcomes of the remedial intervention were relatively “soft” (e.g., physician interviews, chart reviews, and physician satisfaction with process) in comparison with “harder” outcome measures, such as patient satisfaction or improved measures of disease control.
Proposed model for remediation
On the basis of our review of the literature on remediation and selected studies in the learning sciences, we propose essential elements of successful remediation programs that would enhance existing efforts. These four core components of a powerful remediation program would be (1) initial assessment (or screening) using multiple assessment tools to identify deficiencies, (2) diagnosis of problems and development of an individualized learning plan, (3) provision of instruction that includes deliberate practice, feedback, and reflection, and (4) reassessment and certification of competence (Figure 1 ).
Figure 1:
Proposed model of a program for remediation of performance deficits of medical trainees and practicing physicians.
The first component of remediation includes the identification of those individuals who need remediation and the diagnosis of the performance deficits. Remediation requires multiple, reliable, and valid assessment tools for identification of trainees and physicians with deficiencies.47 Because deficiencies may exist in many domains of competence (e.g., knowledge, clinical and communication skills, or professionalism), multiple assessments are required; they are more likely to identify deficiencies than is a single tool.13 Examples of these assessment tools include observed encounters with actual patients, standardized patient encounters, written or Web-based assessments of clinical reasoning, record reviews, chart-stimulated recall, supervisor and peer observations, and multiple-choice examinations of knowledge. These assessment modalities not only uncover deficiencies but also can help guide remediation strategies to the identified areas of need.
A two-step approach to the identification of poorly performing physicians in practice, combining peer assessment with tests of knowledge and clinical skills, has been proposed in the United Kingdom.48 For diagnosing deficits at the student level, performance problems in clinical skills examinations have been characterized in six domains: fund of knowledge, clinical reasoning, history-taking, physical examination, communication, and professionalism.49 These categories are applicable for both GME and practicing physicians.
The second component of a remediation program has two parts: (1) diagnosis of the underlying problem that led to the performance deficits and (2) development of an individualized learning plan based on learner characteristics and identified needs. The development of such a plan involves, after the diagnosis of the problem, an articulation of clear expectations for acceptable performance. Next, learners need guidance in assessing their own performance accurately in light of this external standard, as well as coaching in self-reflection and in planning for improvement. Because learners are not always accurate self-assessors, guidance from an expert is essential. A mentor who is familiar with the individual's strengths and weaknesses is helpful for establishing an individualized learning plan. There should be clarity about whether this remediation is required or voluntary and about what the consequences of remediation or nonremediation will be.
The third component of the remediation program is the provision of the prescribed learning activities. On the basis of this diagnostic and reflective process, a set of specific experiences should be prescribed. It may also be appropriate to recommend a range of services for personal and professional development, tailored to the student's needs.13 Medical students with deficiencies in clinical skills may harbor coexisting cognitive and noncognitive deficits.49 Thus, cognitive strategies associated with gaining knowledge, activation of and connection to prior knowledge, an understanding of the rationale for recommended standards, and the application and use of knowledge in practice may be needed. Problems with professionalism may be better addressed through a behavioral approach that involves identifying the problematic behaviors, offering rationales for the dysfunctional nature of those behaviors, and practicing new behaviors, such as courtesy, respect, and reliability.
The prescribed remediation activities should offer participants opportunities for deliberate practice followed by feedback. These activities might include guided clinical experience, practice with simulations or standardized patients, study and knowledge testing, review of medical charts with stimulated recall, and observation of their clinical performance. The key to success is deliberate, conscious practice under the guidance of experienced supervisors who can offer specific and timely feedback. The usefulness of simulators in procedural skills training50 suggests that the increasing sophistication of clinical simulators may offer opportunities for this type of practice. Whether at the level of a student who is learning to face new clinical problems or that of a practicing physician who is treating multiple patients with complex conditions, a cognitive approach would guide participants in thinking about the concepts raised by patient presentations, examining how they relate to other facets of the case and to prior knowledge of similar clinical problems, and discerning how that knowledge can be applied to future cases. This type of cognitive strategy might also be suited to problems with clinical skills that stem from faulty clinical reasoning, such as a failure to ask the right questions in taking the history or a failure to perform important elements of the physical examination because of an incorrect differential diagnosis. Participants in remediation might work individually with a preceptor in a problem-based learning format or with standardized or actual patients to practice addressing clinical problems, generating differential diagnoses and management plans, and analyzing different diagnosis and treatment strategies.
In contrast, a behavioral approach, which emphasizes observable behaviors that can be taught and measured, might better address dyscompetence arising from problems with technique. For instance, incorrect performance of the physical examination can be remediated through practice with standardized patients, after which expert observers provide feedback and coaching and evaluate the behaviors associated with performance. This approach would be bolstered by simultaneous cognitive efforts to help learners develop reflective abilities that will allow them to review their own performance and evaluate how it compares with the desired standard. Reflection-in-action (also called meta-cognition) is the act of analyzing the impact of one's actions as they are occurring and modifying one's behavior on the basis of that analysis.51 In a candidate for remediation, this level of reflective ability would require significant insight into and understanding of the benchmark to be attained. Regardless of the learning strategy selected, multiple forms of practice with feedback will be required for remediation.
The fourth and final component of remediation is the retesting of participants to ensure that acceptable levels of performance have been achieved, so that competence can be certified. Retesting may involve the same examination modalities as were originally used to identify deficiencies or areas of dyscompetence, or it may involve more-customized assessment methods addressing selected areas of difficulty. In this report, we assumed that remediation efforts have succeeded and that the participant was deemed competent. However, medical education leaders and licensing boards need to take appropriate action if remediation does not achieve the desired result.52 These remediation efforts must be coupled with outcomes-based research to demonstrate a change in performance with patients and the effect on patient outcomes and satisfaction.
Efforts at enhancing remediation are most likely to occur at the UME level, where there is centralized oversight of the learners and where assessment is a routine part of the educational environment. Remediation at the GME level should take advantage of existing assessment systems to identify deficiencies and measure the impact of remediation; developmental efforts should focus on assessing and remediating competencies not effectively targeted by existing measurement systems. Implementing remediation at the CME level is more daunting because of the culture of physician independence, the logistical challenges of observing physicians in practice, and the absence of assessment systems comparable with those present in UME and GME settings.
Learners are generally reluctant to be identified as needing remediation, and institutions may manifest similar reluctance to identify practitioners as needing remediation because the institutions lack expertise in remediation or are unwilling or unable to provide remedial services.5 A trainee or physician with recognized dyscompetence who is in need of remediation may be embarrassed or may feel stigmatized. Students will benefit from an environment that affords some anonymity and safety during the remediation process. However, a stigma-free approach may not be optimal or possible. It can lead to confusion among learners about the status of their performance relative to that of their peers, and it can minimize an understanding of the severity of one's deficiencies.32,53 One study we reviewed found that carefully designed group workshops were rated highly by learners, despite the somewhat public focus on remediation of their dyscompetence.36
The critical importance of remediation must be weighted against the high cost of remediation interventions. At the college level, the expense of remediation challenges institutions and policy makers.54 One approach to increasing the efficiency of remediation, particularly in the medical education setting in which the numbers of learners needing intensive remediation may be small, is the implementation of collaborative programs across institutions. It seems desirable for collaborative efforts to be launched that span institutions and the somewhat artificial UME–GME–CME barriers and that combine knowledge, resources, and experience. Certain centers could develop expertise and resources for remediation and become referral centers from training programs or hospitals around the country.5,55 This cost-effective model would concentrate expertise in remediation. Another efficient remediation strategy, described after the dates of our literature search, involved learners' self-assessment of their performance, both on their own and with faculty guidance, to identify potential areas for improvement.56
This literature review has several limitations. We did not review abstracts from national professional meetings, which might be more likely to include negative results. Thus, our findings may be subject to publication bias and underreporting of remediation efforts around the country. The methodologic quality of the studies reviewed was moderate at best, and the findings from these studies do not allow firm conclusions about remediation efforts that will lead to behavior change. However, we also drew on the learning sciences for guidance.
Conclusion
There is surprisingly little evidence to guide “best practices” of remediation in medical education at all levels. Our findings highlight the dire need for multiinstitutional, outcomes-based research on strategies for remediation of the deficiencies of incompetent and less- than-competent trainees and physicians, accompanied by long-term follow-up, to determine the impact on future performance. Absent such research, we are left to extrapolate from the small number of available studies and from the literature in the learning sciences. These resources do, in fact, all point to a model that includes multiple assessment tools for identifying deficiencies, individualized instruction, deliberate practice followed by feedback and reflection, and reassessment.
Acknowledgments
The authors thank Josephine Tan for help with the literature search.
References
1 Federation of State Medical Boards. Essentials of a Modern Medical Practice Act. Dallas, Tex: Federation of State Medical Boards; 2006.
2 Hauer KE, Hodgson CS, Kerr KM, Teherani A, Irby DM. A national study of medical student clinical skills assessment. Acad Med. 2005;80(10 suppl):S25–S29.
3 Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the accreditation council for graduate medical education: A systematic review. Acad Med. 2009;84:301–309.
4 Howley LD, Wilson WG. Direct observation of students during clerkship rotations: A multiyear descriptive study. Acad Med. 2004;79:276–280.
5 Leape LL, Fromson JA. Problem doctors: Is there a system-level solution? Ann Intern Med. 2006;144:107–115.
6 Federation of State Medical Boards. Maintenance of licensure: Frequently asked questions. Available at:
http://www.fsmb.org/m_mol_faqs.html . Accessed August 10, 2009.
7 Papadakis MA, Teherani A, Banach MA, et al. Disciplinary action by medical boards and prior behavior in medical school. N Engl J Med. 2005;353:2673–2682.
8 Papadakis MA, Arnold GK, Blank LL, Holmboe ES, Lipner RS. Performance during internal medicine residency training and subsequent disciplinary action by state licensing boards. Ann Intern Med. 2008;148:869–876.
9 Papadakis MA. The Step 2 clinical-skills examination. N Engl J Med. 2004;350:1703–1705.
10 Hauer KE, Teherani A, Kerr KM, Irby DM, O'Sullivan PS. Consequences within medical schools for students with poor performance on a medical school standardized patient comprehensive assessment. Acad Med. 2009;84:663–668.
11 Accreditation Council for Graduate Medical Education. Outcome Project. Available at:
http://www.acgme.org/outcome . Accessed August 10, 2009.
12 Medical professionalism in the new millennium: A physician charter. Ann Intern Med. 2002;136:243–246.
13 Boylan HR, Bonham BS, White SR. Developmental and remedial education in postsecondary education. New Dir Higher Educ. 1999;108:87–101.
14 Bordage G. Elaborated knowledge: A key to successful diagnostic thinking. Acad Med. 1994;69:883–885.
15 Bordage G, Lemieux M. Semantic structures and diagnostic thinking of experts and novices. Acad Med. 1991;66(9 suppl):S70–S72.
16 Norman G. Research in clinical reasoning: Past history and current trends. Med Educ. 2005;39:418–427.
17 Norman G, Schmidt H. The psychological basis of problem-based learning: A review of the evidence. Acad Med. 1992;67:557–565.
18 Schmidt H, Norman G, Boshuizen H. A cognitive perspective on medical expertise: Theory and implication. Acad Med. 1990;65:611–621.
19 Schmidt H, Rikers R. How expertise develops in medicine: Knowledge encapsulation and illness-script formation. Med Educ. 2007;41:1133–1139.
20 Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81.
21 Cruess RL, Cruess SR. Teaching professionalism: General principles. Med Teach. 2006;28:205–208.
22 Cruess SR, Cruess RL. Professionalism must be taught. BMJ. 1997;315:1674–1677.
23 Hafferty F. Professionalism: The next wave. N Engl J Med. 2006;355:2151–2152.
24 Inui TS. A Flag in the Wind: Educating for Professionalism in Medicine. Washington, DC: Association of American Medical Colleges; 2003.
25 Stern D, Papadakis M. The developing physician—Becoming a professional. N Engl J Med. 2006;355:1794–1799.
26 Chen D, Mills A, Werhane P. Tools for tomorrow's health care system: A systems-informed mental model, moral imagination, and physicians' professionalism. Acad Med. 2008;83:723–732.
27 The Special Committee on Evaluation of Quality of Care and Maintenance of Competence. Evaluation of Quality of Care and Maintenance of Competence. Dallas, Tex: Federation of State Medical Boards; 1999.
28 Best Evidence Medical Education Collaboration Web site. Available at:
http://www.bemecollaboration.org . Accessed August 10, 2009.
29 Kirkpatrick DL. Evaluation of training. In: Craig R, Mittel I, eds. Training and Development Handbook. New York, NY: McGraw-Hill; 1967:87–112.
30 Magarian GJ, Campbell SM. A tutorial for students demonstrating adequate skills but inadequate knowledge after completing a medicine clerkship at the Oregon Health Sciences University. Acad Med. 1992;67:277–278.
31 Schwartz PL, Loten EG. Effect of remedial tutorial help on students who fail in-course assessments. Acad Med. 1998;73:913.
32 Chou CL, Chang A, Hauer KE. Remediation workshop for medical students in patient–doctor interaction skills. Med Educ. 2008;42:537.
33 Faustinella F, Orlando PR, Colletti LA, Juneja HS, Perkowski LC. Remediation strategies and students' clinical performance. Med Teach. 2004;26:664–665.
34 Sayer M, Chaput De Saintonge M, Evans D, Wood D. Support for students with academic difficulties. Med Educ. 2002;36:643–650.
35 Stillman PL, Ruggill JS, Rutala PJ, Dinham SM, Sabers DL. Students transferring into an American medical school. Remediating their deficiencies. JAMA. 1980;243:129–133.
36 Chang A, Chou CL, Hauer KE. Clinical skills remedial training for medical students. Med Educ. 2008;42:1118–1119.
37 Edeiken BS. Remedial program for diagnostic radiology residents. Invest Radiol. 1993;28:269–274.
38 Harthun NL, Schirmer BD, Sanfey H. Remediation of low ABSITE scores. Curr Surg. 2005;62:539–542.
39 McAuley RG, Paul WM, Morrison GH, Beckett RF, Goldsmith CH. Five-year results of the peer assessment program of the College of Physicians and Surgeons of Ontario. CMAJ. 1990;143:1193–1199.
40 Wenghofer EF, Way D, Moxam RS, Wu H, Faulkner D, Klass DJ. Effectiveness of an enhanced peer assessment program: Introducing education into regulatory assessment. J Contin Educ Health Prof. 2006;26:199–208.
41 Goulet F, Gagnon R, Gingras ME. Influence of remedial professional development programs for poorly performing physicians.J Contin Educ Health Prof. 2007;27:42–48.
42 Goulet F, Jacques A, Gagnon R. An innovative approach to remedial continuing medical education, 1992–2002. Acad Med. 2005;80:533–540.
43 Schwartz RW, Witzke DB, Donnelly MB, Stratton T, Blue AV, Sloan DA. Assessing residents' clinical performance: Cumulative results of a four-year study with the Objective Structured Clinical Examination. Surgery. 1998;124:307–312.
44 Steinert Y, Levitt C. Working with the “problem” resident: Guidelines for definition and intervention. Fam Med. 1993;25:627–632.
45 Carraccio CL, Benson BJ, Nixon LJ, Derstine PL. From the educational bench to the clinical bedside: Translating the Dreyfus developmental model to the learning of clinical skills. Acad Med. 2008;83:761–767.
46 Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(suppl):S63–S67.
47 Epstein RM. Assessment in medical education. N Engl J Med. 2007;356:387–396.
48 Southgate L, Cox J, David T, et al. The assessment of poorly performing doctors: The development of the assessment programmes for the General Medical Council's Performance Procedures. Med Educ. 2001;35(suppl 1):2–8.
49 Hauer KE, Teherani A, Kerr KM, O'Sullivan PS, Irby DM. Student performance problems in medical school clinical skills assessments. Acad Med. 2007;82(suppl):S69–S72.
50 Lynagh M, Burton R, Sanson-Fisher R. A systematic review of medical skills laboratory training: Where to from here? Med Educ. 2007;41:879–887.
51 Schon D. The Reflective Practitioner: How Professionals Think in Action. New York, NY: Basic Books; 1983.
52 Irby DM, Milam S. The legal context for evaluating and dismissing medical students and residents. Acad Med. 1989;64:639–643.
53 Deil-Amen R, Rosenbaum JE. The unintended consequences of stigma-free remediation. Sociol Educ. 2002;75:249–268.
54 Griffith SR, Meyer JM. Remediation in Texas: A prototype for national reform? New Dir Higher Educ. 1999;27:103–114.
55 Norcross WA, Henzel TR, Freeman K, Milner-Mares J, Hawkins RE. Toward meeting the challenge of physician competence assessment: The UCSD Physician Assessment and Clinical Education (PACE) Program. Acad Med. 2009;84:1008–1014.
56 White CB, Ross PT, Gruppen L. Remediating students' failed OSCE performances at one school: The effects of self-assessment, reflection, and feedback. Acad Med. 2009;84:651–654.
Table. Appendix: Published Studies Describing Remediation Interventions (Diagnosis of Performance Deficits, Remediation, or Reassessment) in Undergraduate, Graduate, and Continuing Medical Education Through October 2008
Table: Appendix, Continued
Table: Appendix, Continued
Table: Appendix, Continued