Response to the 2011 Question of the Year
Medical education can lead to better health for individuals and populations when it has effective, evidence-based features and is delivered under the right conditions. Effective, evidence-based features include mastery learning (ML), deliberate practice (DP), and rigorous outcome measurement (ML and DP are both defined below). The right conditions include a committed and skillful faculty, curriculum integration and institutional endorsement, and health care system acceptance. Translation of medical education outcomes to measurable downstream effects on improved patient care practices and better health for individuals and populations is demonstrated by educational and health services research programs that are thematic, sustained, and cumulative.
ML is an especially stringent form of competency-based education where learners acquire essential knowledge and skill measured rigorously against fixed achievement standards without regard to the time needed to reach the outcome. Mastery indicates a much higher level of performance than competence alone, and evidence shows that ML leads to longer skill maintenance without significant decay. Educational outcomes are uniform in ML with little or no variation, whereas educational time varies among trainees.1 In medical education, ML has been used chiefly for acquisition and maintenance of clinical procedural skills such as advanced cardiac life support (ACLS), thoracentesis, and central venous catheter (CVC) insertion. ML can also be used to acquire and refine cognitive and affective educational outcomes. The ability to engage a family in a difficult conversation about end-of-life issues is a clinical skill amenable to ML just like performance of a lumbar puncture. Work is now under way to evaluate these and other clinical mastery outcomes.
DP embodies strong and consistent educational interventions grounded in information processing and behavioral theories of skill acquisition and maintenance.2 DP has at least nine elements: (1) highly motivated learners with good concentration, (2) well-defined learning objectives that address knowledge or skills that matter clinically, at an (3) appropriate level of difficulty for the medical learners, with (4) focused, repetitive practice of the knowledge or skills, that leads to (5) rigorous measurements that yield reliable data, which provide (6) informative feedback from educational sources (e.g., teachers, simulators) that promotes frequent (7) monitoring, error correction, and more DP that enables (8) performance evaluation toward reaching a mastery standard, and allows (9) advancement toward the next clinical task or unit. The goal of DP is constant skill improvement. Research shows that DP is a much more powerful predictor of professional accomplishment than experience or academic aptitude.
Medical education and evaluation research programs that incorporate ML and DP principles, and evaluate outcomes with measurement and methodological rigor, are beginning to show translational results in patient care practices and patient outcomes.3 Many of these educational programs use health care simulation technology as a curriculum driver. Examples of improved patient care include reduced complications and higher success rates at CVC insertion, improvement in laparoscopic surgical skill, better adherence to guidelines during ACLS team responses, and increased competence in several types of endoscopy. Better health for individuals and populations linked directly to medical education programs has been demonstrated through reduced rates of catheter-related bloodstream infections; reduced birth complications due to shoulder dystocia (brachial plexus injury), low Apgar scores, and infant brain injury from neonatal hypoxic–ischemic encephalopathy; and lower postsurgical complications among cataract surgery patients.3 Advancements in medical education, evaluated rigorously, can produce better patient health as judged statistically and clinically.
Powerful and effective medical education programs do not exist in a vacuum. They include not only such curriculum features as ML, DP, and reliable outcome measurement but also faculty and administrative commitment, curriculum support expressed as financial and human capital, and a health care system whose culture embraces professional competence evaluation in service of patient care quality and patient safety at all levels. Medical education programs are being recognized as complex service interventions that are affected by the context in which they are delivered. This context is highly variable but has a powerful role in determining the ultimate success of the program. A new, interdisciplinary academic field called implementation science, and the scholarly journal that bears its name, holds promise to teach the medical education community how to develop, launch, and sustain educational programs that improve health for individuals and populations.
Medical school and residency curricula must change to adopt a competency-based approach featuring structured learning experiences tied to assessments that yield reliable data. Research shows convincingly that ML and DP linked to competence assessment can improve health outcomes. Expansion of this model is needed to better prepare trainees for independent and group practice and to ensure competent medical care for patients and society.
1 Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med. 2006;21:251–256.
2 McGaghie WC, Siddall VJ, Mazmanian PE, Myers J. Lessons for continuing medical education from simulation research in undergraduate and graduate medical education. Effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. Chest. 2009;135(3 suppl):62S–68S.
3 McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the impact of simulation on translational patient outcomes. Simul Healthc. 2011;6(suppl):S42–S47.