The development of US medical education during the 20th and early 21st centuries has been both linear and cautious. The development has been linear because it was shaped by scientific and technical advancements including breakthrough research in the biomedical and clinical sciences, medical and information technology that is improving constantly, growth in the number of medical schools and medical students, and more efficient and effective systems of health care delivery.1 Medical education has also experienced linear improvement from the increasingly sophisticated methods that are used for personnel measurement and evaluation.2 The development of medical education has been cautious because historians point out that the basic 2 + 2 structure (2 years of basic science + 2 years of clinical experiences) has not changed since at least 1905.3 The structure and format of clinical medical education in 2015 are nearly identical to those advocated by Sir William Osler in 1903.4 This lack of progress in medical education has prompted calls for significant reform in US medical education1 and spawned an argument that we must stop educating 21st century physicians using 19th century ideas, technologies, and curricula.5
What is a medical curriculum? Janet Grant teaches that a curriculum is, “A statement of the intended aims and objectives, content, experiences, outcomes and processes of an educational programme, including:
- a description of the training structure (entry requirements, length and organization of the programme, including its flexibilities, and assessment system)
- a description of expected methods of learning, teaching, feedback and supervision
The curriculum should cover both generic professional and specialty-specific areas. The syllabus content of the curriculum should be stated in terms of what knowledge, skills, attitudes and expertise the learner will achieve.”6
Medical curricula have been developed using a variety of models ranging from subject centered to integrated and competency based.7 Kern et al8 have presented a model of medical curriculum development that embodies a 6-step approach. The 6 steps are as follows:
- Problem identification and general needs assessment,
- Targeted needs assessment,
- Goals and objectives,
- Educational strategies,
- Implementation, and
- Evaluation and feedback.
The Kern curriculum development team proposes 2 addenda to round out the stepwise approach: curriculum maintenance and enhancement as well as dissemination. The model of Kern et al and its derivatives have been used to create and evaluate simulation-based medical education (SBME) curricula to address a variety of medical specialties including primary care gynecology,8 general surgery,9 and pediatrics.10
The idea of mastery learning in medical education is congruent with the 6-step approach to medical curriculum development by Kern et al. Mastery learning takes the Kern model as a foundation and adds layers of very high expectations involving educational objectives, achievement standards, educational strategies, and evaluation and feedback. Mastery learning begins with the expectation of “excellence for all” and sets a high standard for the answer to the question, “How good is good enough?”
McGaghie et al11 outlined 7 principles of mastery learning curricula in medical education, grounded on a 50-year history of the idea.5 The 7 mastery learning principles are as follows11:
- Baseline or diagnostic testing;
- Clear learning objectives, sequenced as units with increasing difficulty;
- Engagement in educational activities (eg, deliberate practice (DP), data interpretation, reading) focused on the objectives;
- A set minimum passing standard (MPS) (eg, test score) for each educational unit;
- Formative testing to gauge unit completion at a preset MPS for mastery;
- Advancement to the next educational unit given measured achievement at or above the mastery standard; and
- Continued practice or study of an educational unit until the mastery standard is reached.
The mastery model of medical education ensures that all learners achieve all educational objectives with little or no outcome variation. In contrast with traditional time-based medical curricula, in mastery learning, the time needed to reach a unit’s educational objectives may vary among learners.11
This article provides a framework to use the 6-step curriculum development approach by Kern et al to help educators develop a mastery learning curriculum. Specifically, we used the Kern model to develop an Advanced Cardiac Life Support (ACLS) SBME curriculum that was later transformed to a simulation-based mastery learning (SBML) curriculum. The ACLS curriculum that was developed, implemented, evaluated, and refined at Northwestern University Feinberg School of Medicine from 2003 to 2014 illustrates each step. The 6 curriculum development steps by Kern et al were used together with mastery learning features to create and refine the ACLS curriculum.
This article is organized into 5 sections that address the evolution of the ACLS mastery learning curriculum using the Kern model: (a) origins and early development, (b) measurement, (c) implementation and short-run goals, (d) transformation to a mastery model, and (e) maintenance and enhancement. These sections comprise a comprehensive curriculum plan as seen in Table 1. We conclude the article with a section on limitations and challenges. The concepts used in ACLS curriculum development have also been used successfully for education in other medical procedures, knowledge, and communication skills.30
ADVANCED CARDIAC LIFE SUPPORT MASTERY LEARNING CURRICULUM
Origins and Early Development
The ACLS curriculum was initially developed because the American Board of Internal Medicine requires all internal medicine (IM) residents to perform ACLS safely and effectively (Kern step 1, problem identification and general needs assessment).12 To meet this goal, IM trainees typically complete a 1-day provider course offered by the American Heart Association (AHA) bianually.13 Advanced Cardiac Life Support provider courses involve a series of classroom lectures followed by skill building sessions taught in a large-group format. Objective assessment is conducted at Objective Structured Clinical Examination (OSCE)-like stations where case-based scenarios are presented.13
In-hospital cardiac arrests are rare events, and concerns have been expressed about the adequacy of resident training to manage them.31 Data from the University of Chicago show that the quality of cardiac resuscitation attempts by trained hospital personnel varies widely and often do not meet published standards.21 Experts in ACLS argue that health care providers should attend refresher courses frequently to maintain their knowledge and skill.32,33
In academic medical centers, ACLS provider teams including residents from IM, anesthesiology, and surgery are the usual responders to in-hospital cardiac arrests. Before 2003, second- and third-year IM residents at Northwestern Memorial Hospital (NMH) acted as code leaders after completing an AHA provider course. Based on nursing feedback, we determined that completing the standard training course was not preparing residents for this role (Kern step 2, targeted needs assessment). The NMH has a quality improvement (QI) team for cardiac arrest responses. Nursing leadership received feedback from critical care and floor nurses that it was difficult to determine which resident was the code leader during cardiac arrest events. Nurses also reported that residents varied in their adherence to AHA ACLS guidelines. Given these observations, the chief medical residents and IM residency director (D.B.W.) were asked to meet with the QI team and develop a solution to the problem. Concerns about ACLS skill acquisition and retention prompted the development, implementation, and evaluation of an educational program at NMH designed to boost IM residents’ skills and management of in-hospital cardiac events (Kern step 3, goals and objectives).
The ACLS core curriculum originated from the 6 most common cardiac arrest events at NMH and matched content in the 2001 ACLS Provider Manual published by the AHA.14 The ACLS scenarios were as follows: asystole, ventricular fibrillation, supraventricular tachycardia, ventricular tachycardia, symptomatic bradycardia, and pulseless electrical activity. Simulation-based medical education was selected as the ACLS learning and teaching platform because of its legacy of success in medical education and its ability to provide a safe, controlled practice environment (Kern step 4, educational strategy).28,34 We used realistic clinical scenarios with a high-fidelity simulator (HPS, METI LLC, Sarasota, FL), which allowed for DP35 with focused real-time feedback. Training occurred over four 2-hour simulation education sessions during a 2-week period.17 The METI simulator uses computer software to display multiple physiologic and pharmacologic responses observed in ACLS. The mannequin has respiratory responses, reactive pupils, heart sounds, and peripheral pulses. Cuff (systemic) blood pressure, arterial oxygen saturation, electrocardiogram, and arterial line blood pressure can be monitored. Educational objectives focused on clinical procedures were addressed using task trainers for airway management, line insertion, and compressions.
The 8-hour program was not intended to mirror an AHA ACLS provider course. Instead, the goal was to enhance the knowledge, skills, and attitudes necessary to prepare IM residents to lead and participate in “code” events. During a 4-month preintervention monitoring period, chief medical residents reviewed hospital cardiac arrest logs and developed scenarios based on the 6 most commonly occurring ACLS events. The measures and scenarios for ACLS were originally published on MedEdPortal (see Document, Supplementary Digital Content 1, http://links.lww.com/SIH/A250 checklist).25 The scenarios were pilot tested with attending physicians, ACLS instructors, and other content experts and revised as needed. The use of the human patient simulator in a center equipped with 1-way glass and audiovisual technology allowed the residents to react and care for simulated in-hospital cardiac events repeatedly while managing a team of their peers in a “safe” learning environment.
Checklists were developed using rigorous step-by-step procedures for each of the 6 ACLS conditions.25,26 Two faculty members (including D.B.W.), an ACLS instructor (V.J.S.), and 2 chief medical residents used the modified Delphi technique to develop the checklists based on the AHA ACLS guidelines. Within each original checklist patient assessment, clinical examination, medication administration, chest compressions, monitoring, and other actions were listed in the order recommended by the AHA.14 All checklist items were given equal weight. A dichotomous scoring scale ranging from 0 (not done/done incorrectly) to 1 (done correctly) was used for each checklist item.25
Faculty raters completed training and calibration during the pilot testing phase. All 6 scenarios were pilot tested for 2 weeks on 10 nonstudy subjects to calibrate raters and clarify measures. Each evaluator was trained to use the checklist uniformly to ensure that reliable data were generated. Video-recorded pilot sessions were regraded by faculty to assess interrater reliability using Cohen’s κ. A κη greater than 0.8 for each item was the reliability standard. Checklist items that did not have acceptable reliability were modified until agreement was reached. Faculty raters received feedback about their scoring and had refresher training and practice every 4 to 6 months to ensure continued high checklist scoring reliability.
Implementation and Short-Run Goals (Kern Step 5)
All second- and third-year residents were required to participate in the ACLS curriculum by the program director (D.B.W.) for the IM residency. The Northwestern ACLS curriculum project addressed 3 short-run questions. (a) Does SBME produce better ACLS knowledge and skill acquisition than traditional medical education that includes AHA provider courses?17 (b) Are graduating IM residents who have not received SBME for ACLS procedures accurate judges of their ACLS skills?18 (c) Do ACLS skills acquired in the simulation laboratory translate to better IM resident responses to real hospital ACLS events?19 Deliberate practice, measurement, feedback, and correction in a supportive environment were the operational rules of the simulation-based educational intervention.
Simulation education sessions were conducted uniformly as either teaching or testing occasions.17 The original research report describes the SBME intervention in detail. “Teaching sessions gave groups of two to four residents time to practice protocols and procedures and to receive structured education from simulator faculty. Debriefing allowed the residents to ask questions, review algorithms, and receive feedback. The four teaching sessions were presented in uniform order: (a) procedures—intubation, central line placement, pericardiocentesis, and needle decompression of tension pneumothorax; (b) pulseless arrhythmias—asystole, ventricular fibrillation, pulseless electrical activity; (c) tachycardias—supraventricular and ventricular; and (d) bradycardias—second-and third-degree atrioventricular block.
Two residents were present at each testing session. While one resident directed resuscitation efforts, the other resident performed cardiopulmonary resuscitation or other tasks but did not make management decisions or lead the arrest scenario. The presentation order of the six scenarios was randomized within each testing session. As described in the ACLS guidelines, residents were expected to obtain a history; perform a physical examination; request noninvasive and invasive monitoring; order medications, procedures, and tests; and direct resuscitative efforts of other participants. Residents did not review the scenarios before the session and were not permitted to use written materials while directing the simulations”17 [reprinted with permission of Lawrence/Erlbaum Associates, Inc].
Early curriculum evaluation (Kern step 6) involved 3 studies. The first investigation was a randomized trial with a wait-list control condition designed to evaluate if the SBME intervention produced significant skill acquisition results. We found the educational intervention yielded powerful results in terms of ACLS skill acquisition in the simulation laboratory (38% improvement) and resident morale compared with traditional clinical education.17 Second, we compared graduating IM residents’ self-assessment of their ability to manage ACLS scenarios with measured performance on a simulator. Residents self-assessed their ability to manage ACLS scenarios using a 100-point scale (0 = very low to 100 = very high). These IM residents’ ACLS self-assessments did not correlate with measured performance.18 We also performed a third, case-control study of IM residents’ responses to actual NMH ACLS events to assess curriculum translational outcomes. Responses from SBME-trained residents were 68% adherent to AHA guidelines compared with 44% adherence when care was delivered by residents without SBME training.19 This outcome had statistical and clinical significance. Postevent survival was not significantly different between the 2 groups, but a trend toward increased mean survival time was seen in the simulator-trained group (195 hours) compared with the traditionally trained group (107 hours, P = 0.11), unadjusted for patient risk. Ten percent of the patients in the simulator trained group survived until hospital discharge compared with 3.6% in the traditionally trained group (P = 0.36).
Transformation From SBME to SBML
The Northwestern ACLS curriculum did not originally use the mastery model. The empirical evidence clearly demonstrated that SBME could produce powerful ACLS skill improvement in the medical simulation laboratory (T1 translational science) and better patient care practices in the hospital (T2 translational science).19,36 This led us to pose 3 new questions: Can the SBME intervention be transformed to an SBML educational model? Are SBML ACLS learning outcomes achieved in the simulation laboratory retained over time? Does mastery learning yield improved educational and clinical outcomes compared with SBME outcomes? We chose to transform the curriculum to SBML on patient safety grounds because all learners would be held to a very high achievement standard before completion of training.
Transformation of the ACLS core curriculum from an SBME format to SBML was accomplished after a 2-year phase-in trial. The 7 principles of mastery learning were grafted to the ACLS curriculum. Approximately 5 hours of professional development time was needed to educate faculty and staff about the principles of mastery learning. Baseline data were also needed to establish an MPS for the SBML curriculum. Basic ACLS curriculum objectives, structure, and simulation-based instructional operations did not change.
The ACLS SBML curriculum continued to emphasize history taking, physical examination, clinical decision making, procedural competence, team leadership, and professional communication. Simulation laboratory sessions featuring DP with feedback were complemented by reading assignments and team debriefing exercises. We provided residents with reading materials including the AHA ACLS guidelines and textbook 1 week before practice sessions. We asked residents to review these materials in advance of specific sessions so they would be prepared to assume a leadership role in simulated ACLS scenarios. Debriefing sessions followed each teaching session. Residents reviewed what went well during practice sessions and what could be improved. Faculty facilitators reviewed concepts, including adherence to structured roles and closed-loop communication, and compared simulation sessions with actual patient care scenarios.
Transforming the SBME curriculum to SBML required 3 improvements: (a) systematically establishing a “mastery” MPS using an expert panel and state-of-the-art methods, (b) holding all of the IM resident trainees to this very high “mastery” achievement standard, and (c) allowing the amount of training time (eg, DP in the simulation laboratory) to vary between individual residents. The intent was to produce very high achievement among IM residents in the ACLS procedures and skills required for board certification with little or no outcome variation.15
The MPS for each ACLS procedure was set by 12 clinical experts using the Angoff and Hofstee standard setting methods.22 The final MPS for each clinical scenario was the average of the Angoff and Hofstee derived standards. Both approaches use a panel of judges composed of individuals with expertise, experience, and knowledge of the subject area. In the Angoff method, experts are asked to review each checklist item and determine the percentage of borderline trainees (a group that has a 50% chance of passing the examination) who would perform each item correctly. The Hofstee method asks judges 4 questions as follows: (a) What is the maximum acceptable passing score? (b) What is the minimum acceptable passing score? (c) What is the maximum acceptable failure rate? (d) What is the minimum acceptable failure rate? The MPS was set at 74.3% checklist items correct for asystole, 76.4% for ventricular fibrillation, 72.4% for supraventricular tachycardia, 74.4% for ventricular tachycardia, 71.5% for symptomatic bradycardia, and 76.6% for pulseless electrical activity.
Outcomes from the first round of SBML of ACLS skills among 41 IM residents at NMH were compelling. “Thirty-three of the 41 medicine residents (80.5%) achieved mastery within the standard 8-hour training and deliberate practice ACLS curriculum. The remaining eight residents (19.5%) needed extra time to reach mastery ranging from 15 minutes to one hour. Only five residents needed a full extra hour of deliberate practice to reach all six ACLS scenario MPSs.”15 Residents who did not meet or exceed the MPS at posttest participated in further DP to focus on identified deficits. Because residents were required to meet or exceed the MPS in each scenario, several residents each academic year were required to return to the simulation laboratory for additional practice to remediate 1 or 2 scenarios. During remediation sessions, residents completed individualized DP in the particular area of difficulty. These sessions were well received by residents who did not perceive a negative impact of returning for additional practice.15 The pretest-to-posttest contrast in overall ACLS performance represented a 24% improvement, a highly significant difference (P ≤ 0.0001).15 A follow-up study of the first SBML cohort of IM residents showed that the ACLS skills acquired under mastery conditions did not decay after 6 and 14 months.29
Maintenance and Enhancement
Maintenance and enhancement of the ACLS curriculum based on SBML principles involve constant attention to external organizations that can affect teaching and evaluation, curriculum content, local improvements to achieve educational efficiency, and continued studies of downstream, translational outcomes to demonstrate that powerful educational interventions have an impact on clinical care practices.
External organizations that shape ACLS education and personnel evaluation include the AHA and the International Liaison Committee on Resuscitation and Emergency Cardiovascular Care, which are responsible for updating and revising ACLS guidelines on a 5-year cycle.16 A 2010 revision of the guidelines changed the focus on ACLS training from airway-breathing-circulation (ABC) to circulation-airway-breathing (CAB).23,24 This change prompted our group to revise the ACLS outcome checklists to represent improved clinical practice guidelines.27
All new NMH IM residents continue to undergo SBML for ACLS skill acquisition in addition to required AHA provider courses. We also continue to study translational ACLS outcomes in terms of residents’ responses to in-hospital ACLS events. After the SBML ACLS curriculum was in place, we reduced training time for several reasons. First, we received course feedback that some residents felt that 3 training sessions were sufficient to teach ACLS, and 4 were not needed because IM residents did not routinely perform procedures such as intubation, pericardiocentesis, and needle decompression during actual codes. Second, we planned to institute other SBML courses (eg, for central venous catheter insertion) and wanted to be as efficient as possible with faculty and resident time and resources. For these reasons, we limited the procedural training and incorporated it into relevant scenarios. Despite shortened training time, our research documented that patient care quality continued to improve. After the original curriculum, chart audits showed that residents displayed 68% adherence to AHA protocols during actual ACLS events at our primary teaching hospital.19 A follow-up report showed that after institution of the mastery model guideline, adherence rose to 86% to 88% during actual ACLS events.20 This finding supports the decision to reduce training time and also confirms evidence from a recent meta-analysis that SBML is more effective and efficient than SBME.37
Why were the SBME and SBML interventions successful? Faculty members were highly engaged and had the ability to redesign the checklists and curriculum when ACLS AHA protocols changed (twice). Residents were also highly satisfied with the training and enjoyed participating. We formed a lasting partnership with our hospital QI teams. This relationship allowed us to study the impact of simulation-based interventions as a QI strategy and to document improved and sustained clinical care over time.
What did not work? We were never able to fully incorporate all members of the cardiac arrest team, such as pharmacists and bedside nurses, because of a large number of clinicians and scheduling challenges. We had better success training rapid response nurses alongside IM residents because the nurses had fewer personnel and these numbers were manageable.
Limitations and Challenges
Creation of a rigorous SBML curriculum for individual clinical skills can be either straightforward or challenging, depending on the skill. Developing curricula and outcome measures is different for complex tasks such as ACLS events because it involves training dynamic clinical teams. We were fortunate to have a “criterion standard” approach to address ACLS clinical scenarios published by the AHA for a curriculum foundation. Using the mastery model in scenarios that involve complex clinical activities and advanced clinical reasoning and decision making at the same time is challenging.38 These clinical conditions may not have well-articulated and easily measured metrics and milestones, and it is unknown whether clinical skills training and assessments in these areas can be supported by SBML curricula. However, it is reassuring to know that SBML has successfully been adapted to nonprocedural tasks such as code status disussions.30 Further study is needed on the use of the mastery model to assess management of acute and complex clinical conditions beyond ACLS.38
Additional challenges to developing SBML curricula include administrative support, “buy-in” from key stakeholders, and funding. Attention to each of these is recommended to educators considering the use of SBML. Administrative support is required to schedule trainees for SBML despite competing clinical responsibilities. One reason for our success was direct involvement of the residency program director who ensured that the ACLS training program was required and attendance was mandatory. We also received strong support from 2 successive department of medicine chairs, hospital quality personnel, and nursing leadership. Support from department chairs allowed us to commit faculty time to the project. Simulation-based mastery learning requires sufficient rater training to develop high interrater reliability and ensure the validity of pass/fail decisions. Because of the support from department leadership, the chief medical residents donated time and effort to review ACLS events and helped develop and pilot test scenarios. The department chairs also supported 2 faculty members at 10% effort for 1 year to participate. Hospital administration contributed to the costs of SBML including facility and space rental. We were successful because of nursing and quality leaders who supported the project in receiving a multiyear NMH grant to cover other costs. Endorsement from these sources about the importance of SBML promoted the program’s initial and continued success. We strongly encourage other health professions educators to consider the local environment before selecting SBML targets. Linkage with organizational quality initiatives is beneficial to maximize institutional commitment to a robust SBML intervention.
Finally, an additional challenge (and opportunity) in SBML is linking education in the simulated environment to downstream impacts on patient care quality and health care costs. We demonstrated that our SBML intervention improved the quality of care delivered to patients during ACLS events19,20 and that SBML reduced complications and improved patient outcomes for other medical procedures.39,40 Establishing such translational links is challenging and stems from education and research programs that are thematic, sustained, and cumulative.41 We encourage health profession educators who intend to use the mastery model to extend the end point of their work from the classroom or laboratory to the clinical setting.
Simulation center costs (including 2 ACLS instructors) were $100 per hour when the program began. We used between 120 and 150 hours of simulator time each year. The chief medical residents donated their time to curriculum development and pilot testing as part of their teaching responsibilities during the program’s first years. Two faculty members’ salaries were supported at 10% for 1 year by the department of medicine. We estimate total costs of approximately $45,000 for the first year and approximately $20,000 in subsequent years. We have not measured the return on investment from improved ACLS care. However, SBML has been shown to be highly cost-effective for other medical procedures chiefly by reducing patient complications.42,43
A direct expression of the values held by today’s medical profession is found in the curricula used to educate the next generation of doctors. A medical curriculum leaves a lasting legacy. These curricula must be revised and updated to continue to fulfill the professional practice requirements of trainees and the care expectations of a diverse and informed public. Medical curricula are designed to meet the needs of learners in a local environment. The ACLS core curriculum began with a clinical need and had extensive planning and pilot testing. It changed over time because it was first implemented as a simulation-based education program and later transitioned to the mastery model. Subsequent curricular innovation and remodeling in response to external forces illustrate the teaching of Kern et al that continuous maintenance and enhancement are needed to keep a medical curriculum fresh and timely.8 We anticipate that the SBML ACLS curriculum will continue to mature and change because of medical education research findings about better SBML teaching practices44 and standard setting methods.45 Curricula in SBML are not easy to develop but are justified by the important and sustained educational and downstream clinical outcomes they yield. An additional reward of detailed curriculum planning is the ability to successfully disseminate programs such as the ACLS SBML curriculum to other institutions.46
Our experience with the transformation from SBME to SBML allows us to provide suggestions and advice to others considering the mastery model of education. To succeed, we believe a mastery learning intervention must be (a) powerful with impact on patient care and sustainable, (b) grounded in mastery learning principles including DP and individualized feedback, (c) developed with early attention to implementation science issues including institutional culture as well as faculty and learner engagement, (d) embedded in a unified patient care culture, and (e) linked to rigorous assessments of skill acquisition and patient care outcomes. Mastery learning is no longer a novelty in medical education, and its educational value has been verified via meta-analysis.37 We anticipate widespread adoption of the mastery learning model to promote acquisition and maintenance of a variety of skill and knowledge outcomes in medical education.
1. Cooke M, Irby DM, O’Brien BC. Educating Physicians: A Call for Reform of Medical School and Residency
, San Francisco, CA: Jossey-Bass; 2010.
2. Clauser BE, Margolis MJ, Case SM. Testing for licensure and certification in the professions. In: Brennan RL, ed. Educational Measurement
. 4th ed. Westport, CT: American Council on Education and Praeger Publishers; 2006: 701–731.
3. Ludmerer KM. Learning to Heal: The Development of American Medical Education
. Baltimore, MD: The Johns Hopkins University Press; 1985.
4. Osler W. The hospital as a college . In: Osler W, ed. Aequanimitas
. Philadelphia, PA: P. Blakiston’s Son & Co; 1932: 313–324.
5. Issenberg SB, McGaghie WC. Looking to the future. In: McGaghie WC, ed. International Best Practices for Evaluation in the Health Professions
. London, United Kingdom: Radcliffe Publishing, Ltd; 2013: 341–359.
6. Grant J. Principles of curriculum design. In: Swanwick T, ed. Understanding Medical Education: Evidence, Theory and Practice
. Oxford, United Kingdom: John Wiley & Sons; 2010: 1–15.
7. McGaghie WC, Miller GE, Sajid A, Telder TV. Competency-Based Curriculum Development in Medical Education. Public Health Paper No. 68
. Geneva, Switzerland: World Health Organization; 1978.
8. Kern DE, Thomas PA, Hughes MT, eds. Curriculum Development for Medical Education: A Six-Step Approach
. 2nd ed. Baltimore, MD: The Johns Hopkins University Press; 2009.
9. Stefanidis D, Colavita PD. Simulation
in general surgery. In: Levine AI, DeMaria S Jr, Schwartz AD, Sim AJ, eds. The Comprehensive Textbook of Healthcare Simulation
. New York, NY: Springer; 2013: 353–366.
10. Severin PN, Cortez EP, McNeal CA, Kramer JE. Considerations of pediatric simulation
. In: Kyle RR, Murray WB, eds. Clinical Simulation: Operations, Engineering and Management
. Burlington, MA: Elsevier; 2008: 411–421.
11. McGaghie WC, Siddall VJ, Mazmanian PE, Myers J. Lessons for continuing medical education from simulation
research in undergraduate and graduate medical education: effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. Chest
2009; 135(suppl 3): 62S–68S.
12. American Board of Internal Medicine. Policies and Procedures for Certification 2004. Available at: http://http://www.abim.org
/default.aspx. Accessed October 22, 2014.
13. American Heart Association. Advanced Cardiac Life Support
-Classroom. Available at: http://http://www.heart.org
/HEARTORG/CPRAndECC/HealthcareProviders/AdvancedCardiovascularLifeSupportACLS/Advanced-Cardiovascular-Life-Support--Classroom_UCM_306643_Article.isp, Accessed October 21, 2014.
14. Cummins RO (Ed). ACLS Provider Manual
. Dallas, TX: American Heart Association; 2001.
15. Wayne DB, Butter J, Siddall VJ, et al. Mastery learning
of advanced cardiac life support
skills by internal medicine residents using simulation
technology and deliberate practice. J Gen Intern Med
2006; 21: 251–256.
16. Nolan JP, Hazinski MF, Billi JE, et al. Part 1: executive summary. 2010 international consensus on cardiopulmonary resuscitation and emergency cardiovascular care science with treatment recommendations. Resuscitation
2010; 81S: e1–e25.
17. Wayne DB, Butter J, Siddall VJ, et al. Simulation
-based training of internal medicine residents in advanced cardiac life support
protocols: a randomized trial. Teach Learn Med
2005; 17(3): 210–216.
18. Wayne DB, Butter J, Siddall VJ, et al. Graduating internal medicine residents’ self-assessment and performance of advanced cardiac life support
skills. Med Teach
2006; 28(4): 365–369.
19. Wayne DB, Didwania A, Feinglass J, et al. Simulation
-based education improves the quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. Chest
2008; 133: 56–61.
20. Didwania A, McGaghie WC, Cohen ER, et al. Progress toward improving the quality of cardiac arrest medical team responses at an academic teaching hospital. J Grad Med Educ
2011; 3: 211–216.
21. Abella BS, Alvarado JP, Myklebust H, et al. Quality of cardiopulmonary resuscitation during in-hospital cardiac arrest. JAMA
2005; 293: 305–310.
22. Wayne DB, Fudala MJ, Butter J, et al. Comparison of two standard-setting methods for advanced cardiac life support
training. Acad Med
2005; 80(Suppl 10): S63–S66.
23. International Liaison Committee on Resuscitation and Emergency Cardiovascular Care Science with Treatment Recommendations. Circulation
2005; 112(Suppl III)III-1-136.
24. Proceedings of the 2005 International Consensus on Cardiopulmonary Resuscitation and Emergency Cardiovascular Care Science with Treatment Recommendations. Resuscitation
2005; 67: 157–341.
25. Wayne DB, Butter J, Didwania A, et al. Advanced cardiac life support
checklists for simulation
-based education. MedEdPORTAL 2009. Available at: http://www.mededportal.org
/publication/1773. Accessed October 23, 2014.
26. Stufflebeam DL. Guidelines for developing evaluation checklists: The Checklists Development Checklist (CDC)
. Western Michigan University Evaluation Center, July 2000. Available at: http://http://www.wmich.edu
/evalctr/checklists/evaluation-checklists/. Accessed October 16, 2014.
27. Wayne DB, Nitzberg M, Reddy S, et al. Advanced cardiac life support
checklists for simulation
-based education. MedEdPORTAL 2014. Available at: http://www.mededportal.org
/publication/9698. Accessed September 26, 2014.
28. Issenberg SB, McGaghie WC, Hart JR, et al. Simulation
technology for health care professional skills training and assessment. JAMA
1999; 282: 861–866.
29. Wayne DB, Siddall VJ, Butter J, et al. A longitudinal study of internal medicine residents’ retention of advanced cardiac life support
skills. Acad Med
2006; 81(suppl 10): S9–S12.
30. Cohen ER, Barsuk JH, Moazed F, et al. Making July safer: simulation
-based mastery learning
during intern boot camp. Acad Med
2013; 88: 233–239.
31. Peberdy MA, Kaye W, Ornato JP, et al. Cardiopulmonary resuscitation of adults in the hospital: a report of 14,720 cardiac arrests from the National Registry of Cardiopulmonary Resuscitation. Resuscitation
2003; 58: 297–308.
32. Makker R, Gray-Siracusa K, Evers M. Evaluation of advanced cardiac life support
in a community teaching hospital by use of actual cardiac arrests. Heart Lung
1995; 24: 116–120.
33. Kaye W. Research on ACLS training—which methods improve skill and knowledge retention? Respir Care
1995; 40: 538–546.
34. Fincher R-ME, Lewis LA. Simulations used to teach clinical skills. In: Norman GR, van der Vleuten CPM, Newble DI, eds. International Handbook of Research in Medical Education, Part I
. Dordrecht, NL: Kluwer Publishing Co; 2002: 499–523.
35. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med
2004; 79(Suppl 10): S70–S81.
36. McGaghie WC. Medical education research as translational science. Sci Transl Med
2010; 2: 19cm8.
37. Cook DA, Brydges R, Zendejas B, et al. Mastery learning
for health professionals using technology-enhanced simulation
: a systematic review and meta-analysis. Acad Med
2013; 88: 1178–1186.
38. McGaghie WC, Kristopaitis T. Deliberate practice and mastery learning
origins of expert medical performance. In: Cleland J, Durning S, eds. Researching Medical Education
. Chichester, West Sussex, United Kingdom: John Wiley and Sons, Inc; 2015: 219–230.
39. Barsuk JH, McGaghie WC, Cohen EC, et al. Simulation
-based mastery learning
reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med
2009; 37(10): 2697–2701.
40. Barsuk JH, Cohen ER, Feinglass J, et al. Use of simulation
-based education to reduce catheter-related bloodstream infections. Arch Intern Med
2009; 169(15): 1420–1423.
41. McGaghie WC, Draycott TS, Dunn WF, et al. Evaluating the impact of simulation
on translational patient outcomes. Simul Healthc
2011; 6(Suppl 3): S42–S47.
42. Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter-related bloodstream infection after simulation
-based education for residents in a medical intensive care unit. Simul Healthc
2010; 5: 98–102.
43. Barsuk JH, Cohen ER, Feinglass J, et al. Cost savings of performing paracentesis procedures at the bedside after simulation
-based education. Simul Healthc
2014; 9: 312–318.
44. Hunt EA, Duval-Arnould JM, Nelson-McMillan KL, et al. Pediatric resident resuscitation skills improve after “rapid cycle deliberate practice” training. Resuscitation
2014; 85: 945–951.
45. Yudkowsky R, Tumuluru S, Casey P, et al. A patient safety approach to setting pass/fail standards for basic procedural skills checklists. Simul Healthc
2014; 9: 277–282.
46. Colquitt JD, Parish DC, Trammell AR, et al. Mastery learning
of ACLS among internal medicine residents. Analg Resus Current Res