Home Current Issue Previous Issues Published Ahead-of-Print Collections For Authors Journal Info
Skip Navigation LinksHome > June 2011 - Volume 86 - Issue 6 > Does Simulation-Based Medical Education With Deliberate Prac...
Academic Medicine:
doi: 10.1097/ACM.0b013e318217e119
Comparative Effectiveness Research

Does Simulation-Based Medical Education With Deliberate Practice Yield Better Results Than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence

McGaghie, William C. PhD; Issenberg, S. Barry MD; Cohen, Elaine R.; Barsuk, Jeffrey H. MD; Wayne, Diane B. MD

Free Access
Article Outline
Collapse Box

Author Information

Dr. McGaghie is Jacob R. Suker, MD, Professor of Medical Education, professor of preventive medicine, and director of evaluation, Northwestern University Clinical and Translational Sciences Institute, Northwestern University Feinberg School of Medicine, Chicago, Illinois.

Dr. Issenberg is Michael S. Gordon, MD, professor of medicine and assistant director, Gordon Center for Research in Medical Education, University of Miami Miller School of Medicine, Miami, Florida.

Ms. Cohen is research assistant, Department of Medicine, Northwestern University Feinberg School of Medicine, Chicago, Illinois.

Dr. Barsuk is assistant professor of medicine, Division of Hospital Medicine, Northwestern University Feinberg School of Medicine, Chicago, Illinois.

Dr. Wayne is associate professor of medicine and director, Internal Medicine Residency Training Program, Northwestern University Feinberg School of Medicine, Chicago, Illinois.

Correspondence should be addressed to Dr. McGaghie, Center for Education in Medicine, Northwestern University Feinberg School of Medicine, 1-003 Ward Building, 303 East Chicago Avenue, Chicago, IL 60611; telephone: (312) 503-0174; fax: (312) 503-0840; e-mail: wcmc@northwestern.edu.

First published online April 20, 2011.

Collapse Box

Abstract

Purpose: This article presents a comparison of the effectiveness of traditional clinical education toward skill acquisition goals versus simulation-based medical education (SBME) with deliberate practice (DP).

Method: This is a quantitative meta-analysis that spans 20 years, 1990 to 2010. A search strategy involving three literature databases, 12 search terms, and four inclusion criteria was used. Four authors independently retrieved and reviewed articles. Main outcome measures were extracted to calculate effect sizes.

Results: Of 3,742 articles identified, 14 met inclusion criteria. The overall effect size for the 14 studies evaluating the comparative effectiveness of SBME compared with traditional clinical medical education was 0.71 (95% confidence interval, 0.65–0.76; P < .001).

Conclusions: Although the number of reports analyzed in this meta-analysis is small, these results show that SBME with DP is superior to traditional clinical medical education in achieving specific clinical skill acquisition goals. SBME is a complex educational intervention that should be introduced thoughtfully and evaluated rigorously at training sites. Further research on incorporating SBME with DP into medical education is needed to amplify its power, utility, and cost-effectiveness.

This article addresses the comparative effectiveness of traditional methods of clinical medical education, especially the Halstedian “see one, do one, teach one” approach,1 versus simulation-based medical education (SBME) with deliberate practice (DP). SBME2–4 engages learners in lifelike experiences with varying fidelity designed to mimic real clinical encounters. DP embodies strong and consistent educational interventions grounded in information processing and behavioral theories of skill acquisition and maintenance.5–8 DP has at least nine elements (List 1).

List 1
List 1
Image Tools

The goal of DP is constant skill improvement, not just skill maintenance. The power of DP has been demonstrated in many professional domains including sports, commerce, performing arts, science, and writing.9 Research shows that DP is a much more powerful predictor of professional accomplishment than experience or academic aptitude.6

Comparative effectiveness research (CER), also known as patient-centered outcomes research, refers to studies that compare the benefits and liabilities of different interventions and strategies to prevent, diagnose, treat, and monitor health conditions.10–12 The aim is to “make head-to-head comparisons of different health care interventions [that] outline the effectiveness—or benefits and harms—of treatment options.”13 Conventional clinical treatment options include drugs, surgery, rehabilitation, and preventive interventions that (1) improve patient health, (2) contribute to quality of life, and (3) boost longevity. Treatment options grounded in comparative research have efficacy in controlled laboratory settings and are also effective in clinical patient care where health care delivery, its receipt, and patient adherence vary widely.14

U.S. CER policies have been published recently by the Institute of Medicine (IOM) under the title, Knowing What Works in Health Care: A Roadmap for the Nation.15 Complementary work by the U.S. Agency for Healthcare Research and Quality (AHRQ) outlines comparative health care research priorities.16 These expressions of CER policies and priorities focus on conventional treatment options. However, they do not address the value of a skillful medical and health professions workforce and the importance of its education for the delivery of effective health care. We assert that human capital, embodied in competent physicians and other health care professionals, is an essential feature of health care delivery even though IOM policies and AHRQ research priorities are silent about the contribution of health professions education to health care delivery.

The purpose of medical education at all levels is to prepare physicians with the knowledge, skills, and features of professionalism needed to deliver quality patient care. Medical education research seeks to make the enterprise more effective, efficient, and economical. Short- and long-run goals of research in medical education are to show that educational programs contribute to physician competence measured in the classroom, simulation laboratory, and patient care settings. Improved patient outcomes linked directly to educational events are the ultimate goal of medical education research and qualify this scholarship as translational science.17

This article reviews and evaluates evidence about the comparative effectiveness of SBME with DP versus traditional clinical education. The goal of the study is to perform a “head-to-head” comparison of these two educational methods toward the goal of clinical skill acquisition. This is a quantitative meta-analysis of SBME research that spans 20 years, from 1990 to 2010. The comparative review is selective and critical. We also believe it is exhaustive because the number of existing comparative studies is small.

Back to Top | Article Outline

Method

This article was prepared using most reporting conventions described in the MOOSE (Meta-analysis Of Observational Studies in Epidemiology) statement18 and the QUOROM statement19 for reports of meta-analyses of randomized controlled trials.

Back to Top | Article Outline
Study eligibility and identification

Quantitative research synthesis begins with a systematic search of existing literature. Our search strategy covered three literature databases (MEDLINE, Web of Knowledge, PsychINFO) and employed 12 single search terms and concepts (clinical education, clinical outcomes, deliberate practice, fellows, mastery learning, medical education, medical simulation training, medical students, patient outcomes, quality of care, residents, simulator) and their Boolean combinations. We searched publications from 1990 to April 2010. We also reviewed reference lists of all selected manuscripts to identify additional reports. The intent was to perform a detailed and thorough search of peer-reviewed publications that have been judged for academic quality to evaluate the comparative effectiveness of SBME with DP versus traditional clinical education.

Back to Top | Article Outline
Study selection

Four authors (W.C.M., S.B.I., E.R.C., and D.B.W.) independently retrieved articles using the 12 search terms and reviewed titles and abstracts. The full text of each article thought to be eligible for the study was also reviewed by the four authors. Four inclusion criteria were used to select the pool of eligible studies for the final analysis: Each study had to (1) feature SBME with DP as an educational intervention, (2) have an appropriate comparison group featuring traditional, clinical education or a preintervention baseline measurement for single-group designs, (3) assess trainee skill acquisition rather than knowledge or attitudes, and (4) present sufficient data to enable effect size calculation. Conflicts were resolved by consensus.

Back to Top | Article Outline
Data abstraction and synthesis

The following data were extracted from selected studies: (1) study design (i.e., randomized trial, cohort study, case–control study, prepost baseline study), (2) sample size, (3) outcome variables (i.e., what competency was assessed), and (4) reported skill outcome values (mean and standard deviation).

Two authors (W.C.M., E.R.C.) abstracted information about the main outcome measure for each study and performed statistical analyses. For studies with a comparison group, effect sizes were calculated as the difference in means between the intervention and control groups, divided by the pooled standard deviation. For these studies, the intervention group comprised all medical trainees receiving SBME with DP, whereas the control group included all medical trainees receiving traditional clinical education. Effect size calculations for pre-post baseline studies (within-subjects designs) were performed by dividing the mean difference between post and pretest outcomes by the pretest standard deviation. When sufficient data were not available, we used t test values and degrees of freedom to calculate effect size correlation.20 Effect sizes were derived for individual studies and then combined across research reports. Effect size estimates were corrected for sample size. Correlation coefficients were calculated from effect size estimates. For each outcome of interest, pooled estimates and 95% confidence intervals (CIs) of effect size correlations were calculated using an inverse-variance weighted random-effects meta-analysis.20 Statistical significance was defined as P < .05. Data analyses were done using Comprehensive Meta-Analysis, Version 2 (Biostat, Englewood, New Jersey).

Back to Top | Article Outline

Results

Each reviewer screened the 3,742 citations identified using our search strategies. We excluded papers if they were not original research, did not involve medical learners, or were not published in English. This left us with a subset of 328 articles for further review. This group was evaluated in detail by each author to reach consensus on whether the articles met the inclusion criteria described above until full consensus was reached. Of this group, 314 were excluded from the final analysis. These 314 papers were excluded because they did not feature DP, did not have a comparison group, the intervention was not simulation based, or because data were insufficient. Several studies were included even though the term “deliberate practice” was not used in text. In these cases, descriptions of the type, intensity, and quality of the educational interventions were synonymous with the DP model.

The search strategy and inclusion and exclusion criteria resulted in a final set of 14 research reports addressing medical clinical skill acquisition.21–34 The 14 journal articles are listed in Table 1 in four descending categories ordered by the rigor of their research design.14

Table 1
Table 1
Image Tools

A total of 633 learners participated, including 389 internal medicine, surgical, and emergency medicine residents, 226 medical students, and 18 internal medicine fellows. The SBME studies address a number of competencies and skills including advanced cardiac life support, laparoscopic surgical techniques, central venous catheter insertion, cardiac auscultation, and thoracentesis. Six of the studies demonstrated improvement in laparoscopic surgical skills including cholecystectomy, instrument and camera navigation and handling, and suturing live tissues.22–26,34 Two of the studies showed improved performance and adherence to American Heart Association advanced cardiac life support guidelines including responses during actual patient codes.21,30 Cardiac auscultation skills including identification and interpretation of heart sounds and murmurs were improved among medical students and residents in two studies.27,29 Four of the studies demonstrated improved ability among residents and fellows to perform three invasive procedures (hemodialysis catheter insertion, thoracentesis, central venous catheter insertion).28,31–33

Results from the meta-analysis of the 14 studies comparing SBME with DP versus traditional clinical education are displayed quantitatively and as a forest plot in Figure 1.35 The figure shows CER results with 95% CIs for each individual study and overall. The magnitude of boxes shown is a relative sample size indicator. Without exception and with very high confidence, the CER data favor SBME with DP in comparison with traditional clinical education or a preintervention baseline measure. Every study exceeds the null value without statistical overlap. The overall effect size correlation (0.71) qualifies as a large effect size36 and summarizes the quantitative power of SBME with DP educational interventions for skill acquisition compared with traditional clinical education.

Figure 1
Figure 1
Image Tools
Back to Top | Article Outline

Discussion and Conclusions

Only a small number of studies were identified that address “head-to-head” comparative effectiveness of SBME with DP and traditional clinical education or a preintervention baseline. However, the results of this meta-analysis are clear and unequivocal. The meta-analytic outcomes favoring SBME with DP are powerful, consistent, and without exception. There is no doubt that SBME is superior to traditional clinical education for acquisition of a wide range of medical skills represented in this study: advanced cardiac life support, laparoscopic surgery, cardiac auscultation, hemodialysis catheter insertion, thoracentesis, and central venous catheter insertion. We are confident that demonstrations of the utility and cost-effectiveness37 of educational interventions featuring SBME with DP will increase as the technology is applied to other skill acquisition and maintenance opportunities in health care.

A growing body of evidence shows that clinical skills acquired in medical simulation laboratory settings transfer directly to improved patient care practices and better patient outcomes. Examples of improved patient care practices linked directly to SBME include studies of better management of difficult obstetrical deliveries (e.g., shoulder dystocia),38 laparoscopic surgery,39 and bronchoscopy.40 Better patient outcomes linked directly to SBME have been reported in several studies using historical control groups that address reductions in catheter-related bloodstream infections41 and postpartum outcomes (e.g., brachial palsy injury,38 neonatal hypoxic–ischemic encephalopathy42) among newborn infants. Such work suggests that traditional, clinical education is insufficient if the goal is skill acquisition and downstream patient safety.

The power and utility of SBME with DP toward the goal of skill acquisition are no longer in doubt, especially compared with traditional models of clinical education. However, we also acknowledge that SBME with DP is a complex intervention that has a variety of elements including a long implementation chain, features that mutate as a result of refinement and adaptation to local circumstances, and represent open systems that feed back on themselves. Pawson and colleagues43 assert, “As interventions are implemented, they change the conditions that made them work in the first place.” There is much to learn about the organizational effects of SBME with DP on the medical schools and postgraduate residency programs that adopt these new educational technologies. Best practices to develop faculty to teach using SBME with DP also warrant attention. Finally, we agree with Eva,44 who asserts that the medical education community needs to “move away from research that is intended to prove the effectiveness of our educational endeavours and towards research that aims to understand the complexity inherent in those activities.”

The results of this CER study underscore the importance of using new ways to invest and grow human capital embodied in a highly skilled workforce to improve health care delivery and patient safety. CER policies and priorities should endorse the importance of medical and health professions education in addition to investments in basic science research, drug design, medical device fabrication, and other established mechanisms of medical translational science. A recent conference hosted by Harvard Medical School involving educational leaders from eight other U.S. medical schools concluded that “investigation of the efficacy of simulation in enhancing the performance of medical school graduates received the highest [priority] score.”45 Enhancement of the traditional clinical educational model with evidence-based practices like SBME with DP should be a high priority for medical education policy and research.

This study has several limitations. First, the final number of research studies contained in the meta-analysis (14) is small even though the data involve 633 medical learners. Second, the meta-analysis primarily addresses acquisition of medical procedural skills. It does not cover acquisition of many other clinical skills, such as judgment under pressure, medical decision making, situation awareness, teamwork, or professional behavior. It is not known whether the DP model is suited to these skills, and research is warranted. Third, we are aware of many potential sources of bias that may affect meta-analyses of quasi-experimental research including cohort, case–control, and pre-post studies.46 Despite these limitations, the direction, strength, and consistency of results from this study indicate that the outcomes are robust. We conclude that DP is a key variable in rigorous SBME research and training. Further CER on SBME with DP versus traditional clinical education will refute or endorse this conclusion.

Back to Top | Article Outline

Acknowledgments:

The authors acknowledge the graphical expertise of Sheila Macomber. They thank Douglas Vaughan, MD, at Northwestern University and Michael S. Gordon, MD, PhD, at the University of Miami for their support of this work.

Back to Top | Article Outline

Funding/Support:

This research was supported in part by the Jacob R. Suker, MD, professorship in medical education and by grant UL 1 RR025741 from the National Center for Research Resources, National Institutes of Health (NIH) (Dr. McGaghie). The NIH had no role in the preparation, review, or approval of the manuscript.

Back to Top | Article Outline

Other disclosures:

None.

Back to Top | Article Outline

Ethical approval:

Not applicable.

Back to Top | Article Outline

References

1Halsted WS. The training of the surgeon. Bull Johns Hopkins Hosp. 1904;15:267–275.

2Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282:861–866.

3Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach. 2005;27:10–28.

4McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63.

5Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81. http://journals.lww.com/academicmedicine/Fulltext/2004/10001/Deliberate_Practice_and_the_Acquisition_and.22.aspx. Accessed February 25, 2011.

6Ericsson KA. The influence of experience and deliberate practice on the development of superior expert performance. In: Ericsson KA, Charness N, Feltovich PJ, Hoffman RR, eds. The Cambridge Handbook of Expertise and Expert Performance. New York, NY: Cambridge University Press; 2006:683–703.

7Cordray DS, Pion GM. Treatment strength and integrity: Models and methods. In: Bootzin RR, McKnight PE, eds. Strengthening Research Methodology: Psychological Measurement and Evaluation. Washington, DC: American Psychological Association; 2006:103–124.

8Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med. 2006;21:251–256.

9Ericsson KA, Charness N, Feltovich PJ, Hoffman RR, eds. The Cambridge Handbook of Expertise and Expert Performance. New York, NY: Cambridge University Press; 2006.

10U.S. Department of Health and Human Services. Comparative Effectiveness Research Funding: Federal Coordinating Council for Comparative Effectiveness Research. http://www.hhs.gov/recovery/programs/cer. Accessed February 28, 2011.

11Institute of Medicine. National Priorities for Comparative Effectiveness Research. Washington, DC: National Academies Press; 2009.

12Hochman M, McCormick D. Characteristics of published comparative effectiveness studies of medications. JAMA. 2010;303:951–958.

13Agency for Healthcare Research and Quality. AHRQ Effective Health Care Program. http://effectivehealthcare.ahrq.gov. Accessed February 25, 2011.

14Fletcher RH, Fletcher SW, Wagner EH. Clinical Epidemiology—The Essentials. 3rd ed. Baltimore, Md: Lippincott Williams & Wilkins; 1996.

15Institute of Medicine. Knowing What Works in Health Care: A Roadmap for the Nation. Washington, DC: National Academies Press; 2008.

16Agency for Healthcare Research and Quality. What is the effective health care program. http://effectivehealthcare.ahrq.gov/index.cfm/what-is-the-effective-health-care-program1. Accessed February 25, 2011.

17McGaghie WC. Medical education research as translational science. Sci Trans Med. 2010;2:19cm8.

18Stroup DF, Berlin JA, Morton SC, et al, for the Meta-analysis Of Observational Studies in Epidemiology (MOOSE) Group. Meta-analysis of observational studies in epidemiology: A proposal for reporting. JAMA. 2000;283:2008–2012.

19Moher D, Cook DJ, Eastwood S, et al, for the QUOROM Group. Improving the quality of reports of meta-analyses of randomized controlled trials: The QUOROM statement. Lancet. 1999;354:1896–1900.

20Borenstein M, Hedges LV, Higgins JPT, Rothstein HR. Introduction to Meta-Analysis. Chichester, UK: John Wiley & Sons; 2009.

21Wayne DB, Butter J, Siddall VJ, et al. Simulation-based training of internal medicine residents in advanced cardiac life support protocols: A randomized trial. Teach Learn Med. 2005;17:202–208.

22Ahlberg G, Enochsson L, Gallagher AG, et al. Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am J Surg. 2007;193:797–804.

23Andreatta PB, Woodrum DT, Birkmeyer JD, et al. Laparoscopic skills are improved with LapMentor training: Results of a randomized, double-blinded study. Ann Surg. 2006;243:854–863.

24Korndorffer JR, Dunne JB, Sierra R, et al. Simulator training for laparoscopic suturing using performance goals translates to the operating room. J Am Coll Surg. 2005;201:23–29.

25Korndorffer JR, Hayes DJ, Dunne JB, et al. Development and transferability of a cost-effective laparoscopic camera navigation simulator. Surg Endosc. 2005;19:161–167.

26Van Sickle KR, Bitter EM, Baghai M, et al. Prospective, randomized, double-blind trial of curriculum-based training for intracorporeal suturing and knot tying. J Am Coll Surg. 2008;207:560–568.

27Issenberg SB, McGaghie WC, Gordon DL, et al. Effectiveness of a cardiology review course for internal medicine residents using simulation technology and deliberate practice. Teach Learn Med. 2002;14:223–228.

28Barsuk JH, Ahya SN, Cohen ER, et al. Mastery learning of temporary hemodialysis catheter insertion by nephrology fellows using simulation technology and deliberate practice. Am J Kidney Dis. 2009;53:A14–A17.

29Butter J, McGaghie WC, Cohen ER, et al. Simulation-based mastery learning improves cardiac auscultation skills in medical students. J Gen Intern Med. 2010;25:780–785.

30Wayne DB, Didwania A, Feinglass J, et al. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital. Chest. 2008;133:56–61.

31Wayne DB, Barsuk JH, O'Leary KO, et al. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3:48–54.

32Barsuk JH, McGaghie WC, Cohen ER, et al. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4:397–403.

33Barsuk JH, McGaghie WC, Cohen ER, et al. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37:2697–2701.

34Stefanidis D, Sierra R, Korndorffer JR, et al. Intensive continuing medical education course training on simulators results in proficiency in laparoscopic suturing. Am J Surg. 2006;191:23–27.

35Anzures-Cabrera J, Higgins JPT. Graphical displays for meta-analysis: An overview with suggestions for practice. Res Syn Meth. 2010;1:66–80.

36Cohen J. Statistical Power Analysis for the Behavioral Sciences. 2nd ed. Hillsdale, NJ: Lawrence Erlbaum Associates; 1988.

37Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Sim Healthc. 2010;5:98–102.

38Draycott TJ, Crofts JF, Ash JP, et al. Improving neonatal outcome through practical shoulder dystocia training. Obstet Gynecol. 2008;112:14–20.

39Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: Results of a randomized, double-blinded study. Ann Surg. 2002;236:458–463.

40Blum MG, Powers TW, Sundarasan S. Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy. Ann Thorac Surg. 2004;78:287–291.

41Barsuk JH, Cohen ER, Feinglass J, et al. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med. 2009;169:1420–1423.

42Draycott T, Sibanda T, Owen L, et al. Does training in obstetric emergencies improve neonatal outcome? BJOG. 2006;113:177–182.

43Pawson R, Greenhalgh T, Havery G, Walshe K. Realist review—A new method of systematic review designed for complex policy interventions. J Health Serv Res Policy. 2005;10(suppl 1):21–34.

44Eva KW. The value of paradoxical tensions in medical education research. Med Educ. 2010;44:3–4.

45Fincher RM, White CB, Huang G, Schwartzstein R. Toward hypothesis-driven medical education research: Task force report from the Millennium Conference 2007 on educational research. Acad Med. 2010;85:821–828. http://journals.lww.com/academicmedicine/Abstract/2010/05000/Toward_Hypothesis_Driven_Medical_Education.27.aspx. Accessed February 25, 2011.

46Colliver JA, Kucera K, Verhulst SJ. Meta-analysis of quasi-experimental research: Are systematic narrative reviews indicated? Med Educ. 2008;42:858–865.

© 2011 Association of American Medical Colleges

Login

Article Tools

Images

Share