Secondary Logo

Journal Logo

Design, Implementation, and Evaluation of a Simulation-Based Clinical Correlation Curriculum as an Adjunctive Pedagogy in an Anatomy Course

Coombs, Carmen M. MD, MPH; Shields, Ryan Y. MD; Hunt, Elizabeth A. MD, MPH, PhD; Lum, Ying Wei MD; Sosnay, Patrick R. MD; Perretta, Julianne S. MSEd; Lieberman, Rhett H. MD, MPH; Shilkofski, Nicole A. MD, MEd

doi: 10.1097/ACM.0000000000001387
Innovation Reports
Free
SDC

Problem Because reported use of simulation in preclinical basic science courses is limited, the authors describe the design, implementation, and preliminary evaluation of a simulation-based clinical correlation curriculum in an anatomy course for first-year medical students at Perdana University Graduate School of Medicine (in collaboration with Johns Hopkins University School of Medicine).

Approach The simulation curriculum, with five weekly modules, was a component of a noncadaveric human anatomy course for three classes (n = 81 students) from September 2011 to November 2013. The modules were designed around major anatomical regions (thorax; abdomen and pelvis; lower extremities and back; upper extremities; and head and neck) and used various types of simulation (standardized patients, high-fidelity simulators, and task trainers). Several methods were used to evaluate the curriculum’s efficacy, including comparing pre- versus posttest scores and comparing posttest scores against the score on 15 clinical correlation final exam questions.

Outcomes A total of 81 students (response rate: 100%) completed all pre- and posttests and consented to participate. Posttest scores suggest significant knowledge acquisition and better consistency of performance after participation in the curriculum. The comparison of performance on the posttests and final exam suggests that using simulation as an adjunctive pedagogy can lead to excellent short-term knowledge retention.

Next Steps Simulation-based medical education may prove useful in preclinical basic science curricula. Next steps should be to validate the use of this approach, demonstrate cost-efficacy or the “return on investment” for educational and institutional leadership, and examine longer-term knowledge retention.

C.M. Coombs is assistant professor, Division of Emergency Medicine, Seattle Children’s Hospital, Seattle, Washington.

R.Y. Shields is a resident in obstetrics and gynecology, Yale University School of Medicine, New Haven, Connecticut. At the time of writing, R.Y. Shields was a medical student, Johns Hopkins University School of Medicine, Baltimore, Maryland.

E.A. Hunt is associate professor, Departments of Anesthesiology and Critical Care Medicine, Pediatrics, and Health Informatics, Johns Hopkins University School of Medicine, Baltimore, Maryland.

Y.W. Lum is assistant professor, Division of Vascular Surgery, Johns Hopkins University School of Medicine, Baltimore, Maryland, and former course director for human anatomy, Perdana University Graduate School of Medicine, Kuala Lumpur, Malaysia.

P.R. Sosnay is assistant professor, Division of Pulmonary and Critical Care Medicine, Johns Hopkins University School of Medicine, Baltimore, Maryland, and former director, Genes to Society Curriculum, Perdana University Graduate School of Medicine, Kuala Lumpur, Malaysia.

J.S. Perretta is instructor, Division of Anesthesia and Critical Care Medicine, Johns Hopkins University School of Medicine, and lead simulation educator, Johns Hopkins Medicine Simulation Center, Baltimore, Maryland.

R.H. Lieberman is associate professor, Division of Pediatric Emergency Medicine, Children’s Hospital of Pittsburgh, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania.

N.A. Shilkofski is assistant professor, Departments of Anesthesiology and Critical Care Medicine and Pediatrics, Johns Hopkins University School of Medicine, Baltimore, Maryland, and former vice dean for education, Perdana University Graduate School of Medicine, Kuala Lumpur, Malaysia.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: This study was reviewed and approved by the institutional review board of Johns Hopkins University School of Medicine.

Previous presentations: The study data have not been previously published elsewhere. They have been presented only as preliminary data in abstract form at the Ottawa Conference in Kuala Lumpur, Malaysia, in 2012. A brief description of the curriculum was presented as a case study in the Routledge International Handbook of Medical Education published in 2015.

Correspondence should be addressed to Nicole A. Shilkofski, Johns Hopkins University School of Medicine, Bloomberg Children’s Center, 1800 Orleans St., Suite 6349, Baltimore, MD 21287; telephone: (410) 955-2393; e-mail: nshilko1@jhmi.edu; Twitter: @HopkinsMedicine.

Back to Top | Article Outline

Problem

Simulation-based medical education (SBME) has become a commonly used pedagogy because of advances in technology, increasing attention to patient safety, and greater awareness of active learning as a core principle within adult learning.1 SBME has also gained recognition as an effective teaching and assessment method that lends itself well to clinical training and is supported by a significant body of research.2

In a 2011 Association of American Medical Colleges survey, all 90 responding schools reported using simulation at some point during the four years of medical school training.3 In the preclinical years, however, the most commonly used forms of SBME were animal models or standardized patient (SP)-based simulation, and SBME was rarely used within basic science courses.3

Reported use of simulation in preclinical basic science courses is limited. A small body of literature exists describing the use of simulation to teach physiologic principles, such as cardiovascular function4 and shock physiology5 as well as pharmacology.6 Although over 40% of responding medical schools in the Association of American Medical Colleges survey reported using simulation to teach anatomy,3 to the best of our knowledge, there are no published descriptions of dedicated simulation curricula for anatomy. Further, many medical schools reporting the use of simulation within anatomy curricula classified cadaveric dissection and simulated surgical procedures using cadavers as simulation techniques.7 Several studies have reported the use of virtual surgical procedures to identify anatomical structures and the integration of laparoscopic robotic procedures into traditional anatomy courses.8 However, few examples exist of high-fidelity simulators and task trainers being used as comprehensive tools to demonstrate surface anatomy and reinforce clinically relevant anatomical concepts.9

To help address the gap in the literature noted above, we describe the design, implementation, and preliminary evaluation of a simulation-based clinical correlation curriculum in an anatomy course for first-year medical students at Perdana University Graduate School of Medicine, Kuala Lumpur, Malaysia (in collaboration with Johns Hopkins University School of Medicine, Baltimore, Maryland). We designed the simulation-based clinical correlation curriculum as an adjunctive pedagogy to highlight key anatomical concepts, reinforce surface anatomy, and emphasize clinically relevant anatomy. Our purpose was to determine if simulation could be effectively integrated into an anatomy curriculum with a demonstrated impact on knowledge transfer and exam performance.

Back to Top | Article Outline

Approach

We designed, implemented, and evaluated a simulation-based clinical correlation curriculum with weekly modules as a component of a noncadaveric human anatomy course for three consecutive classes of first-year medical students (n = 81) from September 2011 to November 2013. Other separate but concurrent components of the anatomy course not specifically evaluated here included didactic lectures, gross anatomy labs using plastinated prosection models, radiologic correlations, computer-based virtual dissections, and case-based learning sessions.

The institutional review board of the Johns Hopkins University School of Medicine approved this study. Students verbally consented to participate in the study and had the option of excluding their performance data from analysis. Data were anonymized and tracked by random unique identifiers. All performance data were confidential and accessible only to the research team.

Back to Top | Article Outline

Description of the curriculum

An overview of the simulation-based clinical correlation curriculum is presented in Table 1. The curriculum design used high-fidelity mannequin simulators, task trainers, anatomical models, and SP scenarios. We designed five distinct modules around major anatomical regions (thorax; abdomen and pelvis; lower extremities and back; upper extremities; and head and neck) correlating with the region being studied in the course during a given week. We integrated the modules with the other components of the course (see above), with the modules occurring midweek. Each module included three separate simulation stations that each had four to seven learning objectives. Students spent approximately 45 minutes at each station. We developed the learning objectives using Bloom’s taxonomy, emphasizing cognitive and psychomotor domains and active application of anatomical knowledge. Our learning objectives focused on the introduction of these skills as a way to apply anatomical knowledge in a pragmatic way, not on procedural competence or skill mastery, given the target learners’ level of training and the short time frame for practice at each station. Accordingly, our evaluation methods targeted the assessment of knowledge rather than skill attainment. Each station also included a corresponding PowerPoint presentation highlighting key anatomical concepts.

Table 1

Table 1

Simulation modules, PowerPoint materials, learning objectives, and pre- and posttest questions were developed and taught by anatomy faculty and clinicians (including C.M.C., Y.W.L., P.R.S., and N.A.S.) who were experts in their fields (head and neck, musculoskeletal, pulmonary and thoracic anatomy, etc.). These faculty formed a consensus panel, which chose clinically relevant anatomy topics to be included in the curriculum through a Delphi method survey process. In addition, we had the curricular materials vetted by multiple education experts with extensive experience in SBME (E.A.H., J.S.P., N.A.S.). We describe below 3 of the 15 total stations as examples of the curricular integration of various types of simulation (SP based, high-fidelity mannequin based, and task trainer based). Table 1 further describes the simulation equipment used at each station.

During the week focusing on thoracic anatomy, one station taught the surface landmarks required for needle decompression of a tension pneumothorax and relevant pleural anatomy. The high-fidelity simulator was programmed to have a unilateral pneumothorax, and students were given clinical history, vital signs, physical exam, and radiologic findings consistent with tension pneumothorax. Once students determined the diagnosis, they identified anatomical surface landmarks on mannequins for needle decompression and subsequent chest tube placement. Faculty emphasized the relevant regional anatomy of neurovascular bundles, ribs, muscles, and parietal or visceral pleural structures as well as the avoidance of regional structures during procedures. Students then demonstrated needle decompression on mannequins, with concurrent verbalization of surface landmarks and underlying anatomy.

During the week of abdominal and pelvic anatomy, one station used SP simulation to highlight important anatomical features of the abdomen and pelvis. The SP presented with a classic history for appendicitis. Guided by faculty, students examined the SP and found physical exam signs such as pain over McBurney’s point and psoas sign. On the basis of their findings, they generated an anatomically based differential diagnosis, and faculty led them through a discussion on relevant abdominal anatomy.

During head and neck anatomy week, one station reinforced laryngeal, supraglottic, and subglottic anatomy using neonatal and adult task trainers with the capability for endotracheal intubation. After a faculty-led discussion on direct laryngoscopy technique and clinically relevant laryngeal anatomy for intubation, students individually demonstrated direct laryngoscopy with verbal identification of anatomical landmarks during intubation, including the epiglottis, vallecula, arytenoids, and true or false vocal cords. Faculty also noted the anatomic relationship between the trachea and esophagus and discussed erroneous esophageal intubation.

Back to Top | Article Outline

Data analysis

We used several methods to evaluate the efficacy of the curriculum. We designed multiple-choice exams for each module based on the specific learning objectives for each station. The exams were administered immediately before and after each module. We assessed data for normality of distribution and calculated means and standard deviations for continuous variables and proportions for categorical variables. Paired t tests compared pre- and posttest scores. To assess conceptual retention and knowledge decay, we included a subset of questions from the posttests (n = 15, representing 1 question per station) on the course’s final summative exam, which contained a total of 120 questions. These 15 questions used slightly different clinical vignette stems on the final exam to minimize test–retest bias. The course director chose these questions prior to the analysis of any posttest performance data. We compared total posttest scores with the scores obtained on the 15 clinical correlation final exam questions using paired t tests. Additionally, we compared student scores on the clinical correlation final exam questions with their scores on the 105 nonclinical correlation final exam questions by paired t test. We considered P < .05 significant and used SPSS Statistics 22.0 (IBM Corp., Armonk, New York) for statistical calculations.

To assess students’ perceptions of the achievement of learning objectives, students completed evaluations with five-point Likert scale (where 1 = strongly disagree and 5 = strongly agree) and open-ended questions after each module.

Back to Top | Article Outline

Outcomes

A total of 81 students (response rate: 100%) completed all pre- and posttests and consented to participate. Students ranged in age from 21 to 31 years, with a mean of 24 years. Fifty-three (65%) students were female, and 28 (35%) were male. All students held undergraduate degrees in science disciplines. Thirty-seven (46%) students reported previous anatomy training. Only 3 (4%) students reported prior exposure to simulation.

Figure 1 shows mean pre- and posttest scores for each module and overall. For comparison, all scores were scaled to 100. For each module, mean posttest scores were significantly higher than mean pretest scores (P < .001 for all comparisons). The mean (± standard deviation) total pretest score was 51% (± 10.8%) compared with 87% (± 6.7%) for the total posttest score, demonstrating 71% relative improvement. This comparison shows significant knowledge acquisition and better consistency of performance by the cohort after participation in the simulation curriculum.

Figure 1

Figure 1

To assess knowledge retention from the simulation modules, we calculated a score for each student based on their total score from their posttests for all five modules and compared it against their total score on the 15 clinical correlation questions used on the course’s final exam. For comparison, all scores were scaled to 100. The mean score of 91% (± 8.5%) obtained on the 15 final exam questions was significantly higher than the mean total score of 87% (± 6.7%) obtained on the posttests (P < .001). The comparison of performance on the posttests and final exam suggests that this pedagogical strategy can lead to excellent short-term knowledge retention.

A comparison between students’ perfor mance on the 15 clinical correlation final exam questions and the remaining 105 nonclinical correlation final exam questions (which related to knowledge obtained from the other concurrent components of the anatomy course) is shown in Figure 2. Students scored significantly higher on clinical correlation questions, with a mean score of 91% (± 8.5%), than on the nonclinical correlation questions, with a mean score of 69% (± 12.6%, P < .001). This suggests that this pedagogical strategy is a highly effective method to help students learn and retain key anatomical concepts. However, because some of the information taught in the simulation stations was also covered in concurrent components of the anatomy course, we cannot definitively conclude that this knowledge was obtained exclusively from participation in the simulation curriculum.

Figure 2

Figure 2

We administered evaluations to students after each module to assess their perceptions of the achievement of learning objectives. For all modules, 77 (95%) students either agreed or strongly agreed that the stations met the stated learning objectives. Themes that emerged from the open-ended questions included a positive sense of learner engagement, an appreciation for the interactive nature of the modules, and a desire for more time per station and fewer students per group.

Data from both our formative and summative evaluations show that the simulation curriculum was well received and perceived to be a valuable part of the course by students, thereby demonstrating face validity of the curriculum. From the students’ perspective, the main drawbacks were inadequate time allotted per station and too many students per group.

Back to Top | Article Outline

Next Steps

Our outcomes demonstrate the potential utility of SBME as an adjunctive pedagogy within a preclinical basic science curriculum. However, our sample size was small, and therefore a randomized controlled trial was not a feasible study design. Next steps should include a multi-institutional endeavor to study the integration of SBME in basic science courses in this fashion to develop tools to validate the use of this pedagogic strategy. This endeavor should also integrate lessons learned from our pilot implementation, including using smaller numbers of students per group at simulation stations and allotting more time per station for students to practice skills.

Simulation is very resource intensive in terms of the number of faculty required to execute small-group sessions and the cost of simulation equipment and facilities.10 With the high demands already on faculty and increasing financial pressures on schools, this may limit the feasibility of SBME curricula in some cases. Therefore, another next step should be to utilize study designs that compare educational benefits against a cost evaluation framework for larger-scale implementation, thereby evaluating the SBME pedagogy’s potential “return on investment” for educational and institutional leadership.

Future studies should also attempt to demonstrate longer-term knowledge retention of anatomical concepts through correlation with students’ performance on validated standardized exams, such as the National Board of Medical Examiners gross anatomy and embryology subject exam and the anatomy section of the United States Medical Licensing Exam Step 1.

Although more research is needed, our outcomes suggest that the integration of simulation pedagogies into basic science curricula could offer opportunities for preclinical students to engage in highly active learning strategies that lend themselves to understanding the clinical and translational relevance of basic science. Rather than replacing traditional teaching methods, we suggest that SBME could be a useful adjunctive pedagogy and should be the subject of further study within preclinical basic science curricula that use blended pedagogies.

Acknowledgments: The authors would like to acknowledge the anatomy faculty of Johns Hopkins University School of Medicine, the faculty at Perdana University Graduate School of Medicine who assisted in curriculum instruction, and the simulation educators at the Johns Hopkins Medicine Simulation Center who were instrumental in curriculum design and implementation.

Back to Top | Article Outline

References

1. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: A best evidence practical guide. AMEE guide no. 82. Med Teach. 2013;35:e1511–e1530.
2. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63.
3. Association of American Medical Colleges. Medical simulation in medical education: Results of an AAMC survey. https://members.aamc.org/eweb/upload/Medical%20Simulation%20in%20Medical%20Education%20Results%20of%20an%20AAMC%20Survey.pdf. Published September 2011. Accessed July 7, 2016.
4. Harris DM, Ryan K, Rabuck C. Using a high-fidelity patient simulator with first-year medical students to facilitate learning of cardiovascular function curves. Adv Physiol Educ. 2012;36:213–219.
5. Cendan JC, Johnson TR. Enhancing learning through optimal sequencing of Web-based and manikin simulators to teach shock physiology in the medical curriculum. Adv Physiol Educ. 2011;35:402–407.
6. Via DK, Kyle RR, Trask JD, Shields CH, Mongan PD. Using high-fidelity patient simulation and an advanced distance education network to teach pharmacology to second-year medical students. J Clin Anesth. 2004;16:144–151.
7. Nutt J, Mehdian R, Parkin I, Dent J, Kellett C. Cadaveric surgery: A novel approach to teaching clinical anatomy. Clin Teach. 2012;9:148–151.
8. Hariri S, Rawn C, Srivastava S, Youngblood P, Ladd A. Evaluation of a surgical simulator for learning clinical anatomy. Med Educ. 2004;38:896–902.
9. Torres K, Torres A, Pietrzyk L, et al. Simulation techniques in the anatomy curriculum: Review of literature. Folia Morphol (Warsz). 2014;73:1–6.
10. AAMC Institute for Improving Medical Education. Effective use of educational technology in medical education—Colloquium on educational technology: Recommendations and guidelines for medical educators. https://members.aamc.org/eweb/upload/Effective%20Use%20of%20Educational.pdf. Published March 2007. Accessed July 7, 2016.
© 2017 by the Association of American Medical Colleges