High-fidelity patient simulators, originally designed for high-risk training in anesthesia,1 are becoming increasingly available for wide-spread use in general medical education.2–6 Although these mannequin-robots can reliably replicate a variety of clinical encounters, they have seen limited application in internal medicine clerkships. Beyond simply reducing error and promoting safety in high-risk environments, patient simulation may be particularly effective at promoting reflective and comparative analysis among core internal medicine concepts. We sought to develop a general framework for using simulation in the internal medicine clerkship, apply the model to specific curricular domains, and pilot the technique among students.
Adult educational theory suggests that learning occurs through sequential steps of cognitive analysis.7,8 Any initial experience is retained as a memory, synthesized with existing knowledge, and then internally transformed to provide general meaning or understanding of the event. Repetition of similar meaningful experiences, as well as advance preparation by the student for a planned experience, appears to result in improved learning. Reinforcement is thus a critical component of the learning process. Arguably, however, the highest level of learning occurs when reflection and metacognitive analysis take place, allowing each individual event to be compared, contrasted, and contextually placed relative to other personal experience.8
Traditionally, clinical clerkships are structured to combine didactic and apprenticeship experiences, where students participate in a daily routine of patient care comprising data acquisition, and the formulation of differential diagnoses, and treatment plans. Concurrently, students attend lectures and/or small-group teaching sessions emphasizing understanding of basic concepts and models of disease.9 In relation to adult learning theory, this approach allows students to amass a large body of experiential knowledge to which general meaning is applied.
Arguably, however, this system is inefficient in promoting the highest level of learned knowledge, as reflection and metacognitive analysis occur independently, often without guidance, and only after extended periods of time when students are able to piece together prior isolated experiences.10 Indeed, students only rarely witness several patients exemplifying the full spectrum of any disease process within a concise time period. Furthermore, while didactic sessions attempt to teach disease concepts, such an approach may not affect performance at patients’ bedsides.10,11
We believe that high-fidelity patient simulation offers a unique opportunity to engage students in advanced learning. In this paper, we describe the development of a simulator-based curriculum model designed to enhance comparative and reflective analysis within the internal medicine clerkship.
The authors formed a consensus panel of five clinician educators (including two internal medicine course directors) and identified learning objectives for a simulator-based module. These objectives, based on accepted learning goals for the third-year internal medicine clerkship,12 also represented disease categories that appeared well understood on the basis of preclinical knowledge, yet poorly translated into the clinical care of patients. We applied a general curricular framework to individual disease categories and piloted the program among clerkship students. The instructors (GTM, CM, and EKA) collected observational data as well as information directly from the students. The protocol was reviewed, and deemed exempt from continuing oversight by the Institutional Review Board at Harvard Medical School (HMS).
Setting and population
We provided simulator-based teaching sessions to a total of 90 third-year HMS students assigned to complete their core internal medicine clerkship at the Brigham and Women's Hospital in Boston, Massachusetts, over an 18-month period in 2002–03. Sessions were conducted using a high-fidelity patient simulator maintained on the campus of Harvard Medial School (Human Patient Simulator™, Medical Education Technologies, Inc., Sarasota, Florida). This simulator is a computer-controlled, full-scale mannequin that replicates human physiology in real time. It possesses mechanical lungs with physiologic air exchange and ausculatory breath and heart sounds, palpable pulses, a voice transmitter, and reactive pupils. Vital signs are displayed on a bedside monitor, and any (simulated) medication may be administered through an actual intravenous port. An embedded software model allows the simulator to respond physiologically to all interventions and medications. As a result, time-appropriate changes in clinical condition can be seen, heard, and felt by the students, providing an impressively robust experience. Over 80% of the students had prior experience with the patient simulator as part of coursework completed during their first and second years of medical school, although none had participated in a curriculum similar to that described below. Teaching groups comprised three to four students taught by two to three faculty trained in use of the simulator. Each 90-minute session occurred in the latter half of the students’ 12-week internal medicine clerkship, and was additive to the current clerkship curriculum, which was otherwise unaltered.
We used a general model for each 90-minute teaching session built upon review of learning theory and the consensus of the authors. The framework called for a rapid sequence of three closely-related case simulations intermixed with two focused teaching sessions, all culminating in a final 15-minute exercise for comparative and contextual analysis (see Figure 1). Students began each session with a basic introduction to the patient simulator during which students had an opportunity to listen to the simulator's ‘normal’ heart and lung sounds, review the bedside monitor, and interact in this novel environment. The instructors emphasized to the students that each session was not evaluative, and that we expected lack of expertise. Each student group entered the exercise without advance preparation or knowledge of the goal topics. Only in the last 15 minutes of the session, during instructor-facilitated comparative reflection, was the pathophysiologic concept for each session made clear to the students. At the completion of the 90-minute exercise, all student questions were answered, and a general debriefing ensued.
Protocol Phase I: introduction.
Each 90-minute teaching session began with an initial 15-minute case (Case #1) emphasizing history and physical exam skills and initial evaluation of an ill patient. The session also introduced a common illness that was representative of the pathophysiologic concept and allowed students to become comfortable in this unique learning environment. Case #1 was followed by a ten-minute interactive teaching session (led by one instructor) focused on history taking, critical vital signs, and the basics of drug delivery—learning concepts applicable to any patient or disease concept.
Protocol Phase II: reinforcement.
The second and third 15-minute cases (Case #2 and Case #3, respectively) were designed to demonstrate separate illness inclusive of, yet built on, the same pathophysiologic concept. To do so, different clinical presentations of an illness, or various presentations stemming from common pathophysiologic mechanisms were simulated—each case with increasing complexity. Between Case #2 and #3, a similar ten-minute interactive discussion ensued. Specifically, this discussion was tied to the case experienced immediately prior, and focused on basic diagnostic and management strategies for the chosen disease concept.
Protocol Phase III: reflective and comparative analysis.
At the completion of Case #3, one instructor led the students through a careful 15-minute interactive case review. We designed this discussion to encourage students to compare and contrast the previous three cases, enhancing links across the scenarios, applying the pathophysiology discussed to the cases they experienced, and promoting immediate reflection on the concept. The responsible instructor and students first reviewed all cases, with emphasis on differences and similarities in clinical presentation, physical examination, laboratory findings, and response to treatment. The instructor encouraged the students to formulate hypotheses for the varying presentations and responses witnessed between cases despite their common link to one underlying pathophysiologic concept. For example, the instructor queried the students as to how the same process of myocardial infarction, though occurring in differing locations (left anterior descending artery versus right coronary artery), could lead to differing heart rates, pulmonary examination findings, and response to medications affecting preload and afterload. We purposefully postponed students’ questions concerning specific details of management until students were able to demonstrate reflection and comparison of the pathophysiologic concept. The session ended with a five-minute debriefing during which the students were able to reflect on the total experience and the emotions generated.
We used three questionnaires to assess this teaching process. Each student was asked to complete an entry questionnaire at the beginning of the clerkship, and an exit questionnaire at the end of the simulator session. The entry questionnaire asked about students’ preparation for and prior experience with critical incidents. The exit questionnaire asked about students’ experiences with the simulator, such as perceived utility and desire for further exposure to this method of learning. A follow-up questionnaire invited the last 29 consecutive students’ to interpret and compare their prior didactic and clinical learning to their experiences with the simulator modules. This student group represented two separate three-month clerkship blocks from July to December of 2003 and was studied to provide quantitative data during the final six months of the study.
Instructors recorded both their own observations of the students’ performances, as well as feedback from the students themselves. Quantitative data is reported as percentage of respondents agreeing with questionnaire statements. Qualitative data was analyzed according to the method of Miles and Huberman.13
We successfully applied the general model to two pathophysiologic concepts in which the preclinical knowledge seemed well understood by our medical students, but poorly translated into their clinical years: (A) coronary ischemia with accompanying right, left or biventricular failure; and (B) hypoxemia and respiratory failure resulting from airway obstruction, congestive heart failure or pneumothorax (see Table 1). Ninety students attended the simulator session and all completed the entry and exit questionnaires, and provided direct feedback of their experiences.
Students’ responses from the exit questionnaire showed that they found this curriculum beneficial and of high quality (see Table 2). Furthermore, 94% felt that the simulator exercise should become a routine part of the third-year medicine clerkship curriculum. A majority desired three or more simulator sessions included in their 12-week internal medicine clerkship.
In the follow-up questionnaire we asked a convenience sample of the final 29 consecutive students to contrast their prior didactic and clinical experiences with their learning at the simulator. Of the 29 students, 24 (83%) reported receiving didactic teaching sessions focused on the pathophysiologic concepts covered in the simulated cases within the prior six months. However, only four (17%) students reported that such sessions provided comparative analysis, and 20 (91%) found the simulator to be a more valuable learning experience than were the didactic sessions. Eleven respondents (38%) had seen clinical corollaries of all the cases encountered, although these cases were separated by a mean of two months. When asked to rank the utility of the six components of each module (three cases and three discussions), 21 (72%) recognized the centrality of the reflective analysis component by giving this portion (Protocol Phase III) their highest rank of importance.
The instructors noted that, despite prior experience, the majority of the students were hesitant to engage in the first scenario, and often seemed intent upon matching the current case diagnosis with a prior exposure to the simulator. Therefore, the basic first case (Case #1) appeared critical to ensuring students’ comfort and attention. As the ‘patient’ became increasingly unwell or the severity of its illness worsened in future cases, students’ engagement noticeably increased. Instructors documented that students were, in general, able to extract the history, perform a brief physical examination, and order appropriate tests. In the learning sessions, the students demonstrated knowledge of the core concepts clearly taken from both preclinical and clinical learning, but consistently demonstrated errors in application of knowledge to clinical circumstance. For example, all 32 groups (100%) gave vasodilator medication to a hypotensive patient who was preload dependent, the majority failed to recognize the significance of a rising pCO2 in a patient with asthmatic airway obstruction, and most groups did not intervene when the hypotensive patient consistently had a heart rate of 39 beats per minute. Furthermore, these errors occurred immediately after a didactic experience addressed the corresponding topics.
Iterative experience with multiple, related scenarios resulted in improved performance of critical tasks such as the administration of oxygen, or request for an electrocardiogram. Instructors observed more rapid application of knowledge, and the use of physiologic principles in decision making, with the group sharing discussions about the relative physiologic benefits and risks of pharmacologic interventions. Students were observed to reflect on their experiences, reviewing the case details and identifying errors and solutions. In the final discussion session, with specific guidance, the students were able to compare and contrast the cases and extract generalizable principles from their experience. Notably, however, all three simulated cases by themselves, even when presented in rapid order, did not independently promote apparent comparison and reflection. The final 15-minute instructor-lead session appeared critical to this desired endpoint.
The student comments on the exit questionnaires provided qualitative insight into their appreciation of the simulator experience. Of the 130 comments received, 44 (34%) were related to the teaching style and tutors, 24 (18%) requested more time with the simulator, 18 (14%) were ideas for improvement, nine (7%) brought our attention to technical matters, nine (7%) requested additional written or didactic material, and 26 (20%) were nonspecific, general comments. Representative comments included “It would be optimal to have a weekly session because it ties together the classroom and the ward in an efficient manner” and “Handouts outlining each concept would be helpful at the end.” The students felt this curriculum was well suited for inclusion in this internal medicine clerkship. As one student commented, “...it actually meets my needs; i.e., now I need to know how to manage patients.” Learning through action was apparent, as a student noted, “being put on the spot in a safe context is a great way to learn” and another thought the program offered a “very supportive environment –tolerant of mistakes and therefore conducive to learning.” Students appreciated the style of teaching, one noting it was “excellent, interrelated, concise, and practical explanations of diagnosis, treatment and intervention,” and another, a “good discussion of how physiologic situations translate into decision making.”
In clinical medical education, learning is traditionally promoted via experiential, ‘hands-on’ learning intermixed with didactic teaching sessions. In the United States, preclinical curricula have modified over the last two decades moving away from didactic content, and increasingly focusing on the facilitation of problem solving, group learning, and reflective comparative processing.14 Adult education models support this approach,15 and use of these models has resulted in higher levels of understanding and retention among students.16 It has been much more difficult, however, to significantly modify clinical curricula toward effectively incorporating these principles in a meaningful way. Patient simulation may provide one such way to promote this goal.17
Here we have described the successful introduction of a novel simulation-based educational curriculum into the third-year internal medicine clerkship. The design of this general 90-minute model for creating sequential concept-focused, case learning appears able to promote reflective analysis independent of the pathophysiologic concept applied to it, thus demonstrating its plasticity and robustness as a model. In our study, we successfully applied this model to the concept of coronary ischemia, as well as to hypoxia. Based on our initial experience, virtually limitless concepts could be just as effectively applied, presumably based on the goals and unmet educational needs of each group of educators or medical students. Trained faculty was necessary to facilitate the comparative discussion among students. We found that a group size of three students was optimal for the process.
A minority of students had experienced more than one of the subtypes of ischemic heart disease or hypoxemia, and students with broader experience reported a two-month mean time-interval between these exposures. Our observations support the conclusion that learning is optimized when comparative analysis can occur within a short time span, avoiding the temporal disconnect that typically interferes with this process in the traditional model of clinical education. Reflective analysis was critical to the success of the program and identified as such by both students and instructors.
Our simulator sessions were resource intensive, although the instructors assigned to the project were core members of the clerkship faculty. Sessions necessitated a dedicated classroom and at least two trained operators during each simulation. We acknowledge that the financial cost (as well as time cost) of simulation mannequins could be prohibitive for generalizability of this model to other institutions. However, data suggest rapid growth of simulator centers worldwide,4,6,12,18 as well as frequent shared interdepartmental use (and thus shared cost) of a common simulator at many institutions. We reason that a curriculum such as this could thus be added to many internal medicine clerkships with only modest financial strain, although further cost-benefit analysis will prove useful.
Our study was intended to develop and pilot a framework for simulator-based medical education in an internal medicine clerkship. While we report pilot data and observations here, our data neither compare alternative learning methods, nor address transference of performance outside of the study environment. Based on our observations, however, this approach appears promising. Further investigations will need to employ validated outcome measures for formal assessment of the simulator environment compared to other teaching methods.5 While we did not employ independent observers to confirm the assessment of study instructors, there was reasonable consensus among the three faculty, which seemed consistent with anonymously collected student data.
In conclusion, curriculum design committees in medical education have been challenged by their desire to increase integrated, reflective learning in the clinical years while still preserving the benefits of our current educational model. Through attention to the learning method and promotion of comparative analysis, our study demonstrates that medical simulation facilitates clinically applicable, higher-level learning in an internal medicine clerkship.
Dr. McMahon is supported by the National Institutes of Health K30 Grant # HL04095. Dr. Alexander is supported by a Morgan-Zinnser Fellowship Award from the Academy at HMS. Dr. Gordon is supported by a grant from the Josiah Macy, Jr. Foundation to the Harvard-MIT Division of Health Sciences and Technology and the Center for Medical Simulation, Cambridge, MA. Dr. Gordon is a member of the Clinical Education Advisory Board for Medical Education Technologies, Inc., without personal compensation; all proceeds are donated to Harvard-affiliated teaching programs.
1 Gaba D HS, Fish K, Smith B, Soub Y. Simulation in anesthesia crisis management: a decade of experience. Simulation and Gaming. 2001;32:175–93.
2 Gordon JA, Wilkerson WM, Shaffer DW, Armstrong EG. “Practicing” medicine without risk: students’ and educators’ responses to high-fidelity patient simulation. Acad Med. 2001;76:469–72.
3 Euliano TY, Mahla ME. Problem-based learning in residency education: a novel implementation using a simulator. J Clin Monit Comput. 1999;15:227–32.
4 Morgan PJ, Cleave-Hogg DM. Cost and resource implications of undergraduate simulator-based education. Can J Anaesth. 2001;48:827–8.
5 Gordon JA, Tancredi DN, Binder WD, Wilkerson WM, Shaffer DW. Assessment of a clinical performance evaluation tool for use in a simulator-based testing environment: a pilot study. Acad Med. 2003;78:S45–7.
6 Gordon JA ON, Cooper JB. Bringing good teaching cases “to life”: a simulator-based medical education service. Acad Med 2004;79.
7 Donovan MS BJ, Pellegrino JW (eds). How People Learn: Bridging Research and Practice. Washington DC: National Academy Press, 2000.
8 Bleakley A, Farrow R, Gould D, Marshall R. Making sense of clinical reasoning: judgement and the evidence of the senses. Med Educ. 2003;37:544–52.
9 Gordon J. Fostering students’ personal and professional development in medicine: a new framework for PPD. Med Educ 2003;37:341–9.
10 Davis D, O'Brien MA, Freemantle N, Wolf FM, Mazmanian P, Taylor-Vaisey A. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA. 1999;282:867–74.
11 Kramer AW, Jansen JJ, Zuithoff P, et al. Predictive validity of a written knowledge test of skills for an OSCE in postgraduate training for general practice. Med Educ. 2002;36:812–9.
12 Simulation centers established worldwide 〈http://www.bris.ac.uk/Depts/BMSC/
〉. Accessed 4 October 2004. Bristol, United Kingdom: Bristol Medical Simulation Center, 2004.
13 Miles M, Huberman A. Qualitative Data Analysis. Thousand Oaks, CA: Sage Publications, 1994.
14 Hall KH. Reviewing intuitive decision-making and uncertainty: the implications for medical education. Med Educ. 2002;36:216–24.
15 Armstrong EG. A hybrid model of problem based learning. In: Boud D (ed). The Challenge of Problem Based Learning. London.: Kogan Page, 1997.
16 Antepohl W, Herzig S. Problem-based learning versus lecture-based learning in a course of basic pharmacology: a controlled, randomized study. Med Educ. 1999;33:106–13.
17 Gordon JA, Medical Readiness Trainer Team. The human patient simulator: acceptance and efficacy as a teaching tool for students. Acad Med. 2000;75:522.
18 Ziv A, Wolpe PR, Small SD, Glick S. Simulation-based medical education: an ethical imperative. Acad Med. 2003;78:783–8.