What We Already Know about This Topic
* Simulation has previously been shown to enhance development of cognitive skills in transesophageal echocardiography
What This Article Tells Us That Is New
* This study showed that mannequin-based transesophageal echocardiography simulation training was superior to conventional didactic training on acquisition of good-quality echocardiographic images in patients by residents in training
THE clinical impact of perioperative transesophageal echocardiography (TEE) is well documented.1
The American Society of Anesthesiologists and the Society of Cardiovascular Anesthesiologists emphasize TEE training at all levels; however, translation of this training in residency programs is highly variable. A multitude of barriers continue to limit TEE education, including time constraints in the operating room, a limited number of appropriate cases, and a lack of experienced teaching faculty. The art of echocardiographic image acquisition requires a complex interaction of knowledge in physics, anatomy, and physiology, along with repetitive supervised hands-on experience. Typically, TEE skill acquisition occurs in real time in the operating room while caring for a patient. When used as an educational adjunct, standardized TEE simulation training in a virtual reality environment may lead to a more rapid development of psychomotor skills. Further, simulation training in TEE before patient exposure may enhance patient safety or comfort.2
Studies have shown that simulation-based medical education is superior to the conventional didactic system.3
There is only a handful of trials that address the benefits of simulation-based echocardiography teaching in anesthesia.5
Previous work in TEE simulation has been limited to the demonstration of enhanced cognitive skill acquisition.5
We hypothesized that simulation training will also enhance performance in TEE image acquisition among anesthesia residents. Our study is novel in that it is the first to compare mannequin-based TEE simulation training with conventional training on practical skills in obtaining echocardiographic images in actual patients.
Materials and Methods
This prospective randomized study was conducted at University of North Carolina Hospitals. Anesthesiology residents in the clinical anesthesia (CA) years 1 to 3 (n = 42) were eligible to participate in this study. The residents were sent e-mails regarding the specifics of the study and were offered a compensation of meal tickets for participation. Institutional Review Board (Chapel Hill, NC) approval was obtained before beginning the study. After informed consent, 42 residents (CA-1, n = 14; CA-2, n = 14; CA-3, n = 14) with varied TEE experience were enrolled. None of the participants had previous exposure to mannequin-based TEE simulation training. Residents from each training class were randomized to one of two groups: a control group, which received traditional didactic training, and a simulator group, whose training used a TEE mannequin simulator (fig. 1
). Upon enrollment, each resident was assigned a unique identification number, which was used in randomization procedures and data collection forms. An independent statistician performed group allocation by computer-generated randomization, stratified by CA year.
All residents took a written TEE baseline knowledge test, which consisted of 30 multiple-choice questions. The test covered concepts such as basic principles of ultrasound, echocardiographic anatomy, clinical correlation, and standard imaging views. Test scores were assessed to verify that there was no significant difference in baseline TEE knowledge between the two groups. Demographic data, previous TEE experience, as well as variables potentially affecting technical skills (past use of videogames, handedness, and type of internship training) were collected via a Web-based questionnaire.
Residents completed individual TEE training sessions according to their assigned group. Both groups received approximately 45 min of didactic training. Teaching sessions were moderated by one of four TEE-certified (National Board of Echocardiography) faculty members. Both groups were instructed on echocardiographic anatomy and techniques for acquiring the 20 basic TEE views.7
The educational sessions in the two groups differed only in the method of instruction. The control group’s traditional didactic session was taught using PowerPoint (Microsoft Corporation, Redmond, WA). It included instruction on probe manipulation for image acquisition with anatomic correlates. They were shown video clips of each of the 20 standard views with a demonstration of the echocardiographic planes on an anatomical heart model (fig. 2
). The simulator group’s hands-on didactic session centered upon the use of a mannequin-based TEE Simulator (Heartworks; Inventive Medical Ltd., London, United Kingdom). Residents in this group, under faculty guidance, acquired each of the 20 standard views on the simulator. This included a demonstration of the imaging planes corresponding to each view on the three-dimensional heart model built into the software (fig. 3
). For independent study, both groups were given a link to a Web-based TEE simulation system*
and a copy of the American Society of Echocardiography/Society of Cardiovascular Anesthesiologists guidelines,7
which review standard TEE views. The simulator group had the option of using the TEE simulator under faculty guidance in their free time, whereas the control group had the option of reviewing the PowerPoint slides with a faculty member. Participants were asked to keep track of the time they spent on independent study, which included reviewing the material provided as well as practicing on the simulator or PowerPoint under faculty supervision, depending on their respective study group assignment.
TEE evaluations were conducted within 1 week of didactic instruction whereby each resident was expected to obtain 10 preselected standard views on an actual patient (table 1
). If more than a week had elapsed before their intraoperative patient-based examination, the residents underwent a 25 to 30 min repeat teaching session. This was to address the knowledge lapses in short-term recall due to the time gap and was intended as a recapitulation of what they had already been taught. The repeat session was identical to the initial session with regard to the material covered and the method of instruction. This session was also led by one of the four study instructors, according to scheduling availability.
Each participating resident was assigned to a nonemergent cardiac case in which intraoperative TEE was clinically indicated. Ten commonly used standard views were preselected for this study (table 1
). To standardize for interpatient anatomical variability and technical difficulty in obtaining images, a TEE-certified attending anesthesiologist first obtained the 10 specified views for each patient. The attending anesthesiologist ascertained that all 10 views were obtainable before inviting the participating resident into the operating room. The resident then attempted to obtain the same 10 views. For patient safety purposes, the residents were supervised by the anesthesiologist at all times, but did not receive any feedback or assistance while obtaining the views. A maximum of two residents were permitted to perform TEE exams on a given patient. The images acquired by the resident and the attending were stored for future offline analysis. Residents were not given a time limit but were aware that the total time they took to obtain the views was being recorded. All patient, resident, and attending identifiers were removed from the stored images for confidentiality as well as blinding for future offline grading.
Image Evaluation and Grading
To our knowledge, there has only been one previous study that has evaluated TEE simulators as a teaching tool.5
That study focused on the assessment of cognitive skills and anatomical knowledge rather than on the quality of images obtained. As such, there are no standardized grading scales to assess the quality of acquired views. We devised a grading system whereby each of the 10 selected echocardiographic views was evaluated on a scale of 0 to 10 according to predetermined criteria, which included: imaging angle, overall clarity, and visualization of three major anatomic structures pertinent to each view (tables 1
and fig. 4
). Each of the 10 views obtained by the resident as well as by the faculty anesthesiologist was independently graded by three TEE-certified cardiac anesthesiologists. The evaluators were blinded to the identity of the patient, resident, and faculty member being evaluated. An overall score was calculated for each resident and the corresponding attending anesthesiologist. Each view could receive a maximum score of 10 and each study a maximum score of 100. We defined a difference of 1SD between the mean scores among the two groups as a minimal meaningful impact of the study intervention. Additionally, each view with a score of 8 or greater was determined to be acceptable for clinical use.
Baseline characteristics of the residents were compared between two study groups by using a Fisher exact test (for categorical variables) or a t
test (for continuous variables). Because each image was graded by three independent experts, the average of the three scores was calculated for each view and for each study. Before the rest of the studies were graded, the interrater reliability of the scoring system was first assessed by using the grades from nine randomly selected studies. An intraclass correlation coefficient (ICC) using an absolute agreement definition for average measures suggested by McGraw and Wong8
was calculated by using PASW Statistics 18 (SPSS Inc., 2009, Chicago, IL):
Equation (Uncited)Image Tools
= mean square for rows (“subjects”); MSE
= mean square error; MSC
= mean square for columns (“judges”); n = number of subjects (resident participants); k = number of judges. A 95% CI for ICC coefficient was calculated using a formula for ICC (A, k) provided by McGraw and Wong8
and implemented in PASW Statistics 18. The ICC was then recalculated using the score data from all participants.
The effect of the intervention was assessed using a linear mixed model implemented in SAS 9.3 (SAS Institute Inc., Cary, NC). To account for the fact that on some patients TEE studies were performed by two residents, the correlations between such measurements within each patient were controlled for by specifying random effects for intercept in mixed models. P values less than 0.05 were considered statistically significant.
A power calculation was performed assuming a normal distribution of the image quality scores in each study group and an α = 0.05, for mean differences between groups = 0.75, 1.0, 1.5, and 2.0 SDs. A total of 42 subjects (21 in each group) provides sufficient power (0.89) to detect a mean difference of 1.0 SD between the study groups. A mean difference of 1.0 SD between the image quality scores is deemed to be a minimal meaningful effect in this educational intervention. Therefore, the study was sufficiently powered to detect the difference in the mean image scores.
A total of 42 residents (14 from each training year) was approached to be included in this study. All 42 residents agreed to participate in the study and were randomized into the simulator or control group, with an equal number in each group. There was no difference between the simulator and control groups with regard to age, sex, and training year (table 3
). No significant differences between the study groups were found in terms of baseline TEE knowledge scores, in-training examination percentile scores, optional self-study time, and the number of didactic sessions before testing (table 3
). The control group consisted of more residents with previous TEE experience. The average time from residents’ last didactic session to their examination was not equal between the groups, 5 (range, 0 to14) days in simulator versus
2 (range, 0 to 8) days in the control group (t
Before scoring all the obtained views, the interrater reliability of the grading scale was assessed by using nine randomly selected studies. There was a significant correlation between the scores for the TEE studies evaluated by the three independent blinded graders, with pairwise Pearson correlations greater than 0.95. Calculation of the ICC (ICC = 0.98; 95% CI, 0.89 to 0.99) using all collected data demonstrated an excellent reliability of the grading scale.
Results from patient-based testing are summarized in table 4
and figure 5
. Residents in the simulation group obtained significantly higher-quality images, with a mean total image quality score of 83 (95% CI, 74 to 92) versus
67 (95% CI, 58 to 76) in the control group (P
= 0.016). A breakdown of image quality by views showed that the simulator group obtained a higher score for every view with the exception of the midesophageal four-chamber view. On average, 71% (95% CI, 58 to 85) of images acquired by each resident in the simulator group were acceptable for clinical use compared with 48% (95% CI, 35 to 62) in the control group (P
= 0.021) (table 4
and fig. 5
). No difference was demonstrated between groups with respect to the time required to complete the examination (10 min [95% CI, 7 to 13] in both groups; P
To control for patient-specific variables, such as exam difficulty, we also looked at the adjusted scores. The adjusted scores were obtained by subtracting resident scores from the attending scores, for each view on the same patient, in both the simulation and the control group (table 5
). The average adjusted score was significantly lower for the simulation group (mean 11 units) as compared with that of the control group (mean 26 units), a 15-point difference (P
= 0.027). This was similar to the 16-point overall score difference between the groups when attending scores were not taken into account. Therefore, incorporating the attending score into our analyses had no effect on the results. In other words, the simulation group outperformed the control group to the same extent with either method of analysis. It is also notable that image quality scores were substantially higher (paired t
< 0.001) among attendings (mean ± SD: 94 ± 4; range, 83 to 99) than among residents (75 ± 20; range, 21 to 99).
Multiple demographic variables were assessed for association with exam performance, as summarized in figure 5
and tables 6
. A breakdown of TEE examination performance by training level and previous TEE experience showed that the difference in score between groups was the greatest for the CA-1 residents and for those with no previous TEE training. These differences were statistically significant (table 7
). The percentage of acceptable images per study showed the greatest difference between groups for those with no previous TEE experience (fig. 5
and table 7
); when assessed by training level, the CA-2 residents demonstrated the greatest difference between groups in percentage of acceptable images. Table 6
illustrates that image quality increased steadily with training level (mean image score CA-1, 64; CA-2, 76; CA-3, 88; P
= 0.009), and with previous TEE experience (mean image score 0 weeks TEE experience, 67; 1 week experience, 71; 2 weeks experience, 87; 4 weeks experience, 95; P
= 0.014). Other variables analyzed, including video game use, handedness, sex, and internship type, did not demonstrate a statistically significant association with residents’ image quality score. The Pearson correlation coefficient demonstrates a positive association between written pretest score and image quality (r
= 0.48; P
= 0.001), and a nonsignificant negative association between average time per patient-based exam and image quality (r
= −0.25; P
The results of this study show that a 45-min hands-on didactic session using a mannequin-based TEE simulator substantially improved the ability of residents to obtain 10 standard TEE views with better quality and anatomic accuracy than their traditionally trained counterparts. The simulation-trained group outperformed their peers on all but 1 of the 10 views tested. The most marked beneficial effect of the simulation training was demonstrated in obtaining the midesophageal right ventricular inflow–outflow, midesophageal bicaval, and deep transgastric views, in which image quality increased between 2.1 and 3 points on a 0 to 10 point scale. This improvement in quality occurred despite no difference in time required for image acquisition. Our results affirm that simulation-based education in TEE enhances the acquisition of technical skills associated with this procedure.
Few studies have investigated the usefulness of mannequin-based echocardiographic simulators in residency training. No studies to our knowledge have assessed the transfer of skills from TEE or transthoracic echocardiography (TTE) mannequin-based simulation training to actual patients in the operating room. A recent study by Neelankavil et al.6
showed that mannequin-based simulators are more effective than traditional methods for cognitive and technical components of TTE training, as assessed by written tests and image acquisition on human volunteers. The simulation-based TTE-trained residents performed better at TTE image acquisition, overall TTE efficiency, and identification of echocardiographic anatomy than their traditionally trained counterparts. Bose et al.5
published the only investigation assessing the utility of mannequin-based TEE training. In that study, the simulator-trained group performed substantially better than their counterparts in the cognitive aspects of TEE when evaluated by written testing. The study did not assess technical skill performance in the clinical setting.
To our knowledge, ours is the first prospective randomized study comparing mannequin-based TEE simulation training with traditional TEE teaching methods as assessed by intraoperative performance of residents on actual patients. Our data indicate that simulator-based TEE training is more effective than conventional methods. The simulation group outperformed the control group with regard to both metrics assessed: image quality and percentage of acceptable images acquired. A short 45-min training session resulted in direct and measurable benefits. This finding supports the adoption of mannequin-based TEE simulation training into residency education. Although not statistically significant, simulation also appeared to have a positive effect on resident initiative for self-study.
Our data showed that image quality had a positive correlation with TEE pretest scores, previous TEE experience, and clinical anesthesia training year. When analyzed by training level, the simulation-trained CA-1 residents demonstrated the largest difference in overall image score as compared with controls. Similarly, when analyzed by previous TEE experience, the simulation-trained residents with no previous TEE training demonstrated the most substantial increase in percentage of acceptable images acquired as compared with controls. This suggests that simulation training may have the greatest impact when implemented early in the learning process. Currently, interns and CA1 residents in most anesthesiology programs have minimal to no TEE education built into their curriculum. On the basis of the results of our study, we postulate that perhaps adding TEE simulation early on in residency training may help achieve TEE proficiency in an expedited fashion. With this rapid acquisition of basic skills by using early simulation training, it is possible that “near perfect” TEE image acquisition may happen sooner in this group of residents. Having developed basic psychomotor skills before patient contact, learners could invest their limited time on the refinement of imaging techniques associated with the subtleties of anatomical variability amongst patients. It may also help focus more on patient safety rather than on learning the basics of image acquisition.
Basic competency in TEE is now a central and expected component of contemporary anesthesiology training.9
However, resident work-hour restrictions and variable intraoperative availability of appropriate cases limit the training opportunities during residency. Obtaining skills in basic TEE is possible in some programs with suitable training and clinical opportunity. Typically, expertise in perioperative TEE is only possible with fellowship training. Similar constraints have affected other fields including surgery, pediatrics, interventional radiology, cardiology, obstetrics and gynecology, orthopedics, and emergency medicine.11
This has led the medical community to increasingly embrace simulation-based training as a means to circumvent logistical dilemmas, to standardize training, and to improve both resident education and patient safety. Moreover, many argue that trainees must obtain a higher level of technical skill before performing a procedure on a patient.12
Simulation training aids the development of technical skills such that when confronted in the clinical setting, the trainee can concentrate more on the cognitive aspects of the task at hand.13
A recent study by Sturm et al.14
demonstrated that procedural skills gained by simulation training in laparoscopic cholecystectomy and colonoscopy correlated with significant improvement in patient safety and operating-room efficiency.
TEE is not without risks, as the procedure may cause serious, and rarely, fatal complications.7
The learning curve to obtaining proficiency in TEE is achieved by repetition of psychomotor and technical skills. This was indeed observed in our study, whereby simulator-trained residents who had the opportunity to practice on a life-like model before testing, outperformed their peers in the technical aspects. Whether the addition of simulation in TEE training translates into reduced complication rates and better patient outcomes is yet to be evaluated.
A potential limitation of this study was the difference in time between initial training sessions and evaluations, which was an average of 3 days more for the simulation group as compared with the controls. The longer gap between the simulation group’s training and testing may have negatively impacted the residents’ short-term recall and the overall score. However, this could have also provided them with additional time to study and practice. Similarly, residents with substantial time lags (>1 week) between their initial training session and the patient-based evaluation received a second didactic session. It was assumed that this discrepancy would be evenly dispersed between the two groups. However, the control group had 23% more residents complete an additional didactic session than the simulation group, which could have placed them at an unfair advantage. Further, the residents were randomized to groups by training level to control for variations in TEE experience. Despite this, the control group contained 29% more residents with 4 weeks of TEE training and 10% fewer residents with no TEE experience. Also, the average in-training examination score was 11 points higher for the control group. These factors likely decreased the magnitude of positive impact that simulation training had on resident performance.
Standardized patients were not used for this study, as this would have posed challenges from a logistical and ethical standpoint. Accordingly, we controlled for patient variability by using the adjusted score for each graded component, whereby the attending anesthesiologists’ score was incorporated into our analysis. However, as explained in Results, the results were similar when analyzed with or without the adjusted scores, indicating that patients with technically difficult exams were likely evenly distributed across both groups. Also adding to the validity of this study is the strong correlation of image quality score with CA level and with previous TEE experience (table 6
). Due to the paucity of investigational reports and standardized grading methods,15
we devised our own grading system to assess image quality. In order to validate our grading system the interrater reliability of the grading scale was assessed using nine random studies. It is possible that a more robust grading system could further enhance scoring consistency.
Although it is impossible to completely eliminate subtle differences in teaching styles among individual instructors due to the “human factor” being an integral part of the teaching process, we tried to minimize it by selecting instructors with similar teaching skills. Each of the four instructors were fellowship-trained in cardiac anesthesiology, certified in Advanced Perioperative TEE by the National Board of Echocardiography, and had been a past recipient of at least one teaching award. Two of the four instructors had greater than 10 yr of teaching experience, and the other two had 3 to 5 yr of teaching experience. As is typical in a busy residency program, the instructor assignments were allotted according to daily scheduling availability. Of a total of 51 teaching sessions (nine residents had two sessions), the control group had 14 sessions with senior instructors and 14 with junior instructors, whereas the simulator group had 11 sessions with senior instructors and 12 sessions with junior instructors (chi-square test; P = 1.0). Therefore the distribution of instructors according to teaching experience was distributed evenly between the two groups. It would be difficult to quantify whether some instructors were more engaged with one method of teaching over the other. However, at the time of this study, our department had recently acquired the simulator, hence all four instructors were in their infancy of developing teaching styles for simulator training, but had well-established styles for traditional PowerPoint teaching. Thus, bias was likely to be in favor of the traditional teaching method, which each instructor had been comfortable and familiar with for years.
In conclusion, our results suggest that simulation-based training in TEE education enhances the acquisition of technical skills with the greatest impact when implemented early in anesthesia training. Additional aspects of mannequin-based TEE simulator education could be analyzed, such as its efficacy in teaching cardiac anatomy and its utility in the diagnosis of uncommon pathologic states. Our study may be modified to assess long-term retention of TEE skills by repeating written and practical examinations. Further standardization could be achieved by requiring all residents to spend an equal amount of time dedicated to independent study. It remains to be determined whether an increase in the duration or number of didactic sessions may magnify the effect that simulation training has on technical TEE skills. Additional studies are warranted to explore whether the use of TEE simulation in residency training could translate to clinical benefits and enhanced patient safety. Future studies could be designed to determine the average number and duration of TEE simulation sessions needed by a novice in order to become proficient. This information could be used to design a curriculum for junior residents to help them obtain basic imaging skills before their first contact with a real patient, hence expedite their learning curve. It could also help determine how many simulation hours may be needed to train or certify anesthesiologists who may be interested in acquiring this skill set.
* “TEE Views Module,” Toronto General Hospital Department of Anesthesia. Available at: http://pie.med.utoronto.ca/TEE/
. Accessed September 4, 2013. Cited Here...
1. Guarracino F, Baldassarri R. Transesophageal echocardiography in the OR and ICU. Minerva Anestesiol. 2009;75:518–29
2. Graber MA, Wyatt C, Kasparek L, Xu Y. Does simulator training for medical students change patient opinions and attitudes toward medical student procedures in the emergency department? Acad Emerg Med. 2005;12:635–9
3. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86:706–11
4. Frost DW, Cavalcanti RB, Toubassi D. Instruction using a high-fidelity cardiopulmonary simulator improves examination skills and resource allocation in family medicine trainees. Simul Healthc. 2011;6:278–83
5. Bose RR, Matyal R, Warraich HJ, Summers J, Subramaniam B, Mitchell J, Panzica PJ, Shahul S, Mahmood F. Utility of a transesophageal echocardiographic simulator as a teaching tool. J Cardiothorac Vasc Anesth. 2011;25:212–5
6. Neelankavil J, Howard-Quijano K, Hsieh TC, Ramsingh D, Scovotti JC, Chua JH, Ho JK, Mahajan A. Transthoracic echocardiography simulation is an efficient method to train anesthesiologists in basic transthoracic echocardiography skills. Anesth Analg. 2012;115:1042–51
7. Shanewise JS, Cheung AT, Aronson S, Stewart WJ, Weiss RL, Mark JB, Savage RM, Sears-Rogan P, Mathew JP, Quiñones MA, Cahalan MK, Savino JS. ASE/SCA guidelines for performing a comprehensive intraoperative multiplane transesophageal echocardiography examination: Recommendations of the American Society of Echocardiography Council for Intraoperative Echocardiography and the Society of Cardiovascular Anesthesiologists Task Force for Certification in Perioperative Transesophageal Echocardiography. Anesth Analg. 1999;89:870–84
8. McGraw KO, Wong SP. Forming inferences about some intraclass correlation coefficients. Psychol Methods. 1996;1:30–46
9. Mahmood F, Christie A, Matyal R. Transesophageal echocardiography and noncardiac surgery. Semin Cardiothorac Vasc Anesth. 2008;12:265–89
10. Schulmeyer MC, Santelices E, Vega R, Schmied S. Impact of intraoperative transesophageal echocardiography during noncardiac surgery. J Cardiothorac Vasc Anesth. 2006;20:768–71
11. Matyal R, Bose R, Warraich H, Shahul S, Ratcliff S, Panzica P, Mahmood F. Transthoracic echocardiographic simulator: Normal and the abnormal. J Cardiothorac Vasc Anesth. 2011;25:177–81
12. Niazi AU, Haldipur N, Prasad AG, Chan VW. Ultrasound-guided regional anesthesia performance in the early learning period: Effect of simulation training. Reg Anesth Pain Med. 2012;37:51–4
13. Castanelli DJ. The rise of simulation in technical skills teaching and the implications for training novices in anaesthesia. Anaesth Intensive Care. 2009;37:903–10
14. Sturm LP, Windsor JA, Cosman PH, Cregan P, Hewett PJ, Maddern GJ. A systematic review of skills transfer after surgical simulation training. Ann Surg. 2008;248:166–79
15. Frederiksen CA, Juhl-Olsen P, Nielsen DG, Eika B, Sloth E. Limited intervention improves technical skill in focus assessed transthoracic echocardiography among novice examiners. BMC Med Educ. 2012;12:65
© 2014 American Society of Anesthesiologists, Inc.