Journal Logo

Review Article

Using Simulation to Teach Echocardiography

A Systematic Review

Rambarat, Cecil A. MD; Merritt, Justin M. MD; Norton, Hannah F. MSIS; Black, Erik PhD; Winchester, David E. MD, MS

Author Information
doi: 10.1097/SIH.0000000000000351

Abstract

Echocardiography (echo) has been an essential diagnostic imaging tool for decades. Once limited to cardiologists, echo is now used in several medical specialties including anesthesia, critical care medicine, emergency medicine, and surgery.1–3 In cardiology and other fields, echo has traditionally been learned through an apprenticeship model reinforced with didactic learning.

High-fidelity simulation is a teaching method that offers distinct advantages over learning in a clinical environment. Learners have the opportunity to fail without risking patient harm, while at the same time affording teachers greater control over the learning environment. In a survey of medical educators, the use of simulation for graduate medical education varied by specialty between 3% and 85%.4 With the development of high-fidelity echo simulators, the opportunity now exists to teach echo via simulation.

Although several publications describe the use of high-fidelity simulation to teach echo, no systematic review of this literature has been performed. We sought to appraise the current literature on the use of simulation to teach echo across all specialties.

METHODS

We performed a systematic review of the medical literature for reports of using high-fidelity simulation to teach echo (Fig. 1). Databases searched included PubMed/MEDLINE, Web of Science, Cochrane CENTRAL Register of Controlled Trials, and ERIC. In each database, we searched on variations of terms related to echo, simulation, and education. Searches in PubMed, Cochrane CENTRAL, and ERIC included medical subject headings and ERIC subject headings. No language or date restrictions were placed on the search results. We sought to maximize the number of eligible articles by including all medical specialties (cardiology, anesthesiology, etc.) as well as all types of echo teaching (transthoracic and transesophageal). Meeting abstracts were considered eligible if no subsequent peer-reviewed article of the data was found. Full search strategies are available in the Supplementary Digital Content (see document, Supplementary Digital Content 1, http://links.lww.com/SIH/A397; search strategies). Exclusion criteria were the following: articles that did not pertain to using simulation to teach echo, articles that were not research (eg, editorials or review articles), or articles that did not perform a statistical comparison to demonstrate whether simulation to teach echo was effective (eg, curriculum design or concept papers).

FIGURE 1
FIGURE 1:
A flow diagram of the articles included in the systematic review.

After the initial search, duplicates were removed and 67 studies were chosen for abstract review by authors C.A.R. and J.M.M. If either author suggested retention, the study was included. The process was repeated for abstract review and, again, if either reviewer suggested retention, the study was included. After full-text review, any disagreements about inclusion were resolved by author D.E.W.

The following data were extracted from each study: publication type (article or abstract), study methodology (precomparison/postcomparison, randomized trial, or cohort study), and number of learners (defined as the number of participants who were exposed to simulation as part of the study). Learners were further classified by educational level (undergraduate, graduate, or continuing medical education [CME]) and specialty (anesthesiology, cardiology, other). We evaluated whether the study focused on transthoracic (TTE) or transesophageal echocardiograms (TEE), used computer or hands-on simulation, and reported qualitative or quantitative outcomes. We established if the studies were conducted in the United States or internationally, were funded (internally, nonprofit, or industry), and if educational theory was applied in study/curriculum design. We determined the Best Evidence in Medical Education (BEME) and Kirkpatrick's levels for the outcomes associated with each study (Table 1).6

TABLE 1
TABLE 1:
Data Regarding LEARNERS and Content of the Studies Included in the Final Review (n = 24)

Because of the variation in the measured outcomes, a formal meta-analysis of results would not be possible. We classified each study as “positive” if the study demonstrated that simulation was effective or superior to any comparator and “negative” if the study did not demonstrate either. κ calculation was performed to assess agreement for study inclusion. Interclass correlation was performed for agreement on Kirkpatrick levels as assessed by authors D.E.W. and E.B.; disagreements were resolved by H.F.N.7 We followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement in conducting the investigation.8

RESULTS

After individual review and resolution of differences, 24 studies were included.1–3,5,9–28 The agreement between reviewers was excellent (κ = 0.94, 95% confidence interval = 0.70–1.18, P < 0.0001). Despite no language restriction, all articles were written in English and all were available for full-text review.

Learners and Content

A total of 503 learners were exposed to simulation-based teaching of echo. Studies taught transthoracic echo (TTE) (n = 9), transesophageal echo (TEE) (n = 14), or both TTE and TEE (n = 1). There was a broad representation of medical subspecialties including cardiology (n = 3), anesthesiology (n = 12), emergency medicine, and pulmonary critical care (grouped with other, n = 9). Teaching of graduate medical education (n = 20) predominated, with few studies of undergraduate medical education (n = 1) and continuing medical education (n = 2). Most studies were conducted in the United States (n = 14) with ten being international or a combination of international/domestic (n = 10) (Table 2).

TABLE 2
TABLE 2:
Data Regarding Rigor and Quality of Studies Included in the Final Review (n = 24)

Rigor and Quality of Studies

Randomized controlled trials (RCTs) (n = 10) and pre-post comparisons (n = 10) comprised most of the study designs, whereas cohort study design was the minority (n = 4). When analyzed using the BEME appraisal system, 11 of the studies received a rating of three and eight of the studies received a rating of 4.6 The remainder of the studies received a rating of 1 (n = 3) and 2 (n = 2). Most studies (n = 17) had no noted source of funding. Six studies received funding from nonprofit sources and only one study received industry funding. Educational theory was rarely applied (n = 1) (Table 3).

TABLE 3
TABLE 3:
Definition of Kirkpatrick levels and Best Evidence of Medical Education Appraisal Scales6

Outcomes

Of the 24 included studies, 22 (91.7%) were “positive.” One of the “negative” studies was an abstract of pilot data from a randomized trial published with a sample size of 5.14 The other was a randomized trial exposing surgical residents to TEE training via simulation versus teaching in the operating room and found both strategies effective for improving on a knowledge assessment.25 Most studies (n = 13, 54.1%) were Kirkpatrick level 2a or 2b (modification of perception/attitude/knowledge/skills) (Fig. 2). A substantial portion (n = 10, 41.7%) were Kirkpatrick level 3 (behavioral change) and the remainder were level 4a (n = 1, change in organizational practice); none were level 1 (participation). The interclass correlation for Kirkpatrick levels was 0.83 (95% confidence interval = 0.65–0.92, P < 0.01).

FIGURE 2
FIGURE 2:
A pie chart depicting the distribution of Kirkpatrick levels among the final articles. Kirkpatrick level 2a: Modification of attitudes/perceptions: outcomes relate to changes in the reciprocal attitudes or perceptions between participant groups toward the intervention/stimulus. Kirkpatrick level 2b: Modification of knowledge/skills: for knowledge, this relates to the acquisition of concepts, procedures, and principles; for skills, this relates to the acquisition of thinking/problem-solving, psychomotor, and social skills. Kirkpatrick level 3: Behavioral change: documents the transfer of learning to the workplace or willingness of learners to apply knowledge and skills Kirkpatrick level 4a: Benefits to patients/clients: any improvement in the health and well-being of patients/clients as a direct result of an educational program.

DISCUSSION

In the present review of teaching echo through high-fidelity simulation, most studies focused on house staff trainees enrolled in anesthesia training programs located in the United States. When analyzed using the BEME grading scale, most of the studies received a favorable grade; however, educational theory was rarely applied in the evaluation of teaching echo with simulation. Measured outcomes ranged from surveys on self-confidence (Kirkpatrick 2a) to improved organization in clinical practice (Kirkpatrick 4a) with most studies demonstrating efficacy in the use of simulation to teach echo.

Despite echo being traditionally associated with cardiology, most studies (n = 11) were performed on anesthesia house staff trainees, whereas only three studies were performed on cardiology trainees.11,18,20 One explanation for this is that TTE is more common among cardiologists when compared with TEE, and TTE is associated with lower patient discomfort and risk. Thus, cardiologists have been slower to adopt simulation in training, whereas anesthesia more commonly uses TEE, which is associated with higher levels of patient discomfort, patient risk, and requires refinement of skills such as probe placement and image acquisition. In addition, cardiology trainees often have a longer and steeper learning curve when it comes echo because they require a more in depth understanding of the complex hemodynamic and physiologic variables associated with image acquisition to make a number of cardiac diagnoses. Anesthesia has long developed a culture of pioneering the development of simulators and their use in medical education.29 The use of simulation centers has not been a focus for training in cardiology and few curricula incorporate simulation into cardiology training. A further limitation is cost; adopting simulation may be prohibitively expensive for many training programs. This includes both direct cost of the simulators (on average greater than US $60,000) as well as indirect costs (dedicated time for faculty and fellows to participate in simulation-based learning).11,30 The other challenges reported on studies with cardiology fellows included low motivation in participants and smaller number of available participants because of fellowship class sizes.11,18

In the three studies, which focused on cardiology trainees, two focused on the use of high-fidelity simulators to teach TEE and one focused on their use to teach TTE. Of the studies, which focused on TEE, measured outcomes included formal assessment of TEE performance, self-assessment of ability, and comfort level with TEE and assessment of manual skills with motion analysis. The study with focused on TTE measured motion analysis as an outcome. Each of the studies showed favorable results with the use of high-fidelity simulators to teach echo. There were no studies on cardiology trainees, which used an educational theory. Highest Kirkpatrick level among cardiology trainees was 3, which translates to acquisition of increased knowledge and direct transition of that knowledge to the workplace via noted behavioral changes.

In the 12 studies, which focused on anesthesia trainees, four focused on the use of high-fidelity simulators to teach TTE, whereas most studies (n = 8) focused on the use of simulators to teach TEE. This is consistent with the fact that anesthesia primarily uses TEE intraoperatively to access cardiac function and that TEE more so requires the development fine motor skills. Measured outcomes among anesthesia trainees included written and computer based tests, satisfaction surveys, kinematic assessments, clinical skills human model assessments, and image acquisition evaluations. Five studies measured outcomes on live patients in clinical assessments1,3,17,19,22 Anesthesia trainees exposed to high-fidelity simulators showed improvement in anatomical identification (transversus abdominis, brachial plexus, sciatic nerve), image quality and acquisition time, identification of cardiac structures (parasternal long axis, parasternal right ventricular inflow, parasternal short axis, subcostal view, apical 4 chamber and apical 2 chamber), identification of pulmonary structures, vascular access (internal jugular vein, radial artery), and endotracheal tube placement in human subjects.1,3,17,19,22 Of note, the use of high-fidelity simulators in anesthesia training was also found to have a degree of clinical impact through the identification of new clinical diagnoses via the use of point-of-care echo intraoperatively.22 The main reasons for obtaining echocardiograms included significant medical history, hemodynamic instability, and respiratory failure. Correlation of new diagnoses to clinical outcomes was not measured.

Although accurate image acquisition and quality, knowledge, kinematics, and self-confidence are important competency requirements of TTE and TEE, one major competency requirement that was not accessed by the studies was image interpretation. An assessment of image interpretation would have allowed trainees to integrate echo into the broader framework of patient symptoms and clinical findings. One explanation for the surprising lack measurement of image interpretation is that most of the subjects exposed to the use of simulation to teach echo were trainees who were still in the process of becoming familiar with image interpretation. Although many studies assessed image acquisition as a primary outcome among anesthesia trainees, there were no similar studies among cardiology trainees. Because high-fidelity simulators have also been developed, which can aid with the understanding of image acquisition, the integration of image interpretation into simulation curricula could provide benefit to learners. In addition, trainees should be challenged to interpret images within distinct clinical scenarios to provide maximal diagnostic utilization of echo.

Most of the studies we have included seem to have been well planned and of high quality in terms of design and training effectiveness. The number conducted as randomized trials is surprising because this study design is often a challenge to implement in educational settings.31 We evaluated the confidence of the findings and the rigor of the outcomes using BEME and Kirkpatrick levels, respectively. Although no studies achieved the highest Kirkpatrick level (change in patient outcomes), it would be uncommon to see a study designed to achieve this degree of impact. The BEME levels underscore our conclusion that simulation is an effective strategy to teach echo. While the preponderance of positive studies suggests that simulation is effective, we should still proceed with caution. We could not conduct a formal meta-analysis and therefore could not assess for study bias using a funnel plot or other standard bias assessment metrics. It is worth noting that previous studies comparing simulation to traditional methods of instruction in other kinesthetically challenging medical domains (eg, laparoscopy, acute airway management) are not as overwhelmingly positive.32,33 Thus, results should be inferred with caution, because absence of evidence is not evidence of absence.

Our analysis found an absence of discussion related to educational theory in the articles reviewed. Linking theory to applied practice continues to be a challenge in contemporary medical education, although there is a considerable body of literature describing the benefits of theoretical perspectives in the design and delivery of educational interventions.34–37 Because medical education (undergraduate, graduate, and continuing) maintains its advancement toward competency and mastery-based instruction and assessment, theory will play an increasingly important role in promoting efficiency. The incorporation of educational theory provides a road map for instructional methodologies that may offer promise in differing domains.

Prominent learning theories associated with simulation in healthcare, particularly at the resident and fellow level, include experiential learning and adult learning theory.38 Experiential learning defines the method through which learners are in contact with the realities of the content studied.39 It emphasizes the sensory, emotional, and context associated with learning activities. Adult learning theory, which can occur in parallel with experiential learning, emphasizes instructional strategies necessary to motivate adult learners. There are four fundamental principles associated with adult learners.40 These include the need to be involved in the planning and evaluation of their instruction, experience, and mistakes provides a basis for learning activities, interest in learning topics that have immediate relevance and impact on their job and personal life, and problem-centered learning rather than content oriented learning. As illustrated by Clapper38 and Cheng et al,41 we can operationalize these theories in simulation by creating a safe and collaborative learning environment, which draws upon learners' previous experiences, promoting in-time reflection and focusing on the learners' abilities to translate new skills and knowledge into clinical practice. Unfortunately, even to the most experienced clinical educator, incorporating these elements into simulation practice may not always come naturally. Adoption can take time, effort, and practice.42 For many faculty, role modeling effective simulation instruction remains a significant barrier to advancing learners.43 Although the articles reviewed failed to explicitly describe theory, the learner-centered approaches described within them incorporated concepts of learning theory such as repeated practice and reinforcement, expert assistant to guide learners, contextualized learning environments, and recognition and incorporation of affective components.44 One study used educational theory in evaluating the use of simulation to teach perioperative echo in anesthesia interns. Anesthesia interns exposed to simulation in the study had increased postassessment scores at 6 months and increased kinematic analysis scores at 90 days when compared with anesthesia seniors who were not exposed to simulation.19 Concepts of educational theory applied to this study included a large initial load with multiple modes of reinforcement yielding automaticity. In a separate article by the same author, the concepts of educational theory used in performing a multimodal approach for basic TEE teaching were described: clear goals and carefully structured objectives, convenient access to graduated longitudinal instruction, a protected and optimal learning environment, repetition of concepts and technical skills, progressive expectations for understanding skill development, introduction of abnormalities after understanding normal anatomy and probe manipulation, live learning sessions and individualized proctoring session, use of multiple approaches to teaching, regular feedback, and application of performance and compliance measures.45

Looking beyond the published literature on effectiveness, simulation is being incorporated into training and education in a variety of ways. In the most recent version of the Core Cardiovascular Training Statement (COCATS), published by the American College of Cardiology Foundation, simulation is repeatedly mentioned as an option for documenting competency and reporting of milestones.46 Cardiology fellowship and anesthesiology residency programs have incorporated simulation and published on their experiences.47,48 The American Board of Internal Medicine first embraced simulation in 2008 with an option to earn maintenance of certification credits through simulation of interventional cardiology procedures. Despite this, in a recent survey of Accreditation Council for Graduate Medical Education accredited interventional cardiology fellowships, only 24% reported the use of simulation in teaching trainees.49 A recent survey of practicing cardiologists indicated that 81% would be interested in pursuing simulation-based education if it would fulfill maintenance of certification requirements.50 Simulation-based tutoring on echo has also been made available from private sector vendors51 (http://www.echosim.com/index.html).

Although high-fidelity simulation seems to be effective, educators have difficult choices to make about the costs of simulation, faculty and learner time constraints, deciding which topics to teach via simulation, shared resources, and considerations for internal development versus outsourcing of education. As we have reported, the current literature is focused on teaching TEE to anesthesia trainees, which does not represent the bulk of the clinical application of echo. As the drive for reduced trainee work hours continues, simulation offers an avenue for increased exposure and experience before live patients.

CONCLUSIONS

The use of high-fidelity simulation to teach echo leads to improved self-confidence, modification of knowledge and skills, and improved organizational practice among a variety of learners in differing clinical settings. Future research in this field should focus on the use of educational theory in teaching simulation, the formation of standardized curricula for using simulation to teach echo, and development of further scientific proof to prove that simulation is an effective way to teach echo such as the effect of simulation on image interpretation, new diagnoses, and clinical outcomes.

REFERENCES

1. Ferrero N, Bortsov A, Arora H, et al. Simulator training enhances resident performance in transesophageal echocardiography. Anesthesiology 2014;120(1):149–159.
2. Ogilvie E, Vlachou A, Edsell M, et al. Simulation-based teaching versus point-of-care teaching for identification of basic transoesophageal echocardiography views: a prospective randomised study. Anaesthesia 2015;70(3):330–335.
3. Neelankavil J, Howard-Quijano K, Hsieh T, et al. Transthoracic echocardiography simulation is an efficient method to train anesthesiologists in basic transthoracic echocardiography skills. Anesth Analg 2012;115(5):1042–1051.
4. Passiment M, Sacks H, Huang G. Medical simulation in medical education: results of an AAMC survey. Available at: https://www.aamc.org/download/259760/data/medicalsimulationinmedicaleducationanaamcsurvey.pdf. Accessed November 8, 2018.
5. Arntfield R, Pace J, McLeod S, et al. Focused transesophageal echocardiography for emergency physicians-description and results from simulation training of a structured four-view examination. Crit Ultrasound J 2015;7(1):27.
6. Yardley S, Dornan T. Kirkpatrick's levels and education 'evidence'. Med Educ 2012;46(1):97–106.
7. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics 1977;33(1):159–174.
8. Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 2009;6(7): e1000097.
9. Bose R, Matyal R, Warraich H, et al. Utility of a transesophageal echocardiographic simulator as a teaching tool. J Cardiothorac Vasc Anesth 2011;25(2):212–215.
10. Carmona P, Londono M, Navarro L, Pena JJ, Marques I. Impact of transesophageal echocardiography simulation-based training on learning basic cardiac structures recognition and navigation between the twenty standar views. Br J Anaesth 2012;108:168.
11. Damp J, Anthony R, Davidson MA, Mendes L. Effects of transesophageal echocardiography simulator training on learning and performance in cardiovascular medicine fellows. J Am Soc Echocardiogr 2013;26(12):1450–1456.e1452.
12. Edrich T, Seethala R, Olenchock B, et al. Providing initial transthoracic echocardiography training for anesthesiologists: simulator training is not inferior to live training. J Cardiothorac Vasc Anesth 2014;28(1):49–53.
13. Jelacic S, Bowdle A, Togashi K, VonHomeyer P. The use of TEE simulation in teaching basic echocardiography skills to senior anesthesiology residents. J Cardiothorac Vasc Anesth 2013;27(4):670–675.
14. Kirby D, Sueda L, Fagley R, et al. Acquisition of basic and rescue transesophageal echocardiography skills and knowledge by novice anesthesiology residents using high fidelity simulation. Anesth Analg 2015;120:S529.
15. Kusunose K, Yamada H, Suzukawa R, et al. Effects of transthoracic echocardiographic simulator training on performance and satisfaction in medical students. J Am Soc Echocardiogr 2016;29(4):375–377.
16. Madhivathanan PR, Smith A, Jain S, Walker D. Simulation-based transthoracic echo teaching: a tertiary centre experience. Intensive Care Med 2014;40:S158–S159.
17. Matyal R, Mitchell JD, Hess PE, et al. Simulator-based transesophageal echocardiographic training with motion analysis: a curriculum-based approach. Anesthesiology 2014;121(2):389–399.
18. Matyal R, Montealegre-Gallegos M, Mitchell JD, et al. Manual skill acquisition during transesophageal echocardiography simulator training of cardiology fellows: a kinematic assessment. J Cardiothorac Vasc Anesth 2015;29(6):1504–1510.
19. Mitchell JD, Montealegre-Gallegos M, Mahmood F, et al. Multimodal perioperative ultrasound course for interns allows for enhanced acquisition and retention of skills and knowledge. A A Case Rep 2015;5(7):119–123.
20. Montealegre-Gallegos M, Mahmood F, Kim H, et al. Imaging skills for transthoracic echocardiography in cardiology fellows: The value of motion metrics. Ann Card Anaesth 2016;19(2):245–250.
21. Prat G, Charron C, Repesse X, et al. The use of computerized echocardiographic simulation improves the learning curve for transesophageal hemodynamic assessment in critically ill patients. Ann Intensive Care 2016;6(1):27.
22. Ramsingh D, Rinehart J, Kain Z, et al. Impact assessment of perioperative point-of-care ultrasound training on anesthesiology residents. Anesthesiology 2015;123(3):670–682.
23. Sharma V, Chamos C, Valencia O, Meineri M, Fletcher S. The impact of internet and simulation-based training on transoesophageal echocardiography learning in anaesthetic trainees: a prospective randomised study. Anaesthesia 2013;68(6):621–627.
24. Skinner AA, Freeman RV, Sheehan FH. Quantitative feedback facilitates acquisition of skills in focused cardiac ultrasound. Simul Healthc 2016;11(2):134–138.
25. Smelt J, Corredor C, Edsell M, et al. Simulation-based learning of transesophageal echocardiography in cardiothoracic surgical trainees: a prospective, randomized study. J Thorac Cardiovasc Surg 2015;150(1):22–25.
26. Townsend NT, Kendall J, Barnett C, Robinson T. An effective curriculum for focused assessment diagnostic echocardiography: establishing the learning curve in surgical residents. J Surg Educ 2016;73(2):190–196.
27. Vegas A, Meineri M, Jerath A, et al. Impact of online transesophageal echocardiographic simulation on learning to navigate the 20 standard views. J Cardiothorac Vasc Anesth 2013;27(3):531–535.
28. Wagner R, Razek V, Grafe F, et al. Effectiveness of simulator-based echocardiography training of noncardiologists in congenital heart diseases. Echocardiography 2013;30(6):693–698.
29. Denson JS, Abrahamson S. A computer-controlled patient simulator. JAMA 1969;208(3):504–508.
30. Gosai J, Purva M, Gunn J. Simulation in cardiology: state of the art. Eur Heart J 2015;36(13):777–783.
31. Carney PA, Nierenberg DW, Pipas CF, Brooks WB, Stukel TA, Keller AM. Educational epidemiology: applying population-based design and analytic approaches to study Med Educ. JAMA 2004;292(9):1044–1050.
32. Kennedy CC, Cannon EK, Warner DO, Cook DA. Advanced airway management simulation training in medical education: a systematic review and meta-analysis. Crit Care Med 2014;42(1):169–178.
33. Zendejas B, Brydges R, Hamstra SJ, Cook DA. State of the evidence on simulation-based training for laparoscopic surgery: a systematic review. Ann Surg 2013;257(4):586–593.
34. Tolsgaard MG, Kulasegaram KM, Ringsted C. Practical trials in medical education: linking theory, practice and decision making. Med Educ 2017;51(1):22–30.
35. Bolander Laksov K, Dornan T, Teunissen PW. Making theory explicit - an analysis of how medical education research(ers) describe how they connect to theory. BMC Med Educ 2017;17(1):18.
36. Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: theory to practice. Med Teach 2010;32(8):638–645.
37. Dornan T, Mann K, Scherpbier A, Spencer J. Medical Education: Theory and Practice. Edinburgh: Elsevier; 2011.
38. Clapper TC. Beyond knowles: what those conducting simulation need to know about adult learning theory. Clin Simul Nurs 2010;6(1):e7–e14.
39. Keeton MTT, Pamela J. Learning by Experience—What, Why, How. Keeton Morris T, Tate Pamela J, editors. San Francisco: Jossey-Bass; 1978.
40. Knowles M. Andragogy in Action. San Francisco: Jossey-Bass; 1984.
41. Cheng A, Morse KJ, Rudolph J, et al. Learner-centered debriefing for health care simulation education: lessons for faculty development. Simul Healthc 2016;11(1):32–40.
42. Friedrich MJ. Harvard Macy Institute Helps Physicians Become Better Educators and Change Agents. JAMA 287. United States; 2002:3197–3199.
43. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based med educ research: 2003–2009. Med Educ 2010;44(1):50–63.
44. Kneebone R. Evaluating clinical simulations for learning procedural skills: a theory-based approach. Acad Med 2005;80(6):549–553.
45. Mitchell JD, Mahmood F, Bose R, et al. Novel, multimodal approach for basic transesophageal echocardiographic teaching. J Cardiothorac Vasc Anesth 2014;28(3):800–809.
46. Halperin JL, Williams ES, Fuster V, et al. ACC 2015 Core Cardiovascular Training Statement (COCATS 4) (Revision of COCATS 3): A Report of the ACC Competency Management Committee. J Am Coll Cardiol 2015;65(17):1721–1723.
47. Westerdahl DE. The necessity of high-fidelity simulation in cardiology training programs. J Am Coll Cardiol 2016;67(11):1375–1378.
48. Winchester DE, Wokhlu A, Dusaj RS, Schmalfuss CM. Simulation-based training of transesophageal echocardiography for cardiology fellows. J Echocardiogr 2017;15(3):147–149.
49. Green SM, Klein AJ, Pancholy S, et al. The current state of medical simulation in interventional cardiology: a clinical document from the Society for Cardiovascular Angiography and Intervention's (SCAI) Simulation Committee. Catheter Cardiovasc Interv 2014;83(1):37–46.
50. American College of Cardiology. Simulation-based education: a popular tactile learning technique. 2016. Available at: http://www.acc.org/latest-in-cardiology/articles/2016/11/29/16/31/simulation-based-education#sthash.IXqAOIa7.dpuf. Accessed April 21, 2017.
51. Scottsdale Arizona Echo Simulation Laboratory. Welcome to the Scottsdale Echo Simulation Training Lab. Available at: http://www.echosim.com/index.html. Accessed April 21, 2017.
Keywords:

Simulation; echocardiography; graduate medical education

Supplemental Digital Content

© 2018 Society for Simulation in Healthcare