Journal Logo

Review Article

The 100 Most Cited Articles on Healthcare Simulation

A Bibliometric Review

Walsh, Chloe MSc; Lydon, Sinéad PhD; Byrne, Dara MMEd Ed; Madden, Caoimhe MSc; Fox, Susan PhD; O'Connor, Paul PhD

Author Information
doi: 10.1097/SIH.0000000000000293

Abstract

Because the use of simulation has become more established within the delivery of healthcare education and training, the healthcare simulation research literature has experienced corresponding growth.1,2 A trend in medical and healthcare literature has been to conduct citation analyses to identify the most cited works in a particular field.3,4 To our knowledge, no such works have been conducted pertaining to the use of simulation in healthcare education and research. As such, with the increase in the use of simulation in healthcare education and the growth of healthcare simulation research for the past three to four decades, it seems that this is an appropriate time to take stock and to examine the nature of those articles that have been cited the most within the field of healthcare simulation.

The number of citations received is often used as a measure of the impact an individual article, author, journal, or institution has had on a field of research, with a basic assumption that high numbers of citations are indicative of high levels of impact.5,6 Therefore, identifying the most cited articles can be useful in identifying “classic” works in a field and highlighting areas for future research.6 Although a frequently used approach,7 there are a number of limitations to the use of citation analysis, which must be taken into account, such as the potential for a risk of bias as a result of older publications having had longer time to accumulate citations, or the way in which the citation practices of individuals can affect citation rates.8 Nevertheless, when appropriately applied, interpreted, and analyzed, citation analysis can be a useful tool to provide some indication of which authors, articles, and topics are influencing or motivating research in a field.9,10

This article aims to identify the 100 most highly cited articles in healthcare simulation education and research and to report on the characteristics of these publications. The hope is that this analysis will (1) provide insight into the history and development of the field of simulation in healthcare education and research and (2) provide an overview of research in the area and insight into the articles that have helped shape current knowledge and practice.

METHODS

Study Design

This article describes a citation analysis of journal articles in the field of healthcare simulation education and research. Although somewhat arbitrary, the top 100 articles were chosen as this is consistent with other bibliometric reviews in the healthcare fields3,4,11,12 and offer the reader more information as opposed to reporting just the top 10 or the top 50 articles in a field.

Search Strategy and Study Selection

Searches were conducted in the Scopus and the Web of Science databases (Clarivate Analytics, Philadelphia, PA) in July 2017. Publications pertaining to “simulation-based education” or “simulation-based research” and “health care professionals” were identified. No time limits were imposed on the searches, although searches were limited to the English language because of resource limitations (see table, Supplemental Digital Content 1, http://links.lww.com/SIH/A357, for detailed search strategy). In both databases, the retrieved articles were sorted using the sorting option “times cited – highest to lowest.” The outputs from both databases were presented in a descending order, with the higher cited articles at the top of the list, and were exported to two separate Excel spreadsheets for further analysis. Two reviewers (C.W. and S.F.) independently applied the inclusion and exclusion criteria to each article within the two lists. The full texts were then screened, independently, until each reviewer had developed a list of 100 eligible articles. Cohen's κ indicated high interrater reliability (κ = 0.83, sensitivity = 0.94, specificity = 0.82). To resolve any disagreements, the research team would review the article(s) in question, in detail, together, and discuss the issues at length until consensus was reached.

Eligibility Criteria

Studies were eligible for inclusion if they are the following:

  1. described the employment of simulation using virtual reality (VR) simulators, standardized or simulated patients (SPs), inanimate part-task trainers, high-fidelity and static manikins, live animals, inert animal products, cadavers, and role-plays;
  2. included healthcare personnel (eg, doctors, nurses, health profession students, allied healthcare professionals);
  3. were focused on the teaching and/or assessment of clinical or procedural skills; and
  4. were published in an English language, peer-reviewed journal.

Studies were ineligible for inclusion if they are the following:

  1. described simulation-based education with individuals other than healthcare professionals or health profession students;
  2. had a focus on computational simulation or mathematical modeling;
  3. did not have healthcare simulation education or research as a key focus; or
  4. featured written clinical vignettes alone. Because of the lack of interactive components, for the purpose of this review, these do not constitute simulation technology.13

Data Extraction

For both databases, data were extracted independently by two reviewers. For Scopus, the data were extracted by C.W. and C.M., and for Web of Science, the data were extracted by C.W. and S.F. Information was extracted on the following variables: (1) author and publication year; (2) number of citations; (3) country of publication (based on the first author's affiliation at the time of publication); (4) participants/population of focus; (5) type of simulator used; (6) type of targeted skill (ie, technical or nontechnical); (7) subject/discipline; and (8) article type (eg, intervention, systematic review). High percentage agreement between the raters was found for both databases (Scopus percentage agreement = 94%; Web of Science percentage agreement = 96%) (see table, Supplemental Digital Content 2, http://links.lww.com/SIH/A358, for data extraction table).

Top 100 List Citations

The list of the top 100 cited articles was compiled and ranked according to the outputs from the Scopus database search. Scopus was selected as the primary database because it covers more journals (approximately 21,500 journals)14 compared with Web of Science (approximately 18,200 journals).15 Furthermore, a number of key simulation journals such as Simulation in Healthcare, Journal of Clinical Simulation in Nursing, and British Medical Journal of Simulation and Technology Enhanced Learning are indexed within Scopus, but not Web of Science. To obtain the citation counts from Google Scholar and Web of Science, each article from the Scopus list was individually searched for in these other databases and reported alongside the Scopus outputs (see table, Supplemental Digital Content 3, for list of citations for all three databases). Publications with the same number of citations were ranked at the same position.

Self-citations

Using the “exclude self-citations” option in Scopus, the percentage of self-citations within the list of 100 most highly cited articles derived from Scopus was calculated.

Statistical Analysis

The Pearson correlation coefficient (r) was calculated to determine whether the number of years since publication was correlated with total number of citations among the included articles.

Publication Trends

Additional searches using the terms “simulation in healthcare” and “simulation-based medical education” were conducted within Scopus. These searches and the resultant data provide a broad overview of the publication trends of articles using these terms.

RESULTS

Study Selection

Because of the differences in the outputs from each database, it was impossible to compile a single list of the top cited articles. The flow diagram for the Scopus screening process is provided in Figure 1. The initial search in Scopus returned 479,216 results, of which the titles and abstracts of the 3075 highest cited texts were screened against the eligibility criteria. The full texts of 381 articles were then examined until the 100 most cited articles remained. A table of the full list of the top 100 publications according to Scopus, along with the number of citations those articles received in Web of Science and Google Scholar, are provided in Supplemental Digital Content 3 (see table, Supplemental Digital Content 3, http://links.lww.com/SIH/A359, for a list of publications with citations). A full list of the 100 most cited publications returned from the Web of Science search is also provided (see table, Supplemental Digital Content 4, Web of Science list, http://links.lww.com/SIH/A360).

FIGURE 1
FIGURE 1:
The PRISMA flowchart of study selection. PRISMA flowchart of the process of how included studies were selected with reported numbers of retrieved records at each stage.

Self-citations

Self-citations were found to comprise 5.7% of the total number of citations in the Scopus top 100 list.

Publication Trends

Figure 2 provides an overview of publication trends within Scopus under the search terms “healthcare simulation” and “simulation-based medical education.” An increase in articles using these terms was observed from 1994 onward. A substantial growth is evident from 2005 onward. Figure 2 also indicates key events that have occurred in healthcare simulation in terms of establishing it as a discipline (eg, founding of international societies and discipline specific journals). Figure 3 shows the publication trends for the articles in the included list between 1988 and 2011. A general increase in the number of publications was observed, which peaked in 2004 and 2006.

FIGURE 2
FIGURE 2:
Publications by year in Scopus under the terms “healthcare simulation” and “simulation-based medical education” from 1994 to 2017. Important events in healthcare simulation history are also presented. SESAM indicates Society in Europe for Simulation Applied to Medical Education; IMMS, International Meeting on Medical Simulation; SSH, Society for Simulation in Healthcare; IMSH, International Meeting for Simulation in Healthcare; ASPIH, Association for Simulated Practice in Healthcare. *Later in 2006, this was renamed as the Journal of the Society for Simulation in Healthcare.
FIGURE 3
FIGURE 3:
Publication trends for the articles included in this review.

Study Characteristics

The characteristics of the top cited articles retrieved from Scopus are shown in Table 1. The most commonly cited articles were published between 1988 and 2011 with a citation range from 1411 to 186 (mean [SD] = 361.55 [237.27]). No significant correlation was found between years since publication and number of citations (P = 0.48) (see figure, Supplemental Digital Content 5, http://links.lww.com/SIH/A361, which illustrates the Pearson coefficient).

TABLE 1
TABLE 1:
Study Characteristics of Top 100 Published Articles

The 100 Most Cited Articles

Country of Publication

As can be seen within panel 1 of Table 1, almost half of the studies within the 100 most cited articles were published in the United States (45%),1,13,16–58 with the United Kingdom as the next highest contributor (27%),59–85 followed by Canada (16%).86–101

Type of Study

The distribution of the number of publications by type of study is shown in panel 2 of Table 1. Nonsystematic reviews (35%)1,13,20,23,24,28,29,31,32,35,37–39,42,44,48,50,51,58,60,62,64–66,68,69,73,80,82,83,85,87,93,102,103 were the most common type of study within the 100 most cited articles, followed by interventions (28%)18,25,26,30,36,41,43,45,49,53–55,63,75–78,81,90,92,95,96,104–108 and studies focusing on tool evaluation/development (18%).22,34,52,59,61,70,74,76,79,86,88,89,91,94,98–100,109 Systematic reviews/meta-analyses (7%),19,27,33,84,110–112 observational studies (6%),16,46,47,71,90,97 articles describing performance assessments (5%),17,67,99,109,113 user evaluations (2%),40,57 and curriculum development studies (1%)113 were less commonly cited. The focus of a small number of articles (5%)74,76,99,109,113 was deemed to fall across two or more categories.

Subject/Discipline

The breakdown by subject/discipline is presented in panel 3 of Table 1. Many studies are related to several subjects or disciplines, and this is reflected in the results. For example, crisis resource management was often studied within anesthesiology17,30,35,40,49,54,56,59,73,106 or surgical teams.30,53,95 Most studies (86%)1,13,16,18–21,23–26,30–33,35–50,53–55,58–60,62–66,68–73,75–77,79–112 had a key focus of medical education/training, defined as a “discipline covering the education, and the practice of, skills by students enrolled in medical schools/colleges to become doctors; residency training, continuing medical education; and specializing postgraduate training.”114(p.1148) Surgery accounted for 45% of articles.18,26,28,30,39,41,53,58,61,62,66–70,74–77,79,84,86–96,98,99,101–105,107–109,111–113 This was followed by crisis/crew resource management (15%)17,30,35,37,40,43,49,53,54,56,59,73,80,95,106 doctor-patient communication (11%),25,36,45,48,55,63,72,81,82,85,97 and anesthesiology (9%).17,30,35,40,54,56,59,73,106

Targeted Skills

It can be seen from panel 4 of Table 1 that of the 100 most cited articles, the majority (41%)16,18,26,28,39–41,46,47,61–63,67,69,70,75–77,84,86–96,98,99,101,103–105,107–109,111–113 were focused primarily on technical skills (ie, the knowledge, skills, and ability to accomplish a specific medical task, such as suturing or performing a physical examination.115(p.38) Articles focused specifically on nontechnical skills (eg, communication, decision-making, history taking skills) were less common (25%).25,30,36,37,43–45,48,50,51,53–55,59,63,66,72–74,79–82,85,100 Finally, 34%1,13,17,19–24,27,29,31–35,38,42,49,52,56–58,60,64,65,68,71,78,83,97,102,106,110 of articles focused on both technical and nontechnical skills. For example, in full-scale simulations, the focus was often directed at training/assessing communication skills as well as technical ability.78

Type of Simulator

Panel 5 of Table 1 shows the frequency of the different types of simulators used. Most articles (40%)1,13,19–21,23,24,27,29,32–34,38,41,44,50,56,58,62,64–66,68,74–78,80,83,86,87,92,93,95,102,103,107,111,112 discussed or used multiple types of simulators. For example, review articles sometimes provided an overview of or comparisons between various tools useful for a specific discipline (eg, various tools available for training and assessment in laparoscopic surgery69); the history and development of the field (eg, the development of simulation for medical education64); and future directions.13

Manikins were the next most commonly used type of simulator (20%).16,17,30,35,37,40,43,46,47,49,51,54,57,59,71,73,78,79,106,110 They were used to teach a variety of technical and nontechnical skills within medical education and training (15% of all articles)16,30,35,37,40,43,46,47,49,54,59,71,73,79,106; crisis/crew resource management (11% of all articles)17,30,35,37,40,43,49,54,59,73,106; anesthesiology (7% of all articles)17,35,40,54,59,73,106; nursing (4% of all articles)51,57,78,110; emergency medicine (3% of all articles)43,46,47; team training (2% of all articles)37,43; surgery (2% of all articles)30,79; and obstetrics and gynecology (1% of all articles).71

Simulated patients were featured in 16%22,25,34,36,42,45,48,52,60,63,72,81,82,85,97,100 of articles in the teaching/assessment of nontechnical skills across a number of subject areas/disciplines, eg, medical education (13% of all articles),25,36,42,45,48,60,63,72,81,82,85,97,100 doctor-patient communication (10% of all articles),25,36,45,48,63,72,81,82,85,97 oncology (5% of all articles),36,63,72,81,82 and primary care (4% of all articles).22,25,34,52

Inanimate part-task trainers are also featured in 16% of articles,26,41,70,75,76,88–92,94–96,98,99,101 all of which were surgical articles, in particular laparoscopic surgery (10% of all articles).26,41,75,76,89,90,94,96,99,101

Fully simulated environments were featured in 17% of articles.17,30,35,37,40,43,49–51,54,57,59,73,78,79,106,110 These full-scale simulations were often used in relation to crisis/crew resource management training (11% of all articles)17,30,35,37,40,43,49,54,59,73,106 and in nursing (4% of all articles).51,57,78,110

Virtual reality part-task trainers were featured in 14% of articles18,28,39,67,69,75,76,84,104,105,107–109,113 and were used almost entirely in relation to laparoscopic (12% of all articles)18,67,69,75,76,84,104,105,107–109,113 and minimally invasive surgery (MIS) (1% of all articles).28 One study compared computerized clinical vignettes with SPs for the assessment of clinical competence.34

Live animals/inert animal products (4%),41,61,95,107 role-play (3%),53,55,82 and cadavers (1%)92 were less commonly featured as a key mode of simulation.

Journal

The journals, which have published the top cited articles, are presented in Table 2. Surgical journals have contributed the most articles (33%),18,26,28,39,41,53,58,66,69,70,75,77,79,84,86,88–90,92–96,99,101,103–105,107,109,111–113 followed by general medical journals (22%)20,22–25,27,32,34,36,38,45,46,52,55,60,62,63,67,85,87,97,108 and medical education journals (15%).1,19,21,33,42,48,64,65,68,74,83,91,98,100,102 Simulation-specific journals have contributed 3% of the top cited articles.31,35,44 Any journals, which have contributed less than two articles to the top cited list, are grouped under the category of “other” in Table 2.

TABLE 2
TABLE 2:
Journals That Have Published the Highly Cited Articles in the List Retrieved From Scopus
The 10 Most Cited Articles

Table 3 presents a summary of the ten most highly cited articles. These articles include the following: two controlled interventions in which VR task trainers were used to improve laparoscopic surgery skills18,104; a systematic review investigating the features and using high-fidelity simulation associated with effective learning19; and three nonsystematic reviews, which discussed the use of various simulators in the assessment and training of both technical and nontechnical skills.20,23,87 Two studies evaluated the use of part-task trainers88 or a combination of part-task trainers and live animals86 to assess/train technical skills, whereas one study22 compared the use of SPs with written clinical vignettes in the assessment of the quality of care provided by physicians. Finally, one of the top cited articles was an invited address21 in which the use of simulation is discussed in relation to deliberate practice in medical education/training.

TABLE 3
TABLE 3:
The 10 Highest Cited Publications in Healthcare Simulation as Listed in Scopus

DISCUSSION

The aims of this study were to identify the 100 most cited articles in simulation in healthcare education and research and to describe the characteristics of these articles. The top cited articles, according to the Scopus database, were published between 1988 and 2011 and came mainly from the United States (45%). Nonsystematic reviews (35%) and interventions (28%) were the most common type of publication and 40% of the articles used/discussed multiple simulators. Most articles focused on the education and/or training of medical professionals (86%) across domains, but surgery was the most common specialty (45%). Furthermore, technical skills (41%) were more often featured than nontechnical skills (25%), and surgery-specific journals (33%) have contributed the largest number of highly cited articles in the current list. The increasing trend of publications relating to healthcare simulation education and research suggests that it has become a widespread and accepted methodology in the education of healthcare professionals.

The specialty of surgery dominated the top 100 most cited articles, more specifically, laparoscopic surgery. It is well recognized that basic laparoscopic skills are well suited to practice on a simulator.58 This finding may also be partly attributable to the fact that the rise in simulation has occurred concurrently with the growth of MIS. Assessing the acquisition of laparoscopic skills fits well within the “scientific method” with clear, measurable performance metrics (eg, time, instrument motion).

The high representation of interventions (28%) in the 100 most cited articles may suggest that there has been a “burden of proof” to demonstrate that healthcare simulation is an effective training and education tool. In the past decade, however, systematic reviews and meta-analyses have been conducted and are attaining high citation counts, despite only being published in recent years.19,27,33,84,110–112 Such studies are useful for advancing research, directing practice, and facilitating decision-making, so the increasing appearance of these types of studies is unsurprising and provides further evidence of the maturation of the field.116–118

A general increase in the amount of publications per year was observed for the period covered (1988–2011) by this review, which reflects the growth of the field. Similar findings have been observed elsewhere; in a systematic review spanning 34 years (1969–2003), Issenberg and colleagues19 also illustrated the rapid increase in journal articles on high-fidelity simulation in medical education with 57% of the included studies published in the latter years of the review (2000–2003). Although there is a subsequent decline in publications after the peak years of 2004/2006, this can be expected as a consequence of the impact of a time of publication bias that exists within citation analyses, whereby older publications have had more time to accrue more citations than recent studies.119

Strengths and Limitations

Strengths of the study include the following: the use of an accepted and frequently used technical method119,120; stringent, transparent inclusion and exclusion criteria; a comprehensive systematic search process across two key electronic databases; and the presentation of a novel overview and synthesis within the field.

There were also some limitations that should be noted. First, the search was limited to the English language, and so articles in other languages, which may have global impact, were potentially excluded. Second, the citation counts provided by Scopus, Web of Science, and Google Scholar and the records returned by Scopus and Web of Science vary considerably, making it impossible to produce a single comprehensive list of the 100 most cited articles. There are various arguments in favor of and against both Scopus and Web of Science, so it is unlikely that one database is superior to the other.7 For example, Web of Science's coverage spans as far back as 1900, whereas coverage of articles before 1996 is more limited in Scopus.7,121 However, Scopus covers more journals,14 including many key simulation journals, which Web of Science does not. For this reason, the list of the top 100 articles is based on Scopus. A further limitation is that the methodological rigor of the included studies was not examined. An assessment of study quality would have allowed for the examination of the association between citation count and study quality as well as the determination of whether high quality studies receive more citations. This is something that future research may wish to address.

Finally, there are limitations to the assumption that highly cited articles are highly influential. First, different disciplines/domains often have different rates of citations, which makes cross comparison difficult. For example, even within the same specialties, different citation rates can be found in different countries.122 Second, the size and type of specialty also influence citation rates, whereby larger fields, with more journals, tend to have higher citation rates because of the increased opportunity to gather citations.123 Likewise, there is a risk of bias in relation to the time of publication, whereby older publications have had longer to accumulate citations. Citation practices of individuals can also affect citation rates. For example, formal citations are not always correctly cited and informal citations, such as grey literature or conference presentations, do not get routinely cited.114 Biased citing can also occur because of Halo and Mathew effects or self/in-house citing and can affect overall couts.8,124,125 However, within our data set, the rates of self-citations were relatively low (<6%). Finally, there are considerable variations in the items retrieved from different databases, as observed in the present study, and at different time points.126 It should also be noted that there are alternative methods of conducting bibliometric analyses to the method employed in the current study.127 For example, impact may be measured based on the mean citations per year or via the number of article views. Nevertheless, citation analysis is a key methodology used across many disciplines,127 and despite the previously mentioned limitations, when applied appropriately, citation analysis can be a useful approach to identify which authors, articles, and topics are influencing or motivating research in a field.9,10

Implications for Future Research

This review has provided an historical overview of the highly cited publications within the field of healthcare simulation education and research, as opposed to looking to the future of the field. Nevertheless, based on our findings, it is possible to provide some thoughts on what we may see as the field of simulation continues to advance. Healthcare simulation research has clearly matured for the last three decades, with the top 10 articles all being published within the last 20 years. There is now sufficient evidence that healthcare simulation is an effective educational intervention when it is used under the right conditions.128 Therefore, there will be a move away from research questions focused on “does healthcare simulation work?” to an assessment of the conditions when simulation is most effective, ie, “why, how, and with whom does healthcare simulation work?”

We anticipate an increasing number of systematic reviews and meta-analyses within specific areas of healthcare simulation such as those by McGaghie et al33 and Sturm et al.112 Until recently, there may not have been a sufficient number of journal articles to allow for these types of reviews. However, as the body of research increases, these types of reviews will become more commonplace in the healthcare simulation literature.

It is likely that surgery will continue to be a dominant specialty in the most cited healthcare simulation research, with a growing role for simulation across surgical specialties.28 Simulation is a potentially effective training and education tool to support the continued growth in MIS, and other new surgical methods (eg, robot-assisted surgery) are also techniques that can be taught and practiced using simulators.129

With further advances in healthcare simulation technology (eg, improved haptics), it is also likely that there will be a “blurring” of the categories of simulators and increases in the use of hybrid simulation (ie, the combination of two or more simulation modalities to create a more realistic experience,115 such as those used by Kneebone et al129 and Crofts et al130), and VR simulator technologies such as the MIST-VR for surgical training (eg, Aggarwal et al69 and Grantcharov et al109).

CONCLUSIONS

The use of healthcare simulation is no longer an exception but has become a key component of modern health profession education and training.13,131 This review has demonstrated that healthcare simulation is a vibrant and growing field of research. As the use of healthcare simulation has become more commonplace in healthcare education, there has been an exponential rise in the number of research articles published and a corresponding increase in the outlets that publish this research. Healthcare simulation will always be changing and developing.128 However, it is hoped that identifying those articles that have had the most influence during the last decades will help inform the healthcare simulation research that will be carried out in the future.

REFERENCES

1. McGaghie WC, Issenberg SB, Petrusa ER, et al. A critical review of simulation-based medical education research: 2003-2009. Med Educ 2010;44(1):50–63.
2. Cheng A, Kessler D, Mackinnon R, et al. Reporting guidelines for health care simulation research: extensions to the CONSORT and STROBE statements. Adv Simul 2016;1(1):25.
3. Müller M, Gloor B, Candinas D, et al. The 100 most-cited articles in visceral surgery: a systematic review. Dig Surg 2016;33(6):509–519.
4. Shuaib W, Acevedo JN, Khan MS, et al. The top 100 cited articles published in emergency medicine journals. Am J Emerg Med 2015;33(8):1066–1071.
5. Marx W, Schier H, Wanitschek M. Citation analysis using online databases: feasibilities and shortcomings. Scientometrics 2001;52(1):59–82.
6. Azer SA. The top-cited articles in medical education: a bibliometric analysis. Acad Med 2015;90(8):1147–1161.
7. Kulkarni AV, Aziz B, Shams I, et al. Comparisons of citations in Web of Science, Scopus, and Google Scholar for articles published in general medical journals. JAMA 2009;302(10):1092–1096.
8. MacRoberts MH, MacRoberts BR. Quantitative measures of communication in science: a study of the formal level. Soc Stud Sci 1986;16(1):151–172.
9. Hsu Y, Ho Y. Highly cited articles in health care sciences and services field in Science Citation Index Expanded. Method Inform Med 2014;53(6):446–458.
10. Garfield E. Essays of an Information Scientist, Vol: 8, p. 469–479, 1985 Current Contents,# 49, p. 3–13, December 9, 1985. Curr Contents 1985;49:3–13.
11. Nason GJ, Tareen F, Mortell A. The top 100 cited articles in urology: an update. Can Urol Assoc J 2013;7(1–2):E16.
12. Kim Y, Yoon DY, Kim JE, et al. Citation classics in stroke: the top-100 cited articles on hemorrhagic stroke. Eur Neurol 2017;78(3–4):210–216.
13. Gaba DM. The future vision of simulation in health care. Qual Saf Health Care 2004;13:I2–I10.
14. Elsevier. Scopus Content Coverage Guide. January 2016 2016. Available at: https://www.elsevier.com/__data/assets/pdf_file/0007/69451/scopus_content_coverage_guide.pdf. Accessed September 11, 2017.
15. Clarivate Analytics. Web of Science platform: Web of Science: Summary of Coverage. August 2017. Available at: http://clarivate.libguides.com/webofscienceplatform/coverage. Accessed September 11, 2017.
16. Wayne DB, Didwania A, Feinglass J, et al. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. Chest 2008;133(1):56–61.
17. Gaba DM, Howard SK, Flanagan B, et al. Assessment of clinical performance during simulated crises using both technical and behavioral ratings. Anesthesiology 1998;89(1):8–18.
18. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 2002;236(4):458–463.
19. Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27(1):10–28.
20. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA 2002;287(2):226–235.
21. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med 2004;79(10):S70–S81.
22. Peabody JW, Luck J, Glassman P, et al. Comparison of vignettes, standardized patients, and chart abstraction - a prospective validation study of 3 methods for measuring quality. JAMA 2000;283(13):1715–1722.
23. Epstein RM. Assessment in medical education. New Engl J Med 2007;356(4):387–396.
24. Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA 1999;282(9):861–866.
25. Roter DL, Hall JA, Kern DE, et al. Improving physicians' interviewing skills and reducing patients' emotional distress: a randomized clinical trial. Arch Intern Med 1995;155(17):1877–1884.
26. Scott DJ, Bergen PC, Rege RV, et al. Laparoscopic training on bench models: better and more cost effective than operating room experience? J Am Coll Surg 2000;191(3):272–283.
27. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011;306(9):978–988.
28. Gallagher AG, Ritter EM, Champion H, et al. Virtual reality simulation for the operating room - proficiency-based training as a paradigm shift in surgical skills training. Ann Surg 2005;241(2): 364–372.
29. Jeffries PR. A framework for designing, implementing, and evaluating simulations used as teaching strategies in nursing. Nurs Educ Perspect 2005;26(2):96–103.
30. Howard SK, Gaba DM, Fish KJ, et al. Anesthesia crisis resource management training: teaching anesthesiologists to handle critical incidents. Aviat Space Environ Med 1992;63(9):763–770.
31. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc 2007;2(2):115–125.
32. Cooke M, Irby DM, Sullivan W, et al. American medical education 100 years after the Flexner report. N Engl J Med 2006;355(13): 1339–1344.
33. McGaghie WC, Issenberg SB, Cohen ER, et al. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med 2011;86(6):706–711.
34. Peabody JW, Luck J, Glassman P, et al. Measuring the quality of physician practice by using clinical vignettes: a prospective validation study. Ann Intern Med 2004;141(10):771–780.
35. Gaba DM, Howard SK, Fish KJ, et al. Simulation-based training in anesthesia crisis resource management (ACRM): a decade of experience. Simulat Gaming 2001;32(2):175–193.
36. Back AL, Arnold RM, Baile WF, et al. Efficacy of communication skills training for giving bad news and discussing transitions to palliative care. Arch Intern Med 2007;167(5):453–460.
37. Baker DP, Day R, Salas E. Teamwork as an essential component of high-reliability organizations. Health Serv Res 2006;41(4):1576–1598.
38. Okuda Y, Bryson EO, DeMaria S Jr, et al. The utility of simulation in medical education: what is the evidence? Mt Sinai J Med 2009;76(4): 330–343.
39. Satava RM. Virtual reality surgical simulator. The first steps. Surg Endosc 1993;7(3):203–205.
40. Gaba DM, DeAnda A. A comprehensive anesthesia simulation environment: re-creating the operating room for research and training. Anesthesiology 1988;69(3):387–394.
41. Korndorffer JR Jr, Dunne JB, Sierra R, et al. Simulator training for laparoscopic suturing using performance goals translates to the operating room. J Am Coll Surg 2005;201(1):23–29.
42. Downing SM. Reliability: on the reproducibility of assessment data. Med Educ 2004;38(9):1006–1012.
43. Shapiro MJ, Morey JC, Small S, et al. Simulation based teamwork training for emergency department staff: does it improve clinical team performance when added to an existing didactic teamwork curriculum? Qual Saf Health Care 2004;13(6):417–421.
44. Rudolph JW, Simon R, Dufresne RL, et al. There's no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc 2006;1(1):49–55.
45. Yedidia MJ, Gillespie CC, Kachur E, et al. Effect of communications training on medical student performance. JAMA 2003;290(9): 1157–1165.
46. Barsuk JH, Cohen ER, Feinglass J, et al. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med 2009;169(15):1420–1423.
47. Barsuk JH, McGaghie WC, Cohen ER, et al. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med 2009;37(10):2697–2701.
48. Duffy FD, Gordon GH, Whelan G, et al. Assessing competence in communication and interpersonal skills: the Kalamazoo II report. Acad Med 2004;79(6):495–507.
49. Steadman RH, Coates WC, Huang YM, et al. Simulation-based training is superior to problem-based learning for the acquisition of critical assessment and management skills. Crit Care Med 2006;34(1):151–157.
50. Beaubien JM, Baker DP. The use of simulation for training teamwork skills in health care: how low can you go? Qual Saf Health Care 2004;13(Suppl 1):i51–i56.
51. Lasater K. High-fidelity simulation and the development of clinical judgment: students experiences. J Nurs Educ 2007;46(6):269–276.
52. Luck J, Peabody JW, Dresselhaus TR, et al. How well does chart abstraction measure quality? A prospective comparison of standardized patients with the medical record. Am J Med 2000;108(8):642–649.
53. Awad SS, Fagan SP, Bellows C, et al. Bridging the communication gap in the operating room with medical team training. Am J Surg 2005;190(5): 770–774.
54. Holzman RS, Cooper JB, Gaba DM, et al. Anesthesia crisis resource management: real-life simulation training in operating room crises. J Clin Anesth 1995;7(8):675–687.
55. Brown JB, Boles M, Mullooly JP, et al. Effect of clinician communication skills training on patient satisfaction. A randomized, controlled trial. Ann Intern Med 1999;131(11):822–829.
56. Gaba DM. Improving anesthesiologists' performance by simulating reality. Anesthesiology 1992;76(4):491–494.
57. Feingold CE, Calaluce M, Kallen MA. Computerized patient model and simulated clinical experiences: evaluation with baccalaureate nursing students. J Nurs Educ 2004;43(4):156–163.
58. Scott DJ, Dunnington GL. The new ACS/APDS skills curriculum: moving the learning curve out of the operating room. J Gastrointest Surg 2008;12(2):213–221.
59. Fletcher G, Flin R, McGeorge P, et al. Anaesthetists' Non-Technical Skills (ANTS): evaluation of a behavioural marker system. Br J Anaesth 2003;90(5):580–588.
60. Wass V, Van der Vleuten C, Shatzer J, et al. Assessment of clinical competence. Lancet 2001;357(9260):945–949.
61. Appleyard M, Fireman Z, Glukhovsky A, et al. A randomized trial comparing wireless capsule endoscopy with push enteroscopy for the detection of small-bowel lesions. Gastroenterology 2000;119(6): 1431–1438.
62. Moorthy K, Munz Y, Sarker SK, et al. Objective assessment of technical skills in surgery. BMJ 2003;327(7422):1032–1037.
63. Fallowfield L, Jenkins V, Farewell V, et al. Efficacy of a Cancer Research UK communication skills training model for oncologists: a randomised controlled trial. Lancet 2002;359(9307):650–656.
64. Bradley P. The history of simulation in medical education and possible future directions. Med Educ 2006;40(3):254–262.
65. Maran NJ, Glavin RJ. Low- to high-fidelity simulation - a continuum of medical education? Med Educ 2003;37:22–28.
66. Yule S, Flin R, Paterson-Brown S, et al. Non-technical skills for surgeons in the operating room: a review of the literature. Surgery 2006;139(2): 140–149.
67. Taffinder N, McManus I, Gul Y, et al. Effect of sleep deprivation on surgeons' dexterity on laparoscopy simulator. Lancet 1998;352(9135):1191.
68. Kneebone R. Simulation in surgical training: educational issues and practical implications. Med Educ 2003;37(3):267–277.
69. Aggarwal R, Moorthy K, Darzi A. Laparoscopic skills training and assessment. Br J Surg 2004;91(12):1549–1558.
70. Datta V, Mackay S, Mandalia M, et al. The use of electromagnetic motion tracking analysis to objectively measure open surgical skill in the laboratory-based model. J Am Coll Surg 2001;193(5): 479–485.
71. Draycott TJ, Crofts JF, Ash JP, et al. Improving neonatal outcome through practical shoulder dystocia training. Obstet Gynecol 2008;112(1): 14–20.
72. Fallowfield L, Lipkin M, Hall A. Teaching senior oncologists communication skills: results from phase I of a comprehensive longitudinal program in the United Kingdom. J Clin Oncol 1998;16(5):1961–1968.
73. Fletcher GC, McGeorge P, Flin RH, et al. The role of non-technical skills in anaesthesia: a review of current literature. Br J Anaesth 2002;88(3): 418–429.
74. Yule S, Flin R, Paterson-Brown S, et al. Development of a rating system for surgeons' non-technical skills. Med Educ 2006;40(11):1098–1104.
75. Munz Y, Kumar B, Moorthy K, et al. Laparoscopic virtual reality and box trainers: is one superior to the other? Surg Endosc 2004;18(3):485–494.
76. Taffinder N, Sutton C, Fishwick RJ, McManus IC, Darzi A. Validation of virtual reality to teach and assess psychomotor skills in laparoscopic surgery: Results from randomised controlled studies using the MIST VR laparoscpic simulator. In Westwood JD, Hoffman HM, Stredney D, Weghorst SJ, editors. Medicine meets virtual reality: Art, science, technology: healthcare and (r)evolution. The Netherlands: IOS publishers; 1998:124–130.
77. Aggarwal R, Ward J, Balasundaram I, et al. Proving the effectiveness of virtual reality simulation for training in laparoscopic surgery. Ann Surg 2007;246(5):771–779.
78. Alinier G, Hunt B, Gordon R, et al. Effectiveness of intermediate-fidelity simulation training technology in undergraduate nursing education. J Adv Nurs 2006;54(3):359–369.
79. Yule S, Flin R, Maran N, et al. Surgeons’ non-technical skills in the operating room: reliability testing of the NOTSS behavior rating system. World J Surg 2008;32(4):548–556.
80. Flin R, Maran N. Identifying and training non-technical skills for teams in acute medicine. Qual Saf Health Care 2004;13(Suppl 1):i80–i84.
81. Fallowfield L, Jenkins V, Farewell V, et al. Enduring impact of communication skills training: results of a 12-month follow-up. Br J Cancer 2003;89(8):1445.
82. Parle M, Maguire P, Heaven C. The development of a training model to improve health professionals' skills, self-efficacy and outcome expectancies when communicating with cancer patients. Soc Sci Med 1997;44(2):231–240.
83. Kneebone R, Scott W, Darzi A, et al. Simulation and clinical practice: strengthening the relationship. Med Educ 2004;38(10):1095–1102.
84. Gurusamy K, Aggarwal R, Palanivelu L, et al. Systematic review of randomized controlled trials on the effectiveness of virtual reality training for laparoscopic surgery. Br J Surg 2008;95(9):1088–1097.
85. Maguire P, Pitceathly C. Key communication skills and how to acquire them. BMJ 2002;325(7366):697–700.
86. Martin J, Regehr G, Reznick R, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997;84(2): 273–278.
87. Reznick RK, MacRae H. Teaching surgical skills—changes in the wind. N Engl J Med 2006;355(25):2664–2669.
88. Reznick R, Regehr G, MacRae H, et al. Testing technical skill via an innovative “bench station” examination. Am J Surg 1997;173(3):226–230.
89. Fried GM, Feldman LS, Vassiliou MC, et al. Proving the value of simulation in laparoscopic surgery. Ann Surg 2004;240(3):518–525.
90. Derossis AM, Bothwell J, Sigman HH, et al. The effect of practice on performance in a laparoscopic simulator. Surg Endosc 1998;12(9):1117–1120.
91. Regehr G, MacRae H, Reznick RK, et al. Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med 1998;73(9):993–997.
92. Anastakis DJ, Regehr G, Reznick RK, et al. Assessment of technical skills transfer from the bench training model to the human model. Am J Surg 1999;177(2):167–170.
93. Reznick RK. Teaching and testing technical skills. Am J Surg 1993;165(3):358–361.
94. Vassiliou MC, Ghitulescu GA, Feldman LS, et al. The MISTELS program to measure technical skill in laparoscopic surgery - evidence for reliability. Surg Endosc 2006;20(5):744–747.
95. Grober ED, Hamstra SJ, Wanzel KR, et al. The educational impact of bench model fidelity on the acquisition of technical skill: the use of clinically relevant outcome measures. Ann Surg 2004;240(2):374–381.
96. Sroka G, Feldman LS, Vassiliou MC, et al. Fundamentals of laparoscopic surgery simulator training to proficiency improves laparoscopic performance in the operating room-a randomized controlled trial. Am J Surg 2010;199(1):115–120.
97. Tamblyn R, Abrahamowicz M, Dauphinee D, et al. Physician scores on a national clinical skills examination as predictors of complaints to medical regulatory authorities. JAMA 2007;298(9):993–1001.
98. Faulkner H, Regehr G, Martin J, et al. Validation of an objective structured assessment of technical skill for surgical residents. Acad Med 1996;71(12):1363–1365.
99. Fraser SA, Klassen DR, Feldman LS, et al. Evaluating laparoscopic skills: setting the pass/fail score for the MISTELS system. Surg Endosc 2003;17(6):964–967.
100. Hodges B, Regehr G, McNaughton N, et al. OSCE checklists do not capture increasing levels of expertise. Acad Med 1999;74(10):1129–1134.
101. Derossis AM, Fried GM, Abrahamowicz M, et al. Development of a model for training and evaluation of laparoscopic skills. Am J Surg 1998;175(6):482–487.
102. Ziv A, Wolpe PR, Small SD, et al. Simulation-based medical education: an ethical imperative. Acad Med 2003;78(8):783–788.
103. van Hove PD, Tuijthof GJ, Verdaasdonk EG, et al. Objective assessment of technical surgical skills. Br J Surg 2010;97(7):972–987.
104. Grantcharov TP, Kristiansen VB, Bendix J, et al. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg 2004;91(2):146–150.
105. Ahlberg G, Enochsson L, Gallagher AG, et al. Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am J Surg 2007;193(6):797–804.
106. Chopra V, Gesink BJ, de Jong J, et al. Does training on an anaesthesia simulator lead to improvement in performance? Br J Anaesth 1994;73(3): 293–297.
107. Hyltander A, Liljegren E, Rhodin P, et al. The transfer of basic skills learned in a laparoscopic simulator to the operating room. Surg Endosc 2002;16(9):1324–1328.
108. Larsen CR, Soerensen JL, Grantcharov TP, et al. Effect of virtual reality training on laparoscopic surgery: randomised controlled trial. BMJ 2009;338:b1802.
109. Grantcharov TP, Bardram L, Funch-Jensen P, et al. Learning curves and impact of previous operative experience on performance on a virtual reality simulator to test laparoscopic surgical skills. Am J Surg 2003;185(2):146–149.
110. Cant RP, Cooper SJ. Simulation-based learning in nurse education: systematic review. J Adv Nurs 2010;66(1):3–15.
111. Sutherland LM, Middleton PF, Anthony A, et al. Surgical simulation - a systematic review. Ann Surg 2006;243(3):291–300.
112. Sturm LP, Windsor JA, Cosman PH, et al. A systematic review of skills transfer after surgical simulation training. Ann Surg 2008;248(2):166–179.
113. Aggarwal R, Grantcharov TP, Eriksen JR, et al. An evidence-based virtual reality training program for novice laparascopic surgeons. Ann Surg 2006;244(2):310–314.
114. MacRoberts MH, MacRoberts BR. Problems of citation analysis: a critical review. J Am Soc Inf Sci 1989;40(5):342.
115. Lopreiato JO, Downing D, Gammon W, et al. Spain A. E. & the Terminology and Concepts Working Group. Healthcare Simulation Dictionary 2016. Available at: https://www.ssih.org/dictionary. Accessed July 30, 2017.
116. Moher D, Liberati A, Tetzlaff J, et al. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 2009;6(7):e1000097.
117. Swingler GH, Volmink J, Ioannidis JP. Number of published systematic reviews and global burden of disease: database analysis. BMJ 2003;327(7423):1083–1084.
118. Oxman AD, Cook DJ, Guyatt GH, et al. Users' guides to the medical literature. VI. How to use an overview. Evidence-Based Medicine Working Group. JAMA 1994;272(17):1367–1371.
119. van Raan A.F.J. The use of bibliometric analysis in research performance assessment and monitoring of interdisciplinary scientific developments. Technikfolgenabschätzung – Theorie und Praxis 2003;1(12):20–29.
120. Cronin B. Bibliometrics and beyond: some thoughts on web-based citation analysis. J Inf Sci 2001;27(1):1–7.
121. Falagas ME, Pitsouni EI, Malietzis GA, et al. Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses. FASEB J 2008;22(2):338–342.
122. MacRoberts M, MacRoberts B. Problems of citation analysis. Scientometrics 1996;36(3):435–444.
123. Garfield E, Welljams-Dorof A. Citation data: their use as quantitative indicators for science and technology evaluation and policy-making. Sci Publ Policy 1992;19(5):321–327.
124. Armstrong D. The impact of papers in Sociology of Health and Illness: a bibliographic study. Sociol Health Illn 2003;25(3):58–74.
125. Kostoff R. The use and misuse of citation analysis in research evaluation. Scientometrics 1998;43(1):27–43.
126. Bornmann L, Marx W. Methods for the generation of normalized citation impact scores in bibliometrics: which method best reflects the judgements of experts? J Informetr 2015;9(2):408–418.
127. Moed HF. New developments in the use of citation analysis in research evaluation. Arch Immunol Ther Exp (Warsz) 2009;57(1):13–18.
128. McGaghie WC, Issenberg SB, Petrusa ER, et al. Revisiting ‘A critical review of simulation‐based medical education research: 2003-2009’. Med Educ 2016;50(10):986–991.
129. Kneebone R. Evaluating clinical simulations for learning procedural skills: a theory-based approach. Acad Med 2005;80(6):549–553.
130. Crofts JF, Bartlett C, Ellis D, et al. Training for shoulder dystocia - a trial of simulation using low-fidelity and high-fidelity mannequins. Obstet Gynecol 2006;108(6):1477–1485.
131. McGaghie WC, Issenberg SB, Petrusa ER, et al. Effect of practice on standardised learning outcomes in simulation-based medical education. Med Educ 2006;40(8):792–797.
Keywords:

Healthcare simulation; simulation-based medical education; medical simulation; bibliometric; review

Supplemental Digital Content

© 2018 Society for Simulation in Healthcare