Secondary Logo

Journal Logo

Publication of Abstracts Presented at an International Healthcare Simulation Conference

Cheng, Adam MD, FRCPC, FAAP; Lin, Yiqun MD, MHSc; Smith, Jeremy MD; Wan, Brandi BKin; Belanger, Claudia; Hui, Joshua MD, MSc

doi: 10.1097/SIH.0000000000000229
Empirical Investigations
Free

Introduction We aimed to determine the publication rate for abstracts presented at the International Meeting for Simulation in Healthcare (IMSH) and the time between abstract presentation and publication. We also aimed to describe the study features influencing subsequent publication and the relationship between these features and journal impact factors (IFs).

Methods All types of accepted abstracts from the 2012 and 2013 IMSH were reviewed. We extracted the following data from each abstract in duplicate: presentation format, subject, type of scholarship, research method, study design, outcome measure, number of institutions in authorship group, and number of study sites. PubMed and Google Scholar were searched (January 1, 2012 to August 1, 2016) using the names of the first, second, and last author for comparison with abstracts. Journal of publication and IF were recorded. Data were summarized with descriptive statistics. Bivariate and multivariate analysis was performed to explore the association between publication status and other variables.

Results Of 541 abstracts, 22% (119/541) were published with a median time to publication of 16 months (interquartile range = 8.525), ranging from 0 to 43 months. The study characteristics associated with a greater likelihood of publication were the following: research-type abstract, quantitative studies, randomized trials, studies with patient or healthcare-related outcomes, multiple institutions represented in authorship group, and multicenter studies. Studies with multiple institutions in authorship group and multicenter studies were published in higher IF journals (P < 0.05).

Conclusions The publication rate of 22% for abstracts presented at IMSH is low, indicative of the relatively new nature of simulation-based research in healthcare.

From the KidSIM Simulation Program, Department of Pediatrics (A.C.,Y.L.), Alberta Children's Hospital and the University of Calgary, Calgary, Alberta, Canada; Department of Emergency Medicine (J.S.), University of North Carolina, Chapell Hill, NC; School of Nursing, University of British Columbia (B.W.), Vancouver, British Columbia; School of Kinesiology and Health Studies (C.B.), Queens University, Kingston, Ontario, Canada; and Department of Emergency Medicine (J.H.), Kaiser Permanente Los Angeles Medical Center, Los Angeles, CA.

Reprints: Adam Cheng, MD, FRCPC, FAAP, KidSIM Simulation Program, Department of Pediatrics, Alberta Children's Hospital University of Calgary, 2888 Shaganappi Trail NW, Calgary, Alberta, Canada T3B 6A8 (e-mail: chenger@me.com).

Supported by an infrastructure grant jointly provided by the Alberta Children's Hospital Research Institute, the Alberta Children's Hospital Foundation, and the Department of Pediatrics, Cumming School of Medicine, University of Calgary.

The authors declare no conflict of interest.

Medical conferences are an important avenue for scholars, educators, and researchers to share their scholarly work in form of abstracts.1 Oral and poster presentation of abstracts offer an opportunity for preliminary dissemination of research findings, where the audience has an opportunity to ask questions and provide constructive feedback. As a next step, peer-reviewed publication facilitates dissemination of results and knowledge translation. Failure to publish completed research can lead to unnecessary duplication of effort as other researchers conduct similar studies.2 On the other hand, timely publication of research can, in some circumstances, inform changes to standards of practice that positively influence patient care.3,4

Not all abstracts presented at scientific meetings are subsequently published. The rate of publication of abstracts previously presented at international conferences for various specialties ranges from 8% to 81%.1–15 The publication rate of the abstracts presented at medical education conferences is 34.7%.2 The exponential growth and use of simulation-based education and assessment in clinical training programs have spawned international meetings focused on simulation-based education.16–26 The International Meeting for Simulation in Healthcare (IMSH) is the world's largest annual healthcare simulation conference, with more than 2500 attendees from different healthcare professions and medical specialties. IMSH abstracts are predominantly simulation-based education research studies, with attendees having opportunities for oral or poster presentation of accepted abstracts. It is unknown how many of the abstracts presented at IMSH resulted in journal publication and what characteristics are common to published projects.

The aim of this study was to determine the rate at which abstracts presented at IMSH were subsequently published in full and the time between abstract presentation and full publication. We also aimed to describe the study features influencing subsequent journal publication and the relationship between these features and journal impact factor (IF).

Back to Top | Article Outline

METHODS

This study received exempt status from the Conjoint Health Research Ethics Board, University of Calgary. We reviewed all abstracts (ie, research, program evaluation, technology, and innovation) from the 2012 and 2013 IMSH. Works in progress abstracts were excluded. Two reviewers per year independently extracted the following information from each abstract: presentation format (oral, poster), subject of study (health professions education research, other), type of scholarship (research, program evaluation, technology, and innovation), research method (quantitative, qualitative, or mixed), study design (descriptive, single-group posttest only, single group pre-post, two group nonrandomized, randomized controlled), outcome measure [satisfaction or attitudes, knowledge, skills and behaviors (in simulated environment), behaviors (in clinical environment), patient or healthcare outcome], number of institutions represented in authorship group, and number of study sites. Conflicts were resolved by discussion with a third reviewer with advanced training in research methodology. Interrater agreement, determined by the percentage of agreement between the two reviewers for all items extracted from abstracts, was calculated.

Two databases were searched to minimize the likelihood of missing publications. Two investigators searched PubMed and Google Scholar (January 1, 2012–August 1, 2016) using the names of the first, second, and last author, followed by key words from the title, abstract, or both, for comparison with IMSH abstracts. Retrieved publications were compared with the corresponding abstract to ensure that it represented the same scholarly work. Journal name, journal IF (for the year of publication), and publication date were recorded. The publication date was then used to calculate the time from meeting presentation to full publication.

Back to Top | Article Outline

Statistical Analysis

We performed all analyses with a significance level of 0.05 using R software (Version 3.3.2, https://www.R-project.org). Proportion of total publications, time to publication, and IFs of peer-reviewed journals were summarized with descriptive statistics (count and percentage for categorical variables and median and interquartile range for numerical variables). Bivariate analyses were performed using simple logistic regression to explore the association between publication status and all other variables. Multivariate analysis was performed using multiple logistic regression model including study-specific variables (ie, study design, outcome measures, etc) that were highly significant (P < 0.01) in bivariate analyses. Odds ratios (ORs) as well as 95% confidence intervals (CI) were calculated. We used the Wilcoxon rank sum test to compare journal IF of published articles with and without certain desirable study features that are associated with publication. The number of desirable study features for each published article was calculated, and a linear regression model was used to explore the association between IF and the total number of desirable features per publication.

Back to Top | Article Outline

RESULTS

In total, 541 abstracts were reviewed, with an interrater agreement of 97.5%. Eighty-eight percent (476/541) of the abstracts were related to health professions education. Among all abstracts, 22.0% (119/541) were published, with a median time to publication of 16 months [interquartile range (IQR) = 8.5–25 months], ranging from less than 1 to 43 months (Fig. 1). Among the published abstracts, 38% (46/119) were published within 12 months, 73% (87/119) were published within 24 months, and 93% (111/119) were published within 36 months. In terms of types of abstracts, 33.9% (80/236) of research abstracts, 12.6% (29/231) of program evaluation abstracts, and 13.7% (10/74) of technology and innovation abstracts were eventually published. Articles were published in a total of 63 different peer-reviewed journals (Table 1), with a median IF of 1.69 (IQR = 1.24–2.93). A total of 44.5% (53/119) of the abstracts were published in 1 of 18 different healthcare education journals. Simulation in Healthcare (18.5%; IF 1.59 in 2013) and Medical Education (7%; IF 3.62 in 2013) were the top healthcare simulation and medical education journals to publish IMSH abstracts.

FIGURE 1

FIGURE 1

TABLE 1

TABLE 1

In bivariate analysis, publication was significantly less likely for those abstracts presented in program evaluation type versus research type (12.6% vs. 33.9%, OR = 0.28, 95% CI = 0.17 to 0.44, P < 0.001), technology and innovation type versus research type (13.5% vs. 33.9%, OR = 0.30, 95% CI = 0.14 to 0.60, P = 0.001), and qualitative versus quantitative methods (12.5% vs. 33.8%, OR = 0.28, 95% CI = 0.18 to 0.43, P < 0.001). Compared with single-center study, multicenter studies were more likely to be published (39.7% vs. 21.0%, OR = 2.38, 95% CI = 1.09 to 5.00, P = 0.024) (Table 2).

TABLE 2

TABLE 2

In multivariable analysis, we found that study design, outcome measures, and number of institutions in the authorship were independently associated with publication. Stronger study designs significantly increased the publication rate and showed a dose-response relationship. Compared with descriptive studies, single-group posttest only (adjusted OR = 2.16, 95% CI = 1.09 to 4.62, P = 0.035), single-group pre/posttest studies (adjusted OR = 2.70, 95% CI = 1.15 to 6.60, P = 0.025), two-group nonrandom studies (adjusted OR = 4.39, 95% CI = 1.51 to 12.99, P = 0.007), and randomized trials (adjusted OR = 5.12, 95% CI = 2.00 to 13.75, P < 0.001) were more likely to be published. Eventual publication was more likely for studies reporting patient or healthcare outcomes relative to learner reactions and satisfaction as an outcome (adjusted OR = 7.04, 95% CI = 1.65 to 36.7, P = 0.011). Multiple institutions in the authorship group was positively associated with publication in comparison with a single institution represented in the authorship group (adjusted OR = 1.58, 95% CI = 1.02 to 2.44, P = 0.041) (Table 2).

For the published abstracts, we further explored the relationship between journal IF and the study features shown to be positively associated with publication in the previous analysis: randomized trial design, higher level of outcome measures (behavior in clinical environment or patient and/or healthcare outcomes), multiple institutions in authorship, and multicenter studies. Studies with these features tended to be published in higher IF journals; however, only multiple institutions in authorship group and multicenter studies yielded statistical significance (Table 3). We used linear regression to explore the association between the number of desirable study features and the IF of the journal. Compared with studies without any of these features (mean IF 1.78), studies with one or two desirable features were published in journals with higher IF, but these differences were not statistically significant (coefficient = 0.59, 95% CI = −0.03 to 1.22, P = 0.06 and coefficient = 0.73, 95% CI = −0.09 to 1.55, P = 0.08, respectively). Studies with three desirable features were published in journals with an IF 1.97 higher than those without any of these features, representing a statistically significant difference (95% CI = 0.76 to 3.18, P = 0.001). No studies had all four desirable study features (Fig. 2).

TABLE 3

TABLE 3

FIGURE 2

FIGURE 2

Back to Top | Article Outline

DISCUSSION

Presenting abstracts at scientific meetings is an important step in the dissemination of research findings, contributing to knowledge translation and influencing future research in the field. We found that 22% of the abstracts presented at IMSH were eventually published in peer-reviewed journals. This is less than the publication rate for the abstracts presented at other leading medical education conferences: 37% for Research in Medical Education Conference2 and 32% for the Canadian Conference on Medical Education.2 This is also less than the reported 44.5% mean publication rate for biomedical and clinical research conferences,1 although this comparison may not be as valid given the differing nature of abstracts presented at these conferences and IMSH.

The publication rate provides only a crude representation of quality. Some abstracts present preliminary results that may differ from those shared during full publication,15 whereas other abstracts may have sample sizes and/or treatment effects that differ from those reported in subsequent publications.27 Furthermore, the quality of some abstracts may not be properly represented via publication rate, particularly in the case of scholarly work where maximum impact is best measured by means other than publication. Nonetheless, publication rates provide conferences with a way to track the progression of abstracts to publication form.

There are many possible reasons for the low publication rate of abstracts presented at IMSH. As a relatively new field of research, the criteria for abstract acceptance may be more lenient,15 because the scientific program at IMSH is “just as much about educating researchers as it is about showcasing good work” (personal communication, A. Calhoun and J. Hui, January 2017). Accepting abstracts with a broader range of methodological rigor allows for IMSH to serve as a venue for training researchers. Acceptance rates for abstracts submitted to IMSH hover between 50% and 70% (personal communication, A. Calhoun and J. Hui, January 2017), in stark contrast to the 21% to 23% acceptance rates for articles submitted to Simulation in Healthcare (personal communication, M. Scerbo and K. Durst, January 2017). Other medical education conferences, such as the Ottawa Conference (25% abstract acceptance rate), had relatively lower abstract acceptance rates in prior years (personal communication, B. Issenberg, January 2017). This supports the hypothesis that the low publication rate of IMSH abstracts is in part due to relatively lenient acceptance criteria that are intended to support the education and growth of healthcare simulation researchers.

Other possible reasons for the low publication rate include the following: investigators lacking the expertise, time, or resources to pursue publication; the quality of research may be low, resulting in lack of interest (or unlikely acceptance) from journals6,15; and some abstracts types (eg, technology and innovation) may be ill-suited for publication because of the descriptive nature of the projects. Conference attendance for many individuals may be contingent upon abstract acceptance. For some conferences trying to maximize attendance, this may factor into the standards set for abstract acceptance. Although many medical education and specialty journals are publishing simulation-based research, our results suggest that healthcare simulation journals (eg, Simulation in Healthcare as the top journal of publication) may be more receptive to simulation-based research than other journal types. Other healthcare simulation journals (eg, Advances in Simulation, BMJ Simulation and Technology Enhanced Learning, Clinical Simulation in Nursing) introduced over the years may help address this issue in the future.

The relatively low publication rate of abstracts presented at the IMSH highlights the need to support researchers in publishing their completed work. International simulation conferences, such as the IMSH, have been running workshops (eg, how to design and implement simulation study, how to write an article, etc), expert panels (eg, how to obtain research grants, how to do multicenter research, etc), and networking sessions for many years. Other opportunities to support researchers exist, including expert feedback sessions during abstract presentations structured to enhance the likelihood of publication and special networking events during conferences that promote mentorship of novice investigators. Mentorship programs that pair early career investigators with seasoned researchers facilitate the sharing of tips, tricks, and resources that promote scientific rigor and future academic productivity. For example, the International Network for Simulation-based Pediatric Innovation, Research and Education (INSPIRE) offers longitudinal mentorship for investigators, from study design through to publication.28,29 Simulation societies and networks should continue to assist with the dissemination of research by developing content (and/or programs) to support investigators with publication of their work.

In our study, we describe several variables (ie, quantitative studies, randomized controlled studies, studies with clinical/patient outcomes, projects with collaborators from multiple institutions and multicenter studies) that were predictive of subsequent publication. In other medical specialties, randomized controlled trials are also published more often than abstracts with other study designs, whereas multicenter research is similarly associated with a greater likelihood of subsequent publication compared with single-center studies.1 Some of the predictive variables listed previously (ie, randomized controlled studies, studies with clinical/patient outcomes, multicenter studies) are also indicators of high-quality medical education research, as described by the medical education research study quality instrument.30 Previous systematic reviews of the simulation education literature report variability in the quality of published studies, with anywhere from 43% to 75% of studies with a medical education research study quality instrument score greater than or equal to 12 points.16–19 We were encouraged to find a positive association between a combination of the three of these predictive variables and higher journal IF and to see that several abstracts were published in higher IF journals. All that being said, abstracts that do not have these variables may still be considered high-quality research if conducted in a rigorous manner. Although these variables may be predictive of publication, we recognize the importance of a diversity of approaches to research and scholarship in simulation; the freedom to select among various research approaches empowers researchers to explore innovative lines of inquiry with methods best suited to their expertise and interest.

Back to Top | Article Outline

Limitations

Our study has several limitations. We reviewed abstracts from the 2012 and 2013 IMSH meetings because it provided ample time to publication (ie, up to 4 years for 2012 abstracts) for authors. Because the field of simulation-based research has matured in the past few years, it is possible that publication rates of the abstracts presented from 2014 to 2016 may be different. We analyzed the abstracts from one simulation conference only, thus introducing a potential selection bias because most IMSH attendees are from the United States. We selected IMSH because it is the oldest, largest, and most mature healthcare simulation conference and because other healthcare simulation conferences had significantly fewer abstracts presented in 2012 and 2013. Although we attempted to be thorough in our search strategy to identify published studies, it is possible that we may have missed some publications. Some abstracts (eg, technology and innovation) may have been more suited for publication in journals that do not appear in PubMed, although these should have been picked up in our Google Scholar search. Other abstracts may have been published between the time we completed our search (August 1, 2016) and the time to publication of this article. Looking ahead, future research could assess the effect of interventions (eg, mentorship programs, etc) designed to improve publication rates and explore the rate of publication for abstracts presented at various simulation education conferences around the world.

Back to Top | Article Outline

REFERENCES

1. Scherer RW, Langenberg P, von Elm E. Full publication of results initially presented in abstracts. Cochrane Database Syst Rev 2007;(2):MR000005.
2. Walsh CM, Fung M, Ginsburg S. Publication of results of abstracts presented at medical education conferences. JAMA 2013;310(21):2307–2309.
3. Fosbol EL, Fosbol PL, Harrington RA, Eapen ZJ, Peterson ED. Conversion of cardiovascular conference abstracts to publications. Circulation 2012;126(24):2819–2825.
4. DeMola PM, Hill DL, Rogers K, Abboud JA. Publication rate of abstracts presented at the shoulder and elbow session of the American Academy of Orthopaedic Surgery. Clin Orthop Relat Res 2008;467(6):1629–1633.
5. Bydder SA, Joseph DJ, Spry NA. Publication rates of abstracts presented at annual scientific meetings: how does the Royal Australian and New Zealand College of Radiologists compare? Australas Radiol 2004;48:25–28.
6. De Bellefeuille C, Morrison CA, Tannock IF. The fate of abstracts submitted to a cancer meeting: factors which influence presentation and subsequent publication. Ann Oncol 1992;3:187–191.
7. Goldman L, Loscalzo A. Fate of cardiology research originally published in abstract form. N Engl J Med 1980;303(5):255–259.
8. Halikman R, Scolnik D, Rimon A, Glatstein MM. Peer-reviewed journal publication of abstracts presented at an International Emergency Medicine Scientific Meeting: outcomes and comparison with the previous meeting [published online August 20, 2016]. Pediatr Emerg Care. doi: 10.1097/PEC.0000000000000831.
9. Meranze J, Ellison N, Greenhow DE. Publications resulting from anesthesia meeting abstracts. Anesth Analg 1982;61:445–448.
10. Post RE, Mainous AG 3rd, O'Hare KE, King DE, Maffei MS. Publication of research presented at STFM and NAPCRG conferences. Ann Fam Med 2013;11(3):258–261.
11. Ravn AK, Petersen DB, Folkestad L, Hallas P, Brabrand M. Full-text publication of abstracts in emergency medicine in Denmark. Scand J Trauma Resusc Emerg Med 2014;22(1):33.
12. Scherer RW, Dickersin K, Langenberg P. Full publication of results initially presented in abstracts. A meta-analysis. JAMA 1994;272:158–162.
13. Sprague S, Bhandari M, Devereaux PJ, et al. Barriers to full-text publication following presentation of abstracts at annual orthopaedic meetings. J Bone Joint Surg Am 2003;85-A(1):158–163.
14. Varghese RA, Chang J, Miyanji F, Reilly CW, Mulpuri K. Publication of abstracts submitted to the Annual Meeting of the Pediatric Orthopaedic Society of North America: is there a difference between accepted versus rejected abstracts? J Pediatr Orthop 2011;31:334–340.
15. Walby A, Kelly AM, Georgakas C. Abstract to publication ratio for papers presented at scientific meetings: how does emergency medicine compare? Emerg Med (Fremantle) 2001;13:460–464.
16. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011;306(9):978–988.
17. Cheng A, Lang TR, Starr SR, Pusic M, Cook DA. Technology-enhanced simulation and pediatric education: a meta-analysis. Pediatrics 2014;133(5):e1313–e1323.
18. Cook DA, Brydges R, Hamstra SJ, et al. Comparative effectiveness of technology-enhanced simulation versus other instructional methods: a systematic review and meta-analysis. Simul Healthc 2012;7(5):308–320.
19. Ilgen JS, Sherbino J, Cook DA. Technology-enhanced simulation in emergency medicine: a systematic review and meta-analysis. Acad Emerg Med 2013;20(2):117–127.
20. Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook DA. Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med Educ 2014;48(7):657–666.
21. Brydges R, Hatala R, Zendejas B, Erwin PJ, Cook DA. Linking simulation-based educational assessments and patient-related outcomes: a systematic review and meta-analysis. Acad Med 2015;90(2):246–256.
22. Mundell WC, Kennedy CC, Szostek JH, Cook DA. Simulation technology for resuscitation training: a systematic review and meta-analysis. Resuscitation 2013;84(9):1174–1183.
23. Cheng A, Lockey A, Bhanji F, Lin Y, Hunt EA, Lang E. The use of high-fidelity manikins for advanced life support training—a systematic review and meta-analysis. Resuscitation 2015;93:142–149.
24. Cheng A, Goldman RD, Aish MA, Kissoon N. A simulation-based acute care curriculum for pediatric emergency medicine fellowship training programs. Pediatr Emerg Care 2010;26(7):475–480.
25. Qayumi K, Pachev G, Zheng B, et al. Status of simulation in health care education: an international survey. Adv Med Educ Pract 2014;5(5):457–467.
26. Grant EC, Grant VJ, Bhanji F, Duff JP, Cheng A, Lockyer JM. The development and assessment of an evaluation tool for pediatric resident competence in leading simulated pediatric resuscitations. Resuscitation 2012;83(7):887–893.
27. Weintraub WH. Are published manuscripts representative of the surgical meeting abstracts? An objective appraisal. J Pediatr Surg 1987;22(1):11–13.
28. Cheng A, Kessler D, Mackinnon R, et al. Conducting multicenter research in healthcare simulation: lessons learned from the INSPIRE network. Advances in Simulation 2017;2:6.
29. Hunt EA, Duval-Arnould J, Chime NO, et al. Building consensus for the future of paediatric simulation: a novel ‘KJ Reverse-Merlin’ methodology. BMJ Simulation and Technology Enhanced Learning Published online April 12, 2016.
30. Reed DA, Cook DA, Beckman TJ, Levine RB, Kern DE, Wright SM. Association between funding and quality of published medical education research. JAMA 2007;298(9):1002–1009.
Keywords:

Simulation; research; education; abstracts; publication; conference

© 2017 Society for Simulation in Healthcare