Conversations about the use and value of reporting guidelines in research publications often elicit polarized views. Such views signal a need for more information and thoughtful consideration. As an example from our own field of health professions education, the editors’ panel discussion at the 2019 meeting of the Association for Medical Education in Europe in Vienna, Austria, entitled Controversies and Challenges in Publishing Health Professions Education Research, prompted a lively discussion among editors and authors around the use of reporting guidelines in health professions education research publications. Some participants strongly endorsed use of such guidelines, while others staunchly opposed them. In this editorial, we aim to inform the Academic Medicine community about reporting guidelines, share our perspective on them as editors of the journal, and promote ongoing dialogue on these guidelines.
What Are Reporting Guidelines?
Reporting guidelines list and describe the key information that authors should include in a manuscript that describes a research study or literature review, to ensure that readers can understand, evaluate, and/or replicate the study or review.1 Reporting guidelines complement a journal’s instructions for authors, which tend to focus on technical requirements such as word count and structure,2 and differ from critical appraisal tools, which are used to assess the trustworthiness, relevance, and quality of evidence presented in a manuscript or publication.3–5 Some maintain that evaluation of thorough information reporting should be kept distinct from the critical appraisal process, since incomplete reporting is often interpreted as a sign of low methodologic quality when, in fact, it may be a decision made by authors to abide by word limits or a simple oversight.6 Others consider thorough reporting of information to be a necessary precursor to critical appraisal and part of a process that fulfills high scientific standards.7,8
A Brief History of Reporting Guidelines
Reporting guidelines evolved over time in response to concerns about the quality of evidence published in scientific journals and the risks associated with trusting what may turn out to be weak evidence in practical decision making. Across many fields of medicine, researchers’ efforts to verify evidence by replicating study findings have been confounded by insufficient information about study methods and results.9 Two reporting guidelines for randomized controlled trials in biomedical research were independently developed to address this problem; the 2 were ultimately combined to form CONSORT (Consolidated Standards of Reporting Trials).9,10
The perceived value of reporting guidelines has increased, given current trends in scholarship such as the Open Science movement, which calls for greater transparency in research,11 and the recent boom in systematic reviews, meta-analyses, and other forms of knowledge synthesis.1,12 CONSORT laid the foundation for subsequent reporting guidelines spanning multiple genres and approaches to scholarship—for example, CARE (for case reports),13 PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses),14,15 SQUIRE (Standards for Quality Improvement Reporting Excellence) 2.0,16 COREQ (Consolidated Criteria for Reporting Qualitative Research),17 and STROBE (Strengthening the Reporting of Observational Studies in Epidemiology).18
While many of the early guidelines originated in biomedical and health research, scholars in the social sciences and education adopted, adapted, and created their own versions of reporting guidelines. Guidelines developed for health professions education include SQUIRE-EDU (Standards for Quality Improvement Reporting Excellence in Education),19 GREET (Guideline for Reporting Evidence-Based Practice Educational Interventions and Teaching),20 and SRQR (Standards for Reporting on Qualitative Research.21 Other guidelines frequently referenced in health professions education include CHERRIES (Checklist for Reporting Results of Internet E-Surveys),22 PRISMA,14,15 RAMESES (Realist and Meta-Narrative Evidence Syntheses: Evolving Standards),23 and STROBE.18Table 1 presents examples of guidelines that are applicable to research in health professions education.
The proliferation of reporting guidelines created a need for quality control and management to reduce redundancy and enhance dissemination. The EQUATOR (Enhancing the Quality and Transparency of Health Research) Network, established in 2008, has played a large role in shaping how members of the biomedical and social science communities conceptualize reporting guidelines.1 The EQUATOR Network states that the purpose of reporting guidelines is “to remind researchers of what information to include in the manuscript, not to tell them how to do research.”9(p73)
Conflicting Arguments on the Value of Reporting Guidelines
Use of reporting guidelines varies. Some journals require authors to complete reporting guideline checklists when submitting articles, other journals encourage reviewers to refer to reporting guidelines when completing reviews, and still other journals do not mention reporting guidelines at all.24,25 Proponents tend to view reporting guidelines as a welcome tool for transparency. Guidelines make basic standards and expectations freely available to all, regardless of publication experience. Guidelines may also improve the quality of scholarship by reducing the omission of key information, a problem noted in several studies that have evaluated published articles for reporting quality.7,26 Reporting key information about a study in a consistent way rewards authors who invest in rigorous methodology.27
Those opposed to reporting guidelines raise concerns about their prescriptive and uncritical use by authors and reviewers.28 The conversion of reporting guidelines into reporting “checklists” can inhibit creativity and imply that the approaches or techniques that conform to guidelines are more valuable than those that do not. The use of reporting guidelines may disadvantage researchers with fewer resources who report smaller-scale studies with visible limitations (e.g., design flaws, lower response rates).29 Some also argue that guidelines are not necessary in a robust peer review system, as it is the duty of expert reviewers to evaluate the sufficiency of information provided by authors and request additional clarification as needed.29
Editors and reviewers appreciate the need for nuance. While it is true that the quality standards for medical education research have been heavily influenced by those of biomedical research, the influx of robust social science research has diversified the reviewer pool, expanded the expertise of journals’ editorial teams, and encouraged many thoughtful discussions of how to ensure fair and accurate evaluation of manuscripts that reflect different philosophies of science and employ methodologies from various disciplines.30
As the editorial team at Academic Medicine, we advocate judicious use of reporting guidelines. Authors, reviewers, and journal editors must apply good judgment when deciding what information must be reported in a given study to maximize clarity and coherence. In some circumstances, it may be acceptable not to include certain information (e.g., information that would compromise participant anonymity, that cannot be obtained, or that does not make sense in the context of the particular study design or methodology or available resources). After all, efforts to investigate many important educational research questions are limited by the amount and quality of the research that aims to answer them. In these cases, new studies, even with important methodological limitations of their own, can substantially and meaningfully add to the literature. This recognition means that we all have to be open to judging research in light of what it adds to our knowledge and not judging that research only by reporting standards that may be overly stringent given the state of a field.
Education research grows by accretion, and small gains can be valuable. Furthermore, authors may write excellent articles and reviewers may complete superb reviews without using reporting guidelines. We therefore do not mandate the use of reporting guidelines, yet we believe that reporting guidelines have value. In the sections that follow, we describe the ways that we envision authors, reviewers, and editors using reporting guidelines.
We encourage authors to become familiar with reporting guidelines endorsed by the EQUATOR Network as well as those commonly used in social sciences such as education31–33 and psychology.34,35 Judicious use means that authors review reporting guidelines carefully, selecting the guidelines most appropriate for their work, and confidently deciding when certain guideline items do not apply. Authors always have the option to explain their rationale to reviewers and editors if they are asked to provide information that they think is not appropriate for their study.
In short, we advise authors to refer to guidelines as a resource throughout the design, conduct, and write-up of a study.28 We do not mandate the use of reporting guidelines for any submission to Academic Medicine. As previously indicated, reports involving novel scholarly questions, or questions that can only be addressed in lower-resources settings, may be more modest or less adherent to reporting guidelines and still represent truly important contributions to the literature. Authors must be careful when making claims or interpretations and should clearly state the limitations of their work. All research reports should be transparent in their descriptions of their methodologies.
The goal is to make research accessible to a diverse audience, not to overly standardize research reports. This ambitious goal requires authors, reviewers, and editors to negotiate a delicate balance between (1) a clear, comprehensive description of the researchers’ assumptions, approach, processes, and findings and (2) respect for researchers’ integrity, ingenuity, and judgment. Finding this balance is particularly important when researchers draw on approaches and methodologies that are well established outside of health professions education.
We rely heavily on the expertise of peer reviewers, who are a critical part of our scholarly community and process. We are committed to finding reviewers with the necessary expertise to fairly and critically judge the quality of manuscripts. At the same time, we recognize that the breadth of philosophical orientations and methodologies in a multidisciplinary field such as health professions education makes it challenging to identify reviewers with both content and methodological expertise for all studies, especially for studies introducing new ideas and approaches to health professions education research. We see reporting guidelines as a potentially helpful resource for reviewers who encounter unfamiliar methodology in an area where they may have content expertise. We also see reporting guidelines as useful reminders for experts who may know the methodology well and take for granted decisions, procedures, or terminology that warrant more explanation for nonexpert readers. In sum, we view guidelines as a tool to enhance the excellent insights that reviewers bring to each manuscript, not as an attempt to standardize, undermine, or de-skill the review process.
Of note, Academic Medicine provides review criteria for research manuscripts (available at https://store.aamc.org/review-criteria-for-research-manuscripts.html). Review criteria are meant to assist reviewers in evaluating manuscripts and providing helpful feedback to authors and editors.
As the journal’s editorial team, we are committed to ensuring that manuscripts accepted for publication provide readers with the information they need to judge the credibility of studies and make informed decisions about how to use study findings. Some manuscripts will require more words than the suggested limits to convey their messages clearly, and we are willing to consider this on a case-by-case basis. Every manuscript submitted to Academic Medicine is also assessed by at least 2 members of the editorial and staff team. We learn about new methodologies ourselves and discuss challenging decisions about manuscripts to ensure that we bring multiple perspectives to bear. While we do not require authors to use reporting guidelines as a checklist, we may recommend that authors review a particular guideline as a way to ensure that they have optimized the reporting of their work.
Our recommendation of judicious use of reporting guidelines will, we hope, unify different perspectives on the role of such guidelines in health professions education publications. We aim to increase awareness of the existence of reporting guidelines, encourage ongoing education about them, and stimulate thoughtful dialogue about the use of reporting guidelines in our community. Ongoing dialogue must include consideration of the utility and relevance of reporting guidelines for research in health professions education and, ideally, will lead to new opportunities for research on the value of reporting guidelines in fostering clear communication and furthering rigor in our field.
1. The EQUATOR Network. Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network Resource Center: Reporting guidelines for main study types. https://www.equator-network.org
. Published 2020 Accessed August 3, 2020
2. Moher DA, Altman DG, Schulz KF, Simera I, Wager E, eds. Guidelines for Reporting Health Research: A User’s Manual. 2014.1st ed. Annapolis, MD: John Wiley & Sons;
3. Critical Appraisal Skills Program (CASP). Critical appraisal. https://casp-uk.net/glossary/critical-appraisal
. Published 2020 Accessed June 22, 2020
4. Joanna Briggs Institute. Critical appraisal tools. http://joannabriggs-webdev.org/research/critical-appraisal-tools.html
. Accessed August 3, 2020.
5. Buccheri RK, Sharifi C. Critical appraisal tools and reporting guidelines for evidence-based practice. Worldviews Evid Based Nurs. 2017;14:463–472
6. Huwiler-Muntener K, Juni P, Junker C, Egger M. Quality of reporting of randomized trials as a measure of methodologic quality. JAMA. 2002;287:2801–2804
7. Cook DA, Beckman TJ, Bordage G. Quality of reporting of experimental studies in medical education: A systematic review. Med Educ. 2007;41:737–745
8. Des Jarlais DC, Lyles C, Crepaz N; TREND Group. Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: The TREND statement. Am J Public Health. 2004;94:361–366
9. Altman DG, Simera I. A history of the evolution of guidelines for reporting medical research: The long road to the EQUATOR Network. J R Soc Med. 2016;109:67–77
10. Schulz KF, Altman DG, Moher D; CONSORT Group. CONSORT 2010 statement: Updated guidelines for reporting parallel group randomised trials. BMJ. 2010;340:c332
11. Grove J. Measuring research transparency: New system will measure journals’ research transparency. Times Higher Education. https://www.insidehighered.com/news/2020/01/31/new-system-will-measure-journals-research-transparency
. Published January 31, 2020 Accessed August 3, 2020
12. Maggio LA, Thomas A, Durning SJ. Swanwick T, Forrest K, O’Brien B, eds. Knowledge synthesis. In: Understanding Medical Education. 2018Hoboken, NJ: John Wiley & Sons;457–469
13. Gagnier JJ, Kienle G, Altman DG, Moher D, Sox H, Riley D; CARE Group*. The CARE guidelines: Consensus-based clinical case reporting guideline development. Glob Adv Health Med. 2013;2:38–43
14. Moher D, Liberati A, Tetzlaff J, Altman DG; PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009;6:e1000097
15. Tricco AC, Lillie E, Zarin W, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and explanation. Ann Intern Med. 2018;169:467–473
16. Ogrinc G, Davies L, Goodman D, Batalden P, Davidoff F, Stevens D. SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): Revised publication guidelines from a detailed consensus process. BMJ Qual Saf. 2016;25:986–992
17. Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): A 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19:349–357
18. von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP; STROBE Initiative. Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: Guidelines for reporting observational studies. BMJ. 2007;335:806–808
19. Ogrinc G, Armstrong GE, Dolansky MA, Singh MK, Davies L. SQUIRE-EDU (Standards for QUality Improvement Reporting Excellence in Education): Publication guidelines for educational improvement. Acad Med. 2019;94:1461–1470
20. Phillips AC, Lewis LK, McEvoy MP, et al. Development and validation of the guideline for reporting evidence-based practice educational interventions and teaching (GREET). BMC Med Educ. 2016;16:237
21. O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: A synthesis of recommendations. Acad Med. 2014;89:1245–1251
22. Eysenbach G. Peer-review and publication of research protocols and proposals: A role for open access journals. J Med Internet Res. 2004;6:e37
23. Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: Realist syntheses. BMC Med. 2013;11:21
24. Hirst A, Altman DG. Are peer reviewers encouraged to use reporting guidelines? A survey of 116 health research journals. PLoS One. 2012;7:e35621
25. Sharp MK, Tokalic R, Gomez G, Wager E, Altman DG, Hren D. A cross-sectional bibliometric study showed suboptimal journal endorsement rates of STROBE and its extensions. J Clin Epidemiol. 2019;107:42–50
26. Maggio LA, Tannery NH, Kanter SL. Reproducibility of literature search reporting in medical education reviews. Acad Med. 2011;86:1049–1054
27. Horsley T. Tips for improving the writing and reporting quality of systematic, scoping, and narrative reviews: J Contin Educ in the Health Prof. 2019;39:54–57
28. Barbour RS. Checklists for improving rigour in qualitative research: A case of the tail wagging the dog? BMJ. 2001;322:1115–1117
29. Wharton T. Rigor, transparency, and reporting social science research: Why guidelines don’t have to kill your story. Res Soc Work Pract. 2017;27:487–493
30. Varpio L, MacLeod A. Philosophy of science series: Harnessing the multidisciplinary edge effect by exploring paradigms, ontologies, epistemologies, axiologies, and methodologies. Acad Med. 2020;95:686–689
31. AERA Task Force on Reporting of Research Methods in AERA Publications. Standards for reporting on empirical social science research in AERA publications: American Educational Research Association. Educ Res. 2006;35:33–40
32. AERA Task Force on Standards for Reporting on Humanities-Oriented Research in AERA Publications. Standards for reporting on humanities-oriented research in AERA publications: American Educational Research Association. Educ Res. 2009;38:481–486
33. Newman M, Elbourne D. Improving the usability of educational research: Guidelines for the Reporting of Primary Empirical Research Studies in Education (The REPOSE Guidelines). Eval Res Educ. 2004;18:201–212
34. Appelbaum M, Cooper H, Kline RB, Mayo-Wilson E, Nezu AM, Rao SM. Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board task force report. Am Psychol. 2018;73:3–25
35. Levitt HM, Bamberg M, Creswell JW, Frost DM, Josselson R, Suarez-Orozco C. Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report. Am Psychol. 2018;73:26–46