Secondary Logo

Journal Logo


Varieties of Integrative Scholarship

Why Rules of Evidence, Criteria, and Standards Matter

McGaghie, William C. PhD

Author Information
doi: 10.1097/ACM.0000000000000585
  • Free


The academic community has acknowledged for decades that scholarship takes many forms. Ernest Boyer1 proposed a typology for scholarship in a 1990 monograph, Scholarship Reconsidered: Priorities of the Professoriate. The Boyer typology identified four varieties of scholarly work and expression. They are the scholarship of discovery or original research; integration, which involves intellectual synthesis; application, the responsible use of knowledge to address practical problems; and teaching, educating others about the products of scholarship and enticing future scholars. The Boyer typology is accepted widely across academic disciplines and the learned professions.

In this article, I will address and amplify the scholarship of integration in medical education and education for other health professions. The intent is to point out that integrative scholarship has a variety of complementary methods and modes of expression to educate the academic medical community. Plurality of methods and writing is important on practical and epistemological grounds. For practical reasons we need to know how and why medical education works in its many forms. For epistemic reasons we also need to know about complementary rules of evidence and evidence synthesis that embody our knowledge base and inform educational practice. Each type of integrative scholarship has utility and value for education in the health professions.

Reading and reflection about integrative scholarship reveal that this approach to academic expression not only has a long history dating from the early 19th century but also can take several forms that use different rules of evidence, criteria, and quality standards.2 Research and experience also teach that value judgments always underlie the analysis and synthesis of evidence3 and that experts frequently disagree about the value and publishability of scientific and professional scholarship,4 whether quantitative5,6 or qualitative.7 Scholars should also acknowledge that single publications are rarely definitive and that approximations to truth in science and scholarship rest on the weight and quality of evidence assembled and judged over time.8 Scientific advancement stems from research programs—which include integrative reviews—that are theory based, thematic, sustained, and cumulative.9

What is integrative scholarship? Ernest Boyer1 argues that integrative scholarship “give[s] meaning to isolated facts, putting them in perspective … making connections across the disciplines, placing the specialties in larger context, illuminating data in a revealing way, often educating nonspecialists, too…. What we mean is serious, disciplined work that seeks to interpret, draw together, and bring new insight to bear on original research.” Strong integrative scholarship in the form of research reviews enlightens and informs readers broadly, telling a coherent story about data and their purpose, origins, uniqueness, timeliness, quality, value, meaning, and implications for future science. Writing in Research Synthesis and Meta-Analysis: A Step-by-Step Approach (4th ed.), psychologist Harris Cooper asserts: “Research syntheses focus on empirical studies and seek to summarize past research by drawing overall conclusions from many separate investigations that address related or identical hypotheses. The research synthesist’s goal is to present the state of the knowledge concerning the relation(s) of interest and to highlight important issues that research has left unresolved.”10 Cooper also states that “research synthesists must be required to meet the same rigorous methodological standards that are applied to primary researchers.”10 Integrative scholarship is hard work, grounded in several research traditions, and requires polished writing to convey its message with clarity and simplicity.

Integrative or synthetic scholarship can vary in breadth. It can be conducted within academic disciplines or, after Boyer,1 across academic disciplines. Within a discipline (e.g., surgery), a systematic review covering 11 empirical studies shows that simulation-based training and practice of surgical skills transfer directly to operating room performance.11 Cross-discipline integrative scholarship combines research data from several fields to address and illuminate complex education and health care questions. A recent example is a qualitative review by Dixon-Woods and colleagues12 that cites evidence from medical education, clinical epidemiology, public policy, health services research, economics, and sociology to explain how the Michigan Intensive Care Unit Project led by Peter Pronovost13,14 achieved a statewide reduction in catheter-related bloodstream infections from a quality improvement program.12 The breadth of an integrative scholarly project and its published reports stems from one’s scientific goals and scholarly horizon.

Journals that publish integrative scholarship in health professions education typically have eclectic and inclusive policies about legitimate synthetic work. Journals’ instructions to authors are telling. Academic Medicine, for example, announces that it seeks “General scholarly articles … [that] cover topics of broad concern to academic medicine… [where] The article combines elements of research and description, where the research is not sufficiently robust or central enough to the article’s message to constitute a full-fledged research report.”15 Similar editorial policy statements have been expressed by many other journals that cover health professions education, including Advances in Health Sciences Education,16American Journal of Nursing,17Journal of Graduate Medical Education,18Medical Science Educator,19 and Teaching and Learning in Medicine,20 and by neighbor clinical and behavioral science journals that publish integrative educational scholarship, such as CHEST,21Annals of Surgery,22Health Psychology,23 and Psychological Bulletin.24 The message is clear. Many journals welcome manuscripts that present integrative scholarship in health professions education that addresses important issues and meets quality standards.

Cooper10 and Cooper and Koenka25 propose a seven-step approach to frame the structure and organize the process of research integration. Their idea is that this approach can be used to address integrative scholarship done in a variety of research traditions despite acknowledged differences in culture (e.g., language, meaning of rigor) and terminology (e.g., similarity of a cohort study in clinical research and a behavioral science nonequivalent control group design) across disciplines. The seven steps are:

  1. Formulate the problem;
  2. Search the literature;
  3. Gather information from studies;
  4. Evaluate the quality of studies;
  5. Analyze and integrate the outcomes of studies;
  6. Interpret the evidence; and
  7. Present the results.

Most reports of integrative scholarship embody these steps, explicitly or implicitly. This seven-step approach is an organizing framework to discuss and evaluate integrative scholarship in medical and health professions education.

This article has two objectives. First, I aim to address integrative scholarship in the crucible of five research review traditions in health professions education: narrative, systematic, scoping, critical-realist, and open peer commentary. The five research review traditions were chosen because they account for the majority of published research reviews in medical and health professions education and can be judged on the intellectual quality of the synthesis and the utility of their conclusions.26,27 Second, I offer general guidelines for reviewers and editors who judge the fitness of integrative scholarship produced by peers for publication to the health professions education academic community worldwide.

A summary of the five research review traditions crossed by the seven-step approach10,25 (5 × 7 = 35 cells) to research synthesis is presented as an advance organizer in Table 1. Each cell of the table addresses a substantive issue that producers and judges of integrative scholarship must address to preserve the rigor and value of this work.

Table 1
Table 1:
Five Research Review Traditions and Seven Steps of Research Synthesis

Five Research Review Traditions

Here, I describe the five research review traditions and discuss how each approach gathers, judges, and synthesizes evidence to produce integrative scholarship.


Narrative reviews have a broad and deep history of scholarly synthesis across the academic disciplines. Examples include the humanities,28 behavioral sciences,29 and health professions education.30 Until recently, narrative reviews were the most common, influential, and widely endorsed approach to research integration across the sciences and humanities.

Operationally, individual authors or scholarly teams who conduct narrative reviews “stake out” a domain of published writing and qualitatively, based on expert opinion or judgment, aggregate the evidence using Imel’s31 three tacit steps, which incorporate the Cooper10 and Cooper and Koenka25 seven-step model:

  1. Define the boundaries, contents, and structure of a manuscript that presents integrative scholarship;
  2. Qualitatively assemble, analyze, interpret, evaluate, and synthesize evidence to create plausible arguments about what the evidence means; and
  3. Write a report that integrates evidence and argument in a convincing way to an audience of one’s peers.

Narrative reviews are typically judged using equally qualitative criteria and standards.32 To illustrate, the Review of Educational Research (RER)33 classifies synthetic scholarship in four categories: integrative, theoretical, methodological, and historical. RER33 also instructs manuscript reviewers to use eight standards and criteria when evaluating manuscripts for publication in the journal:

  1. Quality of the literature;
  2. Quality of analysis;
  3. Significance of the topic;
  4. Impact of the article;
  5. Advancement of the field;
  6. Style;
  7. Balance and fairness; and
  8. Purpose.

The eight RER criteria and standards are given without elaboration or operational definition. Reviewers are “on their own” to interpret boundaries and meaning.31

Cooper10 is very critical of the narrative approach to research integration. Cooper asserts: “Research syntheses conducted in the traditional narrative manner have been much criticized. Opponents of the traditional research synthesis have suggested that this method—and its resulting conclusions—is imprecise in both process and outcome. In particular, traditional narrative research syntheses lack explicit standards of proof. Readers and users of these syntheses do not know what standard of evidence was used to decide whether a set of studies supported its conclusion.34 The combining rules used by traditional synthesists are rarely known to anyone but the synthesists themselves, if even they are consciously aware of what is guiding their inferences.”10

Cooper10 also argues that there are three other disadvantages to traditional [narrative] research synthesis:

  • “First, traditional research syntheses rarely involve systematic techniques to ensure that (a) all relevant research was located and included in the synthesis and (b) information from each study was gathered accurately.”
  • “Second, traditional narrative syntheses were prone to use post hoc criteria to decide whether individual studies met an acceptable threshold of methodological quality.”
  • “Finally, traditional narrative syntheses, by their very nature, fail to result in statements regarding the overall magnitude of the relationship under investigation.”10

Despite these real and potential flaws, narrative reviews have been and continue to represent important contributions to the integrative scholarly literature in medical and health professions education. To illustrate, a narrative review authored by K. Anders Ericsson,35 “Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains,” published in Academic Medicine in 2004, has had a major impact. Its message is that new educational approaches grounded in learning science, standardized curricula, and rigorous outcome measurement are replacing the traditional model of clinical medical education based solely on patient care experience. This article has received 985 Google Scholar citations as of March 24, 2014; an Academic Medicine article qualifies as a “citation classic” when cited by peers more than 50 times. A more recent example of narrative integrative scholarship of interest to health professions educators, “Socioeconomic status and the health of youth: a multilevel, multidomain approach to conceptualizing pathways,” was published in Psychological Bulletin.36


The [Australian] National Health and Medical Research Council (NHMRC) states the aim of a systematic literature review: “The purpose of a systematic literature review is to evaluate and interpret all available research evidence relevant to a particular question…. This differs from a traditional review in which previous work is described but not systematically identified, assessed for quality and synthesized.”37 The NHMRC description is amplified by Albanese and Norcini,38 who assert that a systematic review “is an effort to make the review process transparent and to define the rules of evidence in generally agreed upon ways. It is also an approach that holds reviews to predetermined standards of quality.”

Lang39 describes the characteristics of systematic reviews as an extension of the scientific method and presents a nine-step approach for their conduct. Lang’s nine steps are:

  1. Become interested in a biological or human problem;
  2. Learn what is known about the problem;
  3. Formulate a research question about the problem;
  4. Design an experiment to test one or more possible answers to the question;
  5. Select a sample to study;
  6. Collect the data needed to answer the question;
  7. Analyze and interpret the data, perhaps statistically;
  8. Derive conclusions on the basis of the data; and
  9. Publish the results of the study.

Lang’s nine steps are similar, but not identical, to the Cooper10 and Cooper and Koenka25 seven-step approach to conducting a systematic review.

Professional staff members for the journal Academic Medicine excerpted and adapted information from the Lang article39 and created an informal resource comparing narrative reviews and systematic reviews on grounds of four features: reproducibility and research question, search and selection process, analysis, and interpretation.40 These distinctions are shown in Figure 1. In short, a systematic review follows a careful plan of action, is organized, rule governed, and (perhaps) quantitative; in contrast with a narrative review, which is usually based on idiosyncratic, qualitative procedures that use tacit rules of evidence and data synthesis.

Figure 1
Figure 1:
Four features of narrative versus systematic reviews. Source: Lang TA. The value of systematic reviews as research activities in medical education. Acad Med. 2004;79:1067–1072.39 Excerpted and adapted with permission from the Association of American Medical Colleges.

Three collaborative research groups sponsor and conduct systematic reviews of interest to medical and health science educators: the Best Evidence Medical and Health Professional (BEME) Collaboration,41 the Campbell Collaboration,42 and the Cochrane Collaboration.43 BEME Collaboration systematic reviews aim to improve evidence-informed education in medicine and the health professions. Campbell Collaboration systematic reviews address questions in the behavioral sciences, public policy, and health services as research synthesis targets. By contrast, Cochrane Collaboration systematic reviews address health care; health policy; effects of interventions for disease prevention, treatment, and rehabilitation; and diagnostic test accuracy.

Systematic reviews performed under auspices of the BEME Collaboration, the Campbell Collaboration, and the Cochrane Collaboration undergo rigorous peer review from the beginning steps of problem formulation and rule-governed literature searching to abstracting data from individual studies,44,45 evaluating the quality of evidence (e.g., medical education research study quality instrument score46), data synthesis, data interpretation, and presentation of results, analogous to the seven-step procedure.10 This deliberate, step-by-step approach to integrative scholarship is intended to produce outcomes that conform to the scientific method, reduce bias, yield reproducible results, and contribute to a cumulative body of scientific evidence.47

Data analysis in the context of a systematic review can be either quantitative or qualitative. Quantitative data analysis in systematic reviews is called meta-analysis, where data from individual studies are aggregated, effect sizes and confidence intervals are calculated, and data presentation and interpretation are done according to established conventions such as forest plots.48–50 In particular, Beretvas48 sets forth 15 desiderata for quantitative data in systematic reviews that employ meta-analysis. Qualitative data synthesis in the context of a systematic review is performed when features of selected studies include heterogeneous research designs, educational interventions, outcome measures, and time frames that rule out quantitative synthesis using meta-analysis.51

Research reporting conventions for systematic reviews are governed by rules that specify the elements of an integrative research report and their order of presentation. Lang and Seic52 point out basic expectations for reporting a systematic review in the medical literature. Moher and colleagues53 present a detailed account of the “preferred reporting items for systematic reviews and meta-analyses,” known as the PRISMA Statement. They provide a 27-item checklist for authors of systematic reviews that covers quantitative meta-analysis of integrative data to include in a research report. Many journals that publish reports of systematic literature reviews in health professions education insist that submitted manuscripts conform to PRISMA Statement guidelines.

There are many examples of systematic reviews in health professions education that use either quantitative or qualitative methods for data synthesis. Illustrative systematic reviews that employ quantitative meta-analysis for data integration include a report by Cook and colleagues, “Technology-enhanced simulation for health professions education,”54 and an article by McGaghie and colleagues55 on the comparative effectiveness of simulation-based medical education (SBME) with deliberate practice versus traditional clinical education on clinical skill acquisition among medical learners. A systematic literature review that uses qualitative methods for data synthesis, “The attributes of the clinical trainer as a role model,” was published recently in Academic Medicine by Ria Jochemsen-van der Leeuw and colleagues.56

Some critics contend that results from systematic reviews are not always trustworthy because the methodology is too specific and rigid.57,58 These and other scholars argue that the systematic review approach to integrative scholarship is acontextual. It fails to account for the conditions under which the original research was performed. Thus, systematic reviews with or without meta-analysis of data are considered by some scholars to be sterile, detached from the complex, ever-changing conditions that affect the way educational interventions work (or do not work) in different settings and circumstances. The third, scoping, and fourth, critical-realist, approaches to integrative scholarship aim to address this deficit.


Scoping studies are a relatively new addition to the family of integrative scholarship methods. The aim of a scoping study is to map the key concepts contained in a research domain—their breadth, limits, and features—and the primary sources and types of available evidence. The intent is to produce a quick, narrative, descriptive account of the scope of current literature addressing a key research question. The strategy used to identify relevant literature in a scoping study aims to achieve broad, boundary-setting results.59–61

Arksey and O’Malley62 state four reasons to perform a scoping study: to examine the extent, range, and nature of research activity; determine the value of undertaking a full systematic review; summarize and disseminate research findings; and identify research gaps in the current literature. A scoping study is not governed by strict methodological rules (e.g., specific research design, measurement methods) like a systematic review. Instead, “the scoping study method is guided by a requirement to identify all relevant literature regardless of study design.” In addition, “The process is not linear but interactive, requiring researchers to engage with each [scoping review] stage in a reflexive way and, where necessary, repeat steps to ensure that the literature is covered in a comprehensive way.”62

Arksey and O’Malley62 state that there are five required stages of a scoping study:

  1. Identify the research question;
  2. Identify relevant studies;
  3. Select studies;
  4. Chart the data; and
  5. Collate, summarize, and report results.

A sixth, optional stage in a scoping study is to consult with stakeholders, individuals, and groups who have a vested interest in the integrative scholarship, to assess their views about the value and impact of such work.

Scoping study goals are broad and flexible. The inclusive goal is to identify a wide variety of potentially relevant research studies without regard to design, measurement, statistical methods, or qualities of the research presentation. In epidemiological terms, the intent is to conduct a sensitive rather than a specific literature search. There is no requirement to assess the weight or quality of evidence, just its breadth and detail. The relevance or utility of individual research studies to the purpose of a scoping study is judged post hoc, after general criteria are set.

Step four (chart the data) and step five (collate, summarize, and report results) are especially unique to scoping studies. Charting involves creating a literature “map” based on early searching and discussion with experts to identify the range and features of literature relevant to the review question. “The map is likely to be multi-layered (like a geological map), since researchers from different . . . sciences and with differing theoretical orientations can be expected to have studied the same phenomenon in different ways.”63 Writing the scoping report, step five, is not rule governed. Instead, the structure, content, and length of the manuscript are determined by the volume of relevant literature.64 More details about scoping study methodology are available from Levac and colleagues.65


A critical-realist tradition for integrative scholarship is a hybrid of the narrative, systematic, and scoping review methods. The critical-realist approach relies simultaneously on both professional judgment and rigorous methodology. Rycroft-Malone and colleagues66 teach that “A realist review focuses on understanding and unpacking the mechanisms by which an intervention works (or fails to work), thereby providing an explanation, as opposed to a judgment about how it works. The realist approach is fundamentally concerned with theory development and refinement, accounting for context as well as outcomes in the process of systematically and transparently synthesizing relevant literature.” Moreover, “the realist approach is particularly suited to the synthesis of evidence about complex implementation interventions.”

Integrative scholarship performed within a critical-realist framework is grounded in the idea that “an intervention is a theory; because interventions are implemented on a hypothesis of if we do X in this way, then it will bring about an outcome.” In addition, “the resultant model must be outcome focused because a realist synthesis is concerned with uncovering ‘what works’ within differing contextual configurations.”66

The methods used to conduct a critical-realist research synthesis are planful and stepwise, similar but not identical to the steps needed for a systematic review. The methodological procedures for a critical-realist review involve four steps, two having subsidiary actions as articulated by Rycroft-Malone and colleagues66 and Pawson.67 The four steps are:

  1. Define the scope of the review:
    • identify the question,
    • clarify the review purpose(s),
    • find and articulate program theories;
  2. Search for and appraise the evidence:
    • search for evidence,
    • test relevance;
  3. Extract and synthesize findings; and
  4. Develop a narrative.

One key distinction between the critical-realist and systematic approaches to integrative scholarship is that data synthesis in the critical-realist tradition is iterative and qualitative, “focus[ing] on four dimensions: questioning the integrity of a theory, adjudicating between competing theories, considering the same theory in comparative settings, or comparing the ‘official’ theory with actual practice.”66 Quantitative data analysis is not a feature of integrative scholarship in the critical-realist mode.

These ideas conform with Norman and Eva’s68 “critical review” approach to literature synthesis along with the “realist review” tactic espoused by Pawson and colleagues.57,67 Eva69 states: “A good educational research literature review … is one that presents a critical synthesis of a variety of literatures, identifies knowledge that is well established, highlights gaps in understanding, and provides some guidance regarding what remains to be understood. The result should give a new perspective of an old problem…. The author … should feel bound by a moral code to try to represent the literature (and the various perspectives therein) fairly, but need not adopt a guise of absolute systematicity.” Pawson et al57 agree with the assertion, “the review question must be carefully articulated so as to prioritise which aspects of which interventions will be examined.”

Two recently published articles are examples of integrative scholarship that was performed using the critical-realist approach. The first is a critical review of SBME research from 2003 to 2009. This report concluded: “Development of and research into SBME has grown and matured over the past 40 years on substantive and methodological grounds…. The impact and educational utility of SBME are likely to increase in the future. More thematic programmes of research are needed.”70 The second critical-realist review article addresses the impact of mental workload on rater-based assessments. This work “highlight[s] how the inherent cognitive architecture of raters might beneficially be taken into account when designing rater-based assessment protocols.”71

Open peer commentary

The open peer commentary format as a means of integrative scholarship is novel in health professions education. The general approach is for a journal editor (or editorial board) to solicit or commission a “target article” that is provocative, controversial, or at the leading edge of science or scholarship in an academic field. The target article undergoes peer review, perhaps revision, and is published. The target article is followed by a wide-ranging series of open peer commentaries (also critically reviewed) that may endorse, refute, amplify, or refine its methods, substance, or conclusions. The author of the target article then has final say in the form of rebuttal, summary remarks, and comments. The result is integrative scholarship via saturation, diverse perspectives, and spirited disputation. It also contributes to great reading, sometimes spiked with irony or humor.

Behavioral and Brain Sciences (BBS) is the scholarly journal best known for featuring the open peer commentary format, although other journals (e.g., American Journal of Bioethics, Psychological Bulletin) use it occasionally. The BBS masthead states that it “is the internationally renowned journal with the innovative format known as Open Peer Commentary. Particularly significant and controversial pieces of work are published from researchers in any area of psychology, neuroscience, behavioral biology or cognitive science, together with 10–25 commentaries on each article from specialists within and across these disciplines, plus the author’s response to them. The result is a fascinating and unique forum for the communication, criticism, stimulation, and particularly the unification of research in behavioral and brain sciences from molecular neurobiology to artificial intelligence and the philosophy of the mind.”72

The BBS Instructions for Target Article Authors clearly state the journal’s intentions: “A BBS target article can be: (i) the report and discussion of empirical research that the author judges to have broader scope and implications than might be more appropriately reported in a specialty journal; (ii) an unusually significant theoretical article that formally models or systematizes a body of research; or (iii) a novel interpretation, synthesis, or critique of existing experimental or theoretical work. Occasionally, articles dealing with social or philosophical aspects of the behavioral and brain sciences will be considered.”72 Detailed instructions are also provided to shape the tone and organization of commentary responses.

A recent example of a BBS exchange of interest to scholars in medical education is a target article authored by psychologist Nelson Cowan73 titled, “The magic number 4 in short-term memory: a reconsideration of mental storage capacity,” followed by 39 open peer commentaries, and a detailed response by Cowan. The target article has five major sections: introduction to the problem of mental storage capacity; theoretical framework; empirical evidence for the capacity limit; theoretical account of the capacity limits: unresolved issues; and conclusion. The 39 open peer commentaries are authored by 59 scholars from 10 North American, European, Asian, and Australasian countries. The target article, open peer commentaries, and author’s response together cite 526 references in cognitive and experimental psychology, neuroscience, developmental biology, philosophy of science, and other specialties. The reader leaves this academic conversation not only with a broad perspective on recent research about human memory storage capacity but also with great respect for the tradition of vigorous scholarly disputation.

Judging Integrative Scholarship

The five complementary approaches to the conduct of integrative scholarship suggest that criteria used to judge the quality, utility, and value of this work should also vary. Focused evaluation guidelines are given in the rows and cells of Table 1. The rows10,25 are a distillate of evaluation criteria and reporting conventions for integrative scholarship expressed as lists ranging from 3 to 27 items proposed by contemporary scholars.31,33,39,48,53,62,66,72 These evaluation guidelines complement but do not duplicate the six generic criteria for evaluating scholarship proposed by Glassick et al74: clear goals, adequate preparation, appropriate methods, significant results, effective presentation, and reflective critique.

The summary advanced by Table 1 suggests that judging integrative scholarship is relatively straightforward, perhaps formulaic, when the approach to research synthesis is stepwise and rule governed, as is the case for systematic and critical-realist reviews. For these reviews, judgments about the quality and utility of integrative scholarship can be guided by checklists to gauge whether a manuscript fulfills a priori criteria and reporting standards such as the PRISMA checklist53 or the RAMESES (Realist and Meta-narrative Evidence Syntheses: Evolving Standards) publication standards for critical-realist syntheses.75

Judgments about the quality and utility of integrative scholarship performed using the narrative, scoping, and open peer commentary frameworks are more difficult to render because the rules of evidence, criteria, and reporting standards are frequently opaque. The evaluator or journal referee is expected to exercise expert judgment when evaluating such work, to act as an educational connoisseur.76 Evaluations of integrative scholarship under these circumstances will likely be idiosyncratic and unstandardized, similar to the work being judged.

The key message to reviewers and editors about judging integrative scholarship, expressed as the five columns of Table 1, is that the several approaches to research synthesis are not necessarily better or worse than one another, just different. This is important to Academic Medicine readers because integrative scholarship has many faces, and all contribute to our fund of knowledge. Reviewers and editors should recognize and respect the five integrative scholarship traditions and also be ready to embrace new approaches to research synthesis, such as network analysis now on the horizon.77 Medical school promotion and tenure committees should also recognize this broad spectrum of scholarship for its significance and impact when making decisions about faculty advancement. Integrative scholarship in medical and health professions education must not be performed using a static, “cookie cutter” mentality, and neither should evaluations of its quality and utility.

Judging the value of integrative scholarship is very different than evaluating its quality and utility. Judgments about value are governed by personal, professional, and national priorities. Value judgments, often tacit or occult, are always at the heart of science and scholarship even though they are rarely acknowledged. As a community of scholars we may wish to invite colleagues to express the values that underlie the conduct of original and integrative scholarship reported in their manuscripts.

In conclusion, I have tried to demonstrate that there are a variety of approaches to integrative scholarship that involve different rules of evidence, criteria, and standards when the work is performed and judged. Evaluators of integrative scholarship (e.g., journal editors and referees) must be informed about and respect these distinctions when judging synthetic manuscripts.


1. Boyer EL Scholarship Reconsidered: Priorities of the Professoriate. 1990 Princeton, NJ Carnegie Foundation for the Advancement of Teaching
2. Chalmers I, Hedges LV, Cooper H. A brief history of research synthesis. Eval Health Prof. 2002;25:12–37
3. Strech D, Tilburt J. Value judgments in the analysis and synthesis of evidence. J Clin Epidemiol. 2008;61:521–524
4. Cicchetti DV. The reliability of peer review for manuscript and grant submissions: A cross-disciplinary investigation. Beh Brain Sci. 1991;14:119–186
5. Fiske DW, Fogg L. But the reviewers are making different criticisms of my paper! Diversity and uniqueness in reviewer comments. Am Psychol. 1990;45:591–598
6. Rothwell PM, Martyn CN. Reproducibility of peer review in clinical neuroscience. Is agreement between reviewers any greater than would be expected by chance alone? Brain. 2000;123(pt 9):1964–1969
7. Dixon-Woods M, Sutton A, Shaw R, et al. Appraising qualitative research for inclusion in systematic reviews: A quantitative and qualitative comparison of three methods. J Health Serv Res Policy. 2007;12:42–47
8. Murad MH, Montori VM. Synthesizing evidence: Shifting the focus from individual studies to the body of evidence. JAMA. 2013;309:2217–2218
9. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Translational educational research: A necessity for effective health-care improvement. Chest. 2012;142:1097–1103
10. Cooper H Research Synthesis and Meta-Analysis: A Step-by-Step Approach. 20104th ed. Thousand Oaks, Calif Sage Publications
11. Sturm LP, Windsor JA, Cosman PH, Cregan P, Hewett PJ, Maddern GJ. A systematic review of skills transfer after surgical simulation training. Ann Surg. 2008;248:166–179
12. Dixon-Woods M, Bosk CL, Aveling EL, Goeschel CA, Pronovost PJ. Explaining Michigan: Developing an ex post theory of a quality improvement program. Milbank Q. 2011;89:167–205
13. Pronovost P, Needham D, Berenholtz S, et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med. 2006;355:2725–2732
14. Pronovost PJ, Goeschel CA, Colantuoni E, et al. Sustaining reductions in catheter related bloodstream infections in Michigan intensive care units: Observational study. BMJ. 2010;340:c309
15. Academic Medicine. . Editorial policy, publication ethics, and complete instructions for authors. Accessed September 23, 2014
16. Advances in Health Sciences Education. . Interested in publishing your article in this journal? Accessed September 23, 2014
17. American Journal of Nursing. . Writing for the American Journal of Nursing: Author guidelines. Accessed September 23, 2014
18. Journal of Graduate Medical Education. . Author instructions. Accessed September 23, 2014
19. Medical Science Educator. . Types of submissions. Accessed September 23, 2014
20. Teaching and Learning in Medicine. . Instructions for authors. Accessed September 23, 2014
21. CHEST. . For authors: Instructions & policies. Accessed September 23, 2014
22. Annals of Surgery. . Instructions for authors. Accessed September 23, 2014
23. Health Psychology. . Instructions to authors. Accessed September 23, 2014
24. Psychological Bulletin. . Instructions to authors. Accessed September 23, 2014
25. Cooper H, Koenka AC. The overview of reviews: Unique challenges and opportunities when research syntheses are the principal elements of new integrative scholarship. Am Psychol. 2012;67:446–462
26. Cooper HM. Organizing knowledge syntheses: A taxonomy of literature reviews. Knowl Soc. 1988;1:104–126
27. Strike K, Posner GWard SA, Reed LJ. Types of synthesis and their criteria. In: Knowledge Structure and Use: Implications for Synthesis and Interpretation. 1983 Philadelphia, Pa Temple University Press
28. Booth WC, Colomb GG, Williams JM The Craft of Research. 1995 Chicago, Ill University of Chicago Press
29. Bem D. Writing a review article for Psychological Bulletin. Psychol Bull. 1995;118:172–177
30. Busari JO, Berkenbosch L, Brouns JW. Physicians as managers of health care delivery and the implications for postgraduate medical training: A literature review. Teach Learn Med. 2011;23:186–196
31. Imel SRocco TS, Hatcher T. Writing a literature review. In: The Handbook of Scholarly Writing and Publishing. 2011 San Francisco, Calif Jossey-Bass
32. Côté L, Turgeon J. Appraising qualitative research articles in medicine and medical education. Med Teach. 2005;27:71–75
33. Review of Educational Research. . Standards and criteria. Accessed September 23, 2014
34. Johnson BT, Eagly AH. Quantitative synthesis of social psychological research. CHIP Documents. Paper 13. Accessed September 23, 2014
35. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81
36. Schreier HM, Chen E. Socioeconomic status and the health of youth: A multilevel, multidomain approach to conceptualizing pathways. Psychol Bull. 2013;139:606–654
37. National Health and Medical Research Council. How to Review the Evidence: Systematic Identification and Review of the Scientific Literature. 1999 Canberra, Australia National Health and Medical Research Council
38. Albanese M, Norcini J. Systematic reviews: What they are and why should we care? Adv Health Sci Educ. 2002;7:147–151
39. Lang TA. The value of systematic reviews as research activities in medical education. Acad Med. 2004;79:1067–1072
40. Academic Medicine staff. . Tips for submitting systematic reviews to Academic Medicine. Accessed September 23, 2014
41. Best Evidence Medical and Health Professional Education Collaboration (BEME). Accessed September 23, 2014
42. Campbell Collaboration. Accessed September 23, 2014
43. Cochrane Collaboration. Accessed September 23, 2014
44. Haig A, Dozier M. BEME guide no 3: Systematic searching for evidence in medical education—part 1: Sources of information. Med Teach. 2003;25:352–363
45. Haig A, Dozier M. BEME guide no. 3: Systematic searching for evidence in medical education—part 2: Constructing searches. Med Teach. 2003;25:463–484
46. Reed DA, Cook DA, Beckman TJ, Levine RB, Kern DE, Wright SM. Association between funding and quality of published medical education research. JAMA. 2007;298:1002–1009
47. Valentine JC, Cooper H, Patall EA, Tyson D, Robinson JC. A method for evaluating research syntheses: The quality, conclusions, and consensus of 12 syntheses of effects of after-school programs. Res Syn Methods. 2010;1:20–38
48. Beretvas SNHancock GR, Mueller RO. Meta-analysis. In: The Reviewer’s Guide to Quantitative Methods in the Social Sciences. 2010 New York, NY Routledge
49. Valentine JCCooper H. Meta-analysis. In: APA Handbook of Research Methods in Psychology. Vol 3: Data Analysis and Research Publication. 2012 Washington, DC American Psychological Association
50. Anzures-Cabrera J, Higgins JPT. Graphical displays for meta-analysis: An overview with suggestions for practice. Res Syn Methods. 2010;1:66–80
51. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach. 2005;27:10–28
52. Lang TA, Seic M. Reporting systematic reviews and meta-analyses. In: How to Report Statistics in Medicine. 20062nd ed. Philadelphia, Pa American College of Physicians
53. Moher D, Liberati A, Tetzlaff J, Altman DGPRISMA Group. . Preferred reporting items for systematic reviews and meta-analyses: The PRISMA Statement. Ann Intern Med. 2009;151:264–269, W64
54. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: A systematic review and meta-analysis. JAMA. 2011;306:978–988
55. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86:706–711
56. Ria Jochemsen-van der Leeuw HGA, van Dijd N, van Etten-Jamaludin FS, Wieringa-de Waard M. The attributes of the clinical trainer as a role model: A systematic review. Acad Med. 2013;88:26–34
57. Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist review—a new method of systematic review designed for complex policy interventions. J Health Serv Res Policy. 2005;10(suppl 1):21–34
58. McCormack B, Wright J, Dewar B, Harvey G, Ballintine K. A realist synthesis of evidence relating to practice development: Methodology and methods. Pract Dev Health Care. 2007;6:5–24
59. Anderson S, Allen P, Peckham S, Goodwin N. Asking the right questions: Scoping studies in the commissioning of research on the organisation and delivery of health services. Health Res Policy Syst. 2008;6:7
60. Davis K, Drey N, Gould D. What are scoping studies? A review of the nursing literature. Int J Nurs Stud. 2009;46:1386–1400
61. Armstrong R, Hall BJ, Doyle J, Waters E. Cochrane update: “Scoping the scope” of a Cochrane review. J Public Health (Oxf). 2011;33:147–150
62. Arksey H, O’Malley L. Scoping studies: Towards a methodological framework. Int J Soc Res Method. 2005;8:19–32
63. Mays N, Roberts E, Popay JFulop N, Allen P, Clarke A, Black N. Synthesising research evidence. In: Studying the Organization and Delivery of Health Services: Research Methods. 2001 London, UK Routledge
64. Brien SE, Lorenzetti DL, Lewis S, Kennedy J, Ghali WA. Overview of a formal scoping review on health system report cards. Implement Sci. 2010;5:2
65. Levac D, Colquhoun H, O’Brien KK. Scoping studies: Advancing the methodology. Implement Sci. 2010;5:69
66. Rycroft-Malone J, McCormack B, Hutchinson AM, et al. Realist synthesis: Illustrating the method for implementation research. Implement Sci. 2012;7:33
67. Pawson R Evidence-Based Policy: A Realist Perspective. 2001 Thousand Oaks, Calif Sage Publications
68. Norman GR, Eva KW Quantitative Research Methods in Medical Education. 2009 Edinburgh, Scotland Association for the Study of Medical Education
69. Eva KW. On the limits of systematicity. Med Educ. 2008;42:852–853
70. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63
71. Tavares W, Eva KW. Exploring the impact of mental workload on rater-based assessments. Adv Health Sci Educ Theory Pract. 2013;18:291–303
72. Behavioral and Brain Sciences. . Instructions for Authors. Accessed September 23, 2014
73. Cowan N. The magic number 4 in short-term memory: A reconsideration of mental storage capacity. Behav Brain Sci. 2000;24:87–185
74. Glassick CE, Huber MT, Maeroff GI Scholarship Assessed: Evaluation of the Professoriate. 1997 San Francisco, Calif Jossey-Bass
75. Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: Realist syntheses. J Adv Nurs. 2013;69:1005–1022
76. Eisner EW The Enlightened Eye. 1991 New York, NY Macmillan Publishing Co.
77. van de Wijngaert L, Bouwman H, Contractor N. A network approach toward literature review. Qual Quant. 2014;48:623–643
© 2015 by the Association of American Medical Colleges