Secondary Logo

Journal Logo


The Value of Synthesizing Evidence to Inform Cancer Nursing

Noyes, Jane PhD

Author Information
doi: 10.1097/NCC.0000000000000824
  • Free

Making best use of evidence to inform cancer nursing practice is a global priority. Synthesizing evidence is an efficient way of maximizing use of existing evidence and preventing research wastage by commissioning unwarranted new research. The importance of the systematic review to informing clinical decisions is signified by the establishment of global clinical guideline developers such as the World Health Organization (, US Agency for Healthcare Research and Quality (, National Health and Medical Research Council of Australia (, and the National Institute for Health and Care Excellence in the United Kingdom ( The entire clinical guideline development process is predicated on the systematic review of evidence from which recommendations for practice can be made. The field of cancer has also benefited from national and international consensus statements on treatments and interventions drawing on systematic reviews in combination with clinical expertise and patient preferences.

Cancer was one of the first clinical specialties to embrace quantitative systematic reviews of the effects of drugs and other types of treatments. The Cochrane Library, for example, has more intervention effect reviews on cancer than any other topic. The large number of systematic reviews on cancer topics has subsequently been used to underpin clinical guideline development to transform the treatment options, associated nursing care, and improved outcomes for patients. Of specific interest, recent developments include a review to establish the effectiveness and value of European cancer nursing, which is one of the first of its type.1

While the Cochrane type of quantitative intervention effect review has achieved a state of supremacy, over the last 20 years there has been prolific development of other review methodologies to address different types of questions with diverse types of evidence (such as qualitative and mixed methods). Interestingly, nurses have been highly influential in the methodological development of diverse review types that are more likely to be useful in developing new theory and new insights into patient experience and nursing care. The new Cochrane Handbook, for example, includes a chapter on qualitative evidence synthesis,2 and Cochrane has an Effective Practice and Organization of Care review group ( In a more general context (Figure 1), it is now possible to use diverse evidence synthesis methods for a much wider set of purposes, such as to

Figure 1
Figure 1:
New hierarchy of evidence that responds to user requirements for the inclusion of diverse evidence for decision-making and timely reviews. Adapted from
  • determine the pool of known evidence on a topic
  • formulate review questions/determine outcomes and clarify review parameters
  • clarify concepts and synthesize theory
  • synthesize policy intentions and outcomes
  • synthesize system-wide policy outcomes
  • develop theory to inform a primary study
  • develop theory as a primary purpose
  • understand illness experiences
  • determine how promising practices work
  • understand patient, carer, and key stakeholder experiences, values, and preferences concerning interventions
  • determine factors that impact on intervention implementation, fidelity, reach, acceptability, and feasibility and to identify benefits and harms
  • estimate the cost and effectiveness of interventions
  • determine prognosis
  • determine diagnostic test accuracy
  • determine the psychometric properties of instruments
  • determine the effects and impacts of complex, health system-wide interventions
  • integrate quantitative and qualitative evidence

Guideline developers and decision makers increasingly require qualitative and mixed-methods syntheses, as well as reviews of intervention effects, diagnostic test accuracy, and prognosis to populate specific aspects of the “evidence to decision” framework,3 such as patient values, preferences, and experiences; feasibility; implementation and resource considerations; and equity implications (Figures 1 and 2). It is these specific phenomena that can be addressed by newer review types and methods to better inform cancer nursing and underpin guideline development.

Figure 2
Figure 2:
The DECIDE framework3 Courtesy of Simon Lewin.

The majority of published contemporary reviews in cancer nursing, however, use very few of the many available newer methods, especially those qualitative evidence synthesis methods that are designed to advance new theory and theoretical insights that go beyond the primary studies.4 There are currently, for example, approximately 18 different and sometimes overlapping qualitative evidence synthesis methods that vary in complexity.5–7 Reviewers commonly find it difficult to select an appropriate synthesis methodology for their specific context. Reviewers also tend to stick to the same methodology that they are familiar with, rather than consider the best methodology for the type of available evidence. To support reviewers to make the best choice, Booth and colleagues5,6 have produced the RETREAT checklist (Table) of things to consider when selecting a methodology.

Many cancer interventions that involve nursing also tend to be “complex interventions,” and there is less (although growing) experience of undertaking mixed-methods reviews of complex interventions that focus on complexity and involve health systems level change.4–8 The use of theory to design reviews and interpret evidence is also increasingly used to help review authors to produce a more theory-informed and useful product for decision-making. Theory in the form of logic models and social theories can help structure and focus a systematic review of any design and can be used as an integrative or interpretive lens. Cochrane has produced detailed guidance on the choice of theory for use in systematic reviews.9

Those reviewers that do apply newer and more novel methods frequently find it challenging to interpret and apply the evidence synthesis methods and tools as intended by the originators.10,11 While acknowledging that funded reviews often need to be undertaken rapidly, there appears to be a lot of confusion about synthesis methods and designs, blurring of different methods, shortcuts being taken, and missing out important stages and processes when it is not appropriate to do so. Reviewers have also found it challenging to report their review in a way that has maximum utility for decision-makers.10 In recent years, methodologists have recognized that too many qualitative evidence syntheses were poorly reported and thus could not be used to make decisions and responded by developing detailed reporting guidelines to support both the better conduct and reporting of generic qualitative evidence syntheses and meta-ethnographies.12,13 Likewise, many reports of meta-analyses do not meet the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) reporting requirements, and reports of quantitative syntheses without meta-analysis have been particularly poor, leading to the new (SWiM [Synthesis Without Meta-analysis]) reporting guidance, which is an extension of PRISMA.14,15 There is at present no specific reporting guideline for mixed-method reviews, but Flemming and colleagues16 outline some principles to follow. Other recent developments include GRADE CERQual to assess the confidence in synthesized qualitative findings.17 This latter development is important as decision makers have gotten used to the similar GRADE method ( for assessing the certainty of evidence of intervention effects and were keen to have a similar system for qualitative evidence syntheses. Syntheses of qualitative evidence are of more value to decision makers if they have confidence in the quality of the review and the strength of the evidence.

Of particular concern, with some notable exceptions, patient and public involvement has been much slower to be fully integrated into the conduct of evidence syntheses, especially those reviews that are not funded. Many reviews are conducted without any patient and public involvement, whereas for most funded reviews their input is expected because the review product is likely to be more patient-centered and of greater value to decision makers if they are coproduced. Cochrane, for example, has a huge consumer network of people to drawn on. Many local not-for-profit organizations and individuals are, however, more than willing to contribute to nonfunded reviews because they want to see improved treatments and services for people with cancer. There is also a big evidence gap in the conduct of reviews at the interface between health and social cancer care, and reviews focused purely on social care and cancer. People living with cancer experience a myriad of psychological, social, and domestic problems that impact on their life and well-being. Cancer nurses are well placed to fill this known evidence gap to benefit patients.

It is, however, positive to see a mixture of different review types and designs in the current themed issue on evidence synthesis. Selected reviews include a scoping review, priority setting review, meta-analyses, and qualitative evidence syntheses addressing various questions of importance to cancer nursing. Of particular interest, Carney et al18 use meta-ethnography to transform the findings of primary qualitative studies to better understand the experiences during childhood cancer survivorship. Meta-ethnography is one of the more complex qualitative evidence synthesis methods and requires experience of conducting primary qualitative research to fully utilize the power of the methodology to develop new theory and interpretations that move beyond the primary study findings. Cadorin et al19 undertook a mixed-methods review and published their protocol in PROSPERO (International Prospective Register Of Systematic Reviews, It is a marker of best practice to make publicly available the review protocol prior to conducting the review. Although the summary of findings is articulated very briefly and not in a way that was originally intended, it is good to see that Diaw et al20 applied GRADE CERQual to assess the confidence in their synthesized qualitative findings. Presenting a summary of findings (Table) with associated assessments of confidence can be exceptionally helpful for decision-makers. It was also encouraging to see Han et al21 use a symptom management theory as the theoretical framework to inform the design and interpretation of their quantitative review and meta-analysis.

The RETREAT Framework for Selecting an Appropriate Methodology

In summing up the current state of the art of evidence synthesis in the first 2 decades of the 21st century, the best one can say is that it is a mixed picture of great progress and unfulfilled potential. There are different evidence synthesis methods for varying purposes that are continuing to evolve. Global evidence synthesis producers, guideline producers, and decision makers are now much more aware of the value of syntheses of diverse evidence types. There is further potential for cancer nurses to embrace the full range of synthesis methods available in order to make best use of the available evidence in health and social care. But, they need to apply evidence synthesis methods carefully and rigorously to produce higher-quality reviews that are valued and used by decision-makers. There is now much better methodological guidance to support the conduct and reporting of reviews to further improve their quality and utility for decision-making.

Jane Noyes, PhD

School of Health Sciences, Bangor University, Bangor, UK


1. Campbell P, Torrens C, Kelly D, et al. Recognising European cancer nursing: protocol for a systematic review and meta-analysis of the evidence of effectiveness and value of cancer nursing. J Adv Nurs. 2017;73(12):3144–3153.
2. Noyes J, Booth A, Cargo M, et al. Chapter 21: qualitative evidence. In: Higgins JPT, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, Welch VA (editors). Cochrane Handbook for Systematic Reviews of Interventions Version 6.0 (updated July 2019). Cochrane.
3. DECIDE Evidence to Decision Framework, 2015. Accessed January 10, 2020.
4. Flemming K, Booth A, Garside R, et al. Qualitative evidence synthesis for complex interventions and guideline development: clarification of the purpose, designs and relevant methods. BMJ Glob Health. 2019;4(suppl 1):e000882.
5. Booth A, Noyes J, FLemming K, et al. (2016) Guidance on choosing qualitative evidence synthesis methods for use in health technology assessments of complex interventions [online]. http://wwwintegrate-htaeu/downloads/. Accessed January 10, 2020.
6. Booth A, Noyes J, Flemming K, et al. Structured methodology review identified seven (RETREAT) criteria for selecting qualitative evidence synthesis approaches. J Clin Epidemiol. 2018;99:41–52.
7. Booth A, Noyes J, Flemming K, et al. Formulating questions to explore complex interventions within qualitative evidence synthesis. BMJ Glob Health. 2019;4(suppl 1):e001107.
8. Noyes J, Booth A, Moore G, et al. Synthesising quantitative and qualitative evidence to inform guidelines on complex interventions: clarifying the purposes, designs and outlining some methods. BMJ Glob Health. 2019;4(suppl 1):e000893.
9. Noyes J, Hendry M, Booth A, et al. Current use was established and Cochrane guidance on selection of social theories for systematic reviews of complex interventions was developed. J Clin Epidemiol. 2016;75:78–92.
10. France EF, Ring N, Thomas R, et al. A methodological systematic review of what's wrong with meta-ethnography reporting. BMC Med Res Methodol. 2014;14:119.
11. France EF, Uny I, Ring N, et al. A methodological systematic review of meta-ethnography conduct to articulate the complex analytical phases. BMC Med Res Methodol. 2019;19(1):35.
12. Tong A, Flemming K, McInnes E, et al. Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Med Res Methodol. 2012;12:181.
13. France EF, Cunningham M, Ring N, et al. Improving reporting of meta-ethnography: the eMERGe reporting guidance. J Adv Nurs. 2019;75(5):1126–1139.
14. Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: explanation and elaboration. BMJ. 2009;339:b2700.
15. Campbell M, McKenzie JE, Sowden A, et al. Synthesis Without Meta-analysis (SWiM) in systematic reviews: reporting guideline. BMJ. 2020;368:l6890.
16. Flemming K, Booth A, Hannes K, et al. Cochrane qualitative and implementation methods group guidance series-paper 6: reporting guidelines for qualitative, implementation, and process evaluation evidence syntheses. J Clin Epidemiol. 2018;97:79–85.
17. Lewin S, Booth A, Glenton C, et al. Applying GRADE-CERQual to qualitative evidence synthesis findings: introduction to the series. Implement Sci. 2018;13(suppl 1):2.
18. Carney KB, Guite JW, Young EE, Starkweather ER. Forced enlightenment a metasynthesis of experiences during childhood cancer survivorship. Cancer Nursing. 2020;43(3):E159–E171.
19. Cadorin L, Bressan V, Truccolo I, Suter N. Priorities for cancer research from the viewpoints of cancer nurses and cancer patients: a mixed-method systematic review. Cancer Nursing. 2020;43(3):238–256.
20. Diaw M, Sibeoni J, Manolios E, et al. The Lived Experience of Work-Related Issues Among Oncology Nurses: A Metasynthesis. Cancer Nursing. 2020;43(3):200–221.
21. Han CJ, Yang GS, Syrjala K, et al. Symptom Experiences in Colorectal Cancer Survivors After Cancer Treatments: A Systematic Review and Meta-analysis. Cancer Nursing. 2020;43(3):E132–E158.
Copyright © 2020 Wolters Kluwer Health, Inc. All rights reserved