Share this article on:

Systematic review: the first step in developing a complex intervention

Bannigan, Katrina

JBI Database of Systematic Reviews and Implementation Reports: May 2018 - Volume 16 - Issue 5 - p 1079–1080
doi: 10.11124/JBISRIR-2017-003788
Editorial

The University of Plymouth Centre for Innovations in School of Health Professions and Social Care: a Joanna Briggs Institute Centre of Excellence, Plymouth, UK

Correspondence: Katrina Bannigan, katrina.bannigan@plymouth.ac.uk

Systematic reviews are a useful research methodology and have a range of uses, including establishing an existing evidence base, identifying gaps in a knowledge base, identifying and explaining inconsistencies in data, informing guidelines and research priorities and shaping methodology in subsequent primary research studies. In recent years, systematic reviews have been on the receiving end of criticism and it has been suggested that systematic reviewing has almost become a self-perpetuating industry; this critique is captured in Bastian's1 conceptualization of systematic reviews as a research hydra where she states that for every question answered, another two take its place. In the context of this criticism, it is helpful to focus on the role systematic reviews play within the process of developing and evaluating complex interventions.

There are various definitions of a complex intervention but essentially:

All complex interventions have two common characteristics; they have multiple components (intervention complexity) and complicated/multiple causal pathways, feedback loops, synergies, and/or mediators and moderators of effect (pathway complexity). In addition, they may also have one or more of the following three additional characteristics; target multiple participants, groups, or organizational levels (population complexity); require multifaceted adoption, uptake, or integration strategies (implementation complexity); or work in a dynamic multidimensional environment (contextual complexity).2 (p.7)

As such, most of the interventions offered by the nursing and allied health professions would be deemed to be complex interventions. To fully understand whether an intervention is effective or not, we need to know not only if it works but also how, when, why and in what circumstances it works.3

While it is generally acknowledged that randomized controlled trials are used to test effectiveness of interventions, they are only conducted at the end of a long process that involves a number of phases. The development and testing phases, for feasibility and piloting, precede the evaluation phase where effectiveness is tested.4 These phases are needed because the intervention must be developed to the point where it can reasonably be expected to have a worthwhile effect before a substantial evaluation, such as a randomized controlled trial, is undertaken.4 Systematic reviews play an important role in the development phase. It is expected that the findings of one or more systematic reviews will inform the development – or design – phase to clearly establish what is known about the intervention to date.

Systematic reviews, whether reviews of effectiveness or qualitative reviews, can provide a lot of information to answer, or highlight the absence of answers to, questions about whether an intervention works and how, when, why and in what circumstances it works. New types of reviews have also evolved: “Realist and meta-narrative reviews are systematic, theory-driven interpretative techniques, which were developed to help make sense of heterogeneous evidence about complex interventions applied in diverse contexts in a way that informs policy.”5 (p.2)

These types of reviews are different from a Joanna Briggs Institute review of effects because, instead of establishing a causal relationship between an outcome and an intervention, they explore the relationship between context, mechanism and outcome from a realist perspective to explain the success or failure of an intervention.5

If a recent, well conducted systematic review exists or is currently underway, a de novo systematic review will not be required. Equally, to fully inform the multiple characteristics of a complex intervention, a different review may be needed, for example a qualitative meta-synthesis may be conducted to understand the perspective(s) of those involved: patients, caregivers and professionals. As an example, in person-centered goal setting, individual differences and preferences may influence the “effectiveness” of this intervention. To illustrate, experiences of stroke survivors, their families and unpaid caregivers in goal setting within stroke rehabilitation need to be synthesized, fully explored and integrated before an effective person-centered intervention can be developed.6 As developing a complex intervention involves a number of phases, requiring several studies in a program of research, all associated reviews will need to be kept up to date for the duration of the research.4

It is important to appreciate that a systematic review of complex interventions is not straightforward to conduct; synthesizing research findings becomes more challenging the more complex an intervention is.3 Guidance has been developed to support researchers grappling with this challenge.2 Similarly, Petticrew et al. 3 have outlined a pragmatic approach to dealing with the complexities of a systematic review. These authors caution against overcomplicating the review: “systematic reviews should be as complex as they need to be and no more.”3 (p.1214)

To conclude, systematic reviews have an important role to play in the development and evaluation of complex interventions. Thinking about systematic reviews within a program of research is an important reminder that a systematic review is not an end in itself. They are not only an invaluable source of knowledge to inform our decision making but they also inform and inspire innovation in practice in health and social care.

Back to Top | Article Outline

References

1. Bastian H. Tackling the twin beasts of information overload and reviewer overload. Presented at Plenary II: information overload: are we part of the problem or part of the solution? Cochrane Colloquium Vienna on 5th October 2015 [Internet]. Available from: http://bit.ly/2yOOx0e. [Cited May 9, 2016].
2. Guise JM, Chang C, Butler M, Viswanathan M, Tugwell P. AHRQ series on complex intervention systematic reviews-paper 1: an introduction to a series of articles that provide guidance and tools for reviews of complex interventions. J Clin Epidemiol 2017; 90:6–10.
3. Petticrew M, Anderson L, Elder R, Grimshaw J, Hopkins D, Hahn R, et al. Complex interventions and their implications for systematic reviews: a pragmatic approach. J Clin Epidemiol 2013; 66 11:1209–1214.
4. Medical Research Council Developing and evaluating a complex intervention, London: new guidance. 2008; Available from: www.mrc.ac.uk/complexinterventionsguidance. [Cited October 30, 2017].
5. Greenhalgh T, Wong G, Westhorp G, Pawson R. Protocol – realist and meta-narrative evidence synthesis: evolving standards (RAMESES). BMC Med Res Methodol 2011; 11:115.
6. Lloyd A, Bannigan K, Sugavanam T, Freeman J. The experiences of stroke survivors, their families and unpaid carers regarding goal setting within stroke rehabilitation: a systematic review protocol. JBI Database System Rev Implement Rep 2016; 14 1:77–88.
© 2018 by Lippincott williams & Wilkins, Inc.