Evaluation is fundamental to evidence implementation; the Joanna Briggs Institute (JBI) model indicates that for evidence-based healthcare (EBHC) to be implemented, evaluation is a critical component to demonstrating measures of impact or sustainability of a change process.1 The lack of robust evaluation studies has, historically, been one of the perennial criticisms of the EBHC field. Compounding the criticisms of failures in evaluation, is that quantitative reviews with meta-analysis might tell us (with a degree of precision or confidence) the effect size of an intervention for a given outcome, but this is not the same as evaluating the impact on the processes or delivery of clinical practice. In this editorial, we will address process evaluation within the context of evidence implementation.
Process evaluation can be understood in more than one way. First, from the Donabedian principles, process evaluation facilitates examination of how care has been provided. More specifically in this context, process evaluation informs the appropriateness, acceptability, completeness or degree of competency, for example measuring changes in waiting times before surgery, percentage of patients with diabetes who receive their annual foot inspection, et cetera. Process evaluation in this context uses quality indicators, for example measuring protocol adherence using audit of compliance to recommended care for immediate newborns. Using audit of nine process indicators (e.g. the percentage of healthcare staff that received education on essential newborn care; the percentage in which exclusive breastfeeding was initiated within 1 h of birth), Kitila et al.2 show that their implementation effort resulted in higher compliance to recommended care. In this sense, evaluation of process is used as feedback, or ‘wake-up call’, to show areas for clinical improvement. Process evaluations informed by Donabedian mode of evaluating the quality of care are reported in each JBI Implementation Case Report, with several important examples in this issue.
Three implementation case studies in this issue include process evaluations, comparing current practice against evidence-based quality indicators for high-quality care. Each study included a baseline measure of current practice, followed by site-specific implementation strategies then a postimplementation audit to observe, measure and report the extent to which processes of care had changed. Successful examples of process evaluation have been published in medication administration,3 nursing metrics for prevention and management of oral mucositis in chemotherapy, including oral hygiene, pain management and patient education,4 and how procedures for nonpharmacological analgesic interventions are implemented for newborn infants in Sao Paulo.5
Process evaluation also plays an important role in health services research. The Medical Research Council has published guidance on performing process evaluations on complex interventions.6 In this guidance, a process evaluation is defined as ‘a study which aims to understand the functioning of an intervention, by examining implementation, mechanisms of impact and contextual factors. Process evaluation is complementary to, but not a substitute for, high-quality outcomes evaluation’.6(p.8) Other authors have used this to develop and test process evaluation of evidence-based care pathways and other implementation strategies.7
While both forms of process evaluation fit within the JBI model for EBHC, the role within quality improvement as form of evidence generation is widely taught and disseminated across the JBI Collaboration. In JBI, evidence generation is described as a legitimate means of knowledge creation to inform policy and practice arising from the results of well designed studies grounded in any methodological position. The challenge for JBI authors, methodologists and Collaboration is to consider how process evaluation fits within evidence generation and informs other components of the model. Moving from a statement to method and methodology that integrates with the JBI model has yet to be explored.1 Future issues of JBI Evidence Implementation may well be the test cases for the development of a primary research agenda that is collaborative, aligned with the JBI model, informs and concurrently draws from relevant methods groups, and contributes to the JBI mission and vision. For 20 years, JBI has built a reputation on clinical implementation, on ensuring systematic reviews are both high quality and focused on generating recommendations and lines of action that meet knowledge needs in policy and practice. It is important to know if a certain intervention is proven effective (proof of concept). It is equally important to know if an intervention works in a specific context (proof in context).
The challenge for JBI and other EBHC groups is to consider how to integrate the external evidence operationally and define scholarship within this context. As described by the Dutch Council for Health and Society, it is not just clinicians and policy makers that need to consider integration of evidence in decision making, and the precedent actions.8 This shift will have strategic and operational consequences and must be part of an emergent learning process. JBI must therefore consider, plan for and design systems by making use of local data from practice, and planning for integration (integrated meaning within the JBI Model a suitable pathway is developed that closes the loop from evidence generation through to dissemination and knowledge translation) of operational, methodological and scholastic research. Such a process would support and build on evaluation of implementation and can then be extended into a formalized guide from which pedagogical innovations that further the knowledge on process evaluation can be translated throughout the Collaboration as has been the case with the current JBI Implementation Handbook.9
Acknowledgements
Conflicts of interest
The authors report no conflicts of interest.
REFERENCES
1. Jordan Z, Lockwood C, Munn Z, Aromataris E. The updated Joanna Briggs Institute Model of Evidence-Based Healthcare.
Int J Evid Based Healthc 2019; 17:58–71.
2. Kitila SB, Sudhakar M, Feyissa GT. Compliance to immediate newborn care practice among midwives working in maternity wards: a best practice implementation project.
JBI Evid Implement 2020; 18:337–344.
3. Wright KM, Bonser M. The essential steps of medication administration practices project medication administration improvement practices among acute inpatients in a tertiary hospital: a best practice implementation project.
JBI Evid Implement 2020; 18:408–419.
4. Huang TJ, Mu PF, Chen MB, Florczak K. Prevention and treatment of oral mucositis among cancer patients in the hematology–oncology setting: a best practice implementation project.
JBI Evid Implement 2020; 18:420–430.
5. Brito APA, Shimoda GT, Aragaki IMM, et al. Nonpharmacological analgesic interventions among newborn infants in the University Hospital of the University of Sao Paulo: a best practice implementation project.
JBI Evid Implement 2020; 18:431–442.
6. Moore GF, Audrey S, Barker M, et al. Process evaluations of complex interventions. UK Medical Research Council (MRC) guidance. London, UK: Medical Research Council; 2015.
7. Van Zelm R, Coeckelberghs E, Sermeus W, Aeyels D, Panella M, Vanhaecht K. Protocol for process evaluation of evidence-based care pathways: the case of colorectal cancer surgery.
Int J Evid Based Healthc 2018; 16:145–153.
8. Council for Public Health and Society. No evidence without context. About the illusion of evidence-based practice in healthcare. 2017; The Hague: The Council, Available from:
https://www.raadrvs.nl/documenten/publicaties/2017/06/19/zonder-context-geen-bewijs (Eng version). [Accessed 16 September 2020].
9. Porritt K, McArthur A, Lockwood C, Munn Z. JBI handbook for evidence implementation. Adelaide: JBI; 2020; Available from:
https://implementationmanual.jbi.global.
https://doi.org/10.46658/JBIMEI-20-01. [Accessed 3 December 2020].