Secondary Logo

Journal Logo


An evaluation of a training intervention to support the use of evidence in healthcare commissioning in England

Sabey, Abigail MSc1,2

Author Information
International Journal of Evidence-Based Healthcare: March 2020 - Volume 18 - Issue 1 - p 58-64
doi: 10.1097/XEB.0000000000000208
  • Open


What is known about the topic?

  • Healthcare commissioners in CCGs in England allocate a majority of the funding for healthcare services but there is no real evidence-based culture in this sector.
  • Healthcare commissioners make less use of empirical forms of evidence compared with practical, local intelligence.
  • Many healthcare managers are unaware of evidence-based sources and library services.

What does this article add?

  • Healthcare commissioners can be supported through training to find and use evidence in their decision-making.
  • Systematically searching for and appraising grey literature as part of an evidence-informed commissioning process should be promoted.
  • Some practical tips for locating and appraising grey literature are offered to help healthcare commissioners include this important but underused source of evidence.


In contrast to the wealth of literature about the importance and promotion of evidence-based clinical practice across the healthcare professions, the use of evidence in UK healthcare commissioning organizations has not been the focus of significant research or scrutiny, yet clinical commissioning groups (CCGs) in England allocate approximately two-thirds of the National Health Service (NHS) England budget (currently £79.9bn)1 to buy services for their local populations. Other services such as ‘specialized’ services for rare conditions, military health services, prison healthcare and some public health services, are commissioned by NHS England.2 The focus here is on healthcare commissioning by CCGs in England.

CCGs are responsible for the health of their local populations; they assess health needs, decide priorities and buy services from providers such as hospitals and community health organizations, to meet those needs. Their success is measured in terms of how much they improve health outcomes, placing considerable responsibility on commissioners for the choices they make. Basing healthcare commissioning decisions on the best available evidence about what works would therefore seem both important and necessary, in the same way that clinical staff are expected to deliver evidence-based care.3,4

The limited research that has been carried out in England on the sources of evidence used in healthcare commissioning includes a survey of 11 organizations which showed that commissioners rate the importance of empirical evidence such as national guidance and journal articles, lower than practical evidence such as local public health intelligence, expert advice and best practice examples.5 This study concluded that the evidence culture in these organizations is one of plurality rather than hierarchy, highlighting the contrast with the traditional evidence-based medicine (EBM) model. A mixed methods study focused more broadly on healthcare managers’ access and use of research-based knowledge in decision-making,6 uncovered the complex social processes involved in the flow and exchange of multiple, formal and informal types of information. This study observed a tension between ‘relationship-based and experientially based knowledge’ and evidence-based knowledge.6 A similar picture of collective, negotiated use of evidence is seen in a national survey of healthcare managers,7 which found that many managers are unaware of these despite the growth of NHS and healthcare evidence-based sources. Qualitative research based on case studies of four commissioning organizations8 revealed how the different decision-making context in commissioning drives a more pragmatic selection of evidence, values different modes of communication of evidence, as well as placing importance on coproduced evidence at the local level.

It was in this context that a programme to promote evidence informed commissioning was established in the West of England as part of the work of a newly formed Academic Health Science Network (WEAHSN) in 2013, one of 15 such networks across England.9 This broad programme of work included two specific elements seeking to support CCG staff with evidence uptake: first, establishing new roles focused on supporting the uptake of evidence in CCGs and second, a training programme to help build a culture of evidence and evaluation in these organizations. This latter programme was set up in partnership with the National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care, based in the West of England (CLAHRC West).10 The article will focus on the evidence training element of this programme, carried out as a partnership between WEAHSN and CLAHRC West, and how this has promoted the use of evidence in healthcare commissioning.


A needs assessment exercise was carried out by CLAHRC West that informed this project and is reported elsewhere.11 This included data from a local survey of managers in a CCG which highlighted that staff lacked confidence in finding and appraising evidence and this was an unmet training need. This informed the decision to offer short, practical training workshops to all seven CCGs in the West of England to support evidence use in this context.

The workshop was developed by the author, based on a course devised jointly with the information scientist at CLAHRC West, to support healthcare commissioners with finding and using evidence in decision-making. Each workshop was designed to be only 2 h long, to be attractive to staff working in the local CCGs. The content covered a brief background to evidence-based practice; discussion of evidence definitions and use by participants; guidance and a demonstration of how to search for evidence (covering both traditional and grey literature sources); a brief look at quality of evidence including an overview of sources of bias and a short critical appraisal activity. There were clear learning outcomes linked to the content including: to be able to explain what is meant by evidence and why it is important; to know how a search for evidence is conducted and how to get help with searching; to be able to use some simple questions to assess the quality of evidence; to know how to access further resources to support evidence use and help others to use evidence.

As the workshops were a practical intervention implemented in the complex, dynamic environment of large organizations, the evaluation was designed in a pragmatic way to capture the elements of the training that were valued by participants and that have assisted in supporting practice change in this setting. In keeping with Kirkpatrick's training evaluation approach12 this included the immediate response to the training as well as exploring the potential for any longer term change in work practices.

Between March and September 2016, workshops were delivered in each of the seven CCGs in the West of England by the author and a range of local librarians. The training was open to anyone in the CCGs rather than a select sample. This was a new type of training for staff and there was hesitation in some areas about what this was intended to do and why it was being offered; we therefore chose not to ask participants to rate their knowledge and use of evidence before the workshop so as not to appear to be judging or criticizing this. Although this meant no formal baseline for individuals, we wanted the training to be as open and constructive as possible to reveal the potential ways we could support evidence uptake.

Participants were asked for their immediate response or ‘reaction’12 to the relevance and quality of the training content and delivery, using a scale from 1 to 4 (where 1 was ‘poor’ and 4 was ‘excellent’). They were also asked to indicate one action they would be taking as a result of the workshop. This question was a way of assessing impact of the training on intended behaviours, also described as ‘transfer’ in this context.12 This was intentionally a very open question with the idea that eliciting an unprompted response would reveal the most impactful features of the training. Training can have different outcomes to those that are expected or predefined13 and revealing these may offer valuable insight into unknown barriers and challenges in influencing changes in evidence use in this context.

The evaluation included a brief follow-up phase to find out whether these intended actions had been carried out, to explore the potential for an impact on longer term change in work practices or behaviours. A subsample selected randomly from workshops held in different CCGs, were invited to take part in a brief telephone interview between 3 and 6 months later. Interviews were semistructured around four main questions enquiring about the individual's role in relation to evidence; if and how the knowledge gained at the workshop had been used in their role since the training; an example of a change made at work as a result of something learned at the workshop; and an open question about anything else arising from the training. This could be seen as a crude measure of educational effect but it is important to acknowledge again that this project was set in a real-world setting and did not seek to control all the variables that may influence evidence use such as other training taken by staff. The author carried out all the interviews, which lasted on average 15 min; these were not recorded but detailed notes were captured and then immediately written up following the interview. Given the scale of the evaluation, the data were analyzed using a simplified framework approach based on Ritchie et al.,14 involving careful reading of responses to interview questions, identification of themes in the data, mapping data onto the framework and highlighting key illustrative quotes. This approach was judged to be proportional to the data from a small-scale evaluation rather than a thematic analysis involving multiple stages of coding and verifying.


In total 63 participants attended the evidence workshops. Participants worked in a variety of roles such as project support, contract lead, clinical effectiveness lead, commissioning manager and primary care manager. Evaluation forms (n = 39) showed that 95% rated the workshop overall as either ‘excellent’ or ‘good’. New actions that participants said they would take as a result of the workshop included coordinating a thorough search for evidence to support new projects; access Google Advanced; use [bibliographic] databases and other trusted evidence sources; access library services, set up an evidence alert and obtain an NHS Athens login.

Overall, the training workshops revealed that healthcare commissioners welcomed support in how to find, access and appraise evidence and were bracingly honest about the lack of a systematic approach to using evidence in this context. For example, they talked of never searching databases, and relying on rapid Google searches and ad hoc, local knowledge. They were appreciative of the chance to talk about this and to learn new skills. Furthermore, discussion during workshops about the meaning of evidence in this context confirmed that the type of questions that arise in commissioning necessitates a reliance on grey literature over academic papers published in peer review journals. For example, healthcare commissioners may need evidence to answer questions about unusual or complex health interventions, tailored to a specific patient group, rather than a single intervention such as a drug (e.g. a multicomponent lifestyle programme for patients with Type II diabetes); they may ask questions about whether service users will find a new service acceptable or what factors will support successful implementation. These are questions that are less likely to be answered by a traditional research approach such as a randomized controlled trial or to be addressed by a systematic review. This would drive a need to search for evidence beyond the bibliographic databases, to grey literature.

Discussions during the workshops further revealed that while healthcare commissioners informally seek out and use evidence from grey literature, there is a lack of a systematic approach to this or any critical appraisal of this type of evidence. Perhaps because it sits outside the traditional hierarchy of evidence, grey literature is not perceived to require appraisal in the same way as an academic paper.

Longer term impact

The follow-up interviews were completed with nine people from five CCGs, together with four replies by e-mail, giving a total of 13 participants. Those interviewed were in roles such as project management, contract management, commissioning delivery, quality assurance and medicines management. Formal data saturation was not pursued given the practical constraints in securing interviews, but there were strong similarities in responses across individuals from different organizations. These interviews showed that the workshops also had a positive impact in the longer term. Themes in the data identified three types of change – a simple personal change such as raising interest and motivation; change in processes such as how evidence is used in business cases, and change in the form of decisions leading to financial savings.

At the simple level, for some the learning had triggered an interest in wanting to use evidence more and had been motivating. Information was being shared with colleagues about the access to the library services for evidence searching which many participants were not previously aware was available to them. Understanding the role of evidence in other people's roles such as commissioning managers had also proved helpful in broadening discussions about evidence.

The theme of change in processes reflects how some participants were now searching for evidence for use in business cases and other decision-making in a less ‘ad hoc’ way: ‘the way I would search is different – definitely’ and accessing ‘more reliable evidence as a result’. The task of locating evidence was being shared among colleagues helping to broaden the responsibility for finding and filtering evidence. This sort of change makes it more realistic that evidence use becomes a routine part of business processes and decision-making in healthcare commissioning.

The decision-making that led to impact on financial savings from this educational intervention was highlighted in one interview. The workshop had triggered a senior manager to look into the evidence behind a procedure routinely carried out as part of total knee replacement that adds approximately £3K to the cost of each procedure; there was strong evidence for the conclusion that there is no clinical benefit from the procedure and this had led to consultations with clinical staff, a review of policy and ultimately a change in funding policy with projected annual savings in this one commissioning group of £400K. The team went on to look at the evidence behind other policies. As this participant said, ‘the workshop made me go out and check some of these things and not take things at face value’. While acknowledging that this is only one case, it exemplifies what is possible from just one commissioning manager implementing a change based on evidence.


It is clear from this evaluation that offering short, interactive training workshops is valued by healthcare commissioners and can make a difference to their approach to and use of evidence in decision-making. As seen in other research,6 participants valued the chance to step outside their normal environment and engage with others about evidence use. Furthermore, showcasing library services as part of the workshops emerged as a particularly valuable component. Having library support for identifying evidence is seen to be an essential part of improving evidence use in healthcare.15 As many commissioners in this and other studies7 are seen not to use or be aware of local or national library searching services, promoting specialist support for finding and appraising evidence could increase the use of high-quality evidence in healthcare commissioning. A qualitative study of evidence use in eight CCGs similarly concluded that commissioning stakeholders need support to develop capabilities for evidence to ensure effective, evidence-based commissioning.16 With the continual financial pressures in the NHS, CCGs must constantly look for ways to improve efficiency, making the use of evidence ‘critical to the survival of England's NHS’.16

In an environment that currently tends towards only ‘ad hoc’ use of research, where other support initiatives have not succeeded,17 delivering contextualized, practical training, including the spread of librarian expertise, could encourage a broader culture change in healthcare commissioning and help shift behaviour towards more systematic and consistent use of evidence. This is similar to the model of training we have developed at CLAHRC West (now recommissioned as ARC West), to build a research culture and develop a health and social care workforce receptive to evidence which is helping to meet the immediate, practical needs of these professionals.11

The work has also highlighted the need for a flexible approach to the concept of evidence in healthcare commissioning, which includes grey literature as a legitimate form of evidence, alongside the traditional forms that make up the evidence hierarchy established by the EBM movement.18 It is because grey literature is generally classified as more narrative in that it does not fit into the EBM model, but this is also exactly why it is so useful for commissioning. This demands a different way of conceptualizing the value of evidence, away from a hierarchy and towards a matrix where a blend of different types of evidence may contribute answers to the complex questions raised by healthcare commissioners. A similar idea comes from the field of public health.19 However, there is evidently resistance to this notion, given that grey literature is still not readily accepted as a legitimate source of evidence in healthcare despite acknowledgement in influential models such as those of the Joanna Briggs Institute that a narrow definition of evidence is problematic.20 They advocate that a diverse array of sources is required, with legitimacy of evidence determined by its purpose which might be the feasibility, appropriateness, meaningfulness or effectiveness of an intervention or other activity in healthcare, to inform changes in practice or other decision-making. Conceptualizing evidence in this way results in a far broader concept of evidence in which less rigorous sources can have value if they fill a gap in knowledge, so becoming the ‘best available’ source of evidence.20,21

This supports the idea for an evidence matrix for commissioning that gives specific recognition to the value of grey literature in this context. This type of evidence has evolved considerably in the past 20 years with the advent of desktop publishing. Its origins have been traced back to early traditions of sharing scientific and other technical and policy knowledge,22 with information in the form of reports being shared only among groups with a common interest (sometimes for reasons of confidentiality), rather than widely through the published press. Today grey literature encompasses a much wider array of publications outside the commercially produced peer reviewed journals, for example, working papers from expert committees, reports from government agencies or research groups, conference papers, other unpublished or ongoing work, as well as archival material, statistics and informal communications from experts. The more dynamic approach to the production and distribution of literature means the creators of grey literature are able to disseminate their work far more quickly and widely than conventionally produced literature, making this type of evidence often more up-to-date and accessible, not being subject to requirements of publishers about timing or cost of access.

Our workshops have certainly helped to promote the use of a wide range of evidence in healthcare commissioning decisions, but it is evident that healthcare commissioners would benefit from guidance on finding and using grey literature as part of their evidence searching. A local healthcare librarian with whom we have worked at CLAHRC West has developed a list of repositories, indexes and web tools to facilitate the search for grey literature, which are hosted on the website of the Trust where she works.23 Such resources will be equally useful to commissioners and we now promote this link in training courses delivered at ARC West.

Promoting and enhancing the use of grey literature must also include becoming adept at appraising it; as with any type of evidence, it is vital to consider the quality of this type of literature, perhaps even more so when we consider the changes in development of grey literature. Following this study we now promote the use of the AACODS checklist,24 in our subsequent training workshops. AACODS is a simple approach first proposed by a librarian in Australia in 2008,25 and stands for Authority, Accuracy, Coverage, Objectivity, Date and Significance, with evidence for each aspect considered in depth and a judgement reached about the adequacy. The checklist has been widely used in academic studies including systematic reviews. Like the commonly used appraisal tools from organizations such as the Critical Appraisal Skills Programme,26 the AACODS checklist offers a realistic, structured approach to the consideration of quality in grey literature and should be applied routinely to help maintain the use of high-quality evidence in healthcare commissioning.


Short, targeted workshops to promote the use of evidence were delivered successfully across seven clinical commissioning organizations in England; the training was rated highly and subsequent telephone follow-up highlighted some valuable longer term impact on evidence-seeking activities and decision-making. A particularly valued feature of the training was the inclusion of healthcare librarians, helping to spread expert skills and awareness of library services among commissioning staff. Grey literature emerged as highly relevant in this context and should be included in this type of training to encourage a systematic approach to the search for an appraisal of this type of evidence. The use of the AACODs checklist is recommended as a key step in selecting the right evidence to use.

Training is an important and valued element in building the evidence culture in healthcare commissioning and a flexible approach to the concept of evidence in this context is important.


The current work was completed by the author as part the Commissioning Evidence Informed Care team at West of England Academic Health Science Network, whose support and encouragement were invaluable. The expertise of Alison Richards (Information Scientist at CLAHRC West) in developing the searching element of the workshop is also gratefully acknowledged. The author also wishes to thank the local healthcare librarians who helped deliver the evidence workshops with her during 2016, and especially Joanna Hooper, whose expert knowledge on grey literature enhanced the training and this article. Thanks also to my colleagues at CLAHRC West, Professor Selena Gray and Michele Biddle for comments on an earlier draft of this article.

As this was an educational evaluation and did not identify participants, ethical review was not applicable.

The work was carried out as part of a secondment from UWE, Bristol (UK) that was jointly funded by the National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care West (NIHR CLAHRC West), now recommissioned as NIHR Applied Research Collaboration West (NIHR ARC West), at University Hospitals Bristol NHS Foundation Trust and the West of England Academic Health Science Network. The views expressed are those of the author(s) and not necessarily those of NHS England, NHS Improvement, the NIHR or the Department of Health and Social Care.

Conflicts of interest

The author reports no conflicts of interest.


1. NHS Clinical CommissionersAbout CCGs – NHS clinical commissioners. 2019; London:NHS Clinical Commissioners, Available at: [Accessed 1 April 2019].
2. NHS EnglandWho commissions NHS services? 2019; England, London:NHS, Available at: [Accessed 1 April 2019].
3. Nursing and Midwifery CouncilThe code: professional standards of practice and behaviour for nurses, midwives and nursing associates. 2018; London:Nursing and Midwifery Council, Available at: [Accessed 14 March 2019].
4. General Medical CouncilGood medical practice. 2014; London:General Medical Council, Available at: [Accessed 14 March 2019].
5. Clarke A, Taylor-Phillips S, Swan J, et al. Evidence-based commissioning in the English NHS: who uses which sources of evidence? A survey 2010/2011. BMJ Open 2013; 3:e002714.
6. Dopson S, Bennet C, Fitzgerald L, et al. Healthcare managers’ access and use of management research. Final report. NIHR Delivery and Organisation programme 2012; Southampton:NIHR, Available at: [Accessed 14 March 2019].
7. Edwards C, Fox R, Gillard S, et al. Explaining health managers’ information seeking behaviour and use. Final report. NIHR Service Delivery and Organisation programme 2013; Southampton:NIHR, Available at: [Accessed 14 March 2019].
8. Wye L, Brangan E, Cameron A, Gabbay J, Klein JH, Pope C. Evidence based policy making and the ‘art’ of commissioning – how English healthcare commissioners access and use information and academic research in ‘real life’ decision-making: an empirical qualitative study. BMC Health Serv Res 2015; 15:430.
9. West of England Academic Health Science Network. Who we are. [Online] N/D. Available at: [Accessed 14 March 2019].
10. National Institute for Health Research. Applied Research Collaboration, West. About us. N/D. [Online] Available at: [Accessed 16 October 2019].
11. Sabey A, Bray I, Gray S. Building capacity to use and undertake applied health research: establishing a training programme for the health workforce in the West of England. J Public Health 2019; 167:62–69.
12. Praslova M. Adaptation of Kirkpatrick's four level model of training criteria to assessment of learning outcomes and program evaluation in Higher Education. Educ Asse Eval Acc 2010; 22:215–225.
13. Havnes A, Prøitz TS. Why use learning outcomes in higher education? Exploring the grounds for academic resistance and reclaiming the value of unexpected learning. Educ Asse Eval Acc 2016; 28:205–223.
14. Ritchie J, Spencer L, O’Connor W. Ritchie J, Lewis J. Carrying out qualitative analysis. Qualitative research practice: a practical guide for social science students and researchers. London:Sage; 2003. 219–262.
15. Health Education England. National health services library and knowledge services in England Policy. [Online] N/D. Available at: [Accessed 19 March 2019].
16. Swan J, Gkeredakis E, Manning RM, Nicolini D, Sharp D, Powell J. Improving the capabilities of NHS organisations to use evidence: a qualitative study of redesign projects in Clinical Commissioning Groups. 2017; Southampton:NIHR Health Services and Delivery Research, No. 5.18. Available at: [Accessed 19 March 2019].
17. Wilson PM, Farley K, Bickerdike L, et al. Effects of a demand-led evidence briefing service on the uptake and use of research evidence by commissioners of health services: a controlled before-and-after study. 2017; Southampton:NIHR Health Services and Delivery Research, No. 5.5. Available at: [Accessed 19 March 2019].
18. Centre for Evidence Based MedicineLevels of evidence. 2009; Oxford:University of Oxford, Available at: [Accessed 19 March 2019].
19. Petticrew M, Roberts H. Evidence, hierarchies and typologies: horses for courses. J Epidemiol Community Health 2003; 57:527–529.
20. Pearson A, Wiechula R, Court A, et al. A reconsideration of what constitutes ‘evidence’ in the healthcare professions. Nurs Sci Q 2007; 20:85–88.
21. Pearson A, Jordan Z, Munn Z. Translational science and evidence-based healthcare: a clarification and reconceptualization of how knowledge is generated and used in healthcare. Nurs Res Pract 2012; 2012:792519.
22. Lawrence A. Electronic documents in a print world: grey literature and the internet. Media Int Aust 2012; 143:122–131.
23. National Health Service University Hospitals Bristol NHS Foundation Trust. Grey literature. [Online] N/D. Available at: [Accessed 19 March 2019].
24. Tyndall J. AACODS checklist. 2010; Adelaide:Flinders University, Available at: [Accessed 19 March 2019].
25. Tyndall J. How low can you go? Toward a hierarchy of grey literature. Presented at: Dreaming 08 – Australian Library and Information Association Biennial Conference, 2–5 September 2008; Alice Springs, Australia. [Online] Available at: [Accessed 21 March 2019].
26. Critical Appraisal Skills Programme. CASP checklists. [Online] N/D. Available at: [Accessed 19 March 2019].

commissioning; critical appraisal; evidence-based practice; grey literature; training

International Journal of Evidence-Based Healthcare © 2020 The Joanna Briggs Institute