A commentary on the quality improvement practices in leading an organizational response to audit feedback : JBI Evidence Implementation

Secondary Logo

Journal Logo


A commentary on the quality improvement practices in leading an organizational response to audit feedback

Sykes, Michael Dip(Nursing), MBA, PhD

Author Information
doi: 10.1097/XEB.0000000000000338
  • Free



What is known about the topic?

  • Audit may lead to greater improvement if feedback recipients receive support for their quality improvement capabilities.
  • Common improvement methods recommend ‘planning’ to improve, often including a ‘situational analysis’.
  • Improvement methods are sometimes not specified or reported in a way that enables replication.
  • What does this paper add?
  • Describing the specific practices within the quality improvement response provides practical guidance to support feedback recipients.
  • Describing the specific practices supports audit providers and quality improvement leads to consider the capabilities required for such practices.
  • This paper proposes practices that tailor the response to local context and resonate with organizational readiness to change theory.


Clinical audit, also known as audit and feedback, seeks to improve care by reviewing clinical practice against an explicit standard and providing a summary of performance over a period of time.1 The main commissioner of English national audit states that, ‘healthcare providers require additional support to make best use of performance feedback data’.2 Stakeholders prioritize the importance of recipients having the capabilities to respond to the feedback,3 and theory points to the effectiveness of audit and feedback being associated with health professionals’ quality improvement capabilities.4 This article, written for feedback recipients, audit providers and quality improvement leads, describes considerations for the content and delivery of support to feedback recipients, and provides an ‘action model’ for feedback recipients to plan their organizational improvement activity. An action model ‘provides practical guidance in the planning and execution of implementation endeavours and/or implementation strategies to facilitate implementation’.5 There are different levels of organization; this article is written for people at team, division, hospital or hospital/practice group level.

Improvement methods are sometimes not specified or reported in a way that enables replication.6 Theory-informed methods are recommended, but not always applied.3 This article will describe an evidence-based and theory- and stakeholder-informed method for enhancing organizational response to audit feedback. The article draws upon work to develop an action model of the practices involved in enhanced organizational response to audit,7 as well as wider literature. The action model (Fig. 1) was co-designed using data describing the current response to a national audit from diverse hospitals and theory-informed hypotheses describing how to enhance audit.8 The action model specifies the quality improvement practices that inform, and develop commitment for, the response to audit feedback. The action model was then refined through feasibility and co-design studies aligned to two national audits (diabetes and dementia).7 The model describes practices to appraise information and generate change commitment and resonates with Weiner's organizational readiness to change theory.9 The practices are intended to be undertaken by a multidisciplinary team including a clinical lead appropriate to the organizational level of the feedback.

A co-designed action model of an enhanced organizational response to audit feedback v2. Key: Asterisk indicates the practices requiring stakeholder engagement.

Select target and specify goal(s) that address local performance

Improvement capacity is limited; this creates an opportunity cost of those clinical audit standards that are not prioritized for improvement. When selecting priorities, we found that national audit recipients were influenced by national priorities, often undertaking improvement activities on these standards in spite of having high performance and at the expense of ones with weaker local performance.8 Selecting the target for improvement could be enhanced by reviewing local absolute and relative (that is, compared to somewhere else) performance and considering impact upon outcomes. Analysis of local performance may identify sub-populations where improvement may both have the greatest impact on performance and serve to reduce inequalities. For example, the English and Welsh national diabetes audit describes differences in the use of guideline-recommended insulin pumps by age, sex, ethnicity and social deprivation.10

After selecting the target for improvement, specify the goal(s), that is, the behaviour(s) to be achieved. Goals are specified by identifying the action, actor, context, target, time6; for example, a goal to address gaps in delirium screening might be specified as, all patients over 65 (target) will be screened for delirium using the 4AT tool (action) by a nurse (actor) within 6 h of admission (time) to the hospital (context). There may be multiple goals that reflect the pathway behind performance in the standard being addressed. The specification of the goal(s), as with the other practices in our action model, involves stakeholder engagement.

Engage stakeholders

Engaging stakeholders provides more perspectives upon: the goal; influences upon the goal; and actions to meet the goal. Engaging stakeholders may help to develop commitment.9 Such social aspects of improvement are as important as technical aspects but can be challenging.11 Stakeholder engagement in the organizational response to national audit currently includes discussions with a positional leader (e.g. an associate director of nursing), an existing group related to the audit topic (e.g. discussing the national audit with the dementia steering group) and presentation at organization-level committee(s).8

Stakeholder analysis12 may help identify a broader range of stakeholders to engage. This analysis can be applied to stakeholders in the audit results, in related priorities (see below) and, later, to stakeholders in the draft actions. We found that stakeholders were keen to discuss data quality, including consideration of the data source, method of collection and triangulation with other data (e.g. complaints data).8

Stakeholders should be asked how best to engage them. Stakeholder engagement might be enhanced by including both informal face-to-face discussions and presentation at committees. These discussions should include a brief description of the source and method of the audit.8 The informal discussion may be enhanced by those involved in this discussion giving their perspectives upon:

  • (1) current performance, the reason for selecting the target and the care practices behind that target;
  • (2) personal and organizational priorities and how they relate to the audit data;
  • (3) influences upon performance;
  • (4) how to improve;
  • (5) existing actions related to the response to feedback;
  • (6) additional stakeholders in the proposed actions and goals.

Discussion of what change might mean for the stakeholders personally may help explore influences upon commitment and support them to consider the skills and resources to enact a change.12 Where organizational approval for the response is required, the above topics should be incorporated into the presentations at formal committees with responsibility for monitoring, or leading improvement in, quality.

Analyse influences upon care

Understanding what influences performance provides the foundation for selecting actions to improve.13–15 Currently, there may not be a structured analysis of influences upon performance.8 Such an analysis could involve discussions with stakeholders, observations of practice and/or systematic review. The use of theory (e.g. Normalization Process Theory13) or a framework (e.g. the Theoretical Domains Framework14) may enhance the analysis of influences upon performance.

Consideration of how to undertake this analysis might include negotiation of resources; for example, through time within job plans, through a junior doctor improvement project required for accreditation or with corporate quality improvement team support. Engaging stakeholders in the analysis can provide broader perspectives upon influences resulting in the selection of more effective improvement actions, and may start to develop buy-in to the actions.

Link influence to improvement action

Tailoring actions to influences can address barriers and facilitators to performance.15 Currently, team leads select actions based upon what they could do personally,8 which may be unlikely to address underlying influences upon performance. Selecting actions in this way may reflect their beliefs about capabilities; for example, their beliefs about others’ ability to act or their own capability to gain commitment from others. This article proposes that the alignment between the influence and the action could be enhanced through the use of logic models. Logic models articulate the underlying theory of change,15 describing how an action addresses an influence. Drawing a logic model may help clarify the proposed link between the influence and the action. The logic model may also help to communicate the link to others as part of the work to develop commitment for the improvement action.

Link to priorities

Commitment can be at different levels, for example, from individuals, or at team or hospital level.13 Organizational commitment refers to a ‘shared resolve to pursue the courses of action involved in change implementation’9 (p. 2). Organizational commitment is developed through discussions and includes consideration of risks to priorities, notably regulatory, reputational and financial objectives.8 Considering stakeholders’ priorities and presenting information linking the need for improvement and the proposed actions to those priorities may help to generate commitment.7 For example, describing how the work influences patient wellbeing, efficiency and/or individual or organizational reputation may make it easier for stakeholders to commit to improvement. Presenting comparison data (e.g. how this team compares to another team) may help stakeholders consider whether they are meeting their aim to be a high performing team.16

Espoused organizational priorities are often documented in strategy or workplan documents; for example, national strategies to reduce inequalities, organizational visions, strategic goals to be high performing or ward mission statements to provide safe and effective care. Currently, clinicians may be unaware of these priorities.7 Feedback recipients could seek conversations with people who might be aware of the local organizational priorities (e.g. clinical director, associate director of nursing) and review organizational documents, in order to identify priorities linked to the target for improvement and/or the improvement action.

Identify related existing work and collaborate

Audit and feedback may lead to greater improvement when the costs involved in making changes are lower.3,4 Implementation may be easier if task demands are acceptable.9 Explicit consideration of workstreams related to the standard for improvement, the goal(s) or the improvement action may both link the work to existing priorities and reduce the cost of change interventions. For example, if performance is influenced by health professionals’ memory, linking the audit to existing work to amend the health record may provide a low-cost way to build in prompts that address the influence of memory. Seeking collaborations with related teams who have undertaken similar work may reduce costs associated with the change; for example, the time cost of developing training materials or writing business cases.

Monitor feedback

Changes do not always lead to improvement. In line with work describing the cyclical nature of audit and plan-do-study-act,17,18 this article proposes monitoring the change, where possible using existing audit data. Monitoring should be frequent, presented in writing and verbally, and should be discussed with stakeholders in groups.1 The aims from monitoring discussions are to evaluate the impact of changes and inform decisions about the need for further action. If new feedback mechanisms are needed (e.g. to gain frequent feedback), allow time and resources for set-up.11


This paper summarizes how the response to audit feedback may be enhanced and seeks to provide practical guidance to support feedback recipients by specifying practices in the response to feedback. The action model is similar to previous approaches to improvement, for example:

  • (1) by setting a goal and using a cyclical approach similar to plan, do, study and act,17,18
  • (2) by considering selecting targets for improvement based upon performance and impact and specifying goals informed by stakeholder engagement the action model resonates with failure modes effect analysis19 (failure mode, likelihood, failure effects, cause of failure).

However, the action model provides greater specificity of the practices within the planning18 or situational analysis stage.20 Importantly, the action model also gives explicit consideration of commitment to change, an under-addressed component.11 In doing so, it responds to work describing current responses to national audit,8 to calls to provide additional support to feedback recipients2 and to papers describing the opportunity to enhance response to audit and feedback by developing feedback recipients’ quality improvement capabilities.3,4

By describing practices in the organizational response, this paper aims to support feedback providers and quality improvement leads to consider influences upon the implementation of these improvement practices; for example, the barriers and facilitators to clinical leads engaging stakeholders or exploring influences upon clinical performance. Important barriers and facilitator to implementation included how feedback recipients differentiate the approach from current practice, how they work with others to organize themselves to participate in a new practice and how they buy-in to the new approach.7 Theory-informed co-design work identified that implementation may involve demonstration of the practices, supporting feedback recipients to plan how and when to carry them out and a credible source communicating in favour of the practices.7 Creating both the physical opportunity (e.g. time11) and social opportunity to collaborate were important.7 Quality improvement collaboratives incorporating educational workshops and facilitated multisite meetings provide a structure to deliver these capabilities.7

There are strengths and limitations to the proposed action model. The action model was co-designed iteratively through stakeholder discussions of evidence and theory and feasibility tests in different national audits.7 The model describes the quality improvement practices that inform, and develop commitment for, the response to audit feedback. There are potential limitations of the action model: Theory describing how audit and feedback might lead to improvement describes earlier stages than those addressed within the co-designed approach. As such, there may be further enhancements, for example, relating to the nature of the feedback and steps leading to the intention to change.4 The action model focusses on how feedback recipients develop an organizational response; different practices may enhance individual or national responses to feedback. The action model is a simplification describing stronger relationships between selected components. It is anticipated that there are further interactions (e.g. consideration of existing work may affect both the assessment of opportunity cost proposed to influence commitment and the informational appraisal of potential improvement actions). The intervention may reflect the English healthcare context; current work is seeking to adapt the action model to a different national context. There is the opportunity to specify further the practices within the model, for example, who will analyse influences upon performance, where, when and with what materials. However, further specification may be context-specific. Instead, this paper uses a level of abstraction that both provides clarity and supports adaptation. Moore and colleagues provide guidance on how to undertake such adaptation.21


There is evidence, theory and policy for the need to support audit feedback recipients to improve care. Specifying important practices within the quality improvement response provides an action model for feedback recipients and enables audit providers and quality improvement leads to consider how such practices might be implemented. The action model presented here will be further refined through work to extend its application, content and delivery.


I would like to thank all those involved in the work cited in this commentary paper, and in particular the stakeholders and previous co-authors involved in earlier co-design studies. I would also like to thank Robbie Foy, Craig Lockwood and the reviewers for comments on earlier drafts of this manuscript.


Ethics approval and consent to participate: Not applicable.

Consent for publication: Not applicable.

Availability of data and materials: Not applicable.

Competing interests: None.

Funding: This paper includes a description of work undertaken during a Doctoral Research Fellowship (DRF-2016-09-028) supported by the National Institute for Health Research (NIHR). The views expressed in this presentation are those of the authors and not necessarily those of the NHS, the National Institute for Health Research or the Department of Health and Social Care.


1. Ivers NM, Sales A, Colquhoun H, Michie S, Foy R, Francis JJ, Grimshaw JM. No more ‘business as usual’ with audit and feedback interventions: towards an agenda for a reinvigorated intervention. Implement Sci 2014; 9:1–8.
2. Healthcare Quality improvement Partnership (HQIP). Maximising the Quality Improvement potential of the National Clinical Audit and Patient Outcomes Programme. 2021 HQIP, London.
3. Colquhoun HL, Carroll K, Eva KW, Grimshaw JM, Ivers N, Michie S, et al. Advancing the literature on designing audit and feedback interventions: identifying theory-informed hypotheses. Implement Sci 2017; 12:1-0.
4. Brown B, Gude WT, Blakeman T, van der Veer SN, Ivers N, Francis JJ, et al. Clinical performance feedback intervention theory (CP-FIT): a new theory for designing, implementing, and evaluating feedback in healthcare based on a systematic review and meta-synthesis of qualitative research. Implement Sci 2019; 14:1–25.
5. Nilsen P. Making sense of implementation theories, models, and frameworks. Implement Sci 2015; 10:53.
6. Presseau J, McCleary N, Lorencatto F, Patey AM, Grimshaw JM, Francis JJ. Action, actor, context, target, time (AACTT): a framework for specifying behaviour. Implement Sci 2019; 14:1–3.
7. Sykes M, O’Halloran E, Mahon L, McSharry J, Allan L, Thomson R, et al. Enhancing national audit through addressing the quality improvement capabilities of feedback recipients: a multiphase intervention development study. Pilot Feasibil Stud 2022; 8:143doi: 10.1186/s40814-022-01099-9.
8. Sykes M, Thomson R, Kolehmainen N, Allan L, Finch T. Impetus to change: a multisite qualitative exploration of the national audit of dementia. Implement Sci 2020; 15:1–3.
9. Weiner BJ. A theory of organizational readiness for change. Implement Sci 2009; 4:1–9.
10. National Diabetes Audit. Type 1 diabetes. 2021. NHS Digital, Leeds. Available at: https://digital.nhs.uk/.
11. Stephens TJ, Peden CJ, Pearse RM, Shaw SE, Abbott TE, Jones EL, et al. Improving care at scale: process evaluation of a multicomponent quality improvement intervention to reduce mortality after emergency abdominal surgery (EPOCH trial). Implement Sci 2018; 13:1–6.
12. Brugha R, Varvasovszky Z. Stakeholder analysis: a review. Health Policy Planning 2000; 15:239–246.
13. May C. Towards a general theory of implementation. Implement Sci 2013; 8:1–4.
14. Atkins L, Francis J, Islam R, O’Connor D, Patey A, Ivers N, et al. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems. Implement Sci 2017; 12:1–8.
15. Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, et al. Enhancing the impact of implementation strategies in healthcare: a research agenda. Front Public Health 2019; 7:3.
16. Festinger L. A theory of social comparison processes. Hum Relations 1954; 7:117–140.
17. Reed JE, Card AJ. The problem with plan-do-study-act cycles. BMJ Qual Saf 2016; 25:147–152.
18. NHS England. Plan, Do, Study, Act (PDSA) cycles and the model for improvement. 2018. Available at: https://www.england.nhs.uk/wp-content/uploads/2022/01/qsir-pdsa-cycles-model-for-improvement.pdf.
19. Institute for Healthcare Improvement. QI Essentials Toolkit: Failure Modes and Effects Analysis (FMEA). 2017. Available at: file:///C:/Users/msyke/Downloads/QIToolkit_FailureModesandEffectsAnalysis.pdf.
20. Jordan Z, Lockwood C, Munn Z, Aromataris E. The updated Joanna Briggs Institute model of evidence-based healthcare. JBI Evid Implement 2019; 17:58–71.
21. Moore G, Campbell M, Copeland L, Craig P, Movsisyan A, Hoddinott P, et al. Adapting interventions to new contexts – the ADAPT guidance. BMJ 2021; 374:n1679.

audit and feedback; collaborative; commitment; quality improvement; tailoring

© 2022 JBI. Unauthorized reproduction of this article is prohibited.

A video commentary on implementation project titled: How do health professionals prioritise clinical areas for implementation of evidence into practice? The commentary is provided by Andrea Rochon RN, MNSc, Research Assistant, Queen's University, Ontario, Canada