Secondary Logo

Journal Logo

Implementation of Clinical Practice Guidelines in the Health Care Setting

A Concept Analysis

Beauchemin, Melissa MPhil, RN, CPNP; Cohn, Elizabeth PhD, RN; Shelton, Rachel C. ScD, MPH

doi: 10.1097/ANS.0000000000000263
Original Articles
Free

The literature is replete with clinical practice guidelines (CPGs) and evidence supporting them. Translating guidelines into practice, however, is often challenging. We conducted a concept analysis to define the concept of “implementation of CPGs in health care settings.” We utilized Walker and Avant's methodology to define the concept of “implementation of CPGs in health care settings.” This included a focused review of the literature, defining the relevant attributes, defining implementation, case examples, and antecedents and potential consequences from implementation of CPGs in health care settings. The concept “implementation” is complex, with numerous frameworks, facilitators, and barriers to implementation described in the literature. The existing literature supports our definition of implementation of CPGs in a health care setting as a process of changing practice in health care while utilizing the best level of evidence that is available in the published literature. These include 7 attributes necessary for effective implementation. Implementation of CPGs in health care settings requires an ongoing iterative process that considers these attributes and is inclusive to administrators, clinicians, and patients to ensure guidelines are understood, accepted, implemented, and evaluated for continued adoption of best practices. Ongoing efforts inclusive at all steps of implementation across multiple levels are needed to effectively change practice.

School of Nursing (Ms Beauchemin) and Mailman School of Public Health (Dr Shelton), Columbia University, New York; and Hunter College, The Graduate Center, City University of New York, New York (Dr Cohn).

Correspondence: Melissa Beauchemin, MPhil, RN, CPNP, School of Nursing, Columbia University, 560 West 168th St, New York, NY 10032 (mmp2123@cumc.columbia.edu).

Ms Beauchemin's participation in this research was made possible by the Reducing Health Disparities through Informatics (RHeaDI) training grant (T32 NR007969) funded by NINR (PI: Bakken) as well as a Doctoral Degree Scholarship in Cancer Nursing, DSCN-18-068-01, from the American Cancer Society. The content is solely the responsibility of the authors and does not represent the official views of the National Institutes of Health or the American Cancer Society.

The authors have disclosed that they have no significant relationships with, or financial interest in, any commercial companies pertaining to this article.

CLINICAL PRACTICE GUIDELINES (CPGs) are developed through a rigorous systematic methodology synthesizing the ever-increasing amounts of published literature into a practical and digestible set of clinical recommendations to be used in a health care setting.1,2 The Grading of Recommendations Assessment, Development and Evaluation (GRADE) collaboration has established a widely accepted approach for developing guidelines by rating both the quality of the evidence and the strength of the recommendation.3 The US Preventive Services Task Force also develops guidelines using a similarly rigorous and transparent methodology.4 Both approaches recognize that guidelines need to be trustworthy and understandable from the perspective of key stakeholders, and the ultimate goal is to direct clinicians in providing the most up-to-date, evidence-based, and highest-quality care for their patients.

Back to Top | Article Outline

Statement of Significance

What is known or assumed to be true about this topic?

We know that the literature is replete with CPGs and evidence supporting these them. In addition, as nurses we know that provision of evidence-based practice can improve patient outcomes. Translating these guidelines into clinical practice, however, is often challenging. Implementation science, or the study of translating evidence into health care delivery, is a relatively new field, and the concept of “implementation” is complex. Frameworks for implementation science and related domains have been described as well as facilitators and barriers to implementation.

What this article adds?

Although it has been widely discussed within the dissemination and implementation (D&I) field, a clear definition of the concept “implementation of CPGs in clinical settings” has not been described. This article adds to the nursing knowledge about evidence-based practice by defining this complex concept using a widely accepted methodology. Our thorough search of the literature and concept analysis supports the operational definition as “a process of changing practice in health care while utilizing the best level of evidence that is available in the published literature.”

Back to Top | Article Outline

COMPLEXITIES IN IMPLEMENTING CPGs

Although our literature is replete with guidelines, the expected improvements in patient outcomes and reduction in health care–related costs have not been realized.1,5,6 Research suggests that, on average, it takes up to 17 years for only 14% of published evidence to translate into practice.7,8 In response to this, research has shifted to focus on how guidelines can be effectively implemented in a clinical or community setting.9,10 These strategies are extensive and comprehensive; as one example, the Expert Panel for Recommending Change project,10 names 73 strategies and their clear definitions, highlighting the breadth and combinations of possible implementation strategies. These strategies are defined as “techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice”11 and often involve multifaceted components.12 Another nuance to implementation strategies is the measurement or outcomes that often differ from traditional clinical trial outcomes, which focus on efficacy or effectiveness of an intervention. Implementation outcomes instead include acceptability, appropriateness, costs, and sustainability of the evidence-based intervention, and/or the implementation strategy.13 Because of the notable time lag in implementing research into practice, the barriers to implementation have also been described and vary by the implementation strategy being utilized, setting characteristics, provider and patient cultural characteristics, and the clinical practice change itself.5,14

The new evidence or practice change being introduced or implemented into a clinical or community setting is important to consider. Traditionally, the highest level of evidence has been the randomized controlled trial, often used to inform robust systematic reviews and CPGs.15 These trials are designed to answer questions about efficacy of an intervention, and therefore have strict inclusion criteria to maximize internal validity. This is an important consideration when interpreting the results of a published trial because the results tend to be limited in generalizability and external validity.16 As such, clinicians often have little understanding as to whether results from randomized controlled trials are applicable across settings, populations, and conditions.16 More recently, focus has shifted toward effectiveness or pragmatic trials, as this type of trial can better evaluate the real-world effectiveness of an intervention.17 These trials can help address mistrust among clinicians and decision-makers of CPGs that has been cited due to actual and perceived limitations of generalizability to general populations.18,19

Implementation science has emerged over the past 10 years, taking a transdisciplinary research approach and seeking to advance understanding of the methods and strategies used to promote the systematic uptake and integration of evidence-based practices and programs across diverse real-world clinical or community settings.20 The field of implementation science continues to grow, as the facilitators, barriers, frameworks, and strategies for uptake of the evidence into practice are developed and identified across contexts and populations.5,10,20,21

Back to Top | Article Outline

NEED FOR A CLEAR DEFINITION OF GUIDELINE IMPLEMENTATION IN HEALTH CARE SETTINGS

Despite the large volume of research in implementation science, the concept of implementation is relatively new to the clinical setting. It is instrumental to changing clinical practice and, ultimately, providing the highest quality of care to patients. However, the theories and models that have defined this field may not be apparent or accessible to the clinicians providing care to patients and the community. It is therefore important to define and describe this concept to ensure nursing and other clinical communities understand its full implications. To provide clarification on this critical and timely concept, the intent of this article is to conduct a concept analysis of implementation of CPGs in health care settings.

Back to Top | Article Outline

METHODS

This concept analysis was developed using Walker and Avant's 8-step methodology.22 This process includes (1) choosing a concept; (2) determining the purpose of the concept analysis; (3) identifying all uses of the concept; (4) identifying the defining aspects or attributes; (5) describing a model case; (6) describing a borderline, related, contrary, invented, and illegitimate case; (7) identifying antecedents and consequences; and (8) defining empirical referents. This method is useful for defining and describing concepts that are not well-defined and can subsequently be used to inform future studies relating to the concept of interest.

We conducted a focused literature search to identify the existing definitions of implementation of CPGs as outlined in step 3 of Walker and Avant's methodology. We searched PubMed using a combination of keywords and MeSH terms relating to “implementation,” “evidence-based practice,” and “delivery of health care” after consultation with experts in the field and an information specialist (see Table 1 for our search strategy). Studies were included if they were systematic, scoping, or integrative reviews of guideline implementation in a health care setting. We included conceptual frameworks and focused on broader implications of implementation processes. We excluded studies that focused on a specific disease or specific clinical settings. We also excluded textbooks from our formal literature review although we acknowledge one major textbook in this area.23 Finally, we searched the National Institutes of Health/National Cancer Institute's Grid-Enabled Measures Database (GEM), an interactive Web site of scientific measures and organized by theoretical constructs.24 We searched GEM for the construct “implementation” to ensure we did not miss any relevant theoretical frameworks or operational definitions.

Table 1

Table 1

We also searched Google.com (Google scholar?) to identify the various settings that use this concept of “implementation” outside of the health care arena.

Back to Top | Article Outline

FINDINGS

Our search strategy retrieved 3226 articles from PubMed, and 1 additional study from reference searching. In addition, utilizing GEM helped to ensure we identified all relevant existing frameworks and measures for implementation as a construct; no additional studies were identified through this database. We excluded the majority of our search by title and abstract review, and ultimately identified 17 publications to include in this review (see Figure 1 for PRISMA flowchart).

Figure 1

Figure 1

Back to Top | Article Outline

Clarification on definitions of implementation

Implementation is derived from the verb “to implement” and is defined in Merriam-Webster Dictionary as “to give practical effect to and ensure of actual fulfillment by concrete measures.” Implementation is discussed frequently not only in health care settings, but also in relation to computer science, information technology, political science, and conservation studies. For example, due to the availability of increasing volumes of data and new information, many disciplines are finding that implementation of new or developing strategies or evidence has become fundamental, yet challenging.

The literature search confirmed a similar response in the health care community to the overwhelming amount of published data that has resulted in a shift toward implementation science. Our results are summarized in Table 2, and the publications included in the review described the relevant frameworks, high-level concepts and domains relating to implementation in a clinical setting, and a few select publications that related specifically to the nursing field.25,26

Table 2

Table 2

The literature reviewed for this concept analysis identified that “implementation” is a term used in many health care settings, but particularly relates to clinical health care delivery systems,34,41,42 quality improvement,43–45 and patient safety initiatives.46,47 The results described 4 important domains within the construct of implementation: (1) the practice change characteristics; (2) the guidelines or other forms of evidence, which should inform the practice change; (3) strategies to implement an intervention or CPG; and (4) the challenges and, less frequently, potential interventions to overcome these barriers. Table 2 provides further information on the domains discussed in each included publication. The literature supports the operational definition, which will be used for this concept analysis of implementation of CPGs in the health care setting, as a process of changing practice in health care while utilizing the best level of evidence that is available in the published literature.

Back to Top | Article Outline

Defining attributes

The defining attributes are the characteristics that are necessary for understanding and defining the concept. The attributes may be examined through the published literature, general consensus, or expert opinion; a clear outline of attributes improves the ability to convey the conceptual meaning to individuals as well as across disciplines.22 We drew the defining attributes from the literature search conducted for this concept analysis.

Back to Top | Article Outline

Attribute #1: Current practice and policy

In any clinical setting, there are common practices in the local health care delivery model. Current practice may be informed by previously published evidence and may include adapted guidelines, clinical care models, standard procedures, or clinical pathways operationalized in inpatient hospitals, outpatient clinics, or private offices providing health care services. Current practice may also not be formally defined but instead can be viewed as current patient practice in a specific setting to address a defined diagnosis or symptom. In addition, current practice is often influenced by local, state, and federal policies. The World Health Organization defines health policy as the “decisions, plans, and actions that are undertaken to achieve specific health care goals within a society.48” These decisions, plans, and actions guide the funding, development, accountability, and implementation of health-related topics.23 Although some current practices may not be directly affected by a specific policy, because policies from the Centers for Medicare & Medicaid Services and other governmental organizations often dictate health care–related coverage and reimbursement, it is important to understand and acknowledge the influence that policy may play in influencing clinical practice.

Back to Top | Article Outline

Attribute #2: New evidence/innovation

When new evidence, which can also be thought of as the innovation, regarding the practice or care delivery mechanism is published or discovered in a literature review, it may either support current practice or demonstrate the superiority or efficiency of a different practice that should be adopted to provide the best care to patients. This evidence may be in the form of a CPG developed from meta-analyses of randomized controlled trials, historically considered the strongest level of evidence,15,49 or may be informed from the results of a single randomized controlled trial comparing a newer treatment to the standard of care. It may also be evidence from a prospective cohort or observational study comparing the outcomes of multiple cases and controls or perhaps just one group of patients. Because the level of evidence may vary, its strength should be assessed using a standardized tool, such as the Appraisal of Guidelines for Research & Evaluation (AGREE) Instrument for guideline assessment, or Cochrane's GRADE assessment for other research designs.50,51 This process, however, is burdensome and often takes a trained, dedicated team to operationalize. In addition, the process of appraising the evidence should include consideration of the generalizability of the results.16,18

Back to Top | Article Outline

Attribute #3 Introduction of new practice or new evidence

The new evidence or practice can be introduced into the health care setting through various entry points. Ultimately, each of these is an important component of this attribute.

Back to Top | Article Outline
Institutional assessment of the new practice

Once the evidence or innovation is published or made publicly available, an assessment may be conducted by hospital administrators, a guideline review committee, risk management, or other committees responsible for evaluating the current practices and policies of an institution, hospital, private practice, or other community-based health care system. This attribute might be a more feasible and appropriate phase to introduce the evidence-assessment tools discussed previously.3,50,51 In addition to the above stakeholders who contribute at this phase of implementation, whenever possible, the health care setting should consider including the health information technology team or, if possible, clinical informaticians. Implementation processes should include integration into the clinical workflow wherever possible, and this often translates into electronic health record integration or clinical decision support systems.44,52,53 The literature cites integration into workflow as a potential barrier to implementation,26,36,39 and having those who can help integrate a guideline in an automated approach may optimize the implementation strategy.

Back to Top | Article Outline
Individual clinician assessment of the new practice

The next attribute of implementation is the assessment of the practice change recommendation by an individual clinician or health care provider. Because this involves an individual's personal assessment, this attribute is likely to cause greater variability in an implementation process. For example, although the entire process of implementation is built on an underlying foundation in evidence-based practice, the individual clinician's assessment of the new practice will depend heavily on that individual's perception and trust of the evidence.37 If the individual understands and values the methodology that informs the evidence-based practice and guidelines and the source is credible, it is likely that the clinician will determine the process to be credible and acceptable, allowing progress from this attribute toward subsequent practice change.32,54

Back to Top | Article Outline
Patient education/involvement in decision-making about new practice

Many changes in clinical practice will directly affect the patient, and consideration of the patient as a stakeholder is an additional requirement for effective implementation. Depending on the level of direct contact that a patient will have with a proposed change in practice, involvement in the decision-making and implementation plan may vary. For example, updating a cancer screening process or a vaccination schedule heavily relies on the patient to comply with the recommendation; on the contrary, an update to a method for measuring blood pressure or assessing for risk of stroke relies less heavily on patient compliance and buy-in. Regardless, the patient's decision-making should be a consideration during the development of a CPG implementation plan.36

Back to Top | Article Outline

Attribute #4: Practice change

Practice change, the next phase and attribute, is often complex and multidimensional depending on the modification to the practice and the setting in which the practice occurs.36 This phase involves a change in the health care system, health care providers, and other staff who are involved in provision of care, and patients.39 This may also be referred to as the active implementation phase.33 Because of the multidimensionality, this attribute is instrumental. Its presence confers an understanding that the previous attributes have been established and the decision is being made to change the current practice while considering the complexities that affect a successful process change.

As discussed in May's general theory of implementation, the process of implementation as it relates to this attribute of practice change can be considered more of a process within a broader social context, and without integration into the already existing workflow, it is unlikely to be effectively implemented.35 This attribute requires not only the prior assessments and acceptance of the proposed practice, but also a system that will support the adaptation. This attribute is where the majority of the described multilevel and multifactorial facilitators and barriers will become evident. For example, a health care system that is culturally adaptive to change and has a well-integrated clinical informatics program to support the technical requirements to integrate the change will likely implement the guideline more effectively than one that does not have these characteristics.53

Back to Top | Article Outline

Attribute #5: Plan for reassessment and evaluation

Following an effective implementation of an updated or new CPG, a clear plan for follow-up of the effectiveness should be established. This often takes place in clinical settings as a quality improvement/quality assurance protocol. It can also be more formalized through an established group of clinician champions who are tasked with evaluating the literature at regular intervals. This reassessment phase will also help to identify challenges in the implementation plan and may provide discussions for adapting the current procedures to make them more feasible in a particular clinical setting. Similar to the published methodologies for developing and updating guidelines, an institution should establish similarly transparent practices for updating and adapting its procedures.

Back to Top | Article Outline
Deimplementation

Because implementation by definition requires a change in practice, we recommend consideration of deimplementation as an important component, often occurring during the reassessment and evaluation phase. Deimplementation occurs when a previously accepted practice, studied under new, or refined, research methods, is found to be ineffective or even harmful.55 A recent concept analysis defined deimplementation as “the process of identifying and removing harmful, non-cost-effective, or ineffective practices based on tradition and without adequate scientific support.”56 Sometimes referred to as medical reversal, this attribute should be developed in parallel to implementation strategies of practice change. Although deimplementation describes a reversal from a previously established practice, the plan to fill the gap created from a deimplemented practice needs to be clear. One cannot simply stop providing care in a certain way without a plan to continue to provide the care that is needed in an accepted and predetermined approach. Deimplementation, in fact, has been cited as political and contentious.57 Financial, political, and other special interests may present an additional barrier for deimplementation of a specific practice.58

Back to Top | Article Outline

Cases

Cases provide exemplars of the concept in a contextual format and help to further clarify the concept of interest.22 The ideal case provides an example of the concept in an optimal setting or use and can be thought of as a gold standard. The other cases are either suboptimal uses of the concept or not meeting conceptual criteria at all.

Back to Top | Article Outline

Ideal case

Westwell Hospital had been prescribing antiemetic medications for their pediatric oncology patients utilizing an algorithm that was developed by the providers caring for the patients based on a 2013 guideline published in a relevant oncology journal. The algorithm was developed following the guideline's publication in early 2014 by the division's Clinical Guidelines Committee and implemented into their electronic health record in collaboration with their clinical informatics department through a clinical decision support system that provides recommendations based on the algorithm. A nurse involved in the division's quality assurance research initiatives recently attended a conference and received education about a more current guideline from the Pediatric Oncology Group of Ontario published in 2018 on the prevention of acute chemotherapy-induced nausea and vomiting. The updated guideline presented data that no longer recommended a previously used medication for antiemetic purposes and allowed another antiemetic to be used in younger patients than previously published. The nurse presented this updated guideline to the policy and procedure committee and suggested they revise their algorithm for prescribing antiemetics to remove the ineffective medication. They evaluated the guideline using the AGREE approach and it received scores that support approving implementation. The nurse subsequently presented this data to the nurses and physicians in the division to assess for their interest and agreement in modifying their prescribing process and remove the outdated medication from their algorithm as a first-line medication. They came to consensus and agreed to modify their practice, and the pharmacists, providers, clinical informaticians and health information technology team worked together to integrate the new guideline into the prior automated clinical decision support in conjunction with the computerized-provider-order-entry algorithm.

In addition to developing the algorithm, the implementation process included an educational initiative for the patients and their families to ensure they understood any changes in individual antiemetic regimens, and that any questions related to changes in their medication schedules were addressed. A team of nurses developed a research project to assess the success of the implementation strategy by examining the adherence to the guideline; they also explored the barriers and facilitators to adopting the guideline through qualitative study. The research project was approved through the local institutional review board, and ongoing assessments were conducted.

This is an ideal case because it includes all of the attributes previously defined as necessary in the implementation process. Specifically, all relevant stakeholders were involved in the process including nurses, physicians, informaticians, pharmacists, and in this case, patients. The key champion, a nurse, assessed the evidence and considered it rigorous enough to implement into practice. In addition, the ongoing assessment in the form of a research project strengthened the practice change process and helped to ensure that the relevant providers will continue to value optimal antiemetic prescribing practices.

Back to Top | Article Outline

Borderline case

Westwell Hospital had been prescribing antiemetics for their pediatric oncology patients utilizing an algorithm that was developed by the physicians caring for the patients. The algorithm was not developed from the literature even though extensive literature is available on this topic. The hospital had an administrative committee to develop policies and procedures, and this committee decided to update their guideline for antiemetic prescribing practices. An administrator searched online for a guideline and subsequently chose an outdated publication to use as a template for their updated algorithm. After implementing the new guideline into their algorithm, they posted the updated policy on the internal hospital Web site and sent an e-mail to the providers in pediatric oncology to notify them of the implementation. They also notified the pharmacy department of the implemented practice change. The division of pediatric oncology provided ad hoc education to both physicians and nurses about the changes, and some of the nurses provided education to their patients and families.

Subsequently, the hospital committee received updated practice guidelines through a central e-mailing mechanism, and one of the committee members noted that there is an updated guideline on prevention of chemotherapy-induced nausea and vomiting in pediatric oncology patients, which omitted one medication that had previously been widely used by the hospital. The committee reviewed the guideline and followed the same procedures to update their algorithm and reverse, or deimplement, the prior practice that included this medication. E-mails were again sent to the physicians, nurses, and pharmacists to notify them of the deimplementation. The updates were also briefly discussed at a division meeting.

This case is borderline because while it does include some of the attributes required for implementation, the vital component of individual clinician assessment is missing from the process. Without this important step, it is not clear that the algorithm will actually be practiced; clinicians who do not understand or believe in a practice change are less likely to adopt the change. Furthermore, without the input of these important stakeholders, it is unlikely that nurses and physicians who are providing the care would extend themselves and provide additional education to their patients if they do accept the deimplementation of the practice or conduct an assessment of patient outcomes with the change in practice, as the nurses did in the ideal case.

Back to Top | Article Outline

Related case

Westwell Hospital identified a need to update their policies and procedures for a Joint Commission inspection. A committee was formed of pharmacists, nursing supervisors, nurse practitioners, physician assistants, and physical therapists with a physician who was designated as the chair. They examined each policy in alphabetical order and described if they follow the policy as stated or if the usual practice in the hospital is a variation on the policy. If the policy is not followed as prescribed, they explained why they have adapted the policy. The usual practice was then assessed by the committee to determine whether the variation was within standard acceptable procedure, and if so, they modified the policy to match the practice. Occasionally, a published guideline was introduced and, if everyone on the committee agreed, the guideline informed an amendment to the policy. When more than one guideline was available from the different disciplines, the committee voted on which one to incorporate.

This is a related case because policy and practice are being aligned, but it is not necessarily consistent with guidelines or best practice. There is no individual provider assessment or step to allow for education and ultimately, adoption by the clinicians. This case demonstrates a related but distinct concept from implementation.

Back to Top | Article Outline

Contrary case

Westwell Hospital had a new pediatric oncology division chief who did not agree with the current antiemetic algorithm for pediatric oncology patients. The chief determined that they change the algorithm to utilize an outdated medication not backed by evidence because it is the way that she always practiced. There was no plan in place to review the algorithm to identify new information or modify the current practice.

This is a contrary case because it does not contain any of the required attributes. One person makes a determination that is not based on any evidence nor does it have stakeholder input, and the practice is determined without a plan for any future assessment or modification.

Back to Top | Article Outline

Illegitimate case

Westwell Hospital advertised an interest in implementation science, and they delegated an administrative committee the task of reviewing the medications prescribed in the hospital. The committee focused on the most expensive and frequently prescribed medications and realized that a certain class of antiemetics were at the top of the list. Under the premise of implementation science, the committee decided to remove the class of drugs from the formulary and replaced the recommendation with an older class of drugs in the antiemetic algorithm. A notice was sent out to the nurses, physicians, and pharmacists in the pediatric oncology division to update them on the new algorithm recommended by a literature review.

This is an illegitimate case because the hospital cites this as part of their implementation process, but in fact, it is not implementation as the evidence that they are citing is illegitimate or nonexistent. Similar to the related case, new evidence, the initiator of the process, is not present. This case would be deemed illegitimate because it hides under the ruse of implementation.

Back to Top | Article Outline

Antecedents, consequences, and empirical referents

The antecedents and consequences of a concept describe the preexisting requirements and subsequent value or potential risks of the concept if uniformly understood. Empirical referents provide the relationships to the defining attributes and, therefore, benchmarks to measure the conceptual outcome of interest.22 These components of the concept analysis can provide a framework to guide the development and evaluation of future related studies.

Antecedents, or requirements, for implementation of guidelines include a health care setting that provides care to individuals. Furthermore, the concept requires that providers of the care are established as well as the patients who are receiving the care. In addition to the setting, the concept of implementation requires the availability of evidence-based CPGs. This is important to acknowledge as guidelines can only be developed when there is evidence to support them.19 These factors must be in place to create the setting for this concept. It is important to acknowledge, however, that although the described cases involve a hospital setting, implementation of guidelines can occur in any setting where health care is provided, such as a public health clinic or through community health workers.

Although implementation of guidelines should lead to provision of high-quality, evidence-based care, there are potential consequences to consider. Because the process of implementation involves multiple steps and individuals, errors may occur along the way. These may include, but are not limited to, practice change to a non-evidence-based practice, individual fatigue of the evidence-based process leading to distrust of the resulting evidence and recommendations, and increased tension between health care institutions and the individual clinicians due to discord in the assessment of the evidence. Each of these potential outcomes may lead to poorer quality care in the health care field.

The defining attributes of implementation can also provide benchmarks to measure progress or highlight incorrect or ineffective implementation strategies. Intervention or measurement at any of the 5 previously described attributes will guide further understanding and improvement of this process. Prior frameworks, specifically those discussed in Tabak and colleagues' evaluation of 61 models, provide a historical referent point to evaluate implementation of CPGs.34 These are consistent with our findings, which highlight the multilevel process that is necessary for effective implementation.

Back to Top | Article Outline

IMPLICATIONS FOR CLINICAL PRACTICE

Clinical practice is constantly shifting and adapting to new evidence and published guidelines. Practice may also be mandated at the policy level of a hospital, clinic, or other health care setting. Because implementation of these evidence-based guidelines directly affects clinicians in their work environment, it is necessary to clearly define and delineate the specific attributes of this concept. We have demonstrated that the health care provider is an integral part of an effective implementation process, and the uninformed or opposed clinician may delay or arrest even the most well-designed implementation strategy.

In addition, health care providers, as the frontline of the care delivery system, should be actively involved in the process of adaptation of guidelines to integrate them into the local workflow.59 Without clinical input, it is unlikely that an implementation strategy will be effective and sustainable.13 The informed clinician can and should be an integral part of sustaining new, effective clinical care and conducting ongoing evaluation to monitor its impact over time.

Back to Top | Article Outline

LIMITATIONS

This concept analysis describes implementation in the health care setting. The cases are focused on guideline implementation in a clinical setting, but the scope of implementation is broad and includes health policy or another system-based implication. The breadth of this concept may be simplified in a limited concept analysis such as this where only implementation of CPGs in a specific setting, health care, is examined. We propose, however, that limiting the concept to this setting will aid clinicians in better understanding the concept and its importance to their local care delivery model.

In addition, the narrow focus of the theoretical cases may simplify the concept and underappreciate its extensive value. It is important, however, to provide concrete examples to clarify the processes involved in implementation. The cases were chosen specifically because of their concreteness in the hope that clinicians will find them relatable and tangible.

Another potential limitation of this concept analysis is the literature review. Although we included broad terms in the search, it is possible that we missed important publications that contributed to the implementation science literature. To reduce this limitation, the included publications were reviewed by an expert in implementation science as well as through the GEMS search. In addition, we prioritized systematic reviews and theoretical frameworks or reviews of multiple frameworks to ensure comprehensiveness.

Back to Top | Article Outline

CONCLUSION

This concept analysis provides a comprehensive definition of implementation of CPGs in clinical settings. It is our hope that this analysis will provide a clear definition of the concept for clinicians and health care providers, as they are exposed to implementation science studies at their institutions. Quality improvement projects, implementation strategies, and full-scale interventional studies are being conducted more often in a wide variety of clinical settings, and it is imperative that clinicians and researchers are able to communicate clearly and understand the meanings of these terms, concepts, and projects.

Implementation and deimplementation fundamentally are both vital components that are distinct but related. In the case of deimplementation, the required practice change signifies that the previous practice is accepted as no longer effective. The health care system has determined, through the same process as implementation, to stop the ineffective practice in exchange either for a previously established practice, or introduce a newer practice if there is evidence to move toward a newly established practice.

We propose a repeating process model (Figure 2) that is continually resting on a belief and reliance on evidence-based practice. Each of these attributes is a potential area of focus for future interventions in implementation and deimplementation research. These are supported by previously established frameworks of implementability that describes multiple domains necessary for a guideline to be implemented.60 Although they are not identical, they share concepts and sufficient overlap. Our conceptual model, however, visualizes the process as iterative, which is necessary to define and integrate implementation and deimplementation processes into a clinical practice.

Figure 2

Figure 2

Back to Top | Article Outline

REFERENCES

1. Haines A, Donald A. Making better use of research findings. BMJ. 1998;317(7150):72–75.
2. Woolf S, Schunemann HJ, Eccles MP, Grimshaw JM, Shekelle P. Developing clinical practice guidelines: types of evidence and outcomes; values and economics, synthesis, grading, and presentation and deriving recommendations. Implement Sci. 2012;7:61.
3. Guyatt GH, Oxman AD, Vist GE, et al GRADE: an emerging consensus on rating quality of evidence and strength of recommendations. BMJ. 2008;336(7650):924–926.
4. US Preventive Services Task Force. Procedure Manual. Rockville, MD: US Preventive Services Task Force; 2017.
5. Cabana MD, Rand CS, Powe NR, et al Why don't physicians follow clinical practice guidelines? A framework for improvement. JAMA. 1999;282(15):1458–1465.
6. Bostrom AM, Kajermo KN, Nordstrom G, Wallin L. Barriers to research utilization and research use among registered nurses working in the care of older people: does the BARRIERS scale discriminate between research users and non-research users on perceptions of barriers? Implement Sci. 2008;3:24.
7. Westfall JM, Mold J, Fagnan L. Practice-based research—“Blue Highways” on the NIH roadmap. JAMA. 2007;297(4):403–406.
8. Balas EA, Boren SA. Managing clinical knowledge for health care improvement. Yearb Med Inform. 2000;(1):65–70.
9. Mathew P, Michelle G, Karen GS. The effectiveness of clinical guideline implementation strategies—a synthesis of systematic review findings. J Eval Clin Pract. 2008;14(5):888–897.
10. Powell BJ, Waltz TJ, Chinman MJ, et al A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.
11. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.
12. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Scie. 2009;4:50.
13. Proctor E, Silmere H, Raghavan R, et al Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76.
14. Varsi C, Ekstedt M, Gammon D, Ruland CM. Using the consolidated framework for implementation research to identify barriers and facilitators for the implementation of an internet-based patient-provider communication service in five settings: a qualitative study. J Med Internet Res. 2015;17(11):e262.
15. The periodic health examination. Canadian Task Force on the Periodic Health Examination. Can Med Assoc J. 1979;121(9):1193–1254.
16. Glasgow RE, Green LW, Ammerman A. A focus on external validity. Eval Health Prof. 2007;30(2):115–117.
17. Shelton RC, Cooper BR, Stirman SW. The Sustainability of evidence-based interventions and practices in public health and health care. Ann Rev Public Health. 2018;39:55–76.
18. Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research: issues in external validation and translation methodology. Eval Health Prof. 2006;29(1):126–153.
19. Institute of Medicine. Clinical Practice Guidelines We Can Trust. Washington, DC: National Academies Press; 2011.
20. Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. 2006;1(1):1.
21. Michie S, Johnston M, Abraham C, et al Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14(1):26–33.
22. Walker LO, Avant KC. Strategies for Theory Construction in Nursing. 5th ed. Upper Saddle River, NJ: Pearson Prentice Hall; 2011.
23. Brownson RC, Colditz GA, Proctor EK. Dissemination and Implementation Research in Health: Translating Science to Practice. 2nd ed. Oxford, England: Oxford University Press; 2017.
24. Moser RP, Hesse BW, Shaikh AR, et al Grid-Enabled Measures. Am J Prev Med. 2011;40(5, suppl 2):S134–S143.
25. Brouwers MC, Garcia K, Makarski J, Daraz L. The landscape of knowledge translation interventions in cancer control: what do we know and where to next? A review of systematic reviews. Implement Sci. 2011;6:130.
26. Abrahamson KA, Fox RL, Doebbeling BN. Original research: facilitators and barriers to clinical practice guideline use among nurses. Am J Nurs. 2012;112(7):26–35.
27. Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Saf Health Care. 1998;7(3):149–158.
28. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ. 1998;317(7156):465–468.
29. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–1327.
30. Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients' care. Lancet. 2003;362(9391):1225–1230.
31. Wandersman A, Duffy J, Flaspohler P, et al Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am J Community Psychol. 2008;41(3/4):171–181.
32. Feldstein AC, Glasgow RE. A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Jt Comm J Qual Patient Saf. 2008;34(4):228–243.
33. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23.
34. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–350.
35. May C. Towards a general theory of implementation. Implement Sci. 2013;8:18.
36. Flottorp SA, Oxman AD, Krause J, et al A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci. 2013;8(1):35.
37. Weng YH, Kuo KN, Yang CY, Lo HL, Chen C, Chiu YW. Implementation of evidence-based practice across medical, nursing, pharmacological and allied healthcare professionals: a questionnaire survey in nationwide hospital settings. Implement Sci. 2013;8(1):112.
38. Gagliardi AR, Brouwers MC. Do guidelines offer implementation advice to target users? A systematic review of guideline applicability. BMJ Open. 2015;5(2):e007047.
    39. Jun J, Kovner CT, Stimpfel AW. Barriers and facilitators of nurses' use of clinical practice guidelines: an integrative review. Int J Nurs Stud. 2016;60:54–68.
    40. Birken SA, Rohweder CL, Powell BJ, et al T-CaST: an implementation theory comparison and selection tool. Implementat Sci. 2018;13(1):143.
      41. Byhoff E, Freund KM, Garg A. Accelerating the implementation of social determinants of health interventions in internal medicine. J Gen Intern Med. 2018;33(2):223–225.
      42. Wiggers J, McElwaine K, Freund M, et al Increasing the provision of preventive care by community healthcare services: a stepped wedge implementation trial. Implement Sci. 2017;12(1):105.
      43. Dobrasz G, Hatfield M, Jones LM, Berdis JJ, Miller EE, Entrekin MS. Nurse-driven protocols for febrile pediatric oncology patients. J Emerg Nurs. 2013;39(3):289–295.
      44. Grimshaw J, Eccles M, Thomas R, et al Toward evidence-based quality improvement. Evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies 1966-1998. J Gen Intern Med. 2006;21(suppl 2):S14–S20.
      45. Moore K, Johnson G, Fortner BV, Houts AC. The AIM Higher Initiative: new procedures implemented for assessment, information, and management of chemotherapy toxicities in community oncology clinics. Clin J Oncol Nurs. 2008;12(2):229–238.
      46. Preston RM. Drug errors and patient safety: the need for a change in practice. Br J Nurs. 2004;13(2):72–78.
      47. Nieva VF, Murphy R, Ridley N, et al Advances in patient safety from science to service: a framework for the transfer of patient safety research into practice. In: Henriksen K, Battles JB, Marks ES, Lewin DI, eds. Advances in Patient Safety: From Research to Implementation (Volume 2: Concepts and Methodology). Rockville, MD: Agency for Healthcare Research and Quality (US); 2005.
      48. World Health Organization. Health Policy. http://www.who.int/topics/health_policy/en/. Published 2018.
      49. Burns PB, Rohrich RJ, Chung KC. The levels of evidence and their role in evidence-based medicine. Plast Reconst Surg. 2011;128(1):305–310.
      50. Brouwers MC, Kho ME, Browman GP, et al Development of the AGREE II, part 2: assessment of validity of items and tools to support application. CMAJ. 2010;182(10):E472–E478.
      51. Ryan R, Hill S. How to GRADE the quality of the evidence. Cochrane Consumers and Communication Group; 2016.
      52. Yu PP. Knowledge bases, clinical decision support systems, and rapid learning in oncology. J Oncol Pract. 2015;11(2):e206–e211.
      53. Kilsdonk E, Peute LW, Jaspers MW. Factors influencing implementation success of guideline-based clinical decision support systems: a systematic review and gaps analysis. Int J Med Inform. 2017;98:56–64.
      54. Lorencatto F, Gould NJ, McIntyre SA, et al A multidimensional approach to assessing intervention fidelity in a process evaluation of audit and feedback interventions to reduce unnecessary blood transfusions: a study protocol. Implement Sci. 2016;11(1):163.
      55. Prasad V, Ioannidis JP. Evidence-based de-implementation for contradicted, unproven, and aspiring healthcare practices. Implement Sci. 2014;9:1.
      56. Upvall MJ, Bourgault AM. De-implementation: a concept analysis. Nurs Forum. 53(3):376–382.
      57. Johns DM, Bayer R, Fairchild AL. Evidence and the politics of deimplementation: the rise and decline of the “counseling and testing” paradigm for HIV prevention at the US Centers for Disease Control and Prevention. Milbank Q. 2016;94(1):126–162.
      58. Montini T, Graham ID. “Entrenched practices and other biases”: unpacking the historical, economic, professional, and social resistance to de-implementation. Implement Sci. 2015;10(1):24.
      59. Rabin BA, Brownson RC, Haire-Joshu D, Kreuter MW, Weaver NL. A glossary for dissemination and implementation research in health. J Public Health Manag Pract. 2008;14(2):117–123.
      60. Gagliardi AR, Brouwers MC, Palda VA, Lemieux-Charles L, Grimshaw JM. How can we improve guideline use? A conceptual framework of implementability. Implement Sci. 2011;6(1):26.
      Keywords:

      clinical practice guidelines; evidence-based practice; health care settings; implementation

      Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.