Secondary Logo

Journal Logo

Embedding Implementation Research in Community-Based Health Care Systems

Observations of a Field Scientist

Knighton, Andrew J., PhD, CPA; Allen, Todd L., MD; Srivastava, Rajendu, MD, MPH

Quality Management in Healthcare: April/June 2019 - Volume 28 - Issue 2 - p 114–116
doi: 10.1097/QMH.0000000000000210
Intermountain Advances
Free
SDC

Healthcare Delivery Institute, Intermountain Healthcare, Salt Lake City, Utah (Drs Knighton, Allen, and Srivastava); and Department of Pediatrics, Division of Inpatient Medicine, Primary Children's Hospital/University of Utah School of Medicine, Salt Lake City (Dr Srivastava).

Correspondence: Andrew J. Knighton, PhD, CPA, Healthcare Delivery Institute, Intermountain Healthcare, 36 S State St, Salt Lake City, UT 84111 (andrew.knighton@imail.org).

The authors declare no conflicts of interest.

Reliable translation of clinical discoveries into routine clinical practice remains a challenge. The ongoing widespread variation in the use of evidence-based practice (EBP) persists. The potential exists for health care to do this better and improve health care quality by embedding implementation science into the way health care organizations operate. The choice of implementation strategies used to deploy evidence-based clinical practices has been linked to increased adherence to EBP. This has led to a strong interest in implementation science and to advancing implementation research to determine what implementation strategies work best under what circumstances. Much of the interest stems from health care delivery systems seeking solutions to improve care quality and lower the cost of care in a fee-for-value environment.1

As a community-based health care delivery system, Intermountain has a rich legacy of quality improvement. Like other health care systems, Intermountain struggles with ensuring EBP is increasing across all clinical areas and sustaining adherence in areas where high achievement has occurred. The Healthcare Delivery Institute (HDI) at Intermountain Healthcare (Intermountain) seeks to sustain Intermountain's mission of “Helping People Live the Healthiest Lives Possible” by educating internal and external practitioners on the science of quality improvement. Through internal and external funding mechanisms, the HDI supports embedded implementation scientists responsible for studying best methods for increasing adherence to EBP across the care delivery continuum. We want to highlight both the opportunities and challenges community-based health care systems face translating implementation science methods and techniques into practice to improve health care quality.

Back to Top | Article Outline

SUCCESSFUL IMPLEMENTATIONS DON'T JUST HAPPEN

A surprisingly common misconception is that most ideas will spread with high adherence on their own or by word of mouth. This is evidenced not only by the persistence in care variation but also by the number of developed EBP artifacts that go unused and the limited investments many health care delivery systems have made in implementation resources. Implementation success in deploying EBP is highly contextual and depends on addressing interrelated factors involving people, processes, and technology (when applicable). Well-resourced and planned projects require organizational will from all levels of an organization and robust execution with agreed-upon metrics of progress to succeed.

Back to Top | Article Outline

PARTNER WITH CLINICAL EFFECTIVENESS TEAMS

Implementation scientists rarely work alone when embedded in delivery systems. Internal and external funding opportunities often involve close alignment with a clinical effectiveness team that has a goal to increase adherence to an EBP. Most often, clinical partners engage the implementation scientist when they encounter barriers to adherence and the strategies they are trying are not working. The more aligned the clinical and operations teams are with the implementation scientists, the more successful the process and outcomes are likely to be.

Back to Top | Article Outline

WORKING WITH AN IMPLEMENTATION SCIENTIST—CONSULTING, PROJECT MANAGEMENT, AND/OR RESEARCH?

One of the first challenges faced by the embedded implementation scientist is helping stakeholders understand what implementation science is and how it might help them meet their needs in effectively spreading EBP. Expectations regarding the role the implementation scientist can play vary widely. The field of implementation science developed out of a need to study “methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice.”2(p1) Practically speaking, implementation science brings a bundle of techniques that assist the stakeholder in (1) identifying barriers and facilitators to the use of EBP in clinical care; (2) identifying strategies for implementing EBP in practice; (3) deploying strategies; and (4) evaluating the results of this work. This applied research in a community-based setting can support the development of playbooks for conducting implementations across the care delivery continuum. Typically, clinical and operational stakeholders are seeking advice from embedded implementation scientists on how to increase adherence to an existing EBP or they may be looking for a project manager to lead the effort. This kind of support has real value to the stakeholders but may not advance the field of implementation research. Pragmatic approaches are required that balance consultation with efforts to embed implementation research objectives into the project.

Back to Top | Article Outline

NOT ALL IMPLEMENTATION PROJECTS ARE THE SAME

Two broad categories of meaningfully different implementation projects are emerging at Intermountain, consistent with the literature: (1) implementation projects that focus on introducing or increasing adherence to an EBP; and (2) de-implementation projects that focus on stopping the spread or use of an existing practice. These are meaningfully different implementation problems requiring different field engagement strategies. One recent Intermountain project focused on aligning resources within an emergency department (ED) to improve door-to-needle times for stroke treatment. This project type required a high level of local field engagement and collaboration to increase alignment of multiple clinical and administrative decision makers at the site. In contrast, a recent de-implementation project focused on reducing the use of computerized tomographic (CT) scans in pediatric patients with low-risk head trauma. In this situation, the emphasis was on influencing the actions of the ED physician to forego ordering a CT scan for low-risk pediatric head trauma patients using a data-rich alert in the electronic health record (EHR).

Back to Top | Article Outline

FOCUS ON THE MOST IMPORTANT INNOVATIONS

There is no shortage of EBP to consider when determining what to deploy. Given the strong role that clinical champions play in advancing changes in care, there are a plethora of good ideas that arise from within the organization. These ideas should be harnessed as it has implications for improving patient outcomes, reducing costs, and increasing engagement of the workforce. However, implementation also bears a cost to the delivery system—in both time and effort—and is often borne disproportionately by frontline staff inundated with change requests. When determining what EBP to scale across an enterprise, a robust organizational model, with clear vertical and horizontal accountabilities within the organization, becomes essential. Getting buy-in from key organization leadership to frontline clinicians requires that the projects selected have clear objectives and reporting of results and, if executed well, will have meaningful impact on patient outcomes.

Back to Top | Article Outline

ADAPT IMPLEMENTATION FRAMEWORKS TO THE PROBLEM

A primary criticism of implementation science as an academic discipline is the failure of scientists to use hypothesis-driven methods to conduct generalizable research.3 The lack of hypothesis-driven approaches in field research challenges the underlying intellectual rigor of the discipline. Numerous frameworks exist nowadays that are useful in implementation work. Pragmatic methods for selecting and using such frameworks, given project characteristics, are key. For example, implementation scientists initiating a project in a new context may adapt a more comprehensive framework to ensure completeness. In situations involving a well-understood environment, where the goal is increasing uptake of an already deployed practice, a simpler framework may be more appropriate.

Back to Top | Article Outline

RETHINK QUALITATIVE FIELD RESEARCH

Implementation scientists generally use some form of mixed methods to conduct their work. The qualitative component includes engaging clinicians in the field to understand barriers and facilitators to the use of EBP. We find that traditional robust qualitative methods for fieldwork used in the social sciences are not always responsive to the immediate needs of the clinical and operational stakeholders. A 2-step reporting process for presenting results, consistent with operational reviews conducted by large consulting firms, has value. Initial fieldwork includes a hypothesis-driven field interview guide based on published frameworks. Preliminary field results are reported rapidly (within a few days) to obtain stakeholder feedback and guide internal project planning. In the final report, traditional qualitative methods are then used to analyze the interviews to guide a more generalizable approach publishable in the peer-reviewed literature.

Back to Top | Article Outline

USE TECHNOLOGY AS AN ENABLER

One important vehicle for increasing adherence is to embed EBP into clinical workflows. Given the meaningful-use incentive from the federal government to implement EHRs, many organizations are now challenged in their efforts to derive better patient outcomes from these substantial investments. The electronic workflow capabilities within EHRs make this functionality an attractive enabler for increasing EBP. However, not all EBPs require technology to increase adherence. When incorporating technology solutions into the problem, such as computerized workflows or decision support tools, it is important to get the people and workflow issues nailed down first, before enabling them with technology. When you build it, they do not always come. Technology is often built without a clear implementation plan. As a result, the implementation scientist, the continuous improvement field teams, and the health information technology development teams need to be closely aligned and the development properly sequenced to deliver optimal adherence to EBP and ultimately patient care results.

Back to Top | Article Outline

LOOKING AHEAD

The health care delivery environment has been likened to a complex adaptive system. Significant unmet challenges remain in our efforts to improve health care quality. Strengthening the capacity to improve health care system performance remains essential in tackling these unmet challenges.4 We are finding that implementation scientists can play a vital, pragmatic role in helping organization leadership and frontline staff understand and respond to these headwinds. The time is right for implementation science to prove its worth.

Back to Top | Article Outline

REFERENCES

1. James BC, Poulsen GP. The case for capitation. Harv Bus Rev. 2016;94(7/8):102–111, 134.
2. Eccles MP, Mittman BS. Welcome to IS. Implement Sci. 2006;1:1.
3. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015:10:53.
4. Whicher D, Rosengren K, Siddiqi S, Simpson L, eds. The Future of Health Services Research: Advancing Health Systems Research and Practice in the United States. Washington, DC: National Academy of Medicine; 2018.
© 2019Wolters Kluwer Health | Lippincott Williams & Wilkins