Secondary Logo

Journal Logo

Supplement Article

Using Implementation Mapping to Ensure the Success of PrEP Optimization Through Enhanced Continuum Tracking (PrOTECT) AL-A Structural Intervention to Track the Statewide PrEP Care Continuum in Alabama

Creger, Thomas PhD, MPHa; Burgan, Kaylee MAa; Turner, Wesli H. MSa; Tarrant, Ashley MPHb; Parmar, Jitesh MPH, MBA, MPAb; Rana, Aadia MDa; Mugavero, Michael MD, MHSCa; Elopre, Latesha MD, MSPHa

Author Information
JAIDS Journal of Acquired Immune Deficiency Syndromes: July 1, 2022 - Volume 90 - Issue S1 - p S161-S166
doi: 10.1097/QAI.0000000000002976

Abstract

  • Evidence-based innovation: Cross-agency data sharing to improve the HIV status-neutral care continuum.
  • Innovation recipients: Decision-makers and data managers at county health departments, AIDS services organizations, and community-based organizations.
  • Setting: County health departments, AIDS services organizations, and community-based organizations.
  • Implementation gap: Evidence supports cross-agency data sharing to improve the HIV prevention and care however the technical and legal barriers to do so can be challenging.
  • Primary research goal: Select/pilot implementation strategies.
  • Implementation strategies: Distribute educational materials, make training dynamic, provide local technical assistance, and change record systems.

INTRODUCTION

Forty years into the HIV epidemic, public health initiatives in the United States are falling short in improving inequities in HIV prevention, diagnoses, and treatment outcomes. Federal agencies are working in a coordinated effort to End the HIV Epidemic (EHE) with a national strategy that prioritizes prevention efforts that optimize utilization of highly effective biomedical tools such as HIV pre-exposure prophylaxis (PrEP) in geographic areas with significant rates.1 As with many Southern states, Alabama bears a disproportionate HIV burden due in part to social and economic contexts of the region. Poverty, racial segregation, low literacy, stigma, and rurality all contribute to health disparities including the underutilization of PrEP, a safe and effective preventive intervention that could play a central role in preventing HIV infection among individuals vulnerable to infection.2–7 To address inequities in HIV rates, implementation of evidence-based HIV prevention practices must be guided by epidemiological data similar to that modeled in the HIV treatment (care) continuum.8 Our project, PrOTECT AL (PrEP Optimization Through Enhanced Continuum Tracking) is a multiphase study using a participatory approach to investigate the potential to track and report Alabama's PrEP care continuum. The purpose of this article was to demonstrate how an implementation scientific approach, specifically Implementation Mapping (IM), can lead to a better understanding of the contextual nuances at diverse implementation sites and result in more appropriate and effective implementation strategies.

APPROACH

The planning process for PrOTECT AL has been guided by the tenets of IM. IM incorporates insights from both the field of Implementation Science and the Intervention Mapping framework, a 6-step protocol for developing theory-based and evidence-based health promotion programs.9 IM is a 5-step approach to support researchers and community partners in the systematic development, selection, and/or tailoring of implementation strategies to increase program adoption, implementation, and sustainability.10 The 5 rapid steps include: conduct a needs assessment, identify adoption performance objectives and change objectives, select theoretical methods and design implementation strategies, produce implementation protocols and materials, and evaluate the implementation.

Our study was guided by an overarching Implementation Research Logic Model11 (IRLM) (Fig. 1), which has been adapted iteratively to provide clear representation of the causal pathways between implementation strategies and short-term and long-term study outcomes.12 Through this approach, we are creating a framework that will improve our intervention's impact in the community through the selection of strategies that address barriers to implementation at the site level and will address current health inequities in PrEP engagement, our outcome of interest.13 Our IRLM consists of implementation strategies based on the determinants discovered through engagement with key community partners and through interpretation of the findings from semistructured interviews, focus group interviews and meetings with implementation partners, and quantitative surveys. In our IRLM, we provide clear definitions of our implementation strategies and potential mechanisms of action through which our implementation strategies operate to enable measurement and promote reproducibility of data sharing.14–16

F1
FIGURE 1.:
Implementation Research Logic Model for PrOTECT AL.

Our implementation partners for PrOTECT AL include the Alabama Department of Public Health and members of the Alabama Quality Management Group (AQMG), a consortium of 13 geographically dispersed Ryan White–funded parts C and D programs (see Map, Supplemental Digital Content 1, https://links.lww.com/QAI/B870, which indicates locations of the AQMG members). This project received approval from the Institutional Review Board at the University of Alabama at Birmingham, protocol number IRB-300004157.

Results by IM Step

Step 1: Conduct an Implementation Needs Assessment

To better understand the data currently collected around PrEP education, referral, and provision at each site, and to inform metrics for a potential PrEP data dashboard,17 the research team invited representatives of the AQMG members (n = 12) to complete a 76-item online survey that was developed using Qualtrics CoreXM software (see Supplemental Digital Content 2, https://links.lww.com/QAI/B871, PrOTECT AL Initial Survey). The survey included items related to PrEP screening, education, referrals, linkage to prevention services and PrEP prescriptions (defined as prescriptions written by providers) as well as questions about the client-level demographic data that each site collects.

Twelve AQMG organizations participated in the survey conducted from January 2020 until April 2020. Eleven (92%) provided PrEP education, 9 (75%) provided PrEP screening, and 10 (83%) provided PrEP care. Of the organizations reporting PrEP care data, Alabama's PrEP care continuum for 2019 included 6481 individuals screened for PrEP, defined as providing counseling to patients and conducting initial assessment of patients' risk for HIV. Of those with a PrEP indication, 625 received PrEP care, defined as having had at least 1 PrEP medical visit, 478 were prescribed PrEP and 395 were retained in PrEP care (see Graph, Supplemental Digital Content 3, https://links.lww.com/QAI/B872, which illustrates Alabama's PrEP care continuum in 2019).

More than half of those screened for PrEP (57.8%, n = 1563) were between the ages of 45 and 65 years. Among those screened, 48% (1320) were designated female at birth; however, only 49 of those women received PrEP services. Black or African American individuals represented more than 60% (n = 1670) of those screened, of which only 192 received PrEP services. Most of the individuals screened for PrEP (81%, n = 2190) were uninsured; however, more than 80% (n = 386) of those prescribed PrEP had insurance coverage.

To help clarify survey responses and to gauge interest in sharing data across sites, the research team scheduled semistructured interviews (see Supplemental Digital Content 4, https://links.lww.com/QAI/B873, PrOTECT AL Interview Guide) with each of the respondents, 11 of which were completed. The interview guide was informed by the IRLM. All interviews were conducted from April 2020 until July of 2020 and were recorded and transcribed verbatim for rapid qualitative analysis (RQA). Framework-guided RQA is used increasingly to streamline the analysis process and quickly produce actionable information.18 This analysis method generally begins with the development of an interview guide informed by a framework, theory, or the project's research questions. Concurrently, a draft summary table is created to correspond to the interview guide's associated concepts/constructs. In analysis, the research team populates the table with data extracted from interview transcripts, including illustrative quotes. The draft summary table is reviewed and modified based on the analysis conducted by multiple researchers of a single interview transcript. That process is repeated with the modified template on a second interview transcript before a final template is determined to aid in analysis of all future interview transcripts. Achieving a balance between scientific rigor and the need for actionable information is challenging; however, because RQA is less resource-intensive than in-depth analysis, it allows researchers to disseminate findings among research team members and integrate those findings into ongoing implementation.19

In our analysis, we found that most respondents expressed a desire to have real-time data visualization on a data dashboard to inform policy and organizational decisions regarding improvement in PrEP access for their service region. Themes that emerged from the coding of key-informant interviews included constructs from the CFIR, including 3 that were hypothesized as determinants that may affect adoption: adaptability, relative advantage, and complexity.20 Regarding adaptability, all interview participants were open to adapting their workflow to share data to track the PrEP care continuum. Not all the participating organizations collected this information; however, even sites that did not currently collect PrEP care data recognized the value in doing so and indicated willingness to begin collecting those data.

The ability to have a central location to store and share data across participating organizations was acknowledged as one of the relative advantages of a standardized data management system and the functionality of tracking patient care, especially the ability to identify those who have fallen out of care. The system was also viewed by respondents as a mechanism to harmonize data collection around social determinants of health.

There were some concerns around the complexity of implementation of uniform data management system that included the added burden of data capture beyond the array of data systems already in use for both internal and external purposes. Most organizations, however, agreed that a comprehensive database for the PrEP care continuum would be beneficial, particularly regarding reporting outcomes to funders and federal agencies.

Several of the respondents endorsed a concern related to client privacy; however, most agreed that launching a PrEP data dashboard would lead to greater collaboration and better understanding of the affected populations and help guide delivery of PrEP services to those populations. Ultimately, a naming algorithm was developed to ensure client privacy.

With the understanding that the community-based organization/AIDS services organization partners and the Alabama Department of Public Health found a PrEP data dashboard appealing, the final step in the needs assessment was to finalize a data dictionary of potential metrics for discussion with our partners and present a mockup of the dashboard for feedback. Through the survey and interviews, it was clear that a great deal of data were already being collected by each organization. All were collecting basic demographics, and most were collecting HIV risk factors, history of STIs, and information on condom use and other forms of protection. Beyond that, some were collecting information on mental health, housing stability, and food security. Drawing on recommendations from the partner organizations as to the data elements they believed essential to collect, a data dictionary was developed which included patient demographics, laboratory information, PrEP care, clinical visit data, and patient-reported outcomes (eg, a diagnosis of depression and/or anxiety, sexual behavior, housing stability, and transportation).

The data dictionary was then shared with the partners for feedback through a series of 4 focus group discussions. The focus group discussions were conducted between April 2021 and May 2021, each consisting of 2 to 8 representatives. The goal of the focus group discussions was to collect feedback on the proposed data elements, discuss feasibility of collecting these data elements, and better understand the determinants that would influence implementation, including training and other resources that were needed for adoption of the dashboard. All focus groups were recorded and transcribed using the same RQA approach to analysis described above. The results of the focus group discussions are reported in the next section.

Step 2: Identify Adoption Outcomes, Performance Objectives, Determinants, and Change Objectives

Having determined that our partners were interested in creating and participating in a PrEP data dashboard and with an understanding of the data currently collected by the partners and the data that they believed would be important to collect for the dashboard, the team moved on to the next step in IM, which includes the establishment of adoption and implementation outcomes specific to each adopter and implementer, establishment of performance objectives for those same actors, and identification of determinants that might influence adoption and implementation. Adoption, implementation, and maintenance outcomes state the goals for each specific class of actor related to adoption, implementation, and outcomes of the implementation. Performance objectives are the specific tasks required for each (see Supplemental Digital Content 5, https://links.lww.com/QAI/B874, Partial Matrix of Adoption and Implementation Outcomes and Performance Objectives).

In analysis of the focus group discussion described above, 6 primary codes emerged, including language, organizational capacity, infrastructure, strategies for upload, training and resources, and other pertinent information. Summaries were then used to develop a table of implementation strategies, informed by CFIR,20 the taxonomy of behavior change methods,21 and the refined compilation of implementation strategies.22

Step 3: Selecting Theoretical Methods and Design Implementation Strategies

Table 1 (below) outlines the stages of implementation for PrOTECT AL, the actors engaged in the implementation process, and the change methods and implementation strategies used with each actor at each stage of implementation. Chief among the strategies identified by community partners to support implementation of the PrOTECT AL dashboard were (1) develop and distribute education materials, in which the study team will develop and format study manuals, toolkits, data upload templates, and other supporting materials and distribute these to the partners to assist in learning about the innovation and data management; (2) make the training dynamic by developing process-specific training and varying the information delivery methods to cater to different learning styles (print manuals, live virtual trainings, and on-demand recorded tutorials); (3) conduct ongoing training to assist with staff buy-in and address site-level attrition; (4) centralize technical assistance by creating a dashboard website with access to the resources, tutorials, and training materials; and (5) change the record systems from multiple systems currently being used by the partner organizations to a more streamlined data entry format to allow for better assessment and easier upload of data elements.

TABLE 1. - PrOTECT AL Program Implementation Strategies
Stage Agent Determinants* Change Method Implementation Strategy
Adoption Clinic decision maker Leadership engagement -Discussion -Conduct local consensus discussion
Readiness for implementation -Persuasive communication -Share results of needs assessment
Tension for change -Elaboration -Assess for readiness and identify barriers and facilitators
-Promote adaptability
Adoption Clinic data manager Readiness for implementation -Discussion -Conduct ongoing training
Self-efficacy -Guided practice -Provide ongoing consultation
Available resources -Goal setting -Make training dynamic
-Planning coping responses -Use data warehousing techniques
Implementation All Outcome expectations -Discussion -Develop educational materials
Self-efficacy -Persuasive communication -Change record systems
Relative advantage -Elaboration -Centralize technical assistance
Adaptability
Complexity
Maintenance All Identified champion -Persuasive communication -Provide feedback
Trialability -Monitoring and feedback -Make training dynamic
Reflecting and evaluating -Provide technical assistance
*Damschroder, L, Hall, C, Gillon, L. et al. The consolidated framework for implementation research (CFIR): progress to date, tools and resources, and plans for the future. Implementation Sci. 2015;10:A12.
Kok G, Gottlieb NH, Peters GJ, et al. A taxonomy of behaviour change methods: an Intervention Mapping approach. Health Psychol Rev. 2016;10:297–312.
Powell, BJ, Waltz, TJ, Chinman, MJ et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implementation Sci. 2015;10;21.

Step 4: Produce Implementation Protocols and Materials

The next step to be completed will include design of implementation protocols and training materials. Using findings from the survey, the interviews, and the focus group discussions in the first 3 steps, the research team will create a beta version of a PrEP data dashboard and training materials to support implementation of the dashboard. All protocols and materials will be pretested by the implementation partners and refined based on partner feedback. On completion of those steps, a test data upload will be requested of our partners, and they will have a chance to engage interactively with the beta version of the statewide PrEP data dashboard.

Step 5: Evaluate Implementation Outcomes

Uncompromised completeness and implementation fidelity of evidence-based practices are preconditions for achieving desired changes in health outcomes of target populations; therefore, the final step, step 5 of IM, is implementation evaluation. Simultaneous to the development of the intervention implementation plan, the research team has been developing an implementation evaluation plan. Although still in development, that plan will include an assessment of implementation processes and an assessment of the intervention's feasibility, acceptability, and adaptability from our partners' point of view.23 To better understand the acceptability of the intervention, an in-depth interview guide informed by constructs of acceptability articulated by Sekhon et al24 will be developed and interviews will be conducted with adopters, implementers, and maintainers at all partner sites.

DISCUSSION

Important research findings often fail to find their way to enacted policy or practice, or if they do, the process is slow and arduous. Even when research findings are translated into clinical practice, the implementation frequently occurs without careful consideration of context and necessary accommodations, compromising the fidelity and therefore the effectiveness of the program, policy, or procedure. Over the past decade and a half, Implementation Science (IS) has emerged as the best potential solution to reducing the time lag prevalent in the research to practice translation process.25 Despite the success of IS, many programs fail, in part because successful translation of research into practice requires context-specific implementation planning. In light of this, the researchers adopted IM, a participatory approach to the planning and implementation processes in PrOTECT AL.

Relevant to our project, IM supports researcher and community-partner efforts to design interventions and implementation strategies that are context-specific. This systematic process engages stakeholders in the development of the program. Furthermore, IM can be used to systematically adapt existing evidence-based interventions to align them with new populations, geographic regions, or in the case of PrOTECT AL, different implementation contexts. For our project, IM allowed for quick identification of key implementation strategies, including development of a data dashboard for real-time visualization of our PrEP care continuum and the need for additional strategies to support adoption. Subsequent steps resulted in a better understanding of critical support systems and tools needed by community partners to implement uniform data collection across sites and desired shared data elements to include in the dashboard.

Increasingly, data dashboards have contributed to the democratization of data, making relevant information interactive, accessible, and actionable.26,27 AIDSVu is an excellent example of a resource for visualization of HIV surveillance and other population-based data relevant to HIV prevention, care, policy, and impact assessment.28 In September 2015, New York State launched its own EHE Dashboard System, a public-facing website containing current information about New York State's status neutral HIV care continuum in a visual format.29,30 In contrast to New York's EHE Dashboard, which mines its PrEP care data from the NYS DOH Medicaid data warehouse, our approach differs in that we are working with our partners at AQMG organizations and the State health department to track clients over the PrEP care continuum. This allows for relevant identification of contextual barriers and potential selection of evidence-based interventions to close gaps in the continuum. Most importantly, our work has identified critical strategies to improve uptake of the dashboard, and continued engagement in IM steps will likely provide more insight into how public health institutions can effectively use a data dashboard to affect HIV inequities.

Finally, and most germane to this project, IM can help planners to develop, select, or tailor implementation strategies to increase adoption, implementation, and sustainability. For program planning, development, adaptation, and implementation, IM can reduce the gap between effective clinical practices, policies, and programs and their actual use in health care settings and communities.10

Like most participatory processes, IM is very time-consuming and requires expertise that not all projects nor project teams may have. Program planners would benefit from familiarizing themselves with the processes of IM before determining whether IM is the best approach. Furthermore, our team benefitted greatly from our many years of working closely with all our implementation partners. Developing the necessary trust with implementation partners is essential to ensuring success with IM.

CONCLUSIONS

For PrOTECT AL, we are conducting IM by engaging each of our community partners individually and collectively, using qualitative and quantitative methods to develop implementation strategies tailored to their needs and capacities. This will help us achieve our state's and the national EHE ultimate goal to decrease the number of new HIV cases annually and to overcome intersectional inequities in the burden of HIV fueled by racism, sexism, homophobia, geography, and poverty.

In the same way that Intervention Mapping has greatly improved the planning and therefore the effectiveness of health promotion and other interventions, IM can lead to an increased understanding of the contextual nuances at implementation sites, leading to more appropriate and effective implementation strategies and ultimately more impactful implementation.

Researchers and practitioners alike have recognized the need for better descriptions of implementation planning processes to facilitate replication and dissemination of evidence-based interventions.31 Our project contributes to the literature on translational science by articulating, evaluating, and reporting on implementation planning using a participatory approach and IM to address potential contextual facilitators and challenges before implementation.

ACKNOWLEDGMENTS

The authors thank all the clinicians, data managers, social workers, staff, and patients of Alabama Quality Management Group–associated clinics.

REFERENCES

1. Fauci AS, Redfield RR, Sigounas G, et al. Ending the HIV epidemic: a plan for the United States. JAMA. 2019;321:844–855.
2. Siegler AJ, Mouhanna F, Giler RM, et al. The prevalence of pre-exposure prophylaxis use and the pre-exposure prophylaxis–to-need ratio in the fourth quarter of 2017, United States. Ann Epidemiol. 2018;28:841–849.
3. Williams DR, Collins C. Racial residential segregation: a fundamental cause of racial disparities in health. Public Health Reports. 2001;116:404–416.
4. Homan P. Structural sexism and health in the United States: a new perspective on health inequality and the gender system. Am Sociological Rev. 2019;84:486–516.
5. Baker DW, Parker RM, Williams MV, et al. The relationship of patient reading ability to self-reported health and use of health services. Am J Public Health. 1997;87:1027–1030.
6. Eaton LA, Earnshaw VA, Maksut JL, et al. Experiences of stigma and health care engagement among Black MSM newly diagnosed with HIV/STI. J Behav Med. 2018;41:458–466.
7. Probst JC, Moore CG, Glover SH, et al. Person and place: the compounding effects of race/ethnicity and rurality on health. Am J Public Health. 2004;94:1695–1703.
8. Myers JE, Braunstein SL, Xia Q, et al. Redefining prevention and care: a status-neutral approach to HIV. Open Forum Infect Dis. 2018;5:ofy097.
9. Bartholomew LK, Parcel GS, Kok G. Intervention mapping: a process for developing theory- and evidence-based health education programs. Health Educ Behav. 1998;25:545–563.
10. Fernandez ME, Ten Hoor GA, van Lieshout S, et al. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Health. 2019;7:158.
11. Smith JD, Li DH, Rafferty MR. The Implementation Research Logic Model: a method for planning, executing, reporting, and synthesizing implementation projects. Implement Sci. 2020;15:84–12.
12. Smith J. An implementation research logic model: a step toward improving scientific rigor, transparency, reproducibility, and specification. 11th Annual Conference on the Science of Dissemination and Implementation. AcademyHealth; 2018.
13. Bosch M, Van Der Weijden T, Wensing M, et al. Tailoring quality improvement interventions to identified barriers: a multiple case analysis. J Eval Clin Pract. 2007;13:161–168.
14. Craig P, Dieppe P, Macintyre S, et al. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655.
15. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139–211.
16. Kazdin AE. Evidence-based treatment and practice: new opportunities to bridge clinical research and practice, enhance the knowledge base, and improve patient care. Am Psychol. 2008;63:146–159.
17. Nash D. Designing and disseminating metrics to support jurisdictional efforts to end the public health threat posed by HIV epidemics. Am J Public Health. 2020;110:53–57.
18. Watkins DC. Rapid and rigorous qualitative data analysis: the “RADaR” technique for applied research. Int J Qual Methods. 2017;16:1609406917712131.
19. Nevedal AL, Reardon CM, Widerquist MAO, et al. Rapid versus traditional qualitative analysis using the consolidated framework for implementation research (CFIR). Implement Sci. 2021;16:67–12.
20. Damschroder LJ, Aron DC, Keith RE, et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50–15.
21. Kok G, Gottlieb NH, Peters GJ, et al. A taxonomy of behaviour change methods: an intervention mapping approach. Health Psychol Rev. 2016;10:297–312.
22. Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21–14.
23. Weiner BJ, Lewis CC, Stanick C, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12:108–112.
24. Sekhon M, Cartwright M, Francis JJ. Acceptability of healthcare interventions: an overview of reviews and development of a theoretical framework. BMC Health Serv Res. 2017;17:88–13.
25. Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011;104:510–520.
26. Wu E, Villani J, Davis A, et al. Community dashboards to support data-informed decision-making in the HEALing Communities Study. Drug Alcohol Depend. 2020;217:108331.
27. Concannon D, Herbst K, Manley E. Developing a data dashboard framework for population health surveillance: widening access to clinical trial findings. JMIR Form Res. 2019;3:e11342.
28. Sullivan PS, Woodyatt C, Koski C, et al. A data visualization and dissemination resource to support HIV prevention and care at the local level: analysis and uses of the AIDSVu public data resource. J Med Internet Res. 2020;22:e23173.
29. Joshi A, Amadi C, Katz B, et al. A human-centered platform for HIV infection reduction in New York: development and usage analysis of the Ending the Epidemic (ETE) Dashboard. JMIR Public Health Surveill. 2017;3:e95.
30. Braunstein SL, Coeytaux K, Sabharwal CJ, et al. New York City HIV care continuum dashboards: using surveillance data to improve HIV care Among people living with HIV in New York City. JMIR Public Health Surveill. 2019;5:e13086.
31. Hoffmann TC, Glasziou PP, Boutron I, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687.
Keywords:

implementation mapping; PrEP care continuum; community-engaged

Supplemental Digital Content

Copyright © 2022 Wolters Kluwer Health, Inc. All rights reserved.