Secondary Logo

Journal Logo

Supplement Article

Five Common Myths Limiting Engagement in HIV-Related Implementation Research

Beres, Laura K. PhD, MPHa; Schwartz, Sheree PhD, MPHb; Mody, Aaloke MDc; Geng, Elvin H. MD, MPHc; Baral, Stefan MD, MPHb

Author Information
JAIDS Journal of Acquired Immune Deficiency Syndromes: July 1, 2022 - Volume 90 - Issue S1 - p S41-S45
doi: 10.1097/QAI.0000000000002964
  • Free



Implementation research (IR) holds great promise toward realizing the potential of efficacious prevention and treatment tools for reducing HIV incidence and improving HIV treatment outcomes, making IR key to accomplishing the goals of the Ending the HIV Epidemic (EHE) initiative.1,2 There is widespread enthusiasm about this emerging area of applied research, yet all new areas of inquiry require time for practicing scientists to agree on the basic definitions and standards and their nuances. By contrast, although well-articulated standards are useful for bringing a field cohesion, excessive adherence to standards or nomenclature—particularly when those standards and nomenclature are likely to evolve—can be counterproductive. There are a growing number of investigators and implementers looking to engage in IR3 and a growing number of HIV IR-related funding opportunities and studies.4 Simultaneously, prevalent HIV-related IR myths that inform perceptions of what is considered “true” IR create barriers to entry for investigators without specialized training and inhibit innovation. We raise, and aim to debunk, these myths to lower the perceived bar to engagement in IR while promoting methodological consistency.


Broadening engagement in IR—ensuring that it is not esoteric or purely academic—is a core pillar of implementation science. Given heterogeneity in the HIV epidemic across contexts, locally knowledgeable implementers and scientists (collectively referred to as “implementation researchers”) are best positioned to define relevant implementation research questions. Advancing the field to meet the challenges of bringing a conclusion to the epidemic will require innovation in developing, refining, and applying IR frameworks and methods. Thus, engagement of a broader and more diverse range of investigators and implementers in research–practice partnerships is needed to successfully implement the evidence-based tools available to end the HIV epidemic. Based on our collective experience, including consultations with IR trainees, conversations with colleagues and funders during meetings, inconsistent feedback during grant review, and review of the implementation science literature, we have noticed the proliferation of myths about what it means to “do implementation research.” These myths create barriers to engagement in IR, precisely at a time when continued expansion is needed. Ensuring quality application of the methods and measures used is important for creating generalizable knowledge and scientific integrity; we believe that can be performed while simultaneously promoting diversity and improving equity in IR through broader engagement.3


One Must Rigidly Apply All Aspects of an Implementation Framework for the Framework to be Valid

The use of theories, models, and frameworks (collectively, “frameworks”) to strengthen research is a hallmark of IR. Frameworks serve many purposes, including (1) ensuring a thorough and considered approach to IR within contexts and populations of interest, (2) making explicit the theory of how change will occur (ie, anticipated mechanisms of action) to guide selection of appropriate implementation strategies and evaluation metrics, and (3) promoting comparability of IR methods and outcomes across studies and contexts. Multiple reviews have identified and classified the numerous IR frameworks and their uses5–8 while considerable IR resources focus on appropriate model selection and application.8–13 The “hotspot” approach14 of EHE underlines the importance of tailoring frameworks in HIV IR based on heterogeneity in populations, geographies, and contexts. For example, identifying and selectively applying constructs that enhance understanding of key populations15 is critical domestically and in diverse, global settings.16 Alternatively, it may be necessary to add missing constructs or combine frameworks, such as incorporating health equity domains into established IR frameworks.17 Careful adaptation, often optimally guided by communities themselves, is recommended by framework scholarship.16,18–21 Implementation researchers can leverage resources designed to support model adaptation and application to advance their valid application,8,10,12,13 and documentation of study-level adaptations can help frameworks to improve over time.16 To advance IR, implementation researchers can use frameworks during research planning, implementation, and evaluation; apply appropriate measures of key framework constructs; and report and compare findings.9

Implementation Research Limits the Type of Designs Available to Researchers

The overarching goals of IR vary broadly, ranging from effectiveness studies with some implementation outcome measurement to an exclusive focus on the differential impact of implementation strategies on implementation outcomes. Study designs and approaches used by implementation researchers are varied, including observational, quasiexperimental, experimental, participatory, qualitative, mixed methods, costing, and modeling, none of which are specific to IR and many of which are not mutually exclusive. Specific to IR, however, are effectiveness-implementation hybrid designs,22,23 which themselves incorporate a range of approaches listed above, but necessitate consideration of both implementation and effectiveness outcomes. It is the research goal that is the primary determinant of IR, not the approach. For example, a mixed-methods IR trial testing the effect of a blended digital and peer-based education system for ART providers may evaluate improved provider knowledge as the primary outcome, with secondary outcomes including a qualitative assessment of the mechanisms through which knowledge change occurred, and improved viral suppression among patients living with HIV. IR often deals with varied, multilevel contexts24 and is often seeking balance between internal and external validity.25 To account for these issues, IR designs often randomize at the cluster level (eg, clinic and community) instead of the individual level and also use pragmatic,26 mixed-methods,27 or adaptive28,29 designs, although many designs can achieve IR objectives.

Implementation Strategies Cannot be Patient-Level or Client-Level Approaches

There is a perception that if the mechanism of action being studied is not focused on changing the behavior of the provider, organization, or health system, it is not IR. Confusion may be augmented by the blurriness that often exists between evidence-based interventions (EBIs) targeting health outcomes and implementation strategies targeting behaviors at multiple levels.30 For HIV, the EBI is often ART or PrEP and implementation strategies often target individual-level, provider-level, or system-level barriers to optimize ART/PrEP delivery and adherence. Ultimately, the most appropriate implementation strategy or combination of strategies must consider the contextual environment and match the implementation strategies to modifiable barriers impeding implementation or use of the EBI. Robust formative and preference-oriented research, combined with the application of logic models, frameworks, and guidance on strategy specification, are essential to ensuring that the strategies proposed are clear and aligned with relevant EBI barriers and priorities.31–34 This may result in implementation strategies targeting patients, providers, or organizational factors. A fair criticism of HIV-related IR is that implementation strategies have frequently been myopic, predominantly geared at the patient/client or, more recently, the ART delivery approach (eg, fast track and pharmacy35), with less focus on provider aspects.4 Increased emphasis on nonpatient approaches is warranted, but recognition that strategies to deliver EBIs such as PrEP or ART may need to be tailored to the multifactorial barriers that individuals face to prevention and care is critical. In settings where health system resources are stretched, engaging patients as the actors in implementation strategies that target the patients themselves or communities often represents the most pragmatic approach to enhancing implementation.36 This may be particularly true for members of stigmatized groups who are not well-served by other health system actors.37 To the extent that key barriers are based in the delivery of health care such as provider attitudes/friendliness or access issues such as transport or long clinic queues, reconceptualization of services to become more patient-centered or to circumvent structural access challenges is key. However, for individual psychosocial or network barriers, patient-oriented approaches remain critical and should not be undervalued.

Only Studies Prioritizing Implementation Outcomes are “True” Implementation Research

Grant and paper reviewers being less familiar with IR has emerged as a common thematic challenge, often requiring the inclusion of individual-level effectiveness outcomes, even for well-established EBIs. Perhaps partially in response to this, some more ensconced in the IR community discount IR studies that prioritize patient-level or client-level outcomes over implementation outcomes. Importantly, prominent outcome frameworks in IR include both client-level and implementation-level outcomes,38,39 with effectiveness as a key component of those measures. In addition, research that assesses downstream clinical events common to non-IR, such as viral suppression, can yield additional insights beyond simply “clinical effectiveness.” First, no single implementation outcome of interest is likely to mediate the entire effect of a strategy on a downstream clinical outcome. Therefore, measuring both the effect of a strategy on an implementation outcome and the effect on downstream clinical outcomes, especially in different settings, can help reveal the extent to which a particular implementation outcome mediates effects and how that varies. For example, the field has been interested in the effect on retention of being diagnosed with HIV and prescribed ART on the same day; however, how the provider offers ART, including the interpersonal dynamic, adjunctive counseling, and supportiveness of the clinic setting, will influence the effect of the same-day ART prescription on outcomes. Indeed, the literature shows effects ranging from retention improvements to decreases.40 Second, there may be common causes of the extent to which a strategy is implemented and the effects of the strategy on clinical outcomes. In this case, identification of context-specific factors that influence both implementation and clinical outcomes across different units can reveal important organizational and contextual influences, with possible implications for health equity that IR is positioned to identify and address.17,41,42 For example, owing to structural factors such as poverty, stigma, and racism, a clinic in a socioeconomically deprived area may have lower health care worker morale and patients with a greater psychosocial burden, which depress both provider uptake and the effect of delivery. Variability in the effects of levels of implementation on the downstream clinical effects is often a question of substantive interest that may be answered by a study powered on clinical effectiveness outcomes.24 In addition, typologies of hybrid implementation-effectiveness designs22 and pragmatic designs such as leveraging aggregate data instead of requiring individual-level enrollment may be useful tools for new implementation researchers in planning their study approach.

If Not Explicitly Labeled Implementation Research, It May Have Limited Impact on Implementation

Implementation research is an inherently multidisciplinary field whose ultimate goal is to advance our understanding of how to close the gaps between evidence from controlled studies and routine practice in real-world contexts. Any research conceptually seeking to understand the scale of these implementation gaps, the reasons for them, and strategies to address these gaps and the mechanisms through which the strategies may operate helps to serve this purpose.25 Not all research that includes conceptual equivalence with IR aims, however, will apply the rapidly developing IR nomenclature or use an established IR framework. For example, a growing science of incentives in HIV prevention, care, and treatment is based in behavioral economics, focusing inquiry on variation in uptake of EBIs.43–45 The field of economics is ripe with studies that advance implementation46 but are rarely explicitly labeled as IR: Econometrics provides robust methodology for assessing the impacts of real-world program implementation47 and discrete choice experiments quantify preferences to inform optimized program design for acceptability and adoption.48 Similarly, sociology and social network analyses describe the social dynamics underpinning spread that informs implementation science, including spread of infectious diseases and the dissemination and diffusion of behaviors.49 The discussion of context and mechanisms, enjoying much IR attention at the moment, also informs the context–mechanism–outcome framework central to realistic evaluation.50 In many traditions, approaches to use of evidence-based practices IR may call “strategies” are called “interventions,” but this difference in nomenclature should not obscure their immediate relevance to the field of IR. A generation ago, scholars in organizational psychology noted that a balance between knowledge acquisition processes that are open-ended, creative, and emergent and those that are more concrete and standardized are necessary to the vibrancy and health of organizations.51 IR frameworks, designs, and methods already draw heavily from these different fields, and as the field of IR advances, incorporating tools and approaches from diverse fields that are particularly suited for providing insights on implementation will be vital to the IR growth and avoiding “reinventing the wheel.” Ultimately, it is the questions being asked and the goals of the research that determine whether research advance implementation and not whether a specific label, design, method, or framework accompanies said research. Explicitly pursuing cross-disciplinary training and collaborations52 may help to more rapidly advance the field of IR and strengthen its applications in HIV research while using conceptual definitions of IR aims in addition to specific nomenclature may help to advance HIV IR reviews and practice.

The 5 myths presented here have been consistently encountered when working in IR training, academia, and funding spaces and can be negotiated to facilitate conceptually congruent transdisciplinary dialog, ensure high-quality research, and foster a more inclusive and dynamic field. Allowing for the full range of implementation frameworks, strategies, methods, and outcomes and avoiding overly specialized interpretations of IR practice will help implementation science applied to HIV to meet the scientific needs of the moment. IR can maintain quality across a diverse range of applications through practices including clarity and transparency in the scientific choices made around frameworks, strategies, and methods; collaboration with implementing partners and utilization of nomenclature that promotes accessibility and understanding across stakeholders; efforts to create generalizable knowledge through comparison across contexts; and rapid, open-access dissemination of findings. Dispelling these myths is particularly important to the field of HIV because HIV researchers are rapidly adopting IR as a way of bridging the research and practice gap. In addition, to achieve the aims of the Ending the HIV Epidemic initiative, including better supporting the health of people living with HIV and decreasing transmissions requires the innovation, application, and focus on equity made possible by high-quality IR. Ultimately, promoting the use of implementation science by a wide range of researchers will advance the field, as it has other fields, and expand opportunities to apply IR tools to ensure real-world effectiveness of efficacious interventions to end the HIV pandemic.


The authors acknowledge inputs from Dr. Chris Gordon at the National Institutes of Health for feedback on implementation research-related myths.


1. Eisinger RW, Dieffenbach CW, Fauci AS. Role of implementation science: linking fundamental discovery science and innovation science to ending the HIV epidemic at the community level. J Acquir Immune Defic Syndr. 2019;82(suppl 3):S171–S172.
2. Beyrer C, Adimora AA, Hodder SL, et al. Call to action: how can the US Ending the HIV Epidemic initiative succeed? Lancet. 2021;397:1151–1156.
3. Schwartz SR, Smith JD, Hoffmann C, et al. Implementing implementation research: teaching implementation research to HIV researchers. Curr HIV/AIDS Rep. 2021;18:186–197.
4. Smith JD, Li DH, Hirschhorn LR, et al. Landscape of HIV implementation research funded by the National Institutes of Health: a mapping review of project abstracts. AIDS Behav. 2020;24:1903–1911.
5. Tabak RG, Khoong EC, Chambers DA, et al. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43:337–350.
6. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:53.
7. Strifler L, Cardoso R, McGowan J, et al. Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J Clin Epidemiol. 2018;100:92–102.
8. Birken SA, Rohweder CL, Powell BJ, et al. T-CaST: an implementation theory comparison and selection tool. Implement Sci. 2018;13:143.
9. Moullin JC, Dickson KS, Stadnick NA, et al. Ten recommendations for using implementation frameworks in research and practice. Implement Sci Commun. 2020;1:42.
10. Hub UISR. Pick a Theory, Model, or Framework. Univeristy of Washington. Available at: Accessed June 14, 2021.
11. Ridde V, Pérez D, Robert E. Using implementation science theories and frameworks in global health. BMJ Glob Health. 2020;5:e002269.
12. Disorders LCfISiGB. Toolkit: Overcoming Barriers to Implementation in Global Health. Center for Global Health Studies at the Fogarty International Center, National Institutes of Health. Available at: Accessed June 14, 2021.
13. Rabin B, Glasgow R, Ford B, et al. Dissemination & Implementation Models in Health Research & Practice; 2021. Available at: Accessed September 13, 2021.
14. Fauci AS, Redfield RR, Sigounas G, et al. Ending the HIV epidemic: a plan for the United States. JAMA. 2019;321:844–845.
15. Schwartz SR, Rao A, Rucinski KB, et al. HIV-related implementation research for key populations: designing for individuals, evaluating across populations, and integrating context. J Acquir Immune Defic Syndr. 2019;82(suppl 3):S206–S216.
16. Means AR, Kemp CG, Gwayi-Chore MC, et al. Evaluating and optimizing the consolidated framework for implementation research (CFIR) for use in low- and middle-income countries: a systematic review. Implement Sci. 2020;15:17.
17. Woodward EN, Singh RS, Ndebele-Ngwenya P, et al. A more practical guide to incorporating health equity domains in implementation determinant frameworks. Implement Sci Commun. 2021;2:61.
18. Damschroder LJ, Lowery JC. Evaluation of a large-scale weight management program using the consolidated framework for implementation research (CFIR). Implement Sci. 2013;8:51.
19. Damschroder LJ, Goodrich DE, Robinson CH, et al. A systematic exploration of differences in contextual factors related to implementing the MOVE! weight management program in VA: a mixed methods study. BMC Health Serv Res. 2011;11:248.
20. Damschroder LJ, Aron DC, Keith RE, et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.
21. Team CR. Consolidated Framework for Implementation Research Construct Selection Techniques. Center for Clinical Management Research; 2021. Available at: Accessed June 14, 2021.
22. Curran GM, Bauer M, Mittman B, et al. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50:217–226.
23. Landes SJ, McBain SA, Curran GM. An introduction to effectiveness-implementation hybrid designs. Psychiatry Res. 2019;280:112513.
24. Wolfenden L, Foy R, Presseau J, et al. Designing and undertaking randomised implementation trials: guide for researchers. BMJ. 2021;372:m3721.
25. Odeny TA, Padian N, Doherty MC, et al. Definitions of implementation science in HIV/AIDS. Lancet HIV. 2015;2:e178–e180.
26. Loudon K, Treweek S, Sullivan F, et al. The PRECIS-2 tool: designing trials that are fit for purpose. BMJ. 2015;350:h2147.
27. Bauer MS, Damschroder L, Hagedorn H, et al. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3:32.
28. Broder-Fingert S, Kuhn J, Sheldrick RC, et al. Using the multiphase optimization strategy (MOST) framework to test intervention delivery strategies: a study protocol. Trials. 2019;20:728.
29. Hwang S, Birken SA, Melvin CL, et al. Designs and methods for implementation research: advancing the mission of the CTSA program. J Clin Transl Sci. 2020;4:159–167.
30. Eldh AC, Almost J, DeCorby-Watson K, et al. Clinical interventions, implementation interventions, and the potential greyness in between a discussion paper. BMC Health Serv Res. 2017;17:16.
31. Smith JD, Li DH, Rafferty MR. The implementation research logic model: a method for planning, executing, reporting, and synthesizing implementation projects. Implement Sci. 2020;15:84.
32. Kirchner JE, Smith JL, Powell BJ, et al. Getting a clinical innovation into practice: an introduction to implementation strategies. Psychiatry Res. 2020;283:112467.
33. Haley AD, Powell BJ, Walsh-Bailey C, et al. Strengthening methods for tracking adaptations and modifications to implementation strategies. BMC Med Res Methodol. 2021;21:133.
34. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.
35. Grimsrud A, Bygrave H, Doherty M, et al. Reimagining HIV service delivery: the role of differentiated care from prevention to suppression. J Int AIDS Soc. 2016;19:21484.
36. Webb K, Chitiyo V, Patel D, et al. The Action Birth Card: Evaluation of an Innovative Goal-Setting Tool to Increase Demand and Uptake of Underutilized Services along the PMTCT Cascade 8th IAS Conference on HIV Pathogenesis, Treatment and Prevention (IAS 2015). Vancouver: Canada; 2015.
37. Napierala S, Desmond NA, Kumwenda MK, et al. HIV self-testing services for female sex workers, Malawi and Zimbabwe. Bull World Health Organ. 2019;97:764–776.
38. Proctor EK, Landsverk J, Aarons G, et al. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Pol Ment Health. 2009;36:24–34.
39. Glasgow RE, Harden SM, Gaglio B, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2019;7:64.
40. Puttkammer N, Parrish C, Desir Y, et al. Toward universal HIV treatment in Haiti: time trends in ART retention after expanded ART eligibility in a National cohort from 2011 to 2017. J Acquir Immune Defic Syndr. 2020;84:153–161.
41. Chinman M, Woodward EN, Curran GM, et al. Harnessing implementation science to increase the impact of health equity research. Med Care. 2017;55(suppl 9 2):S16–S23.
42. Woodward EN, Matthieu MM, Uchendu US, et al. The health equity implementation framework: proposal and preliminary study of hepatitis C virus treatment. Implement Sci. 2019;14:26.
43. Galárraga O, Harries J, Maughan-Brown B, et al. The Empower Nudge lottery to increase dual protection use: a proof-of-concept randomised pilot trial in South Africa. Reprod Health Matters. 2018;26:1510701.
44. Linnemayr S, MacCarthy S, Kim A, et al. Behavioral economics-based incentives supported by mobile technology on HIV knowledge and testing frequency among Latino/a men who have sex with men and transgender women: protocol for a randomized pilot study to test intervention feasibility and acceptability. Trials. 2018;19:540.
45. Linnemayr S, MacCarthy S, Wagner Z, et al. Using behavioral economics to promote HIV prevention for key populations. J AIDS Clin Res. 2018;9:780.
46. Geng E, Hargreaves J, Peterson M, et al. Implementation research to advance the global HIV response: introduction to the JAIDS supplement. J Acquir Immune Defic Syndr. 2019;82(suppl 3):S173–S175.
47. Geldsetzer P, Fawzi W. Quasi-experimental study designs series-paper 2: complementary approaches to advancing global health knowledge. J Clin Epidemiol. 2017;89:12–16.
48. Eshun-Wilson I, Kim HY, Schwartz S, et al. Exploring relative preferences for HIV service features using discrete choice experiments: a synthetic review. Curr HIV/AIDS Rep. 2020;17:467–477.
49. Harling G, Tsai AC. Using social networks to understand and overcome implementation barriers in the global HIV response. J Acquir Immune Defic Syndr. 2019;82(suppl 3):S244–S252.
50. Pawson R, Tilley N. Realistic Evaluation. UK: SAGE Publications Ltd; 1997.
51. Gupta AK, Smith KG, Shalley CE. The interplay between exploration and exploitation. Acad Manag J. 2006;49:693–706.
52. Barnett ML, Dopp AR, Klein C, et al. Collaborating with health economists to advance implementation science: a qualitative study. Implement Sci Commun. 2020;1:82.

implementation research; implementation science; HIV; theory; outcomes frameworks

Copyright © 2022 Wolters Kluwer Health, Inc. All rights reserved.