* The New York Academy of Medicine: Lead agency for the Center with primary responsibility for 1) design and implementation of the multisite evaluation; and 2) organization of all cross-site program activities;
* Yale University School of Medicine: Representatives from general internal medicine and psychiatry responsible for providing bup/nx clinical support and training to demonstration site physicians and staff and providing clinical input into the evaluation design and analysis;
* Weill Cornell Medical College, Division of Health Policy: Lead researchers for the cost evaluation, including advisement on the development and implementation of appropriate instruments to assess initial and ongoing costs of integrated care;
* HRSA/SPNS: Funder of the initiative, provided technical assistance to the sites on administrative matters, monitored the use of grant funds, and facilitated coordination with related efforts at other federal agencies; and
* National Advisory Committee: Provided oversight and advice to the Center and the demonstration sites on the evaluation design, dissemination of findings, and linkages to external resources. Included patient advocates and experts in HIV care, addiction medicine, and health policy.
EVALUATION AND SUPPORT CENTER
The Evaluation and Support Center developed and implemented a set of mechanisms to address the challenges inherent in: 1) adopting a new clinical practice; and 2) conducting a multisite evaluation of programs, diverse in geography, institutional base, design, and patient population. Although the demonstration sites came to the project with a wealth of HIV care and research expertise, Center activities ensured that each site had access to consistent and ongoing technical assistance and support relevant to bup/nx treatment, program implementation, and evaluation. These activities included:
* Grantee Meetings: Semiannual grantee meetings, with representation from all sites, were a primary venue for providing cross-site training and technical assistance on clinical, programmatic, and evaluation issues. Meetings included formal training sessions, presentations from the demonstration sites and the Center, and relevant updates from HRSA and the National Advisory Committee;
* Clinical Technical Assistance Calls: Monthly calls provided opportunities for physicians, nurses, and other providers to present difficult cases and receive guidance on both routine and complex treatment issues;
* Clinical Technical Assistance Listserv: To complement the clinical calls, a Listserv was established whereby the demonstration sites were able to share resources and/or ask questions of the Center and their colleagues. The Listserv also provided a forum through which the Center could provide clinical updates and guidance;
* Medication Interaction Tracking System: To track potential adverse events resulting from prescribing bup/nx to HIV-infected patients, the Center developed and implemented a system for documentation of potential medication interactions;
* Study Implementation Calls: Monthly conference calls were established for demonstration site evaluators and other staff involved in study implementation to discuss logistics, share challenges and solutions, and ensure consistent implementation of the evaluation protocol;
* Web Resource Center: The Center developed and implemented a Web Resource Center (www.bhives.org) that included public and password-protected areas, the latter accessible only to staff from the demonstration sites, the Center, and HRSA. The site provided educational resources as well as study documents required for the implementation of the evaluation such as Institutional Review Board documents, meeting information, instruments, and training materials;
* BHIVES Training Manual: The Center created a training manual that provided comprehensive instructions on all aspects of the multisite protocol. Covering a wide range of topics, a primary goal of the manual was to establish consistency in client-level data collection. The manual included detailed instructions and a question-by-question guide for each instrument;
* Individual Technical Assistance: Individual technical assistance was provided through phone calls, e-mail, and in-person at grantee meetings and during site visits. Center staff were available to advise the sites on clinical issues, evaluation design, Institutional Review Board concerns, data systems, and program implementation;
* Data Entry, Transfer, and Tracking Systems: Patient-level data collected by the demonstration sites were submitted to the Center semiannually using a secure electronic File Transfer Protocol. The Center was responsible for cleaning, maintenance, and analysis of multisite data and developed protocols for tracking the receipt of data and the status of enrolled participants across time;
* Site Visits: Center staff conducted site visits at the start of program implementation and again during the final year of the initiative. Site visits included group and individual interviews with project and other relevant site staff. Site visits provided opportunities for the collection of in-depth evaluation data as well as discussion and problem-solving around implementation issues; and
* Facilitated Access to Free Study Medication: Reckitt Benckiser Pharmaceuticals agreed to provide free bup/nx to study patients who had no other means for payment. Free bup/nx was accessed by 154 patients at eight sites. Access to free medication was negotiated and facilitated by the Center.
THE DEMONSTRATION SITES
Demonstration sites were selected by HRSA through a competitive Request for Proposals process. The sites were located throughout the United States and included, as lead agency, academic medical centers (n = 7), community-based clinics (n = 2), and a public hospital. Several sites included two or more programs offered in separate locations with distinct institutional characteristics (eg, two sites included an HIV clinic within an academic medical center as well as a community health center). Although most sites had established connections to local bup/nx experts before the start of their program, only one had experience in the delivery of bup/nx treatment within an HIV care setting.
As a demonstration program with service delivery as a primary mandate, sites were allowed significant autonomy in the design of their programs so as to facilitate models of care that were attractive to patients and could be sustained after the grant period. All sites participating in the initiative were required to offer integrated HIV medical care and bup/nx treatment for opioid dependence. As described in detail in a companion paper,29 definitions of “integration” varied significantly (ranging from integration at the level of the individual physician to integration at the level of the clinic). One of the 10 sites was unable to meet the integrated care requirement; although an integral part of the initiative, data from this site were excluded from the analyses described in the accompanying articles.
At the start, sites were required to have a comparison arm; however, the protocol for enrollment and retention of comparison patients differed across sites. For example, there was variation in 1) prior treatment experience among comparison patients (eg, at some sites, all were untreated at baseline; at other sites, comparisons included patients stable on methadone); 2) treatment modality of comparison patients (including methadone and “drug-free”); and 3) allowance for transferring from the comparison to the intervention arm. The variation reflected demonstration site implementation and research interests but limited the multisite use of the comparisons.
Cross-site eligibility criteria were developed as a part of the collaborative process and incorporated standard human subjects' protections as well as criteria specific to the initiative and to bup/nx treatment guidelines-most notably, HIV-infected and meeting Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for opioid dependence (see Table 2). Patients were expected to be starting bup/nx treatment at baseline; there were no expectations regarding timing of HIV treatment, and there was significant variation in HIV treatment experience among patients.
Sites were able to apply the eligibility criteria with some flexibility as appropriate to their patient population and medical expertise; for example, there was some variability across sites in the upper limit of the methadone dose that patients were receiving when being considered for transfer to bup/nx.
Study participants were identified by providers and through self-referral; however, enrollment was slower than expected at most sites. The reasons for this included an unexpectedly small pool of appropriate patients, resulting from concerns around bup/nx transfer for patients stable on methadone, relatively limited knowledge regarding bup/nx among patients and potential referring physicians, and lower than anticipated rates of heroin use among clinic patients. In response to low enrollment at certain sites, extensive outreach was conducted within the clinic, community, and provider networks. In addition, low enrollment led to some expansion in program locations. In all, 427 patients were enrolled in the initiative across the 10 sites; 386 were enrolled at the nine sites included in the research described in the accompanying articles. Characteristics of the patient sample are fully described in a companion paper.30
THE MULTISITE EVALUATION PROTOCOL
All sites receiving funds through the initiative were required by HRSA/SPNS to develop a local evaluation plan and to participate in the multisite evaluation. As described in detail subsequently, the multisite evaluation included site- and patient-level data. With the exception of analyses focused on service delivery,31,32 which incorporated comparison patients, the evaluation used a single group pre-post design. The primary objectives of the evaluation focused on examining changes in substance use,33,34 HIV and other health outcomes3,31,35,36 as well as identification and analyses of processes of care and promising practices29,32,37-39 regarding integrated HIV and substance abuse treatment. The BHIVES evaluation protocol was approved by The New York Academy of Medicine Institutional Review Board and the Institutional Review Boards of all demonstration sites.
As indicated in Table 3, the evaluation included patient-level assessment interviews and chart abstractions, in-depth interviews with a subset of patients, provider surveys, patient satisfaction surveys, and site visits involving individual and group interviews with providers and study personnel (see www.bhives.org for data collection forms and instructions). Site-level participation in research studies developed through the collaborative that were not part of the core multisite evaluation (ie, quality of care31 and qualitative patient interviews37) was encouraged but not required.
Instruments to collect patient-level data were developed and piloted through a collaborative process (including in-person meetings, conference calls, and e-mail) involving study investigators and evaluators at all demonstration sites, Center staff and consultants, and HRSA/SPNS staff. To the extent possible, validated scales were incorporated into the patient assessments. Because the assessments were intended to capture multiple domains, we generally opted for brief scales (eg, SF-1240 and Center for Epidemiologic Studies-Depression [CES-D]) as well as those that captured multiple domains (ASI-Lite41) (see Table 3 for a listing of validated scales and the accompanying papers for detailed descriptions of data collected). If appropriate validated scales could not be identified, questions were developed using a consensus process. In addition to the assessment interviews, data were collected from the patients' medical records using standard chart abstraction forms. Data from the medical record included but were not limited to: year of HIV diagnosis, AIDS-defining illnesses, antiretroviral regimen, CD4 lymphocyte count, HIV RNA, and urine toxicology. As a service delivery initiative, sites were permitted to use clinical judgment regarding frequency of laboratory tests.
Study participants were followed for 12 months from the date of their baseline interview independent of whether they were retained in HIV care and/or substance abuse treatment. Patients were disenrolled only on request or if they acted in as way that was inconsistent with site-level regulations (eg, threatening or exhibiting violent behavior). The protocol called for quarterly assessments. Recognizing the challenges inherent in scheduling appointments with the target population, a 45-day window (both before and after the date of the quarter) was built around each assessment. Study participants received an in-kind incentive (generally a gift card for a local grocery store or pharmacy) for completion of each assessment interview.
From the perspective of the multisite evaluation, arm assignment was based on treatment received within the first 44 days in the study: bup/nx (n = 303); methadone (n = 41); or other, which included a variety of counseling modalities (n = 42). In all, 1399 baseline and follow-up assessment interviews were available for analysis across the study period including 1095 for patients in bup/nx arm, 166 for methadone patients, and 138 for patients participating in other treatment.
Surveys assessing bup/nx knowledge and attitudes among providers were developed and piloted through the collaborative process described previously. They were administered by demonstration site research staff to medical providers and other site staff three times during the study period.42 At the midpoint of the initiative, a supplementary survey, focused on opioid prescribing for the treatment of pain and addiction, was developed by a subcommittee of the Center and demonstration site staff and administered at all sites.39 Data from the two surveys were submitted to the Center for aggregation, storage, cleaning, and analysis. Additional provider-level data were collected through individual and group interviews conducted during site visits (described previously). Interviews focused on program implementation (including barriers and facilitators), best practice recommendations, and other lessons learned.29
DATA MANAGEMENT AND ANALYSIS
Individual-level data from the demonstration sites were transmitted to The New York Academy of Medicine over a secure server on a semiannual basis and reviewed by the project statistician for consistency and completeness. Preliminary analyses of patient-level data, including enrollment statistics, process indicators (eg, proportion of patients with viral load measures), and preliminary findings (eg, substance use outcomes), were conducted no less than once per year and made available to all Collaborative members. Final analyses were conducted by Center statisticians, working in cooperation with topic-specific workgroups, as represented by the articles included in this supplement. Detailed descriptions of statistical methods are included within each of the analytic articles. All data were analyzed using SPSS (Version 15.0 for Windows; SPSS Inc, Chicago, IL); to the extent possible, a consistent statistical approach was used across analyses.
Qualitative data from site visits and in-depth patient interviews were analyzed using standard qualitative methods, including repeated reviews of the data within and across interviews. All interviews were audio-recorded; patient interviews and Principal Investigator interviews were transcribed and coded using NVIVO (Version 8.0, QSR International Doncaster, Australia), a software package for maintenance and analysis of qualitative data. Detailed notes were taken during all site visit interviews and were used, along with audio recordings, for analysis when transcripts were unavailable.
As might be expected, multisite initiatives present both opportunities and challenges. The SPNS program encourages demonstration sites to design programs that best fit local conditions considering patient, staff, institutional, and community characteristics. The resultant heterogeneity in approaches yields useful information regarding implementation possibilities and challenges. From the evaluation perspective, heterogeneity-combined with a primary mandate of service delivery (thus limiting use of experimental designs)-amplify the challenges and necessitate unique approaches for applying scientific rigor to the evaluation process. The background information provided in this article, although offered primarily as context for understanding the analytic papers developed through the BHIVES initiative, also provides a number of “lessons learned” for the implementation and evaluation of multisite demonstrations. For example, by working collaboratively with the demonstration sites, the Center was able to delineate a minimum uniform data set feasible for all programs. Training, detailed written instructions, and ongoing technical assistance and support proved necessary to maintain quality and consistency of the data being collected. Site visits with qualitative interviews provided invaluable context for interpretation of survey data and opportunities for discussion and problem-solving. Semiannual grantee meetings, listservs, conference calls, and the Web Resource Center were critical to fostering cross-site collaboration and reinforcing and supplementing guidance regarding best practices in clinical care.
However, unanticipated challenges were encountered. As noted previously, at most sites, the number of appropriate patients was fewer than anticipated, resulting in an overall sample size that was smaller than expected. Comparison arms also represented ongoing challenges and proved to be of significantly less analytic value than had been hoped; this was the result of their diversity; crossover (allowed at most sited as a result of the service delivery mandate); and the fact that a number of comparison patients continued with the substance abuse treatment modality that they had been receiving before study enrollment. “True” baseline data were not available for these patients. Finally, as a service delivery initiative, there were some limitations on available clinical data. Sites used clinical judgment regarding frequency of viral load, CD4, and urine toxicology screens, meaning that data available for the evaluation were more sparse than would be optimal from a research perspective.
Despite the challenges and limitations of the evaluation, the BHIVES initiative includes the largest sample of HIV-infected bup/nx patients to date and offers a number of key findings regarding bup/nx treatment and the potential for integrated substance use and HIV care. These findings are all the more valuable for the “real-world” context within which they were generated.
1. Hser YI, Hoffman V, Grella CE, et al. A 33-year follow-up of narcotics addicts. Arch Gen Psychiatry
2. Fischer B, Cruz MF, Rehm J. Illicit opioid use and its key characteristics: a select overview and evidence from a Canadian multisite cohort of illicit opioid users (OPICAN). Can J Psychiatry
3. Korthuis PT, Zephyrin LC, Fleishman JA, et al. Health-related quality of life in HIV-infected patients: the role of substance use. AIDS Patient Care STDs
4. Ross J, Teesson M, Darke S, et al. The characteristics of heroin users entering treatment: findings from the Australian Treatment Outcome Study (ATOS). Drug Alcohol Rev
5. Rodriguez-Arenas MA, Jarrin I, del Amo J, et al. Delay in the initiation of HAART, poorer virological response, and higher mortality among HIV-infected injecting drug users in Spain. AIDS Res Hum Retroviruses
6. Hicks PL, Mulvey KP, Chander G, et al. The impact of illicit drug use and substance abuse treatment on adherence to HAART. AIDS Care
7. Wood E, Kerr T, Zhang R, et al. Poor adherence to HIV monitoring and treatment guidelines for HIV-infected injection drug users. HIV Med
8. Mellins CA, Havens JF, McDonnell C, et al. Adherence to antiretroviral medications and medical care in HIV-infected adults diagnosed with mental and substance abuse disorders. AIDS Care
9. Shannon K, Kerr T, Lai C, et al. Nonadherence to antiretroviral therapy among a community with endemic rates of injection drug use. J Int Assoc Physicians AIDS Care (Chic Ill)
10. Laine C, Zhang D, Hauck WW, et al. HIV-1 RNA viral load monitoring in HIV-infected drug users on antiretroviral therapy: relationship with outpatient care patterns. J Acquir Immune Defic Syndr
11. Celentano DD, Latimore AD, Mehta SH. Variations in sexual risks in drug users: emerging themes in a behavioral context. Curr HIV/AIDS Rep
12. Centers for Disease Control and Prevention. HIV-associated behaviors among injecting-drug users-23 cities, United States, May 2005-February 2006. MMWR Morbid Mortal Wkly Rep
13. Palepu A, Tyndall MW, Joy R, et al. Antiretroviral adherence and HIV treatment outcomes among HIV/HCV co-infected injection drug users: the role of methadone maintenance therapy. Drug Alcohol Depend
14. Joseph H, Stancliff S, Langrod J. Methadone maintenance treatment (MMT): a review of historical and clinical issues. Mt Sinai J Med
15. Stancliff S, Myers JE, Steiner S, et al. Beliefs about methadone in an inner-city methadone clinic. J Urban Health
16. Marsch LA. The efficacy of methadone maintenance interventions in reducing illicit opiate use, HIV risk behavior and criminality: a meta-analysis. Addiction
17. Schwartz RP, Kelly SM, O'Grady KE, et al. Attitudes toward buprenorphine and methadone among opioid-dependent individuals. Am J Addict
18. Sullivan LE, Chawarski M, O'Connor PG, et al. The practice of office-based buprenorphine treatment of opioid dependence: is it associated with new patients entering into treatment? Drug Alcohol Depend
19. Haile CN, Kosten TA, Kosten TR. Pharmacogenetic treatments for drug addiction: alcohol and opiates. Am J Drug Alcohol Abuse
20. Sullivan LE, Fiellin DA. Narrative review: buprenorphine for opioid-dependent patients in office practice. Ann Intern Med
21. Johnson RE, Strain EC, Amass L. Buprenorphine: how to use it right. Drug Alcohol Depend
22. Maremmani I, Pani PP, Pacini M, et al. Substance use and quality of life over 12 months among buprenorphine maintenance-treated and methadone maintenance-related heroin-addicted patients. J Subst Abuse Treat
23. Johnson RE, Chutuape MA, Strain EC, et al. A comparison of levomethadyl acetate, buprenorphine, and methadone for opioid dependence. N Engl J Med
24. Fudala PJ, Bridge TP, Herbert S, et al. Office-based treatment of opiate addiction with a sublingual-tablet formulation of buprenorphine and naloxone. N Engl J Med
25. Ling W, Wesson DR. Clinical efficacy of buprenorphine: comparisons to methadone and placebo. Drug Alcohol Depend
26. Stein MD, Cioe P, Friedmann PD. Buprenorphine retention in primary care. J Gen Intern Med
27. Jaffe JH, O'Keeffe C. From morphine clinics to buprenorphine: regulating opioid agonist treatment of addiction in the United States. Drug Alcohol Depend
28. Cheever L, Kresina TF, Cajina A, et al. A model federal collaborative to increase patient access to buprenorphine treatment in HIV primary care. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S3-S6.
29. Weiss L, Netherland J, Egan JE, et al. Integration of buprenorphine/naloxone treatment into HIV clinical care: lessons from the BHIVES collaborative. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S1-S2.
30. Chaudhry AA, Botsko M, Weiss L, et al. Participant characteristics and HIV risk behaviors among individuals entering integrated buprenorphine/naloxone and HIV care. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S14-S21.
31. Korthuis PT, Fiellin DA, Rongwei F, et al. Improving adherence to HIV quality of care indicators in persons with opioid dependence: the role of buprenorphine/naloxone. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S83-S90.
32. Schackman BS, Leff J, Botsko M, et al. The cost of integrated HIV care and buprenorphine/naloxone treatment: results of a cross-site evaluation. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S76-S82.
33. Fiellin DA, Weiss L, Botsko M, et al. Drug treatment outcomes among HIV-infected, opioid-dependent patients receiving buprenorphine/naloxone. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S33-S38.
34. Sullivan LE, Botsko M, Cunningham C, et al. The impact of cocaine use on outcomes in HIV-infected patients receiving buprenorphine/naloxone. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S54-S61.
35. Altice FL, Lucas GM, Lum P, et al. HIV treatment outcomes among HIV-infected, opioid-dependent patients receiving buprenorphine/naloxone treatment within HIV clinical care settings: results from a multisite study. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S22-S32.
36. Vergara-Rodriguez P, Tozzi MJ, Botsko M, et al. Hepatic safety and lack of antiretroviral interactions with buprenorphine/naloxone in HIV-infected opioid-dependent patients. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S62-S67.
37. Egan JE, Netherland J, Gass J, et al. Patient perspectives on buprenorphine/naloxone treatment in the context of HIV care. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S46-S53.
38. Finkelstein R, Netherland J, Sylla L, et al. Policy implications of integrating buprenorphine/naloxone treatment and HIV care. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S98-S104.
39. Lum PJ, Little S, Botsko M, et al. Opioid prescribing and provider confidence recognizing opioid abuse in HIV primary care settings. J Acquir Immune Defic Syndr
. 2011;56(Suppl 1):S91-S97.
40. Gandek B, Ware JE, Aaronson NK, et al. Cross-validation of item selection and scoring for the SF-12 Health Survey in nine countries: results from the IQOLA Project. International Quality of Life Assessment. J Clin Epidemiol
41. McLellan AT, Luborsky L, Woody GE, et al. An improved diagnostic evaluation instrument for substance abuse patients. The Addiction Severity Index. J Nerv Ment Dis
42. Netherland J, Botsko M, Egan JE, et al. Factors affecting willingness to provide buprenorphine treatment. J Subst Abuse Treat
43. Derogatis LR, Melisaratos N. The Brief Symptom Inventory: an introductory report. Psychol Med
Keywords:© 2011 Lippincott Williams & Wilkins, Inc.
buprenorphine; HIV; AIDS; opioid-related disorders; heroin dependence; program evaluation; integrated care