In 1981, when the first cases of AIDS were reported in CDC's Morbidity and Mortality Weekly Report,1 2 computing technology milestones also occurred: the first US patent was granted for a computer program, and Microsoft released a new computer operating system—MS-DOS 1.0. In its fourth decade, the US HIV epidemic has grown and changed in ways that could not have been imagined in 1981,2 and so has the computing technology industry. Advances in microcomputing speed, data transmission speed, miniaturization of computing technologies, and the development of infrastructure for high mobile bandwidth have revolutionized industries and social structures3 and also offer great potential for reinvigorating certain aspects of HIV research and prevention.4,5 Here, we consider how changes in the US HIV epidemics and technology present mutual opportunities, summarize several key lessons learned from the first decade of online prevention research, and propose priorities to improve the impact of emerging technologies for HIV prevention going forward.
CHANGING EPIDEMICS, CHANGING TECHNOLOGIES
In the fourth decade, the HIV/AIDS epidemic has substantially changed in the United States. Compared with the early 1980s, cases of pediatric HIV/AIDS in the United States are now almost nonexistent; similarly, cases among injection drug users (IDU) have greatly diminished and are decreasing in heterosexuals.2,6 Earlier concerns about HIV bridging from men who have sex with men (MSM) and IDU into the general population to produce a generalized epidemic have also, ultimately, not happened. Instead, HIV has concentrated among MSM and in some communities of color. The proportion of incident cases attributed to male–male sex has risen in recent years to at least 61% of new infections6; young MSM, especially young MSM of color, have emerged as the most heavily impacted subgroup within this epidemic.7,8 Among other factors, the rise of the Internet as a facilitator of sexual networking has been proposed to have contributed to the reemergence of HIV and STDs among US MSM.9–11 Beyond MSM, high levels of late diagnoses of HIV suggest that lack of awareness of HIV infection is an important ongoing public health challenge,12 and routine HIV screening has been recommended for US adolescents and adults aged 13–64 years.13
Several major trends in computing technologies suggest that the time is right to take greater advantage of electronic media for HIV prevention and research. First, mobile phones have become ubiquitous. Among US adults, 85% own a cell phone, and among millennials (those aged 18–34 years), 94%–95% own a cell phone.14,15 Importantly, 40% of these phones are “smartphones,”14 which can receive Internet content in mobile web browsers and use “mobile applications,” or “apps.” Second, wireless data speeds and bandwidth continue to increase16; higher data speeds allow more complex content to be presented to mobile devices. Third, the “digital divide,” in which some racial/ethnic minorities have historically had less access to private high-speed Internet, has narrowed considerably in recent years.17 Fourth, new technologies and online venues have emerged that represent interactions between technology and risk taking. For example, sexual networking Web sites and mobile phone applications facilitate sexual partnering of MSM.18 This places sexual negotiation, online innovation, and opportunity for sexual health intervention in a common (if virtual) place. Finally, social networking platforms have grown explosively, to the extent that these virtual communities are increasingly approaching a census of some key subpopulations of the US adults. For example, the proportion of Internet-using US adults with at least 1 social networking account increased from 8% in 2005 to 65% in 201119; among Americans aged 18–33 years, 83% have at least 1 such account.19 Considered together, these changes suggest that broad segments of Americans, especially younger Americans, can be reached and engaged through electronic media for the purposes of HIV prevention research and prevention content delivery.
THE UPS AND DOWNS OF NEW TECHNOLOGIES FOR HIV PREVENTION
The potential advantages of electronic media for HIV research and prevention are many. In research, the use of online surveys offers opportunities to collect large amounts of data in relatively short periods of time. Additionally, online research allows researchers to enroll populations of research subjects from broad geographic areas, increasing the extent to which research results may be generalizable beyond specific cities or regions. For example, the European Men's Internet Survey (EMIS) collected 180,998 online surveys in 12 weeks, representing MSM in 38 countries.20 Using modern computing platforms and survey utilities that allow for complex scripting, more sophisticated methods for measuring complex behavioral patterns can be applied. For example, an Internet-administered questionnaire to collect information about sexual networks was reported to allow better resolution of the timing of sexual relationships compared with interviewer-administered methods.21 For HIV prevention programs, online capacities offer the opportunity to increase fidelity of the delivery of prevention content and the possibility of scaling up prevention efforts with minimal incremental costs.
However, new communication technologies and online facilities are not without limitations for HIV prevention research and programs. In research, there are significant concerns about selection biases in the enrolling of online samples, and these biases are believed to operate across age, race, education, and risk for HIV infection. For example, an analysis of published HIV prevention studies of US MSM suggested that black and Hispanic MSM have historically been underrepresented in online prevention research by 6%–89%.22 Furthermore, the extent to which MSM who consented to and began an online survey varied by age, race, education, and urbanicity of residence.22 The compounded selection biases at the levels of exposure to recruitment materials, click through, consent, and survey completion are complicated and make it difficult to characterize the populations to which study findings may apply. Also, when studies use monetary incentives, there is the potential for fraudulent data collections, in which either ineligible individuals misrepresent themselves to participate or in which some individuals enroll and complete study procedures multiple times.23 These instances result in misclassification bias and/or in the skewing of data toward the response profile of those who complete study procedures multiple times. Finally, because technologies change rapidly, the cycle of envisioning, competing for funding, conducting, and analyzing findings of online prevention research may be so long that by the time research is concluded and reported, the technology environment is substantially changed.
WHAT HAVE WE LEARNED?
Use Emerging Computing and Communication Technologies for What They Can Do Uniquely Better Than Existing Technologies
Generally, emerging technologies are superior to existing technologies with respect to bringing content delivery to scale, collecting data, or presenting survey items in an adaptive way (ie, the sequence of questions or content changes based on previous inputs), covering broad geographic areas, and delivering technologies in highly tailored ways based on the preferences or characteristics of research subjects or clients.24–27 For example, in a randomized control trial, researchers demonstrated the feasibility of effectively surveying28 and administering an HIV prevention intervention29 to rural MSM, who are more isolated and cover a larger geographic area than urban MSM.
However, delivery of certain research instruments or prevention services is uniquely suited to being provided by people. For example, motivational interviewing, HIV prevention counseling, and interview schemes that cannot be reduced to an algorithm are inherently better suited to being provided by a trained research interviewer or prevention provider. A corollary of this is that there may be points in the flow of an otherwise online endeavor—for example, the provision of HIV-positive results from at-home HIV testing to participants in a large online HIV prevention study—at which it is important to step out of the virtual environment of study procedures and engage with research participants personally. In our experience with such at-home testing, participants who tested HIV-positive at home were followed up by phone to provide telephone support and linkage support services; 12 of the 14 reported attending a first doctor's visit after such “in-person” interactions.30
Match Technologies and Online Setting to Populations
Certain populations may be well-suited to specific technologies; formative research that uses data sources not conventionally used in HIV prevention research, such as public opinion survey data and technical reports of technology utilization16 may suggest such opportunities. For example, in response to data suggesting that MSM of color were more likely than white MSM to report mobile phone ownership, a recent online prospective prevention cohort tested the impact of collecting follow-up survey data through short message service (SMS, or text messaging) to improve the retention of MSM of color. Preliminary results indicated that, for black MSM, collecting follow-up survey data through SMS (versus web-based survey) resulted in higher early retention.30 Future research regarding the prevalence of mobile application use among young MSM, which is already at 60% among all US adults aged 18–29 years,31 may also reveal an innovative way to collect data and administer HIV prevention interventions.
Implement Approaches, Both Before and After Research, to Improve Data Quality
As mentioned earlier, the risk of fraud and selection and misclassification biases pose threats to the usefulness of research data collected through Internet study procedures. Based on lessons learned to date, there are important opportunities to reduce the impact of these concerns during each phase of study development. Researchers can reduce risks of fraud and bias by taking steps before data collection to reduce their occurrence, during data collection to identify and mitigate problems related to hacking and recruitment, and after data collection to remove or adjust for data related to suspicious entries.
Before data collection, steps can be taken to reduce enrollment of ineligible participants and duplicate enrollment. To reduce enrollment of ineligible participants, for example, participants can be recruited through venues that suggest the authenticity of their eligibility. For example, men recruited through gay sex–seeking sites, or whose social networking profiles identify them as gay, are unlikely to have attended the site or have identified as gay in their profile solely to seek participation in a monetarily-incentivized study. In terms of duplicate enrollment, it is possible to restrict repeat enrollment from a single IP address, or to require verification of email address or mobile phone number as part of the study registration process.23 Protection against automated hacking programs, or “bots,” has also improved using methods such as reCAPTCHA, which distinguishes between computer programs and human responders by requiring that a human complete a task before being granted access to the survey.32 Although imperfect solutions, steps such as these make it more challenging for participants to enroll more than once and increase the effort-to-incentive ratio.
During data collection, it is useful to monitor incoming participants' locations and referring URLs, particularly if the survey has been designed to include quotas (eg, for stratified sampling). Abnormally high frequencies from either a specific location or a referring single Web site can indicate that, at the very least, something has gone wrong with the recruitment mechanism. Clusters of entries from the same city may indicate link sharing, either on Web sites or among friends, or individuals completing the survey multiple times from different locations. Similarly, referring URLs can reveal whether the link to a survey has been posted to a Web site that lists surveys that people can take for money (eg, SlickDeals.net).
After data are collected, it is imperative that researchers take steps to deduplicate response sets based on the available data. Typically, deduplication protocols may begin with a computer-based screening of datasets for duplicate IP addresses, resolved either to the full address or to the first 3 quadrants (or 24 bits).33 Potentially duplicative results are then examined manually more closely for similarities in other elements of the response profile. In some cases, and as allowed by institutional review board (IRB) procedures, participants suspected of enrolling more than once may also be contacted to help inform appropriate dispositions of suspect records. More recently, improved technologies have been considered to use potentially identifying user characteristics (such as patterns of keystroking) to help provide additional objective data on which to base dispositions of suspect records.34
WHAT ARE SOME NEXT STEPS TO MAXIMIZE THE IMPACT OF EMERGING TECHNOLOGIES FOR HIV PREVENTION?
Better Ways to Characterize Bias/development of Sampling Frames
Given the very high coverage of social networking subscriptions in the United States,19 there are increasing opportunities to improve the rigor of online HIV prevention research by using social networking approaches to recruit samples with high representativeness and to characterize selection biases. Generally, it will be important to address several issues to advance the field in this area. First, researchers will need to develop credible estimates of the number of MSM in online social networks, with stratified estimates for important subgroups (eg, younger men, MSM of color, HIV-positive MSM). Such research may use traditional methods, such as capture–recapture methods based on survey responses, or technology-enabled approaches, such as the use of public domain data about coenrollment in certain subscription venues suggestive of MSM behavior. Second, because many MSM (including “post-gay” millennial MSM) may not identify as such in their social network profiles,35 research is needed to identify proxy markers for MSM in social networking environments and to develop and test peer-referral strategies that take advantage of online social networking structures to allow access to MSM who are not publicly identified as such in their profiles. Third, recruiting from general social networking sites increases the risk of enrollment by ineligible participants for monetary or other motivations.36 Significant challenges in excluding “professional online research participants” and automated computer bots need to be anticipated. For example, in a recent recruitment of MSM from Facebook, the first 750 of 900 subjects failed to pass a cross-validation and deduplication protocol (Rosser, unpublished data). Hence, more research on how to cross-validate eligibility is needed.
Having credible estimates of the numbers and characteristics of MSM in online sexual networking communities will also allow for improved characterization of selection biases in recruitment. Furthermore, social networking communities lend themselves naturally to sampling strategies, such as Respondent-Driven Sampling,37 which rely on social network structures. Developing multiple methods to characterize samples and increase representativeness will facilitate triangulation toward an understanding of online MSM populations and the direction and extent of biases in samples recruited in specific studies.
Development of Standards for Reporting Results of Surveys
The field of Internet research for HIV prevention has been fast paced and entrepreneurial to date. To improve the scientific credibility of research results from online research efforts, it is necessary to develop a consensus set of expectations of the reporting of results from online research. Such an approach builds on the successes of earlier efforts to standardize reporting of research results from randomized trials,38 field outbreak investigations (ORION),39 epidemiologic studies (STROBE),40 and systematic reviews and meta-analyses (PRISMA).41,42 Of note, the “Checklist for Reporting Results of Internet E-Surveys” (CHERRIES)43 guidelines for reporting of surveys by Eysenbach has been a helpful resource for researchers; however, because of the continued evolution of technology issues, there is substantial opportunity to develop consensus guidelines that might lead to broad adoption by biomedical journal editors.
Develop Improved Streamlined Funding Mechanisms to Keep up With Technology
As mentioned herein, the typical federally supported research cycle is at odds with the rapid pace of evolving technology. Prolonged review and funding timelines put scientists at risk of beginning research data collections with outdated technologies and approaches. Furthermore, project implementation for technology research projects often require substantial investment in development of technology pieces early in the course of a study; these realities may be in conflict with NIH annual budget caps that assume consistent levels of spending in each study year. To address these concerns, academic and public health scientists should support funders to develop expedited funding mechanisms with appropriate flexibility to address the unique project management issues associated with online research.
Because of technical issues related to privacy and the unique challenges of informed consent online, IRBs may have varying approaches to how research protocols for online data collections are treated with respect to risks of participation. Furthermore, there are gaps in understanding of which processes for informed consent may be most helpful to participants. In two studies, the main reasons given by respondents for declining to participate in an online HIV prevention survey study were as follows; in the first study: too long a survey battery (53%), insufficient compensation (32%), and confidentiality concerns (26%); in the second study: too much of a hassle (53%), inconvenient timing (32%), and sexually explicit content (15%).44 Survey length, compensation, timing in addition to concerns about confidentiality, and content need to be addressed. Traditional (offline) approaches of presenting consent as a document to be read and signed appears inappropriate for most online research.45 Because people read and process information differently online,46 researchers need to adapt offline procedures to develop appropriate online consent processes. Four challenges are primary: (1) designing recruitment and enrollment procedures to ensure adequate attention to human subject considerations, (2) obtaining and documenting subjects' consent, (3) establishing investigator credibility through investigator–participant interactions, and (4) enhancing confidentiality during all aspects of the study.44 There are also opportunities for the development of improved educational programs for IRB members and administrators, and for further research into informed consent processes for participants in online studies.
Oversurveying of MSM
A critical issue for the field is the proliferation of online HIV behavioral and prevention surveys, which may result in challenges in recruiting MSM to future research studies or in confusion among MSM as to whether they may have already participated in a particular survey. This problem is exacerbated, at least potentially, because many studies use widely available “stock” images in Internet banner advertisements for recruiting MSM. In our own informal observations, the rates at which men “click through” gay-themed banner advertisements for surveys on social networking sites has decreased in recent years, as has the survey completion rate among men who consent and begin surveys. Anecdotally, and through responses to open-ended feedback fields in our own surveys, men express frustration at being asked very similar sets of questions in multiple surveys, and in their limited ability to see the results of research that they participate in (especially for anonymous surveys). This suggests that there may be an opportunity to develop mechanisms that preserve academic freedom and promote investigator-driven research approaches but use some common resources to prevent the extinction of willingness of MSM to participate in online sexual health research. Such approaches might include the development of a voluntary transparent registry of online surveys, with links to published research products, or the development and implementation of a periodic omnibus survey—perhaps through existing Center for AIDS Research infrastructures—with open access to researchers wishing to collect data specific to their research interests at no or low cost. The development of panels of MSM respondents, while presenting other biases, might also offer potential for reducing burden among men in our communities, while allowing for rigorous data collections.
The emergence of technology-enabled methods for HIV research and prevention has been a fast-paced and dynamic development in HIV research and prevention. The dynamic nature of technology suggests that embracing a set of guiding principles, rather than a codified set of best practices, allows a framework in which prevention researchers and practitioners can use new communication technologies to best advantage. There is a great deal of opportunity to bring increased rigor and impact to technology-enabled research and prevention through addressing traditional issues of behavioral science and epidemiology, such as doing better at assessing biases and data quality, developing better sampling approaches, improving transparency of research methods and results, and reaching consensus on standards for reporting the results of online studies. A smart combination of technology-enabled interactions and human interactions with clients likely offers an optimal approach for evaluating and implementing combination HIV prevention approaches.
1. Centers for Disease Control and Prevention. Pneumocystis pneumonia—Los Angeles. MMWR Morb Mortal Wkly Rep. 1981;30:1–3.
3. DiMaggio P, Hargittai E, Neuman WR, et al.. Social implications of the Internet. Ann Rev Sociol. 2001;27:307–336. doi: 10.1146/annurev.soc.27.1.307.
4. Swendeman D, Rotheram-Borus MJ. Innovation in sexually transmitted disease and HIV prevention: Internet and mobile phone delivery vehicles for global diffusion. Curr Opin Psychiatry. 2010;23:139. doi: 10.1097/YCO.0b013e328336656a.
5. Rietmeijer CA, McFarlane M. Web 2.0 and beyond: risks for sexually transmitted infections and opportunities for prevention. Curr Opin Infect Dis. 2009;22:67–71. doi: 10.1097/QCO.0b013e328320a871.
6. Prejean J, Song R, Hernandez A, et al.. Estimated HIV incidence in the United States, 2006–2009. PLoS One. 2011;6:e17502. doi: 10.1371/journal.pone.0017502.
7. Phillips G, Wohl A, Xavier J, et al.. Epidemiologic data on young men of color who have sex with men. AIDS Patient Care STDS. 2011;25(suppl 1):S3–S8. doi: 10.1089/apc.2011.9882.
8. Centers for Disease Control and Prevention. Trends in HIV/AIDS diagnoses among men who have sex with men—33 States, 2001–2006. MMWR Morb Mortal Wkly Rep. 2008;57:681–686.
9. Centers for Disease Control and Prevention. Internet use and early syphilis infection among men who have sex with men—San Francisco, California, 1999–2003. MMWR Morb Mortal Wkly Rep. 2003;52:1229–1232.
10. Wong W, Chaw JK, Kent CK, et al.. Risk factors for early syphilis among gay and bisexual men seen in an STD clinic: San Francisco, 2002-2003. Sex Transm Dis. 2005;32:458–463. doi: 10.1097/01.olq.0000168280.34424.58.
11. Ciesielski CA. Sexually transmitted diseases in men who have sex with men: an epidemiologic review. Curr Infect Dis Rep. 2003;5:145–152. doi: 10.1007/s11908-003-0051-5.
12. Centers for Disease Control and Prevention. Vital signs: HIV testing and diagnosis among adults—United States, 2001–2009. MMWR Morb Mortal Wkly Rep. 2010;59:1550–1555.
13. Centers for Disease Control and Prevention. Revised recommendations for HIV testing of adults, adolescents, and pregnant women in health-care settings. MMWR Morb Mortal Wkly Rep. 2006;55(RR-14):1–17.
18. Liau A, Millett G, Marks G. Meta-analytic examination of online sex-seeking and sexual risk behavior among men who have sex with men. Sex Transm Dis. 2006;33:576–584. doi: 10.1097/01.olq.0000204710.35332.c5.
21. Rosenberg ES, Sullivan PS. A new online survey module to measure sexual partnership timing, with results from a focus group of MSM. Paper presented at Sex::Tech; February 26, 2010; San Francisco, CA.
22. Sullivan PS, Khosropour CM, Luisi N, et al.. Bias in online recruitment and retention of racial and ethnic minority men who have sex with men. J Med Internet Res. 2011;13:e38. doi: 10.2196/jmir.1797.
23. Bowen AM, Daniel CM, Williams ML, et al.. Identifying multiple submissions in internet research: preserving data integrity. AIDS Behav. 2008;12:964–973. doi: 10.1007/s10461-007-9352-2.
24. Eysenbach G, Wyatt J. Using the Internet for surveys and health research. J Med Internet Res. 2002;4:e13. doi: 10.2196/jmir.4.2.e13.
25. Wyatt JC. When to use web-based surveys. J Am Med Inform Assoc. 2000;7:426–430. doi: 10.1136/jamia.2000.0070426.
26. Ekman A, Litton JE. New times, new needs; e-epidemiology. Eur J Epidemiol. 2007;22:285–292. doi: 10.1007/s10654-007-9119-0.
27. Ekman A, Dickman PW, Klint Å, et al.. Feasibility of using web-based questionnaires in large population-based epidemiological studies. Eur J Epidemiol. 2006;21:103–111. doi: 10.1007/s10654-005-6030-4.
28. Preston DB, D'Augelli AR, Kassab CD, et al.. The influence of stigma on the sexual risk behavior of rural men who have sex with men. AIDS Educ Prev. 2004;16:291–303. doi: 10.1521/aeap.16.4.291.40401.
29. Bowen AM, Horvath K, Williams ML. A randomized control trial of Internet-delivered HIV prevention targeting rural MSM. Health Educ Res. 2007;22:120–127. doi: 10.1093/her/cyl057.
30. Khosropour CM, Sullivan PS. At-home HIV testing of MSM enrolled in an online HIV behavioral risk study. Paper presented at: 2011 National HIV Prevention Conference; August 15, 2011; Atlanta, GA.
32. von Ahn L, Maurer B, McMillen C, et al.. reCAPTCHA: human-based character recognition via web security measures. Science. 2008;321:1465–1468. doi: 10.1126/science.1160379.
33. Konstan JA, Rosser BRS, Ross MW, et al.. The story of subject naught: a cautionary but optimistic tale of Internet survey research. J Comput Mediat Commun. 2005;10:00. doi: 10.1111/j.1083-6101.2005.tb00248.x.
34. Hosseinzadeh D, Krishnan S. Gaussian mixture modeling of keystroke patterns for biometric applications. IEEE Trans Syst Man Cybern C Appl Rev. 2008;38:816–826. doi: 10.1109/TSMCC.2008.2001696.
35. Taraszow T, Aristodemou E, Shitta G, et al.. Disclosure of personal and contact information by young people in social networking sites: an analysis using Facebook profiles as an example. Int J Media Cult Polit. 2010;6:81–101. doi: 10.1386/macp.6.1.81/.
36. Mustanski BS. Getting wired: exploiting the Internet for the collection of valid sexuality data. J Sex Res. 2001;38:292–301. doi: 10.1080/00224490109552100.
37. Heckathorn DD. Respondent-driven sampling: a new approach to the study of hidden populations. Soc Probl. 1997;44:174–199. doi: 10.1525/sp.1997.44.2.03x0221m.
38. Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomized trials. Ann Intern Med. 2010;152:726–732.
39. Stone SP, Cooper BS, Kibbler CC, et al.. The ORION statement: guidelines for transparent reporting of outbreak reports and intervention studies of nosocomial infection. Lancet. 2007;7:282–288. doi: 10.1016/s1473-3099(07)70082-8.
40. von Elm E, Altman DG, Egger M, et al.. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Lancet. 2007;370:1453–1457. doi: 10.1016/s0140-6736(07)61602-x.
41. Moher D, Liberati A, Tetzlaff J, et al.. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ. 2009;339:332–336. doi: 10.1136/bmj.b2535.
42. Liberati A, Altman DG, Tetzlaff J, et al.. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: explanation and elaboration. J Clin Epidemiol. 2009;62:e1–e34. doi: 10.1016/j.jclinepi.2009.06.006.
43. Eysenbach G. Improving the quality of web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). J Med Internet Res. 2004;6:e34. doi: 10.2196/jmir.6.3.e34.
44. Rosser BRS, Gurak L, Horvath KJ, et al.. The challenges of ensuring participant consent in Internet-based sex studies: a case study of the Men's INTernet Sex (MINTS-I and II) studies. J Comput Mediat Commun. 2009;14:602–626. doi: 10.1111/j.1083-6101.2009.01455.x.
45. Pequegnat W, Rosser BRS, Bowen AM, et al.. Conducting Internet-based HIV/STD prevention survey research: considerations in design and evaluation. AIDS Behav. 2007;11:505–521. doi: 10.1007/s10461-006-9172-9.
46. Gurak LJ, Lannon JM. A Concise Guide to Technical Communication. New York, NY: Pearson Longman; 2003.