Secondary Logo

Journal Logo

Original Study

Predicting Clinical Practice Change: An Evaluation of Trainings on Sexually Transmitted Disease Knowledge, Diagnosis, and Treatment

Voegeli, Christopher PhD, MPH; Fraze, Jami PhD; Wendel, Karen MD; Burnside, Helen MS; Rietmeijer, Cornelis A. MD, PhD; Finkenbinder, Allison MSN, WHNP-BC; Taylor, Kimberly BSW; Devine, Sharon PhD, JD; on behalf of the National Network of STD Clinical Prevention Training Centers

Author Information
Sexually Transmitted Diseases: January 2021 - Volume 48 - Issue 1 - p 19-24
doi: 10.1097/OLQ.0000000000001282
  • Open

In 2018, more than 2.2 million sexually transmitted diseases (STDs) were diagnosed in the United States, including more than 1.7 million cases of chlamydia, 580,000 cases of gonorrhea, 115,000 cases of syphilis, and 1300 cases of congenital syphilis in newborn children.1 In the face of this epidemic, health care providers need evidence-based knowledge and skills to screen, diagnose, treat, manage, and prevent STDs in patients. There continue to be large gaps in health professionals’ academic education and continuing education related to STDs and STD care.2–4 In addition, advancements in care such as extragenital testing, increased use and demand for HIV preexposure prophylaxis, and expedited partner therapy continue to drive clinical education needs.

Since 1979, the Centers for Disease Control and Prevention (CDC) has funded the National Network of STD Clinical Prevention Training Centers (NNPTC) to train providers to improve STD prevention and care in the United States and its territories.2 The network includes 8 regional clinical prevention training centers (PTCs) that are responsible for facilitating incorporation of evidence-based STD screening and treatment guidelines at the individual provider, organization, and system levels. Through the use of both experiential and didactic trainings, the NNPTC has been previously shown to improve clinical STD practices.5–8 Courses vary in length from approximately 1 hour of didactic presentation, often through webinars, to several days of didactic and experiential skills building training. The NNPTC’s priority training audience is health care providers serving high-risk populations working in areas with high STD prevalence.

Previous research has investigated how the Theory of Reasoned Action, Theory of Planned Behavior, and Theory of Intentional Behavior Change provide some insight into the complex psychological mechanisms that underlie facilitation of behavior change.9–11 A review of 78 studies provided evidence of both the ability to predict intention and behavior and the application of the Theory of Planned Behavior to conceptualize the processes that yield these changes.12

A review of 10 studies found a predictable relationship between intention to change and clinicians’ behavior change.13 Although there may be a lack of consensus on the underlying mechanisms that lead an intention to change to a behavior change, among the studies in the review, there is evidence to support that intentions lead to actual changes in behavior. The present analysis identifies factors that elicited individual intention to change and self-reported change in practices from health care workers providing STD care.

METHODS

We are reporting the first aggregate data of the NNPTC. In 2014, the CDC funded the NNPTC to evaluate itself as part of a 2014 cooperative agreement (PS 14-1407) through the creation of the National Evaluation Center.

Evaluation Process

The funder identified a national program delivered by regional centers and requested a national aggregated evaluation to assess outcomes across all regions. After working with the regional PTCs and CDC staff to identify training needs and service gaps, the National Evaluation Center developed an online learning management system (LMS) to collect registration information and standardized measures for both immediate postcourse and 90-day postcourse evaluations of common training outcomes. Trainees completed the online registration form when enrolling and were invited by e-mail to complete immediate postcourse evaluations. The immediate postcourse evaluation was e-mailed to trainees on the last day of the training or at the most 3 days later. Approximately 90 days after the end of the course, trainees were invited by e-mail to complete another postcourse evaluation. For both evaluations, trainees were sent 2 e-mail reminder notifications. Except in rare instances, all data were collected online by the LMS. To receive continuing education credits, trainees needed to complete the immediate postcourse evaluation. Trainees were not required to answer all evaluation questions in an attempt to decrease the burden on respondents. The NNPTC’s registration form and all evaluation instruments were approved for use by the Office of Management and Budget (PRA No. 0920-0995).

Data Collected

Data are presented in aggregate for trainings conducted by the NNPTC’s 8 regional PTCs. The trainings included a wide range of topics from screening, treating, and diagnosing STDs to increasing cultural competency and creating a more inclusive clinic through changes in practices. For the purposes of this study, we identified the inclusion criteria for the regression analysis as all respondents who completed a registration form, a postcourse evaluation survey, and a 90-day postcourse evaluation survey. Using this listwise deletion approach ensured that everyone in the regression is accurately categorized as either making an actual change in their practice or not. In addition, a missing value analysis was performed to determine if the missing data followed any patterns. Previous literature has found that with a large enough sample size and the lack of a pattern in the missing data that complete cases, analysis produces unbiased and conservative results.14 On the registration form, among other information, trainees were asked to indicate their profession from a list of 17 possible categories, their functional role at the worksite from a list of 23 options, and type of worksite from a list of 20, including “not working.” For each category, “other” was a permitted write-in, which was manually recoded into existing categories if possible based on the response.

The immediate postcourse evaluation asked about satisfaction, intention to change, precourse practice patterns, intended postcourse practice patterns, and sometimes, depending on the course content, a multiple-choice test of knowledge about STD screening and treatment. The 90-day postcourse evaluation asked about current practice patterns, changes made based on the course, knowledge, and barriers and facilitators to making a practice change.

The clinical practices included in the evaluations were specific to the content taught in the course. Examples of practice change included increasing the proportion of sexually active asymptomatic female patients younger than 25 years screened annually for chlamydia, proper testing, and follow-up for gonorrhea, and increasing the proportion of nonvaccinated female patients between the ages of 11 and 26 years who were counseled about getting the HPV vaccine. We coded trainees’ as having an intention to change and self-reported actual change in trainee practice based on their responses to the immediate postcourse and 90-day postcourse evaluation survey. For intention to change, we created an algorithm that denoted an intention to change if trainees responded they would make a change in their practice. One such item for intention to change was the trainee’s response to an item that stated, “As a result of information presented, do you intend to make changes in your practice or at your worksite setting?” with possible responses of “yes,” “no,” “I already do this,” “it’s not my job,” and “other.” If a trainee responded with “yes,” they were coded as having an intention to change. If a trainee responded with “I already do this” or “it’s not my job,” they were coded as not having an intention to change. Another type of question used to measure intent to change was a retrospective preitem then postitem asking trainees to rate how frequently they performed a specific practice before the training and how frequently they intended to do this after the training. Frequency responses were in the form of a 6-point Likert scale, with possible answers being 0%, 1% to 25%, 26% to 50%, 51% to 75%, 76% to 90%, and >91%. If there was an increase in precourse practice to immediate postcourse intention for the same practice, the trainee was coded as having an intention to change.

Self-reported change in practice was measured with an algorithm in the same manner as intention to change, but through survey items on the 90-day postcourse evaluation. If trainees reported they made a change on any item, they would be marked as having made an actual change. One type of question asked the trainees on the 90-day evaluation survey “Did you make a change in your practice or worksite setting as a result of this training?” with possible answer choices of “yes,” “no,” “not applicable to my job or patients,” “I was already using these practices,” and “other reason.” If a trainee responded with “yes,” that person was coded as having made an actual change. In addition, a trainee’s response about frequency of a practice before the course from the immediate postcourse evaluation was compared with their response to a corresponding item on the 90-day postcourse evaluation asking how often they currently perform the activity. If a trainee reported an increase in practice on any of these measures, trainees were coded as having made an actual change. The 90-day postcourse evaluation also asked trainees to report all the barriers that they encountered when they tried to make a change in their practice. Examples of responses to this item included “nothing interfered,” “no opportunity to apply practices,” or “lack of time with patients.” Trainees could select multiple responses.

Statistical Measures

Descriptive statistics report trainee demographics, length of course, intention to change, self-reported actual change 90 days after a course, and barriers to change. We analyzed the demographics using SPSS 23. We used χ2 tests to compare proportions.

We performed a logistic regression using Stata15 to test whether intention to change, functional role, employment setting, and length of course predicted self-reported change 90 days after a course. For the regression, professions were recoded into clinical provider professions and nonclinical providers. Examples of clinical provider professions included registered nurse, physician, and physician assistant. Functional roles were recoded into those providing direct patient care and not. Direct patient care functional roles included clinician, nurse, care provider, clinical/medical assistance, and intern/resident/fellow. Employment setting was recoded into a primary care setting and not. Examples of primary care employment settings were academic health center, community health center, correctional facility, and health maintenance organization/managed care organization. Profession and functional role were largely colinear; functional role was used in the regression, as it more accurately reflected the actual activities of the trainee at their workplace.

Response Rates

Most evaluation data were collected through the online LMS; however, some courses used a paper form of the immediate postcourse evaluation instrument. Once the paper evaluation instruments were completed, staff entered the results into the online LMS. The LMS does not track the number or frequency of paper evaluation instrument use. Because of this difference, response rates cannot be considered entirely comparable to either in-person or web-based survey response rates. From 290 courses, 8560 immediate postcourse evaluation requests sent to trainees through the online LMS or collected with a paper survey and 5073 were completed, yielding an immediate postcourse response rate of 59.3%. The 90-day postcourse follow-up was done entirely through the online LMS and can be compared with other online survey response rates. The LMS sent 8227 90-day postcourse evaluation e-mail invitations and 2376 trainees completed the 90-day postcourse evaluation (28.9%). There is a difference in the total number of postcourse and 90-day postcourse surveys because some trainees would complete paper postcourse surveys but not provide their e-mail address so a 90-day postcourse survey could not be sent or a course was not set up to send the 90-day postcourse survey. The response rates to the 90-day postcourse evaluation were consistent with previous research.16–18

Data

Of the 2376 respondents to the 90-day postcourse evaluation, 1572 (66.2%) responded to all the items included in the logistic regression. To ensure the independence of observations, when a trainee took multiple courses, all cases after the earliest training were removed from the data set. Of 1572, 189 trainees attended more than one training session, leaving 1383 (88.0%) unique trainees in the data set representing 187 courses. These courses are categorized into 4 main groups based on content and format. The “clinical/counseling skills” category accounts for 27.8% of courses and is focused on providing hands on experience working in a clinic or with clients. The “intensive” courses category accounts for 27.3% of courses and consists of multiday trainings that involve lectures, clinical practicums, or both. The “short presentations” category accounts for 23.5% of courses and consists of a presentation on an STD topic that lasts less than 4 hours. The final category, “clinical updates,” accounts for 21.4% of courses and is devoted to teaching attendees about the CDC STD treatment guidelines or updates to the guidelines. Course length varies from less than 4 hours to 3 days.

RESULTS

Demographic Data

Descriptive data are reported only on the sample used in the logistic regression (n = 1383). Most trainees were female (n = 1162; 84.0%), white (n = 903; 65.3%), and not Hispanic/Latino (n = 1175; 85.0%). The most commonly reported functional role was clinician/clinical care provider (n = 701; 50.7%). The most commonly reported employment setting was state and local health department (n = 689; 49.8%). More than half of trainees (n = 762; 55.1%) attended a training 4 hours or longer. Table 1 reports trainees’ demographic information.

TABLE 1 - Trainee Demographics
n %
Sex
 Female 1162 84.0
 Male 136 9.8
 Transgender: female to male 6 0.4
 Transgender: male to female 2 0.1
 Missing 77 5.6
Race
 White 903 65.3
 African American 232 16.8
 Asian American 59 4.3
 Multiracial 50 3.6
 American Indian or Alaskan Native 15 1.1
 Native Hawaiian or Pacific Islander 7 0.5
 Missing 117 8.5
Ethnicity
 Not Hispanic, Latino/a or Spanish Origin 1175 85.0
 Hispanic, Latino/a or Spanish Origin 111 8.0
 Missing 97 7.0
Professional discipline
 Registered nurse 584 42.2
 Advanced practice nurse 320 23.1
 Physician 127 9.2
 Community health worker 110 8.0
 Health educator 68 4.9
 Other 38 2.7
 Dentist 29 2.1
 Physician assistant 27 2.0
 Licensed practical nurse 25 1.8
 Social worker 22 1.6
 Clerical 17 1.2
 Missing 16 1.2
Functional role
 Clinician/care provider 701 50.7
 Administrator 253 18.3
 Disease intervention specialist/partner services provider 77 5.6
 Student/graduate student 69 5.0
 Case manager 66 4.8
 Client/patient educator 40 2.9
 Outreach staff 33 2.4
 Clinical/medical assistant 30 2.2
 Med tech/laboratory 3 0.2
 Public health nonclinician 2 0.1
 Missing 109 7.9
Employment setting
 State/local health department 689 49.8
 Community health center 138 10.0
 Hospital/hospital-affiliated clinic 124 9.0
 Academic health center 86 6.2
 Other nonprofit health center 71 5.1
 College/university 70 5.1
 Community-based service organization 48 3.5
 Private practice 36 2.6
 HMO/managed care organization 10 0.7
 Missing 111 8.0
 Total 1383 100.0
HMO, health maintenance organization

To determine if there was a statistically significant difference between trainees included in the regression analysis and trainees that were not included in the analyses, we compared the groups on all demographic and study variables. No statistically significant differences existed between the 2 groups.

Intention to Change and Reported Actual Change

More than half of the trainees included in the sample for the regression (65.9%; n = 912) reported they intended to make a change in their practice in the immediate postcourse evaluation. The 3 functional roles with the highest reported self-reported intention to change were clinician/care provider (73.0%), client/patient educator (72.5%), and clinical/medical assistant (63.3%). The 3 functional roles with the lowest reported self-reported intention to change were outreach staff (48.5%), case manager (53.0%), and disease intervention specialist (55.8%). The 3 employment settings with the highest proportion of trainees reporting an intention to change were community health center (81.2%), college/university (77.1%), and academic health center (75.6%). The 3 employment settings with the lowest proportion of trainees reporting an intention to change were “other nonprofit health center” (59.2%), community-based service organization (60.4%), and state or local health department (62.1%). Groups comprising fewer than 20 were excluded from these rankings.

A majority of trainee respondents included in the regression analysis (62.4%; n = 863) reported they had made a change in their practice 90 days after a course. The 3 functional roles with the highest reported actual change were clinician/care provider (71.9%), client/patient educator (57.5%), and disease intervention specialist (57.1% each). The 3 functional roles with the lowest reported actual change were clinical/medical assistant (43.3%) and student/graduate student (49.3%), and tied for third were care manager and outreach staff (51.5% each). The 3 employment settings with the highest proportion of trainees reporting an actual change were community health centers (77.5%), “other nonprofit health center” (74.6%), and private practice (72.2%). The 3 employment settings with the lowest reported actual change were community-based service organization (50.0%), state or local health department (58.2%), and college/university (61.4% each). Groups with fewer than 20 were excluded from these rankings.

Intent to change and actual change were further stratified by length of course. There was a statistically significant lower proportion of trainees in shorter courses who said they intended to make a change (62%) when compared with trainees that took a course lasting 4 or more hours (69%). There were also a statistically significant lower proportion of trainees in shorter courses who said they made an actual change (53%) when compared with trainees that took a course at least 4 hours long (70%). Table 2 reports the number of trainees that intended to make a change and reported an actual change stratified by course length.

TABLE 2 - Percentage of Respondents Reporting Actual Change at 90 Days After a Course (n = 1383)
Total Population Actual Change Reported No Change Reported χ2 OR 95% CI
n % n % n %
Course intensity, h
 <4 621 44.90 329 23.79 292 21.11 42.64 1.00 1.54–2.46
 ≥4 162 55.10 534 38.61 228 16.49 1.94
Intended change
 No 471 34.10 204 14.75 267 19.31 110.92 1.00 2.46–3.97
 Yes 912 65.90 659 47.65 253 18.29 3.12
Functional role
 Provides direct patient care and would treat STDs 740 53.50 522 37.74 218 15.76 44.94 1.77 1.40–2.24
 Other 643 46.50 341 24.66 302 21.84 1.00
Employment setting
 A setting that provides direct patient primary care and would treat STDs 1137 82.20 712 51.48 425 30.73 0.132 1.02 0.75–1.38
 Other 246 17.80 151 10.92 95 6.87 1.00

Trainees who reported an intention to change were significantly more likely to report making an actual change (odds ratio [OR], 3.12; 95% confidence interval [CI], 2.46–3.97; Table 2). Trainees who took a course lasting 4 hours or more were 94% more likely to report an actual change (OR, 1.94; 95% CI, 1.54–2.46). Trainees with a functional role providing direct patient care and treating STDs were more likely to report an actual change compared with trainees who did not provide direct care (OR, 1.77; 95% CI, 1.40–2.24). The strongest predictors of actual change were stating an intention to change immediately after the training (OR, 3.12; 95% CI, 2.46–3.97) and attending a course 4 hours or longer (OR, 1.94; 95% CI, 1.54–2.46), both statistically significant at P < 0.001. Employment setting did not predict self-reported change in practice (OR, 1.02; 95% CI, 0.75–1.38).

It is worth noting that from July 20 to September 19, 2016, the NNPTC LMS experienced an error, and e-mail invitations to trainees directing them to complete an immediate postcourse or 90-day postcourse evaluation were not sent. Trainees received their evaluation surveys up to 60 days after the normal surveying period. The final sample included 115 respondents who received a postcourse evaluation for a training during this time and 170 respondents who should have received a 90-day postcourse evaluation during this time. This error impacted up to 742 postcourse evaluations and 714 ninety-day immediate postcourse evaluations. Invitations for the evaluations were sent when the error was discovered.

Barriers

Of the 1383 trainees who were included in the logistic regression, 515 (37.2%) reported not making a change in practice. Of the 515, 197 (38.3%) responded to the 90-day postcourse evaluation items asking about barriers to implementing change. Half reported that nothing interfered with application of what they learned (50.8%; n = 100). Other barriers reported were the lack of opportunities to apply what they learned (24.4%; n = 48) followed by lack of time with patients (13.2%; 26), lack of equipment or supplies (7.1%; n = 14), policies at their worksite (6.6%; n = 13), cost or lack of reimbursements (6.1%; n = 12), the trainee had more important patient concerns (3.6%; n = 7), and resistance to change by supervisors (1.0%; n = 2). This variable asking respondents about their barriers to making a change was not included in the regression, but it is important to understand why this group did not make a change.

DISCUSSION

In determining the impact of the training on change in worksite, intention to change and self-reported actual change are measures of particular interest. In this evaluation, more than half (65.9%) of trainees completing an evaluation reported their intention to change in the immediate postcourse evaluation, whereas 62.4% reported that they made a change in practice after 90 days. These proportions are consistent with findings from other provider trainings.19,20 The greatest predictors for reporting an actual change were stating an intention to change immediately after training and attending courses with a length of 4 hours or more. This finding provides evidence that the more time spent in a course, the more likely the trainee will incorporate a change into their clinical practice. This could be because longer trainings educate trainees on more areas for improvement ensuring a higher likelihood that one of the changes is achievable. Other possible reasons are that individuals who take longer courses may come into the training with a greater intention to change and longer courses were more often conducted in-person, which might lead to an increased focus on the trainings content. There may also be a bias toward folks more dedicated to improving STI care in the group that attend longer trainings. The trainee is different from the person who shows up for a 1-hour webinar.

Because half of the trainees for which registration data are available were clinicians/care providers, it is heartening that this group had the highest proportion of reported actual change. Although almost half of those trained work in a state and local public health department, that employment setting did not rank in the top 3 for actual change based on employment setting. This could be because public health departments are already using the latest guidelines in their practice or they experience more barriers to making a change.

It is intuitive, perhaps, that those attending a longer training would report more actual change 90 days after the training. This finding presents a question of resource allocation for the NNPTC, which has limited funding and differential allocation of resources to regions depending on the epidemiology of STDs in PTCs’ assigned states. Moreover, the provision of STD care varies by state, as some states have expanded Medicaid under the Affordable Care Act, some provide more care through public health clinics, whereas others have more robust health care organizations that serve people presenting with STDs.

Given limited resources, clinical training organizations might focus their activities more narrowly to clinical functional roles and set eligibility requirements for registration rather than training all who wish to attend. The data on lack of opportunity to use information from the training as the most frequent barrier to practice change might reflect the need to better target invitations to training or it might reflect circumstances in the workplace that limit adoption of clinical practices. Training organizations might also consider the length and modality of courses to offer in light of the predictors of change. Shorter courses, available as webinars, represent a smaller investment of training resources and a larger reach than longer courses. The PTCs were able to reach twice the number of trainees in less intense courses, such as 1-hour grand rounds and webinars, but those attending these types of shorter courses reported less practice change.

In addition, some trainees identified lack of time with patients and lack of equipment and supplies to apply what they learned in the trainings. To enhance the retention of skills and knowledge gained in the trainings, work sites should try to ensure that trainees have the ability to practice the skills and use new knowledge in their clinical practice. Training organizations can aid in this by using action planning21 with the trainees to help them plan how they will incorporate the skills into daily practice. It is notable that very few respondents reported resistance by supervisors as a reason they did not make a change. This might be due to respondents having a supervisor who is supportive to change. However, the change might still have been prohibited by people above the supervisor or by structural barriers to change. If supervisors are supportive of implementing new practices, trainees and training organizations should consider leveraging this support to ensure adoption of these skills and knowledge. Training organizations might consider providing more resources directed at supervisors to aid in making practice changes with all staff. In addition, training organizations and subject matter experts should consider offering technical assistance and capacity building to the organizations to further aid adoption of new practices.

These results focus on the relationship between intention to change and self-reported change. Intention to change does not necessarily mean that a change was made. The literature details the intention-behavior gap, which presents numerous reasons why intention to make a change does not always elicit a behavioral change.22 A meta-analysis performed by Webb and Sheeran23 found that a medium-to-large effect size for intention to change yielded a small-to-medium change in behavior. Grol and Grimshaw24 discussed the need to have comprehensive approaches to effectively implement clinical changes that focused on the provider, clinical team, health care organization, and the larger environmental system. Our findings identified predictors of reporting an actual change in practice after 90 days. In the future, more research should be done to better understand how a comprehensive training approach that involves team trainings and organizational capacity building could increase the proportion of trainees who are able to make a change in their practice and to evaluate change at the organization level.

Because data are self-reported, respondents may report more positive results causing a social desirability bias that may impact this study’s results. In addition, a relatively low response rate to the 90-day postcourse evaluation (29%) is also a limitation because people that made a change might be more likely to fill out the survey. Also, respondents who complete the 90-day postcourse survey might be more intrinsically motivated to complete tasks more generally, and this might be related to making a change in their practice after a training. The inclusion criteria also removed some of the starting sample. To determine if there were differences in intention and actual change, we calculated the percentages of each for everyone not included in the regression. Of all trainees not included in the regression who responded to the postcourse evaluation, 61.8% reported an intention to change. Of all trainees who responded to the 90-day postcourse evaluation, 55.4% reported an actual change. This is a difference of 4% in intention to change and 7% in actual change between the regression sample and those not in the sample. Those not in the regression were less likely to report an intention to change and an actual change. This indicates some selection bias of those who had enough data to be included in the regression analysis. The NNPTC LMS error described in the results is also a limitation. Although the trainees included in this analysis still responded to the surveys, some answered them later than other respondents, giving them more time to make a change in their practice or forget about course content. In addition, this error may have decreased motivation for trainees to respond to the survey. The error might have influenced the time to respond or the completion of the evaluation because of the delay in requesting evaluations for 742 immediate postcourse evaluations and 714 ninety-day postcourse evaluations.

Most importantly, this first national evaluation of the NNPTC focused on intention to change and self-reported change at the individual level only. The evaluation plan did not include an assessment of change in practice at an organization level.

The NNPTC continues to be a national resource for health care workers to increase their knowledge while learning and practicing skills to improve STD care and address the rising rates of STDs around the country. This evaluation analysis provides further evidence for the effectiveness of training to elicit change in clinical practice by trainees and suggestions for targeting limited training resources.

REFERENCES

1. Braxton J, Davis DW, Emerson B, et al. Sexually Transmitted Disease Surveillance 2017. Atlanta, GA: CDC; 2018. doi:10.15620/cdc.59237.
2. Eng TR, Butler WT. The Hidden Epidemic: Confronting Sexually Transmitted Diseases. Washington, DC: National Academies Press, 1997.
3. Malhotra S, Khurshid A, Hendricks KA, et al. Medical school sexual health curriculum and training in the United States. J Natl Med Assoc 2008; 100:1097–1106.
4. Parish SJ, Clayton AH. Continuing medical education: Sexual medicine education: Review and commentary (CME). J Sex Med 2007; 4:259–268.
5. Dreisbach S, Devine S, Fitch J, et al. Can experiential-didactic training improve clinical STD practices?Sex Transm Dis 2011; 38:516–521.
6. Judson FN, Boyd WA. The Denver sexually transmitted diseases prevention/training center: A two-year performance evaluation. Sex Transm Dis 1982; 9:183–187.
7. Wangu Z, Gray B, Dyer J, et al. The value of experiential sexually transmitted disease clinical training in the digital age. Sex Transm Dis 2016; 43:134–136.
8. Dreisbach S, Burnside H, Hsu K, et al. Improving HIV/STD prevention in the care of persons living with HIV through a national training program. AIDS Patient Care STDS 2014; 28:15–21.
9. Sniehotta FF. Towards a theory of intentional behaviour change: Plans, planning, and self-regulation. Br J Health Psychol 2009; 14:261–273.
10. Perkins MB, Jensen PS, Jaccard J, et al. Applying theory-driven approaches to understanding and modifying clinicians' behavior: What do we know?Psychiatr Serv 2007; 58:342–348.
11. Grimshaw JM, Eccles MP, Walker AE, et al. Changing physicians' behavior: What works and thoughts on getting more things to work. J Contin Educ Health Prof 2002; 22:237–243.
12. Eccles MP, Hrisos S, Francis J, et al. Do self-reported intentions predict clinicians' behaviour: A systematic review. Implement Sci 2006; 1:28.
13. Godin G, Bélanger-Gravel A, Eccles M, et al. Healthcare professionals' intentions and behaviours: A systematic review of studies based on social cognitive theories. Implement Sci 2008; 3:36.
14. Kang H. The prevention and handling of the missing data. Korean J Anesthesiol 2013; 64:402–406.
15. StataCorp. Stata Statistical Software: Release 14. College Station, TX: StataCorp LP; 2015.
16. Sheehan KB. E-mail survey response rates: A review. J Comput Commun 2001; 6:1–19.
17. Cunningham CT, Quan H, Hemmelgarn B, et al. Exploring physician specialist response rates to web-based surveys. BMC Med Res Methodol 2015; 15:32.
18. Kellerman SE, Herold J. Physician response to surveys: A review of the literature1. Am J Prev Med 2001; 20:61–67.
19. Boehler M, Schechtman B, Rivero R, et al. Developing the HIV workforce: The MATEC clinician scholars program. J Assoc Nurses AIDS Care 2016; 27:246–260.
20. Lalonde B, Uldall KK, Huba GJ, et al. Impact of HIV/AIDS education on health care provider practice: Results from nine grantees of the special projects of National Significance Program. Eval Health Prof 2002; 25:302–320.
21. Development KC for CH. Section 5. Developing an Action Plan. Available at: https://ctb.ku.edu/en/table-of-contents/structure/strategic-planning/develop-action-plans/main. Accessed November 1, 2019.
22. Sniehotta FF, Scholz U, Schwarzer R. Bridging the intention-behaviour gap: Planning, self-efficacy, and action control in the adoption and maintenance of physical exercise. Psychol Health 2005; 20:143–160.
23. Webb TL, Sheeran P. Does changing behavioral intentions engender behavior change? A meta-analysis of the experimental evidence. Psychol Bull 2006; 132:249.
24. Grol R, Grimshaw J. From best evidence to best practice: Effective implementation of change in patients' care. Lancet 2003; 362:1225–1230.
Copyright © 2020 The Author(s). Published by Wolters Kluwer Health, Inc. on behalf of the American Sexually Transmitted Diseases Association.