Using assessment to facilitate learning is a well-established priority in education but has been associated with variable effectiveness for continuing professional development. What factors modulate the impact of testing in practitioners are unclear. We aimed to improve capacity to support maintenance of competence by exploring variables that influence the value of web-based pretesting.
Family physicians belonging to a practice-based learning program studied two educational modules independently or in small groups. Before learning sessions they completed a needs assessment and were assigned to either sit a pretest intervention or read a relevant review article. After the learning session, they completed an outcome test, indicated plans to change practice, and subsequently documented changes made.
One hundred twelve physicians completed the study, 92 in small groups. The average lag between tests was 6.3 weeks. Relative to those given a review article, physicians given a pretest intervention: (1) reported spending less time completing the assigned task (16.7 versus 25.7 minutes); (2) performed better on outcome test questions that were repeated from the pretest (65.9% versus 58.7%); and (3) when the learning module was completed independently, reported making a greater proportion of practice changes to which they committed (80.0% versus 45.0%). Knowledge gain was unrelated to physicians' stated needs.
Low-stakes formative quizzes, delivered with feedback, can influence the amount of material practicing physicians remember from an educational intervention independent of perceptions regarding the need to engage in continuing professional development on the particular topic.