ArticlesTesting the Impact of an Asynchronous Online Training Program With Repeated FeedbackWoda, Aimee PhD, RN; Bradley, Cynthia Sherraden PhD, RN; Johnson, Brandon Kyle PhD, RN; Hansen, Jamie PhD, RN; Loomis, Ann PhD, RN; Pena, Sylvia PhD, RN; Singh, Maharaj PhD; Dreifuerst, Kristina Thomas PhD, RN Author Information Associate Professor (Dr Woda), Assistant Professor (Dr Pena), Research Associate Professor (Dr Singh), and Professor and Director PhD Program (Dr Dreifuerst), College of Nursing, Marquette University, Milwaukee, Wisconsin; Assistant Professor and Director of Simulation (Dr Bradley), School of Nursing, University of Minnesota, Minneapolis; Associate Professor and Associate Dean for Simulation (Dr Johnson), Texas Tech University Health Sciences Center, Lubbock; Clinical Professor (Dr Hansen), Carroll University, Waukesha, Wisconsin; and Clinical Assistant Professor (Dr Loomis), School of Nursing, Purdue University, West Lafayette, Indiana. Correspondence: Dr Woda, College of Nursing, Marquette University, Milwaukee, WI 53233. ([email protected]). The authors declare no conflicts of interest. Accepted for publication: March 2, 2023 Early Access: March 30, 2023 Cite this article as: Woda A, Bradley CS, Johnson BK, et al. Testing the impact of an asynchronous online training program with repeated feedback. Nurse Educ. 2023;48(5):254-259. doi:10.1097/NNE.0000000000001405 Nurse Educator 48(5):p 254-259, September/October 2023. | DOI: 10.1097/NNE.0000000000001405 Buy Metrics Abstract Background: Learning to effectively debrief with student learners can be a challenging task. Currently, there is little evidence to support the best way to train and evaluate a debriefer's competence with a particular debriefing method. Purpose: The purpose of this study was to develop and test an asynchronous online distributed modular training program with repeated doses of formative feedback to teach debriefers how to implement Debriefing for Meaningful Learning (DML). Methods: Following the completion of an asynchronous distributed modular training program, debriefers self-evaluated their debriefing and submitted a recorded debriefing for expert evaluation and feedback using the DML Evaluation Scale (DMLES). Results: Most debriefers were competent in DML debriefing after completing the modular training at time A, with DMLES scores increasing with each debriefing submission. Conclusion: The results of this study support the use of an asynchronous distributed modular training program for teaching debriefers how to implement DML. © 2023 Wolters Kluwer Health, Inc. All rights reserved.