Assessor training is essential for defensible assessments of physician performance, yet research on the effectiveness of training programs for promoting assessor consistency has produced mixed results. This study explored assessors’ perceptions of the influence of training and assessment tools on their conduct of workplace-based assessments of physicians.
In 2017, the authors used a constructivist grounded theory approach to interview 13 physician assessors about their perceptions of the effects of training and tool development on their conduct of assessments.
Participants reported that training led them to realize that there is a potential for variability in assessors’ judgments, prompting them to change their scoring and feedback behaviors to enhance consistency. However, many participants noted they had not substantially changed their numerical scoring. Nonetheless, most thought training would lead to increased standardization and consistency among assessors, highlighting a “standardization paradox” in which participants perceived a programmatic shift toward standardization but minimal changes in their own ratings. An “engagement effect” was also found in which participants involved in both tool development and training cited more substantial learnings than participants involved only in training.
Findings suggest that training may help assessors recognize their own subjectivity when judging performance, which may prompt behaviors that support rigorous and consistent scoring but may not lead to perceptible changes in assessors’ numeric ratings. Results also suggest that participating in tool development may help assessors align their judgments with the scoring criteria. Overall, results support the continued study of assessor training programs as a means of enhancing assessor consistency.
K. Hodwitz is research associate, College of Physicians and Surgeons of Ontario, Toronto, Ontario, Canada.
A. Kuper is associate professor and faculty co-lead, Person-Centred Care Education, Department of Medicine, scientist and associate director, Wilson Centre for Research in Education, University Health Network, University of Toronto, and staff physician, Division of General Internal Medicine, Sunnybrook Health Sciences Centre, Toronto, Ontario, Canada.
R. Brydges is research director and scientist and holds the professorship in Technology Enabled Education at the Allan Waters Family Simulation Centre, St. Michael’s Hospital, and is associate professor, Department of Medicine and Wilson Centre for Research in Education, University of Toronto, Toronto, Ontario, Canada.
Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A733.
Funding/Support: None reported.
Other disclosures: None reported.
Ethical approval: The University of Toronto’s research ethics review board granted ethical approval.
Previous presentations: This work was originally published, in a different form, as a master of science thesis at the University of Toronto, completed in April 2018. This work was also presented at the University of Toronto’s Department of Medicine Annual Day in June 2018.
Correspondence should be addressed to Kathryn Hodwitz, Research and Evaluation Department, College of Physicians and Surgeons of Ontario, 80 College St., Toronto, Ontario, Canada M5G 2E2; telephone: (416) 967-2600, ext. 522; email: firstname.lastname@example.org; email@example.com.