Coordinating and operationalizing assessment systems that effectively streamline and measure fine-grained progression of residents at various stages of graduate medical training can be challenging. This article describes development, administration, and psychometric analyses of a learner analytics system to resolve challenges in implementation of milestones by introducing the Scoring Grid Model, operationalized in an internal medicine (IM) residency program.
A 3-year longitudinal cohort of 34 residents at the University of Illinois at Chicago College of Medicine began using this learner analytics system, from entry (July 2013) to graduation (June 2016). Scores from 23 assessments used throughout the 3-year training were synthesized using the Scoring Grid Model learner analytics system, to generate scores corresponding to the 22 reportable IM subcompetencies. A consensus model was used to develop and pilot test the model using feedback from IM faculty members and residents. Scores from the scoring grid were used to inform promotion decisions and reporting of milestone levels. Descriptive statistics and mixed-effects regression were used to examine data trends and gather validity evidence.
Initial validity evidence for content, internal structure, and relations to other variables that systematically integrate assessment scores aligned with the reportable milestones framework are presented, including composite score reliability of scores generated from the learner analytics system. The scoring grid provided fine-grained learner profiles and showed predictive utility in identifying low-performing residents.
The Scoring Grid Model and associated learner analytics data platform may provide a practical, reasonable solution for generating fine-grained, milestones-based profiles supporting resident progress.