To describe how the authors developed an objective structured clinical examination (OSCE) station to assess aspects of collaborative practice competency, and how they then assessed validity using Kane’s framework.
After piloting the collaborative practice OSCE station in 2015 and 2016, this was introduced at the Cumming School of Medicine in 2017. One hundred fifty-five students from the class of 2017 and 22 students from the class of 2018 participated. To create a validity argument, the authors used Kane’s framework that views the argument for validity as 4 sequential inferences on the validity of scoring, generalization, extrapolation, and implications,
Scoring validity is supported by psychometric analysis of checklist items and the fact that the contribution of rater specificity to students’ ratings was similar to OSCE stations assessing clinical skills alone. The claim of validity of generalization is backed by structural equation modeling and confirmatory factor analysis that identified five latent variables, including three related to collaborative practice (“provides an effective handover,” “provides mutual support,” and “shares their mental model”). Validity of extrapolation is argued based upon the correlation between the rating for “shares their mental model” and the rating on in-training evaluations for “relationship with other members of the health care team,” in addition to the association between performance on the collaborative practice OSCE station and the subsequent rating of performance during residency. Finally, validity of implications is supported by the fact that pass/fail decisions on the collaborative practice station were similar to other stations and by the observation that ratings on different aspects of collaborative practice associate with pass/fail decisions.
Based upon the validity argument presented, the authors posit that this tool can be used to assess the collaborative practice competence of graduating medical students and the adequacy of training in collaborative practice.