Secondary Logo

Journal Logo

Translating Theory Into Practice: Implementing a Program of Assessment

Hauer, Karen, E., MD, PhD; O’Sullivan, Patricia, S., EdD; Fitzhenry, Kristen, EdM; Boscardin, Christy, PhD

doi: 10.1097/ACM.0000000000001995
Innovation Reports
Free

Problem A program of assessment addresses challenges in learner assessment using a centrally planned, coordinated approach that emphasizes assessment for learning. This report describes the steps taken to implement a program of assessment framework within a medical school.

Approach A literature review on best practices in assessment highlighted six principles that guided implementation of the program of assessment in 2016–2017: (1) a centrally coordinated plan for assessment aligns with and supports a curricular vision; (2) multiple assessment tools used longitudinally generate multiple data points; (3) learners require ready access to information-rich feedback to promote reflection and informed self-assessment; (4) mentoring is essential to facilitate effective data use for reflection and learning planning; (5) the program of assessment fosters self-regulated learning behaviors; and (6) expert groups make summative decisions about grades and readiness for advancement. Implementation incorporated stakeholder engagement, use of multiple assessment tools, design of a coaching program, and creation of a learner performance dashboard.

Outcomes The assessment team monitors adherence to principles defining the program of assessment and gathers and responds to regular feedback from key stakeholders, including faculty, staff, and students.

Next Steps Next steps include systematically collecting evidence for validity of individual assessments and the program overall. Iterative review of student performance data informs curricular improvements. The program of assessment also highlights technology needs that will be addressed with information technology experts. The outcome ultimately will entail showing evidence of validity that the program produces physicians who engage in lifelong learning and provide high-quality patient care.

K.E. Hauer is professor, Department of Medicine, University of California, San Francisco, San Francisco, California; ORCID: http://orcid.org/0000-0002-8812-4045.

P.S. O’Sullivan is professor, Department of Medicine, University of California, San Francisco, San Francisco, California; ORCID: http://orcid.org/0000-0002-8706-4095.

K. Fitzhenry is manager of student assessment, School of Medicine, University of California, San Francisco, San Francisco, California.

C. Boscardin is associate professor, Department of Medicine, University of California, San Francisco, San Francisco, California; ORCID: http://orcid.org/0000-0002-9070-8859.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: Reported as not applicable.

Correspondence should be addressed to Karen E. Hauer, University of California, San Francisco, 533 Parnassus Ave., U80, Box 0710, San Francisco, CA 94143; telephone: (415) 502-5475; e-mail: karen.hauer@ucsf.edu.

Back to Top | Article Outline

Problem

A program of assessment can foster development of lifelong learning skills and adaptive expertise essential for physician practice.1 As a centrally planned, coordinated approach to assessment, a program of assessment entails collecting multiple pieces of information to generate holistic views of learners and their progression. van der Vleuten et al1 describe how a program of assessment shifts priority from assessment of learning to assessment for learning through frequent formative assessments, yielding rich feedback to promote further learning. Periodic high-stakes assessments identify readiness for advancement. Despite the conceptual benefits of a program of assessment, published literature contains few examples of systematic implementation. This report describes the specific steps we took to apply a program of assessment framework at the University of California, San Francisco, School of Medicine.

Back to Top | Article Outline

Guiding principles

As our medical school undertook major curriculum redesign,2 we identified an opportunity to incorporate a program of assessment, and thus to describe how to implement this compelling approach to assessment. We aimed to design a program of assessment that supports each student’s progress, emphasizing informed self-assessment, formative feedback, and individual progress review. Recognizing the complexity inherent in applying a program of assessment across a curriculum,1,3 three authors (K.E.H., P.S.O’S., C.B.) iteratively reviewed the literature to synthesize key findings into six guiding principles, reported in Table 1, to translate theory into practice. We mapped the key literature findings to our plans to ensure that our six principles captured both the literature and corresponding features of our implementation plan. We illustrate below and with Figure 1 how the principles facilitated implementation of the program of assessment.

Table 1

Table 1

Figure 1

Figure 1

Back to Top | Article Outline

Challenges and opportunities related to assessment

Examination of international and national contexts, our literature review, and local experience highlighted common, long-standing challenges related to assessment within the dominant medical education culture, which considers assessment synonymous with accreditation demands and high-stakes decision making.4 Emphasis on high-stakes assessment hampers opportunities for students to receive timely feedback, monitor their learning, and make improvements. Intense focus on licensing examination scores prioritizes medical knowledge over other competencies. Clinical education suffers from inadequate direct observation and formative feedback and limited student–supervisor continuity. Assessment information about learners comes primarily within individual courses or clerkships, rather than centrally and systematically to characterize trajectories. We must shift from the classic dichotomous distinction of formative and summative assessment to a model of ongoing assessments with varying types of performance information along a continuum of stakes or consequences.4,5

Back to Top | Article Outline

Approach

Intervention: Program of assessment principles

Based on literature about best practices in assessment,1 we derived six principles to guide design and implementation of a program of assessment, described below and in Table 1. We describe our approach in the context of our new Bridges curriculum, a three-phase integrated curriculum launched in 2016–2017 for our first-year class of 152 students (Foundations 1: pre-core clerkship; Foundations 2: core clerkships; Career Launch: advanced clerkships, scholarly project). Foundational sciences, clinical and health systems skills, and inquiry skills (learning through discovery, using evidence) progress and are integrated throughout all curricular phases. We present each principle and how we enacted it.

Back to Top | Article Outline

Principle 1: A centrally coordinated plan for assessment aligns with and supports a curricular vision

To achieve central coordination and standardization of assessment, we developed assessment guidelines for each of the three curricular phases. As course leadership designed learning activities, the assessment team co-designed assessment activities to meet curricular objectives. We met with each course team to plan frequent formative assessments and align high-stakes assessments with curricular content.

Our implementation process included communicating the rationale and procedures of new assessments to ensure faculty buy-in and learner engagement. Faculty champions requiring high-level understanding of the program of assessment included curriculum redesign leaders, course and clerkship directors, and coaches (clinicians guiding students’ patient care and systems skills, and providing mentoring). A director of faculty development collaborated on design of in-person, video, and written materials to support faculty understanding of the assessment program and procedures. Frontline faculty received essential information in abbreviated written and video formats. Students received in-class and written communications distinguishing the program of assessment that emphasizes long-term retention and growth from traditional approaches that prioritize short-term compartmentalized memorization. Students and faculty were oriented to expectations for integrated assessments focusing on application rather than recall of information through formats such as weekly practice essay questions and summative assessments using open-ended question formats.

Back to Top | Article Outline

Principle 2: Multiple assessment tools used longitudinally generate multiple data points

Longitudinal assessments generate multiple data points that enable progressive monitoring of competence development. Table 2 shows example assessments.

Table 2

Table 2

To generate rich performance data, we needed assessment tools that charted progress longitudinally. The assessment team reviewed existing tools for alignment with curricular milestones, and co-created with subject matter experts new tools where needed. Assessment tools include both new applications of existing tools (i.e., in Foundations 1, progress testing in advance of licensing exam preparation) and locally created tools based on the school’s milestones (i.e., in Foundations 1, a checklist with developmental descriptive anchors for assessment of inquiry behaviors). Course/clerkship leaders implement each assessment activity with guidance from the assessment team.

Back to Top | Article Outline

Principle 3: Learners require ready access to information-rich feedback to promote reflection and informed self-assessment

Assessment for learning must generate frequent, low-stakes feedback that allows learners to gauge their progress toward milestones aligned with expected trajectory. Reflecting on personal performance data, learners can set individualized learning goals.

We created a new student performance dashboard with an integrated view of individual assessment data. Performance data are synthesized by competency across multiple assessment activities, with opportunities to drill down for detailed views of individual and class-average data. Color coding indicates performance at expectations (green), of concern (yellow), or needing immediate intervention (red). The dashboard also serves as a repository for performance reports from single assessment activities. Learning analytics in the form of aggregate class data provide information about students’ use of curricular and formative assessment resources, and enable prediction of student performance on future assessments to identify early on those students who need extra support.

Back to Top | Article Outline

Principle 4: Mentoring is essential to facilitate effective data use for reflection and learning planning

We created a new faculty coach role separate from an assessor’s role to support students. The skills of data interpretation, reflection, and generation of learning plans are unfamiliar to many students and require careful guidance.

We recruited and funded coaches to longitudinally mentor and guide 12 students (6 first-year and 6 third-year students) throughout the four-year curriculum. Coaches, supported at 0.20 time, also teach foundational clinical skills to their first-year cohorts and provide ongoing feedback in a safe learning environment. Coaches regularly review student progress and meet individually with each student, four times in Foundations 1 and then twice yearly, to review performance data holistically and guide the student in developing learning goals. Coaches elicit students’ insights and concerns and promote students’ ability to build on strengths. Coaches received training in effective communication, including inquiring, listening, and supporting, based on the American Academy on Communication in Healthcare curriculum (http://www.aachonline.org/). Coaches receive ongoing professional development via in-person meetings, workshops, and an online coach handbook.

Back to Top | Article Outline

Principle 5: The program of assessment fosters self-regulated learning behaviors

Self-regulated learners strategically establish learning goals, monitor progress, and make adjustments to achieve those goals.6 Incorporating structured self-regulated learning activities into the curriculum with guidance about goal setting and review of progress on learning goals builds the attention to self-improvement and metacognitive skills needed for managing one’s learning.

During dedicated, quarterly ARCH (Assessment, Reflection, Coaching, Health) Weeks, students and coaches meet individually to review and share feedback on students’ learning goals. The student drafts SMART (specific, measurable, attainable, relevant, timebound) goals for discussion with the coach. Students and coaches review prior learning goals, discuss progress, and revise as needed during subsequent ARCH Weeks. Coaches refer students when appropriate to additional learning and well-being resources to optimize performance and experience.

Back to Top | Article Outline

Principle 6: Expert groups make summative decisions about grades and readiness for advancement

High-stakes decisions, including assignment of course and clerkship grades, achievement of adequate progress, and readiness to advance, should be made by groups of trained, experienced committee members who review accumulated data. To ensure trustworthy decision making, no high-stakes decisions are rendered based on a single data point or by a single individual.

Following guidelines based on literature on group decision making,7 grading committees review performance data after each course or clerkship. In Foundations 1, directors within an integrated course (i.e., foundational science, clinical/health systems skills, inquiry) together determine satisfactory achievement of course expectations across competencies and assign course grades (pass/fail). In Foundations 2, grading committees comprising clerkship and site directors and other experienced educators review numerical and narrative data against criteria and assign grades (honors/pass/fail). An academic progress committee meets yearly throughout the curriculum to review all students’ longitudinal progress and identify students with performance below benchmarks. Course directors and subject matter experts aid coaches in designing plans for students needing extra support or remediation. Coaches do not participate in high-stakes assessment or grading decisions for their own students.

Back to Top | Article Outline

Outcomes

We are mindful of lessons from implementation science8 as we monitor our adherence to principles defining the program of assessment. Our implementation process includes gathering and responding to regular feedback from key stakeholders through meetings with course leadership, including planning meetings before each course and exam debrief meetings after each course. Student feedback sessions and written student feedback prompt adjustments. An example is allowing limited learning resources (diagrams of complex metabolic processes, risk calculators) during high-stakes examinations to emphasize application of knowledge rather than memorizing material that clinicians typically look up. For those students unfamiliar with open-ended question exams, we have offered learning resources to address this skill.

Attention to the intervention context includes collaborating with stakeholders (course directors, teachers, students) in implementation and modifications. All stakeholders, including many educators and clinical sites away from core teaching hospitals, require continued education to counteract the temptation to view curricular components and assessments as separate and decontextualized. Dedicated administrative support is essential for successful implementation of assessment activities; the school shifted the reporting structure for course leadership and administrative staff from departments to the school to enhance buy-in and standardization.

We communicate the value and benefits of the program of assessment. Students initially questioned the usefulness of the dashboard beyond a grade repository. Therefore, we continually reinforce the importance of monitoring progress over time contextualized around milestones, and using assessment data to inform next steps in learning. Ongoing reinforcement with coaches and other educators emphasizes the importance of assessment for learning and recognizing and incorporating feedback. Faculty members appreciate the need for application and integration of knowledge through active participation in design of assessments and standard-setting meetings.

The implementation process has included planning, engaging, executing, and reflecting/evaluating. Development of new assessment tools was guided by Kane’s9 validity framework; we pilot tools before implementation and are now systematically collecting evidence for validity. The program of assessment prompts identification of students needing remediation. Strategies we have implemented include tracking performance in foundational science subjects across examinations, and requiring students with longitudinal performance below expectations to meet with subject experts for individual learning planning. The iterative process of reviewing student performance data enables curricular improvements based on collective evidence about students’ learning.

Back to Top | Article Outline

Next Steps

Commitment to building the technical infrastructure and partnership with information technology experts are essential. We recognize the need for two additional dashboards to display program-level data: (1) an administrative dashboard to monitor completion of required assessment activities and flag students who request or require additional help, and (2) a group progress dashboard that synthesizes information for grading committees and academic progress committees.

Implementing the program of assessment has enhanced alignment among all elements of the curriculum. Changes in institutional culture around assessment are emerging, with students demonstrating receptivity to feedback and willingness to work with coaches on self-improvement. We continue to monitor whether we successfully identify students progressing as expected, seeking more individualized opportunities, or needing additional resources. Ultimately, the outcome of the program of assessment will be demonstrated with evidence of validity showing that the program produces physicians who engage in lifelong learning and provide high-quality patient care.

Acknowledgments: The authors wish to thank Victoria Ruddick for help with the figure and the Educational Scholarship Conference (ESCape) works in progress group for critical feedback.

Back to Top | Article Outline

References

1. van der Vleuten CP, Schuwirth LW, Driessen EW, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34:205–214.
2. Lucey CR. Medical education: Part of the problem and part of the solution. JAMA Intern Med. 2013;173:1639–1643.
3. Konopasek L, Norcini J, Krupat E. Focusing on the formative: Building an assessment system aimed at student growth and development. Acad Med. 2016;91:1492–1497.
4. Lau AMS. “Formative good, summative bad?”—A review of the dichotomy in assessment literature. J Furth High Educ. 2016;40(4):509–525.
5. Norcini J, Anderson B, Bollela V, et al. Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33(3):206–214.
6. Sandars J, Cleary TJ. Self-regulation theory: Applications to medical education: AMEE guide no. 58. Med Teach. 2011;33:875–886.
7. Hauer KE, Cate OT, Boscardin CK, et al. Ensuring resident competence: A narrative review of the literature on group decision making to inform the work of clinical competency committees. J Grad Med Educ. 2016;8:156–164.
8. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.
9. Kane MT. Current concerns in validity theory. J Educ Meas. 2001;38(4):319–342.Reference cited in Table 1 only
10. Hauer KE, Boscardin C, Fulton TB, Lucey C, Oza S, Teherani A. Using a curricular vision to define entrustable professional activities for medical student assessment. J Gen Intern Med. 2015;30:1344–1348.
© 2018 by the Association of American Medical Colleges