Skip Navigation LinksHome > May 2007 - Volume 82 - Issue 5 > The Portfolio Approach to Competency-Based Assessment at the...
Academic Medicine:
doi: 10.1097/ACM.0b013e31803ead30
Educational Strategies

The Portfolio Approach to Competency-Based Assessment at the Cleveland Clinic Lerner College of Medicine

Dannefer, Elaine F. PhD; Henson, Lindsey C. MD, PhD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Dannefer is director of medical education research and assessment, Cleveland Clinic Lerner College of Medicine of Case Western Reserve University, Cleveland, Ohio.

Dr. Henson is professor and chair, Department of Anesthesiology and Perioperative Medicine, University of Louisville School of Medicine, Louisville, Kentucky. During the development and first year of implementation of the curriculum and this assessment system, Dr. Henson was vice dean for education and academic affairs, Case Western Reserve University School of Medicine, and vice dean for education, Cleveland Clinic Lerner College of Medicine of Case Western Reserve University, Cleveland, Ohio.

Correspondence should be addressed to Dr. Dannefer, Director, Medical Education Research and Assessment, Cleveland Clinic Lerner College of Medicine of Case Western Reserve University, Cleveland Clinic/NA24, 9500 Euclid Avenue, Cleveland, OH 44195; telephone: (216) 445-1058; fax: (216) 445-7442; e-mail: (dannefe@ccf.org).

Collapse Box

Abstract

Despite the rapid expansion of interest in competency-based assessment, few descriptions of assessment systems specifically designed for a competency-based curriculum have been reported. The purpose of this article is to describe the design of a portfolio approach to a comprehensive, competency-based assessment system that is fully integrated with the curriculum to foster an educational environment focused on learning.

The educational design goal of the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University was to create an integrated educational program—curriculum and instructional methods, student assessment processes, and learning environment—to prepare medical students for success in careers as physician investigators. The first class in the five-year program matriculated in 2004. To graduate, a student must demonstrate mastery of nine competencies: research, medical knowledge, communication, professionalism, clinical skills, clinical reasoning, health care systems, personal development, and reflective practice.

The portfolio provides a tool for collecting and managing multiple types of assessment evidence from multiple contexts and sources within the curriculum to document competence and promote reflective practice skills. This article describes how the portfolio was developed to provide both formative and summative assessment of student achievement in relation to the program’s nine competencies.

Medical education assessment practices continue to be shaped by the international competency movement.1–4 At least four themes regarding competency-based assessment are reported in recent literature. First, the competency movement has focused educators’ attention on the need to assess a full range of competencies, concentrating on students’ performance in actual and simulated practice settings where complex tasks require the ability to integrate multiple competencies.4,5 Second, specification of performance criteria directly linked to educational objectives has increased attention to the need for qualitative, formative feedback to help learners improve their performance.6–8 Third, because performance related to competencies is context specific, measurement concerns are focused on sampling from multiple contexts and sources.9,10 Finally, psychometric approaches alone are no longer adequate; new models are needed to triangulate and analyze multiple types and sources of evidence.7,9,10

Few examples of competency-based assessment systems are reported in the literature on undergraduate medical education programs.11–15 In this article we contribute to the literature on competency-based assessment by describing a portfolio assessment system developed as an integral component of the new Cleveland Clinic Lerner College of Medicine (CCLCM) of Case Western Reserve University, a track within the Case Western Reserve University School of Medicine, established in 2004. This educational track is distinct from the existing MD program at the Case School of Medicine and was developed de novo, providing the opportunity to design all aspects of the program to achieve CCLCM’s stated goal: to prepare medical students for successful careers as physician investigators. The program’s competency-based curriculum interweaves three main areas of study—research, sciences basic to medical practice, and clinical experiences. The five-year program provides each student with several mentored research experiences, including a master’s-level thesis, combined with a formal, required curriculum in research skills. The basic science curriculum fosters significant independent study and problem solving and provides opportunities to hone teamwork skills. Clinical skills training and care of adult and pediatric outpatients begin in the first year of study. Year three introduces students to inpatient care in a series of core and advanced clinical blocks supplemented by a longitudinal curriculum in advanced basic science concepts, research, and clinical skills. A detailed description of the curriculum is provided elsewhere.16,17

In this article, we describe the design of CCLCM’s portfolio approach for a comprehensive, competency-based assessment system that is fully integrated with the curriculum. To provide context, we describe the program’s competencies, performance standards, core assessment principles, and unique assessment instrument before addressing the design of the overall assessment system.

Back to Top | Article Outline

Components of the Assessment System

Competencies and performance standards

The nine competencies (List 1) agreed on by the central planning committee for the CCLCM program encompass the six Accreditation Council for Graduate Medical Education core competencies for U.S. residency programs, thus providing a continuum of competency-based education for students that begins in their first year of undergraduate medical education. Professional competence, however, is more than possessing a set of competencies; it requires the ability to integrate competencies and to use judgment in the practice of medicine.10,18 Competency-based assessment has the potential to fragment performance by using methods that target specific components of competencies, thereby neglecting the complexity and integrated nature of practice. Thus, reflective practice is the foundational competency for the CCLCM program, underscoring the critical importance of learning from experience and engaging in conversations about practice to develop professional judgment.19 Five competencies (research, medical knowledge, clinical skills, clinical reasoning, and health care system) relate to the core curriculum in basic and clinical science knowledge, clinical medicine, and research; two others address the general professional competencies of communication and professionalism. The faculty also recognized that finding a balance between personal and professional life is an ongoing struggle for physicians and thus included the personal development competency.

List 1
List 1
Image Tools

Groups of faculty experts identified specific desired outcomes, referred to as “standards,” for each competency.20 A modified Delphi approach was used to engage these groups of experts to define the developmentally appropriate standards for each competency at the ends of year one and year two and at graduation (year five). Because the CCLCM curriculum is individualized during years three through five, students can potentially meet year five standards at variable time points. To illustrate the standards, List 2 presents the communication competency standards. Across all competency areas, the standards directly relate to the curricular experiences of the students, yet they are general enough to warrant assessment throughout the five-year educational program. Faculty members conduct an annual review of the adequacy of curricular experiences and availability of assessment evidence to inform yearly revisions of the standards.

List 2
List 2
Image Tools
Back to Top | Article Outline
Assessment principles

The overall goal of the CCLCM assessment system is to help students become reflective practitioners of medicine and science with a drive for lifelong learning complemented by a critical approach to self-assessment and self-improvement. Faculty recognized that a competency-based curriculum designed to foster self-directed learning would not achieve its goal if student assessment focused on what the “teacher” said in class and factual recall. They wanted an assessment process that would reward students for identifying gaps in their abilities and developing effective ways to correct those gaps. Faculty also recognized that developing reflective practice skills would be facilitated by cultivating close advising relationships between students and faculty. On the basis of these tenets, the faculty developed core principles for design of the student assessment process (List 3) after a series of presentations by assessment experts, participation in focus groups, and review of the literature.

List 3
List 3
Image Tools

Those involved in the planning process also articulated the practical implications of these principles early on. For example, the principle that “assessment should enhance learning” committed the faculty to using only formative assessments designed to provide feedback for improving performance and documenting progress longitudinally. The principle that “assessment should be progressive and cumulative” meant that no summative midcourse or end-of-course grades would be given. Using competency-based assessment rather than grades for all components of the curriculum, including clinical courses, precluded the common practice in U.S. medical schools of competitive grading in clerkships with a targeted percentage of students receiving an honors grade. Expecting all students to demonstrate achievement of the standards for all nine competencies at graduation meant that faculty efforts should focus on helping students attain the standards, rather than on documenting differences in students’ abilities. Finally, the goal of developing students into reflective practitioners suggested that the system should require students to take an active role in evaluating their own performance (a student-centered rather than a faculty-centered process).

Back to Top | Article Outline
Assessment template

The assessment principles guided the choice of assessment methods. We wished to avoid exclusive reliance on traditional assessment methods, such as multiple-choice question (MCQ) examinations, lab practicals, and short station objective structured clinical examinations (OSCEs), which fractionate professional competence and focus the students on short-term mastery of specific information or skills.18,21 A major step toward achieving this goal was the development of a unique CCLCM assessment instrument designed to demonstrate that competencies cut across courses, learning experiences within courses, and years in the program, and that it is necessary to integrate multiple competencies in tasks relevant to the practice of medicine. The template for this CCLCM instrument is used in all courses to collect feedback on performance from faculty and peers and for self-assessment. The template is organized by competencies and related performance criteria (behavioral descriptors). The behavioral descriptors are related to the phase of the curriculum and specific context in which the student is being assessed, providing assessors with guidance in determining performance expectations for each student. The instrument does not provide the option of numeric ratings or choosing “met/not met,” but it requires narrative feedback that identifies areas needing improvement and reinforces areas of strength. Appendix 1 provides an example of a specific assessment form (based on the template) that is completed by the research preceptor at the end of a student’s summer research experience.

Information derived from forms based on this template is supplemented by multiple other assessment methods, including OSCEs, observed history and physicals, weekly MCQ self-assessments, weekly knowledge application essay questions, and clinical progress tests. Faculty use all assessment methods to provide formative feedback, rather than summative grades, to students. For example, students do not pass or fail their year one OSCE; rather, they receive specific formative feedback on their performance of the clinical skills expected during each OSCE station. Student-generated work from different curricular experiences provides authentic performance evidence, such as grant proposals, PowerPoint presentations, lab notebooks, research thesis, concept maps, and patient logs and journals. The only summative examinations CCLCM students are required to pass for graduation are those required for medical licensure (USMLE Steps 1 and 2).

Back to Top | Article Outline

Design of the Portfolio Assessment System

The search for an assessment system consistent with our principles led us to the literature on portfolios for medical student assessment14–15,21–23 and to firsthand investigation of experiences with portfolio assessment within and beyond national, disciplinary, and professional borders. At the time (2002–2003), the portfolio assessment approach was being implemented in the United States in grades K–12 and at the undergraduate college level24–26 and in the United Kingdom and Europe at a number of medical schools. For example, at the University of Dundee College of Medicine, as a key component of the final examination process, each final-year medical student was required to submit a portfolio documenting his or her progress in achieving the outcomes of the curriculum.14,15 The first-year medical student portfolio at the University of Maastricht Faculty of Medicine included students’ reflective essays on their own development in four identified roles of the doctor as a method of encouraging integration of competencies.22

Portfolios used for assessment can be defined as purposeful collections of evidence used by students to document and reflect on learning outcomes.14,27 The literature suggested that a portfolio approach could be designed to promote reflection on learning, accommodate a wide range of assessments, including authentic performance-based methods, and give students responsibility for integrating and assessing evidence of their own learning. The faculty and oversight committee approved the use of a portfolio assessment process for both formative and summative purposes.

Our portfolio assessment system was designed on the basis of the following considerations derived from review of the literature and from consultations with experts:

1. If reflective practice is a goal, setting aside time and providing mentors is critical to helping students reflect on evidence of their learning and professional development.21,28–30

2. Distinctly separate processes and reviewers for formative portfolio (FP) and summative portfolio (SP) assessments can ensure that confidentiality of reflections of a personal nature is not compromised by the rigor and judgments required for making promotion decisions.22

3. Student responsibility for selecting evidence and analysis of their learning is critical to maintaining student engagement in assessing progress.21

4. Requiring essays aimed at integrating the competencies needed for the practice of medicine, such as asking students to reflect on the various roles of the doctor, can facilitate reflection on learning.22,28

5. Rigorous measurement standards are necessary if portfolios are to be used for summative purposes: fairness (clear instructions, equal assistance, and due process), validity (appropriate standards, evaluators capable of making sound judgments, and quality authentic evidence), and reliability (trained evaluators and adequate curricular experiences providing multiple sources of assessment).25–26,30–35

On the basis of these considerations, we developed a portfolio assessment system (Figure 1) that uses the same evidence database (the broad foundation at the base of the triangle in the illustration), but separate processes, for formative and summative assessments. We carefully designed our approach to achieve fairness, validity, and reliability in both processes. Our approach differs from most other portfolio assessment systems in which the portfolio is used only for formative assessment36,37 or else as a method for making summative judgments about competencies that are difficult to assess by other means.14,15,22,29,30

Figure 1
Figure 1
Image Tools
Back to Top | Article Outline

The advising system

Central to the portfolio assessment system is a robust advising system. At the beginning of medical school, each student is assigned to a physician advisor. Advisors are assigned no more than 10 students and are given protected, nonclinical time to fulfill their advising responsibilities. Descriptions in the literature consistently note that the usefulness of portfolios for developing reflective practice skills is dependent on facilitating a dialogue between students and mentors about the students’ experiences and learning needs.22 Accurate self-assessment, a critical step in reflective practice, is more likely to occur when students work with a mentor to interpret assessment data.38 Development of students’ professional competence depends on their critical reflection on the concept of practice, especially their exploration of areas of uncertainty and reasoning about how to practice that can be enriched by conversations with colleagues.19,22,29

In the CCLCM program, physician advisors partner with students across the five-year curriculum to engage in reflective practice cycles around the FPs (see below). Advisors monitor assessment data, coursework assignments, and unique student-generated work using an electronic portfolio system. Students and advisors communicate informally by e-mail and in one-to-one meetings. We use written guidelines to ensure consistency in monitoring students’ progress and reviewing portfolios, conduct workshops for advisors to set standards for review of FPs, and hold weekly meetings attended by all advisors to create a shared understanding of the advisor role and methods to fulfill advisor responsibilities.

Back to Top | Article Outline

The FP process

Periodically (one to three times a year), students construct an FP for review with their physician advisors (Figure 1). Chart 1 provides a timetable of the FPs in relation to the curriculum; Chart 2 provides a summary of the competency-related evidence available for each of the FPs from multiple sources, contexts, and methods. To ensure that the process is meaningful to the students, the focus of the FP changes depending on the phase of the curriculum and related professional development issues. We chose this approach on the basis of our consultations and focus groups in 2004 with faculty and students at Peninsula Medical School (Universities of Exeter and Plymouth) and the University of Maastricht Faculty of Medicine, where students found that structure was helpful at first but that it became a constraint as they became more comfortable with the process. At CCLCM, the first FP in year one is quite structured and requires students to reflect on the four competencies (research, medical knowledge, communication, and professionalism) most relevant to performance in the year one summer research block and on their development as researchers. In contrast, the first FP in year two asks students to “analyze your personal and professional development as a physician and researcher, discussing your successes and challenges and identifying specific competencies you need to address.” FPs become less frequent as students become more skilled in developing learning plans and identifying professional development issues. A major goal of repetitive cycles of self-assessment and reflection with close mentoring by advisors is to develop the students’ habits of reflective practice.18

Chart 1
Chart 1
Image Tools
Chart 2
Chart 2
Image Tools

We place responsibility on the students for demonstrating achievement of the competencies. For each FP review, students analyze their evidence and write reflective essays to document their mastery of the competencies and their development as doctors and/or researchers. They prepare a structured learning plan with measurable outcomes and include reflections on progress in meeting previous learning goals. The advisors read the FPs, focusing on three specific qualities: (1) whether students have addressed patterns of performance apparent in the evidence presented in the students’ electronic portfolios, (2) the students’ level of reflection, and (3) the students’ learning plans. In reviewing the FPs with the students, advisors serve as coaches, dialoguing with students about the evidence and their self-assessments, helping them to reconcile their learning plans with their identified learning goals, and ensuring the goals are achievable. These ongoing cycles of self-assessment and consultation with advisors are designed to ensure that students master areas of relative weakness and develop further in areas of relative strength and interest.

Below, we use a hypothetical example to demonstrate how the FP process requires students to triangulate data from multiple sources for each competency and to identify patterns in their own performance. The example relates to the students’ first FP in year one, which occurs after two months of basic science research. Although students are required to address four competencies in their FP essays, we focus here on one standard for the year one communication competency: “Demonstrates effective written and oral communication in basic science research” (List 3). For evidence of progress in meeting this communication standard, the student has assessments from his or her research preceptor and lab colleagues, assessments from faculty and peers of journal club presentations, and facilitator feedback on performance in bioinformatics small-group sessions. The student’s own work includes a required grant proposal, two PowerPoint presentations, and other unique evidence of his or her choice.

Back to Top | Article Outline
Example.

A review of a student’s evidence from several different assessors identifies a theme of “quietness” noted by faculty and peers, whereas the student’s written communication is noted for its excellence. The student may or may not recognize and write about this pattern in his or her portfolio. If the student recognizes the pattern, his or her level of reflection may be descriptive and the relevance dismissed (“I’ve always been shy”). Alternatively, the student may attribute his or her quietness to cultural background. Depending on the student’s level of reflection, the advisor may need to help the student recognize the pattern as important, analyze the pattern, and/or begin to evaluate what this pattern may mean for functioning on an inpatient team where the life of a patient may depend on the student speaking out. Although this example is about feedback on communicating in the basic science research context, the theme of quietness could potentially be observed in upcoming courses in feedback from problem-based learning faculty and peers and clinical skills preceptors. Thus, the student’s learning plan should address the issue, and in the next FP, the student should provide new evidence to document success in meeting his or her learning goals related to this pattern.

This example illustrates how formative feedback linked to criteria for competency standards, when collected from multiple contexts and sources, can be used to identify patterns of performance. It also emphasizes the key role of the advisor as a coach to the student, rather than as a faculty evaluator of the evidence who tells the student what the evidence shows.

Back to Top | Article Outline
The SP process

The qualities that make portfolios attractive for formative assessment present numerous psychometric challenges when portfolios are used for summative, standards-based decisions about performance. Critical to the summative use of portfolios are trained evaluators capable of making sound judgments.15,22,34

The CCLCM summative process requires students to submit an SP at the end of years one and two and at the beginning and end of year five (Chart 1) to document that they have achieved requirements for promotion or graduation. Detailed instructions for preparing the SP are distributed to students and physician advisors and explained at a workshop. SP structure and format guidelines are rigid (unlike the FP) to ensure ease and fairness in the review process. The instructions emphasize that the student must demonstrate achievement of each standard by writing an essay that cites a representative sample of evidence that is balanced and that draws from different curricular contexts. The reflective practice competency requires students to address variability in performance and to provide evidence that identified learning needs were addressed. Advisors review the SP and certify that the portfolio is the student’s own work and that the selected evidence is representative of the student’s overall evidence. This validation assures the promotions committee that the students accurately represent their achievements in relation to faculty-defined competency standards.

Students submit their SPs to a medical student promotions and review committee (MSPRC) composed of 15 clinical and basic science faculty members who determine students’ eligibility for promotion to the next phase of the curriculum and graduation from medical school. The MSPRC members are seasoned in making expert judgments about performance because most have served as residency training directors or have extensive experience supervising graduate students. A series of mandatory training sessions introduced MSPRC members to the curriculum, related assessments, and the portfolio system and used hypothetical student cases to help members develop a shared understanding of approaches to judging student performance.

The systematic process for the summative review begins with a two-step standard-setting procedure. In preparation for the standard setting, each committee member independently reviews the same sample of eight randomly selected portfolios. This provides them with a perspective on how students at that phase of the curriculum present evidence of meeting the standards and what evidence they choose to cite. During step 1, the committee members reach consensus as to which standards are essential for demonstrating achievement of each competency. Our committee therefore sets a conjunctive standard39 that identifies which standards must be met for each competency. These conjunctive standards, or “cut-points,” are used to judge each student’s evidence of performance. This process was designed to improve interrater reliability by recalibrating individual reviewer’s expectations and understanding of the standards. During step 2, the committee discusses each of the eight sample portfolios, standard by standard for each competency, and votes on whether the student has met, not met, or provided insufficient evidence for them to make a judgment of achieving that standard. The outcomes for the sample of students are then reviewed to determine whether the cut-points are acceptable.

After the standard-setting process, each of the remaining SPs is given to two reviewers who independently evaluate whether the student has met, not met, or provided insufficient evidence for them to make a judgment. The two reviewers then meet to reach a consensus, and they may request more evidence from the student. Recommendations of the reviewers are presented to the whole committee for discussion and approval. If the two reviewers recommend dismissal or cannot reach consensus, the portfolio is read by the entire committee, and consensus is reached on recommendations. Every student receives a letter that summarizes the outcome of the committee’s deliberations and that notes specific strengths and areas needing improvement.

Back to Top | Article Outline

Recommendations for Portfolio Practice

Although our portfolio system was designed specifically for our educational program, two years of experience with portfolio assessment provides the basis for some lessons learned that may be of use to other institutions seeking to adopt a similar system.

Back to Top | Article Outline
Training materials

Initially, one of the most difficult barriers to overcome for portfolio assessment training was the lack of real portfolios. Although we constructed mock portfolios for training advisors, students, and the MSPRC, these materials did not convey the individualized nature of the rich, robust evidence that students use to document performance in the nine competencies. Even when real portfolios became available, they were not available to faculty until some students gave written permission for their summative portfolios to be used for faculty development purposes. These deidentified student competency essays and supporting evidence have proved very helpful in training faculty to give useful feedback and physician advisors to develop advising skills.

Back to Top | Article Outline
Quality evidence

The quality of evidence is critical for this primarily qualitative approach to assessment. We learned very quickly that faculty and students need to be trained to provide observation-based, narrative feedback on the performance-based criteria to identify areas needing improvement and areas of strength. All teaching faculty who provide feedback are expected to participate in a rigorous faculty development program. Students also participate in a workshop on giving feedback, and they learn in the course of receiving feedback what constitutes useful feedback.

Back to Top | Article Outline
Sufficient evidence

Students need sufficient curricular experiences and feedback to ascertain achievement of standards for competencies. If a review by the curriculum committees finds that curricular experiences and/or opportunities to obtain quality feedback are insufficient to demonstrate achievement of standards, standards are revised or the educational program (curriculum, assessments, etc.) is improved.

Back to Top | Article Outline
Physician advisors

The role of the advisor is critical, and appointing faculty already recognized for their mentoring abilities is vital. Protected time and regular meetings of the advisors allow problem solving, ensure similar support of students, help develop skills for encouraging reflective practice, and offer collegial support.

Back to Top | Article Outline
Summative assessors

Appointing senior faculty interested in education and experienced in exercising judgment with regard to trainees is critical to ensure solid decisions that are viewed as credible by students and faculty. Agreeing on standards for portfolios before beginning the review process takes time and should not be short changed.

Back to Top | Article Outline
Assessment culture

The consistent embracement of the competency-based, formative feedback approach is challenging for faculty. Therefore, communication is critical for developing a shared understanding of the portfolio system. Giving students responsibility for documenting achievement of competencies challenges the tradition of teacher-centered assessment. Creating a culture supportive of a portfolio system requires multiple approaches, especially in the early stages. For example, the MSPRC sent a letter to all faculty after the first SP review, stressing the value of the portfolio for student learning and the soundness of the process for making summative decisions. In the fall of the second year, an education conference featured the portfolio system, during which the faculty had the opportunity to read and discuss portions of actual portfolios and to view a video of students being interviewed about the process.

Back to Top | Article Outline
Oversight committee

A central committee to oversee the development and implementation of the assessment system ensures its integrity by establishing policies consistent with the assessment principles. The committee also creates portfolio “experts” for various subgroups within the program who educate others and protect the integrity of the system.

Back to Top | Article Outline
Program evaluation

An evaluation plan that collects regular feedback from all participants in the portfolio system is essential for identifying changes that need to be made in the course of initial implementation and for ongoing program improvement.

Back to Top | Article Outline
Fairness, validity, and reliability

A rigorous approach to establishing the reliability and validity of data and fairness of judgments in a portfolio-based assessment is critical. We have taken the first step by carefully considering these issues as we designed our assessment system. The important next step, which includes detailed analysis of the quality and interpretation of assessment data from measures used for the portfolios, is beyond the scope of this article, and we plan to address it in a subsequent manuscript.

Back to Top | Article Outline
Electronic portfolio design

During the first year, our portfolio was primarily paper based. This allowed us to refine our assessment instruments and use focus groups of both students and physician advisors to make sure the planned electronic portfolio met user needs. The assessment director met weekly with the information technology team to ensure that the conceptual framework of the portfolio assessment system informed the overall development of the electronic portfolio.

Back to Top | Article Outline
Consultants

Consultants have provided critical expertise for the development and implementation process and have saved the project from costly mistakes. Portfolios are complex tools, and experience provides useful lessons.

Back to Top | Article Outline
Accreditation considerations

A frequently voiced concern on the part of faculty and administrative leadership at many medical schools is that innovative approaches to student assessment, such as our portfolio process, will not meet Liaison Committee on Medical Education (LCME) accreditation standards. Although it is important to know the standards and consider them in developing new approaches, it is not useful to be bound by a rigid interpretation. We found that we needed to violate the “letter of the law” for only one standard (ED-30) to adhere to our core assessment principles.40 More important, we provided the LCME with a clear rationale for every aspect of our portfolio process during our accreditation site visit. The LCME, in fact, cited our innovative approach to assessment as a strength of the program in its report.

Back to Top | Article Outline

An Answer to the Challenges of Competency-Based Assessment

We have described a portfolio approach to a competency-based assessment system specifically designed to support the educational goals of a unique MD program to educate physician investigators. In our opinion, this system and its components should be applied in other settings across the medical education continuum because major themes in the current debate about assessment in medical education, such as integration across competencies, competency-based assessment methods, sampling from multiple contexts and multiple sources, triangulation of information, and training of learners in reflective practice are addressed in this innovative method of student assessment.

Back to Top | Article Outline

Acknowledgments

The authors wish to acknowledge the support and advice of individuals who provided suggestions throughout the design and implementation process and to Drs. Beth Bierer, Christine Taylor, Dale Dannefer, and Alan Hull for their invaluable feedback on the manuscript. Our particular thanks are due to Dr. Margery Davis from the University of Dundee for her extraordinary efforts and encouragement during the last four years. Dr. Davis, along with Dr. Patricia O’Sullivan (University of California, San Francisco) and Dr. Kathleen O’Brien (Alverno College), served as consultants for the project. We also wish to thank Dr. Charlotte Rees from Peninsula Medical School and faculty from the University of Maastricht and Groningen University for sharing their experience with us. The authors are grateful to Ann Honroth, the portfolio coordinator, for her organizational expertise and support. This project was supported in part by a grant (Student Portfolio Approach to Education Physician Educators L2004-0031, E. Dannefer, PI) from the Cleveland Foundation.

Back to Top | Article Outline

References

1 Accreditation Council for Graduate Medical Education Outcomes Project. Available at: (http://www.acgme.org/outcome). Accessed January 22, 2007.

2 Royal College of Physicians and Surgeons of Canada. CanMEDs 2000: Skills for the New Millennium: Report of the Societal Needs Working Group. Available at: (http://rcpsc.medical.org/publications/index.php#canmeds). Accessed January 22, 2007.

3 General Medical Council. Tomorrow’s Doctors. Available at: (http://www.gmc-uk.org/education/undergraduate/index.asp). Accessed January 22, 2007.

4 Metz JCM, Verbeek-Weel AMM, Huisjes HJ, eds. Training of Doctors Blueprint 2001. Adjusted Objectives of Undergraduate Medical Education in the Netherlands. Available at: (http://ifas.klinikum.uni-muenster.de/mcfragen/files/blueprintNL2001.pdf). Accessed January 22, 2007.

5 Schuwirth LW, van der Vleuten CP. Changing education, changing assessment, changing research? Med Educ. 2004;38: 805–812.

6 Davis MH, Harden RM. Competency-based assessment: making it reality. Med Teach. 2003;25:565–568.

7 Schuwirth LWT, Southgate L, Page GG, et al. When enough is enough: a conceptual basis for fair and defensible practice performance assessment. Med Educ. 2002;36:925–930.

8 Whitcomb ME. Competency-based graduate medical education? Of course! But how should competency be assessed? Acad Med. 2002;77:359–360.

9 van der Vleuten CP, Schuwirth LW. Assessing professional competence: from methods to programmes. Med Educ. 2005;39:309–317.

10 Shumway JM, Harden RM. AMEE Guide No. 25: the assessment of learning outcomes for the competent and reflective physician. Med Teach. 2003;25:569–584.

11 Smith SR, Dollase RH, Boss JA. Assessing students’ performances in a competency-based curriculum. Acad Med. 2003;78:97–107.

12 Epstein RM, Dannefer EF, Nofziger AC, et al. Comprehensive assessment of professional competence: the Rochester experiment. Teach Learn Med. 2004;16:186–196.

13 Dannefer EF, Henson LC, Bierer BS, et al. Peer assessment of professional competence. Med Educ. 2005;39:713–722.

14 Davis MH, Friedman BDM, Harden RM, et al. Portfolio assessment in medical students’ final examinations. Med Teach. 2001;23:357–366.

15 Friedman BDM, Davis MH, Harden RM, Howie PW, Ker J, Pippard M. AMEE edical education guide no. 24: portfolios as a method of student assessment. Med Teach. 2001;23:535–551.

16 Hull AL, Dannefer EF, Hutzler JC, Fishleder AJ, Henson LC. The need for physician investigators. The Advisor. 2003;12:29–32.

17 Fishleder AJ, Henson LC, Hull AL. Cleveland Clinic Lerner College of Medicine: innovation in approach to medical education and the training of physician investigators. Acad Med. In press.

18 Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287:226–235.

19 Coles C. Developing professional judgment. J Contin Educ Health Prof. 2002;22:3–10.

20 Carraccio C, Wolfsthal SD, Englander R, Ferentz K, Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002;77:361–367.

21 Challis M. AMEE Medical Education Guide No. 11 (revised): portfolio based learning and assessment in medical education. Med Teach. 1999;21:370–386.

22 Driessen EW, van Tartwijk J, Vermunt JD, van der Vleuten CPM. Use of portfolios in early undergraduate medical training. Med Teach. 2003;25:18–23.

23 Snadden D, Thomas ML. The use of portfolio learning in medical education. Med Teach. 1998;20:192–199.

24 Schmitz J. Student Assessment as Learning at Alverno College. Milwaukee, WI: Alverno College Institute; 1994.

25 Koretz D. Large-scale portfolio assessments in the US: evidence pertaining to the quality of measurement. Assess Educ Princ Policy Pract. 1998;5:309–331.

26 Wilkerson JR, Lang WS. Portfolios, the pied piper of teacher certification assessments: legal and psychometric issues. Educational Policy Analysis Archives. 2003;11. Available at: (http://epaa.asu.edu/epaa/v11n45). Accessed January 22, 2007.

27 Arter JA, Spandel V. Using Portfolios of Student Work in Instruction and Assessment. Available at: (http://www.ncme.org/pubs/items/18.pdf). Accessed January 22, 2007.

28 Driessen EW, van Tartwijk J, Overeem K, Vermunt JD, van der Vleuten CPM. Conditions for successful reflective use of portfolios in undergraduate medical education. Med Educ. 2005;39:1230–1235.

29 Rees C, Shepherd M, Chamberlain S. The utility of reflective portfolios as a method of assessing first year medical students’ personal and professional development. Reflective Pract. 2005;7:3–14.

30 Dreissen EW, Overjeem K, van Tartwijk J, van der Vleuten CPM, Muijtjens AMM. Validity of portfolio assessment: which qualities determine ratings? Med Educ. 2006;40:862–866.

31 Pitts J, Coles C, Thomas P, Smith P. Enhancing reliability in portfolio assessment: discussions between assessors. Med Teach. 2002;24:197–201.

32 Rees C, Sheard C. The reliability of assessment criteria for undergraduate medical students’ communication skills portfolios: the Nottingham experience. Med Educ. 2004;38:138–144.

33 Roberts C, Newble D, O’Rourke A. Portfolio-based assessments in medical education: are they valid and reliable for summative purposes? Med Educ. 2002;36:899–900.

34 Driessen E, van der Vleuten C, Schuwirth L, van Tartwijk J, Vermunt J. The use of qualitative research criteria for portfolio assessment as an alternative to reliability evaluation: a case study. Med Educ. 2005;39:214–220.

35 O’Sullivan PS, Cogbill KK, McClain T, Reckase MD, Clardy JA. Portfolios as a novel approach for residency evaluation. Acad Psychiatry. 2002;26:173–179.

36 Gordon J. Assessing students’ personal and professional development using portfolios and interviews. Med Educ. 2003;37: 335–340.

37 Hays R. Reflecting on learning portfolios. Med Educ. 2004;38:800–804.

38 Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 80(10 suppl):S46–S54, 2005.

39 Ben-David MF. AMEE Guide No. 18: standard setting in student assessment. Med Teach. 2000;22:120–130.

40 Liaison Committee on Medical Education. Structure and Functions of a Medical School. Available at: (http://www.lcme.org/functions2006june.pdf). Accessed January 22, 2007.

Back to Top | Article Outline
Appendix 1 Cited Here...
Table. Research Prec...
Table. Research Prec...
Image Tools

Cited By:

This article has been cited 12 time(s).

Medical Teacher
Supporting students in self-regulation: Use of formative feedback and portfolios in a problem-based learning setting
Dannefer, EF; Prayson, RA
Medical Teacher, 35(8): 655-660.
10.3109/0142159X.2013.785630
CrossRef
Bjog-An International Journal of Obstetrics and Gynaecology
The determinants of minimal levator hiatus and their relationship to the puborectalis muscle and the levator plate
Shobeiri, SA; Rostaminia, G; White, D; Quiroz, LH
Bjog-An International Journal of Obstetrics and Gynaecology, 120(2): 205-211.
10.1111/1471-0528.12055
CrossRef
Teaching and Learning in Medicine
A Review of Portfolio Use in Residency Programs and Considerations before Implementation
Colbert, CY; Ownby, AR; Butler, PM
Teaching and Learning in Medicine, 20(4): 340-345.
10.1080/10401330802384912
CrossRef
Neurocirugia
Historical vignette of Cajal's work "Degeneration and regeneration of the nervous system" with a reflection of the author
Lobato, RD
Neurocirugia, 19(5): 456-468.

International Journal of Morphology
General competence in medicine, the role of the anatomy
Inzunza, O
International Journal of Morphology, 26(2): 243-246.

American Journal of Pharmaceutical Education
Web-based Portfolios for Pharmaceutical Care Plans During Advanced Pharmacy Practice Experiences
McDuffie, CH; Sheffield, MC; Miller, MS; Duke, LJ; Rogers, SP
American Journal of Pharmaceutical Education, 74(4): -.
ARTN 59
CrossRef
Anatomical Sciences Education
Development of a Synergistic Case-Based Microanatomy Curriculum
McBride, JM; Prayson, RA
Anatomical Sciences Education, 1(3): 102-105.
10.1002/ase.21
CrossRef
Medical Teacher
Methods to assess students' acquisition, application and integration of basic science knowledge in an innovative competency-based curriculum
Bierer, SB; Dannefer, EF; Taylor, C; Hall, P; Hull, AL
Medical Teacher, 30(7): E171-E177.
10.1080/01421590802139740
CrossRef
Teaching and Learning in Medicine
Evaluation of Essay Questions Used to Assess Medical Students' Application and Integration of Basic and Clinical Science Knowledge
Bierer, SB; Taylor, CA; Dannefer, EF
Teaching and Learning in Medicine, 21(4): 344-350.
10.1080/10401330903230980
CrossRef
Medical Teacher
Assessing tomorrow's learners: In competency-based education only a radically different holistic method of assessment will work. Six things we could forget
Schuwirth, L; Ash, J
Medical Teacher, 35(7): 555-559.
10.3109/0142159X.2013.787140
CrossRef
Medical Teacher
Beyond assessment of learning toward assessment for learning: Educating tomorrow's physicians
Dannefer, EF
Medical Teacher, 35(7): 560-563.
10.3109/0142159X.2013.787141
CrossRef
Academic Medicine
Standardizing and Personalizing Science in Medical Education
Lambert, DR; Lurie, SJ; Lyness, JM; Ward, DS
Academic Medicine, 85(2): 356-362.
10.1097/ACM.0b013e3181c87f73
PDF (262) | CrossRef
Back to Top | Article Outline

© 2007 Association of American Medical Colleges

Login

Article Tools

Images

Share