Donato, Anthony A. MD; George, David L. MD
Portfolios have been used as a tool to collect longitudinal evidence of expertise for artists, architects, and writers, for students from grade school through undergraduate education, and, in the past decade, in nursing and undergraduate medical education in Europe and Australia.1 More recently, both Canada and the United Kingdom have introduced online portfolios for continuing medical education.2,3 In the United States, the Accreditation Council for Graduate Medical Education's (ACGME's) recent interest in competency-based education has prompted it to encourage teaching programs to move their emphasis from educational process to educational outcomes. This, in turn, has prompted those programs to search for tools that can best evaluate educational outcomes of their students with regard to the ACGME's six core competencies (Patient Care, Medical Knowledge, Practice-Based Improvement, Interpersonal Communication, Systems-Based Practice, and Professionalism).
Of the 13 recommended tools in the ACGME's toolbox of assessment methods, only the portfolio tool is preferred for assessment of all six core competencies.4 Accordingly, in 2005, the ACGME developed its own online learning portfolio, which is now available to programs for voluntary use and is scheduled for full implementation in 2016.5 Program directors will, therefore, need to develop familiarity with the portfolio as an evaluation system, to learn what content adds value, what time and personnel resources are required, and what pitfalls to avoid. In this article, we give a blueprint for the implementation of a portfolio in a graduate medical education setting based on the five-year implementation experience in one U.S. internal medicine residency program.
Educational Rationale for Portfolios
A portfolio is defined as “a collection of materials that represents a learner's efforts, progress and achievements in multiple areas of the curriculum.”5 Evidence from systematic reviews indicates that use of portfolios may improve a student's self-awareness, augment the student's ability to learn independently and to integrate theory with practice, and also increase a mentor's awareness of the students' needs.6 Guided reflection on the contents of the portfolio by trained mentors seems critical to a successful portfolio implementation.7
Educational theorists define reflection as “the mental process of organizing or reorganizing cognitive structures that represent existing knowledge and beliefs and guide perceptions of experiences, situations and problems.”8 Reflection is described as a key component in Kolb's9 experiential learning theory, is a critical step in dealing with unfamiliar problems in Ericsson's10 deliberate practice theory of expertise development, is vital in Knowles'11 paradigm of the creation of self-directed learner, is a crucial step to translate experiences into transformative learning for Epstein's12 mindful practitioner, is essential to development of Mylopoulos and Regehr's13 “adaptive expertise,” and is central to understanding what professionals do in Schon's “reflection in action” paradigm.14
Mentor-guided portfolio use is believed to successfully promote reflection.15 However, given the infrastructure, time, and resources needed to properly develop mentors, train evaluators, and win the hearts of those doing the reflecting as well as their mentors, the learning environment surrounding a portfolio may be the most important variable to its successful implementation.16 In addition, how best to build and implement a portfolio tool is still not clearly defined.
Development of Our Portfolio System
Analysis of the need
The Reading Hospital and Medical Center's internal medicine residency program had, throughout the five-year implementation (2006–2011), a total of 26 residents (21 categorical medicine, 5 transitional year) mentored by seven key clinician faculty. In 2005, while reviewing our department's resident assessment tools, we found that our current tools (monthly evaluation forms from the American Board of Internal Medicine [ABIM] completed by 100 mostly volunteer faculty, an in-training exam, 360-degree evaluations, and an internally developed, direct observation tool) were not adequately aligned to allow us to longitudinally assess practice-based learning, systems-based practice, and professionalism with the degree of specificity required to document progress in these domains.4 Before 2005, each resident was assigned a faculty mentor who assessed the resident's progress triannually after reviewing results of the existing assessment tools and dictated a brief letter to the program director documenting the resident's progress and any potential barriers to graduation. We wanted to improve learner assessment while aligning it with this well-regarded individual mentoring program. A portfolio tool was identified as one potential solution for both mentors and also our program director to assess longitudinal resident growth while enhancing creation of personalized learning plans (see Table 1).
Determining the goal of a portfolio is crucial to successful implementation.7 Our goal was to follow the ACGME's competency-based educational model to develop a portfolio for purposes of learner-centered formative feedback and assessment.4 With this goal in mind, our objectives were (1) to provide some structure so that learners were clear about goals, while allowing them to choose or withhold content to preserve authenticity (i.e., to maintain the learner's ownership and control of the content and reflections),7 (2) to foster reflection and the creation of personal growth plans, (3) to decentralize control, so that mentors could individualize portfolios to meet the needs of their learners, and (4) to create a single repository for all other evaluations from the resident so that longitudinal growth could be better assessed by learner and mentor, and to foster reflection on that growth by the learner.
Format and content
The initial format was developed by a consensus of the faculty and tested with three volunteer senior residents in the spring of 2006, followed by implementation of the entire program in September 2006. Its format consisted of a three-ring binder with folders, each arranged by section. Within each section were instructions, an example, and a series of reflective questions specific to that section's contents. Initial content for the portfolio included a curriculum vitae, teaching presentations, research notes, rotation evaluations, samples of clinical documentation (i.e., dictated histories, discharges, consults), quality improvement projects, action plans from direct observations, evidence-based medicine searches, and critical incidents. Residents were encouraged to assemble and review all of the components, reflecting on whatever documents they felt were significant to their growth. They would submit the portfolio for mentor review, but they could withhold any reflections from the review (but not the assessment documents themselves). Later additions in 2008 and 2010 included an ambulatory portfolio, care audits, in-training exam results, a sleepiness scale, and the Steven Covey time management grid.
Following refinements in 2008, contents were moved online. Web pages were designed to allow the resident to choose and upload any file, video, or scanned item and could be viewed by the learner at the same time as a series of reflective questions and a space for typing one's own reflections (see Figure 1). All reflections from any one section (i.e., Teaching Presentations) can be opened simultaneously and viewed in time sequence by learners or invited mentors to assess longitudinal growth. The ability to individually password-protect any single document allows for that document to be kept private to preserve authenticity and resident ownership (approximately 2%–3% of all reflections are locked, mostly in critical incident review sections). Each night, a data link with our online evaluation vendor automatically transferred all currently used online evaluation forms (360-degree evaluations, ABIM end-of-month evaluations) to a page where they could be viewed and a reflection could be written. Other sections triggered the resident to choose and upload his or her own documents representing his or her work (i.e., teaching, research, critical incident sections). A filter was built to transfer the contents of a PubMed search into the portfolio, where the resident could detail the search question, search strategy employed, and what was learned. Reminders sent to a “to-do” list were generated with any entry to scholarly activity uploads, as was a message to the program coordinator. A “view-only” window was created so that a learner could invite a mentor or fellowship director to have time-limited, read-only access to unlocked portions of the portfolio.
Training of learners and mentors
Residents were introduced to the portfolio rationale in a one-hour lecture in 2005 and then introduced to individual sections in monthly sessions, during which they were given a 20-minute introduction followed by 40 minutes of time to individually reflect. Learners told us in a 2007 focus group that they did not find the monthly sessions useful, so these were replaced in 2008 by individual portfolio introductions in four weekly, brief (15-minute) one-on-one sessions with mentors over the course of one month in which topics are introduced and technical difficulties are reviewed. After introductions, learners were coached in triannual face-to-face mentor sessions with their portfolios, lasting 60 to 90 minutes each during the next three years, during which session-specific contents (see Table 2) are reviewed and learning plans are developed and uploaded. Reminders regarding sections to be reviewed are sent to faculty and residents two weeks before each triannual mentor meeting to allow residents time to upload and reflect on topics specific to that session and year in residency and allow mentors to review the uploads for discussion during their meeting.
Mentors were involved in the initial selection of content and the teaching of monthly sessions. In a 2007 focus group, five of the seven mentors participating identified several barriers, including a lack of personal experience with portfolios, concerns with the amount of time needed to complete them, and a lack of comfort with assessment. Based on their needs, a series of faculty development sessions involving a mix of review of technical and procedural aspects, as well as frame-of-reference training for assessment of specific sessions, were performed.17 Faculty members were given space to create their own portfolios in 2008, and five of the seven did.
Assessment of portfolio
In 2006 and 2007, mentors used the portfolio in a purely formative fashion to enhance their mentor–mentee discussions. Starting in 2008, we piloted and trained mentors to use a basic scoring rubric for reflective capacity adapted from the work of Epstein12 that reduces his five-point hierarchical paradigm to just three, using a “reporter-interpreter-manager” model introduced by Pangaro18 (see Chart 1). Interrater reliability testing on 150 randomly chosen individual reflections of two untrained raters familiar with the paradigm showed a weighted kappa of 0.62.
A summary statement of performance (performed by a nonmentor) was added in 2008. The statement is given to each mentor and mentee to guide their future growth and to provide information for residents' letters of reference when they apply to future programs. The statement is not used for high-stakes, pass–fail decisions. Using a model successful in other portfolio implementations,19 the content of the statement includes a qualitative assessment of the learner's teaching ability, research skill, and self-directed learning skills, as well as the mentor's scores of the resident's reflective skill.
Integration Into the Curriculum
Although initially added as an enhancement to mentoring sessions, we inculcated the portfolio tool into the curriculum in 2008. Expectations for uploading evidence-based medicine searches (once a month for interns, twice monthly for second years, weekly for third years) were given for all ward, consult, and night float rotations. Critical incident reflections and four patient audits were added as a requirement to the night float rotation. All resident-led teaching sessions receive immediate oral and written feedback, which is uploaded during or after the discussion.
Time and financial investment
Costs for the custom Web developer who also supports the site were $15,000. A 2010 upgrade was an additional $5,000. Residents receive personal computers ($950/resident) with wireless access to the portfolio site, which they also use for clinical work.
Mentoring sessions currently take about 45 minutes preparation time by mentors, and 45 to 90 minutes for the resident per session. Final preparation for annual reviews takes 1.5 to 6 hours by residents. The program director invests 1 hour per resident to review each portfolio and dictate a qualitative summary.
Components of successful portfolio implementation
For a short summary of these components for successful implementation of a portfolio, plus additional ones, see List 1.
Determine your goals in advance.
Portfolio goals can range from unstructured collections of resident work for purely formative feedback to structured documentation tools used for high-stakes decisions. Failure to articulate a clear vision may make buy-in limited from both learners and mentors. Given the time investment needed by learners, aligning portfolio goals with learners' perceived needs seemed to help participation, as we found in our learner focus groups.
Get mentor buy-in.
Mentors will have to champion the assessment method to residents, which means mentors must buy into the method; without mentor acceptance, full engagement of the residents is quite unlikely.16 Protected mentor time for portfolio review and for individual coaching with learners seems to be important for success.19 Gaining faculty support can be aided by having them review the educational benefits (exhaustively discussed in Buckley et al6) and the gaps in the current assessment systems that a portfolio may fill. Failure to convince faculty of the value added with portfolios, and in competency-based education methods in general, can result in a misdirected backlash against the product, the visible symbol of the change, which may overshadow the merits of the overall process.7 If initial buy-in from faculty seems weak, directors should reconsider any portfolio implementation, as implementation efforts in places with limited support have met with poor mentor and mentee acceptance.16,20 Frequent evaluation of the portfolio process from mentors at faculty meetings or in focus groups following mentor sessions helped us recalibrate content, direct faculty development, and share best practices between mentors.
Start small, and build with a long-term plan.
Given that the learning environment may be as or more important than the content in portfolio implementation, a gradual evolutionary rollout process is likely to meet with less resistance than any sweeping change. Establishing clearly defined goals and building an open-ended structure will allow the portfolio to be a flexible collection of the learner's abilities that will not compromise authenticity.21 Agreeing on a long-term vision with faculty and a stepwise process of implementation will help early adopters lead the way for others. Starting with the most learner-relevant and least time-intensive components—such as introducing the curriculum vitae, reflective writing, and reflective chart audits during a night float experience, or adding a reflective component to monthly evaluations—can help mentor and mentee discover the benefits to the process and may add depth to an existing assessment method. We found that mentor experiences with reviewing learners' critical incidents and their reflections on constructive feedback made them more vested coaches in the process. Mentors also noted greater ease in developing educational plans for those residents whose performance was far ahead and far behind others; this provided for more objective evidence to support schedule redesigns for both remediation (for those behind others) and advanced projects in research and teaching (for those far ahead).
Make the case to your learners.
To gain success with a structured portfolio, learners must first be convinced of its intrinsic value to them personally.22 We attempted to anchor the portfolio around the creation of their curricula vitae, and we built in reminder systems for pertinent uploads, allowing them to see how the materials they are assembling tie directly into their future careers. Adding structure that makes sense to learners is critical to keep intent clear. We also added a section for learning plans at the end so that the formative intent is obvious. Developing clear instructions that describe aims without rigidly defining content and avoiding rigid summative criteria are cited as critical steps to maintain learner ownership of the contents.19,23 Similar to the ACGME online portfolio, we chose to offer a series of optional reflective questions for each section that allow the learner the flexibility to create a reflection in his or her own voice. We allowed residents to withhold selected reflections (not their evaluations) so that their privacy can be maintained, because, as Driessen15 points out, student ownership of the process is critical to success, and authenticity is based on their ability to control its contents.
An effective introduction of the portfolio is necessary, both to sell learners on its necessity and to introduce concepts of reflection.24 We found that lecture-based introductions were less well received, probably because we could not individually identify barriers and explain the educational importance to each resident. We have found more success in a decentralized, one-on-one approach in short meetings with mentors that address barriers and technical issues, especially since adding the technical challenges involved with online content. Assimilation of the portfolio process into the curriculum also increased acceptance, as a systematic review of portfolios also suggests.15
Lack of time is listed as a common barrier in the literature on portfolio implementation.1,22,25 Keeping the purpose clear, the clarity intact, and emphasizing quality over quantity of content may help most learners overcome this barrier, and doing so can reduce the feelings that learners are “jumping through hoops.”4,24
Many reflection experts believe that reflective habits can be taught to anyone, but openness to reflection seems to be a prerequisite to success.26,27 Although most learners were open to reflection, we found that adding a summary statement of performance was associated with improved participation. Although some authors argue that formal assessment of the portfolio will have negative effects on honesty and authenticity,16,28 we did not see those negative effects, which is similar to the finding reported by Driessen.21 Despite initiating a summary statement, we still found that a substantial minority (10% of our learners) participated minimally or not at all, a finding also noted by other authors.22 Characteristics of those who refuse to participate are unknown, but seeking to learn them is an important topic for future portfolio research.
Train your mentors and allow them to benchmark as you go.
Training mentors was seen as instrumental to success in a systematic review by Driessen,15 and a lack of training was cited as a critical reason for failure by Ryland.29 Our mentors were initially uncomfortable evaluating portfolio sections, even formatively. Group training of raters, using real-world examples (similar to assessor training exercises, as described by Driessen15), allowed each rater to benchmark from others' responses and also assisted us in improving mentors' comfort in rating clinical documentation performance, evidence-based medicine measures, and quality of reflection. Open-forum sessions following mentoring meetings allowed for review of specific information technology issues and global educational problems that then helped inform curricular changes and alter portfolio system evaluation processes. This became especially important after the transition to the Web-based format, when technical issues can become important barriers, as other authors similarly note.28
Paper or electronic?
Advantages of paper portfolios over electronic media include lower costs and fewer technical issues. Initial resistance to portfolio evaluation systems often focus on product instead of process.7 Early introduction of electronic formats may heighten this resistance, as learners report that electronic versions typically require more time than paper versions.15 We recommend an introduction to portfolios on paper with only a few components until buy-in from participants is achieved, as was reported by Pinsky,30 even if the eventual intent is to use an electronic format. Advantages of electronic portfolios include more portability for reviewing by mentors, summative evaluators, and fellowship directors, as well as easier transfer to future training sites. However, more training is required for electronic versions than paper versions, and custom development, cost, and ongoing site maintenance can be issues.22 Many electronic commercial formats are now available, including the ACGME's version,5 which may be less costly and have fewer start-up technical issues than in-house custom designs.
Develop an effective process for assessment.
Integration of a portfolio into the residency program's competency assessment strategy is critical. Best practice models support the value of a formative process for competency development and with some summary review to increase compliance.15 Separating roles of the formative (“coach”) and summative (“evaluator”) assessor is recommended to avoid ethical issues with mentors.7 Our qualitative performance summary is performed by the program director, who is not a mentor.
Because portfolios contribute to overall assessment in a more meaningful way when flexibly structured, contents are often diverse and difficult to quantitatively assess. We chose to use a global measurement of reflective ability as our only measured rating domain, and we used the rest of the review (1) to inform a global qualitative letter commenting on teaching and research skill and (2) to inform self-directed learning, focusing on a holistic approach, as recommended by many authors.25,31,32
The Importance of the Portfolio
One author on portfolio implementation concludes his work with the statement that “portfolio implementation is like buying shoes—a good fit is important.”7 The portfolio is an important tool to promote reflection and independent learning, and it may be helpful in enriching the depth of mentor–mentee discussions.6 However, much like other components of competency-based education, successful implementation requires deployment of a complex, time-intensive mentoring and assessment process. If the eventual goal in competency-based education is not only to assess whether our learners can apply the knowledge they have to real-world situations but also to see how they practice at the highest level (i.e., what a physician “does”) of Miller's33 pyramid, we must first be able to measure the adequacy of residents' reflection skills while formatively helping them assimilate the habits of lifelong learners. A portfolio tool is well suited to be that assessment instrument. Areas for future study include factors associated with openness to reflection, as well as relationships between participation with portfolios and lifelong self-directed learning, burnout, and enhancement of mentor–mentee relationships.
1. Challis M. AMEE medical education guide no. 11 (revised): Portfolio-based learning and assessment in medical education. Med Teach. 1999;21:370–386.
2. Davies H. Portfolios, appraisal, revalidation, and all that: a user's guide for consultants. Arch Dis Child. 2005;90:165–170.
3. Dornan T, Carroll C, Parboosingh J. An electronic learning portfolio for reflective continuing professional development. Med Educ. 2002;36:767–769.
6. Buckley S, Coleman J, Davison I, et al.. The educational effects of portfolios on undergraduate student learning: A best evidence medical education (BEME) systematic review. BEME guide no. 11. Med Teach. 2009;31:282–298.
7. Van Tartwijk J, Driessen EW. Portfolios for assessment and learning: AMEE guide no. 45. Med Teach. 2009;31:790–801.
8. Korthagen FAJ, Kessels JPAM. Linking theory and practice: Changing the pedagogy of teacher education. Educ Res. 1999;28:4–17.
10. Ericsson KA. An expert-performance perspective of research on medical expertise: The study of clinical performance. Med Educ. 2007;41:1124–1130.
12. Epstein RM. Mindful practice. JAMA. 1999;282:833–839.
13. Mylopoulos M, Regehr G. Cognitive metaphors of expertise and knowledge: Prospects and limitations for medical education. Med Educ. 2007;41:1159–1165.
15. Driessen E, van Tartwijk J, van der Vleuten C, Wass V. Portfolios in medical education: Why do they meet with mixed success? A systematic review. Med Educ. 2007;41:1224–1233.
16. Snadden D, Thomas ML. Portfolio learning in general practice vocational training—Does it work? Med Educ. 1998;32:401–406.
17. Holmboe ES, Ward DS, Reznick RK, et al.. Faculty development in assessment: The missing link in competency-based medical education. Acad Med. 2011;86:460–467.
18. Pangaro L. A new vocabulary and other innovations for improving descriptive in-training evaluations. Acad Med. 1999;74:1203–1207.
19. Dannefer EF, Henson LC. The portfolio approach to competency-based assessment at the Cleveland Clinic Lerner College of Medicine. Acad Med. 2007;82:493–502.
20. Webb TP, Aprahamian C, Weigelt JA, Brasel KJ. The Surgical Learning and Instructional Portfolio (SLIP) as a self-assessment educational tool demonstrating practice-based learning. Curr Surg. 2006;63:444–447.
21. Driessen EW, van Tartwijk J, Overeem K, Vermunt JD, van der Vleuten CPM. Conditions for successful reflective use of portfolios in undergraduate medical education. Med Educ. 2005;39:1230–1235.
22. Tochel C, Haig A, Hesketh A, et al.. The effectiveness of portfolios for post-graduate assessment and education: BEME guide no. 12. Med Teach. 2009;31:299–318.
23. Case SH. Will mandating portfolios undermine their value? Educ Leadersh. 1994;52:46–47.
24. Wade RC, Yarbrough DB. Portfolios: A tool for reflective thinking in teacher education? Teach Teach Educ. 1996;12:63–79.
25. McCready T. Portfolios and the assessment of competence in nursing: A literature review. Int J Nurs Stud. 2007;44:143–151.
26. Mamede S, Schmidt HG. The structure of reflective practice in medicine. Med Educ. 2004;38:1302–1308.
27. Dobie S. Viewpoint: Reflections on a well-traveled path: Self-awareness, mindful practice, and relationship-centered care as foundations for medical education. Acad Med. 2007;82:422–427.
28. Kjaer NK, Maagaard R, Wied S. Using an online portfolio in postgraduate training. Med Teach. 2006;28:708–712.
29. Ryland I, Brown J, O'Brien M, et al.. The portfolio: How was it for you? Views of F2 doctors from the Mersey Deanery Foundation Pilot. Clin Med. 2006;6:378–380.
30. Pinsky LE, Fryer-Edwards K. Diving for PERLS: Working and performance portfolios for evaluation and reflection on learning. J Gen Intern Med. 2004;19:582–587.
31. Friedman Ben David M, Davis MH, Harden RM, et al.. AMEE medical education guide no. 24: Portfolios as a method of student assessment. Med Teach. 2001;23:535–551.
32. Driessen EW, Overeem K, van Tartwijk J, van der Vleuten CPM, Muijtjens AMM. Validity of portfolio assessment: Which qualities determine ratings? Med Educ. 2006;40:862–866.
33. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 suppl):S63–S67.
The Reading Hospital IRB approved a waiver for the interrater reliability study of resident reflections, titled “Interobserver Agreement for Portfolio Reflections,” on November 15, 2010 (protocol number TRHMC-010E-10). The Reading Hospital IRB approved the qualitative study of portfolio acceptance, titled “Factors Affecting Success and Performance on a Portfolio Collection Device in an Internal Medicine Residency,” on October 13, 2006 (protocol number TRHMC-035-06). These studies provided important qualitative data from a resident focus group and the interrater reliability for reflection cited in this article.
Significant portions of this work were presented in a lecture at the Association of Program Directors of Internal Medicine Annual Meeting, October 31, 2008, Orlando, Florida; at the Association of Program Directors of Internal Medicine Annual Meeting, April 9, 2011, Las Vegas, Nevada; and at the Canadian Conference of Medical Educators 2011 Annual Meeting, May 9, 2011, Toronto, Ontario, Canada.