Secondary Logo

Journal Logo

Articles

Developing and Successfully Implementing a Competency-Based Portfolio Assessment System in a Postgraduate Family Medicine Residency Program

McEwen, Laura A. PhD; Griffiths, Jane MD, CCFP, FCFP; Schultz, Karen MD, CCFP, FCFP

Author Information
doi: 10.1097/ACM.0000000000000754
  • Free

Abstract

The move towards competency-based models of postgraduate residency education has emphasized the role of assessment. Portfolios have been recognized as useful structures for collecting, organizing, and managing the large volume of assessment information necessary to support these educational models.1 The potential for portfolios to function as catalysts for learning by promoting residents’ active engagement in and responsibility for the learning process makes these tools even more appealing.2 The structural and functional advantages, along with the endorsement of regulating bodies, have spurred uptake.3 However, the processes by which these assessment systems are designed, implemented, and maintained are emergent.4–7

In this article, we describe the needs assessment, development, implementation, and continuing quality improvement processes that have shaped the Portfolio Assessment Support System (PASS) used in residency training in the Department of Family Medicine at Queen’s University, (Kingston, Ontario, Canada). As we built PASS, we used our educational philosophy—that supporting high-quality resident learning and assessment is a purposeful, deliberate activity—to guide our decisions. PASS supports competency development, scaffolds the use of self-regulated learning skills, and promotes professional identity formation. We understand that individual PASS components and processes are not unique, but in sharing our overall experience and explaining PASS en toto, we hope to support others who are currently engaged in systematic portfolio design and/or who could potentially adapt aspects of PASS for their local programs.

Needs Assessment

Two events provided the impetus for major curricular reform and assessment system development in the family medicine department at Queen’s University. First, in 2009, the College of Family Physicians of Canada (CFPC), which is the accrediting body for Canadian family medicine training programs, launched the new competency-based Triple C Curriculum. The new curriculum mandated that programs provide training that (1) focuses on producing comprehensively trained family physicians; (2) incorporates continuity of patient care, curriculum, and supervision; and (3) centers that training in family medicine. According to the CFPC, the new outcomes-oriented approach required “carefully designed curricular elements to achieve clearly stated desired outcomes” that would replace “traditional time-based educational strategies.”8

Second, the Queen’s University family medicine residency program grew from a single site with 100 residents centrally located in Kingston, Ontario, to a distributed program with 130 residents at four sites spread over a radius of 200 kilometers (125 miles). In 2008, 17 academic family medicine physicians supported the clinical training of the 100 residents. Now, approximately 20 salaried academic family physicians are affiliated with the original Kingston site, and 15 new part-time salaried physicians are affiliated with the three expansion sites. These physicians plan and implement programs, teach and assess residents, and provide individualized learner support and competency-based assessment by functioning as “academic advisors” or AAs (see Creating processes, below). An additional 750 to 1,000 community family physicians and specialists receive “sessional” payments for the teaching they provide, including supervising clinical care and/or facilitating academic sessions. Given our distributed program and our large number of residents and faculty, we felt that a Web-based assessment platform was necessary.

Operationalizing the Triple C Curriculum at Queen’s has resulted in three different two-year curriculum structures. The original Kingston site maintains a traditional, rotation-based structure of 26 four-week blocks over two years. There are 12 blocks of family medicine training, and the 14 remaining blocks provide experiences relevant to family medicine such as obstetrics and pediatrics. In this structure, residents move from rotation to rotation with a consequent change in learning environment. Two of our new training sites (Belleville and Oshawa) have adopted predominantly longitudinal structures with few rotations. In this structure, residents are situated in ongoing longitudinal family medicine training environments. Threaded throughout the program are other short off-service learning experiences, and residents bring information learned on off-service experiences back to be applied in the family medicine environment. Our third new training site (Peterborough) uses a hybrid model, combining rotation-based and longitudinal learning experiences.

The diversity in curriculum structure across our sites, coupled with the shift to a competency-based educational model, rendered conventional assessment processes in the form of rotation-based in-training evaluation reports (ITERs) inadequate. From an administrative perspective, the ITER system lacked the flexibility to accommodate the primarily longitudinal curriculum structure without rotations. Even for our more traditional rotational-based sites, the ITER system lacked the comprehensiveness to document residents’ emerging competence as it developed over time and across different rotations. Competence in caring for children, for example, builds over time through rotations in family medicine, pediatrics, emergency medicine, and psychiatry. The rotation-based ITER system simply could not capture information of the quality or granularity necessary to formulate judgments about family medicine residents’ growing competence. We required an assessment system that would allow us to sample, collate, and interpret resident performance more longitudinally.8–10

As the Queen’s family medicine residency program has expanded, other needs have arisen. Although Queen’s University sets family medicine program objectives centrally, individual teaching sites leverage local strengths and resources, ultimately enacting curriculum in unique ways. Further, some important program objectives (e.g., those related to global health) are not achievable at all sites. Consequently, we have developed centralized online learning resources and assessments for these objectives such that learners across all sites can access and use them. Finally, we have been mindful that, given the wide scope of practice in the field of family medicine, there is no way to predict residents’ future needs in terms of knowledge and skills. Therefore, we have prioritized both supporting the development of residents’ self-regulated learning skills and fostering their early identity formation as family physicians.

Our competency-based assessment needs are not unique. Medical education in many areas of the world is transitioning to competency-based educational models.11,12 Longitudinal programs that incorporate bounded learning environments and continuity of patient and supervisor relationships, both of which seem to increase authentic learning experiences, are increasingly popular.13–15 We believe, therefore, that assessment systems like PASS, that feature not only purposefully defined components for capturing learners’ longitudinal growth but also processes for compiling and interpreting this assessment information so as to inform declarations of competency, have widespread applicability.

Development

The development of Queen’s family medicine innovative postgraduate portfolio assessment system began in 2008. Anticipating the move to competency-based medical education (CBME), our assessment director (J.G.) conducted an extensive review of the growing literature on the topic. She noticed that, with respect to assessment, portfolios were a recurrent theme and that educators generally considered this system of capturing progress to align well with the tenets of CBME.1

According to the research literature, portfolios offer a useful structure for collecting, organizing, and managing the wide variety of assessment information required for competency-based assessment.16,17 They also have the potential to function as catalysts for learning by enhancing residents’ active engagement in and responsibility for their learning processes.2,18 The literature, however, cautions that the extent to which this catalytic benefit is realized in practice relates closely to how faculty facilitate and implement portfolio assessment processes.1,19 Specifically, the literature highlights the role of mentors who function as guides in supporting residents’ active reflection about their learning process.20,21 Overall, the literature emphasizes the critical importance of the purposeful inclusion of informed mentors as important aspects of successful portfolio design (Figure 1).

Figure 1
Figure 1:
Components of the Portfolio Assessment Support System (PASS) used by the Family Medicine PostGraduate Training Program at Queen’s University in Kingston, Ontario, Canada. EPA indicates entrustable professional activity.25

Considering context

As we began to blueprint assessment components and develop assessment processes, we were cognizant of our contextual constraints. Our assessment system had to accommodate the needs of multiple training sites—some of which were community based and some of which were hospital based. Ultimately, we needed a flexible system that was easily accessible for a diverse range of geographically distributed users. These requirements highlighted the need for a Web-based solution.

Selecting assessment components

Initially, the requirements of our accrediting college (CFPC) guided our assessment system blueprinting. The assessment director (J.G.) conducted a comprehensive analysis of the CFPC’s skills dimensions, the 99 priority topics,22 the “Phases of the Clinical Encounter,”9 and the CanMEDs–family medicine roles.10 Next, the assessment director consulted with the postgraduate assessment specialist (L.A.M.), which helped us consider the future practice demands of residents through the lens of self-regulated learning theory. On the basis of this analysis of the literature and consultation, we defined a preliminary list of components for inclusion in our portfolios. Each component mapped to an established need (e.g., addressing an objective; providing structured opportunities for residents to reflect, self-assess, and collaboratively plan their learning; responding to an accreditation requirement; or providing small biopsies of competency assessment, which when collated, would assist AAs in formulating competency decisions).

In addition to the mapping of specific needs to particular components, we purposefully elicited feedback from all stakeholders including educational program leaders, clinical teachers, and residents. Stakeholders provided comments and critiques on a regular basis to ensure utility and feasibility and to foster community ownership of the system. These sessions raised practical concerns about issues regarding confidentiality and the possible legal implications of sharing personal writing or reflections. Such concerns prompted consultations with university legal services that, in turn, served to inform guidelines around confidentiality and professional responsibilities for reportable issues (e.g.: Portfolios can be viewed only by a select few to ensure privacy. Exceptions to this include responding to concerns that fall under the Mandatory Reporting policy of the College of Physicians and Surgeons of Ontario). Finally, we also sought insights from assessment experts external to Queen’s. This comprehensive process served to shape a purposeful, thorough portfolio blueprint for Queen’s postgraduate family medicine program (see Appendix 1).

Creating processes

The program director (K.S.) recognized early that because of the sheer number of learners, coupled with their geographical distribution, she would need to delegate responsibility for monitoring their growth and making decisions about their attainment of competency. We developed the AA role to assume these responsibilities. As the portfolio assessment system developed, scheduling regular AA–resident meetings became a key process issue. We conceptualized that AAs would function essentially as academic coaches. They would review portfolios, interpret assessment data, coach and support residents’ SRL processes, engage in deliberate mentoring, promote the development of residents’ professional identify formation as family physicians, and ultimately make decisions about progress and advancement. We engaged faculty in discussions about these responsibilities early and sought feedback about the kinds of support they felt they would need to fulfill this critical role. In effect, we collaboratively defined the parameters of the academic advising task, which empowered the community and further fostered ownership of assessment processes.

Implementation

Phase 1

We strategically phased in the implementation of PASS over two academic years. In 2009 we initiated the portfolio components. We provided data sticks for residents to store their work and share it with AAs in advance of meetings. During this phase we formalized academic advising processes. Prior to implementing PASS, faculty advisors met with residents only twice a year to discuss career plans and general progress through the program. The lack of specific objectives to guide the process resulted in wide variation in the focus, quality, and usefulness of these meetings. We addressed this weakness by establishing specific objectives for academic advising meetings and by instituting a scheduling protocol (meetings are to occur at four-month intervals). We also clearly defined preparatory processes for residents (e.g., reflective writing tasks, collating paper-based field notes23 [see Appendix 1 for detailed information about field notes]), and we delineated AAs’ responsibilities for reviewing assessment information in preparation for meetings. This first phase served to orient AAs to their new role, focusing on assessment rather than strictly mentoring. It also familiarized residents and faculty with the main components of the portfolio.

Phase 2

Simultaneously, the assessment director (J.G.) worked with our Web designer to develop the electronic portfolio structure. Together, they created a Web-based solution, tailored to meet our exact requirements and specifications. Again, we engaged residents and faculty in the development process to ensure that the interface was intuitive and met their needs. Their periodic involvement eased their transition to the electronic format in 2010 because they were familiar with the interface. Further, users welcomed the move away from data sticks. After graduation, residents’ assessment information is removed from the active Web-based platform. Objective assessment data are kept in a digital format according to university policy; however, all subjective data are stored and kept in a deidentified format for program quality improvement and research.

Support

In addition to engaging in ongoing community consultation to help shape PASS and to stage implementation over two years, we have also developed an extensive network of user supports. We have identified “assessment leads” for each educational site who assume responsibility for orienting local users of the system. All residents receive a one-hour tutorial at the beginning of the program. Further, each portfolio component includes instructions, and we developed resource documents and videos for the “help” sections of PASS. Finally, we revise all of these annually to ensure that they remain current and useful.

In addition to the technical supports, the platform includes embedded meeting checklists, including regular progress check-ins and the requirements for progression from first to second year and for graduation from the program. These support features help focus meetings and ensure that residents stay on track. In addition, we developed review protocols for site and program directors whereby they are able to review all portfolios one month prior to residents’ completion of the program. Residents with incomplete portfolios, as well as their AAs, receive a reminder that a complete portfolio is a condition for graduation.

Initially, AAs attended several one-hour faculty development sessions covering topics such as the parameters of the AA role, writing field notes, and navigating PASS. Annual individualized AA development sessions provide ongoing support. In addition, AAs receive compensation not only for the time they spend meeting with residents but also for the time they spend preparing for those meetings (e.g., reading and commenting on reflective exercises, reviewing modules, and assessing performance data). Remuneration for preparatory effort is an institutionalized acknowledgment of the investment required by AAs to effectively facilitate meetings with residents and actively support their individualized growth. Furthermore, remuneration makes explicit the value Queen’s University places on the AAs who play a vital role in the assessment system by providing informed, professional judgments about residents’ progress and declaring them competent for independent practice.

Continuing Quality Improvement

We have several quality improvement mechanisms in place that function to identify and address system problems and weaknesses as they arise. Our assessment director (J.G.) has a dedicated 0.2 FTE (one day) for the continued development, support, and oversight of assessment processes for the entire family medicine program. Assessment leads at each site have just under a half-day of dedicated time per week to support their work in that role. Our full-time Web designer manages all technical troubleshooting in a timely manner. In keeping with the community ownership model that we initiated during the design phase, we highly value, encourage, and swiftly address user feedback. The Resident Assessment Advisory Group meets with the assessment director on a regular basis to bring forward ideas and concerns. For example, when residents raised concerns about differences in the quality of AA meetings, we not only shared that feedback with faculty but also developed an AA evaluation system. Residents now complete evaluations of their AAs at regular intervals just as they do for their clinical preceptors. Residents’ feedback has been incorporated into the academic physician review processes along with field note completion rates (as compared with departmental averages).

Likewise, in response to feedback from faculty about the burden of scheduling AA meetings with residents, we have provided administrative support for scheduling these sessions. All meetings occur during regular working hours and are scheduled at the beginning of the academic year. If and when rescheduling is required, administrative support staff organize the new meeting, relieving AAs of this burden. Furthermore, there is an institutional expectation that physicians reserve a 30-minute time slot in every clinical day for directly observing, providing feedback to, and documenting the performance of residents under their purview. Our educational philosophy is that supporting high-quality learning for and assessment of residents does not occur by happenstance; rather, it is a purposeful, deliberate activity that requires dedicated clinician time and attention.

Program Evaluation

We use multiple measures to monitor the impact of PASS at the individual, programmatic, and institutional levels. At the individual level, we gather and examine learners’ and faculty members’ perspectives about the impact and value of PASS in supporting CBME. At the programmatic level, we monitor field note completion rates and use cohort assessment data to inform curriculum development. Finally, at the institutional level, we monitor our effectiveness in identifying and remediating residents in difficulty. Ethical clearance for the use of interview quotes and survey data was granted by Queen’s Health Sciences Research Ethic Board and an exemption was granted for the use of program evaluation statistics.

Individual level

Residents’ perspectives.

In February 2012, the CFPC appointed a Working Group for Survey Development. The group’s mandate was to develop a series of surveys (Entry, Graduate, and Practice) as a component of a longitudinal outcomes-based program evaluation of the Triple C Curriculum.24 The purpose of these surveys was to gather residents’ perspectives about their preparation for and intentions to practice. Respondents are asked to indicate their level of agreement with statements about their residency experiences, including assessment, on a five-point Likert scale. The Graduate survey was piloted in the spring of 2013 at Queen’s, and 67% (32/48) of the graduating family medicine residents completed it. Queen’s residents’ responses to the assessment items are one measure of the impact of PASS.

The results of the 2013 survey indicate that 75% of Queen’s family medicine residents (24/32) agreed or strongly agreed that they understood programmatic expectations and were actively aware of their progress throughout the program. Furthermore, 66% (21/32) reported involvement in tailoring their learning when individual needs were identified, and the great majority (91%, n = 29) indicated that they were confident in their ability to identify personal learning needs.

Overall, residents have a solid grasp of the standards for performance and use those standards to self-monitor their development over the course of the residency program. The fact that many report both assuming an active role in directing their learning and feeling confident in their ability to do so going forward into practice suggests that our investment in academic advising is having the intended effect of supporting the development of learners’ self-regulated learning skills.

Faculty members’ perspectives.

Interviews with AAs for an upcoming article suggest that they value the electronic portfolio as a one-stop shop or central repository for resident performance information. In particular, AAs value the capacity to sort field notes; as one AA relates, “You can look at the grid of how they did in different domains to get a big-picture view. And then you can drill down within to get a small-picture view.” AAs perceive this kind of access to assessment data as a means to facilitate the early identification of performance gaps and to support planning for individualized learning opportunities. One AA described how he, in collaboration with a resident, “made an assessment of [her] competency and then looked to see, okay, [what] are your areas that you need to continue to work on? What do you have coming up? Let’s see if we can tailor this a bit for what you need.”

AAs also recognize the value of the system beyond the accessibility and comprehensiveness of assessment information. In the words of one AA, regular meetings with residents “have drawn the academic advisors and their residents closer.” Another mentioned how reflective components of the portfolio supported her own self-reflection because “it’s a time for me to think—what do I do as a family doc?” Although AAs acknowledge the challenges associated with more intensive assessment processes, they also recognize the value of those same processes. To illustrate, one AA commented, “It’s not easy, but it’s an important thing to feel like you’re making a difference.” Another AA explicitly discussed his/her great sense of pride about being “acknowledged as being a leader in the area.”

Programmatic level

Field note completion rates.

Another important measure we use to monitor the impact of PASS is field note completion rates. Prior to implementation of the electronic field note, residents received, on average, 8 paper field notes annually. In 2011, that number had risen to 14. By 2012, residents received an average of 41 field notes annually, in 2013 the average was 46, and in 2014 the average was 59. We now have more than 23,000 field notes written and compiled in our system. Having successfully sustained the upward trend in field note completion rates over three years, our attention is shifting to quality.

In keeping with our community ownership orientation, we have developed a three-pronged approach to the assessment of field note quality. Resident input is sought through a nomination process whereby residents share field notes they found particularly impactful, and field note functionality is being further enhanced to allow residents to indicate how a field note impacted their learning. We are also developing a field note audit tool that will be used by a trained administrative assistant to randomly monitor timeliness, tone, and focus. Finally, we plan to engage preceptors in a supported self-reflective process about the quality of feedback they provide in field notes. Preceptors will receive a bundle of 10 recently composed field notes along with the preceptor field note audit tool to support the review process. Our hope is to have this activity certified for the CFPC continuous professional development process, as a means of formally acknowledging our preceptors’ contribution to educational quality.

Cohort assessment data.

The utility of cohort assessment data to inform curriculum revision is another measure of the impact of PASS. Graduating residents complete a survey indicating their level of confidence in 99 key topic areas22 and clinical procedures. The curriculum committee reviews survey results annually to identify any gaps as perceived by graduates. When a substantial percentage of residents (10% or more) report lacking confidence in or not having had the opportunity to learn about a particular topic or procedure during residency, we consider curriculum revisions. These curriculum revisions may take the form of additional didactic teaching sessions during academic days (e.g., adding a talk on seizures) or adding procedures to the simulation sessions (e.g., simulated toenail removal).

Institutional level

Identification and remediation of residents in difficulty is an important measure of the impact of PASS. We monitor this particular measure at the institutional level through the number of cases referred to our Educational Advisory Board (EAB). The EAB is a special committee, convened by the associate dean of postgraduate medical education, that is responsible for assisting with academic planning for residents in need. Of the 24 cases reviewed by the EAB since 2013, 11 were family medicine cases. In all 11 cases, the EAB commended the family medicine program for the extensive resident performance information made available and for the high-quality remediation plans proposed.

Outside acknowledgment

Although external acknowledgment of PASS is not officially part of its evaluation system, the recognition of other groups involved in postgraduate medical education is encouraging. Winning the Professional Association of Interns and Residents of Ontario’s prestigious Residency Program Excellence Award in 2012 (plus being nominated again in 2015) evidences that Queen’s residents value the quality of learning and assessment that PASS enables. Additionally, the official acknowledgment and high praise that PASS received from CFPC after its most recent accreditation visit affirm the value and quality of the system.

In Sum

In summary, the strategically designed components; the thoughtfully orchestrated processes grounded in educational theory; the dedicated, informed AAs; and the robust, customized, flexible electronic platform are the cornerstones of PASS. PASS has been in use for five years, and its processes, components, and platform functionality all continue to evolve so as to serve the needs of faculty and residents. In spite of the complexity of change that moving to PASS has involved, it has been strongly endorsed by family medicine faculty and residents—and by the Queen’s postgraduate medical education community more generally. Impact measures at the individual, programmatic, and institutional levels provide evidence that the program is realizing our initial intentions and goals.

Acknowledgments: The authors acknowledge editors at Academic Medicine, Elizabeth S. Karlin and Anne L. Farmakidis, for their patience and guidance. They also thank Ulemu Luhanga for her contribution in finalizing this manuscript.

References

1. Tochel C, Haig A, Hesketh A, et al. The effectiveness of portfolios for post-graduate assessment and education: BEME guide no 12. Med Teach. 2009;31:299–318
2. Van Tartwijk J, Driessen EW. Portfolios for assessment and learning: AMEE guide no. 45. Med Teach. 2009;31:790–801
3. Donato AA, George DL. A blueprint for implementation of a structured portfolio in an internal medicine residency. Acad Med. 2012;87:185–191
4. Nagler A, Andolsek K, Padmore JS. The unintended consequences of portfolios in graduate medical education. Acad Med. 2009;84:1522–1526
5. Webb TP, Merkley TR, Wade TJ, Simpson D, Yudkowsky R, Harris I. Assessing competency in practice-based learning: A foundation for milestones in learning portfolio entries. J Surg Educ. 2014;71:472–479
6. Cooney CM, Redett RJ 3rd, Dorafshar AH, Zarrabi B, Lifchez SD. Integrating the NAS milestones and handheld technology to improve residency training and assessment. J Surg Educ. 2014;71:39–42
7. Hurtubise L, Roman B. Competency-based curricular design to encourage significant learning. Curr Probl Pediatr Adolesc Health Care. 2014;44:164–169
8. Tannenbaum D, Kerr J, Konkin J, et al. Triple C Competency-Based Curriculum: Report of the Working Group on Postgraduate Curriculum Review—Part 1. March 2011 Missassauga, Ontario, Canada College of Family Physicians of Canada http://www.cfpc.ca/uploadedfiles/education/_pdfs/wgcr_triplec_report_english_final_18mar11.pdf. Accessed March 27, 2014
9. Oandasan IF, Saucier D Triple C Competency-Based Curriculum Report—Part 2: Advancing Implementation. 2013 Missassauga, Ontario, Canada College of Family Physicians of Canada http://www.cfpc.ca/uploadedfiles/education/_pdfs/triplec_report_pt2.pdf. Accessed March 27, 2015
10. Tannenbaum D, Konkin J, Parsons E, et al. CanMEDS—Family Medicine. October 2009 Missassauga, Ontario, Canada College of Family Physicians of Canada http://www.cfpc.ca/uploadedFiles/Education/CanMeds%20FM%20Eng.pdf. Accessed March 27, 2015
11. Frank JR, Snell LS, ten Cate O, et al. Competency-based medical education: Theory to practice. Med Teach. 2010;32:638–645
12. Iobst WF, Sherbino J, ten Cate O, et al. Competency-based medical education in postgraduate medical education. Med Teach. 2010;32:651–656
13. Holmboe ES, Ward DS, Reznick RK, et al. Faculty development in assessment: The missing link in competency-based medical education. Acad Med. 2011;86:460–467
14. Torbeck L, Wrightson AS. A method for defining competency-based promotion criteria for family medicine residents. Acad Med. 2005;80:832–839
15. Litzelman DK, Cottingham AH. The new formal competency-based Indiana University School of Medicine: Overview and five-year analysis. Acad Med. 2007;82:410–421
16. Mathers NJ, Challis MC, Howe AC, Field NJ. Portfolios in continuing medical education—Effective and efficient? Med Educ. 1999;33:521–530
17. Carraccio C, Englander R. Evaluating competence using a portfolio: A literature review and Web-based application to the ACGME competencies. Teach Learn Med. 2004;16:381–387
18. Norcini J, Anderson B, Bollela V, et al. Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33:206–214
19. Van Tartwijk J, Driessen E, Van Der Vleuten C, Stokking K. Factors influencing the successful introduction of portfolios. Qual High Educ. 2007;13:69–79
20. Donato AA, Pangaro L, Smith C, et al. Evaluation of a novel assessment form for observing medical residents: A randomised, controlled trial. Med Educ. 2008;42:1234–1242
21. Tigelaar DEH, Dolmans DHJM, Wolfhagen IHAP, van der Vleuten CPM. Using a conceptual framework and the opinions of portfolio experts to develop a teaching portfolio prototype. Stud Educ Eval. 2004;30:305–321
22. Allen T, Bethune C, Brailovsky C, et al. Defining Competence for the Purposes of Certification by the College of Family Physicians of Canada: The Evaluation Objectives in Family Medicine. 2010 Missassauga, Ontario, Canada College of Family Physicians of Canada http://www.cfpc.ca/uploadedFiles/Education/Defining%20Competence%20Complete%20Document%20bookmarked.pdf. Accessed March 27, 2015
23. Donoff MG. Field notes: Assisting achievement and documenting competence. Can Fam Physician. 2009;55:1260–1262, e100
24. Oandasan IF, Archibald D, Authier L, et al. Giving curriculum planners an edge: Using entrance surveys to design family medicine education. Can Fam Physician. 2015;61:e204–210

References cited in Figure 1 or Appendix 1

25. Ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5:157–158
    26. Schultz K, Griffiths J, Lacasse M. The application of entrustable professional activities to inform competency decisions in a family medicine residency program [published online ahead of print February 23, 2015]. Acad Med. doi: 10.1097/ACM.0000000000000671.
      27. McEwen LA, Luhanga U, Griffiths J, Schultz K, Spiller A, Acker A Queen’s Multisource Feedback Rubrics: Operationalizing Frames of Reference for Raters and Residents.;820Presented at: 2013 AAMC Annual Meeting MedEdPORTAL Poster Session on “Excelling in Health Education Assessment.” MedEdPortal iCollaborative, Resource ID https://www.mededportal.org/icollaborative/resource/820. Accessed March 27, 2015
        28. The Evaluation Objectives in Family Medicine: Procedure Skills. Missassauga, Ontario, Canada College of Family Physicians of Canada http://www.cfpc.ca/uploadedFiles/Education/Procedure%20Skills.pdf. Accessed March 27, 2015
          Appendix 1
          Appendix 1:
          Queen’s University Family Medicine (FM) Postgraduate Training Program Portfolio Assessment and Support System Blueprint
          © 2015 by the Association of American Medical Colleges