Secondary Logo

Journal Logo

Implementing Competency-Based Medical Education in a Postgraduate Family Medicine Residency Training Program: A Stepwise Approach, Facilitating Factors, and Processes or Steps That Would Have Been Helpful

Schultz, Karen MD, CCFP, FCFP; Griffiths, Jane MD, CCFP, FCFP

doi: 10.1097/ACM.0000000000001066
Innovation Reports
Free

Problem In 2009–2010, the postgraduate residency training program at the Department of Family Medicine, Queen’s University, wrestled with the practicalities of competency-based medical education (CBME) implementation when its accrediting body, the College of Family Physicians of Canada, introduced the competency-based Triple C curriculum.

Approach The authors used a stepwise approach to implement CMBE; the steps were to (1) identify objectives, (2) identify competencies, (3) map objectives and competencies to learning experiences and assessment processes, (4) plan learning experiences, (5) develop an assessment system, (6) collect and interpret data, (7) adjust individual residents’ training programs, and (8) distribute decisions to stakeholders. The authors also note overarching processes, costs, and facil itating factors and processes or steps that would have been helpful for CBME implementation.

Outcomes Early outcomes are encouraging. Residents are being directly observed more often with increased documented feedback about performance based on explicit competency standards (24,000 data points for 150 residents from 2013 to 2015). These multiple observations are being collated in a way that is allowing the identification of patterns of performance, red flags, and competency development trajectory. Outliers are being identified earlier, resulting in earlier individualized modification of their residency training program.

Next Steps The authors will continue to provide and refine faculty development, are developing an entrustable professional activity field note app for handheld devices, and are undertaking research to explore what facilitates learners’ competency development, what increases assessors’ confidence in making competence decisions, and whether residents are better trained as a result of CBME implementation.

K. Schultz is associate professor and program director, Department of Family Medicine, Queen’s University, Kingston, Ontario, Canada.

J. Griffiths is assistant professor and assessment director, Department of Family Medicine, Queen’s University, Kingston, Ontario, Canada.

Funding/Support: Funding and support were provided from the Queen’s University Department of Family Medicine’s departmental budget.

Other disclosures: None reported.

Ethical approval: Reported as not applicable.

Previous presentations: Implementation framework presented at Queen’s University Competency-Based Medical Education Retreat, the Association for Medical Education in Europe (AMEE) Conference, Memorial University’s Preceptor Meeting, and the International Conference on Residency Education in 2015.

Correspondence should be addressed to Karen Schultz, 115 Clarence St., Suite 319, Kingston, ON K7L 5N6, Canada; telephone: (613) 533-9300; e-mail: karen.schultz@dfm.queensu.ca.

Back to Top | Article Outline

Problem

Competency-based medical education (CBME) is a developing approach to medical education that is intuitive and theoretically beneficial. However, little has been written about practical CBME implementation. In 2009–2010, our postgraduate residency training program at the Department of Family Medicine, Queen’s University, Kingston, Ontario, Canada, wrestled with the practicalities of CBME implementation when our accrediting body, the College of Family Physicians of Canada (CFPC), introduced its competency-based Triple C curriculum. The new CBME accreditation standards prompted us to develop an assessment system that would also collect, collate, and interpret assessment data about our residents’ development in critical competencies over their whole residency, as well as a process to adjust individual residents’ training programs to meet their unique competency development needs. This report will outline the steps we took to implement CBME in our program, highlighting facilitating factors and processes or steps that in retrospect we wish we had taken.

Back to Top | Article Outline

Approach

In 2009–2010, we used a stepwise approach to implement CMBE at our postgraduate family medicine residency training program (Figure 1). The steps in the approach were to (1) identify objectives, (2) identify competencies, (3) map objectives and competencies to learning experiences and assessment processes, (4) plan learning experiences, (5) develop an assessment system, (6) collect and interpret data, (7) adjust individual residents’ training programs, and (8) dis tribute decisions to stakeholders.

Figure 1

Figure 1

Back to Top | Article Outline

Steps 1–3: Identify objectives, identify competencies, and map them to learning experiences and assessment processes

From September to December 2009, three of our academic physicians considered CFPC accreditation standards,1 societal needs (local and global, present and future), and program-specific needs to identify objectives and competencies (steps 1 and 2). Some authors write about first identifying competencies and then writing objectives, but like most programs, we already had objectives. These were reviewed for relevance and reworded using competency-based language (e.g., active verbs permitting observation and assessment). The objectives and competencies were then mapped or blueprinted to the draft learning experiences and assessment processes (step 3), an important exercise to ensure that the collective learning experiences supported the development of all the competencies and objectives. Objectives that did not map and/or fit into planned learning experiences were reworked or dropped (or if the objective was a critical one, we developed new learning experiences) as we knew otherwise they would not be met. For curriculum planning and assessment documentation purposes, objectives were then grouped into relevant themes. For us, given family medicine’s breadth, we organized our objectives into life cycle domains.1 Other specialties will have different relevant themes.

Back to Top | Article Outline

Step 4: Plan learning experiences

Planning learning experiences involved deliberately organizing and/or reorganizing competency-building experiences, utilizing local resources, and incorporating education theories such as adult learning, self-directed learning, and expertise development theory.2 We considered relevant experiential learning opportunities that would successively layer learning and repeatedly have residents bring new knowledge back into the family medicine setting. We dropped rotations we no longer felt to be relevant, scaffolded experiences (e.g., residency starts with boot camp where simulation courses precede clinical experiences and clinical experiences [e.g., consecutive blocks of family, emergency, and internal medicine] are now separated so residents can repeatedly cycle back to experiences over two years with the expectation of increasingly sophisticated performance), and timed learning to our residents’ developmental stages (e.g., foundational clinical knowledge sessions early, behavioral medicine sessions later, and practice management sessions near graduation).

Back to Top | Article Outline

Step 5: Develop an assessment system

Competency-based assessment must be authentic (i.e., workplace-based assessment of residents caring for patients), reliable, practical, and ideally promote learning. Our preceptors—our real assessment “instrument” in workplace-based assessment—need to directly observe “biopsies” of resident performance; teach and provide constructive, behaviorally based feedback; and assess and illustratively document residents’ performance. Most preceptors appear to use themselves as their “gold standard”3 with subsequent low interrater reliability. Therefore, to improve reliability, we needed multiple informed, dedicated assessors. This required us to have an assessment system that included faculty development with a faculty development lead, so assessors understood the standards of performance and their assessment biases; multiple assessors for each resident; practical intuitive tools to capture assessments; and time for preceptors to do this work.

Entrustable professional activities (EPAs)—those “professional activities that together constitute the mass of critical elements that operationally define a profession”4—allowed us to translate our hundreds of objectives (too many to individually assess) and our desired competencies (too large and not integrative enough to reflect the complex roles and tasks a physician carries out in individual patient encounters) into carefully chosen, clinically relevant activities known to all preceptors and residents. By assessing these EPAs, we would have a way to assess competencies. Having decided to use EPAs, we needed an intuitive, efficient tool for our busy preceptors to document their assessments of our residents’ performance, one that would ideally also be cost-effective and promote learning.5 Therefore, we incorporated the 36 EPAs we wrote for family medicine into our existing generic field notes (FNs), a tool we had been using on paper since 2008 and electronically since 2011. FNs are brief notes that document a resident’s performance in the clinical environment and summarize the verbal feedback given about his or her performance.6 Slipping the EPAs into this familiar tool and process of direct observation made the introduction of EPAs an easy step for our preceptors. (The development of our EPAs, associated performance standards, and the EPA FNs by an expert panel, which occurred between 2010 and 2011, is described elsewhere.7)

Back to Top | Article Outline

Steps 6 and 7: Collect and interpret data and adjust individual residents’ training programs

With our EPA FNs in place, resulting in over 150 low-stakes data points about each resident’s performance (see below), we needed a system to collect, collate, and display those data so that they could be interpreted for patterns of performance, red flags, and competency development trajectory (step 6) and used to adjust training programs to meet individual residents’ unique learning needs (step 7). Our Portfolio Assessment Support System allowed us to do that. (Our development of this system and the supporting CBME literature from 2010 that we used are described elsewhere.8) Briefly, each of our residents has an electronic portfolio where a number of items, including assessment data, collect. Each resident has an academic advisor (AA) with whom the resident meets every four months to review all assessment data, identifying patterns of performance, red flags, and competency development trajectory, which form the basis for the AA’s competency declarations. Our EPA FNs form most of the competency data points about our residents; however, the AAs also consider other data (e.g., in-training evaluation reports, objective structured clinical examinations, multisource feedback, and simulation course results), which triangulate and enhance assessment reliability. If an AA is unsure about a resident’s progress, the program director is consulted. Once a competency declaration is complete, the resident’s learning plan for the upcoming four months is developed, with electives and other learning experiences deliberately chosen to meet that resident’s competency development needs (step 7). This use of assessment data to individualize residents’ training programs is a critical step in maximizing their competency development.

The AA role is a crucial one made necessary by the size of our program; it is supported through faculty development and paid protected time. Each AA oversees approximately six residents, allowing for close oversight.

Back to Top | Article Outline

Step 8: Distribute decisions to stakeholders

The final step is taking summative decisions about a resident’s overall competency at key junctures in training (e.g., transitions and end of training) and distributing these decisions to pertinent stakeholders (i.e., accrediting bodies, the university, and regulatory authorities). For us, given our already-short two-year program, we do not anticipate the program being shortened because some things such as understanding continuity of care cannot be compressed; however, for other programs, modification of program length could be an outcome of such summative decisions.

Back to Top | Article Outline

Overarching processes, costs, and facilitating factors and processes or steps that would have been helpful

It cannot be stated strongly enough that overlying all these steps are change management strategies, faculty development, research, and quality assurance (ongoing program refinement and adjustment through evaluation and adaptation) (Figure 1). For example, because assessment with CBME takes more time than assessments that occur now in most training programs and challenges buy-in from preceptors, it requires careful change management. Change management seems similar to working with patients to promote compliance: You need to provide an explanation to make proposals make sense, address concerns, give people a voice in proposed changes, identify and provide feedback when things are going well, and explore and adapt when things are not going well. We consulted preceptors and residents early on and have continued to seek their input. Having our accrediting body articulate CBME principles as national accreditation standards was very beneficial in motivating change. To get buy-in from our non-CFPC colleagues, who also work with and assess our residents, it would have been helpful to have CBME principles articulated as university deliverables and/or standards of their accreditation body. Making critical roles and expectations (e.g., someone designated and supported to oversee CBME implementation, the AA role, and expected frequent low-stakes assessments) program deliverables at an institutional level is something that our institution is now discussing and that we anticipate will be helpful.

Implementing CBME as we have envisioned has required extra resources. Costs (reported as time because time-to-money conversion varies across settings) have been as follows: program director (1 day/week), curriculum director (half day/week), and assessment director (half day/week) time to design, implement, evaluate, and improve processes; 36 hours/year/AA (each AA has 6 residents that he or she meets with individually 3 times a year; each meeting takes 2 hours—1 for preparation and 1 with the resident); and 1 blocked preceptor clinic appointment slot/resident/clinic for direct observation and feedback (which does not always happen due to patient add-ins but is attempted). Portfolio and FN software development took approximately 600 hours; upkeep and improvement since then has taken about 150 to 250 hours/year. Access to an information technology developer for quick responses to issues and upgrades has been critical. All costs have been paid by our department, highlighting the importance of departmental buy-in. We have added a helpful role, an “academic support person” (half day/week) for our identified residents in difficulty—someone who can build skills in educational diagnosis and remediation and be an additional resource and assessor to work with residents in difficulty.

As CBME implementation has progressed, we have discovered several facilitating factors and processes or steps that in retrospect would have been helpful (Table 1).

Table 1

Table 1

Having a few local champions within the program has been beneficial in broadening thinking and maintaining enthusiasm. Networking on a wider scale, facilitated by the CFPC, has been similarly beneficial. We look forward to more local networking and sharing of resources as other programs at our institution implement CBME.

The importance of faculty development and a faculty development lead has also been critical. Our faculty development for CBME implementation has primarily focused on developing preceptors’ skills in day-to-day assessment and their new AA role. This development has occurred through multimodal faculty development (small-group sessions, one-on-one sessions, online presentations), incorporating standards of performance into the EPA FNs, and providing feedback from residents to preceptors about their performance in both their preceptor and assessor roles. In retrospect, given that our preceptors really are our instruments for competency-based assessment, we would have benefited from a process (parallel to the one outlined in Figure 1) that built our preceptors’ expertise as teachers and assessors—a process that would, for example, figure out the objectives and competencies needed for teaching and assessment expertise, outline a curriculum and assessment strategy to facilitate ongoing preceptor development, and ideally assess and provide feedback to preceptors about their teaching and assessing. We anticipate that a new CFPC initiative to structure faculty development by detailing preceptor and academic coaching competencies and collecting and organizing faculty development resources will be very helpful in beginning the work of laying out a faculty development program.9

In retrospect, developing residents’ self-assessment, feedback-seeking, and self-directed learning skills would have been beneficial. Therefore, we plan to start this development during the first-year boot camp.

Building on the change and excitement of CBME implementation has been greatly facilitated by employing a part-time PhD education researcher. Her expertise has enriched our research and quality assurance work. If other programs have education researchers available to them, we strongly encourage this addition.

Back to Top | Article Outline

Outcomes

Early post-CBME-implementation outcomes are encouraging. Our residents are now being directly observed more often with increased documented feedback about performance based on explicit competency standards (from 2013 to 2015 we have gathered over 24,000 data points for our 150 residents). These multiple observations are being collated in a way that is allowing the AAs to identify patterns of performance, red flags, and competency development trajectory. We are identifying outliers earlier, both residents in difficulty and those excelling, resulting in earlier individualized modification of their residency training program. The documentation of patterns of performance, red flags, and competency development trajectory has been critical in supporting program decisions to extend or, rarely, terminate training. Following review by our Resident Assessment Committee (where all high-stakes recommendations to our associate dean for residents in difficulty are discussed), an extension of training has occurred for 8 of about 350 residents from 2010 to 2015 for a total of 19 extra training blocks (where 1 block is four weeks). More often, remediation has been addressed through the use of one or both of the resident in difficulty’s elective blocks and/or changing some of his or her planned rotations or learning experiences. The majority are successful with these modifications (ultimately passing their certification exam), but 4 residents have been asked to leave the program after due process, remediation, and probation have been exhausted. All decisions have been upheld when scrutinized by appeal boards (as per our university’s appeal process).

Back to Top | Article Outline

Next Steps

The extra time and effort CBME requires need constant reinforcement and troubleshooting. We will continue to provide and refine faculty development. In addition, we are developing an EPA FN app for handheld devices that our community and hospital preceptors have told us they would prefer to use over the current desktop version. We are also undertaking research to explore what facilitates our learners’ competency development, what increases assessors’ confidence in making competence decisions, and whether residents are better trained as a result of CBME implementation. Finally, this work should never be considered finished because once CBME has been implemented it must be maintained and adapted. Societal needs will change, objectives and competencies will need revisiting, and curriculum and assessment standards will evolve, all of which necessitate ongoing program evaluation and refinement.

Acknowledgments: The authors wish to acknowledge Drs. Glenn Brown, head of the Department of Family Medicine, Ross Walker, associate dean of postgraduate education, and Richard Reznick, dean of the Faculty of Health Sciences, Queen’s University, for their support of this work; Drs. Elaine Van Melle, Laura McEwen, and Tom Laughlin for their insightful ideas; Rachelle Porter for her information technology development expertise; and the Queen’s University family medicine residents and preceptors for their patience and feedback as this work was implemented.

Back to Top | Article Outline

References

1. Oandasan I, Saucier D. Triple C competency-based curriculum report—Part 2: Advancing implementation. 2013. Mississauga, Ontario, Canada: College of Family Physicians of Canada; www.cfpc.ca/uploadedFiles/Education/_PDFs/TripleC_Report_pt2.pdf. Accessed April 19, 2015.
2. Swing SR; International CBME Collaborators. Perspectives on competency-based medical education from the learning sciences. Med Teach. 2010;32:663–668.
3. Berendonk C, Stalmeijer RE, Schuwirth LW. Expertise in performance assessment: Assessors’ perspectives. Adv Health Sci Educ Theory Pract. 2013;18:559–571.
4. ten Cate O, Scheele F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med. 2007;82:542–547.
5. Van Der Vleuten CP. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ Theory Pract. 1996;1:41–67.
6. Donoff MG. Field notes: Assisting achievement and documenting competence. Can Fam Physician. 2009;55:1260–1262, e100e102.
7. Schultz K, Griffiths J, Lacasse M. The application of entrustable professional activities to inform competency decisions in a family medicine residency program. Acad Med. 2015;90:888–897.
8. McEwen LA, Griffiths J, Schultz K. Developing and successfully implementing a competency-based portfolio assessment system in a postgraduate family medicine residency program. Acad Med. 2015;90:1515–1526.
9. Walsh A, Antao V, Bethune C, et al. Fundamental teaching activities in family medicine: A framework for faculty development. Mississauga, Ontario, Canada: College of Family Physicians of Canada; 2015. http://www.cfpc.ca/uploadedFiles/Education/_PDFs/FTA_GUIDE_TM_ENG_Apr15_REV.pdf. Accessed November 17, 2015.
Back to Top | Article Outline

Reference cited in Table 1 only

10. Kotter JP. Leading change: Why transformation efforts fail. Harv Bus Rev. March/April 1995:1–10.
    © 2016 by the Association of American Medical Colleges