The Accreditation Council for Graduate Medical Education (ACGME) initiated its Outcome Project to better prepare physicians-in-training to practice in the rapidly changing medical environment and mandated assessment of competency in six outcomes, including Practice-Based Learning and Improvement (PBLI) and Systems-Based Practice (SBP). Before the initiation of the Outcome Project, these competencies were not an explicit element of most graduate medical education training programs. Since 1999, directors of ACGME-accredited programs nationwide have been challenged to teach and assess these competencies. The authors describe an institution-wide curriculum intended to facilitate the teaching and assessment of PBLI and SBP competencies in the 115 ACGME-accredited residency and fellowship programs (serving 1,327 trainees) sponsored by Mayo School of Graduate Medical Education. Strategies to establish the curriculum in 2005 included development of a Quality Improvement (QI) curriculum Web site, one-on-one consultations with program directors, a three-hour program director workshop, and didactic sessions for residents and fellows on core topics. An interim program director self-assessment survey revealed a 13% increase in perceived ability to measure competency in SBP, no change in their perceived ability to measure competence in PBLI, a 15% increase in their ability to provide written documentation of competence in PBLI, and a 35% increase in their ability to provide written documentation of competence in SBP between 2005 and 2007. Nearly 70% of the programs had trainees participating in QI projects. Further research is needed to evaluate the cost-effectiveness of such a program and to measure its impact on learner knowledge, skills, and attitudes and, ultimately, on patient outcomes.
Mr. Varkey is associate professor of medicine, preventive medicine and medical education, Division of Preventive and Occupational Medicine, College of Medicine, Mayo Clinic, Rochester, Minnesota, and director, Quality Improvement Curricula, Mayo School of Graduate Medical Education and Mayo Medical School, Rochester, Minnesota.
Mr. Karlapudi is a research trainee, Division of Preventive and Occupational Medicine, College of Medicine, Mayo Clinic, Rochester, Minnesota.
Dr. Rose is associate professor of anesthesiology and vice dean, Mayo School of Graduate Medical Education, College of Medicine, Mayo Clinic, Rochester, Minnesota.
Dr. Nelson is professor of medicine, Division of Endocrinology, College of Medicine, Mayo Clinic, Rochester, Minnesota.
Dr. Warner is professor of anesthesiology and dean, Mayo School of Graduate Medical Education, College of Medicine, Mayo Clinic, Rochester, Minnesota.
Correspondence should be addressed to Mr. Varkey, Mayo Clinic, Baldwin 5A, 200 First SW, Rochester, MN 55905; telephone: (507) 284-9966; fax: (507) 284-4251; e-mail: (Varkey.email@example.com).
Front-line physicians including residents and fellows are often in the forefront of systems inefficiencies, hand-offs, and medical errors. However, most are not adequately trained to practice in this kind of health care environment. To address some of these issues, the Accreditation Council for Graduate Medical Education (ACGME) initiated its Outcome Project1 and identified six competencies that all residents and fellows should be proficient in at the time of graduation. Among them, the Practice-Based Learning and Improvement (PBLI) and Systems-Based Practice (SBP) competencies are related to the more general domain of Quality Improvement (QI).2 Because these two competencies have not been an explicit part of the traditional curriculum of most specialty programs, program directors (PDs) have found it challenging to teach and assess them.3,4 Lack of faculty with expertise, lack of institutional support for patient safety and QI,5 and residents’ perception of this as a diversion from learning medicine6 are some of the common concerns raised by PDs about incorporating these two competencies into training repertoires.
Although many aspects of PBLI and SBP require a common body of knowledge and skills that translates across all medical specialties, there is little published literature describing institution-wide initiatives to teach and assess PBLI and/or SBP competencies across a broad range of specialty programs. Medio et al7 have described an institution-wide core curriculum for 47 residency programs using didactics and discussions of topics including resident as a teacher, Medicare, hospital practice, ethics, medical–legal issues, statistics, socioeconomics, cost containment, communication skills, research design, and critical review of literature; the curriculum was well received by faculty and residents. Impact on a wider audience and interaction among residents from different specialties7 are among the potential benefits of such a system-wide strategy, as well as efficient use of content experts, prevention of resource burnout, and cost-effectiveness. With these in mind, an institution-wide program on SBP/PBLI for all ACGME-accredited programs across the three Mayo Clinic sites was designed and implemented. By means of this paper, we describe its implementation and our analysis of its impact during its second year.
Mayo’s Institution-Wide PBLI and SBP Program
Mayo School of Graduate Medical Education (MSGME) provides administrative support and oversight to 115 ACGME-accredited graduate medical education programs (97 specialties; 1,327 residents) offered across the three Mayo Clinic campuses. These are in Rochester, Minnesota (75 programs; 1,068 residents), Jacksonville, Florida (22 programs; 139 residents), and Phoenix/Scottsdale, Arizona (18 programs; 120 residents). In 2005, several PDs requested the assistance of MSGME with the implementation of the PBLI and SBP competencies through self-identification on a survey. A lack of faculty expertise and resources in these areas was noted, especially among programs of smaller size. In response to these needs, the MSGME dean’s office initiated an institution-wide program (the MSGME QI program) in October 2005. The MSGME QI program was created with the following strategic objectives: (1) to facilitate the implementation of the SBP and PBLI competencies in MSGME programs and (2) to foster knowledge and skills in QI among all MSGME learners. The dean of MSGME appointed a physician QI and patient safety expert as director of this QI program. This physician is also a PD with an interest in QI who had successfully implemented a PBLI and SBP curriculum for preventive medicine fellows for three prior years.
As the first step, extensive benchmarking was conducted in the first three months of the program. This included review of health professions and management literature to identify teaching and assessment strategies in other institutions. The general director also met with PDs within the institution who had successfully implemented PBLI/SBP curricula, and advice was sought from local experts who had previously implemented an institution-wide curriculum in ethics and professionalism.
The MSGME QI curriculum was developed based on the framework of the Kerns model for curriculum development.8 The Kerns model emphasizes consideration of six steps for curriculum development including a general needs assessment, targeted needs assessment, objectives and goals for the curriculum, teaching methodologies, implementation strategies, and evaluation and feedback. In addition to didactics, case-based learning and exercises and learner involvement in educational QI projects (QIPs)9,10 were identified as preferred experiential teaching/learning methodologies. To facilitate implementation of this curriculum, the director met one-on-one with PDs, facilitated a PD workshop on teaching PBLI and SBP, developed a centralized curriculum Web site, and organized institution-wide didactic sessions for learners on specific topics.
Meetings with PDs
As part of the effort to assist PDs in defining specialty-appropriate content and developing teaching method and assessment strategies for PBLI and SBP, the director of the QI program met with every PD individually at all Mayo Clinic sites. In preparation for the same, a formal announcement of the goals of the program was distributed to all PDs as an e-mail from the dean of MSGME, raising awareness about the program and the resources available. On the Rochester campus, meetings with PDs were prioritized based on program size and/or requests for assistance. Meetings with PDs on the Arizona and Florida campuses were scheduled over a two-day period at each site. The general director of the QI program met with each PD at least once (one-hour sessions). At the discretion of the PD, the meeting often included the associate PD, the education program coordinator, and, occasionally, residents/fellows.
An interactive discussion on the purpose, scope, and objectives of PBLI and SBP (as described by the ACGME) was part of every meeting. Training in PBLI aims at providing learners with the skills and knowledge necessary to identify the strengths and deficiencies in their medical knowledge, systematically evaluate patient care, assimilate scientific evidence, and implement changes aimed at providing higher-quality patient care. Training in SBP aims to develop resident skills in systems thinking, team collaboration, health care financing, safety, and patient advocacy. A wide variety of tactics including teaching methods, reading resources, MSGME Web resources, assessment tools,11 and examples of QIPs done by learners in various programs both within and outside of the MSGME were also discussed. This was followed by brainstorming on existing MSGME educational strategies and curricula that could be enhanced to meet some of the needs for the teaching and assessment of PBLI and SBP, with special emphasis placed on discussing specialty-specific QIP opportunities (e.g., enhancing a standardized management of urinary incontinence in the geriatric fellowship, enhancing medication reconciliation in the preoperative clinic served by anesthesia residents). Timing of QIPs in the curriculum and preparation for site visits were among other topics discussed in detail in the meetings.
In June 2006, a three-hour workshop was conducted on the Rochester campus and video-telecast to the Florida and Arizona campuses to assist PDs in implementing the PBLI and SBP curriculum. This in-depth workshop, facilitated by QI educators at all three sites, detailed effective strategies for teaching and assessing the two competencies such as experiential learning through QIPs, case-based learning, simulations, videos followed by debriefs, and morbidity and mortality rounds on systems issues. The workshop also presented case study examples from multiple programs, identified various experts/champions (physician, administrative, and allied health), provided suggestions on how to efficiently incorporate these competencies into existing curricula (stand-alone curriculum versus longitudinal curriculum), and addressed potential barriers to implementation including resources, faculty development, and teaching QI in a clinical context.
QI curriculum Web site
A Web site was created as a centralized resource for sharing content, teaching and assessment strategies, and QIP examples for PBLI and SBP. This Web page was linked to the MSGME Web site and includes primarily static content including presentations, interactive exercises (e.g., exercises on cost-effective practice, systems thinking), reading resources, evaluation methods, and examples of QIPs. The Web site is frequently updated to provide best practices identified during PD meetings and internal reviews. The Web site has resources geared toward both faculty and learner needs.
Institution-wide didactic sessions
The majority of PDs expressed a desire to have MSGME present didactics on insurance systems (an element of SBP), basic principles of QI, and systems thinking (elements of PBLI and SBP). Each didactic session was conducted twice in Rochester in 2007, and some PDs mandated the attendance of their residents. These didactic sessions were interactive through the use of group exercises and an audience response system (ARS) that prompted immediate learner responses and discussions to lecturer questions raised in the sessions. Three pre- and postsession questions by means of the ARS were asked of the participating learners in each of the sessions to gauge their change in knowledge after the sessions.
Impact of the Program
In the first 18 months of the program, the general director of the MSGME QI program met with 111 (96.5%) PDs and/or their representatives. To gauge the effectiveness of the QI program, we compared PD self-assessment surveys conducted in 2005 and 2007. These surveys used Likert scales for assessing preparedness of programs in the areas of SBP and PBLI competencies. The surveys in both years were identical except for an additional question on the usefulness of the one-on-one PD meetings in the 2007 survey. In 2005, 68 (59.1%) PDs (48 in Rochester, 15 in Arizona, and 5 in Florida) responded to the survey; 72 (62.6%) PDs (50 in Rochester, 12 in Arizona, and 10 in Florida) responded in 2007. PD perceptions of their ability to measure competency in PBLI in 2005 and 2007 were similar (see Table 1). However, in 2007, 46% (31/68) as compared with 33% (22/67) of PDs in 2005 perceived they were in “good shape” to measure competency in SBP. Similarly, in 2007, 78% (54/69) of the PDs stated they would be able to provide written evidence of assessing SBP at a site visit as compared with 63% (41/65) in 2005, an increase of 15%. In 2007, 79% (54/69) of the PDs stated they would be able to provide written evidence of assessing PBLI as compared with 44% (29/66) in 2005, an increase of 35%. Fifty-seven of the 72 PDs who responded to the 2007 survey provided feedback on the usefulness of specific strategies discussed in the one-on-one meetings with them. Fifty-four percent (31/57) found the definition and scope of competencies to be useful, 70% (40/57) found the teaching methodologies to be useful, and 61% (35/57) found examples of QIPs to be useful (see Figure 1).
At the time of the 2007 survey, 44% (30/68) of the responding PDs had all residents or fellows involved in a QIP; 26% (18/68) had some (but not all) of their residents involved in a QIP. About 2% (14/68) stated they were in the process of working out strategies to involve their residents in QIPs, whereas 9% (6/68) of the programs did not have any plans to incorporate QIPs in residents’ training at the time of the survey. Although this information was not available from the 2005 survey, according to the qualitative information available there were fewer than 10 programs that had their residents engaged in QIPs before the program.
Since the creation of the QI curriculum Web site, there have been more than 3,500 views. We were unable to track the number of downloads from the site. In terms of the institution-wide didactic sessions, the insurance systems components were attended by a total of 192 (18%) learners. Significant improvements in learner knowledge of insurance systems were noted for all three questions assessing knowledge (P < .0001). Most learners (97%, 158/162) rated the overall usefulness of the sessions to be good, excellent, or outstanding. Two hundred twenty-five (21%) learners attended the health care QI didactics. Significant improvements in learner knowledge were noted for two of the three questions (P < .0006). Most learners (97%; 150/154) rated the usefulness of these sessions to be good, excellent, or outstanding.
It is important to explore efficient and cost-effective strategies as programs develop and implement SBP and PBLI curricula. One such strategy is to develop a system-wide initiative to facilitate implementation of these competencies. This approach has the advantage of overcoming issues related to scarce faculty resources and expertise. By means of this report, we have described implementation of an institution-wide PBLI-SBP curriculum across 115 residency and fellowship programs. Our interim analysis in the second year of the program showed that the majority of the programs had all or some of their graduates involved in a QIP and that all PDs were confident in demonstrating resident competency in PBLI and SBP.
Support of senior officials has been described by Medio et al7 as a key factor in the success of institution-wide programs. The MSGME QI program was initiated by the dean of MSGME and continues to be a centralized program very closely supervised by the dean and MSGME administrative staff. The general director’s dual experience as a QI expert and a PD also helped her relate to the concerns of other PDs. Furthermore, the development of a common shared vision, brainstorming, and active goal setting in the one-on-one meetings, as well as the flexible and decentralized approach to the implementation of this curriculum, were critical to the success of the program. Components such as these have been well described in a study examining effective curriculum reform.12 The program process also closely followed the principles of QI in benchmarking, shared leadership,13 and participatory meetings.14
A centralized Web site was designed to provide up-to-date reliable information on best educational practices, resulting in the sharing of knowledge, tools, and resources across specialties. This Web site was often linked to the electronic curricula of specific residencies and fellowships, and PDs required learners to complete educational modules related to these competencies. Although not related to SBP or PBLI, York et al15 describe a similar Web-based approach to provide a core curriculum of 13 modules for 1,000 residents in 60 training programs across 18 training sites at the University of Illinois at Chicago. What we found especially interesting in our case was that although the centralized Web site was linked to several program Web sites, this was not noted among the most helpful interventions of the QI program. This may not have been noted because at this time we do not have interactive educational materials (linked to competency assessment) on the Web site. We are in the process of developing such modules for 2009.
Comparative analysis of PDs’ perceptions in 2005 and 2007 revealed that within the first 18 months of the program, all PDs knew how to demonstrate competence in PBLI and SBP. About 70% of all programs had residents involved in QIPs, and 21% of programs were planning to involve their residents in QIPs. We are hopeful that this form of experiential curriculum will translate into improved patient care, as has been noted in other studies,10,16 as well as sustained learner knowledge and skills in SBP and PBLI.
Although no improvement was demonstrated in PDs’ perceptions of their ability to measure competency in PBLI, a 13% increase was noted in their ability to measure competency in SBP. Of the educational tools discussed during the meetings with the general director of the QI curriculum, most PDs found teaching methodology, examples of QIPs, assessment tools, and definition and scope to be the most useful. For other institutions that have interest in initiating such a QI program, this subject matter would be important to address with PDs.
On the basis of PD requests, we conducted didactic sessions on the basics of QI and insurance systems, attended by 21% and 18% of MSGME residents, respectively. The low attendance is likely attributable to the fact that similar sessions had been held previously in six large programs attended by 229 residents (21% of the total residents at the Rochester campus) before the initiation of the MSGME didactics and that attendance at the didactics was not mandated by MSGME or the individual programs. The institution-wide sessions themselves were well received by residents in attendance, and postsession assessments revealed a statistically significant improvement in knowledge related to both topics. Similar institution-wide strategies also described by Medio et al7 can form an effective strategy to meet the educational needs of residents in a variety of specialties. The use of the ARS encouraged active interaction of the audience and allowed the presenter to receive real-time feedback on understanding of key concepts by the learners, thus allowing for real-time improvements to the presentation. The use of the ARS is well-known to enhance learner knowledge compared with didactics in which no interaction occurred.17 We also found the ARS useful to track attendance and to relay learner competence in the subject matter to the PDs.
Our evaluations and analysis have limitations. Approximately 40% of the PDs did not respond to the surveys, and hence we are limited by a possible selection bias in interpreting the outcomes and impact of this program. However, this response rate is similar to that reported in other PD surveys.18 Similarly, in 44 programs there were changes in PDs between the two years when the surveys were done. It is also possible that improvements in meeting the requirements of the ACGME are not related to this institutional program but are similar to national trends related to the same, or because of changes related to preparation for a midcycle internal review or a site visit. Finally, this is an interim analysis on the effect of a new system-wide initiative, and we do not have information on the cost-effectiveness of such a program or its impact on enhancing patient outcomes or accreditation at site visits.
Future initiatives of this program focus on providing ongoing support to PDs, the creation of interactive Web-based modules that are linked to pre and post education competency assessment tools that could automatically populate the learner’s portfolio, and the creation of experiential simulation (or objective structured clinical examination) modules for specific patient-safety-related competencies (education and assessment) as is relevant for specific specialty programs.
Anecdotally, many PDs find it challenging to teach and assess the PBLI and SBP competencies. To our knowledge, documented strategies to address and measure these competencies have been limited to individual programs or related groups of specialty programs. A systems approach centralized at the level of the school of graduate medical education, similar to what we have described at MSGME, may prove to be an efficient and cost-effective method to facilitate the implementation of these competencies across the different specialty programs in an institution. Further research is needed to evaluate the cost-effectiveness of such a program and its sustained impact on learner knowledge, skills, and attitudes as well as patient outcomes.
This study was partially funded through a Medicine Innovation Development and Advancement System Grant.
1 Swing SR. Assessing the ACGME general competencies: General considerations and assessment methods. Acad Emerg Med. 2002;9:1278–1288.
2 Varkey P, Natt N, Lesnick T, Downing S, Yudkowsky R. Validity evidence for an objective structured clinical examination to assess competency in systems-based practice and practice-based learning and improvement: A preliminary investigation. Acad Med. 2008;83:775–780.
3 Moskowitz EJ, Nash DB. Accreditation Council for Graduate Medical Education competencies: Practice-based learning and systems-based practice. Am J Med Qual. 2007;22:351–382.
4 Mohr JJ, Randolph GD, Laughon MM, Schaff E. Integrating improvement competencies into residency education: A pilot project from a pediatric continuity clinic. Ambul Pediatr. 2003;3:131–136.
5 Tomolo A, Caron A, Perz ML, Fultz T, Aron DC. The outcomes card. Development of a systems-based practice educational tool. J Gen Intern Med. 2005;20:769–771.
6 David RA, Reich LM. The creation and evaluation of a systems-based practice/managed care curriculum in a primary care internal medicine residency program. Mt Sinai J Med. 2005;72:296–299.
7 Medio FJ, Arana GW, McCurdy L. Implementation of a college-wide GME core curriculum. Acad Med. 2001;76:331–336.
8 Kern DE, Thomas PA, Howard DM, Bass EB. Curriculum Development of Medical Education: A Six-Step Approach. Baltimore, Md: The Johns Hopkins University Press; 1998.
9 Allen E, Zerzan J, Choo C, Shenson D, Saha S. Teaching systems-based practice to residents by using independent study projects. Acad Med. 2005;80:125–128.
10 Varkey P, Reller MK, Smith A, Ponto J, Osborn M. An experiential interdisciplinary quality improvement education initiative. Am J Med Qual. 2006;21:317–322.
11 Accreditation Council for Graduate Medical Education. Toolbox of Assessment Methods—ACGME Outcome Project. Chicago, Ill: Accreditation Council for Graduate Medical Education; 2000.
12 Bland CJ, Wersal L. Effective leadership for curricular change. In: Norman GR, van der Vleuten CPM, Newble DI, eds. International Handbook of Research in Medical Education. Dordrecht, The Netherlands: Kluwer Academic Publishers; 1999:1136.
13 Casady WM, Dowd TA. Shared leadership and the evolution of “one great department.” Radiol Manage. 2005;27:52–54, 56–59.
14 Green LW. Study of Participatory Research in Health Promotion: Review and Recommendations for the Development of Participatory Research in Health Promotion in Canada. Ottawa, Canada: Royal Society of Canada; 1995.
15 York JW, Stapleton G, Sandlow LJ. A Web-based core curriculum to meet certification and training needs for medical residents. J Asychronous Learn Netw. 2003;7:96–104.
16 Ogrinc G, Headrick LA, Morrison LJ, Foster T. Teaching and assessing resident competence in practice-based learning and improvement. J Gen Intern Med. 2004;19:496–500.
17 Schackow TE, Chavez M, Loya L, Friedman M. Audience response system: Effect on learning in family medicine residents. Fam Med. 2004;36:496–504.
18 Heard JK, Allen RM, Clardy J. Assessing the needs of residency program directors to meet the ACGME general competencies. Acad Med. 2002;77:750.
© 2009 Association of American Medical Colleges
This article has been cited
Academic MedicineCorrectionAcademic Medicine