Share this article on:

A Continuous Quality Improvement Curriculum for Residents: Addressing Core Competency, Improving Systems

Djuricich, Alexander M.; Ciccarelli, Mary; Swigonski, Nancy L.

Section Editor(s): Pinsky, Linda MD

Papers: Forces Impacting Graduate Medical Education

Purpose. To describe the development, implementation, and evaluation of a residency continuous quality improvement (CQI) curriculum.

Method. Forty-four medicine and pediatrics residents participated in a CQI curriculum. Resident-designed projects were scored for CQI construct skills using a grading tool. Pre- and post-tests evaluated knowledge, perceived knowledge, interest, and self-efficacy.

Results. Differences between pre- and post-test perceived knowledge and self-efficacy were highly significant (p < .001). The mean project score was 81.7% (SD 8.3%). Higher knowledge was associated with higher ratings of self-efficacy. There was no correlation of measured knowledge with project score or interest.

Conclusions. Resident education and learning in CQI served to produce innovative and creative improvement projects that demonstrated individual residents’ competency in practice-based learning and improvement.

This research was supported in part by the Anne E. Dyson Community Pediatrics Training Initiative.

Correspondence: Alexander M. Djuricich, MD, OPW-M200, 1001 West 10th Street, Indianapolis, IN 46202; e-mail: 〈〉.

Practice-based learning and improvement (PBLI), one of the six competencies outlined by the Accreditation Council for Graduate Medical Education (ACGME) Outcomes Project, is a challenge for residency educators to assess and evaluate.1 In addition to incorporating the use of evidence-based medicine, information technology, and the teaching of other health care providers, the PBLI competency is designed to help residents perform practice-based improvement activities using a systematic methodology.2 As an alternative approach to chart auditing, a common tool used to demonstrate resident competency in PBLI, we designed a curriculum in which residents constructed self-initiated continuous quality improvement (CQI) proposals to improve systems.

CQI is a methodology designed to improve systems in an organized fashion. Many residency programs have developed quality improvement projects to improve some aspect of patient care,3,4 and have involved residents in those projects.5 Indeed, some of these ideas have been innovative and well implemented, with a resultant measurable improvement in particular processes or systems of care.6 One aim of resident involvement in quality improvement is to have residents explore system operations and the critical component of interdisciplinary teamwork in system function and improvement. Another benefit is resident reflection leading towards active engagement in the creation of solutions, which is an important step beyond identifying the problems.

Although resident participation in quality improvement activities is not new, evaluating their ability to lead this type of project development is a higher-level skill.7 The purpose of this research was to determine whether residents can learn about CQI in an organized fashion, and demonstrate project development within the context of their own patient care and the residency program.

Back to Top | Article Outline


Our university-based medical school is affiliated with an adult tertiary care hospital, a children's hospital, an inner-city county hospital, a Veterans’ Administration hospital, and a community hospital. The departments of medicine and pediatrics train 105 categorical medicine, 13 transitional medicine, 51 combined internal medicine/pediatrics, two medicine/neurology, 75 categorical pediatrics, 11 combined emergency medicine/pediatrics, and five combined psychiatry/child psychiatry/pediatrics residents.

In September, 2002, a needs assessment consisted of a review of the medical education and quality literature and a series of discussions with residency program directors, ambulatory clinic directors, and faculty. This information guided the creation of an approach in which residents could learn CQI concepts within the context of real clinical experiences. In January of 2003, a preliminary curriculum was piloted for the medicine residents. Evaluation and feedback from the pilot reshaped the current curriculum. The pediatrics residency joined the pilot in March of 2003, with adoption of an interdepartmental curriculum for both residencies in July of 2003.

Initially, the one-month required postgraduate year three (PGY3) ambulatory block rotation was selected as the best potential time to teach a system-based skill like CQI. PGY3 medicine residents, who are familiar with the hospital systems and the training program, are stimulated to learn about practice management topics as they recognize their approaching graduation. An ambulatory PGY2 rotation, which focused on community pediatrics, was chosen as the rotation in which to begin the curriculum for the pediatrics residents. All consecutive upper-level residents on these required ambulatory block rotations participated in the CQI curriculum.

Specific content areas within CQI were identified as important for the residents to learn. A curricular structure including objectives, teaching content, an individual project template, and evaluation methods was created. The objectives of the curriculum were:

  1. Residents will learn basic principles and methodology of CQI, specifically the Plan-Do-Study-Act (PDSA) cycle and the “Model for Improvement.”8
  2. Residents will create quality improvement projects that focus on improving either patient care or a particular aspect of the residency program itself.

The curriculum was divided into three portions. A 60-minute didactic session describing CQI theory and its applications was given to the residents early in the rotation. Examples of CQI in health care9 were reviewed using a case-based format. The residents were given the task of selecting and developing their own CQI project ideas, the scope of which could seek to improve either patient care issues or the residency program itself.

Another one-hour session was scheduled exclusively for the participating resident group, without the faculty facilitator, to “brainstorm” ideas for their projects. They were encouraged to e-mail their initial ideas to the preceptor, who provided individual resident feedback. This formative evaluation helped residents refine the focus of their projects.

At the final one-hour session, each resident delivered a five-minute presentation of his or her own project to the resident group and teaching faculty. Presentations emphasized feasibility of and barriers to project implementation. The faculty preceptor, a medicine–pediatrics physician with faculty development experience but no formal quality improvement training, provided an evaluation of the residents’ written project using specific criteria on a formal evaluation tool.

The evaluation tool measured adherence to the PDSA cycle, ability to measure outcomes, feasibility, relevance, and affordability. Adherence to the PDSA cycle accounted for the majority of the project score (32 points). The other four criteria were each scored on a 0–4-point scale. The highest attainable score was 48 points. Graded evaluations were returned to the resident with comments and suggestions.

In addition to constructing a project, each resident completed a pre- and a post-test. Test questions were developed by the faculty preceptor, then reviewed for face and content validity by three senior faculty members, one of whom had training in survey development and design. Questions covered perceived knowledge (“How much do you know about continuous quality improvement?”), self-efficacy (“I believe I am able to develop and implement a CQI project.”) and interest (“I would like to participate in a project if it helped improve patient care or improve the residency program.”). Responses were rated on a five-point scale. A knowledge score, obtained using five items (“Who are the customers of a residency?,” “What is the difference between quality and CQI?,” “Give a clinical example of CQI in health care,” “List elements of the PDSA cycle,” and “Which is NOT a core concept of CQI?”), was coded as a percentage of the correct responses. Paired t-tests were used to test for a significant difference in pre- and post-test scores. Pearson correlations were used to measure the association of perceived knowledge, self-efficacy, interest, and the project score with the pre- and post-test–measured knowledge scores.

Back to Top | Article Outline


A total of 44 residents participated in the curriculum beginning in July, 2003. Two medicine residents were excluded from the analyses because of missing data. Descriptive statistics and results of paired t-tests are given in Table 1. Differences between pre- and post-test perceived knowledge and self-efficacy items were highly significant (p < .001). The pretest mean on the interest item was high (4.3 out of 5) and did not differ significantly from the post-test. Residents nearly doubled their knowledge (47.9% to 88.6%, p < .001). The mean project score was 81.7% (39.2 out of 48) with a standard deviation of 8.3%.

Table 1

Table 1

Table 2 shows the correlation of pretest and posttest scores with perceived knowledge, interest, self-efficacy, and the project score. Higher pretest knowledge was associated with higher perceived knowledge and self-efficacy at the beginning of the rotation. Higher post-test knowledge was associated with higher ratings of self-efficacy at the completion of the course. There was no correlation of the measured knowledge scores with the project score or interest.

Table 2

Table 2

Back to Top | Article Outline


The CQI curriculum significantly improved residents’ knowledge, perceived knowledge, and self-efficacy with a modest investment of their time in an organized curriculum. A change in interest in CQI was not demonstrated. However, interest prior to the course was already very high. The inability to show a difference may be due to a ceiling effect of the instrument. Correlation of pre- and post-test knowledge with self-efficacy implies that the curriculum is useful for all levels of learners.

All participating residents completed the project assignment and generated reasonable, specific project aims and measurement methods for outcomes within the context of their own patient care. While some utilized the formal framework of the PDSA cycle without further instruction, others requested additional feedback. The lack of correlation between any pre- or post-test variables with the project score is likely due to the large amount of formative feedback given at the residents’ request. Many residents have begun meeting with faculty pertinent to their own individual project implementation. Additional feedback for project enhancement was sought by a group of residents particularly motivated to implement their projects. Several of these projects have been completed with measurable improvement.

A major discovery during the pilot phase was that many PGY3 residents wished to implement their projects but could not do so because they needed additional time to complete the projects, yet were near graduation. The curriculum was therefore moved to the PGY2 year. These residents have repeatedly commented that the PGY2 year is the ideal time to learn CQI, as designing and initiating a project, implementing it, and studying the measurement differences all take a significant amount of time.

There are three major limitations to this study. First, a single preceptor required a total of 15 hours per month to deliver the curricular material and grade each individual resident's project. Plans are underway to train additional faculty to serve as facilitators in future months, which would allow measurement of interrater reliability. Second, the generalizability of the curriculum to other institutions has not yet been tested. Nevertheless, the interdisciplinary success and collaboration between the medicine and pediatric residents demonstrate generalizability of this curriculum in separate primary care disciplines. Third, we were unable to demonstrate full implementation of each individual resident project. Although every resident completed a written project, implementation is unlikely to occur within the given one-month time frame.

Other institutions have demonstrated that CQI can be taught to residents during elective rotations.10 In contrast, our curriculum focuses on learning and applying CQI principles within the resident's clinical realm and educational process as a portion of a required one-month rotation. The advantage of this method is that all residents will complete the compact curriculum by the end of residency, and competency in PBLI can be documented for every resident.

Back to Top | Article Outline


  1. Internal medicine and pediatrics residents can develop knowledge and skills in CQI methodology within a three-hour curriculum during a one-month ambulatory block rotation.
  2. Residents use their unique training experiences to create quality improvement projects designed to improve the residency and patient care.
  3. A CQI curriculum is generalizable across primary care residency programs.
  4. Residency programs can successfully demonstrate individual resident competency in certain aspects of practice-based learning and improvement.
Back to Top | Article Outline


1. Ogrinc G, Headrick LA, Mutha S, Coleman MT, O'Donnell J, Miles PV. A framework for teaching medical students and residents about practice-based learning and improvement, synthesized from a literature review. Acad Med. 2003;78:748–53.
2. Accreditation Council for Graduate Medical Education. Outcomes project: general competencies; 2004 〈〉. Accessed 9 June 2004.
3. Headrick LA, Richardson A, Priebe GP. Continuous improvement learning for residents. Pediatrics. 1998;101:768–73.
4. Parenti CM, Lederle FA, Impola CL, Peterson LR. Reduction of unnecessary intravenous catheter use. Internal medicine house staff participate in a successful quality improvement project. Arch Intern Med. 1994;154:1829–32.
5. Volpp KGM, Grande D. Residents’ suggestions for reducing errors in teaching hospitals. N Engl J Med. 2003;348:851–5.
6. Coleman MT, Nasraty S, Ostapchuk M, Wheeler S, Looney S, Rhodes S. Introducing practice-based learning and improvement ACGME core competencies into a family medicine residency curriculum. Jt Comm J Qual Saf. 2003;29:238–47.
7. Weingart SN. A house officer-sponsored quality improvement initiative: leadership lessons and liabilities. Jt Comm J Qual Improv. 1998;24:371–8.
8. Langley GJ, Nolan KM, Nolan TW. The foundation of improvement. Qual Prog. 1994;June:81–6.
9. Shine KI. Health care quality and how to achieve it. Acad Med. 2002;77:91–9.
10. Ogrinc G, Headrick LA, Morrison LJ, Foster T. Teaching and assessing resident competence in practice-based learning and improvement. J Gen Intern Med. 2004;19:496–500.
© 2004 Association of American Medical Colleges