Journal Logo

Individual QI projects from single institutions

Impact of a Longitudinal Quality Improvement and Patient Safety Curriculum on Pediatric Residents

Vachani, Joyee G. MD, MEd*; Mothner, Brent MD*; Lye, Cara MD*; Savage, Charmaine MSW*; Camp, Elizabeth PhD; Moyer, Virginia MD, MPH

Author Information
Pediatric Quality and Safety: November 2016 - Volume 1 - Issue 2 - p e005
doi: 10.1097/pq9.0000000000000005
  • Open



In 2006, the Society of Hospital Medicine specified quality improvement (QI) and practice-based learning and improvement as core competencies relating to healthcare systems.1 In 2007, the Accreditation Council for Graduate Medical Education–mandated residents across specialties must participate in interprofessional teams to enhance patient safety (PS) and improve the quality of patient care, systematically analyze their practice using QI methods, and utilize this information to effect change with the goal of practice improvement.2 This charge required residency training programs to acknowledge and create educational plans for QI and PS and incorporate these skills into care plans and system-based learning.3–7

A review of the literature suggests that improvements are still needed in QI/PS pediatric resident training and that successful programs require QI project work along with faculty mentorship.5,6 Using key principles of adult learning theory as it applies to resident education and QI/PS methodology, we developed and implemented the QI and PS curriculum (QIPSC) for Baylor College of Medicine/Texas Children’s Hospital pediatric residents. The purpose of this study was to develop and test our curriculum QIPSC to improve resident competence in QI/PS knowledge, skills, and attitudes.


Evaluation of the curriculum was institutional review board approved, and the project was part of the Academic Pediatric Association’s Educational Scholars Program.

Curriculum Development and Implementation

We used the principles of andragogy described by Knowles,8 which suggest engaging adult learners early, applying their experience, making learning relevant, and creating curricula that are problem centered.8 The first steps in curriculum development as described by Kern9 are problem identification and a needs assessment. We began by developing a needs assessment survey, to which 75 pediatric residents responded in academic year (AY) 2011 to 2012. A majority of respondents (n = 64) preferred short educational modules of 30 minutes or less. Residents ranked their preferences for teaching methods equally, with residents identifying didactic in-person lectures, workshops, online quizzes, and online self-running lectures as effective delivery methods. Residents identified topics of greatest interest: evidence-based practice, PS, health policy, team effectiveness, and leadership. Most respondents (n = 52) felt that learning about QI was important or very important during residency.

Utilizing the results of our needs assessment and continuing with subsequent steps in the model described by Kern9 for curriculum development, we created specific goals and objectives and developed educational strategies and an implementation plan for our curriculum. Content was reviewed by QI and education experts. Given limited available curriculum time, our curriculum was split into 2 phases: a didactic learning phase and a hands on active learning phase. The first phase of QIPSC, implemented in July 2012, consisted of “ignite” presentations (interactive, online self-paced modules with voice-overs) and an interactive noon conference series on QI/PS topics that engaged learners through team-based learning, process mapping, and fishbone diagram activities.

In July 2013, we began the second phase of the curriculum: ongoing mentored projects in which pediatric residents actively participated. In AY 2013 to 2014, residents participated in 4 projects under the mentorship of Pediatric Hospital Medicine faculty. These projects were selected based on the criteria of stage of completion, faculty support, and educational value. All projects were QI or PS initiatives with clear and precise goals and timelines that would allow resident participation to be substantial. The initial mentored project topics were resident handoffs, assignment of severity scoring to asthma patients, a standardized oxygen weaning initiative, and identification of correct patient care teams. Each project utilized the Institute for Healthcare Improvement model for improvement incorporating specific aims, testing cycles of change, and measuring outcomes. The handoff project and a new patient flow project were continued in AY 2014 to 2015. Each project had one or more resident champions, both self-identified and appointed by the project mentor. These resident champions participated throughout the project duration and mentored junior residents as several projects became hospital-wide initiatives and were presented at peer-reviewed national conferences. Residents were involved in various aspects of each of the QI projects including data collection and interpretation, review of Plan, Do, Study, Act cycles, and creation and implementation of new initiatives for future Plan, Do, Study, Act cycles. The Maintenance of Certification credit was offered to faculty who served as project mentors. The first cohort of residents trained in QIPSC graduated in June 2015.

Curriculum Evaluation

The final steps of curriculum development as described by Kern9 are evaluation and feedback. The model described by Kirkpatrick10 for training evaluation describes the levels of learning from the most basic reaction, which then progresses to learning, transfer, and results. The pyramid of clinical competence described by Miller11 describes the transition from a novice who gathers facts and interprets/applies to an expert who demonstrates learning and integrates into practice. Integrating these frameworks, we developed tools to assess learner knowledge, skills, and attitudes.

Knowledge and Attitudes Assessments

Knowledge questions and an attitude survey (AS) previously described in the literature were administered at 3 points in the curriculum: a pretest and survey administered in the first year of residency before exposure to any components of the curriculum; a posttest and survey administered directly after completion of the 4 online modules; and an end of curriculum (EOC) survey (with knowledge test embedded) administered in the final year of residency after participation in all components of QIPSC curriculum.

Ten knowledge questions were the same throughout all tests and were analyzed. A previously described 12-question AS from the quality assessment and improvement curriculum was administered to cohort 1 at all 3 points in time.12–14

Skills Assessment

We also developed and administered a 12-question pre– and post–skill survey focusing on project work, team work, and leadership roles. The survey was completed by residents, and results were based on their perception of skills; no direct observations were done. Resident perceptions of their skills before and after the curriculum were surveyed upon completion of the curriculum in AY 2014 and 2015.


Knowledge and Attitudes Assessment

Of the 57 eligible residents in cohort 1, 43 participants completed 1 of the 3 administered QI/PS knowledge tests and were included in the final analysis: 42 residents completed the pretest, 20 completed the posttest, and 31 completed the EOC survey. We used unmatched/unpaired data comparisons of pre-, post-, and EOC survey scores because of the variability in sample size for each test. There were significant differences between pre- and posttest scores and pretest and EOC knowledge scores; however, there were no significant differences between the posttest and EOC survey using either parametric or nonparametric testing (Tables 1 and 2).

Table 1.:
Unmatched Comparisons of Knowledge Tests: Pre-, Post-, and EOC Surveys Using the Independent t Test
Table 2.:
Unmatched Comparisons of Knowledge Tests: Pre-, Post-, and EOC Surveys Using the Mann–Whitney Test

Of the 57 eligible residents in cohort 1, 11 residents completed the pre-AS, 13 completed the post-AS, and 37 completed the EOC AS; all surveys were included in the final analysis. The pre- and post-AS scores were compared as unpaired groups (ie, comparisons were made between pre- and postgroups, not between pre- and postmatched scores). The EOC AS did contain linked pre- and postresponses by residents and thus were analyzed as 2 paired groups (comparisons were made between individual resident pre- and postmatched scores). In the comparison of cohort 1 pre-AS and post-AS responses, significant attitude differences were found in 3 questions, and 1 approached significance. In the comparison of pre-AS and EOC “after” responses and EOC AS “before” and EOC “after” responses, there were statistically significant differences for all attitude questions (Tables 3 and 4).

Table 3.:
Unmatched Comparison of Pre- and EOC AS: Before and after Your Participation in QIPSC, How Comfortable Were You with the following? (1) Not at all, (2) Slightly, (3) Moderately, and (4) Extremely
Table 4.:
Before and after Perceptions in EOC AS: Before and after Your Participation in QIPSC, How Comfortable Were You with the following? (1) Not at all, (2) Slightly, (3) Moderately, and (4) Extremely

Skills Assessment

Thirty-seven eligible participants completed the skill survey and were included in the final analysis. Using the Wilcoxon test, there were statistically significant differences for all skill questions (Table 5).

Table 5.:
Before and after Perceptions in EOC Skills Assessment: Before and after Your Participation in QIPSC, to What Degree Did You Agree with the following Statements? (1) Strongly Disagree, (2) Disagree, (3) Neutral, (4) Agree, and (5) Strongly Agree


The QIPSC combines didactic components (lectures provided in interactive online ignite modules) with an interactive monthly noon conference series and practical applications of learned concepts in ongoing mentored projects to create an innovative longitudinal curriculum. The key study findings include increased QI/PS knowledge, attainment of skills, and improved attitudes.

Our analysis suggests that QIPSC modestly improved resident knowledge. As healthcare shifts from volume-based to a value-based culture, QI knowledge is a vital component of residency training as it directly impacts organizational success and patient outcomes.2,5

We found through participation in QIPSC, resident attitudes, and skills improved. These results are consistent with previous studies in QI/PS education; however, the multifaceted and longitudinal approach employed by QIPSC was integral for success and may increase the longevity of the curriculum. Additionally, residents were able to participate in faculty-mentored QI projects, many of which were based on ideas and feedback from the residents. For example, a focus group of residents were frustrated with the process of handoffs and sought a standardized method to improve this process. Through an independent literature review and research, residents discovered the I-PASS© tool and created a mentored project to implement this tool. Since inception of the project, I-PASS© has been accepted as the standard handoff tool throughout the hospital.

The leadership and hands-on experience gained by direct resident involvement in projects was an important aspect of the curriculum. Although there were no specific degrees of resident project participation in this study, engagement of the residents in project work proved essential to project success. Plan, Do, Study, Act cycles were largely determined by their ideas and feedback. Additionally, faculties involved in the projects were able to identify particular residents who were most invested in the project and approached them about becoming a “resident project champion.” Residents involved in this aspect of the project were not only empowered to lead their peers but were also able to present the project locally and nationally. This experience enabled learners in training to see first hand how QI and PS work can lead to tangible improvements and scholarly output.

As residents embark on their own careers, seeing the output and improvements that can be achieved through QI/PS work is extremely meaningful.15,16 The results of the knowledge test and the attitude/skill survey point to the value in both the explicit and the implicit curriculum of the QIPSC course.17 Improvement in knowledge scores suggests that the explicit curriculum (didactic instruction from online modules, guided project work, and noon conference presentations) gave residents a foundation of QI/PS knowledge. Improvement in attitudes toward and skills in QI/PS suggests that the implicit curriculum (project work, mentorship, and open dialogue with residents) was impactful and meaningful. Resident champions, who were mentored by QI-trained faculty members, went on to lead and present their work at an internal resident conference at the end of the AY. All projects and the QIPSC curriculum as a whole were subsequently presented at peer-reviewed national conferences.

There were several lessons learned while developing this curriculum and researching the impact over time. Getting stakeholder buy-in for new or additional curriculum in an already stretched learner group is difficult and takes time. Involving residents, program directors, and faculty mentors early on in the process was vital to success. It is important to work within the given framework, prove the new model, and then ask for more time. Additionally, it is key to train and engage faculty to sustain a longitudinal curriculum; incentives such as Maintenance of Certification should be offered. Finally, culture change takes time… but is priceless.


Barriers to curriculum implementation included competing demands within the residency program, time constraints of faculty, and variable placement of QIPSC within the residency curriculum. Initial placement of curriculum components on a busy pediatric inpatient rotation led to challenges in participation and group discussion preparation, ultimately resulting in a lack of resident engagement and understanding of project goals and progress. Technology-related issues also arose after the first year as the online platform of module delivery had to be changed, resulting in adjustments to orientation materials, evaluations, and faculty orientations.

Although our curriculum was mandatory, response and participation in surveys were not. Additionally, there were no associated external incentives for participation. Variation in survey completion is attributed to the setting/location/timing of test administration (eg, busy inpatient rotation vs nonclinical scholarly rotation and protected time for completion vs utilization of personal time). Small response rates and inability to pair pre-/postresponses from an individual resident during the knowledge section made analysis challenging. A shorter observation period potentially could have helped this although the design of our curriculum and study is longitudinal and aimed at assessing residents over a longer period of time. Additionally, because our response rate fluctuated throughout the study and we utilized unmatched comparisons, it is possible that less interested learners chose not to repeat testing, leading to volunteer bias. Data to properly analyze this potential bias were not available to maintain confidentiality for learners. The results from skill survey/ASs are significant but limited by being a perception survey done by the learner at a single point in time. The study could have been strengthened by direct observation and assessment of skills.

Finally, as the culture of medicine has evolved toward value-based care and the discussion of patient quality and safety in medical education is more prevalent, it is difficult to assess what impact a cultural shift outside of our curriculum had on learners’ attitudes. As the curriculum gained momentum, cycles of improvement were done to the curriculum itself, including components being moved to ambulatory rotations (Fig. 1).

Fig. 1.:
Cycles of improvement in QIPSC through each AY. PGY, Post-Graduate Year; PHM, Pediatric Hospital Medicine; RCA, Root Cause Analysis.


The QIPSC increased QI/PS knowledge, attainment of skills, and improved attitudes. Key lessons learned to successfully develop and implement a quality/safety curriculum include acknowledging a number of learner, faculty, and institutional needs; being flexible and responsive to these needs; and integrating key concepts of adult learning theory and QI/PS methodology. Ongoing goals for the curriculum include improving acquired resident knowledge, developing more robust evaluation tools, expanding QIPSC to additional learner groups and programs, and linking learner training to patient-level outcomes. Follow-up results from subsequent QIPSC learner cohorts will be necessary to measure the true impact of this curriculum: behavior change and improvements in practice.


1. Co JP. Educating for quality: quality improvement as an activity of daily learning to improve educational and patient outcomes. Acad Pediatr. 2014;14:13.
2. Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the accreditation council for graduate medical education: a systematic review. Acad Med. 2009;84:301309.
3. Kelz RR, Sellers MM, Reinke CE, et al. Quality in-training initiative: a need for education in quality improvement: results from a survey of program directors. J Am Coll Surg. 2013;217:11261132.e1–e5.
4. AAMC Expert Panel. Teaching for Quality Integrating Quality Improvement and Patient Safety across the Continuum of Medical Education. 2013.
5. Craig MS, Garfunkel LC, Baldwin CD, et al. Pediatric resident education in quality improvement (QI): a national survey. Acad Pediatr. 2014;14;5461.
6. Mann KJ, Craig MS, Moses JM. Quality improvement educational practices in pediatric residency programs: survey of pediatric program directors. Acad Pediatr. 2014;14:2328.
7. Gupta M, Ringer S, Tess A, et al. Developing a quality and safety curriculum for fellows: lessons learned from a neonatology fellowship program. Acad Pediatr. 2014;14:4753.
8. Knowles MS. Andragogy in Action: Applying Modern Principles of Adult Education. San Francisco, CA: 1984.
9. Kern DE. Curriculum Development for Medical Education: A Six Step Approach. 2009.2nd Ed. Baltimore, MD: Johns Hopkins University Press.
10. Kirkpatrick DL. Evaluating Training Programs: The Four Levels. 1994.California: Berrett-Koeller Publishers.
11. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65:S63S67.
12. Reed D, Wittich C, Drefahl M, et al. A Quality Improvement Curriculum for Internal Medicine Residents. MedEdPORTAL. Available at: Accessed October 14, 2013.
13. Ogrinc G, Headrick LA, Morrison LJ, et al. Teaching and assessing resident competence in practice-based learning and improvement. J Gen Intern Med. 2004;19:496500.
14. Oyler J, Vinci L, Johnson JK, et al. Teaching internal medicine residents to sustain their improvement through the quality assessment and improvement curriculum. Online Appendix 3 – QAIC Toolkit. J Gen Intern Med. 2011; 26:221225.
15. Philibert I, Gonzalez DRJA, Lannon C, et al. Quality improvement skills for pediatric residents: from lecture to implementation and sustainability. Acad Pediatr. 2014;14;4046.
16. Simasek M, Ballard SL, Phelps P, et al. Meeting resident scholarly activity requirements through a longitudinal quality improvement curriculum. J Grad Med Educ. 2015;7:8690.
17. Balmer DF, Quiah S, DiPace J, et al. Learning across the explicit, implicit, and extra-curricula: an exploratory study of the relative proportions of residents’ perceived learning in clinical areas at three pediatric residency programs. Acad Med. 2015;90:15471552.
Copyright © 2016 the Author(s). Published by Wolters Kluwer Health, Inc. All rights reserved.