Secondary Logo

Share this article on:

Impact of a Longitudinal Quality Improvement and Patient Safety Curriculum on Pediatric Residents

Vachani, Joyee, G., MD, MEd*; Mothner, Brent, MD*; Lye, Cara, MD*; Savage, Charmaine, MSW*; Camp, Elizabeth, PhD; Moyer, Virginia, MD, MPH

Pediatric Quality and Safety: November 2016 - Volume 1 - Issue 2 - p e005
doi: 10.1097/pq9.0000000000000005
Individual QI projects from single institutions

Introduction: The effectiveness of longitudinal quality/safety resident curricula is uncertain. We developed and tested our longitudinal quality improvement (QI) and patient safety (PS) curriculum (QIPSC) to improve resident competence in QI/PS knowledge, skills, and attitudes.

Methods: Using core features of adult education theory and QI/PS methodology, we developed QIPSC that includes self-paced online modules, an interactive conference series, and mentored projects. Curriculum evaluation included knowledge and attitude assessments at 3 points in time (pre- and posttest in year 1 and end of curriculum [EOC] survey in year 3 upon completion of all curricular elements) and skill assessment at the EOC.

Results: Of 57 eligible residents in cohort 1, variable numbers of residents completed knowledge (n = 42, 20, and 31) and attitude (n = 11, 13, and 37) assessments in 3 points in time; 37 residents completed the EOC skills assessment. For knowledge assessments, there were significant differences between pre- and posttest and pretest and EOC scores, however, not between the posttest and EOC scores. In the EOC self-assessment, residents’ attitudes and skills improved for all areas evaluated. Additional outcomes from project work included dissemination of QI projects to hospital-wide quality/safety initiatives and in peer-reviewed national conferences.

Conclusions: Successful implementation of a QIPSC must be responsive to a number of learners, faculties, and institutional needs and integrate adult learning theory and QI/PS methodology. QIPSC is an initial effort to address this need; follow-up results from subsequent learner cohorts will be necessary to measure the true impact of this curriculum: behavior change and practice improvements.

From the Sections of *Pediatric Hospital Medicine and Emergency Medicine, Department of Pediatrics, Baylor College of Medicine, Houston, Tex.; and US Preventive Services Task Force and Maintenance of Certification and Quality, American Board of Pediatrics, Chapel Hill, N.C.

An institutional education grant from Texas Children’s Hospital was used to develop this curriculum.

Preliminary data were presented as a platform presentation at the Pediatric Hospital Medicine conference in San Antonio, Tex., July, 24, 2015.

Received for publication July 12, 2016; accepted October 3, 2016.

Disclosure: The authors have no financial interest to declare in relation to the content of this article.

*Corresponding author. Address: Joyee G. Vachani, MD, MEd, Section of Pediatric Hospital Medicine, Department of Pediatrics, Texas Children’s Hospital, 1102 Bates Street, Suite FC.1860, Houston, TX 77030 PH: 832-824-5447, E-mail: JGVachan@texaschildrens.org

This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially.

Back to Top | Article Outline

INTRODUCTION

In 2006, the Society of Hospital Medicine specified quality improvement (QI) and practice-based learning and improvement as core competencies relating to healthcare systems.1 In 2007, the Accreditation Council for Graduate Medical Education–mandated residents across specialties must participate in interprofessional teams to enhance patient safety (PS) and improve the quality of patient care, systematically analyze their practice using QI methods, and utilize this information to effect change with the goal of practice improvement.2 This charge required residency training programs to acknowledge and create educational plans for QI and PS and incorporate these skills into care plans and system-based learning.3–7

A review of the literature suggests that improvements are still needed in QI/PS pediatric resident training and that successful programs require QI project work along with faculty mentorship.5 , 6 Using key principles of adult learning theory as it applies to resident education and QI/PS methodology, we developed and implemented the QI and PS curriculum (QIPSC) for Baylor College of Medicine/Texas Children’s Hospital pediatric residents. The purpose of this study was to develop and test our curriculum QIPSC to improve resident competence in QI/PS knowledge, skills, and attitudes.

Back to Top | Article Outline

METHODS

Evaluation of the curriculum was institutional review board approved, and the project was part of the Academic Pediatric Association’s Educational Scholars Program.

Back to Top | Article Outline

Curriculum Development and Implementation

We used the principles of andragogy described by Knowles,8 which suggest engaging adult learners early, applying their experience, making learning relevant, and creating curricula that are problem centered.8 The first steps in curriculum development as described by Kern9 are problem identification and a needs assessment. We began by developing a needs assessment survey, to which 75 pediatric residents responded in academic year (AY) 2011 to 2012. A majority of respondents (n = 64) preferred short educational modules of 30 minutes or less. Residents ranked their preferences for teaching methods equally, with residents identifying didactic in-person lectures, workshops, online quizzes, and online self-running lectures as effective delivery methods. Residents identified topics of greatest interest: evidence-based practice, PS, health policy, team effectiveness, and leadership. Most respondents (n = 52) felt that learning about QI was important or very important during residency.

Utilizing the results of our needs assessment and continuing with subsequent steps in the model described by Kern9 for curriculum development, we created specific goals and objectives and developed educational strategies and an implementation plan for our curriculum. Content was reviewed by QI and education experts. Given limited available curriculum time, our curriculum was split into 2 phases: a didactic learning phase and a hands on active learning phase. The first phase of QIPSC, implemented in July 2012, consisted of “ignite” presentations (interactive, online self-paced modules with voice-overs) and an interactive noon conference series on QI/PS topics that engaged learners through team-based learning, process mapping, and fishbone diagram activities.

In July 2013, we began the second phase of the curriculum: ongoing mentored projects in which pediatric residents actively participated. In AY 2013 to 2014, residents participated in 4 projects under the mentorship of Pediatric Hospital Medicine faculty. These projects were selected based on the criteria of stage of completion, faculty support, and educational value. All projects were QI or PS initiatives with clear and precise goals and timelines that would allow resident participation to be substantial. The initial mentored project topics were resident handoffs, assignment of severity scoring to asthma patients, a standardized oxygen weaning initiative, and identification of correct patient care teams. Each project utilized the Institute for Healthcare Improvement model for improvement incorporating specific aims, testing cycles of change, and measuring outcomes. The handoff project and a new patient flow project were continued in AY 2014 to 2015. Each project had one or more resident champions, both self-identified and appointed by the project mentor. These resident champions participated throughout the project duration and mentored junior residents as several projects became hospital-wide initiatives and were presented at peer-reviewed national conferences. Residents were involved in various aspects of each of the QI projects including data collection and interpretation, review of Plan, Do, Study, Act cycles, and creation and implementation of new initiatives for future Plan, Do, Study, Act cycles. The Maintenance of Certification credit was offered to faculty who served as project mentors. The first cohort of residents trained in QIPSC graduated in June 2015.

Back to Top | Article Outline

Curriculum Evaluation

The final steps of curriculum development as described by Kern9 are evaluation and feedback. The model described by Kirkpatrick10 for training evaluation describes the levels of learning from the most basic reaction, which then progresses to learning, transfer, and results. The pyramid of clinical competence described by Miller11 describes the transition from a novice who gathers facts and interprets/applies to an expert who demonstrates learning and integrates into practice. Integrating these frameworks, we developed tools to assess learner knowledge, skills, and attitudes.

Back to Top | Article Outline

Knowledge and Attitudes Assessments

Knowledge questions and an attitude survey (AS) previously described in the literature were administered at 3 points in the curriculum: a pretest and survey administered in the first year of residency before exposure to any components of the curriculum; a posttest and survey administered directly after completion of the 4 online modules; and an end of curriculum (EOC) survey (with knowledge test embedded) administered in the final year of residency after participation in all components of QIPSC curriculum.

Ten knowledge questions were the same throughout all tests and were analyzed. A previously described 12-question AS from the quality assessment and improvement curriculum was administered to cohort 1 at all 3 points in time.12–14

Back to Top | Article Outline

Skills Assessment

We also developed and administered a 12-question pre– and post–skill survey focusing on project work, team work, and leadership roles. The survey was completed by residents, and results were based on their perception of skills; no direct observations were done. Resident perceptions of their skills before and after the curriculum were surveyed upon completion of the curriculum in AY 2014 and 2015.

Back to Top | Article Outline

RESULTS

Knowledge and Attitudes Assessment

Of the 57 eligible residents in cohort 1, 43 participants completed 1 of the 3 administered QI/PS knowledge tests and were included in the final analysis: 42 residents completed the pretest, 20 completed the posttest, and 31 completed the EOC survey. We used unmatched/unpaired data comparisons of pre-, post-, and EOC survey scores because of the variability in sample size for each test. There were significant differences between pre- and posttest scores and pretest and EOC knowledge scores; however, there were no significant differences between the posttest and EOC survey using either parametric or nonparametric testing (Tables 1 and 2).

Table 1

Table 1

Table 2

Table 2

Of the 57 eligible residents in cohort 1, 11 residents completed the pre-AS, 13 completed the post-AS, and 37 completed the EOC AS; all surveys were included in the final analysis. The pre- and post-AS scores were compared as unpaired groups (ie, comparisons were made between pre- and postgroups, not between pre- and postmatched scores). The EOC AS did contain linked pre- and postresponses by residents and thus were analyzed as 2 paired groups (comparisons were made between individual resident pre- and postmatched scores). In the comparison of cohort 1 pre-AS and post-AS responses, significant attitude differences were found in 3 questions, and 1 approached significance. In the comparison of pre-AS and EOC “after” responses and EOC AS “before” and EOC “after” responses, there were statistically significant differences for all attitude questions (Tables 3 and 4).

Table 3

Table 3

Table 4

Table 4

Back to Top | Article Outline

Skills Assessment

Thirty-seven eligible participants completed the skill survey and were included in the final analysis. Using the Wilcoxon test, there were statistically significant differences for all skill questions (Table 5).

Table 5

Table 5

Back to Top | Article Outline

DISCUSSION

The QIPSC combines didactic components (lectures provided in interactive online ignite modules) with an interactive monthly noon conference series and practical applications of learned concepts in ongoing mentored projects to create an innovative longitudinal curriculum. The key study findings include increased QI/PS knowledge, attainment of skills, and improved attitudes.

Our analysis suggests that QIPSC modestly improved resident knowledge. As healthcare shifts from volume-based to a value-based culture, QI knowledge is a vital component of residency training as it directly impacts organizational success and patient outcomes.2 , 5

We found through participation in QIPSC, resident attitudes, and skills improved. These results are consistent with previous studies in QI/PS education; however, the multifaceted and longitudinal approach employed by QIPSC was integral for success and may increase the longevity of the curriculum. Additionally, residents were able to participate in faculty-mentored QI projects, many of which were based on ideas and feedback from the residents. For example, a focus group of residents were frustrated with the process of handoffs and sought a standardized method to improve this process. Through an independent literature review and research, residents discovered the I-PASS© tool and created a mentored project to implement this tool. Since inception of the project, I-PASS© has been accepted as the standard handoff tool throughout the hospital.

The leadership and hands-on experience gained by direct resident involvement in projects was an important aspect of the curriculum. Although there were no specific degrees of resident project participation in this study, engagement of the residents in project work proved essential to project success. Plan, Do, Study, Act cycles were largely determined by their ideas and feedback. Additionally, faculties involved in the projects were able to identify particular residents who were most invested in the project and approached them about becoming a “resident project champion.” Residents involved in this aspect of the project were not only empowered to lead their peers but were also able to present the project locally and nationally. This experience enabled learners in training to see first hand how QI and PS work can lead to tangible improvements and scholarly output.

As residents embark on their own careers, seeing the output and improvements that can be achieved through QI/PS work is extremely meaningful.15 , 16 The results of the knowledge test and the attitude/skill survey point to the value in both the explicit and the implicit curriculum of the QIPSC course.17 Improvement in knowledge scores suggests that the explicit curriculum (didactic instruction from online modules, guided project work, and noon conference presentations) gave residents a foundation of QI/PS knowledge. Improvement in attitudes toward and skills in QI/PS suggests that the implicit curriculum (project work, mentorship, and open dialogue with residents) was impactful and meaningful. Resident champions, who were mentored by QI-trained faculty members, went on to lead and present their work at an internal resident conference at the end of the AY. All projects and the QIPSC curriculum as a whole were subsequently presented at peer-reviewed national conferences.

There were several lessons learned while developing this curriculum and researching the impact over time. Getting stakeholder buy-in for new or additional curriculum in an already stretched learner group is difficult and takes time. Involving residents, program directors, and faculty mentors early on in the process was vital to success. It is important to work within the given framework, prove the new model, and then ask for more time. Additionally, it is key to train and engage faculty to sustain a longitudinal curriculum; incentives such as Maintenance of Certification should be offered. Finally, culture change takes time… but is priceless.

Back to Top | Article Outline

LIMITATIONS

Barriers to curriculum implementation included competing demands within the residency program, time constraints of faculty, and variable placement of QIPSC within the residency curriculum. Initial placement of curriculum components on a busy pediatric inpatient rotation led to challenges in participation and group discussion preparation, ultimately resulting in a lack of resident engagement and understanding of project goals and progress. Technology-related issues also arose after the first year as the online platform of module delivery had to be changed, resulting in adjustments to orientation materials, evaluations, and faculty orientations.

Although our curriculum was mandatory, response and participation in surveys were not. Additionally, there were no associated external incentives for participation. Variation in survey completion is attributed to the setting/location/timing of test administration (eg, busy inpatient rotation vs nonclinical scholarly rotation and protected time for completion vs utilization of personal time). Small response rates and inability to pair pre-/postresponses from an individual resident during the knowledge section made analysis challenging. A shorter observation period potentially could have helped this although the design of our curriculum and study is longitudinal and aimed at assessing residents over a longer period of time. Additionally, because our response rate fluctuated throughout the study and we utilized unmatched comparisons, it is possible that less interested learners chose not to repeat testing, leading to volunteer bias. Data to properly analyze this potential bias were not available to maintain confidentiality for learners. The results from skill survey/ASs are significant but limited by being a perception survey done by the learner at a single point in time. The study could have been strengthened by direct observation and assessment of skills.

Finally, as the culture of medicine has evolved toward value-based care and the discussion of patient quality and safety in medical education is more prevalent, it is difficult to assess what impact a cultural shift outside of our curriculum had on learners’ attitudes. As the curriculum gained momentum, cycles of improvement were done to the curriculum itself, including components being moved to ambulatory rotations (Fig. 1).

Fig. 1

Fig. 1

Back to Top | Article Outline

CONCLUSIONS

The QIPSC increased QI/PS knowledge, attainment of skills, and improved attitudes. Key lessons learned to successfully develop and implement a quality/safety curriculum include acknowledging a number of learner, faculty, and institutional needs; being flexible and responsive to these needs; and integrating key concepts of adult learning theory and QI/PS methodology. Ongoing goals for the curriculum include improving acquired resident knowledge, developing more robust evaluation tools, expanding QIPSC to additional learner groups and programs, and linking learner training to patient-level outcomes. Follow-up results from subsequent QIPSC learner cohorts will be necessary to measure the true impact of this curriculum: behavior change and improvements in practice.

Back to Top | Article Outline

REFERENCES

1. Co JP. Educating for quality: quality improvement as an activity of daily learning to improve educational and patient outcomes. Acad Pediatr. 2014;14:1–3.
2. Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the accreditation council for graduate medical education: a systematic review. Acad Med. 2009;84:301–309.
3. Kelz RR, Sellers MM, Reinke CE, et al. Quality in-training initiative: a need for education in quality improvement: results from a survey of program directors. J Am Coll Surg. 2013;217:1126–1132.e1–e5.
4. AAMC Expert Panel. Teaching for Quality Integrating Quality Improvement and Patient Safety across the Continuum of Medical Education. 2013.
5. Craig MS, Garfunkel LC, Baldwin CD, et al. Pediatric resident education in quality improvement (QI): a national survey. Acad Pediatr. 2014;14;54–61.
6. Mann KJ, Craig MS, Moses JM. Quality improvement educational practices in pediatric residency programs: survey of pediatric program directors. Acad Pediatr. 2014;14:23–28.
7. Gupta M, Ringer S, Tess A, et al. Developing a quality and safety curriculum for fellows: lessons learned from a neonatology fellowship program. Acad Pediatr. 2014;14:47–53.
8. Knowles MS. Andragogy in Action: Applying Modern Principles of Adult Education. San Francisco, CA: 1984.
9. Kern DE. Curriculum Development for Medical Education: A Six Step Approach. 2009.2nd Ed. Baltimore, MD: Johns Hopkins University Press.
10. Kirkpatrick DL. Evaluating Training Programs: The Four Levels. 1994.California: Berrett-Koeller Publishers.
11. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65:S63–S67.
12. Reed D, Wittich C, Drefahl M, et al. A Quality Improvement Curriculum for Internal Medicine Residents. MedEdPORTAL. Available at: http://www.aamc.org/mededportalID=7733. Accessed October 14, 2013.
13. Ogrinc G, Headrick LA, Morrison LJ, et al. Teaching and assessing resident competence in practice-based learning and improvement. J Gen Intern Med. 2004;19:496–500.
14. Oyler J, Vinci L, Johnson JK, et al. Teaching internal medicine residents to sustain their improvement through the quality assessment and improvement curriculum. Online Appendix 3 – QAIC Toolkit. J Gen Intern Med. 2011; 26:221–225.
15. Philibert I, Gonzalez DRJA, Lannon C, et al. Quality improvement skills for pediatric residents: from lecture to implementation and sustainability. Acad Pediatr. 2014;14;40–46.
16. Simasek M, Ballard SL, Phelps P, et al. Meeting resident scholarly activity requirements through a longitudinal quality improvement curriculum. J Grad Med Educ. 2015;7:86–90.
17. Balmer DF, Quiah S, DiPace J, et al. Learning across the explicit, implicit, and extra-curricula: an exploratory study of the relative proportions of residents’ perceived learning in clinical areas at three pediatric residency programs. Acad Med. 2015;90:1547–1552.
Copyright © 2016 The Authors. Published by Wolters Kluwer Health, Inc. Health, Inc. All rights reserved.