Secondary Logo

Journal Logo

RIME

Educating Future Physicians to Track Health Care Quality

Feasibility and Perceived Impact of a Health Care Quality Report Card for Medical Students

O’Neill, Sean M., PhD; Henschen, Bruce L., MD, MPH; Unger, Erin D., MD; Jansson, Paul S., MS; Unti, Kristen; Bortoletto, Pietro; Gleason, Kristine M., MPH, RPh; Woods, Donna M., PhD; Evans, Daniel B., MD

Author Information
doi: 10.1097/ACM.0b013e3182a36bb5
  • Free

Abstract

New models of care such as the patient-centered medical home (PCMH)1 show promise at lowering costs and improving patient care by empowering patients and mobilizing primary care physicians as leaders of health teams.2,3 The core principles of the PCMH align with many of the Carnegie Foundation recommendations for medical education reform,4,5 and leading physician organizations have called on medical schools to teach the principles of the PCMH.6,7 However, current medical school curricula provide minimal, if any, exposure to emerging primary care models such as the PCMH.8,9 Some schools offer students meaningful opportunities to follow patients longitudinally,10–12 but few such models exist in medical school curricula.13 Opportunities for medical students to track health care quality metrics and measure outcomes for an authentic patient panel as part of a longitudinal clerkship have not been reported in the literature.

In 2011, the Education-Centered Medical Home (ECMH) launched at Northwestern University’s Feinberg School of Medicine (NUFSM).14 The ECMH is a longitudinal clerkship designed to integrate teams of medical students into outpatient clinics that focus on adopting the principles of the PCMH, including continuity with a personal physician; team-based care; care coordination and integration; quality and safety; and enhanced access to care.6 Specific components of the ECMH learning environment include longitudinal clinical experiences; care coordination of medically complex patients; peer teaching; quality assessment and improvement activities; and monthly “grand rounds” didactic sessions. After a one-year pilot, the ECMH was expanded to 13 clinics and 202 students (29% of the NUFSM student body) for the 2012–2013 academic year (see Table 1).

Table 1
Table 1:
Medical Students Participating in the Education-Centered Medical Home and Chart Abstraction Practicum, Northwestern University Feinberg School of Medicine, 2012–2013

A core objective of the ECMH curriculum was for every student to contribute to the development of a quality metric scorecard using National Quality Forum–endorsed measures.15 Our goal in this study was to introduce medical students, who had little background knowledge related to quality improvement (QI), to the concepts and process of QI by having them participate in a quality assessment practicum.

We set out to create a practical tool for tracking ECMH program performance centrally and for highlighting quality deficits and areas for improvement at the individual clinic level. We tasked all medical students enrolled in the ECMH to measure the quality of their clinic’s care, with the intention of providing firsthand experience in QI data abstraction, using available QI metrics, applying these metrics to an authentic patient panel, and then using clinical data to drive improvement efforts. Here we report the year one results regarding feasibility and perceived impact of our ECMH quality report card curriculum.

Method

ECMH setting and student participants

The organizational design of the ECMH has been previously reported,14,16 but here we provide a brief summary. We recruited students from all four years of medical education at NUFSM to participate in the ECMH curriculum. We matched them in teams of 16 (4 students from each class) to existing faculty practices. Each week, 7 or 8 students from each team would attend clinic and provide direct patient care under the traditional preceptor model, with third- and fourth-year students directly observing and mentoring their first- and second-year peers. Students also assumed the roles of health coaches and care coordinators in an effort to deliver PCMH-level care.1,6,14,17

In the 2011–2012 academic year, 56 students worked across 4 pilot clinics. On the basis of positive reactions from students, preceptors, and patients, the program was expanded to 202 students across 13 clinics for the 2012–2013 academic year (see Table 1). Three of the pilot clinics continued uninterrupted, and we refer to these as “experienced” clinics in this report. One hundred ninety-seven out of the 202 available students consented to be part of the educational study and to allow their aggregated assessment results and surveys to be analyzed and reported. The Northwestern University institutional review board approved this study.

Patient selection

From their existing panels, preceptors recruited “high-risk” patients, defined as any patient who had complex medical or social issues, required three to four office visits per year, and/or had two or more emergency room visits or hospital admissions during the past year. Preceptors verbally explained to potential enrollees that they would be seeing medical students in clinic who would serve as their care coordinator and who would contact them outside clinic for follow-up. Patients who agreed to participate had their routine appointments steered toward ECMH clinic days but could also schedule appointments with their primary physician for urgent issues at other times. Patients were defined as “enrolled” with the ECMH team on the date they were identified by the preceptor as “in need of outreach” and were contacted by phone for enrollment, or the date of their first visit with the student team, whichever came first. This was done to capture patients who may have been called by students following enrollment but prior to coming to clinic. We noted each patient’s date of enrollment.

ECMH interventions

Students were tasked to serve as “health coaches” for between one and six patients, depending on the student’s training level, under the supervision of their clinic preceptor. In addition to coordinating their patients’ ambulatory visits, students in the ECMH provided a range of services to empaneled patients, including telephone outreach, calls to specialists to coordinate care, health behavior coaching, and real-time identification of quality measure deficits during outpatient visits (e.g., immunization or diabetic foot exam deficiencies).

Quality indicators

We incorporated a total of 27 adult and 3 pediatric National Quality Forum–endorsed quality indicators into an ECMH “Quality Scorecard.”15 These included 19 measures reflecting chronic disease management as well as 9 adult and 2 pediatric preventive care measures. Metrics suggested by the Commonwealth Fund for tracking QI in the PCMH environment were heavily represented on the ECMH scorecard.18 Scorecards were used by each ECMH clinic to view their clinic’s aggregate performance and discuss strategies for real-time clinical improvement.

Data abstraction and analysis

Students participating in the ECMH attended a large-group session on quality measurement and data abstraction. Most students reported little previous experience with quality measurement or improvement (5 students reported > 50 hours, but the remaining 192 students estimated an average of 2.8 hours of prior QI training). Over two months, all students were given a series of background readings on health care QI,18–22 were sent a link to a secure Web-based abstraction instrument built using SurveyMonkey.com (SurveyMonkey, Palo Alto, California), and were instructed to use the tool to abstract quality metrics from patients in their personal panel. This abstraction tool included 157 items designed to calculate each of the quality indicators; skip logic built into the instrument reduced the number of items answered for patients who did not meet qualifying criteria (e.g., questions for diabetes-specific indicators were skipped if a patient did not have diabetes).

Using the electronic medical record, students entered deidentified patient data from 2010, 2011, and 2012 using the abstraction instrument. Patient data were abstracted for 2010, prior to the ECMH launch, to assess each patient’s baseline performance with each quality metric. For patients who had no data from 2010, we still included their data from 2011 and 2012 in the study. Students practiced using the abstraction instrument by entering “test patients” to become familiar with the data-gathering process; these practice sessions were later dropped from the final dataset. We deidentified patients by assigning them a generic ECMH identification number. The validity of the data was assessed by a physician (D.E.) who performed an implicit review of a random 5% sample (n = 20) of the student-abstracted charts. To develop a picture of the reliability of our abstraction process, a team of three trained chart reviewers independently abstracted the same sample. We then calculated the pairwise agreement and kappa statistic between the trained and student reviewers to determine the eligibility and pass criteria for all indicators.

We asked students to complete the previously developed Quality Assessment and Improvement Curriculum (QAIC) Toolkit survey on self-perceived QI skills both pre and post intervention.23,24 We compared pre- and postintervention surveys using the Wilcoxon signed rank test because the QAIC survey collects responses on an ordinal, four-point Likert-type scale.

We calculated descriptive statistics regarding the student abstraction process and the patient panel. We determined the proportion of eligible patients who received the specified care for each quality indicator in each year. All statistical analyses were conducted in STATA statistical software version 12.0 (STATA Corporation, College Station, Texas). As data were collected, we communicated ongoing results to students at monthly grand rounds large-group sessions, as well as through regular e-mails from the ECMH director (D.E.) to the students. Clinic-specific quality scorecards were generated using the final abstraction results and given to each student team for the purpose of finding quality deficits and encouraging patient outreach.

Results

Seventy-six percent of students (149/197) completed at least one patient record abstraction (range: 1–10; see Table 1). In total, 405 patient records were abstracted (mean: 2.7 per student; SD: 1.8). Third-year students abstracted 3.3 records on average compared with 1.6 records for first-year students. Median abstraction time was 21.8 minutes (interquartile range: 13.1–37.1). Students in the experienced ECMH clinics that had participated in the pilot year tended to abstract more records (3.3 versus 2.5) but had no differences in extraction time. Mean agreement across all abstractor pairs (n = 20) was 86%, and the kappa statistic was 0.59. Mean agreement across abstractor pairs that included a first-year student-abstractor (n = 4) was much poorer (61%, kappa 0.12) compared with second- through fourth-year abstractors (92%, kappa = 0.60).

Among the abstracted records, 100 patients were enrolled by the end of 2011; 355 were enrolled by the end of 2012, and 405 patients were enrolled by the end of the study in February 2013. Patients’ characteristics are displayed in Table 2. Patients were eligible for an average of 8.0 indicators (SD 5.0, range 0–21), meaning that they met criteria to be screened for those indicators, and more than 10% of the sample population were eligible for 22 out of the 27 adult indicators (see Table 3). Weight screening (85%) and influenza immunization (85%) were the most common indicators that patients were eligible for; by contrast, less than 5% of patients were eligible for several heart-disease-related metrics and pediatric measures.

Table 2
Table 2:
Characteristics of Abstracted Education-Centered Medical Home Patients, Northwestern University Feinberg School of Medicine, 2012–2013
Table 3
Table 3:
Education-Centered Medical Home Patient Performance on Quality Report Card Indicators for Calendar Year 2012, Northwestern University Feinberg School of Medicine

Overall performance on quality measures ranged from 100% for beta-blocker use for patients with a history of myocardial infarction to a low of 24% for dilated diabetic eye exams (see Table 3). We performed exploratory analyses examining trends in performance by year, ECMH enrollment, duration of enrollment, and clinic experience. From 2010 (pre-ECMH) to 2012, the greatest performance gains overall were observed in diabetic foot exams (22% versus 55%), chlamydia screening rates (32% versus 62%), medical attention to diabetic nephropathy (65% versus 91%) and the use of inhaled steroids for moderate-to-severe persistent asthma (71% versus 100%).

A total of 147 out of 197 students (75%) completed both the pre and post QAIC survey on self-assessment of QI skills (see Table 4).25 Mean baseline confidence in QI skills was at least “slightly” confident in most domains, with postintervention confidence ratings advancing to between “slightly” and “moderately” confident. At the end of the ECMH-QI project, students were asked to rate the educational value/impact of the exercise. Sixty-six percent of students agreed or strongly agreed with the statement “reviewing the quality of care for my individual patients was a valuable exercise,” and 77% agreed or strongly agreed with the statement “prospectively following ECMH quality metrics going forward will be a valuable exercise.”

Table 4
Table 4:
Students’ (n = 147) Mean Rating of Their Self-Assessment of Quality Improvement Learning Objectives, Northwestern Feinberg University School of Medicine, 2012–2013

Discussion and Conclusions

Our results demonstrate the feasibility of medical student participation in firsthand health care quality measurement using a Web-based abstraction instrument for patients they have seen on a longitudinal clerkship. A quick-turnaround “quality scorecard” is capable of highlighting specific care deficits on which to focus improvement efforts. A significant number of patients were eligible for most of the indicators used, which illustrates the usefulness of this particular scorecard for this set of primary care clinics.

Several aspects of this project are unique from an educational perspective. First, data abstraction for quality measurement is typically carried out by trained abstractors who focus on quality measurement. We were able to show that a distributed data collection mechanism, employing medical students for only a few hours at most, was able to rapidly produce useful quality measurement data. Second, by being so deeply involved in the quality measurement process, students are experiencing firsthand the limitations and frustrations that go along with translating the complex act of clinical care into objective statistics.

Students’ self-assessments further reflect these frustrations as they, on average, reported only moderate comfort with their skills after the exercise. The low pre- and postassessment QAIC ratings are similar to students’ and residents’ QI self-confidence ratings seen in recent studies26,27 and may also stem from little prior exposure to data abstraction and from the short study length. Further exposure to directed didactic and training modules for QI skills, in addition to the much broader ECMH “grand rounds” curriculum, may have helped to increase self-confidence ratings. Despite these challenges, student feedback was largely positive and may reflect an appreciation for the importance of accurate and timely data collection for informing QI efforts. After seeing their clinic’s performance outcomes, participating students shared during grand rounds discussions that the process is forcing them to consider clinical care at a new level they had not fully appreciated before. We believe that exposure to such a “deep-dive” view of their own clinical care during medical school will engender a familiarity and informed perspective on the quality metrics by which they will be assessed for the remainder of their careers. Future planned analyses of the ECMH intervention include tracking students’ academic and clinical performance over time.

This study has several limitations. First, our data collection process was distributed widely among 149 independent abstractors, and agreement statistics of 86% may not be optimal. There are no widely accepted thresholds for interrater reliability, but high reliability is optimal when using quality indicators to drive organizational improvement.28 However, given the wide variation in abstractor training level (first- through fourth-year students) and experience, and the ECMH’s primary emphasis on hands-on educational experience (versus “gold standard” quality measurement), our results are an encouraging first step toward building the practical tools by which our students will eventually assess their performance using clinical measures. Second, the ECMH intervention itself is occurring across multiple clinics in multiple settings with differing resources, practice habits, and patient populations, and the challenges facing each clinic vary. Thus, the treatment effect of the ECMH model itself on patient care is difficult to ascertain, and some improvements may be attributable to the Hawthorne effect. Third, we did not prospectively identify a group of control clinics by which to randomize the ECMH intervention. However, applying traditional biomedical research methodology to complex social and organizational interventions is difficult, and its utility is controversial.29,30 The ECMH is an evolving, ongoing intervention within a medical school and multiple independent physician groups, and research projects are under way to evaluate its educational and QI effectiveness through a variety of methodological approaches.

In summary, we demonstrated the feasibility of an authentic QI curriculum for medical students embedded in a longitudinal clerkship structure. By using a rapid and convenient abstraction instrument, students were able to identify opportunities to improve the quality of care they delivered to their patients. Using the curriculum, student perceptions of their own QI skills improved compared with baseline. We hope that by participating integrally in a real-time quality measurement process, medical students will be inspired to generate meaningful QI interventions within the ECMH. Furthermore, learning QI techniques in a practical setting may provide ECMH graduates with the tools needed to measure and improve their future workplaces. Further work is required to refine our abstraction instrument and follow these data over time to see if our students can improve the quality of care for their personal patient panels.

References

1. Patient-Centered Primary Care Collaborative. . Defining the medical home. http://www.pcpcc.net/content/joint-principles-patient-centered-medical-home. Accessed June 6, 2013
2. Stange KC, Nutting PA, Miller WL, et al. Defining and measuring the patient-centered medical home. J Gen Intern Med. 2010;25:601–612
3. Jackson GL, Powers BJ, Chatterjee R, et al. The patient-centered medical home: A systematic review. Ann Intern Med. 2013;158:169–178
4. Irby DM, Cooke M, O’Brien BC. Calls for reform of medical education by the Carnegie Foundation for the Advancement of Teaching: 1910 and 2010. Acad Med. 2010;85:220–227
5. Cooke M, Irby D, O’Brien B Educating Physicians: A Call for Reform of Medical School and Residency. 2010 San Francisco, Calif Jossey-Bass
6. Baxley E, Dearing J, Esquivel M, et al. Joint Principles for the Medical Education of Physicians as Preparation for Practice in the Patient-Centered Medical Home. http://www.acponline.org/running_practice/delivery_and_payment_models/pcmh/understanding/educ-joint-principles.pdf. Accessed June 6, 2013
7. Lieberman SA, McCallum RM, Anderson GD. A golden opportunity: The coevolution of medical and education homes. Acad Med. 2011;86:1342
8. Joo P, Younge R, Jones D, Hove J, Lin S, Burton W. Medical student awareness of the patient-centered medical home. Fam Med. 2011;43:696–701
9. Lausen H, Kruse JE, Barnhart AJ, Smith TJ. The patient-centered medical home: A new perspective for the family medicine clerkship. Fam Med. 2011;43:718–720
10. Teherani A, Irby DM, Loeser H. Outcomes of different clerkship models: Longitudinal integrated, hybrid, and block. Acad Med. 2013;88:35–43
11. Hirsh D, Gaufberg E, Ogur B, et al. Educational outcomes of the Harvard Medical School–Cambridge integrated clerkship: A way forward for medical education. Acad Med. 2012;87:643–650
12. Poncelet A, Bokser S, Calton B, et al. Development of a longitudinal integrated clerkship at an academic medical center. Med Educ Online. 2011;16:5939 http://med-ed-online.net/index.php/meo/article/view/5939. Accessed August 7, 2013.
13. Ogrinc G, Mutha S, Irby DM. Evidence for longitudinal ambulatory care rotations: A review of the literature. Acad Med. 2002;77:688–693
14. Henschen BL, Garcia PM, Jacobson B, et al. The patient centered medical home as curricular model: Perceived impact of the “education-centered medical home.” J Gen Intern Med. 2013;28:1105–1109
15. National Quality Forum. Measures, reports and tools. http://www.qualityforum.org/Measures_Reports_Tools.aspx. Accessed June 6, 2013
16. Evans D. The patient-centered medical home as a curricular model: Medical students need an “educational home.” Acad Med. 2011;86:e2
17. American College of Physicians. . The Patient-Centered Medical Home Neighbor: The Interface of the Patient-Centered Medical Home With Specialty/Subspecialty Practices. http://www.acponline.org/advocacy/where_we_stand/policy/pcmh_neighbors.pdf. Published August 1, 2010. Accessed June 6, 2013
18. Rosenthal MB, Abrams MK, Bitton Aand the Patient-Centered Medical Home Evaluators’ Collaborative. . Recommended Core Measures for Evaluating the Patient-Centered Medical Home: Cost, Utilization, and Clinical Quality. http://www.commonwealthfund.org/~/media/Files/Publications/Data%20Brief/2012/1601_Rosenthal_recommended_core_measures_PCMH_v2.pdf. Accessed June 6, 2013
19. McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348:2635–2645
20. Batalden PB, Davidoff F. What is “quality improvement” and how can it transform healthcare? Qual Saf Health Care. 2007;16:2–3
21. Campbell SM, Braspenning J, Hutchinson A, Marshall MN. Research methods used in developing and applying quality indicators in primary care. BMJ. 2003;326:816–819
22. Aron DC, Headrick LA. Educating physicians prepared to improve care and safety is no accident: It requires a systematic approach. Qual Saf Health Care. 2002;11:168–173
23. Oyler J, Vinci L, Johnson J, Arora V. Quality Assessment and Improvement Curriculum (QAIC) Toolkit. http://medqi.bsd.uchicago.edu/documents/QAICToolKit5-09.pdf. Accessed June 6, 2013
24. Oyler J, Vinci L, Arora V, Johnson J. Teaching internal medicine residents quality improvement techniques using the ABIM’s practice improvement modules. J Gen Intern Med. 2008;23:927–930
25. American Academy of Family Practice. . Model for Improvement Video. http://www.aafp.org/practice-management/pcmh/overview/videos.html. Accessed August 7, 2013
26. Shunk R, Dulay M, Julian K, et al. Using the American Board of Internal Medicine practice improvement modules to teach internal medicine residents practice improvement. J Grad Med Educ. 2010;2:90–95
27. Levitt DS, Hauer KE, Poncelet A, Mookherjee S. An innovative quality improvement curriculum for third-year medical students. Med Educ Online. 2012;17:18391 http://med-ed-online.net/index.php/meo/article/view/18391/html Accessed August 7, 2013
28. Quality Measurement and Health Assessment Group (QMHAG), Center for Medicare and Medicaid Services. . CMS Measures Management Special Project. Adopted from NQF Final Evaluation Criteria. http://www.hsag.com/services/special/mms.aspx. Accessed June 6, 2013
29. Grembowski D, Anderson ML, Conrad DA, et al. Evaluation of the group health cooperative access initiative: Study design challenges in estimating the impact of a large-scale organizational transformation. Qual Manag Health Care. 2008;17:292–303
30. Berwick DM. The science of improvement. JAMA. 2008;299:1182–1184
© 2013 by the Association of American Medical Colleges