Secondary Logo

Share this article on:

A Model for Catalyzing Educational and Clinical Transformation in Primary Care: Outcomes From a Partnership Among Family Medicine, Internal Medicine, and Pediatrics

Eiff, M. Patrice MD; Green, Larry A. MD; Holmboe, Eric MD; McDonald, Furman S. MD; Klink, Kathleen MD; Smith, David Gary MD; Carraccio, Carol MD; Harding, Rose; Dexter, Eve MS; Marino, Miguel PhD; Jones, Sam MD; Caverzagie, Kelly MD; Mustapha, Mumtaz MD; Carney, Patricia A. PhD

doi: 10.1097/ACM.0000000000001167
Research Reports

Purpose To report findings from a national effort initiated by three primary care certifying boards to catalyze change in primary care training.

Method In this mixed-method pilot study (2012–2014), 36 faculty in 12 primary care residencies (family medicine, internal medicine, pediatrics) from four institutions participated in a professional development program designed to prepare faculty to accelerate change in primary care residency training by uniting them in a common mission to create effective ambulatory clinical learning environments. Surveys administered at baseline and 12 months after initial training measured changes in faculty members’ confidence and skills, continuity clinics, and residency training programs. Feasibility evaluation involved assessing participation. The authors compared quantitative data using Wilcoxon signed-rank and Bhapkar tests. Observational field notes underwent narrative analysis.

Results Most participants attended two in-person training sessions (92% and 72%, respectively). Between baseline and 12 months, faculty members’ confidence in leadership improved significantly for 15/19 (79%) variables assessed; their self-assessed skills improved significantly for 21/22 (95%) competencies. Two medical home domains (“Continuity of Care,” “Support/Care Coordination”) improved significantly (P < .05) between the two time periods. Analyses of qualitative data revealed that interdisciplinary learning communities formed during the program and served to catalyze transformational change.

Conclusions Results suggest that improvements in faculty perceptions of confidence and skills occurred and that the creation of interdisciplinary learning communities catalyzed transformation. Lengthening the intervention period, engaging other professions involved in training the primary care workforce, and a more discriminating evaluation design are needed to scale this model nationally.

M.P. Eiff is professor and vice chair, Department of Family Medicine, Oregon Health & Science University, Portland, Oregon.

L.A. Green is professor of family medicine, Epperson-Zorn Chair for Innovation in Family Medicine and Primary Care, University of Colorado, Denver, Colorado.

E. Holmboe is senior vice president, Milestone Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois.

F.S. McDonald is senior vice president, Academic and Medical Affairs, American Board of Internal Medicine, Philadelphia, Pennsylvania.

K. Klink is director, Medical & Dental Education, Department of Veterans Affairs Office of Academic Affiliations, Washington, DC.

D.G. Smith is director, Graduate Medical Education, Abington Memorial Hospital, Abington, Pennsylvania, and clinical associate professor of medicine, Temple University School of Medicine, Philadelphia, Pennsylvania.

C. Carraccio is vice president, Competency-Based Assessment Program, American Board of Pediatrics, Chapel Hill, North Carolina.

R. Harding is research assistant, Department of Family Medicine, Oregon Health & Science University, Portland, Oregon.

E. Dexter is biostatistician, Department of Family Medicine, Oregon Health & Science University, Portland, Oregon.

M. Marino is assistant professor, Department of Family Medicine, Oregon Health & Science University, Portland, Oregon.

S. Jones is program director, Virginia Commonwealth University–Fairfax Residency Program, Fairfax, Virginia.

K. Caverzagie is associate dean for educational strategy, University of Nebraska School of Medicine, Omaha, Nebraska.

M. Mustapha is assistant professor, Department of Internal Medicine and Department of Pediatrics, University of Minnesota, Minneapolis, Minnesota.

P.A. Carney is professor of family medicine, School of Medicine, and professor of public health, School of Public Health, Oregon Health & Science University, Portland, Oregon.

Funding/Support: The initial 18-month Primary Care Faculty Development Initiative pilot program was conducted by Oregon Health & Science University pursuant to contract HHSH250201200023C with the U.S. Department of Health and Human Services, Health Resources and Services Administration. An additional 6-month period for assessment of longer-term outcomes and further training was funded by the Josiah Macy, Jr. Foundation, the American Board of Internal Medicine, the American Board of Family Medicine Foundation, and the American Board of Pediatrics Foundation.

Other disclosures: None reported.

Ethical approval: This project, including all evaluation activities, received an exemption from Oregon Health & Science University’s institutional review board (IRB) in accordance with 45CFR46.101(b)[1] (IRB#9227). Additionally, the IRB of each of the four participating institutions also deemed the evaluation exempt.

Disclaimer: The views expressed are those of the authors and do not necessarily represent the views of the U.S. Department of Health and Human Services, including the Health Resources and Services Administration, the U.S. Department of Veterans Affairs, or the U.S. federal government.

Previous presentations: Some of the results of the Primary Care Faculty Development Initiative were presented at the 2015 Accreditation Council for Graduate Medical Education Annual Educational Conference on February 28, 2015, in San Diego, California, and the 11th Annual Association of American Medical Colleges Health Workforce Research Conference on May 1, 2015, in Alexandria, Virginia.

Correspondence should be sent to M. Patrice Eiff, Oregon Health & Science University, 3181 S.W. Sam Jackson Park Rd., MC: FM, Portland, OR 97239; telephone: (503) 494-6610; e-mail: eiff@ohsu.edu.

A better-prepared and more effective primary care workforce is urgently needed to meet the “triple aim” of improved care, health, and cost,1,2 yet existing physician residency programs are not designed to produce the necessary results. The current graduate medical education (GME) system lacks accountability in producing the types of physicians that today’s health care system requires.3–6 The primary care workforce needs better training in systems thinking, quality improvement, cost of care, care coordination, and team-based care. Research has shown that the clinical learning environment in which future physicians train affects their future performance and that effective clinical systems are necessary to produce competent physicians.7–9 These findings suggest that to effectively learn to practice in advanced models of primary care, such as the patient-centered medical home (PCMH), trainees should experience such models during training.10–12 Faculty trained before these models developed are unlikely to have experienced any formal training in practice transformation. Professional development programs are needed to provide faculty with skills to create effective ambulatory clinical learning environments while concomitantly teaching important new primary care competencies.

Traditionally, professional development programs have been delivered within single disciplines as one-time “bolus” workshops. Such programs fail to address the reality that the acquisition of expertise occurs both longitudinally and experientially.13 Many of the PCMH competencies, such as team-based care and quality improvement, require ongoing practical experiences and reinforcement.14 Iterative educational experiences should serve as core features of collaborative learning through which participants support one another and contribute to one another’s success.15 These strategies have gained traction in other practice redesign collaboratives and are especially important during early stages of practice transformation.16,17 Given the evolving nature of the PCMH model and the lack of faculty with expertise in this model in residency ambulatory clinics, a learning community approach would position the primary care disciplines for effective change.14,18

The Primary Care Faculty Development Initiative (PCFDI), which takes just such a collaborative, iterative approach to preparing faculty to teach in and for 21st-century primary care practices, is described in detail elsewhere.19 The initiative began as a collaborative effort of the certifying boards in family medicine, internal medicine, and pediatrics. The Health Resources and Services Administration and the Josiah Macy Jr. Foundation supported the initiative. The overarching goal was to catalyze meaningful change in preparing the primary care workforce using an evidence-guided professional development program focused on the competencies needed to transform primary care training practices and residencies. The aim of the pilot project resulting from the initiative was to successfully enroll faculty teams from the three primary care disciplines in an educational program that fostered collaboration which would, in turn, lead to shared efforts. An additional aim was to assess the feasibility and acceptability of this type of interdisciplinary faculty development program. Here we report the effects of this professional development program on faculty participants’ skills and leadership confidence, on the primary care continuity clinics, and on the residency training programs.

Back to Top | Article Outline

Method

The PCFDI

As mentioned, we have described the initial training curriculum, organizational approach, selection process, and evaluation for PCFDI previously.19 Briefly, we piloted PCFDI with faculty teams from family medicine, internal medicine, and pediatric residencies at four institutions in the Midwest from 2012 to 2014. We selected these institutions from a pool of 53 applicants who underwent a competitive process which required a team application from representatives of each of the three disciplines. Each institution and the evaluation team, based at Oregon Health & Science University (OHSU), received institutional review board review for study activities and were granted approvals, exemptions, or waivers. Faculty members received no compensation for either themselves or their programs for participating, and they were not required to give informed consent because of the exempt nature of the study.

The PCFDI included two face-to-face training sessions: a two-and-a-half-day meeting in April 2013, during which the training curriculum was implemented and institutional goals were established,19 and a one-and-a-half-day meeting in May 2014 which focused on sharing preliminary results, provided an opportunity for each institutional team to share its progress in transforming their continuity clinics and residency training programs, and included content sessions on leadership and sustainability. The curriculum included six component areas that focused on patient-centered care: (1) leadership, (2) change management, (3) teamwork, (4) population management, (5) systems of care, and (6) competency assessment. Core faculty with expertise in specific areas of the curriculum provided instruction and coaching at both sessions. Designated core faculty and members of the evaluation team visited each institution in September or October 2013 to provide coaching and assess changes in continuity clinics and residency training programs. We conducted two webinars (in June and November 2013) for teams to report their progress and share ideas.

Back to Top | Article Outline

Study measures

We used participation rates to assess the feasibility and acceptability of this faculty development program. We used surveys (see below) to evaluate changes in faculty members’ self-reported skills and leadership confidence. We also used surveys to assess changes in continuity clinics and residency training programs during the time faculty were participating in the pilot program so as to better understand how faculty applied their learning in their respective programs. Finally, we also used qualitative methods (see below) to gain richer information about the PCFDI.

We captured faculty participation through attendance logs for each activity.

We measured perceived confidence in leadership using the validated 19-item Leadership Behavior tool,20 and self-assessed competency using a 38-item Faculty Skill & Program Feasibility Survey designed specifically for this pilot. For both of these measures, we used a post and retrospective preassessment such that we asked participants 6 months after the initial training (November 2013) to rate their perceived skill level at baseline prior to the PCFDI. At follow-up (12 months after the initial training, June 2014), we again surveyed faculty participants, assessing for both self-reported leadership confidence and skills competency.

The PCFDI Continuity Clinic Survey, a 17-item survey, also developed specifically for this project, measured continuity clinic characteristics (e.g., number of trainees and clinic sessions) and was administered at baseline (directly after the training, May 2013) and at follow-up (12 months after the initial training, May 2014). To assess PCMH activities in each clinic, we used the PCMH Monitor,21 which has been administered extensively to primary care residencies engaged in residency and practice transformation. Although the PCMH Monitor has undergone revisions over time, we used a version that underwent a factor analysis using data collected from residencies in Colorado over five years. (The factor analysis was done on 43 common or very similar items and yielded 32 items reflecting five domains: quality improvement process [Cronbach alpha = 0.89], team-based care [Cronbach alpha = 0.81], data and population management [Cronbach alpha = 0.91], self-management support and care coordination [Cronbach alpha = 0.87], and continuity of care [Cronbach alpha = 0.76].) Members of the clinical leadership team from each residency program completed this tool at baseline and 12 months after the initial training.

The Faculty Skill & Program Feasibility Survey captured, in addition to faculty self-reported skill competency, the extent of continuity clinic and residency training practice redesign efforts.

We captured the effects of the redesign efforts on participating residencies qualitatively via field notes recorded by multiple trained observers during trainings, site visits, and focus groups of faculty, residents, and clinical staff conducted during the site visits. Focus group questions assessed the effect of participation in the PCFDI on changes in residency programs and practices. To assess the impact of participation in the PCFDI on efforts to develop other faculty, the creation of learning communities, and redesign efforts, we conducted interviews (at 6 months and 12 months after the initial training) with participants representing each discipline at each institution. (Contact authors for interview and focus group questions.)

Back to Top | Article Outline

Quantitative data analysis

Data sources.

Thirty-six participants, nine from each of the four institutional teams, attended one or more of the PCFDI activities. The nine-member team from each institution was assembled by the three primary care residencies at that institution based on faculty members’ interest in the program. We excluded the participants who failed to complete follow-up surveys from analyses, as we needed pairwise comparisons to assess changes between the baseline and follow-up periods.

Back to Top | Article Outline

Statistical analyses.

Because this was a pilot study, we conducted nonparametric analyses to explore changes between the pre- and posttest periods. To test changes between baseline and the follow-up periods, we applied the nonparametric Wilcoxon signed-rank test for pre–post comparisons of faculty members’ confidence in leadership (Leadership Behavior Survey). We also used the Wilcoxon signed-rank test to examine comparisons of clinic changes (PCFDI Continuity Clinic Survey, PCMH Monitor) in all continuity clinics combined to avoid assuming any distribution on the outcome data. To test for homogeneity of the pre- and posttest distributions of faculty skills, we applied the Bhapkar22 test. We also used the Bhapkar test, a generalized McNemar test, to assess changes in practice and residency redesign efforts. All statistical tests were two sided, and we defined statistical significance as P < .05. We performed all statistical analyses using R v3.1.2 (Vienna, Austria).

Back to Top | Article Outline

Qualitative data analysis

Three experienced observers from the OHSU evaluation team recorded field notes of site visits, focus groups with program faculty and residents, and key informant interviews. These independent documents were then merged into a single document for each recorded “event” (e.g., each site visit) to reflect a composite record. Then two members of the evaluation team (M.P.E. and P.A.C.) independently reviewed the composite documents for emergent themes using narrative analysis techniques. We applied Riessman’s thematic narrative analysis techniques23 because this approach is case centered rather than cross-theorizing (i.e., it does not “fracture” the data). Thus, the themes identified remain contextual in nature according to who conveyed what. We identified emergent themes and extracted exemplar quotations from the source documents. We coded the themes using both open coding (reading through the data several times and creating tentative labels or codes for chunks of data that summarize the meanings that are emerging from the data) and axial coding (identifying relationships among the open codes), and we tagged the codes with specific definitions to allow for consistency that would facilitate consensus determination. We also used constant comparative analysis, which involved review and re-review of the data to ensure that emergent themes did not overlap. Additionally, we applied emergence and crystallization techniques, which involved in-depth review and discussion of emergent themes. Finally, after two of us (M.P.E. and P.A.C.) independently coded the data, we held consensus meetings until we agreed on final themes and representative exemplar quotations.

Back to Top | Article Outline

Results

Demographics and attendance

The 36 participants in the PCFDI included 18 women and 18 men (data not shown). One participant was a clinical psychologist, and the remaining 35 participants were physicians in family medicine (n = 11), internal medicine (n = 12), or pediatrics (n = 12). Participants had been in a faculty position for an average of 11 years (range 2–30 years); 9 were residency program directors, 8 were associate program directors, and 2 were clinic medical directors. Of the 36 participants, 33 (92%) attended the initial April 2013 training, 17 (47%) attended the June webinar, 14 (39%) attended the November webinar, 26 (72%) attended the second training session in May 2014, 17 (47%) attended both trainings and either webinar, and 6 (17%) attended both trainings and both webinars (data not shown).

Back to Top | Article Outline

Changes in faculty skills

Self-reported confidence in leadership (Table 1) increased for 15 of 19 (79%) variables assessed between baseline (prior to initial training) and follow-up (12 months after initial training). Further, confidence in these 15 variables increased in each of the three disciplines during the study period. The greatest changes were noted for the variables “Help people from different departments determine the root cause of problems” (5.6 at baseline and 7.5 at follow-up) and “Handle a more challenging job” (6.5 at baseline, 8.5 at follow-up). Faculty self-assessment of competence improved significantly in 21 of 22 (95%) skills covered during the PCFDI training (Table 2). Despite variability at baseline, the greatest shift in improvements appeared to be in family medicine and internal medicine.

Table 1

Table 1

Table 2

Table 2

Back to Top | Article Outline

Continuity clinic changes

Table 3 illustrates findings from the PCMH Monitor at baseline and follow-up (12 months after initial training). We noted improvements in all five domains between the two time periods, and improvement in two areas was significant. “Continuity of care” improved from 50.7% to 64.4%, and “self-management support and care coordination” improved from 50.3% to 60.6% (P = .046 for both).

Table 3

Table 3

Characteristics of the continuity clinics, by primary care discipline at baseline and follow-up, are shown in Table 4. The average number of residents assigned to these clinics ranged from 6 to 12 per postgraduate-year group and did not differ by discipline (data not shown). The number of half-days residents spent in continuity clinics did not change during the study period. Family medicine residents spent more time in clinic as they progressed from Year 1 to Year 3, while the number of half-day sessions remained the same in the other two disciplines throughout training. At baseline, between 50% and 75% of programs in each discipline were able to get resident-generated data from their electronic health record. In all clinics, this increased from 64% to 82% between baseline and follow-up, though this change was not statistically significant. No other changes were found to be significant between baseline and follow-up.

Table 4

Table 4

Back to Top | Article Outline

Changes in practices and residencies

Learning community activities and redesign efforts are shown in Table 5. The majority of core faculty in all three disciplines were meeting face-to-face with others in their learning community once or twice per month at baseline, which decreased at follow-up to monthly or at least once in six months (P < .001; see also note in Discussion). Core faculty in all disciplines participated in other learning community activities, such as stakeholder engagement and forming new collaborations with those beyond the primary team. The extent of the effort did not change significantly over the study period.

Table 5

Table 5

Faculty were engaged in practice and residency redesign efforts to the same extent at baseline compared with follow-up (12 months after initial training) (Table 5), with the exception of implementing processes to enhance and support teamwork, which increased significantly during this period. Two of the four institutional teams sought funding for their transformation efforts, particularly joint professional development activities. Five proposals were submitted to institutional groups or local foundations, and four of these were funded. The two institutions received awards totaling $41,000 and $5,000, respectively (data not shown).

Back to Top | Article Outline

Qualitative results

Two predominant themes emerged from analyses of qualitative data. The first was the value that the interdisciplinary teams derived from the learning communities. The teams met regularly, built relation ships, learned from each other, and held a shared commitment to change. To illustrate, one participant commented, “We have built up trust and deepened our relationships such that it makes collaborating or interacting around other aspects of our work easier,” and another noted, “The power of the collaborative is that you play to each other’s strengths and everyone elevates.”

The second theme was that PCFDI was a catalyzing force, providing additional motivation and structure for getting things done via a national spotlight and with coaching from national experts. One participant stated, “We would never be doing what we are doing if the PCFDI didn’t happen.” Another remarked, “Being a part of this is the best thing that ever happened to us.”

Speaking with one unified voice and keeping leadership informed to garner additional resources was a third, less predominant theme identified in analysis of qualitative data. To illustrate, one participant noted, “Because of PCFDI, we feel like we have formed a powerful collaborative that can now ask for things as a three-discipline unit.”

The work of the teams did not proceed without difficulty, and the challenge of managing multiple competing demands in order to do transformation work formed an additional theme. One participant commented, “I think the reality is that everyone is very busy, and while we’re able to meet, there’s not a lot of time to work on this.” Another said, “I like everyone (in the three disciplines), but when you get back home, efforts fizzle.”

Back to Top | Article Outline

Discussion

The results from this pilot demonstrate that the PCFDI appears to have been an important catalyst for family medicine, internal medicine, and pediatrics to work together to transform continuity clinics and residency training programs. At the individual participant level, faculty gained confidence in multiple leadership behaviors, and their perceived competence in several PCMH and learner assessment skills also increased. Even within a short 12-month evaluation period, many patient-centered care features of the continuity clinics improved in all three disciplines. The desire to do this transformation work was present from the outset, but change may not have taken place (or if it had, it may have taken longer) without the external structure and support that PCFDI provided.

The interdisciplinary learning community model we used appeared to help faculty sustain their redesign efforts, which required thinking and planning that typically expanded beyond their usual institutional patterns. Faculty from the three disciplines built new working relationships, focused on shared goals rather than cultural differences, and integrated ideas and insights from their multiple perspectives into unified change efforts. The teams met more frequently at the beginning of the project to build relationships and gain momentum and then, as they moved into implementing their ideas, their meetings became more occasional but still continued. The signif icance of breaking down traditionally “siloed” disciplines in academic insti tutions cannot be overstated.

The fact that faculty perceived improve ments in their skills competence and leadership confidence is notable given that self-determination theory posits that perception of competence (along with relatedness and autonomy) is a key ingredient for enhancing motivation.24 Our finding that one of the greatest changes in faculty confidence was in helping other departments solve problems suggests that the PCFDI may have stimulated relatedness through its interdisciplinary design. The PCFDI’s focus on faculty as motivated change agents may provide an important model for sustaining needed GME improvements.

The PCFDI was intentionally focused on providing faculty development so that faculty could better equip residents for their future practices, and better prepare the primary care workforce for the demands of an evolving health care system. Given the growing need to develop interprofessional team-based care models, future faculty development programs should include other professionals involved in training the primary care workforce such as nurse practitioners, physician assistants, pharmacists, and behavioral health specialists.25–27 This inclusiveness has the potential to create a ripple effect, allowing others to study how this model impacts other members of primary care teams and, subsequently, strengthens interprofessional collaborative practice.28,29

These findings are consistent with other redesign collaboratives focused on community practices. The participants in the PCMH National Demonstration Project realized the power of “learning to be a learning organization.”16 Leaders in the safety net clinics participating in a primary care renewal project in Oregon shared an “incubator experience,” which was instrumental in developing engaged leaders, especially during the inspiration phase of transformation.17 A model of professional development that emphasizes application of learning in the local environment with collaboration and ongoing coaching aligns with research that shows the elements needed in PCMH transformation work.14 Based on the experience in other redesign collaboratives, two to five years is a realistic time frame to create momentum for change and assess meaningful practice and residency training transformation.30–32

One of the strengths of the evaluation for this pilot was that the measures we selected for study were sensitive enough to assess many activities associated with the pilot. Further, we achieved high response rates for core quantitative instruments, and the mixed-method approach proved key to evaluating this type of work by providing rich detail about the process of change and how the learning communities developed. An additional strength is that we constructed the program using evidence for effective programmatic design, incorporated flexibility to meet institution-specific needs, and provided coaching elements to reinforce and support participants’ change efforts.

Weaknesses include the lack of a comparison group and a small sample size without sufficient power to test specific a priori hypotheses. Despite these weaknesses, the pilot allowed us to assess feasibility and acceptability, test the precision of the instruments we used, and explore effect sizes in preparation for a larger, more discriminating study. An additional limitation is that our participating programs likely did not represent typical residencies since they were already motivated to change given their submission of an application. Also, the short duration did not allow us to assess sustainability or the impact of the program on long-term outcomes such as the change in faculty skills over time, the practice profiles of residents who graduated from participating residency training programs, and the outcomes of patients seen in participating continuity clinics. We do not yet know whether the learning communities will continue to function after external support and the national spotlight are removed.

The diversity of contextual features of the four institutions involved in the PCFDI (e.g., different sponsorship arrangements, leadership structures, program sizes, clinic settings) makes them quite heterogeneous. Despite this heterogeneity, the results were consistent across institutions with regard to the formation of learning communities and the role the PCFDI played in catalyzing change, indicating that it may be possible to reproduce these results in a larger national sample. The participating teams received no funding other than travel support to attend trainings. From the start they successfully found new resources or leveraged current resources to assist their efforts, which supports potential sustainability.

Despite the different professional perspectives, the participating faculty members at each institution all found a way to work collaboratively—a process that started with merely bringing them together for training. The cooperation within teams was strengthened by collaborative visits and ongoing coaching from the PCFDI faculty. Replicating these elements with a different group of institutions has the potential to catalyze further redesign work and diffuse innovation. Building on the success of the PCFDI, a new professional development program entitled Professionals Accelerating Clinical and Educational Redesign (PACER) will begin a broader implementation of this model. PACER includes faculty from multiple health professions and will examine an expanded set of outcomes to better assess changes related to resident training and patient experience.33

Back to Top | Article Outline

Conclusions

The PCFDI program was a successful pilot leading to improvements in faculty perceptions of confidence and skill and creation of interdisciplinary learning communities where none previously existed. This professional development model appears capable of catalyzing transformation in primary care residency continuity clinics and training programs. Improvements including lengthening the intervention period, engaging other health professions, and employing more sophisticated study designs are needed if we are to scale this model nationally.

Acknowledgments: The authors gratefully acknowledge the contributions of the core faculty for the Primary Care Faculty Development Initiative: Brad Benson, MD, and Brian Sick, MD, of the University of Minnesota; Steven Crane, MD, of Mountain Area Health Education Center; Perry Dickinson, MD, of the University of Colorado; Ana-Elena Jensen, PhD, patient-centered medical home senior consultant/facilitator; Charles Kilo, MD, MPH, of Oregon Health & Science University; Paul V. Miles, MD, of the American Board of Pediatrics; Eric Warm, MD, of the University of Cincinnati; and Will Miller, MD, MA, of Lehigh Valley Health System. The authors also wish to acknowledge the courage and dedication of the 36 faculty from the four participating institutions and thank them for co-creating a successful program.

Back to Top | Article Outline

References

1. Hoff T, Weller W, DePuccio M. The patient-centered medical home: A review of recent research. Med Care Res Rev. 2012;69:619–644.
2. Nielsen M, Gibson A, Buelt L, Grundy P, Grumbach K. The patient-centered medical home’s impact on cost and quality: An annual update of the evidence, 2013–2014. 2015. Patient-Centered Primary Care Collaborative Web site. http://www.pcpcc.org/resource/patient-centered-medical-homes-impact-cost-and-quality. Accessed February 1, 2016.
3. Institute of Medicine. Graduate Medical Education That Meets the Nation’s Health Needs. 2014.Washington, DC: National Academies Press.
4. Crosson FJ, Leu J, Roemer BM, Ross MN. Gaps in residency training should be addressed to better prepare doctors for a twenty-first-century delivery system. Health Aff (Millwood). 2011;30:2142–2148.
5. Hackbarth G, Boccuti C. Transforming graduate medical education to improve health care value. N Engl J Med. 2011;364:693–695.
6. Bohmer RA. Managing the new primary care: The new skills that will be needed. Health Aff (Millwood). 2010;29:1010–1014.
7. Asch DA, Nicholson S, Srinivas SK, Herrin J, Epstein AJ. How do you deliver a good obstetrician? Outcome-based evaluation of medical education. Acad Med. 2014;89:24–26.
8. Carraccio CL, Englander R. From Flexner to competencies: Reflections on a decade and the journey ahead. Acad Med. 2013;88:1067–1073.
9. Sirovich BE, Lipner RS, Johnston M, Holmboe ES. The association between residency training and internists’ ability to practice conservatively. JAMA Intern Med. 2014;174:1640–1648.
10. Bowen JL, Salerno SM, Chamberlain JK, Eckstrom E, Chen HL, Brandenburg S. Changing habits of practice. Transforming internal medicine residency education in ambulatory settings. J Gen Intern Med. 2005;20:1181–1187.
11. Nasca TJ, Weiss KB, Bagian JP. Improving clinical learning environments for tomorrow’s physicians. N Engl J Med. 2014;370:991–993.
12. Green LA, Jones SM, Fetter G Jr, Pugno PA. Preparing the personal physician for practice: Changing family medicine residency training to enable new model practice. Acad Med. 2007;82:1220–1227.
13. Steinert Y, Mann K, Centeno A, et al. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME guide no. 8. Med Teach. 2006;28:497–526.
14. Clay MA 2nd, Sikon AL, Lypson ML, et al. Teaching while learning while practicing: Reframing faculty development for the patient-centered medical home. Acad Med. 2013;88:1215–1219.
15. Chou CL, Hirschmann K, Fortin AH 6th, Lichstein PR. The impact of a faculty learning community on professional and personal development: The facilitator training program of the American Academy on Communication in Healthcare. Acad Med. 2014;89:1051–1056.
16. Nutting PA, Miller WL, Crabtree BF, Jaen CR, Stewart EE, Stange KC. Initial lessons from the first national demonstration project on practice transformation to a patient-centered medical home. Ann Fam Med. 2009;7:254–260.
17. McMullen CK, Schneider J, Firemark A, Davis J, Spofford M. Cultivating engaged leadership through a learning collaborative: Lessons from primary care renewal in Oregon safety net clinics. Ann Fam Med. 2013;11(suppl 1):S34–S40.
18. Eiff MP, Waller E, Fogarty CT, et al. Faculty development needs in residency redesign for practice in patient-centered medical homes: A P4 report. Fam Med. 2012;44:387–395.
19. Carney PA, Eiff MP, Green LA, et al. Transforming primary care residency training: A collaborative faculty development initiative among family medicine, internal medicine, and pediatric residencies. Acad Med. 2015;90:1054–1060.
20. Irvine D, Leatt P, Evans MG, Baker RG. Measurement of staff empowerment within health service organizations. J Nurs Meas. 1999;7:79–96.
22. Bhapkar VP. A note on the equivalence of two test criteria for hypotheses in categorical data. J Am Stat Assoc. 1966;61:228–235.
23. Riessman CK. Narrative Methods for the Human Sciences. 2008.Los Angeles, Calif: SAGE Publications.
    24. Ryan RM, Deci EL. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am Psychol. 2000;55:68–78.
    25. Leasure EL, Jones RR, Meade LB, et al. There is no “i” in teamwork in the patient-centered medical home: Defining teamwork competencies for academic practice. Acad Med. 2013;88:585–592.
    26. Brienza RS, Zapatka S, Meyer EM. The case for interprofessional learning and collaborative practice in graduate medical education. Acad Med. 2014;89:1438–1439.
    27. Kogan JR, Holmboe ES. Preparing residents for practice in new systems of care by preparing their teachers. Acad Med. 2014;89:1436–1437.
    28. Ladden MD, Bodenheimer T, Fishman NW, et al. The emerging primary care workforce: Preliminary observations from the primary care team: Learning from effective ambulatory practices project. Acad Med. 2013;88:1830–1834.
    29. Interprofessional Education Collaborative Expert Panel. Core Competencies for Interprofessional Collaborative Practice: Report of an Expert Panel. 2011.Washington, DC: Interprofessional Education Collaborative.
    30. Crabtree BF, Nutting PA, Miller WL, Stange KC, Stewart EE, Jaén CR. Summary of the National Demonstration Project and recommendations for the patient-centered medical home. Ann Fam Med. 2010;8(suppl 1):S80–S90, S92.
    31. McNellis RJ, Genevro JL, Meyers DS. Lessons learned from the study of primary care transformation. Ann Fam Med. 2013;11(suppl 1):S1–S5.
    32. Francis MD, Thomas K, Langan M, et al. Clinic design, key practice metrics, and resident satisfaction in internal medicine continuity clinics: Findings of the educational innovations project ambulatory collaborative. J Grad Med Educ. 2014;6:249–255.
    33. Professionals Accelerating Clinical and Educational Redesign Web site. 2016. www.pcpacer.org. Accessed January 29, 2016.
    © 2016 by the Association of American Medical Colleges