Skip Navigation LinksHome > November 2002 - Volume 77 - Issue 11 > Evaluation of a Faculty Development Program in Managing Care
Academic Medicine:
Research Reports

Evaluation of a Faculty Development Program in Managing Care

Peters, Antoinette S. PhD; Ladden, Maryjoan D. PhD, RN; Kotch, Jamie B. SM; Fletcher, Robert H. MD, MSc

Free Access
Article Outline
Collapse Box

Author Information

Dr. Peters is assistant professor, Department of Ambulatory Care and Prevention, Harvard Medical School and Harvard Pilgrim Health Care, and director of curriculum development, Office of Educational Development, Harvard Medical School, Boston, Massachusetts. Dr. Ladden is assistant professor, Ms. Kotch is project coordinator, Dr. Fletcher is professor, all in the Department of Ambulatory Care and Prevention, Harvard Medical School and Harvard Pilgrim Health Care.

Correspondence and requests for reprints should be addressed to Dr. Peters, Department of Ambulatory Care and Prevention, Harvard Medical School and Harvard Pilgrim Health Care, 133 Brookline Avenue, Boston, MA 02215; e-mail: 〈toni_peters@harvardpilgrim.org〉.

This study was supported by contract #230-99-0036 with the Bureau of Health Professions, Division of Medicine and Dentistry, Health Resources and Services Administration, U.S. Department of Health & Human Services, Jerilyn K. Glass, MD, PhD, Project Officer. The authors gratefully acknowledge the assistance of our advisory committee: the late Sarah Stone, MD (Chair), Janice Benson, MD, John Pascoe, MD, Maryjean Schenk, MD, and Michael Workings, MD.

For an article and two research reports on related topics, see pages 1069, 1112, and 1128.

Collapse Box

Abstract

PURPOSE: To evaluate a faculty development program that teaches quality improvement and cost—effectiveness.

METHOD: From October 2000 to February 2001, a two-part faculty development program was offered to 39 physicians from 19 U.S. medical schools supported by grants from the Partnerships for Quality Education (PQE) and Undergraduate Medical Education in the 21st Century (UME-21). Special features of the program included partnerships between academic and community physicians from each school, development of an educational innovation of interest to the participants, concurrent development of teaching skills and new medical knowledge, learning leadership skills (e.g., how to train colleagues to teach), and practice periods. The program focused on quality improvement and cost—effectiveness, but included other “managing care” topics. Prior to and after the course, participants assessed their knowledge of and competence to teach these topics, as well as other managing care topics. They also assessed their competence as medical educators and leaders. After the course, they indicated their progress in implementing their proposed educational innovations.

RESULTS: Thirty-two of the 39 physicians completed evaluations both before and after the program. Self-assessed knowledge and competence to teach quality improvement and cost—effectiveness were significantly higher at the end of the course, as were all self-assessed teaching and leadership skills. The largest change scores occurred in assessments of competency to teach the new topics and to teach in new ways. Participants who implemented their innovations rated their competencies to teach quality improvement and cost—effectiveness higher than did non—implementers.

CONCLUSION: Opportunities for faculty to learn how to teach a topic of stated importance to them, to practice what they have learned, and to work collaboratively with partners improved teaching skills.

Including new content in medical education poses a problem for faculty who have had no prior training in the new content area. The Council on Graduate Medical Education (COGME)1 has recommended new content for residency programs to prepare physicians to practice in a changing health care system. Others have suggested that this content be included in medical school curricula as well.2,3 These topics, often called “managing care competencies,” include health promotion and disease prevention, population-based care, evidence-based medicine, cost—effectiveness, quality improvement, systems-based care, ethics, and patient—provider communication. Some of these topics (such as ethics and prevention) have been typical content in medical curricula for some years, but the emergence of managed care and the concomitant changes in health care require that they be taught in new ways. (The term “managing care” is used to differentiate the competencies, which have wide applicability, from “managed care” as a system.)

Two national programs—Undergraduate Medical Education for the 21st Century (UME-21)4 and Partnerships for Quality Education (PQE)5 — have helped selected medical schools and residency programs integrate managing care principles into their curricula. UME-21, funded by the U.S. Health Resources and Services Administration, is a program to develop innovations in the clinical curricula of 18 medical schools. Its objective is to help medical students attain the knowledge, skills, and attitudes needed to practice in a more intensively managed and integrated health care system. PQE (supported by the Pew Charitable Trusts from 1996 to 1999 and presently supported by The Robert Wood Johnson Foundation) focuses on education for residents and advanced practice nurses on managing care and interprofessional collaboration. During the time of the faculty development program in our study, PQE supported 66 residency programs. Because faculty have had no prior, formal training in the new content areas, medical schools and residency programs face a double challenge—to train both their faculty and their students and residents in these areas. To accomplish this, they need to identify change agents who can develop new curricula and train colleagues to teach new content.

Recognizing that faculty need additional training to teach these managing care principles, the U.S. Health Resources and Services Administration (HRSA) supported an 18-month initiative, Teaching the Managing Care Competencies: A Faculty Development Program for UME-21 and PQE Faculty. The purpose of this initiative was to develop, implement, and evaluate a faculty development program for training UME-21 and PQE faculty to develop curricula and innovative teaching strategies for two of the managing care competencies, quality improvement and cost—effectiveness. The program was also designed to promote skills that would allow faculty to generalize principles and strategies to teaching other managing care topics.

In this report, we describe the HRSA program as it was implemented at Harvard Medical School and Harvard Pilgrim Health Care and evaluate how participants' sense of competency as teachers of new content changed during this program. The goal of the program was to increase participants' knowledge of quality improvement and cost—effectiveness, their knowledge of program development and teaching strategies, and their skill in teaching, as assessed by their own sense of competency.

Back to Top | Article Outline

METHOD

Program and Workshops

The faculty development program in our study was conducted by the Department of Ambulatory Care and Prevention at Harvard Medical School and Harvard Pilgrim Health Care (HPHC). The HPHC is a not-for-profit managed care organization and is a major teaching affiliate of Harvard Medical School.

To maximize the utility of the program to UME-21 and PQE schools, we designed a four-month program (mid-October 2000 to mid-February 2001) that focused on approaches to developing and implementing curricula and enhancing teaching skills for two managing care competencies—quality improvement and cost—effectiveness. Representatives of the schools, as well as a national advisory board, identified these two competencies as the ones most frequently lacking in training programs. Program participants defined a teaching innovation of interest to each of them and their institutions (a local change in curriculum related either to cost—effectiveness or quality improvement), attended a two-day workshop, returned to their institutions to practice what they had learned in their own developing programs, attended a second two-day workshop, and then, using what they learned, returned to their home schools to teach other faculty. Based on adult learning theory, we believed that pre-program work would prepare participants to hear new information in a meaningful way, in relation to the work of interest and importance to them.6 Evaluations of continuing medical education programs designed to change physicians' clinical behaviors have suggested that between-workshop practice enhances learning.7,8 Additionally, we believed that the train-the-trainer model would not only promote learning through doing, but also widen the dissemination of new skills.

We invited all UME-21 and PQE schools to nominate faculty members for the workshops and suggested that they identify one community-based and one academic physician who might collaborate on a curricular change that would be of interest to the school (either on a joint project or through mutual support of each other's projects). We believed that these partnerships would provide a level of momentum and responsibility that would promote faster change. Nineteen schools sent 39 faculty members to participate in our program. Twenty faculty members represented ten UME-21 schools, while 19 were from nine PQE programs (one PQE school elected to send three faculty members). Participants came from 15 states, representing all regions of the United States, and most were primary care physicians, although three were from subspecialties, and one was a PhD educator.

Activities during the workshops were based on quality improvement and cost—effectiveness, and how they can be taught. The workshops used lectures, and large- and small-group discussions, and fostered various individual, team, and reflection exercises. The program also developed participants' knowledge of and skills in educational theory and practice, as well as their skills as change agents and faculty developers. More detailed information about the content and teaching methods included in the program is available at 〈www.hms.harvard.edu/ambulatory/hrsa.html〉. The workshops were designed not only to introduce principles of teaching, learning, and evaluation, but also to model a variety of approaches that the clinical teacher could build into his or her personal repertoire. We modeled a number of pedagogic methods, providing both focused remarks about the rationale for each method prior to the session and opportunities for reflection on and evaluation of the methods at the end of each day. An extensive workshop syllabus was developed to provide resources for the workshop itself and to help participants replicate these sessions at their home schools.

In the two months between the two workshops, we encouraged participants to practice what they learned in the first workshop. Since they would need to train other faculty to teach in new ways, we suggested that they use this time to teach at least five colleagues something that they had learned in the first workshop. During this period, we were available for consultation through telephone calls or e-mail; we also sent readings on cost—effectiveness and program evaluation in preparation for the second workshop.

At the end of the program, we recommended to all participants that they try to use the skills they learned as quickly as possible so that they could deepen their learning and disseminate ideas to colleagues. We suggested that they establish small, trial faculty development programs among their colleagues within the next two months. Moreover, we encouraged use of the project Web site to promote continued conversation among participants and between participants and us, to share materials from the course and from different institutions, and to provide a platform for new ideas.

Back to Top | Article Outline
Evaluation

On three occasions (before the first workshop, just prior to the second workshop, and two months after the second workshop), we asked the participants to complete evaluations rating three kinds of competencies relevant to the course: their knowledge of and competence to teach each of the nine managing care competencies, their skills in teaching in seven different ways (e.g., lectures, precepting, giving feedback), and their skills in developing and evaluating teaching innovations. All ratings were made on a ten-point scale (0 = not knowledgeable or skilled, 10 = highly knowledgeable or skilled). During the third administration of this evaluation, participants were asked to indicate their progress in implementing their proposed teaching innovations around quality improvement and cost—effectiveness prior to and at the end of the program (i.e., 1 = had defined a teaching innovation, 5 = were ready to implement, and 10 = completed implementation of the teaching innovation). Finally, we asked them to describe what they had learned, what they taught others that they had learned in the program, and what recommendations they would make to improve the program.

Back to Top | Article Outline

RESULTS

Eighty-two percent (32 of 39) of the program participants returned evaluations both before and after the program. Sixty-two percent (24) returned the evaluation at all three points. There was no difference between respondents and non-respondents in terms of academic versus community status, region of the United States, or association with UME-21 or PQE.

At the beginning of the program, participants rated their knowledge of patient—provider communication and health promotion and disease prevention moderately high (7.7 and 7.3, respectively, on a ten-point scale, see Table 1). On average, they rated their knowledge of the targeted competencies, quality improvement and cost—effectiveness, 6.3 and 6.0, respectively, while they rated their knowledge of health care systems, population-based care, and systems-based care lower: 4.6, 5.4, and 5.5, respectively.

Table 1
Table 1
Image Tools

At the end of the faculty development program, we observed statistically significant differences in the respondents' ratings of their knowledge of the targeted managing care competencies—quality improvement and cost—effectiveness (see Table 1). In all tables in this report, we include the mean values at all three evaluation points (pre-, mid-, and end-program); however, the change score was calculated as the change in self-assessment from the beginning to the end of the program. Ratings of knowledge of quality improvement rose 1.6 points (p < .0001) on the ten-point scale, and knowledge of cost—effectiveness rose 0.9 points (p < .02). Participants also perceived growth in related areas that were covered less directly during the program, such as health care systems, systems-based care, population-based care, and ethics of managed care. As expected, knowledge of some content areas that have been traditionally taught in medical schools and residency programs, such as disease prevention and patient—provider communication, were not affected by this program.

Participants' pre-program ratings of their competency to teach the managing care topics were somewhat lower than their assessments of their knowledge of the topics (see Table 2). Again, participants felt most competent to teach disease prevention and patient—provider communication. However, by the end of the program, the participants felt that they had gained competencies to lecture on and teach clinical applications of all the topics, including those traditionally taught. Change was greatest for competencies to lecture on (2.1, p < .0001) and to teach clinical applications of quality improvement (2.2, p < .0001). Ratings of competency to lecture on the second targeted topic, cost—effectiveness, rose 1.5 points, and competency to teach clinical applications of cost—effectiveness, 1.6 points.

Table 2
Table 2
Image Tools

Similarly, participants' ratings of their skill in using different methods to teach and in developing and evaluating teaching innovations were higher after the program than before (see Table 3). Although the changes were statistically significant for all skills, we did observe smaller changes in the ratings of skills in giving a lecture (0.8), precepting in the office (0.8), and teaching at the bedside (1.0) (activities most of these faculty would already have engaged in and for which they rated their skills fairly high before the program) than in giving feedback (1.6), leading large interactive groups (1.4), modeling clinical skills (1.1), or tutoring small groups (1.1). The largest changes were in developing and evaluating teaching innovations, with three of five ratings rising more than two points on the ten-point scale.

Table 3
Table 3
Image Tools

Of the 32 respondents to the final evaluation, 24 provided information about the extents to which they had implemented their originally proposed teaching innovations: 17 said they had implemented the innovations, seven said they had not. Of the 24 respondents, 18 were community—academic partners (i.e., nine pairs). Six pairs of these nine pairs (12 participants) said they both had implemented their teaching innovation, and in three pairs of the nine pairs, only one partner in each had implemented the project. Two other participants who had implemented their innovations had partners who did not respond to the final evaluation.

We examined whether the respondents who had implemented their teaching innovations perceived greater changes in their knowledge and competency than did those who had not yet implemented their innovations. We found a consistent trend in higher self-ratings among the implementing group; however, we found no significant difference between the groups, perhaps because of low statistical power. The difference in the two groups' ratings of end-of-program skills in teaching clinical applications of cost—effectiveness and quality improvement were 1.2 and 1.0, respectively. The differences in self-ratings of competency to lecture on quality improvement and cost—effectiveness were both 0.8, and differences in knowledge were both 0.6. Differences between the groups were very small on ratings of teaching and developing innovations, although implementers' self-ratings were higher by 0.6 point than were those of non-implementers in terms of both identifying course content and giving feedback (data not shown).

We also explored differences in implementation by program (UME-21 versus PQE), specialty (internal medicine, pediatrics, family medicine, and others), and type of faculty (community versus academic). None of the comparisons was statistically significant. More than three-fourths of UME-21 (76.9%) and 63.6% of PQE faculty members had implemented their teaching innovations. Eighty percent of the community faculty members and 64.3% of the academics had implemented their projects.

Participants listed a wide variety of things they had learned and had already used or planned to use. They believed they had learned a lot about quality improvement and cost—effectiveness. They also mentioned learning strategies for developing innovations as well as for teaching in new ways. One respondent wrote, “I have learned and acquired a lot of materials to help me teach the subject…. I acquired and modeled different teaching methods that I could personally use and share with other faculty.” Others mentioned the value of learning to write objectives and set up evaluation plans. One respondent wrote, “Teaching methods were more concretely defined for me as a result of the course. This helped me in writing a grant proposal I had not even before considered…. It was funded. The writing surveys/defining outcome measures aspect raised new awareness for me and has helped with my teaching methods.”

After the course, participants also tried to integrate some of the methods into their own teaching. One said that he “introduced these topics in resident clinic conferences. Residents were totally in the dark prior to my discussion and were genuinely interested in learning this.” He added that his main focus will be with his community preceptors. Several others noted that their faculty development programs were extremely well received by the primary care faculty. Others were continuing to plan programs or were struggling for acceptance.

Back to Top | Article Outline

DISCUSSION

Participants in our faculty development program reported increased knowledge of and skill in teaching, in general, and in teaching the targeted managing care topics, quality improvement and cost—effectiveness, specifically. According to their self-assessments, they learned the most in new areas, whether those areas were about managing care or teaching, as well as about those topics covered most intensively. Participants who implemented teaching innovations at their home schools in the four-month period rated their competencies to teach the targeted topics higher than those who did not.

While we did not directly investigate whether training faculty from the UME-21 (undergraduate) and PQE (residency) programs together was beneficial, several participants said that it was useful to include the continuum of education when considering new curricula. Training academic and community partners together did seem to move teaching innovations at participants' home schools ahead more quickly. We found that approximately one third of all partners succeeded in implementing a teaching innovation around quality improvement or cost—effectiveness during our program. Our data do not tell us whether the partners worked together or whether those innovations involving both partners were stronger than those involving only one member, but the data do suggest that partnering community and academic faculty may have facilitated implementing educational change at their home schools.

Our program was designed based on the best evidence on faculty development7,9 and changing physicians' behaviors10 and included a few unusual features: namely, following training with periods of practice,7 using peer experts as trainers,10 learning while working on projects of importance to the learners,6,11 building collaboration between academic and community faculty, training physicians to teach peers,12,13 and learning new medical and educational content simultaneously. However, the evidence we based our design on was limited. Evaluations of faculty development programs seldom measure performance in practice or effects on students' learning. Instead, as we did, others have measured participants' perceptions of their own efficacy.8 Given mixed results in prior studies, we followed Greco and Eisenberg's suggestion that “no particular type of intervention is inherently effective, particularly when it is used in isolation…”. [C]ombinations of methods are superior to single methods of intervention.14,p. 1273

While the number of significant changes in self-assessment might suggest a degree of social desirability in the responses, the pattern of the responses suggests they are credible. First, changes in self-assessed competencies to teach quality improvement are greater than those for cost—effectiveness. Since quality improvement was presented first, and then also embedded in the second workshop on cost—effectiveness, the participants were exposed to this content area over a longer period of time. Moreover, they had more time to practice teaching quality improvement after learning about it. Second, variations in initial ratings of knowledge of and competencies to teach all the managing care competencies suggest that faculty differentiated among those topics that were new to them and those they had previously taught. As we expected, ratings of knowledge of disease prevention and patient—provider communication were high prior to the program and remained approximately the same at the end of the program. Participants rated their knowledge and skill in teaching new content areas, such as health care systems and system-based care, relatively low prior to the program and substantially higher at the end. This difference could result both from the degree to which the material was covered in the program and the mutability of participants' perceptions of their skills.

While systems-based care and health care systems were not specifically targeted content in this course, these competencies were embedded in lectures and exercises on quality improvement and cost—effectiveness, whereas disease prevention was not covered and patient—provider communication was the topic of only a few exercises. Moreover, the participants, most of whom were primary care physicians, would have had a strong sense of their abilities to provide (and teach) preventive care and sensitive communication with patients. We cannot know whether increased knowledge could also be attributed to the home environment during the same period, but would assume that learning during the course would deepen with increased exposure to the ideas through discussion with colleagues, further development of one's innovation, and the like.

The participants in our program were selected from UME-21 and PQE schools, which had been funded to implement managing care programs. Other faculty members may not be as interested in teaching these topics. Nonetheless, we believe that the model of this program is generalizable to other settings. First, the faculty members in our program represented 19 schools from across the United States and were both academic and community physicians. Moreover, their assessments of their knowledge and skills in teaching and managing care were modest prior to the program. Second, our program is a model specifically designed to enhance learning for participants with a “need to know.” That is, when participants are selected because they want to develop a teaching innovation, particularly around new content, they are primed, eager, and ready to learn.

We believe that the design of our faculty development program provided an effective means to update faculty on new content areas important to current medical school and residency program curricula, as well as to enhance their teaching skills. The number of new topics that policymakers suggest be integrated into medical education—ranging from genetics to medical economics to cultural diversity—would suggest that faculty development programs such as ours might support curricular change and ease the burden of faculty assigned to teach these topics. Teaching physicians to teach new content which they have a vested interest in learning seems to generate interest and build skills quickly.

Back to Top | Article Outline

References

1. Council on Graduate Medical Education. Preparing Learners for Practice in a Managed Care Environment. Rockville, MD: U.S. Department of Health and Human Services, 1997.

2. Lurie N. Preparing physicians for practice in managed care environments. Acad Med. 1996;71:1044–9.

3. Meyer GS, Potter A, Gary N. A national survey to define a new core curriculum to prepare physicians for managed care practice. Acad Med. 1997;70:669–76.

4. Rabinowitz HK, Babbott D, Bastacky S, et al. Innovative approaches to educating medical students for practice in a changing care environment: the National UME-21 Project. Acad Med. 2001;76:587–97.

5. More information about Partnerships for Quality Education is available at 〈www.pqe.org〉.

6. Knowles MS, Holton ES, Swanson RA. The Adult Learner. 5th ed. Houston, TX: Gulf Publishing, 1998.

7. Wilkerson L, Irby DM. Strategies for improving teaching practices: a comprehensive approach to faculty development. Acad Med. 1998;73:387–96.

8. Davis D, O'Brien M, Freemantle N, Wolf FM, Mazmanian P, Taylor-Vaisey A. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care out-comes? JAMA. 1999;282:867–74.

9. Skeff KM, Berman J, Stratos G. A review of clinical teaching improvement methods and a theoretical framework for their evaluation. In: Edwards JC, Marier RI (eds). Clinical Teaching for Medical Residents: Roles, Techniques, and Programs. New York: Springer Verlag, 1988:92–120.

10. Davis DA, Tomson MA, Oxman AD, Haynes RB. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA. 1995;274:700–5.

11. Armstrong, Elizabeth G., director, Harvard-Macy Institute, Boston, MA. Personal communication, November 13, 2001.

12. Skeff KM, Stratos GA, Berman J, Bergen MR. Improving clinical teaching. Evaluation of a national dissemination program. Arch Intern Med. 1992;152:1156–61.

13. Albright CL, Farquhar JW, Fortmann SP, et al. Impact of a clinical preventive medicine curriculum for primary care faculty: results of a dissemination model. Prev Med. 1992;21:419–35.

14. Greco PJ, Eisenberg JM. Changing physicians' practices. N Engl J Med. 1993;329:1271–4.

© 2002 Association of American Medical Colleges

Login

Article Tools

Images

Share