Secondary Logo

Journal Logo

PAPERS: FACULTY DEVELOPMENT: WHY BOTHER?

Longitudinal Outcomes of an Executive-model Program for Faculty Development

TEHERANI, ARIANNE; HITCHCOCK, MAURICE A.; NYQUIST, JULIE G.

Section Editor(s): Cleary, Lynn MD

Author Information

Increasing numbers of faculty members in medicine seek training to enhance their academic careers. There is a need for faculty development programs that are tailored to teaching in health settings; have track records of graduating outstanding teachers, educational scholars, and leaders; offer degrees to produce credibility for those who choose teaching as a career focus; and are organized so that faculty can participate while continuing to work. Despite a 20-year history of faculty development programs, few programs meet these criteria. This study reports the outcomes of one such program.

The faculty development fellowship program assessed here was launched in the fall of 1998 at a private urban medical school and relied heavy on two new approaches. Classes were moved, over the course of two years, to 12 intensive weekend sessions modeled upon the executive-training model used by business schools. Between weekend sessions, distance learning was used. Participants received assignments online, exchanged drafts of projects, and developed group presentations online or through e-mail. This enabled the participants to complete the program while continuing their academic duties. Topics such as curriculum design, research project development, small-group teaching, and learner evaluation were included. The executive model also attracted participants nationwide who traveled to California for the intensive sessions and then returned to their ongoing faculty duties. The two-year program leads to a master's degree. The focus of year one is on teaching and learning, while the focus of year two is on educational leadership. Outcomes of the first-year teaching and learning fellowship are examined here.

Method

Participants. This study assessed the first two cohorts (n = 21 in year one; n = 15 in year two) who completed the teaching and learning fellowship (1998–99 and 1999–00). Supervisors or colleagues of 15 participants were interviewed. Analyses addressed the following questions:

  • Did knowledge of teaching in clinical and classroom settings improve?
  • Did clinical and classroom teaching skills improve?

Instruments. Each participant completed a survey before the program that included items on gender, educational background, self-reported knowledge (79 items), and relevant experience (e.g., years of teaching or types of students taught). At the end of the program, an external facilitator conducted a one-hour focus group to assess what the participants felt they gained from the program (e.g., What aspects of the program do you feel contributed to your knowledge as a teacher?).

A follow-up questionnaire was administered to the fellows two to three months after the first year. The 79 items in this questionnaire were identical to those on the pre-program survey. Additionally, structured telephone interviews were conducted with the participants three to five months after the first year to assess changes in the participants' skills. The interviewer asked participants whether they felt the program had influenced their teaching knowledge and skills. Participants were then asked to describe instances of how the program had affected their knowledge and skills. At the conclusion of the interviews, participants identified colleagues or supervisors who had observed them teach or plan for teaching both before and after participation in the program. These colleagues or supervisors were interviewed using the same format (i.e., Have you noticed any change in fellow X's teaching since October? Could you provide me with examples of how the program affected fellow X's skills and knowledge?).

Data analysis. Three analyses were completed. First, factor analysis was used to reduce the 79 items to a more concise set of variables. While it was recognized that the results of factor analysis based on correlation coefficients from small sample sizes can be unstable,1 the factor analysis was exploratory. It was undertaken as a springboard to identify factors within the large amount of questionnaire data that could be related to other, external sources of data from interviews and the focus group. The principal-axis factor method was used, and both varimax and promax rotations were considered. Due to the homogeneity of some of the factors and their potential intercorrelation, the oblique promax rotation was selected to allow for intercorrelation of factors. The cutoff used for meaningful loadings was 0.4. Alpha reliability coefficients for the 18 factors retained ranged from .69 to .98. Second, paired t tests were used to compare pre- and post-intervention ratings from the pre-program and follow-up questionnaires. Finally, follow-up and supervisor or colleague interviews and the focus group were coded for all participants. The coding procedure followed that of open, axial coding.2

Demographics. Fourteen participants were women and 22 were men. Twenty-six were physicians, eight were allied health professionals (e.g., chiropractors or physician assistants), and two were basic scientists.

In terms of motivation or interest for participating in the program, 17 wanted to become better teachers and 17 wanted to gain improved knowledge and skills in specific areas (e.g., how to give better lectures or help learners in difficulty). Seven participants wanted to initiate education courses or departments of education at their own institutions, five wanted to gain more confidence in their abilities as teachers, and three wanted increased networking opportunities.

Results

Question 1: Did Participants' Knowledge of Teaching in Clinical and Classroom Settings Improve?

Pre- and post-program self-ratings. As displayed in Table 1, fellows' knowledge in all areas as assessed pre- and post-program increased over the course of time.

TABLE 1
TABLE 1:
Self-reported Knowledge Gain after a One-year Fellowship as Assessed by Comparing Fellows' Mean Scores on a Pre-survey and a Follow-up Questionnaire*

Interviews and the focus group. When asked whether they had gained knowledge, all 18 interviewed participants responded “yes.” Ten stated that the program had played a critical role in their knowledge gains as teachers. The following comments were typical:

  • “[The program] created ways for me to do everything. Some people say it [the program] enhanced their skills but it gave me structure.”
  • “I have been in teaching for many years so I did not expect to learn so much. … I did not expect such large changes but that is what I got.”
  • “Training gave me the building blocks to design and the basis to develop better … skills.”

Eleven interviews were conducted with supervisors or colleagues of the fellows. Of the 15 interviewed, nine identified areas of knowledge gain identical to what the fellows had themselves stated. Examples of supervisors' comments follow.

  • “He has a new philosophy of learning … and he picked up this philosophy during the program. He knows more and has more confidence too.”
  • “… I can say now that he has a more complete understanding of the educational process and evaluation.”
  • “He always had a lot of educational knowledge but now he has more and is more up to date than anyone else in the hospital.”
  • “With it [the program] his horizons were broadened. He has a growing interest in research methods and is becoming more realistic and enthusiastic about becoming a full faculty member … now I see [him] thinking a lot about what he learned and also being more committed ….”
  • “One of the things that has been a good thing for her, is that she now brings an academic or educational perspective to the workplace… she constantly focuses us on the educational component.”

Learning took place most often for those who implemented or continued to implement what they learned during the fellowship. One participant felt he had learned a lot about presentation skills because he was able to implement much of what he had learned at work. Another participant knew her knowledge of teaching in small groups and with cases would probably not increase much because she had never had any experience with either of these areas and had no practice with them. Another participant reported that what she put into practice had helped her learn those areas better. Learning seemed to take place more in situations where implementation was possible.

The balance between theory and practice contributed to participants' learning. One participant felt he had gained from the program specific, concrete ideas to improve his teaching. Another noted that the program brought up issues that she could not have obtained from reading alone. The opportunity to do things handson and actively participate integrated learning in another participant's behavior. Similarly, one fellow stated there was a large body of knowledge about teaching he had been exposed to that was intellectually challenging and yet very practical.

Interaction also encouraged learning. Participants cited opportunities such as networking with colleagues, knowing that all participants struggled together, reflecting in groups, using each other as mentors, and learning from peers about gains and stumbles as means of contributing to learning.

Program faculty contributed to fellows' knowledge gain. When asked about what aspect of the program had been crucial in contributing to their knowledge as teachers, participants commented,

  • “Instructors in general.”
  • “Excellent role modeling of good teaching techniques by faculty.”

Question 2: Did Clinical and Classroom Teaching Skills of Participants Improve?

Follow-up interviews. When first asked whether they had implemented what they had learned, all 18 participants responded positively. Behavioral changes took many different forms. The program affected participant behaviors on three very different levels.

Primarily, the fellows reported changes in routines as a result of the program. Many cited better organization and greater consistency in their teaching. Their prior inclinations and, thus, learning and teaching practices were altered.

Participants' behaviors were influenced by what they learned. For instance, participants reported conducting needs assessments of their learners prior to teaching, using more formative assessment techniques, and designing new courses with the help of what they had learned. These patterns portrayed the program's immediate and direct influence on participants' behaviors.

Finally, the participants shared what they had learned with other faculty. Four stated they had returned and taught others some of the skills they had acquired during the program. Others used the educational concepts learned to instruct peers from within and outside their own institutions.

Supervisor or colleague interviews. Similarly, supervisors and colleagues reported that in some cases there were immediate changes in professional performance. Being more sensitive in interactions with students, adopting a positive attitude toward broadening horizons, and taking a more active role in evaluation indicated changes in personal philosophy that in turn led to altered specific behaviors (e.g., better lecturing skills and providing explanations, reflecting upon learning and research, and providing better feedback).

Many of the supervisors' or colleagues' reports corroborated the participants' self-reports in greater detail. An example was the fellow who presented his fellowship project proposal to faculty members in his department. His supervisor related that at the end of his presentation, the fellow enlisted other faculty members to help with the project. Another fellow stated that he had launched an evidence-based medicine (EBM) course within his department. The colleague of this fellow began the interview by stating that the fellow had actually “Initiated and created a revolution in terms of EBM in our department.” The change was not just in designing a course for learners but also in educating other faculty members about EBM. Another fellow initiated and completed a project to affiliate the practice he worked at with a local university. This fellow also initiated a program for learners in difficulty at the practice, identified and planned a new curriculum for learners in difficulty, and involved everyone in the practice to plan better learner training. The fellow's colleague stated, “He has created enthusiasm among us for this material.”

The focus group. During the focus group, two questions assessed whether participants had implemented what they learned during the program. The first question asked, “What aspect of the program do you feel stimulated, maybe even changed you the most?” The second question was “What aspects of the program were most important in contributing to your knowledge and ability as a faculty member?”

Responses to both questions pointed to direct changes in skills.

  • “The chance to discuss with everyone and get together with a group of people who were interested in medical education… served as a catalyst for me to do some things I had wanted to do for a long time.”
  • “All the programs were new to me and stimulated me to get closer to the teaching of our residents.”
  • “Before taking the workshops my usual style for lecture was trying to memorize. Now I am trying to change my presentation style.”

Discussion

The fellows' knowledge and skills increased as a result of program participation. Their self-reports were corroborated by supervisors or colleagues. Implementation during and after the program contributed to greater learning. The balance between theory and the practical, interaction among students, and role modeling by program faculty members contributed to participants' learning. Participants changed their philosophies and routines, directly implemented what they learned, and used what they had learned to educate others.

In reassessing the faculty development literature within the context of this study, a few issues were clarified. Primarily, prior research had indicated that participants' knowledge and satisfaction increase after such programs.3,4 However, not much was known about participants' skills. In this study, participants' skills were influenced, indicating that training can potentially affect behavior. Second, most research on outcomes of faculty development amasses results from different types of programs.3,5 To better understand outcomes, programs should be examined in context. Conducting an evaluation of a single program established whether the program accomplished its objectives without muddling the results of many programs. Design, objectives, and participants vary between programs, rendering research on faculty development programs context-dependent. Accordingly, it is a complex process to consistently measure one outcome from a series of different programs. To ensure concise outcomes here, the results of only one program were examined. Finally, triangulation of method was used to ensure validity of the results. Self-report has traditionally been the method used to assess outcomes of faculty development programs. In this study, in addition to self-report, supervisors and colleagues of participants were interviewed to determine whether the knowledge and skills of the participants had been influenced by their participation.

Two limitations of this study include the participants' self-selection and the case study method. The usefulness of self-selection has been the subject of continuing debate in studies of faculty development programs. The question is whether faculty members who choose to participate in such programs are already more motivated and committed to their academic careers.6,7 It has yet to be determined whether participants and non-participants differ in relevant personal characteristics such as motivation and professional experience.

The second limitation involved an inherent limitation in case study design. This study was an evaluation of a single program and, as a consequence, caution must be exercised in any attempt to generalize the substantive results from this study to other faculty development programs. However, the evaluation methods can be applied in a variety of settings.

Several issues for future research were generated from this study. Opportunities for implementation of newly acquired skills need to be studied further. One important finding in this study was that the participants learned most when they were able to implement what they learned as they learned it. During the program, two types of implementation opportunities enhanced learning: application opportunities during the fellowship program and implementation of concepts at home institutions between program meetings. The latter form of implementation was occasionally constrained by the fellows' home institutions through complex bureaucracy and lack of release time to pursue change.

Designing faculty development programs in the health professions based upon the executive model of education is new. With recent advances in distance learning, this type of program offered a format for educating faculty from afar with periodic contact opportunities that promoted learner and faculty interaction and knowledge and skill gain. This study documented how educational principles can be translated into teaching practices.

References

1. Pedhauzer EJ, Schmelkin LP. Measurement Design, and Analysis: An Integrated Approach. Hillsdale, NJ: Lawrence Erlbaum, 1991.
2. Strauss A, Corbin J. Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Newbury Park, CA: Sage Publications, 1990.
3. Bland CJ, Hitchcock MA, Anderson WA, Stritter FT. Faculty development fellowship programs in family medicine. J Med Educ. 1987;62:632–41.
4. Sheets KJ, Henry RC. Evaluation of a faculty development program for family physicians. Med Teach. 1988;10(1):75–83.
5. Anderson WA, Stritter FT, Mygdal WK, Arndt JE, Reid A. Outcomes of three part-time faculty development fellowship programs. Fam Med. 1997;29:204–8.
6. McGaghie WC, Bogdewic S, Reid A, Arndt JE, Stritter FT, Frey JJ. Outcomes of a faculty development fellowship in family medicine. Fam Med. 1990;22:196–200.
7. Reid A, Stritter, FT, Arndt JE. Assessment of faculty development program outcomes. Fam Med. 1997;29:242–7.

Section Description

Research in Medical Education: Proceedings of the Fortieth Annual Conference. November 4–7, 2001.

© 2001 by the Association of American Medical Colleges