In this article, we outline the components and discuss the findings of a distance-based consulting (DBC) program designed to assist faculty development projects through all stages—needs assessment, project design and implementation, and, in particular, program evaluation and the dissemination of results. Based on the DBC experience, we describe institutional characteristics that facilitated the project sites' success, and assess the usefulness of our distance-based assistance efforts. Finally, we offer recommendations for future distance-based consulting programs of this type.
In academic departments of primary care, faculty development initiatives have been changing over time to accommodate the changing demographics and commitments of faculty. As in other areas of the academy, many primary care faculty are now experienced, mid-career, or senior members of their institutions, and thus have different developmental needs than those of newer faculty—who typically have been the focus of most past faculty development efforts.1 In addition to demographic shifts, faculty roles are being heavily redefined. For example, many institutions are asking their faculty to commit more time and energy to conducting research, making research training for primary care faculty highly desirable. Also, many institutions are locating greater portions of their curricula within the community surrounding the medical school. This necessitates the training of volunteer faculty who assume teaching roles in these clinics and hospitals.
Given this changing landscape, primary care departments are continually challenged to provide their faculty with many new, tailored opportunities for professional development. Unfortunately, little published research and few generalizable evaluations of faculty development programs are available to help faculty development planners choose the most appropriate and effective training strategies.1 Even when internal evaluations of faculty development programs are conducted, their results are often not generalizable beyond the specific program at hand. Further, many evaluations have lacked the rigor necessary to measure outcomes,2 and few institutions have taken the important step of publishing their results.1 What is needed is rigorous research or generalizable evaluations aimed at discerning which components of and strategies for faculty development are most effective in different settings and circumstances. Additionally, if the wider academic community is to benefit from these investigative efforts, these findings must be disseminated through publication.
These pressing needs were recognized by the planners of the December 1998 national faculty development conference, “Models That Work: The Nuts and Bolts of Faculty Development” (MTW). The MTW conference itself—sponsored by the Health Resources and Services Administration (HRSA), along with the Ambulatory Pediatric Association (APA)—was designed to address two of the five basic stages of a faculty development project: project design (i.e., how to select strategies for meeting faculty development needs) and project implementation (i.e., how to organize faculty development efforts within an institution). MTW planners decided to supplement the conference experience with additional support for attendees, such that the remaining three stages of faculty development—needs assessment, project evaluation, and publication of findings—would be achieved. Preconference support focused on needs assessment, delivered via a video teleconference on the topic. In addition, a one-year, post-conference support plan was established, with an emphasis on evaluation and publishing. This support was delivered through distance-based consulting with a team of three consultants (including two of us, CB and WV) chosen for their expertise in needs assessment, measurement and evaluation, and writing and publishing. Last, to further ensure that at least some conference attendees would successfully move through all the stages of a faculty development initiative, 17 institutions were selected as sites to receive individualized focused support before, during, and after the conference.
By October 2000, 22 months after the MTW conference, most of the 17 projects had successfully completed needs assessments and had chosen appropriate faculty development strategies for their desired outcomes. Many sites had also conducted internal evaluations of their projects, or were in the process of conducting evaluations. With respect to disseminating their results, however, only three sites had given conference presentations on their projects or submitted manuscripts for publication.
It is encouraging that primary care faculty development efforts have been furthered through the MTW conference and its accompanying components (the videoconference and follow-up distance-based consulting). Even though many sites have not yet reached the stage of readiness to publish their outcomes, much has been learned through the DBC program about (1) the realities of conducting faculty development projects in primary care departments and (2) how those realities should guide the conduction of future distance-based consulting efforts of this type. We hope that our experience will lead to others' successes, and that they will benefit from our hindsight and recommendations for future distance-based consulting programs.
THE DBC PROGRAM
Pre-conference Events and Support
The DBC program had two pre-conference components: (1) the selection of DBC project sites by the MTW conference steering committee, and (2) the production and satellite broadcast of a program on conducting needs assessments.
Project site selection. As part of a pre-conference mailing, prospective MTW conference participants were invited to apply to be a “project team” in the DBC program. These sites would receive special guidance from the DBC consulting team—before, during, and especially after the conference—in the areas of needs assessment, program evaluation, and publishing. Prior to the MTW conference, 17 project teams were selected by the MTW steering committee from among 34 applicants, based on the following criteria:
- clarity of the rationale for the site's proposed faculty development program and the focused identification of a problem;
- quality of the site's preliminary needs assessment plan;
- quality of the site's proposed faculty development strategies or plan to address the identified problem;
- site contacts' enthusiasm for attending the MTW conference and being a participating site, and/or their identification of specific ways that being a site would help their proposed project;
- capacity of each applying institution to implement needs assessment, faculty development, and/or evaluation designs; and
- level of commitment of the institution to the proposed faculty development program.
The applicants were rated for each criterion on a four-point scale (1 = little or no information, 2 = poor, 3 = moderate, 4 = exceptional).
Originally, only six sites were to be selected as project teams; however, given the overwhelmingly positive response of institutions and the impressive quality of many applications, the selection committee suggested that additional funding be secured to expand the pilot study. The funding was obtained from HRSA, and, following the recommendations of the committee, the number of pilot teams was increased to 17 sites: one in Canada, one in Puerto Rico, and the rest in 14 different states across the United States. (Technical assistance for any site outside the United States could not be supported by the federal funds. Thus, assistance for the site in Canada, whose ratings from the selection committee were commendable, was provided as part of the out-reach efforts of faculty at the University of Minnesota Medical School.)
Expansion of the project group allowed for the inclusion of a more diverse range of academic institutions (e.g., community-oriented, consortium, research-oriented) as well as a broader representation of geographic locales. Care was also taken to balance the representation of all three primary care disciplines (family medicine, general internal medicine, and general pediatrics). Table 1, which describes the objectives for each project, indicates that three sites planned faculty development projects in family medicine, two planned projects in pediatrics, and another two planned projects in internal medicine. The remaining ten projects involved combinations of these three disciplines.
Video teleconference. In October 1998, two months prior to the MTW conference, a video teleconference on needs assessment was designed and broadcast by the DBC program consultants. Conducting a needs assessment is an important first step for any faculty development project, so that strategies are selected to address the true needs of the faculty. It was recommended to sites that they conduct a needs assessment prior to the conference in order to focus their conference time on the types of strategies most relevant to their faculty members' needs. The satellite teleconference was available to any institution attending the MTW conference. Those who alerted the consultants ahead of time that they planned to participate in the live broadcast were provided with pre-teleconference reference materials on needs assessment. The teleconference itself featured national program evaluation experts, including the DBC program consultants, who presented the “nuts and bolts” of conducting needs assessments and responded to questions called or faxed in by participants. Following the teleconference, the DBC program consultants further assisted project leaders in designing their needs assessments by reviewing assessment plans and providing feedback. Copies of the broadcast were sent to all participating projects and to other institutions upon request.3
Conference Events and Support
The MTW conference, held December 2–4, 1998, in Lake Buena Vista, Florida, was open to any interested participants, but also included special offerings for the project teams. The conference evolved from two earlier meetings sponsored by the Division of Medicine, Bureau of Health Professions, HRSA, which addressed the education of the generalist physician and emphasized the need for more faculty development in medical schools. The intent of the MTW conference was to help participants choose the models of faculty development most appropriate for their settings and gain skills for implementing faculty development programs at their institutions. The conference agenda included numerous workshops on how to address likely faculty development needs in areas such as clinical teaching and building computer skills; presentations of successful local, regional, and national faculty development programs; and informal time for meeting with national faculty development experts. Over 200 participants from around the nation took part.
With respect to the DBC program, the conference supplied project sites with three days of “in person” interaction with each other, with the DBC consultants, and with other faculty development experts. This served as a prelude to what was to be a distance-based experience from that point on. The conference events specifically targeted to the participating projects were as follows:
- Two presentations, “Effective Approaches to Faculty Development” and “Evaluating Faculty Development Programs and Publishing the Results,” were presented by the DBC consultants to set the stage for the project sites' future work.
- At two additional special events (a reception and a luncheon), pilot site participants had a chance to network with each other and with the DBC consultants, and to talk informally with the conference presenters about their faculty development plans, ideas, and issues.
- Conference presenters were available for individual consultations with project site participants on the last day of the conference.
- To encourage networking and sharing of ideas among the sites and continuing communication between sites and the DBC consultants, each site representative received a packet listing project site locations and contact information, the descriptions of each planned faculty development project, the kinds of assistance available from the DBC consultants (see the next section, “Post-conference Technical Assistance”), and other resources for faculty development.
Post-conference Technical Assistance
Distance-based consulting. The strategy of distance-based consulting was selected to bridge the geographically dispersed project sites and national consultants. Using a variety of long-distance telecommunication means (Web site, listserv, e-mail, phone, fax, and U.S. mail), the DBC consultants were available to offer assistance to project teams in seven key areas of the faculty development process (see List 1). Since the videoconference focused on needs assessment and the MTW conference focused on selecting and implementing the appropriate faculty development approach for one's needs, it was expected that the post-conference technical support would be dedicated primarily to measurement, evaluation, and publishing (areas 5–7 in List 1).
MTW Web site and listserv. In addition to one-on-one assistance by the consultants, a Web site and a listserv were developed to provide ongoing information and support to engage the participants in online communication with one another, and to post sites' progress with their faculty development projects. The Web site was periodically updated to highlight information about funding for faculty development programs, national faculty development opportunities, the new structure for HRSA training grants, the types of faculty development programs that can be funded under federal grant legislation (Title VII), new approaches to faculty development (e.g., Web sites highlighting the use of distance-learning and Web-based instruction strategies), and resources for evaluating scholarship.
In June 1999, six months after the MTW conference, progress reports were collected from all projects. At the end of nine months (September 1999), an evaluative feedback survey was conducted to assess sites' progress in developing their individual faculty development projects, as well as the overall usefulness of the distance-based consulting. Also in September, observations of the program consultants were compiled and reviewed. In this section, we describe the methods and findings for this three-part evaluation: the gathering of progress reports from the project sites, the follow-up feedback survey, and observations of the program consultants. As described below, the information gleaned from this evaluation process served to redirect the remainder of the DBC program.
Part 1: Project Sites' Progress
At the mid-point of the technical assistance follow-up to the MTW conference in June 1999, a progress report form was sent to project coordinators asking them to describe their projects' progress. Specifically, they were asked to identify
- any change(s) in project goals that had occurred since the inception of the project,
- the status of the project in terms of specified stages of development and implementation,
- institutional factors supporting project success,
- barriers or constraints on their projects that sites encountered, and
- assistance desired at this point, either from the distance-based consultants or from elsewhere.
To provide detailed information for the second and third items above, 12 progress markers and seven institutional support indicators were defined (see Tables 2 and 3). Site coordinators were asked to indicate which progress markers had been completed, and which supportive characteristics were present at their institutions. These results are presented next.
Completion of progress markers for faculty development projects. The six-month progress reports revealed that 12 of the 17 sites were actively working on their faculty development initiatives but not yet implementing their plans or strategies, that is, they were still engaged in preparatory work (see Table 2). Four sites were still in the design phase, and one (site 5) did not provide any detail about project status. Only three sites reported being at the point of actually implementing defined faculty development strategies. Three sites reported not even having project teams in place to plan and implement their faculty development projects.
Given these results, it was apparent that the DBC program's planned timeline was unrealistic for many sites. Recall that the objective of the overall DBC program was to move sites from needs assessment to evaluation and preparing manuscripts for publication in approximately 14 months (two months pre-conference and 12 months post-conference). With only six months remaining, only three sites were at the point of implementing their faculty development projects. Further, several sites reported facing significant barriers and constraints, lending concerns about these sites' abilities to complete their projects, conduct evaluations, and publish results.
A review of the DBC consultants' logs six months after the conference indicated that project site personnel were not calling on the program consultants as frequently as had been expected, no doubt because they had not yet reached the stages of evaluation and manuscript writing, which were to be the focus of the consulting services. The types of assistance that the sites did request varied significantly according to their levels of sophistication. Queries ranged from very basic and open-ended, such as “How do I begin a faculty development initiative?” to more specialized concerns about how to overcome specific institutional barriers, funding constraints, or other political or organizational difficulties. The majority of questions pertained to the preparatory areas of faculty development (needs assessment, project planning, organization of tasks, selection of faculty development strategies, reference searches on various faculty development topics, identification of possible funding sources, and the location of key experts in specific faculty development areas), rather than the areas of evaluation design, feedback instruments, data analysis, or planning a manuscript.
Presence of institutional supports or barriers for faculty development projects. A review of sites' responses regarding institutional supports and barriers was conducted in the following manner. Sites were categorized into two groups: those that seemed to be progressing well with their projects, and those that seemed to be progressing slowly or were stalled altogether (delineated in Table 3 as “making progress” and “experiencing difficulty,” respectively). In total, ten sites were categorized as “making progress” and seven sites were identified as “experiencing difficulty.” The latter included the five sites not actively working on faculty development initiatives (as indicated in their progress reports), as well as an additional two sites that had reported major impediments to their progress (via the progress report, phone conversations, and/or e-mail communication with the program consultants). Following this categorization, the progress report responses of the two groups were compared to detect any major differences between the groups in the levels or types of institutional support present.
The large majority of sites—nine of the ten (90%) that were “making progress” and five of the seven (71%) that were “experiencing difficulty”—reported the presence of support from administrative/organizational leadership; thus, this characteristic did not seem especially salient to project progress. In other areas, however, clear differences between the two groups were evident. A considerably higher percentage of the project sites “making progress” indicated having the following support indicators: a project team in place to plan and implement the project; at least one team member with prior experience/training in faculty development; at least one team member with a commitment of 10% or more funded time for the project; and designated funding for the faculty development effort. Further, sites “experiencing difficulty” were more likely to report having experienced a change in the project team since the start of the project and having faced barriers to project implementation. Interestingly, however, even among the sites “making progress,” five of the ten reported experiencing barriers to project implementation. It is likely that the existing supportive institutional structures at these sites helped them to overcome any significant barriers.
Part 2: Feedback Survey
With just six months remaining in the program and only a few sites actually implementing their faculty development projects, it was clear that a redirection of the efforts of the program consultants—and quite possibly of the funds remaining on the contract—might be warranted. Therefore, the program consultants decided to survey the site contacts to solicit feedback about the usefulness of the DBC program components thus far, as well as their preferences regarding assistance for the remainder of the program.
The feedback survey also included a detailed section on institutional support to elicit sites' ratings of the importances of various institutional support characteristics that could be having an impact on their projects' success. These data were expected to complement the information previously collected that revealed whether or not proposed supports or barriers were present at sites. The list of supports was expanded to reflect insights into the sites' environments that the program consultants had gleaned through communication with the sites. (In the survey instrument, this expanded list was renamed “potential predictors of faculty development project success,” to reflect the possible likely connection between the presence of certain institutional environment factors and project success.)
The 68-item survey was mailed to the project sites in September 1999. Thirteen of the 17 sites responded; nine of these were among the ten in the “making progress” subset, and four were among the seven previously categorized as “experiencing difficulty.” These numbers themselves were telling, as a higher percentage (90%) of the ten “making progress” sites responded to the survey. Even after numerous reminders from the project consultants, only four of the seven “experiencing difficulty” sites (57%) responded. We surmise that the low response from the struggling sites was exacerbated by the environmental stresses that they had reported in their progress reports (e.g., hospital mergers, partnering with managed care organizations, pressures to produce more income via direct patient care).
Next, we present the results from the project-site feedback survey for each of the following survey categories:
- Significance of potential predictors of project success
- Usefulness of the distance-based assistance provided by the consultant team
- Sites' preferences regarding types of consulting assistance to be provided during the remainder of the program
- Additional information desired from the consultant team (content areas to be covered)
- Sites' recommendations regarding future approaches to distance-based consulting
Significance of potential predictors of project success. Table 4 summarizes the sites' responses to survey queries about the presence and significance of potential predictors of faculty development success in their environments.
A high proportion of the sites “making progress” (seven of the nine, or 78%) reported the presence of the following set of potential predictors of project success:
- Stable project team
- Stable project site contact
- Team members with prior faculty development experience
- Support from administrative/organizational leaders
- An environment willing to support faculty development initiatives
Interestingly, a similarly high proportion of sites “experiencing difficulty” (three of the four sites, or 75%) cited the presence of four of these same five predictors. The exception was “support from administrative/organizational leaders,” which was present for only two of the four “experiencing difficulty” sites.
High percentages of both groups—seven of the nine “making progress sites (78%) and three of the four “experiencing difficulty” sites (75%)—also reported that barriers to project implementation were present at their sites. These barriers included: lack of funding, lack of institutional support and enthusiasm from collaborating departments, lack of mechanisms for interdepartmental collaborations, lack of experience/expertise in faculty development, lack of adequate time and money for developing curriculum, lack of protected time, lack of administrative support, and faculty turnover. Notably, for sites “making progress,” the number of project teams reporting the presence of barriers increased by two in the three months since the June progress report.
It is in examining the remaining three potential predictors—all of which had to do with funding—that we see the greatest differences between the site groups. Five of the nine “making progress” sites (56%) reported (1) having designated funding to support the faculty development initiative, (2) having specific funds committed to the project for at least one team member, and (3) being in an environment capable of supporting faculty development initiatives (e.g., no major budget shortfall, few faculty transitions, a strong mission, no threat of mergers, etc.). In contrast, of the five sites whose projects were “experiencing difficulty,” only one (25%) reported an environment capable of supporting faculty development or the existence of money to support at least one team member. Moreover, none of the sites “experiencing difficulty” had designated funding to support its faculty development initiative.
In addition to stating whether a potential predictor was present or absent at their sites, site contacts were asked to rate each potential predictor as to “the extent to which the [potential predictor], or lack of the [potential predictor], impacted the success” of their faculty development projects. For these ratings, a three-point scale was used (1 = not at all, 2 = somewhat, 3 = most definitely). In examining the mean scores for each potential predictor, we found that for most predictors, the mean response hovered between 2.50 and 2.67. The highest mean score (2.92) went to having funds committed to the project for at least one team member. The lowest mean score (2.20) went to having a stable site contact.
From these mean scores, we can see that, on average, all of the predictors were viewed by the site teams as contributing at least “somewhat” to the success of their projects. Indeed, many rating scores tended toward the upper (“most definitely”) end of the scale. The predictor receiving the highest mean score was related to funding; this underscores the importance of funding issues for the projects. In retrospect, utilizing a wider range of possible scores (a scale of five or seven, rather than three) might have revealed greater differentiation in terms of the relative value of each potential predictor.
Usefulness of the distance-based assistance provided by the consultant team. The next segment of the survey assessed the usefulness of each of the DBC program's assistance strategies in the project so far. Sites were to rate the strategies as to the extents to which they were found to be useful for their projects (see the top panel of Table 5 for mean ratings). A three-point rating scale was employed: 1 = not at all useful, 2 = somewhat useful, 3 = most definitely useful.
Most notable in this segment of the survey was the sites' preference for one-on-one consulting geared to their specific needs. As noted earlier, sites were at varying levels of experience and sophistication in conducing faculty development efforts, and their questions reflected those differences. Also, their queries were unique to their settings, circumstances, and objectives. It is not surprising, then, that the highest-rated item in this segment of the survey (mean = 2.67) was “specific assistance from the DBC program consultants in response to requests from the site.” Assistance that was more broad in scope, to serve the entire group, received much lower ratings. For example, “general assistance/information” (i.e., group e-mail messages and faxes) was rated as only somewhat useful to the project sites (mean = 1.91).
Further evidence that the sites preferred individualized attention over group information is seen in their relatively low ratings of two group-oriented assistance strategies: the MTW Web site and the listserv. Although both were created at the request of the project teams, each was rated less than “somewhat” useful for sites' faculty development projects (mean = 1.80 and mean = 1.67 for the Web site and the listserv, respectively). Likewise, the teleconference on needs assessment was rated less than “somewhat” useful (mean = 1.73), and the videotape of the teleconference was found to be least useful (mean = 1.18).
Additional information desired from the consultant team (content areas to be covered). In addition to their ratings of the assistance received so far, the project sites were asked what additional information they desired from the consultant team. The highest-rated content area (2.83 on a three-point scale, where 1 = not at all useful, 2 = somewhat useful, 3 = most definitely useful) was information about “financing faculty development, developing infrastructure, administrative support, and resources (especially money for initial and ongoing development).” This was followed by information about establishing environments supportive of faculty development (e.g., “how to encourage cultural change, get faculty interested and motivated, get departmental support, find time for faculty development, add faculty development as one component of career development, etc.”); mean = 2.55.
All but one of the ten possible topic areas listed received at least a “2” rating from the sites, indicating that the sites were at least “somewhat” interested in receiving information about those topics. The only choice that received less than a “2” was “needs assessment,” presumably because by this time in the project, all of the sites responding to the survey had completed this phase of project development. These data again reflect the wide range of information the sites desired in various topic areas, and point to the challenge that the project consultants faced in meeting the sites' differing needs, especially since the distance-based consulting that occurred after the MTW conference was intended to focus on just two of these topic areas: evaluation and publishing.
Project sites' preferences regarding types of consulting assistance to be provided during the remainder of the program. Projects were also asked to comment on the kinds of assistance they would like to receive for the remainder of the program. Site contacts were asked to assign a total of 100 points among seven options with respect to “how useful [they] thought each type of assistance would be to their site within the next 3–4 months.” (More points were to be assigned to those options considered the most useful). The bottom panel of Table 5 lists the assistance options with the average points awarded each option.
The sites' responses to this section of the survey indicated that their strongest preference was to be given discretionary funds to support their faculty development projects (mean = 36.82). The second-highest rating went to the option of receiving funds to hire a consultant of the site management's choice (mean = 28.50). Thus funding again emerged as a central concern.
Project sites' recommendations regarding future approaches to distance-based consulting. In the final section of the survey, the sites were asked to reflect on their experiences in this distance-based consulting program, and then to forecast what they thought would be most beneficial for participants in future programs of this type. Specifically, they were asked to rate items according to the extents to which the items were considered useful and likely to increase the effectiveness of future distance-based programs (see Table 6). The ratings ranged from 2.17 to 2.67, based on the same three-point usefulness scale described earlier. The highest ratings (mean = 2.67) went to having a time frame encompassing more than 15 months for sites to move from needs assessment to implementation to evaluation. This was closely followed by having set expectations for being a project site (mean = 2.64), having dedicated time (perhaps two to three hours) with the program consultants at the faculty development conference (mean = 2.58), receiving funding as a result of being selected as a project site (mean = 2.55), and having access to a wider pool of consultants with skills or expertise in a range of faculty development areas and stages of project development (mean = 2.54).
These responses underscore five key themes emerging from the DBC program, themes either already alluded to, and/or further discussed in remaining sections of this article: (1) the sites required more time than anticipated; (2) the program design did not include requirements of accountability on the part of the participating sites; (3) sites particularly appreciated consulting aimed at their individualized needs; (4) adequate funding was a key element of the sites' faculty development project success; and (5) the sites' consulting needs were broad-ranging.
Part 3: Consultants' Observations
In addition to feedback from the project sites, observations from the DBC program consultants provided an additional perspective on both the program sites' success in their faculty development endeavors and the success of this distance-based consulting approach. The consultants reported that the distance-based assistance approach felt unsatisfying for them in that contact with the sites was sporadic, and the lack of face-to-face contact and reliance on e-mail allowed relationships to “weaken” and feel more impersonal over time. Many of the sites were not ready for coaching on evaluation and publication or did not want such coaching; their requests for assistance had more to do with political or organizational concerns or institutional barriers than with evaluation plans and publication.
In addition, the needs of the sites tended to be individualized, requiring a broader set of coaching skills and resources than were originally planned and available in our resource/consultant pool. The program consultants could provide sites with names of experts or appropriate resources on such topics as problem-based learning or Web-based instruction, but the DBC consultants were not themselves “jacks of all trades,” and thus had difficulty fielding the wide variety of questions put forth. This led to a “disconnect” between what could be offered, what the sites needed, and in what time frame solutions could be achieved.
In retrospect, it appears either that a larger pool of consultants was needed to handle the wide range of needs, or that a smaller, more “select” group of sites—with similarly focused needs, support systems, and motivations—would have been more manageable for the three-person DBC consultant team.
Other “disconnects” between the planned DBC program and what occurred had to do with timing. As the program progressed, it became clear that the speed at which sites could move from needs assessment to implementation and then to evaluation was greatly overestimated. Most sites needed more time than had been allowed to plan and implement their proposed projects, let alone evaluate them and prepare manuscripts for publication. The existence of individual institutional support for faculty development projects, evaluation, and the publishing of results was uneven across sites; in many cases, individuals responsible for faculty development and their resources were overtaxed with other responsibilities. Therefore, the sites progressed at differing rates. Compounding this timing “disconnect” were unexpected institutional barriers that arose at various sites as the DBC program progressed, overwhelming even those locations where favorable characteristics had initially been present.
Another factor impeding the sites' success had to do with the relative lack of experience of site personnel in conducting faculty development projects. This factor reduced many sites' likelihood of success, and especially reduced the likelihood of publishing, as staff did not have an already-established line of inquiry and body of knowledge on which to build.
There also seemed to be a lack of commitment from the project sites to the DBC program. Although the sites had to apply and were selected, there was no direct funding to sites connected with their acceptance into the program, nor were there prescribed expectations related to the sites' participation. Sites' active participation had been assumed by the DBC program designers, and mechanisms to ensure sites' accountability had not been planned. Thus, there were no guarantees that participating sites would ask for or utilize the resources offered, or that they would respond to requests from the consultants once the program was under way. For example, no site ever posted a question or reported on its progress on the listserv, and timely progress reports from the sites were not forthcoming. As time went on, and some sites' institutions or departments were experiencing great duress, even the distance-based consultants' offer to fund an on-site consultant visit at each site went unanswered by some locations. (The on-site consultant visits are discussed in the next section of this article.)
Given the numerous difficulties surfacing at some sites, which impeded their progress, and the much longer timeline that even the most successful sites were requiring, it appeared that the intended outcome of completed faculty development projects, with evaluation of these projects and published results, was not likely to be forthcoming from most sites.
Redirection of the Project
After careful review of the program results at this point, the HRSA project director and DBC consultants decided to extend the project for an additional year. The remaining funds were allocated for the hiring of an additional consultant for each project site. Each site could choose its own consultant to match his or her expertise to its most critical area of faculty development. Travel money and honoraria were provided so that consultants could visit locations in person for consulting and/or to conduct training. The hiring of consultants was the second-highest-rated choice among the sites. This method of distribution of funds was preferred by the funder (HRSA) over the sites' first choice, which was to be granted full discretionary use of funds for any aspect of their project. The specified use of funds allowed the funder and consultants greater financial oversight. (The hiring of additional consultants required the sites to submit proposals for approval, and required the invited consultants to sign contractual agreements.) Sites were allowed great latitude, however, in that they had the discretion of hiring whatever consultants they thought best to work either on their faculty development projects that were designated parts of the DBC program or for other pressing areas of need in their faculty development projects.
A letter offering the sites the opportunity to hire a consultant was sent December 18, 1999. (One site, located in Canada, could not be included in this opportunity, as HRSA regulations did not allow funding support to extend outside the United States.) Only a few sites had responded to this offer by the application deadline of March 15, 2000, signaling the institutional instabilities and other difficulties several sites were experiencing in continuing their faculty development functions. This also indicated that even sites that had flourishing faculty development efforts required a great deal of time to take advantage of the assistance offered through the DBC program. Most sites that did respond by March 15 requested extensions for completing their arrangements with consultants and submitting proposal letters. These extensions were granted.
A few sites had dropped out of contact with the DBC program consultants, and several e-mail messages and phone calls were required to reestablish contact with site representatives. These sites generally declined the opportunity to hire a consultant, citing lack of time and personnel to make the necessary arrangements, institutional structural and financial problems, changes in staff, personal health problems among the staff, administrative difficulties, and institutional shifts in priorities away from faculty development. One site that declined funds had a happier story to tell: site representatives had secured their own funding to hire a consultant from other sources.
In total, nine of the 16 U.S. project sites used HRSA funds to arrange for on-site consultant visits; of these, three sites were so ambitious that they arranged for two consultants. With these sites, communication between DBC consultants and sites increased significantly, mostly for the purposes of solidifying contracts and arranging for the payment of the invited consultants. Consultant visits began as early as the end of May 2000 and continued into October 2000.
Final Assessment of Project Sites' Progress
As the consultants' visits concluded, a final progress report was solicited from all 17 sites. Reports from 15 sites were returned (an 88% response rate). Findings from these 15 reports echo findings from the earlier progress report, evaluation, and consultants' observations. All but two sites indicated changes in their faculty development plans and goals since the inception of the DBC program. Thirteen also indicated changes in their project timelines. Only three sites reported completing evaluations of their projects. At the time that the final reports were made, three sites had submitted manuscripts for publication; a fourth site was preparing a manuscript for submission; and one site had presented a paper on its project at conferences.
When asked to list factors contributing to the success of their faculty development projects, the sites' responses reflected priorities similar to those revealed in earlier evaluations. The two most frequently listed factors were availability of funding and supportive leadership. The usefulness of interactions with the DBC program consultants and other invited consultants was frequently mentioned. Nearly every site that had an on-site consultant funded by reallocated DBC funds lauded the on-site visits as exceptionally valuable. The two most frequently listed factors detracting from project success were lack of financial support and lack of time to devote to the project.
SUMMARY AND RECOMMENDATIONS
We identified three main factors that had the most impact on the success of the participating faculty development projects:
- Funds committed and and designated for faculty development
- Funded, protected time for at least one person to implement the faculty development initiative
- An environment capable of supporting faculty development initiatives (e.g., no major budget shortfall, few faculty transitions, a strong mission, no threat of mergers, etc.)
Unfortunately, none of these factors can be influenced by distance-based consulting, or, for that matter, by any form of consulting. We elaborate briefly on these three factors below.
Committed funds. The emphasis on funding needs showed up again and again in the sites' feedback responses. For example, when sites were asked what kinds of additional information they desired, the highest-rated option was for information about financing faculty development initiatives. When sites were asked in the ninth month of the DBC program how the remaining program funds should be used, the option that received highest ratings was for sites to receive money that they could use as needed to support their faculty development projects. When asked for their recommendations on how to conduct future distance-based projects, the sites ranked highly the option of receiving financial support in exchange for their participation in the pilot study. Last, in the sites' final progress reports, the issue of funding surfaced frequently in response to an open-ended survey question about factors detracting from or contributing to project success. Clearly, funding was a detractor to success when lacking, and an important contributor to success when present.
Ample, dedicated time. Having enough time to plan and implement their projects was another obvious challenge for many sites. The sites' responses were slow, even to the offer of money to hire consultants; in many cases, proposals to hire consultants were forthcoming only after several reminders and time extensions. Lack of time was frequently cited as a barrier impeding sites' progress. Most sites expressed a desire to have at least a portion of one person's time funded for faculty development. In addition, the overall timeline of the project proved too ambitious for most sites. Having a time frame of more than 15 months to move from assessment to implementation to evaluation received the highest ratings among options of ways to increase the effectiveness of future distance-based programs.
Supportive, stable environment. The need for a stable, supportive environment was clear, as the lack thereof was the “undoing” of several sites' faculty development projects. These tended to be the sites that “disappeared into the woodwork,” overwhelmed by institutional restructurings of one kind or another, by changes in department or institutional priorities or direction, and/or by turnover in pilot team leadership or membership.
Aside from these factors, which cannot be influenced by consulting, the sites' responses also indicated which types of consulting assistance they found or would find most helpful. First, the sites preferred individualized assistance tailored to their unique needs over general information supplied to the sites en masse. Second, in addition to information about techniques of faculty evaluation, evaluation, and publication, as were provided as part of the DBC, the program sites would have liked information about funding sources for faculty development, how to establish a “culture” that is conducive to faculty development, and the availability of collaborative faculty development efforts. Finally, the sites recommended that future DBC programs have the following five characteristics:
- A time frame encompassing more than 15 months for sites to move from needs assessment to implementation to evaluation
- Set expectations for being a participating project
- Dedicated time (perhaps two to three hours) with the program consultants at the faculty development conference
- Allocated funds for participating sites as a result of being selected
- A wider pool of consultants with skills or expertise in a range of faculty development areas and stages of project development
Recommendations for Future Distance-based Programs
Given the above feedback from sites and consultants, what recommendations can be made for the future conduct of similar distance-based efforts?
First, having funding tied to sites' acceptance as program participants is recommended to provide the sites with much needed financial support. However, this support should come with expectations for required levels of participation by the sites. Thus, providing funding would demand greater accountability of the sites and provide a mechanism to keep them “on task” with their projects.
Second, keeping the number of sites to a more manageable number (six or, at most, eight) would serve to focus consultants' energies and thereby enable them to provide what the sites in this study most appreciated—individualized attention to their unique needs. This individualized attention would also serve to help keep the sites on task and progressing in a timely manner. Furthermore, the smaller number of project site locations would necessitate a more selective approval process with even closer scrutiny of the sites' propensities to succeed than was utilized for this study.
Third, a more generous time line would increase sites' abilities to achieve desired outcomes, that is, to successfully move through all stages of conducting a needs assessment, and then designing, implementing, evaluating, and disseminating the results of a faculty development project.
Fourth, arranging for sites to have more initial, dedicated, face-to-face contact with the program consultants would provide a much stronger foundation for forging productive long-distance relationships.
Fifth, later, as sites develop their faculty development projects, having a larger pool of available consultants to better address the wide range of possible needs of the sites would enhance the sites' success.
Upon reflection, it seems that the types of assistance most valued by the participating sites were those interactions that occurred “in person” rather than those that were distance-based. For example, the MTW conference itself, which was not detailed in this report, received highly positive ratings from attendees. In communications between site contacts and DBC consultants, and in the sites' final progress reports, site personnel frequently made positive remarks about the types of “in person” assistance they had received: personal contacts between site personnel and the experts at the conference, and subsequent on-site visits to the sites by the DBC staff or by other consultants of their choosing.
The bottom line is that even in our high-tech, Internet-linked society, the personal touch is what is most appreciated. Even attending to the recommendations above may not raise distance-based consulting to the level of effectiveness that can be achieved by good one-on-one, in-person assistance.