Secondary Logo

Journal Logo

How Do Social Networks and Faculty Development Courses Affect Clinical Supervisors’ Adoption of a Medical Education Innovation? An Exploratory Study

Jippes, Erik PhD; Steinert, Yvonne PhD; Pols, Jan MD, PhD; Achterkamp, Marjolein C. PhD; van Engelen, Jo M.L. PhD; Brand, Paul L.P. MD, PhD

doi: 10.1097/ACM.0b013e318280d9db
Research Reports
Free
SDC

Purpose To examine the impact of social networks and a two-day faculty development course on clinical supervisors’ adoption of an educational innovation.

Method During 2007–2010, 571 residents and 613 clinical supervisors in four specialties in the Netherlands were invited to complete a Web-based questionnaire. Residents rated their clinical supervisors’ adoption of an educational innovation, the use of structured and constructive (S&C) feedback. Clinical supervisors self-assessed their adoption of this innovation and rated their communication intensity with other clinical supervisors in their department. For each supervisor, a centrality score was calculated, representing the extent to which the supervisor was connected to departmental colleagues. The authors analyzed the effects of supervisor centrality and participation in a two-day Teach-the-Teacher course on the degree of innovation adoption using hierarchical linear modeling, adjusting for age, gender, and attitude toward the S&C feedback innovation.

Results Respondents included 370 (60%) supervisors and 357 (63%) residents. Although Teach-the-Teacher course participation (n = 172; 46.5%) was significantly related to supervisors’ self-assessments of adoption (P = .001), it had no effect on residents’ assessments of supervisors’ adoption (P = .371). Supervisor centrality was significantly related to innovation adoption in both residents’ assessments (P = .023) and supervisors’ self-assessments (P = .024).

Conclusions A clinical supervisor’s social network may be as important as faculty development course participation in determining whether the supervisor adopts an educational innovation. Faculty development initiatives should use faculty members’ social networks to improve the adoption of educational innovations and help build and maintain communities of practice.

Supplemental Digital Content is available in the text.

Dr. Jippes is manager, Center for Medical Imaging–North East Netherlands, University Medical Center Groningen, University of Groningen, Groningen, Netherlands.

Dr. Steinert is professor of family medicine, Richard and Sylvia Cruess Chair in Medical Education, and director, Centre for Medical Education, University Medical Center Groningen, McGill University, Montreal, Quebec, Canada. At the time of writing, she was also Johanna H. Bijtel Chair, Faculty of Medicine, University of Groningen, Groningen, Netherlands.

Dr. Pols is master thesis coordinator, Wenckebach Institute, University Medical Center Groningen, University of Groningen, Groningen, Netherlands.

Dr. Achterkamp is assistant professor, Marketing Department, Faculty of Economics and Business, University of Groningen, Groningen, Netherlands.

Dr. van Engelen is professor, Product Development and Strategy Department, Faculty of Economics and Business, University of Groningen, Groningen, Netherlands, and professor, Design Engineering Department, Faculty of Industrial Design Engineering, Delft University of Technology, Delft, Netherlands.

Dr. Brand is professor of Clinical Medical Education, Postgraduate School of Medicine, Wenckebach Institute, University Medical Center Groningen, University of Groningen, Groningen, Netherlands, and consultant pediatrician, Princess Amalia Children’s Clinic, Isala Klinieken, Zwolle, Netherlands.

Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A116.

Correspondence should be addressed to Dr. Jippes, University Medical Center Groningen, PO Box 30.001, Hanzeplein 1, 9700 RB Groningen, The Netherlands; telephone: (+31) 503619702; e-mail: e.jippes@umcg.nl.

As the theoretical framework of experiential learning has been widely accepted and supported by empirical data,1 most faculty development efforts in medicine take the form of workshops, courses, and seminars.2 At the same time, few experimental studies have documented the effects of such educational interventions on the professional behavior of faculty involved in clinical teaching. A systematic review3 found that faculty development activities were highly valued by participants, who reported positive changes in their attitudes toward teaching in general as well as improvements in their educational knowledge, self-efficacy, and behavior. Participants’ perceptions of their own behavioral changes were not, however, consistently reflected in students’ or residents’ evaluations of their teaching, and few effects on student behavior were demonstrated.3 These findings suggest that we should explore additional factors—beyond formal, structured faculty development activities—that can affect how and whether clinical supervisors transfer knowledge and skills into their teaching practices.

Research on business/management4 and the implementation of health care technologies5 suggests that social networks may be one such additional factor—as well as a promising avenue of research in determining the degree to which individuals adopt innovations. A social network can be defined as the relationships between a finite set of actors.6 Social networks function as channels through which potential adopters of an innovation can communicate, construct, and negotiate that innovation, thereby reducing its novelty and their uncertainty regarding its potential outcomes.4,5 On the basis of the principle that the pattern of relationships among individuals (or groups) has greater influence on outcomes than do the attributes of the individuals (or groups) themselves, social network analysts examine the effects of connections between individuals on the adoption of innovations.7

A key variable in social network analysis is the extent to which an individual is connected to other actors in the network, referred to as actor centrality.6 In business organizations, individuals with high centrality are significantly more likely than others to be promoted.8 Further, students’ centrality in their networks is associated with their enjoyment of learning and their academic success.7 In this study, we focus on the centrality of the clinical supervisor within the social network of his or her clinical department. We define clinical supervisor as a member of the medical faculty who supervises residents in a clinical setting; the notion of centrality represents the clinical supervisor’s embeddedness in this network.

Clinical supervisors in postgraduate medical education (PGME) programs in the Netherlands were recently encouraged to implement an educational innovation—structured and constructive (S&C) feedback—in their teaching of residents. In Dutch residency programs before 2004, feedback from clinical supervisors to residents, if offered at all, was given in an unstructured and sometimes derogatory manner.9,10 In 2004, the Royal Dutch Medical Association–Central College for Medical Specialists (CCMS)11 issued a legal directive requiring all medical specialist societies in the Netherlands to revise their PGME programs on the basis of the Canadian Medical Education Directions for Specialists (CanMEDS) framework of core competencies,12 which includes the physician’s roles as medical expert, collaborator, communicator, professional, health advocate, manager, and scholar. In addition, CCMS11 recommended introducing S&C feedback that follows “Pendleton’s rules”13: (1) Feedback should be structured, (2) clinical supervisors should give residents opportunities to express their opinions, (3) clinical supervisors should provide positive comments, (4) clinical supervisors should provide specific comments regarding areas for improvement, and (5) clinical supervisors should provide feedback in a “safe” way. Both residents and clinical supervisors have since expressed the view that the introduction of S&C feedback was the most important innovation in the renewed PGME curricula.14

All PGME program renewal was required to be completed by the end of 2010.11 To help clinical supervisors to master the skills involved in providing S&C feedback and to improve its implementation in practice, two-day Teach-the-Teacher courses were developed and offered to clinical supervisors from 2004 onward. The government of the Netherlands recommended and supported participation in these courses.

In this study, we explored the relative influence of social networks and participation in Teach-the-Teacher courses on clinical supervisors’ implementation of S&C feedback, as measured by clinical supervisors’ self-assessments and residents’ assessments of supervisors’ adoption of the innovation. Our research question was as follows: What is the effect of a clinical supervisor’s centrality in a social network of peer clinical supervisors, as compared with the effect of a clinical supervisor’s participation in a Teach-the-Teacher course, on the adoption of the S&C feedback innovation?

Back to Top | Article Outline

Method

Study participants

We recruited teams of residents (both junior and senior) and clinical supervisors in surgical (obstetrics–gynecology), medical (pediatrics), diagnostic (radiology), and supportive (anesthesiology) disciplines, from university and general hospitals involved in the clinical teaching of residents. We chose to incorporate a medical specialty of each discipline type and a mix of university and general hospitals for two reasons. First, physicians’ social networks may differ according to the characteristics of their specialties (e.g., anesthesiologists primarily work independently, whereas pediatricians work more closely together). Second, the hospital setting may influence physicians’ communication structures (e.g., faculty members working in smaller teams in general hospitals may have more frequent contact with one another than do their counterparts in university hospitals). We selected this study’s specific specialties and hospital settings on the basis of the authors’ access to key persons.

We determined that we needed to recruit 230 to 475 clinical supervisors for a statistical power of 80% (230 for an explained variance of 5%; 475 for an explained variance of 2.5%; with a P value of .05).15

Back to Top | Article Outline

Study questionnaires and innovation adoption scores

During 2007–2010, we invited 571 residents and 613 clinical supervisors from the four medical specialties described above to complete previously validated Web-based questionnaires.16 We first contacted program directors within a specialty to ask for their cooperation. Once we had received their approval, we invited the residents and clinical supervisors within a clinical training unit/team to participate by e-mailing them the link to access the appropriate online questionnaire. We sent two e-mail reminders and stopped collecting data four weeks after the initial invitation was extended to the unit/team.

We collected study data across a four-year period because we wanted to ensure that the clinical supervisors of the participating teams were at the same stage of implementing S&C feedback to residents. Some specialties—such as pediatrics14—were early adopters of this educational innovation and were therefore included early in the data collection period, whereas others—such as radiology17—began to implement the innovation later and therefore were included late in the data collection period. Each team, resident, and clinical supervisor was surveyed only once during the study.

We adapted Rogers’4 definition of adoption—“the decision to make full use of an innovation as the best course of action available”—to the context of our study. The questionnaires therefore asked clinical supervisors and residents to assess the degree to which the supervisor had adopted S&C feedback rather than asking them to indicate whether the supervisor had or had not adopted the innovation. We chose resident- and self-assessed scores for S&C feedback as our dependent variable because another study identified such feedback as the most important innovation in the renewed PGME curricula14 and because the innovation was comparable across all specialties.

Back to Top | Article Outline

Resident questionnaire.

The five-item resident questionnaire asked residents to assess the nature of the S&C feedback given by their clinical supervisors in the six months before the questionnaire was administered (see Supplemental Digital Appendix 1, http://links.lww.com/ACADMED/A116). Residents were provided with a list of the clinical supervisors in their unit and were asked to rate each supervisor on items worded according to the five components of Pendleton’s rules13 (e.g., “The clinical supervisor provides feedback in a structured way,” “The clinical supervisor gives residents the opportunity to express their opinion”). Ratings were made on a five-point Likert scale, ranging from “totally disagree” (scored as 1) to “totally agree” (scored as 5); there was also an option of “not possible to assess this supervisor.”

We calculated the mean score on the five items for each clinical supervisor to determine the supervisor’s resident-assessed innovation adoption score. Only clinical supervisors who were assessed by at least two residents were included in the data analysis; the supervisor’s mean resident-assessed innovation adoption score was used as the dependent variable in analyses.

Prior reliability analysis yielded a Cronbach alpha of 0.82 for the five questions; factor analysis revealed one construct under these questions (eigenvalue of 3.042 and 61% explanation of variance).16

Residents completed the survey anonymously and were not asked to provide any demographic data.

Back to Top | Article Outline

Clinical supervisor questionnaire.

Clinical supervisors completed a similar, 10-item, Web-based questionnaire (see Supplemental Digital Appendix 2, http://links.lww.com/ACADMED/A116). Supervisors rated their own adoption of the five components13 of the S&C feedback innovation (e.g., “I provide feedback in a structured way,” “I give residents the opportunity to express their opinion”) on a five-point Likert scale ranging from “totally disagree” (scored as 1) to “totally agree” (scored as 5). We calculated the supervisor’s self-assessed adoption score as the mean score on these five items and used it as the dependent variable in analyses. Clinical supervisors used the same Likert scale to respond to the item “Structured feedback constitutes an improvement of the quality of the clinical teaching of residents”; this was considered a measure of the supervisor’s attitude toward the S&C feedback innovation. We controlled for this variable because the adopter’s attitude may have had an effect on innovation adoption.18 We collected demographic data (age and gender) for clinical supervisors.

Other items on the questionnaire measured the clinical supervisor’s communication structures to allow us to analyze the supervisor’s social network connections to determine his or her centrality and asked whether the supervisor had attended a Teach-the-Teacher course (both independent variables, described below).

Back to Top | Article Outline

Teach-the-Teacher course attendance (independent variable)

Teach-the-Teacher training took the form of a two-day course on how to apply adult learning principles in clinical teaching19; approximately 70% of the course time was devoted to providing S&C feedback. Teach-the-Teacher courses were designed by medical schools/universities according to national government guidelines20,21 and were conducted by certified trainers and accredited educational institutes in the Netherlands. For the purposes of this study, we considered these courses to be comparable. The courses employed various methods to provide instruction on S&C feedback, such as interactive discussions, role-play, and mini-lectures. During role-play activities, participants gave S&C feedback to colleagues who had completed five-minute teaching sessions; this was followed by debriefings with other participants on how well participants had applied Pendleton’s rules.13 Clinical supervisors from different specialties and hospitals across the Netherlands, including supervisors in our sample, attended these courses.

In this study, we asked clinical supervisors to indicate on their questionnaire whether they had attended a Teach-the-Teacher course during the past three years so that we could examine whether course attendance had any impact on their adoption of the S&C feedback innovation.

Back to Top | Article Outline

Social network analysis: Centrality score (independent variable)

We used a “full roster” design for social network analysis.6 Following standard practice, we provided each clinical supervisor with a list of their fellow, departmental clinical supervisors. In the Web-based questionnaire, we asked each supervisor to rate the intensity of his or her communication with each colleague “in the past half year about the introduction of innovations, new methods or procedures, or new developments related to the work situation” using a six-point scale of “never” (scored as 1), “less than once a month” (2), “more than once a month” (3), “weekly” (4), “daily” (5), or “more than once daily” (6).

To calculate supervisor centrality, we created an undirected dichotomous matrix by recoding responses as follows: 1 (never) and 2 (less than once per month) were recoded as “0,” indicating no communication between the individuals, whereas 3 (weekly) through 6 (more than once daily) were recoded as “1,” indicating a communication relationship between the individuals. We used the highest rating of communication intensity between two persons or, in the case of missing data, the rating from one person.6

We then calculated a centrality score for each clinical supervisor; this score represents the percentage of fellow clinical supervisors with whom he or she has contact about new developments at least once per month. The index thus ranges from 0 (the clinical supervisor has no contact with other supervisors in the department) to 100 (the supervisor has contact with all supervisors in the department on at least a monthly basis). The full details of the network analysis and calculation of centrality are available in Supplemental Digital Appendix 3 (http://links.lww.com/ACADMED/A116).

Back to Top | Article Outline

Statistical analysis

First, we assessed the effects of Teach-the-Teacher course attendance and supervisor centrality on innovation adoption scores using t tests and correlation analyses. Subsequently, we analyzed the independent effects of Teach-the-Teacher course attendance and supervisor centrality on innovation adoption scores after adjusting for age, gender, and attitude toward the S&C feedback innovation. To account for the nested structure of the data (individuals within teams), we used two-level hierarchical linear modeling22 (statistical software program MLwiN version 2.17, University of Bristol). We inserted the interaction effects between the Teach-the-Teacher attendance and centrality index to assess possible moderating effects. A P value of .05 was considered statistically significant.

It should be noted that the data collected on the clinical supervisors were identifiable. Residents were asked to assess each clinical supervisor in their clinical training unit. Clinical supervisors identified themselves by name on their questionnaires and were asked to rate their communication intensity with each of the other supervisors in their clinical training unit. Data were anonymized by the study’s principal investigator (E.J.) prior to analysis. Data were linked for analysis via unique identifiers assigned by E.J. All data were stored on secured servers.

At the time of data collection, institutional review board approval was not required for medical education research in the Netherlands.

Back to Top | Article Outline

Results

Study respondents

We invited 613 clinical supervisors (radiology = 370, anesthesiology = 147, obstetrics–gynecology = 50, pediatrics = 46) and 571 residents (radiology = 344, anesthesiology = 141, obstetrics–gynecology = 50, pediatrics = 36) from 30 hospitals (8 university medical centers and 22 general hospitals) and 38 teams (radiology = 24, anesthesiology = 5, obstetrics–gynecology = 4, pediatrics = 5) to participate in this study.

Of the 613 clinical supervisors, 420 responded to the questionnaire. We discarded questionnaires with incomplete answers and analyzed responses from 370 clinical supervisors (radiology = 210, anesthesiology = 75, obstetrics–gynecology = 42, pediatrics = 43; overall response rate: 60%). See Table 1 for characteristics of the responding supervisors. Of the 571 residents, 357 responded (radiology = 210, anesthesiology = 98, obstetrics–gynecology = 41, pediatrics = 23; overall response rate: 63%).

Table 1

Table 1

Back to Top | Article Outline

Univariate analyses

Teach-the-Teacher course attendance and centrality score.

Clinical supervisors who had attended a Teach-the-Teacher course in the past three years (n = 172; 47%) had a mean (SD) self-assessed innovation adoption score of 4.20 (0.55) as compared with 4.02 (0.58) for those who had not attended such a course (n = 198; 53%; 95% confidence interval [CI] for difference, 0.06 to 0.30; P = .013). The resident-assessed mean (SD) innovation adoption score was 4.04 (0.50) for the clinical supervisors who had attended a Teach-the-Teacher course versus 4.07 (0.45) for those who had not (95% CI, −0.07 to 0.13; P = .272).

Clinical supervisors whose centrality scores were in the highest quartile (centrality scores ≥ 58.33; n = 93; 25%) had a mean (SD) self-assessed innovation adoption score of 4.19 (0.53) compared with 3.96 (0.64) for supervisors whose centrality scores were in the lowest quartile (centrality scores ≤ 13.33; n = 86; 23%; 95% CI for difference, −0.41 to −0.06; P = .004). The resident-assessed mean (SD) innovation adoption score was 4.11 (0.46) for clinical supervisors whose centrality scores were in the highest quartile versus 3.99 (0.47) for supervisors whose centrality scores were in the lowest quartile (95% CI, −0.25 to 0.02; P = .048). Supplemental Digital Figure 1 (http://links.lww.com/ACADMED/A116) presents a visual representation of a social network of clinical supervisors in a participating department, including the centrality of individual faculty members and their resident-assessed innovation adoption scores.

Back to Top | Article Outline

Control variables.

The correlation between clinical supervisors’ self-assessed and resident-assessed innovation adoption scores was considerably higher for supervisors who had attended a Teach-the-Teacher course (r = 0.33, P < .001) than for those who had not (r = 0.20, P = .006).

There were no significant differences in gender, age, and attitude towards the S&C feedback innovation between clinical supervisors who had and had not attended a Teach-the-Teacher course (all P values > .4). Resident-assessed innovation adoption scores were significantly correlated with supervisor age (r = −0.14, P < .001). Male clinical supervisors (n = 261; 71%) rated their own innovation adoption significantly higher (mean 4.16, SD.57) than did female supervisors (n = 109; 29%; mean 3.98, SD.57; P = .006). Supervisor self-assessed innovation adoption score was significantly correlated with attitude toward the S&C feedback innovation (r = 0.23, P < .001) and with centrality score (r = 0.14, P = .006). Male supervisors were significantly older (mean 47.87 years, SD 8.51) than female supervisors (mean 44.06 years, SD 7.47; P < .001). Finally, male supervisors’ mean centrality score of 42.93 (SD 31.69) was significantly higher than that of female supervisors (mean 25.69, SD 26.18; P < .001).

Back to Top | Article Outline

Multivariate analyses

Table 2 presents the results of the hierarchical linear model at the individual level. (Data for the departmental level are not shown because the model showed no explained variance at that level.) In the multivariate models, all three control variables significantly influenced innovation adoption scores: Male supervisors had higher innovation adoption scores than female supervisors, innovation adoption score was inversely related to increasing supervisor age, and there was a positive relationship between supervisor attitude toward the S&C feedback innovation and adoption of the innovation. After we adjusted for these control variables, Teach-the-Teacher course attendance was weakly but significantly related to the supervisor’s self-assessed innovation adoption score (P = .001; explained variance = 11.49% on the individual level) but not to the resident-assessed innovation adoption score (P = .371). In contrast, clinical supervisor centrality was significantly related to innovation adoption, both as rated by residents (P = .023; explained variance = 4.15%) and by the supervisors themselves (P = .024; explained variance = 9.29%). There was no significant interaction between Teach-the-Teacher course attendance and clinical supervisor centrality score in any of the models.

Table 2

Table 2

Back to Top | Article Outline

Discussion

In this study, we compared the effects of a faculty development intervention (Teach-the-Teacher courses) and social networks on clinical supervisors’ adoption of a medical education innovation (providing S&C feedback to residents). Although Teach-the-Teacher course participation was significantly related to self-assessed innovation adoption scores, it had no effect on residents’ ratings of their supervisors’ adoption of the innovation. In contrast, the clinical supervisor’s centrality within his or her department’s social network was significantly related to both self-assessed and resident-assessed innovation adoption scores. These associations remained significant after we adjusted for supervisor age, gender, and attitude toward the S&C feedback innovation, which suggests that individuals offering faculty development programs or activities should take faculty members’ social networks into account.

The degree to which an individual in a social network is connected to other individuals in the network (actor centrality) has been shown to have a major influence on that individual’s adoption of business innovations.4,23 A recent publication suggested that taking a network approach toward faculty development programs in medicine might be important in determining the success of such programs.24 In a previous study, we demonstrated the importance of clinical supervisor centrality in the adoption of educational innovations.16 The current study adds to our earlier findings in several ways: We used a larger sample of clinical supervisors and residents, we employed hierarchical linear modeling to account for the nested data structure (individuals within teams), and we incorporated supervisors’ self-assessments in addition to residents’ assessments of supervisors’ innovation adoption. The use of hierarchical linear modeling allows for better assessment of coefficients and error components, and the inclusion of both faculty self-assessments and resident assessments provides valuable insights into how adopters’ perceptions compare with the perceptions of others. In concordance with our earlier work,16 we found that clinical supervisor centrality contributed significantly to the residents’ assessment of the supervisor’s innovation adoption. In the current study, we found that supervisor centrality was similarly significantly related to the supervisor’s self-assessment of innovation adoption; however, Teach-the-Teacher course attendance was related only to the supervisor’s self-assessment.

The latter finding is in line with previous studies2,3 showing that the effect of a faculty development course—in this case, Teach-the-Teacher training—on teaching behavior may be limited. This does not mean that such educational workshops and courses should be abandoned, especially as they are effective methods of disseminating information to groups. On the basis of the results of this study, however, we recommend that faculty development courses include a high proportion of interactive exercises (e.g., role-play, discussions) because (1) active participation is likely to improve participants’ retention of knowledge and skills25 and (2) these interactions may activate social network structures among participants. Such engagement of social network structures may also help explain why long-term, comprehensive faculty development programs appear to be more successful than isolated workshop-based interventions.3

Recent articles have highlighted the role of social practices in faculty development and the importance of faculty development in building communities of practice.26,27 Our results support considering faculty development through this lens and moving beyond workshops as a primary method of delivery.26 Our finding that there were no significant differences in residents’ assessments of clinical supervisors who had and who had not attended Teach-the-Teacher training suggests that clinical supervisors may have learned from other sources how to incorporate S&C feedback into their teaching practices. Our results suggest that social networks of peer clinical supervisors could have provided this input; however, considering that our explained variance is relatively low, other social network sources may be involved (e.g., network connections with medical educators).

It is also interesting to note that clinical supervisors were significantly more likely to consider themselves to be successful adopters of S&C feedback than were their residents, as the effects of Teach-the-Teacher course participation on the adoption of the educational innovation was demonstrated only in the supervisors’ self-assessments and not in the residents’ assessments of their supervisors. Supervisors may have overrated their own adoptive behavior; this is consistent with others’ findings that adult learners often perform poorly in assessing their own clinical or educational competence.28 Alternatively, residents may have underrated supervisors’ adoptive behavior as they may not have recognized feedback when it was given or may have confused feedback with teaching.29,30

We examined the influence of three potential confounding factors in this study: age, attitude toward the S&C feedback innovation, and gender. First, we found that clinical supervisors were less likely to show adoptive behavior (as measured by residents’ assessments) with increasing age. Although it could be argued that older faculty members are less able to learn or less willing to adopt new skills,31 we believe two other explanations should be considered: (1) Residents may have been more likely to identify with younger clinical supervisors and therefore rated them more highly, and (2) younger faculty may have been more familiar than older faculty with S&C feedback because younger faculty’s medical training may have been more oriented toward this innovation. Second, we were not surprised to find that clinical supervisors with more positive attitudes toward S&C feedback were more likely to adopt the innovation, according to their self-assessments. This finding is in agreement with previous research on attitude and innovation adoption in health care.18 Third, with regard to gender, residents gave higher innovation adoption scores to male supervisors than to female supervisors. Our finding that gender was only of borderline significance in the multivariate analysis (Table 2) suggests that the higher scores given to the men may be partly due to the stronger embeddedness of male than female clinical supervisors in their respective social networks. Men’s higher centrality may have given them more opportunities to become acquainted with, and adopt, the innovation.

Our results on centrality confirm research showing that individuals may gain social capital benefits from holding central positions in their social networks. A meta-analysis32 of eight business studies found that individuals with high centrality were likely to emerge as leaders, to be more satisfied with team performance, and to participate more in developing and implementing task solutions. Other studies showed that centrality independently predicted individuals’ workplace performance33 and that high centrality increased the likelihood of employees remaining in their positions.34 In a study of an advertising and public relations agency, centrality was found to be the most significant predictor for involvement in innovation.23 Furthermore, individuals who are more central have more opportunities to be introduced to new ideas, to avail themselves of the necessary resources for implementing these new ideas, and to adopt innovations.5

Back to Top | Article Outline

Strengths and limitations

This study is the first to compare the effects of Teach-the-Teacher training and social networks on clinical supervisors’ implementation of a medical educational innovation, as rated by both the self-assessments of and residents’ assessments of clinical supervisors. The large study sample, allowing for hierarchical linear modeling, improves the robustness of our findings.

This study had a number of limitations. First, our explained variance was relatively low, which means that there were variables beyond the scope of our study that had significant effects on innovation adoption. Second, the adoption of the educational innovation may have been influenced by social networks other than the one we investigated (e.g., networks with residents, with supervisors from other departments or hospitals, or with medical educators). It is likely that interdisciplinary collaboration increases the likelihood of innovation adoption, but this requires further study in the field of medical education. Third, this study was designed as an exploratory study with a cross-sectional and observational research design. Longitudinal and more experimental approaches, including pre- and posttest measures, are needed to study the dynamics of social networks in PGME innovation and to test the hypothesis that such networks are indeed important.

Fourth, we measured the effect on innovation adoption of clinical supervisors’ participation in a Teach-the-Teacher course in the three years prior to the questionnaire. The fact that the period between completion of the course and administration of the questionnaire differed among the supervisors in our sample may have had an impact on the results. Fifth, all assessments of innovation adoption in this study reflect the perceptions of the resident or supervisor completing the questionnaire. We did not directly observe clinical supervisors giving S&C feedback to residents. Finally, our study was limited to the implementation of S&C feedback in PGME in four disciplines in the Netherlands. In this study, S&C feedback can be considered a genuine innovation because this change was “new” for the adopting organizations.4 However, caution is required in generalizing our findings to other types of innovations and organizations.

Back to Top | Article Outline

Conclusion

Although further studies are necessary to corroborate our preliminary findings, we believe our results provide a novel demonstration of the principle that the structure of a clinical supervisor’s social network may be at least as important in that supervisor’s adoption of new pedagogical methods as his or her participation in faculty development courses. Our results also suggest that faculty development efforts could capitalize on faculty members’ social networks to improve their implementation of medical education innovations. We recommend accomplishing this by using diverse strategies, such as specifically including in faculty development initiatives those medical faculty with high centrality, and by viewing faculty development initiatives as key components in building and maintaining communities of practice.

Funding/Support: None.

Other disclosures: None.

Ethical approval: At the time of data collection, ethical review boards did not exist for medical education research in the Netherlands, and under Dutch law, medical education studies were exempted from ethical review. There were no health or safety risks to participants. Participants received an e-mailed invitation that indicated that participation (i.e., completion of the Web-based survey) was voluntary, detailed the purpose of the study, and described the procedures related to and treatment of respondents’ data. Although some identifiable data were collected, the lead author de-identified the data prior to analysis. All data were stored on secure servers.

Previous presentations: This study was part of Dr. Jippes’ doctoral dissertation, entitled The Role of Social Communication Networks in Implementing Educational Innovations in Healthcare, University of Groningen, May 31, 2012.

Back to Top | Article Outline

References

1. Kolb DA Experiential Learning: Experience as the Source of Learning and Development.. 1984 Englewood Cliffs, NJ Prentice Hall
2. McLean M, Cilliers F, Van Wyk JM. Faculty development: Yesterday, today and tomorrow. Med Teach. 2008;30:555–584
3. Steinert Y, Mann K, Centeno A, et al. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME guide no. 8. Med Teach. 2006;28:497–526
4. Rogers EM. Diffusion of preventive innovations. Addict Behav. 2002;27:989–993
5. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: Systematic review and recommendations. Milbank Q. 2004;82:581–629
6. Wasserman S, Faust K Social Network Analysis. 1994 Cambridge, UK Cambridge University Press
7. Baldwin T, Bedell M, Johnson J. The social fabric of a team-based MBA program: Network effects on student satisfaction and performance. Acad Man J.. 1997;40:1369
8. Brass D. Being in the right place: A structural analysis of individual influence in an organization. Adm Sci Q. 1984;29:518–539
9. Van Der Hem-Stokroos HH, Scherpbier AJ, Van Der Vleuten CP, De Vries H, Haarman HJ. How effective is a clerkship as a learning environment? Med Teach. 2001;23:599–604
10. Busari JO, Weggelaar NM, Knottnerus AC, Greidanus PM, Scherpbier AJ. How medical residents perceive the quality of supervision provided by attending doctors in the clinical setting. Med Educ. 2005;39:696–703
11. . Royal Dutch Medical Association—Koninklijke Nederlandsche Maatschappij tot Bevordering, der Geneeskunst. Resolution (kaderbesluit) CCMS 2009 [in Dutch]. http://knmg.artsennet.nl/Opleiding-en-Registratie/CGS-1/Regelgeving/Huidige-regelgeving-CCMS.htm. Accessed November 29, 2012
12. Scheele F, Teunissen P, Van Luijk S, et al. Introducing competency-based postgraduate medical education in the Netherlands. Med Teach. 2008;30:248–253
13. Pendleton D, Schofield D, Tate P, Havelock P The New Consultation: Developing Doctor–Patient Communication.. 2003 Oxford, UK Oxford University Press
14. Jippes E, Van Luijk SJ, Pols J, Achterkamp MC, Brand PL, van Engelen JM. Facilitators and barriers to a nationwide implementation of competency-based postgraduate medical curricula: A qualitative study. Med Teach. 2012;34:e589–e602
15. Cohen J, West SG, Cohen P Applied Multiple Regression: Correlation Analysis for the Behavioral Sciences. 2002 Mahwah, NJ Lawrence Erlbaum Associates
16. Jippes E, Achterkamp MC, Brand PL, Kiewiet DJ, Pols J, van Engelen JM. Disseminating educational innovations in health care practice: Training versus social networks. Soc Sci Med. 2010;70:1509–1517
17. Jippes E, van Engelen JM, Brand PL, Oudkerk M. Competency-based (CanMEDS) residency training programme in radiology: Systematic design procedure, curriculum and success factors. Eur Radiol. 2010;20:967–977
18. García-Goñi M, Maroto A, Rubalcaba L. Innovation and motivation in public health professionals. Health Policy. 2007;84:344–358
19. Mann KV. Theoretical perspectives in medical education: Past experience and future possibilities. Med Educ. 2011;45:60–68
20. Boor K, Teunissen PW, Brand PLP. Feedback guidelines in postgraduate medical education [in Dutch]. TMO Neth J Med Educ. 2011;30:43–49
21. Molenaar WM, Zanting A, van Beukelen P, et al. Designing teacher competencies for medical education [in Dutch]. TMO Neth J Med Educ. 2009;28:201–211
22. Snijders TAB, Bosker RJ Multilevel Analysis: An Introduction to Basic and Advanced Multilevel Modeling. 1999 London, UK SAGE Publications
23. Ibarra H. Network centrality, power, and innovation involvement: Determinants of technical and administrative roles. Acad Man J. 1993;36:471–501
24. Baker L, Reeves S, Egan-Lee E, Leslie K, Silver I. The ties that bind: A network approach to creating a programme in faculty development. Med Educ. 2010;44:132–139
25. Ramani S, Leinster S. AMEE guide no. 34: Teaching in the clinical environment. Med Teach. 2008;30:347–364
26. Steinert Y. Faculty development: From workshops to communities of practice. Med Teach. 2010;32:425–428
27. Steinert Y, Macdonald ME, Boillat M, et al. Faculty development: If you build it, they will come. Med Educ. 2010;44:900–907
28. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: A systematic review. JAMA. 2006;296:1094–1102
29. Shute VJ. Focus on formative feedback. Rev Educ Res.. 2008;78:153–189
30. Liberman SA, Liberman M, Steinert Y, McLeod P, Meterissian S. Surgery residents and attending surgeons have different perceptions of feedback. Med Teach. 2005;27:470–472
31. Decker D, Wheeler GE, Johnson J, Parsons RJ. Effect of organizational change on the individual employee. Health Care Manag. 2001;19:1–12
32. Mullen B, Johnson C, Salas E. Effects of communication network structure—Components of positional centrality. Soc Netw. 1991;13:169–185
33. Mehra A, Kilduff M, Brass D. The social networks of high and low self-monitors: Implications for workplace performance. Adm Sci Q. 2001;46:121–146
34. Feeley T. Testing a communication network model of employee turnover based on centrality. J Appl Comm Res. 2000;28:262

Supplemental Digital Content

Back to Top | Article Outline
© 2013 by the Association of American Medical Colleges