Coleman, Michelle M. MD; Blatt, Benjamin MD; Greenberg, Larrie MD
One of the many missions of U.S. medical schools is to ensure continuity of leadership by providing medical students with experiences that would attract them to careers in academic medicine.1–8 Medical educators have expressed concerns, however, about the effectiveness of this mission—concerns amplified by the difficulty of recruiting and retaining high-achieving and talented junior faculty.3 Current information suggests that very few students select an academic medicine career, perhaps as a result of the barriers to this career that they perceive.9,10 Those contemplating academic medicine see a revenue-driven environment in which, in addition to providing patient care, faculty must understand and respond to a bewildering array of issues such as health care reform, gaps in health care delivery, health care disparities, increased productivity expectations, decreased research funding, learner generational differences, and the new Accreditation Council for Graduate Medical Education resident regulations. Added to this complex picture is the need to master the skills necessary to fulfill the academic mission: teaching, mentoring, leadership, and scholarship. For new faculty to be successful in a revenue-driven context, they must bring with them the skills, knowledge, and attitudes to face these challenges. Because of the demands of the current work environment, on-the-job training is a difficult option. It is reasonable and advantageous for new faculty hires to have the necessary training to accomplish the academic mission when they begin their careers. Clearly, training the next generation of academic physicians is critical if the academic medicine community is going to recruit and retain a workforce that is competent in meeting the complex educational and leadership challenges it faces. Many educators now make the case for beginning this training in medical school. Consequently, faculty development programs in teaching have filtered down to residents and students.11 Medical educators have described many residents-as-teachers programs, and others, more recently, have reported the results of a national survey of students-as-teachers programs.12
In 2004, the American Medical Student Association (AMSA) responded to the need for early exposure by developing its own annual teacher training program for medical students: Training Tomorrow’s Teacher’s Today (T4). This annual program has focused on providing knowledge, skills, and attitudes related to teaching and curricular reform to interested students at all levels from across the United States. Since T4’s inception, AMSA has partnered with a sponsoring medical school in order for the program to benefit from faculty expertise and mentoring (previous sponsors have been Mount Sinai School of Medicine [three years] and the University of Michigan Medical School [two years]).13 AMSA leaders have traditionally held this initiative in the summer to accommodate the schedules of the largest number of medical students possible. They presented it over the course of a week to give adequate time to impart a meaningful foundation in teaching and learning principles.
Teacher training, however, is only one component of preparation for an academic career. Recognizing the need for a comprehensive and multifaceted program aimed at students interested in academic medicine, AMSA and the George Washington University School of Medicine and Health Sciences (GWU) teamed up in 2008–2009 to redesign T4 for the sixth annual T4 Medical Education Leadership Institute (hereafter, simply Institute). Educators at the University of Michigan, AMSA’s previous collaborator, recommended GWU because of its strong focus on undergraduate medical education. The primary goal of the sixth annual T4 Institute was to initiate the formation of a cadre of national leaders in academic medicine through a program that goes beyond providing students with teaching skills to exposing them to multiple components important in the evolving field of academic medicine.
The first purpose of this article is to describe the development and implementation of the 2009 Institute and to share lessons learned from the program. The second purpose is to demonstrate the usefulness of a multifaceted approach to career program evaluation, a key part of which is rooted in social cognitive career theory (SCCT).
Multiple studies in several disciplines support SCCT as a means to predict career choice, achievement, and success.6 Only recently have researchers applied it in a medical context.4–6 SCCT holds that career choice and success can be explained by the reciprocal interaction of self-efficacy (belief in one’s ability to succeed), outcomes expectations (how one anticipates external factors beyond one’s control will promote or limit success), and personal characteristics (e.g., predisposition, gender). The first two of these elements are, in turn, influenced by four factors: (1) personal success experience, (2) exposure to successful role models, (3) social and verbal persuasive communications, and (4) positive emotional reactions (e.g., low anxiety, supportive climate). Because a strong body of evidence links SCCT to career promotion and success,4–6 the theory offers a useful framework through which to evaluate the T4 program as a means to foster careers in academic medicine.
Curriculum Development and Implementation
Keeping the goal to expand T4’s focus beyond teaching in mind, national leaders from AMSA and from GWU assumed primary responsibility for creating, planning, and facilitating the sixth annual Institute. Three AMSA national leaders (including M.C.) and four GWU faculty (including L.G.) held both in-person and telephone meetings during 2008 and 2009 to discuss a redesigned curriculum. On the basis of the feedback of previous Institute participants, they agreed on a 41-hour curriculum with the following content areas: teaching, leadership, educational scholarship, and academic career-planning. Although the content of the 2009 Institute expanded to include elements other than teaching and learning, some elements from previous Institutes remained the same. Namely, the program remained student-led (i.e., students helped guide the curriculum) and experiential, and the student participants were required to come with an idea for a scholarly project to address at the Institute and then complete at their respective medical schools. Table 1 summarizes the content, objectives, hours, and evaluation methods of the 2009 T4 Institute.
We designed program sessions to be interactive and learner-centered; that is, medical students would actively engage in all activities. To accommodate learners’ differing learning styles, we provided a mix of teaching methods. These included the following:
* reflective practice,
* shared pairs,
* small- and large-group discussions,
* student-directed faculty panels,
* student presentations,
* observation of clinical teaching and learning in the ambulatory and inpatient settings, and
* interactions with standardized learners.
We also arranged for faculty–student lunches and other informal sessions during unscheduled time.
The AMSA coordinator (M.C.) recruited medical student participants at all levels (rising first through fourth year) from a diverse set of training programs (MD-granting, osteopathic, naturopathic, and dual-degree programs). AMSA advertised for the T4 Institute from November 2008 to March 2009 via mass e-mails through the previously existing AMSA listserv, reaching approximately 33,000 U.S. students (one-third premedical and two-thirds medical). To apply, medical students submitted both an essay explaining why they wanted to participate in the program and a proposal for a medical education project they would complete on return to their home institution. AMSA received 13 applications. After reviewing the applications, the AMSA coordinator (M.C.) determined all students to be eligible and appropriate for the program on the basis of the strength of their proposed projects and their commitment to a career in academic medicine. A panel of three additional AMSA national leaders provided final approval of student selection, and AMSA then sent correspondence confirming acceptance via e-mail in March 2009. Two students withdrew their applications in April secondary to scheduling conflicts. The remaining 11 students participated in the Institute, which occurred on the campus of GWU in Washington, DC, during the first week of July 2009. Participants (first-through fourth-year medical students) were a diverse group representing MD-granting and naturopathic medical schools, private and public medical schools, and schools from across the United States and even from the Caribbean.
Before the start of the 2009 Institute, we gave students key articles on adult learning theory and medical education to provide them with a cognitive foundation for the week. We expected them to read the material before arriving so that they would be familiar with the educational areas the Institute would address.
GWU faculty members facilitated the majority of program sessions; the AMSA coordinator (M.C.), the AMSA president, and the AMSA director of student programming led the remainder. We believed that a great strength of the program would be the opportunity for T4 participants to see fellow students (i.e., the national AMSA representatives) leading Institute sessions and to develop near-peer relationships with these leaders. We recruited faculty members according to their areas of expertise and commitment to medical student teaching.
Both the AMSA leaders and the GWU faculty members volunteered their time and did not receive compensation for their service to the program. The total cost of running the T4 program was approximately $6,000, not including participants’ travel expenses. Program tuition was set at $175, and we encouraged students, via the application materials, to apply to their medical schools for funds to cover their program tuition and travel costs. Nine out of 11 medical students received funding for the program. The 2 remaining students were unable to obtain funding from their schools and chose to cover their own travel expenses to the institution. After a brief discussion, the AMSA coordinator (C.W.) and the GWU faculty coordinator (L.G.) decided to waive the tuition of these two students. AMSA and GWU shared the remaining costs of running the program (student housing for the week, two meals per day, printing materials for the program), approximately $1,000 and $3,500, respectively.
We performed a multifaceted evaluation of the T4 program by assessing short-term outcomes, long-term outcomes, and theory-based measures. The GWU institutional review board exempted the T4 program evaluation protocol. We tested pre–post differences using either the Wilcoxon signed rank test or the paired t test, depending on the observed distribution of change scores. We used SAS for Windows version 9.2 (Carey, North Carolina) to perform analyses of our program evaluation results.
We evaluated short-term outcomes with both subjective and objective measures. The subjective measures consisted of three surveys which all 11 students completed: the program evaluation, a self-efficacy instrument, and the Commitment to Academic Medicine Survey. The objective measure was an objective structured teaching examination (OSTE).
The program evaluation survey. The program evaluation sampled students’ perceptions of the Institute, asking them to evaluate aspects of the program, including each of its 11 content areas by responding to the statement “This topic was important and valuable” on a five-point Likert scale (1 = strongly disagree, and 5 = strongly agree). Students rated individual content areas from 3.8 to 4.9; the average across all topics was 4.1. The most highly rated sessions were “Giving Effective Feedback” (4.9), “Adult Learning Theory” (4.6), and “Case-Based Teaching” (4.6). Overall, students rated teaching- and leadership-focused sessions as the most important and valuable (they rated the topics in these categories 4.5 on average), whereas career-building and project-planning sessions were rated as less important and valuable (students rated individual topics in these categories, on average, 3.90 and 3.83, respectively). Students rated the learning climate as 4.5, and they assessed the organization of the Institute as 4.7. Every participant agreed that he or she would “recommend this institute to others.”
The self-efficacy survey. The self-efficacy survey asked students to rate their confidence in their teaching skills before and after the T4 Institute (retrospective pre–post format). This instrument, the Clinical Educator Self-Assessment, developed by Dennis Baker, PhD,14 at Florida State University, consisted of 14 items in areas such as leading case discussions, providing effective feedback, writing and using learning objectives, and using questions effectively to teach. Student ratings of self-efficacy (1 to 5; 5 = high) showed a positive change, from a mean of 3.0 (standard deviation [SD] = 0.6) pre-Institute to 4.1 (SD = 0.5) post-Institute. Mean changes (all increases) on individual survey items ranged from 0.4 to 1.5 points. Internal consistency reliability was 0.82 for pre-Institute and 0.87 for post-Institute. Students also experienced an increase in their confidence to execute their projects successfully—from 3.5 at the beginning of the program to 4.5 at the end of the program.
The Commitment to Academic Medicine Survey. The Commitment to Academic Medicine Survey, comprising just a single question also rated on a five-point scale (5 indicating high commitment), assessed students’ commitment to academic medicine before and after the program. Overall, students indicated a pre–post increase in their commitment to academic medicine: The mean rose from 3.6 pre-Institute to 4.5 post-Institute.
The OSTE. The OSTE, a shorter version of one designed at GWU for residents, served as an objective complement to the surveys and occurred in a pre–post fashion on Day 1 and Day 5 of the five-day program. The OSTE consisted of three stations: teaching a skill, giving a mini-lecture, and giving feedback. At both the beginning and the end of the program, the students encountered these same three OSTE scenarios and performed the assigned tasks with “standardized students,” senior GWU students from the GW TALKS students-as-teachers program (an elective program meant, primarily, to teach medical students how to teach and to expose them to medical education knowledge and skills).15,16 GWU educators who train standardized patients spent over four hours training these standardized students both to play scripted learner roles and, after each encounter, to evaluate the AMSA students using checklists, which included such items as “The teacher explained the relevance of the teaching exercise to me,” “The teacher asked me to summarize the take-home points,” and “The teacher offered me reinforcing feedback.”
On the pre-Institute OSTE, the students scored a mean of 63.4% (SD = 12.2) for the three stations. On the post-Institute OSTE, students scored a mean of 78.5% (SD = 9.3). Of the 11 students, 10 showed a positive increase from pre-Institute to post-Institute OSTE. One student earned a very slightly lower score on the post-Institute OSTE compared with the pre-Institute OSTE, but this student had the second-highest pre-OSTE score and had less room for improvement.
Students completed (via Survey Monkey) two long-term electronic evaluation surveys 8 months and 18 months after completing the program. The 8-month survey asked students which aspects of the T4 program were the most helpful to them (forced choice) and what T4 knowledge and skills they had put to use (open-ended). The 18-month survey asked students for follow-up information about their projects.
At the 8-month follow-up, 91% of the students (n = 10) said the teaching sessions were the most helpful elements of the T4 Institute, 73% (n = 8) said the project planning sessions were the most helpful, and 64% (n = 7) said the academic medicine career-building sessions were the most helpful. In contrast, fewer students rated the leadership training sessions (45%, n = 5), the ability to network with other students (27%, n = 3), and the ability to network with faculty (18%, n = 2) as most helpful. Of the eight students who chose to provide comments in the open-ended section of the evaluation, seven specifically mentioned that they were using teaching skills (giving feedback [n = 6], teaching a skill [n = 5], and adult learning [n = 5]) in their everyday lives.
At 18 months, students reported on the status of their projects (Table 2). Seven of eleven students (64%) had completed their projects, and four reported their projects abandoned. Two students abandoned their projects for undisclosed reasons, and two abandoned their projects because the need for their projects became moot (one institution received a grant to develop a similar project, and one student group took jurisdiction of a student’s project before he or she could complete it). Of those who completed their projects, five had presented their medical education projects formally, resulting in 12 total presentations (5 at the national level and 7 at the student’s home institution); one student had successfully received funding to support her project; one student had published an abstract; and one student was planning to submit a scholarly article in the current academic year. Five of the 11 students started new projects inspired by their participation in the T4 Institute.
Evaluation using SCCT
In addition to using student outcomes for evaluation, we performed a retrospective program review using SCCT. SCCT allowed us to evaluate the T4 program from a unique perspective: its potential to foster an academic career. As noted, SCCT demonstrates the reciprocal interaction of self-efficacy, outcomes expectations, and personal characteristics in influencing career decision making. Extensive empirical evidence supports its tenets.4–6 Bakken et al6(p96) note that research has demonstrated that academic and career self-efficacy “is predictive of educational and career choice, persistence, and achievement across a large range of career and educational domains and is culturally valid for a diversity of populations.”
Bakken et al6 were the first to apply SCCT to career formation analysis in medicine. Building on Bakken and colleagues’ efforts, O’Sullivan et al5 used SCCT as a framework in a qualitative study of how students and residents make decisions about careers in academic medical research. On the basis of the work of these authors, we have constructed a rubric (Table 3) as a convenient means to use SCCT to evaluate the T4 program. In so doing, we have assumed that the results of O’Sullivan and colleagues’ study, which focuses on academic research careers, can be extended to students considering careers as clinician–educators. In their examinations of SCCT, both O’Sullivan et al5 and Bakken et al6 strongly emphasize the critical role of continuity and mentoring (which, by definition, includes continuity). We have therefore added continuity to our list of factors.
By developing a rubric to assess the degree to which SCCT components were embedded in our program, we were able to use evidence-based means to evaluate its potential for fostering careers in academic medicine. The rubric lists SCCT’s major components (self-efficacy, outcomes expectations, personal characteristics, and continuity) in the left-hand column and, in the right-hand column, the factors that—if present—theoretically support these components.
Self-efficacy and outcomes expectations. As mentioned, the four factors that support both self-efficacy and outcomes expectations are (1) personal success experience, (2) exposure to successful role models, (3) social and verbal persuasive communications, and (4) positive emotional reactions. The results of the self-efficacy survey, the OSTEs, and the personal projects suggest that the T4 Institute successfully provided personal success experiences (factor 1); that is, self-efficacy improved, OSTE scores improved, and not only did most students complete their projects but some also parlayed them into presentations and even funding. Though students did not directly evaluate role modeling and social verbal persuasion (factors 2 and 3), the program planners deliberately incorporated these elements in the 2009 T4 Institute. During the Institute, 27 diverse faculty members served as role models and social verbal persuaders for students; we observed the meaningful connections faculty were able to develop with students through repeated interactions—both formal and informal. Students evaluated the learning climate (factor 4) very positively, rating it a 4.5 on the program evaluation surveys. Further, our observations support their assessment. We minimized the hierarchical differences between faculty and students by establishing rapport with the students, making the Institute’s goals clear, and setting the ground rules, all of which supported a safe learning climate. Many students, in fact, felt comfortable enough to address faculty on a first-name basis.
In addition to personal success experience, exposure to successful role models, social and verbal persuasive communications, and positive emotional reactions, two additional factors support Component 2, Outcomes Expectations: (5) Definition of Clear Career Pathways and (6) Approaches to Addressing Career Challenges.5,6 The 2009 Institute addressed both of these, at least to some extent, through career panels and informal student–faculty interactions. However, we did not define the degree to which they were to be addressed in the curriculum, nor did we evaluate the effectiveness of the elements that were included.
Personal characteristics and continuity. The 2009 T4 Institute minimally addressed the last two components in the model, Personal Characteristics and Continuity. Personal characteristics such as gender, ethnicity, and predisposition can affect the success of mentoring and personal experiences. Although evidence suggests that some students value gender- and race-concordant mentoring,17 we did not attempt to match students with faculty mentors of the same race or gender.
Some research has shown that the influence of single experiences, such as T4, runs the risk of dissipating if they are not connected to a network of reinforcing experiences offered by a community of dedicated professionals. O’Sullivan et al,5(p339) in emphasizing the need for continuity, speak of the ideal approach as a “fluid pathway.” T4 failed to provide such fluidity; it did not formally connect student participants to one another or to faculty post-Institute, nor did it coordinate with students’ home institutions to help define a clear pathway for students interested in careers in academic medicine. Partnering with home institutions to demarcate a clear career path would have been challenging, especially if our impression—that many institutions themselves have not defined a clear pathway—is correct.
Evaluating the Evaluation
AMSA, in conjunction with GWU, accomplished its overall goal of offering students a peer-led summer institute which went beyond previous teaching programs to provide a broad exposure to academic medicine through curriculum in leadership, scholarship (or project development), and academic career-building, as well as teaching and learning. The Institute promoted students’ formal and informal exploration of academic medicine through repeated interactions with many faculty who presented the challenges and rewards of the field.
AMSA and GWU assessed the Institute’s effectiveness through the use of complementary evaluation strategies: subjective and objective; immediate and long-term. This multifaceted approach allowed us to evaluate the program on three of four Dixon levels18: student perception (surveys), test performance (OSTE), and real-life performance (projects and presentations). (The fourth and final level in Dixon’s evaluation model is impact on others.)18 Evaluators have advocated this sort of multifaceted approach as a means to gain a better picture of the influence of a program on outcomes beyond learner satisfaction.18 The results of this evaluation support the strength of the Institute, particularly in the curricular areas of teaching, scholarship, and career-building. For example, the results of the self-efficacy survey reveal that the students felt increased confidence in their teaching skills, and the positive OSTE results objectively support these perceptions. Similarly, the student scholarly project initiative supports both the Institute’s scholarship/project development and its academic medicine career-building curriculum. Because no controls are available for comparison, knowing what portion of this achievement might be attributable to the program and its effects is impossible, but these very concrete products do set a standard of comparison for similar development programs of this type. Finally, the increase in commitment to a career in academic medicine, as evidenced by the results of our commitment survey, also attests to the effectiveness of the Institute’s academic medicine career-building curriculum. Of interest, whereas leadership topics were initially very highly rated in the short-term program evaluation survey, fewer than half of the students rated leadership topics as most helpful eight months following the Institute. We devoted less time during the week of the Institute to leadership, so perhaps having less time to solidify knowledge and practice concrete skills in leadership made this topic area less useful in the long run, despite being valued by students. One potential solution to overcome this seemingly use-it-or-lose-it phenomenon is to conduct leadership training in a longitudinal model, emphasizing the application of learned knowledge and skills.
Our theory-based evaluation (i.e., the application of the SCCT) was also insightful. The data from our complementary evaluations cannot tell us whether the T4 program will result in its participants ultimately choosing and succeeding in academic careers—a question answerable only many years from now. SCCT, however, served as a surrogate, providing us with a way to look at T4’s potential to achieve this long-term outcome. Because evidence has linked SCCT components to career choice and success, we were able to gain a sense of our program’s career-building potential by determining the degree to which the Institute incorporated the components into the program or curriculum. In the future, T4 Institute planners will use SCCT for program design as well as evaluation. Specifically, because SCCT emphasizes the need for early exposure to a career and multiple follow-up exposures over time, future iterations of T4 (and other career-building programs) will need to include and evaluate continuity options. Planners could expand T4 either in the form of a larger summer institute or by hosting multiple institutes based at different U.S. schools (especially given the high cost of travel and housing associated with a national institute). Also worth consideration is the formation of a Web-based network to support and connect graduates of the Institute.
We believe the AMSA model, in conjunction with its multifaceted evaluation program (incorporating SCCT), may prove useful to medical schools interested in beginning or augmenting a students-as-teachers program. Some of the lessons we learned from the T4 experience may serve as a useful guide for career-building program developers. The first lesson we learned is the importance of student leadership in creating and implementing a successful program. The AMSA leaders brought to the collaboration with GWU faculty a clear sense of what students lacked and wanted. Other important lessons include the following:
* The student-driven, student–faculty collaboration that goes into devising the curriculum supports a program that, in turn, supports students’ career interests.
* The near-peer interaction between AMSA leaders and T4 students creates a program engaging and credible to participants.
* A multifaceted approach to curriculum for a careers-in-academic-medicine program (including leadership, academic careers, and project planning [as well as teaching]) is valuable.
* Challenging students to develop an educational project at the Institute and to then take it back to their home institution for completion helps them to understand how to produce scholarship.
* Casual, unstructured time is valuable for and valued by both faculty and students.
* Approaching program evaluation using complementary strategies, including long-term, short-term, subjective, and objective dimensions, reveals both strengths on which the program can build and areas in which the program can improve.
* Using SCCT to design as well as evaluate career-building programs adds benefit.
* Evaluating the integration of SCCT components into a career-building program may serve as a proxy measure for difficult-to-evaluate long-term program outcomes.
* Career-building programs need to attend to personal factors in career choice, perhaps by providing short-term mentoring and by bringing together students and faculty of similar genders, ethnicities, and backgrounds.
* Developing post-Institute continuity for students’ projects and career development is necessary for the program to be as effective as possible.
* Clearly defining career pathways for students interested in academic medicine helps the students actually walk those pathways and embark on academic medicine careers.
Other disclosures: None.
Ethical approval: The George Washington University institutional review board deemed this research exempt.