Skip Navigation LinksHome > August 2004 - Volume 79 - Issue 8 > Documentation Systems for Educators Seeking Academic Promoti...
Academic Medicine:
Research Report

Documentation Systems for Educators Seeking Academic Promotion in U.S. Medical Schools

Simpson, Deborah PhD; Hafler, Janet EdD; Brown, Diane; Wilkerson, LuAnn EdD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Simpson is associate dean for educational support and evaluation, director, Office of Educational Services, and professor of family and community medicine, and Ms. Brown is educational project analyst, Office of Educational Services, both at the Medical College of Wisconsin, Milwaukee; Dr. Hafler is assistant professor of pediatrics, Office of Educational Development, and associate director for faculty development, Harvard Medical School, Boston; and Dr. Wilkerson is senior associate dean for medical education, University of California, Los Angeles, David Geffen School of Medicine at UCLA.

Correspondence and requests for reprints should be addressed to Dr. Simpson, Associate Dean for Educational Support and Evaluation, Director, Office of Educational Services, 8701 Watertown Plank Road, Milwaukee, WI 53226.

For an article on a similar topic, see pp. 729–736.

Collapse Box

Abstract

Purpose. To explore the state and use of teaching portfolios in promotion and tenure in U.S. medical schools.

Method. A two-phase qualitative study using a Web-based search procedure and telephone interviews was conducted. The first phase assessed the penetration of teaching portfolio-like systems in U.S. medical schools using a keyword search of medical school Web sites. The second phase examined the current use of teaching portfolios in 16 U.S. medical schools that reported their use in a survey in 1992. The individual designated as having primary responsibility for faculty appointments/promotions was contacted to participate in a 30–60 minute interview.

Results. The Phase 1 search of U.S. medical schools’ Web sites revealed that 76 medical schools have Web-based access to information on documenting educational activities for promotion. A total of 16 of 17 medical schools responded to Phase 2. All 16 continued to use a portfolio-like system in 2003. Two documentation categories, honors/awards and philosophy/personal statement regarding education, were included by six more of these schools than used these categories in 1992. Dissemination of work to colleagues is now a key inclusion at 15 of the Phase 2 schools. The most common type of evidence used to document education was learner and/or peer ratings with infrequent use of outcome measures and internal/external review.

Conclusions. The number of medical schools whose promotion packets include portfolio-like documentation associated with a faculty member's excellence in education has increased by more than 400% in just over ten years. Among early-responder schools the types of documentation categories have increased, but students’ ratings of teaching remain the primary evidence used to document the quality or outcomes of the educational efforts reported.

A portfolio is a systematic collection of information documenting expertise in an area, usually incorporating multiple sources of information collected over time to demonstrate excellence.1 Since the late 1980s, the portfolio concept has been used in higher education as a tool for faculty members to report their educational activities, often with accompanying evidence of effectiveness. This report explores the current status of the portfolio approach in medical education to better understand how this format can make the processes and products of teaching more public and assessable.

Teaching portfolios or dossiers typically contain three types of information: a personal statement to provide a context for reviewing the portfolio, a brief review of major accomplishments and activities, and summarized evidence regarding the quality and effectiveness of the activities.2 In 1990, the ACME-TRI Report showed that only five schools used “educational dossiers” for documenting educational activities and accomplishments.3 A search of the indexed medical literature in 1991 yielded only one brief description of a portfolio-like approach for systematically evaluating educational contributions (the Teaching Dossier from the University of Toronto).4

To identify the actual use of portfolios, Simpson and colleagues5 surveyed 126 medical schools in 1992. At that time, 24 medical schools indicated they had portfolio systems in place by a variety of names, “Promotion Packet,” “Performance Packet,” “Faculty Activities Handbook,” “Promotions Folder/File,” “Data Summary for Professional Advancement.” A follow-up telephone interview determined that only 17 medical schools met the literature-derived criterion1 (agreed to by the authors) for using a portfolio: providing evidence in at least three education categories for promotion. Typical portfolio inclusions from these 17 schools included philosophy, teaching, curriculum development, advising, learner assessment, educational administration, and dissemination.

In the intervening decade, the interest in and reports on the use of portfolios to document teaching quantity and quality have continued to grow, including reports from individual institutions6–10 and disciplines or medical specialties.11–15 Refinements to the portfolio concept described in the literature include the identification of types of evidence appropriate to different educational roles,15,16 descriptions of portfolios associated with successful academic promotion,6, 8,10 and criteria for evaluating portfolio content as a form of scholarship.17–21 Several recent articles have addressed the use and review of portfolios in the selection of faculty members into “academies” of medical educators.22–24

The common theme among these studies and reports is that documenting educational activities and providing associated evidence of excellence that can be judged by peers is feasible, and the results can be successfully used for academic promotion. However, the literature about typical portfolio contents, types of evidence, and standards for evaluating the portfolio in promotion decisions fails to fully represent the degree to which portfolio-like systems have penetrated academic medicine over the past ten years. In this report, we attempt to elucidate the current use of educational portfolios in academic medicine through a two-phase study. In Phase 1, we reviewed the policies and documentation guidelines for teaching accomplishments for purposes of promotion and tenure in academic medicine at U.S. medical schools available on the World Wide Web (WWW). In Phase 2, we interviewed academic leaders at the seventeen schools that were the early adopters of teaching portfolios in 1992.

Back to Top | Article Outline

Method

We used a two-phase qualitative research design. Phase 1 of the study used publicly accessible information available via the Web. Phase 2 of the study was reviewed and approved by the Medical College of Wisconsin's Education Institutional Review Board.

Back to Top | Article Outline
Phase 1

Phase 1 focused on assessing the penetration of portfolio-like systems for documenting and demonstrating faculty excellence in education for academic promotion in all U.S. medical schools. The search was conducted online from November 2002 to February 2003. Each medical school's WWW home page was accessed via links from the alphabetical listing of medical schools at the Association of American Medical College's (AAMC) Web site.25

The search of each school's Web site used the following terms: portfolio, teaching portfolio, dossier, faculty dossier, educational dossier, scholarship of teaching, educator's portfolio, electronic portfolio, faculty portfolio, faculty development, promotion, rank and tenure, handbook, faculty handbook, and teaching portfolio template. For searches that failed to yield any results with the initial search terms, another search was initiated using broader terms: faculty guidelines, guidelines, guidelines for promotion and promotion packet. Because portfolios allow a candidate to present an array of educational activities ranging from curriculum development and assessment to teaching, advising, and educational administration,1,26–28 a medical school was judged to have a portfolio system in place if it accepted evidence of at least three different types of education activities. This approach may have underrepresented the actual use of educational portfolios in those schools that do not make academic advancement policies and procedures accessible on their Web sites.

The results from each Web site were coded into a template that listed the school's name and the nature of the accessible information such as policies or guidelines.

Back to Top | Article Outline
Phase 2

Phase 2 examined the current status of portfolio use in medical schools that were originally surveyed in 1992.5 One author (DB) searched the Web sites for each school for detailed information about policies and guidelines and examples of portfolios. The individual designated by the AAMC's Medical Schools of the United States and Canada25 to have primary responsibility for faculty appointments and promotions was contacted via a letter explaining the purpose of the project, committing that information disclosed would not be identifiable by school, and noting that individuals would be contacted to determine their willingness to participate in an interview. We divided the early-adopter schools among the authors (DS, JH, LW) and each conducted structured 30–60 minute telephone interviews from May through July 2003.

The interview protocol (available upon request from the corresponding author) focused on whether a system was in place for faculty members to document and provide evidence demonstrating the quality of their education-related activities for academic promotion. If there was a system in place, the content of the educational activities and the types of associated evidence were discussed. The final part of the interview focused on the degree to which education was considered a form of scholarship and the value of the portfolio-like system for demonstrating educational scholarship in their institution. Open-ended and closed-option question formats using dichotomous (yes/no) or Likert-scale response options were used. For example, “How effective is your method for documenting education for academic promotion?” was answered used a four-point scale (high, moderate, somewhat, poor), with a follow-up question asking the respondent to please explain. The participants’ responses to quantitative rating scale questions were directly recorded on the interview template by the interviewer using the rating scale provided. Narrative responses were recorded as field notes by the interviewer on the interview template by question. All interviewers forwarded their completed interview templates to the senior author for analysis. Qualitative analysis was applied to the narrative results using open coding29 (DS), and the interview coding results were confirmed by the original interviewer.

Back to Top | Article Outline

Results

Phase 1: Penetration of Portfolio-Like Documents in the Promotion Process

The medical school Web-site search revealed that eight of the 126 medical schools had secured Web sites allowing only intrainstitution access. Of the remaining 118 schools, 76 (64%) had links to Web-based information on documenting educational activities for promotion. Only 14 (18%) of those 76 schools had links available to both detailed policy descriptions of education-specific inclusions for promotion packet submission and descriptive information available (e.g., examples of portfolio entries, guidebooks, or instructions on how to complete a portfolio). However, most of the 76 schools had public access to either detailed policy descriptions (57%) or descriptive information (55%). The 76 schools were representative of the national distribution of public and private schools: 50 (66%) schools were public and 26 (44%) were private compared with 59% and 41% nationally.30 Twenty-five of the portfolio schools were among the 30 top-ranked schools for National Institutes of Health (NIH) awards for fiscal year 2002.31

Medical schools or their parent universities described their documentation systems using one or more from an array of terms, including “description of activities,” “dossier,” “promotion packet,” “summary statement,” and “faculty log.” Web links included handouts, guidelines, step-by-step instructions for creating your own teaching portfolio, and templates for documenting one's educational activities and providing associated evidence. The University of Michigan Medical School's Web site illustrated both guidelines for portfolio inclusions and examples of portfolios.32 An example of promotion guidelines specifying inclusion of education-focused materials was found in Section VII of the University of Minnesota Medical School—Twin Cities’ dossier site.33 In that section of the dossier, candidates were instructed to prepare a one- to two-page summary of their teaching-related activities and associated evaluations. The guidelines described items to be included, ranging from listings of lectures or courses taught to descriptions of mentoring activities.

Some medical schools provided general information about teaching portfolios. For example, the Web site for University of Medicine and Dentistry of New Jersey's Robert Wood John Medical School34 included information on the rationale and concept of the teaching portfolio, on describing one's teaching philosophy, and on how to create a teaching portfolio. Similarly, the University at Buffalo, State University of New York School of Medicine and Biomedical Sciences’ Center for Teaching and Learning Resources35 has also created a well-referenced site on faculty portfolios that includes sample portfolios of faculty from other universities and disciplines beyond medicine along with extensive references and resource guidelines.

Back to Top | Article Outline
Phase 2: The Ten-Year Follow-Up on Early Portfolio Adopters

Phase 2 centered on the medical schools that had reported portfolio use in the early 1990s.5 Of the 17 schools interviewed in 1992, 16 agreed to participate in our study and make accessible or provide supplemental information regarding their policies. Some included a sample portfolio. Five of the schools were private,30 with three schools ranked among the top ten for NIH awards in fiscal year 2002.31

Fourteen of the interviewees at the schools were either chairs of their medical school's promotion and tenure (P&T) committee3 and/or administrative officers of their medical school14 responsible for academic or faculty development. Ten of the interviewees were members of the P&T committee or were responsible for the decanal reviews.

All 16 institutions continued to have a “system in place for faculty members to document and provide evidence of their contributions for education for academic promotion.”5 Although one respondent reported that he/she was unsure about whether there was an actual system in place, when asked to clarify, the institution met the baseline criterion for inclusion. Almost one-third of the respondents (five out of 16) indicated that their system was undergoing change with either minor modification (e.g., updated standards) or major reform (e.g., a relative value unit-based evaluation system; revamping the approach to expand the types of inclusions and/or evidence accepted).

Back to Top | Article Outline
Documentation system inclusions.

Since the 1992 study, the names used to describe the teaching portfolios at each of the schools remain unchanged. These names included Educator's Portfolio (Teacher's); Dossier (Teaching, Promotion); and Promotion Packet, Promotion Folder, and P&T Notebook. Two institutions used no terminology because the information was incorporated into the curriculum vitae. Types of inclusions within the various documents were consistent across institutions and included the ten categories elucidated by Simpson et al.5 (see Table 1). Almost all (14 out of 16) of the schools included six of the ten categories (philosophy/personal statement, curriculum development/evaluation, teaching, advising/mentoring, dissemination, and honors/awards) in their documentation system (see Table 1).

Table 1
Table 1
Image Tools

The “other” category reflected that one school listed “citizenship” as a separate inclusion category. Where others had discussed citizenship as part of the educational leadership/administration category, this respondent explained why it was a separate category by stating, “Effective educators must role model good citizenship. ... They must play well with others in order to effect change as a leader, as a committee member, as curriculum innovator or as a teacher.”

Back to Top | Article Outline
Types of evidence and methods associated with evaluating inclusions by category.

An analysis of the types of evidence used within each category to document quality or educational outcomes (see Table 1) revealed an array of data sources and types of information. However, unlike the consistency seen in the content inclusion categories, these results revealed limited consensus on types of evidence to be used with only learner evaluations and peer ratings of teaching used by 12 or more schools. Only four (25%) of the schools reported using learner or peer data for any other content inclusion categories. Examples of outcomes-based evidence (e.g., learner-performance data, awards of graduates, residency match results and committee actions) were explicit parts of the documentation system at only three schools. Several categories provided contextual information about a person's development as an educator (e.g., philosophy/personal statement, continuing education, and long-term goals). When evaluated, these categories were judged by evidence provided in the form of outcomes or peer or external review. However, as one respondent stated, “Every type of contribution can be ‘graded’ using comments and actual learner evaluations.”

When asked how the documentation was evaluated in general, the majority of the respondents indicated that they relied on the judgment of the review committee, “We know what we want to look for. ...but it is not really codified. ...as we have a broad outline of what to expect in promotion.” Variables associated with that judgment paralleled those for scholarship, the foundation for traditional P&T decisions. For example, one participant responded, “We mostly consider if faculty members are moving the field forward, whatever the field is.” Further, creativity, development, and dissemination of transferable products were key elements identified by respondents as being associated with “moving the field forward.” As one respondent put it, “What defines a university is the development of products that can be shared. ...”

Evidence of “excellence” was explicitly identified by six, less than half, of the respondents. “Excellence,” as one respondent quoted from their institution's guidelines, “relates to the quality of performance or product of sufficient quantity to be recognized as an appropriate accomplishment,” or, “It must make an impact.” Five respondents explicitly cited reputation in the field as an important variable, including regional to international recognition for educational work.

Back to Top | Article Outline
Education as scholarship.

Education was considered a form of scholarship by 15 of the 16 respondents. The lone dissenter stated that at his/her institution, a “scholarly approach is missing. ... We do things backwards in education.” Only five, or less than one-third of the respondents, were familiar with the publications redefining scholarship in higher education, most notably those associated with Boyer19 or Glassick et al.18 A selection of comments reflected the range of these five respondents’ views regarding the evaluation of submitted documents based on the criteria for scholarship. One person said, “We talked about Boyer, but it had no real impact on our system. We gave up defining scholarship because it was eating up so much time and we could not get consensus. We have just been going ahead with the art and the ‘we know it when we see it’ approach.” Another reported using Boyer's19 work as a model, but explained that it failed because “the model was forced on us and we didn't know what it meant. What we've now done is pulled the best parts and used them in our document.”

Using Boyer's19 expanded concept that “teaching is scholarship,” the 16 respondents were asked if teaching itself was considered a form of scholarship in their institutions: nine said “no,” five said “yes,” and two were undecided. These responses typify the diversity of opinion on this issue. As one respondent stated, “The creative work of a course director would be considered scholarship, but not the act of teaching.” Another offered, “We are not going to promote people because they were doing teaching. ... A great, great teacher would not be promotable. ... We would want to see disseminated scholarship—papers, book chapters, teaching tools that are accepted by other schools.” In contrast, another stated, “Teaching could be scholarship if it could meet the criteria of achieving recognition for the candidate beyond (our institution's) boundaries.”

A separate set of interview questions asked the respondents to rate a series of items related to the status of the documentation system at their institution. Respondents uniformly felt that the P&T committee had a high level of commitment to the documentation system that was in place (mean = 1.2, where 1 = high, 2 = moderate, 3 = somewhat, 4 = poor). Clarifying responses included, “The complete dossier is required for any personnel action without exception,” and “The dossier is the crucial document. It's the bible by which we make decisions.”

When queried, using the same scale, about the impact or influence of the evidence of educational activities in the promotion decision, responses were slightly lower (mean = 1.8). However, poor teaching did have an impact on promotion decisions, “We have had a couple of people fail this year who were very productive clinicians. ... doing whatever research they were doing [but they] had poor teaching evaluations and they failed. They [the P&T Committee] are definitively taking it seriously.” “The worst thing that can happen [for a P&T Committee] is to not have data,” said one respondent, “The second worst is to have a disorganized dossier.”

The perceived effectiveness of each institution's education documentation system had the broadest range of responses. Four respondents selected each of the following: “unable to judge/having no opinion,” “low,” “moderate,” and “high.” A typical response from the “unable to judge” or “low” rating respondents was that, “We have clarified the issues and the educational piece is in place. But it is not used.” When asked about the value of an institution's system of documentation relative to other approaches of evaluation for educational activities, perceptions were again wide-ranging: six institutional representatives were unable to judge or had no opinion. Typical of this ambivalence was the statement, “It's not the best, but it's where we are. ...” Or it is “the document we use. ... it's not been a howling success with faculty [who describe it] as not user friendly. ... It has not helped people focus on true scholarship. ... It's a bean-counting document: how much, how many courses, students. ...”

The final protocol question asked the respondents to identify any key features they would like to change. Two focal areas for change emerged: six identified increasing the kinds of inclusions and evidence and five identified clarifying and consistently using the system. The respondents’ own comments best highlight the range of responses and the continuing evolution of the documentation systems. “Personally, I would like to ‘blow it up and redo it’,” said one, adding: “We are struggling with what is substantive educational leadership; what contributions really count.” Another said “[We are] struggling with interdisciplinary teaching – when they teach outside of their department in interdisciplinary courses. ... or the value of grand rounds in other specialties.” A more moderate approach, one respondent suggested, would be to “develop a mechanism for showcasing creative work as . ...education when the product is not a presentation or publication.” Several respondents indicated they were actively considering establishing an academy, such as those at University of California, San Francisco (UCSF), or Harvard Medical School: “Once that is in place,” said one, “there will be pressure to have the promotion and tenure committee look at (academy selections as) evidence for educational quality.”

Back to Top | Article Outline

Discussion

In 1990, the AAMC reported that only five medical schools used “educational dossiers” for documentation of educational activities and accomplishments in promotion decisions.3 By 1992, 17 medical schools responding to a national survey indicated that they used portfolios.5 In both of these reports, educational portfolios were defined as an approach to documenting educational accomplishments that included at least three different types of educational activities as part of the academic promotion documentation system.5 Based on the results of our Web review (Phase 1), 76 medical schools now offer Web-based access to information on documenting educational activities for promotion that include at least three categories of educational activities, an increase of more than 400% in just over ten years. This result probably underrepresents actual use because some schools have secured Web sites that cannot be searched and others may have portfolios that are not posted on their Web site.

In Phase 2, we found the categories of documentation used in 2003 were consistent with the 1992 results from 17 early-adopting schools.5 In 1992, 13 (81%) of respondents included curriculum development, teaching, advising/mentoring, educational administration/leadership, and dissemination as core elements of their systems. In 2003, all of the 16 schools responding included curriculum development/evaluation and teaching, 14 included advising/mentoring and honors/awards, and 13 included educational leadership/administration.

Dissemination of work to colleagues is now a key inclusion at 15 of the 16 schools. The types of transferable objects have expanded to include CD/DVDs, course syllabi, Web sites, teaching strategies and innovations, and other products emerging from educational activities beyond traditional publications and presentations. The dissemination criteria are consonant with Glassick et al.'s 18 criteria of effective presentation and Beattie's36 application to education. Teaching scholarship is, Beattie writes, “incomplete unless communication to peers and other scholars occurs in addition to presentation to the usual audience of students, colleagues or the public.”

Two categories exhibited a major change from 1992 to 2003. Honors and awards were included by just over half of the schools in 1992, but they became a common inclusion (14 out of 16 schools) among the schools we surveyed. This increase is consistent with Atasoylu et al.'s 37 report that teaching awards were the most highly rated performance measure for promotion of clinician educators by department of medicine chairs and promotion committee chairs in terms of their importance and quality of information. A philosophy or personal statement, reported by fewer than half of schools in 1992, was included by 14 schools in our study. However, although the recent emphasis on assessment of learner's performance by the Accreditation Council of Graduate Medical Education38 and other accrediting bodies is growing, only nine of the 16 schools we interviewed reported the inclusion of information about learning outcomes. In fact, the primary evidence reported in the early-adopter schools’ portfolios were learner and peer ratings of teaching. The dominance of such ratings for teaching may be attributable to the relative ease of obtaining rating data, particularly with the advent of online evaluation systems.

Increasing the use of outcome data is more challenging. Cooke et al.23 report that the description of the evidence used in support of applications to the UCSF, School of Medicine's Academy of Medical Educators has changed to include the following types of outcome evidence: a list of educational materials adopted by other institutions, statements from advisees of career effects, and comments from course or clerkship faculty and committee chairs on amount and quality of the faculty member's contributions. Other outcome data that might be included are results of classroom tests, standardized examinations, clinical scores, or alumni surveys. Given the many faculty who contribute to students’ learning outcomes in a single course or clerkship, such data may be most appropriate for those faculty members directing programs, with selected data sets available to those teaching in them. For example, where major contributions are being made, individual faculty members may be able to identify subsets of examinations or performance ratings to which they feel they have directly contributed.

Concurrently, assessing outcomes can also move beyond the focus on individual instructors to a focus on larger units by encouraging faculty to describe their contribution to and include evidence from a department or course as an aggregated unit.39 For example, evidence may include the relative rank of a course or rotation in which the faculty member teaches compared with ratings of other courses or rotations, the number of students selecting the faculty member's medical specialty or discipline, residency match data, length of accreditation for their residency or graduate program, and/or academic appointments of fellows or graduate students. By moving beyond the individual as the only unit of evidence, a faculty member can include evidence related to the regional or national stature of the program to which they have contributed. This approach would address the need for dissemination as a criteria for scholarship, now required at 15 out of 16 of our early adopter schools, by providing faculty whose primary roles are internal to the institution (e.g., teaching, patient care)40 with a broader array of evidence to document regional and national recognition.

Back to Top | Article Outline

Implications

U.S. medical schools have made progress in addressing the challenge of translating education into the type of scholarly activity associated with academic rewards and promotions. The vast majority of schools have now included at least three education-related categories within their academic promotion documents. The early-adopter institutions have recognized the need to continually enhance their documentation systems for education, however, as evidenced by the fact that more than half of the interviewees reported ambivalence about the effectiveness of their current documentation system and identified areas needing improvement. Given the evidence from Phase 1 that systems for documenting educational accomplishments exist at many medical schools, it is disturbing that recent studies in internal medicine reveal that faculty members, department chairs, and P&T committee chairs have differing perceptions of the relative value of educational contributions to promotion.37,41 Clear communication of relative value is needed at departmental and institutional levels if faculty members are to believe, and act as if they believe, that teaching “counts.”

A publication similar to that of Boyer19 or Glassick et al.18 but one focused on the documentation and evaluation of teaching for promotion in medical schools could assist more schools in creating or improving their use of educational portfolios. The results from our early-adopter schools may serve as the basis for such a document given the consensus among these schools regarding the types of educational activities appropriate for P&T documentation and some experience regarding the types of evidence and standards used to assess these teaching, advising, educational leadership, and curriculum development activities. We believe his publication should emerge from a consensus conference, drawn from leaders in academic medicine and measurement experts who have been charged to determine “what counts” in education. More specifically, at such a conference participants could articulate how evidence other than learner and peer ratings of teaching might be appropriately used as measures of quality for individual faculty members in a team-structured educational system and recommend whether reputation beyond one's own institution is a reasonable criteria for evaluating educational accomplishments.40

Clearly, it is now time to build on what is known about how to document educational accomplishments and to establish standards for evaluating these accomplishments for purposes of promotion and tenure so that individual faculty members and institutions are better prepared to understand, value, publicly articulate, and reward educational scholarship.

This paper was initially conceptualized as part of the AAMC's Group on Educational Affairs Educational Scholarship Project.

Back to Top | Article Outline

References

1.Seldin P. The Teaching Portfolio: A Practical Guide To Improved Performance and Promotion/Tenure Decisions Second Edition. Bolton, MA: Anker Publishing Company, 1997.

2.Cannon R. Broadening the context for teaching evaluation. In: Knapper C, Cranton P (eds). Fresh Approaches to the Evaluation of Teaching. New Directions for Teaching and Learning. San Francisco: Jossey-Bass, 2001:87–97.

3.Swanson AG. ACME-TRI Report: Educating Medical Students. Washington, DC: Association of American Medical Colleges, 1992.

4.Rothman AI, Poldre P, Cohen R. Evaluating clinical teachers for promotion. Acad Med. 1989;64:774–5.

5.Simpson D, Morzinski J, Beecher A, Lindemann J. Meeting the challenge to document teaching accomplishments: the educator's portfolio. Teach Learn Med. 1994;6:203–6.

6.Heestand D. Recognizing educational activities: a search for identify by one school. Basic Sci Educ. 1997;7(1-2):14–5.

7.Miller ME. Troubled times for medical school teachers. Johns Hopkins Med News. 2000;Winter:26-31.

8.Lindemann JC, Beecher AC, Morzinski JA, Simpson DE. Translating family medicine's educational expertise into academic success. Fam Med. 1995;27:306–9.

9.MacDonald SM. Professional Development Guide for the Faculty of the Johns Hopkins University School of Medicine: A Practical Guide for Faculty Professional Development and the Process Involved in Faculty Academic Advancement. Baltimore: John Hopkins University School of Medicine, 2001.

10.Hafler JP, Lovejoy FH. Scholarly activities recorded in the portfolios of teacher-clinician faculty. Acad Med. 2000;75:649–52.

11.Education Clearinghouse. Faculty Development Committee Teaching Dossier 1989 〈http://www.surgicaleducation.com/educlear/index.htm#table4〉. Accessed 25 July 2003. Association for Surgical Education, Southern Illinois University School of Medicine, 2004.

12.Carroll RG. Professional development: a guide to the educator's portfolio. Am J Physiol. 1996;271(Part 2):S10–S13.

13.Collins J, Smith WL. Promotion based on teaching efforts requires ongoing documentation of scholarship teaching activities. Acad Radiol. 2001;8:771–6.

14.Kevorkian CF, Rintala DH, Hart KA. Evaluation and promotion of the clinician-educator: the faculty viewpoint. Am J Phys Med Rehabil. 2001;80:47–55.

15.Sachdeva AK, Cohen R, Dayton MT, et al. A new model for recognizing and rewarding the educational accomplishments of surgery faculty. Acad Med. 1999;7:1278–87.

16.Sherertz EF. Criteria of the “educators’ pyramid” fulfilled by medical school faculty promoted on a teaching pathway. Acad Med. 2000;75:954–6.

17.Simpson DE, Fincher RM. Making a case for the teaching scholar. Acad Med. 1999;74:1296–9.

18.Glassick CE, Huber MT, Maeroff GI. Scholarship Assessed: Evaluation of the Professsoriate. San Francisco: Jossey-Bass, 1997.

19.Boyer EL. Scholarship Reconsidered: Priorities of the Professsoriate. Princeton: The Carnegie Foundation for the Advancement of Teaching, 1990.

20.Fincher RE, Simpson DE, Mennin SP, et al. Scholarship in teaching: an imperative for the 21st century. Acad Med. 2000;75:887–94.

21.Shulman L. The scholarship of teaching. Change. 1999;31(5):11.

22.Richards BF, Moran BJ, Friedland JA, Kirkland RT, Searle NS, Coburn M. A criterion-based, peer review process for assessing the scholarship of educational leadership. Acad Med. 2002;77(suppl 10):S7–S9.

23.Cooke M, Irby DM, Debas HT. The UCSF Academy of Medical Educators. Acad Med. 2003;78:666–72.

24.Thibault GE, Neill JM, Lowenstein DH. The Academy at Harvard Medical School: Nurturing Teaching and Stimulating Innovation. Acad Med. 2003;78:673–81.

25.Medical Schools of the United States and Canada 〈http://www.aamc.org/members/listings/msalphakm.htm#M〉. Accessed 13 November 2002. Association of American Medical Schools.

26.Seldin P. How Administrators Can Improve Teaching. San Francisco: Jossey-Bass, 1990.

27.Edgerton R, Hutchings P, Quinlan K. The Teaching Portfolio: Capturing the Scholarship in Teaching. Washington, DC: American Association for Higher Education, 1992.

28.Hebert EA. The Power of Portfolios. San Francisco: Jossey-Bass, 2001.

29.Miles MB, Huberman AM. Qualitative Data Analysis: An Expanded Sourcebook. 2nd ed. Thousand Oaks, CA: SAGE Publications, 1994.

30.References List Public vs. Private Association of American Medical Colleges. Table 2: Distribution of U.S. Medical School Faculty by School 〈http://www.aamc.org/data/facultyroster/usmsf00/00table2.pdf〉. Accessed 25 July 2003.

31.Awards to Medical Schools by Rank, FY 2002. Source: Pub9202I Program med_rank_030421_rfm. 〈http://grants2.nih.gov/grants/award/rank/medttl02.htm〉. Accessed 25 July 2003. National Institutes of Health.

32.Guidelines for Educator's Portfolio—University of Michigan Medical School 〈http://www.med.umich.edu/medschool/faculty/portfolio/〉. Accessed 25 July 2003. University of Michigan.


34.Center for Teaching Excellence. Career Development/Teaching Portfolios 〈http://www.umdnj.edu/meg/career_portfolios.htm〉. Accessed 25 July 2003. University of Medicine and Dentistry of New Jersey.

35.Center for Teaching and Learning Resources Faculty 〈http://wings.buffalo.edu/vpaa/ctlr/files/faculty_portfolios.htm#sample_portfolios〉. Accessed 25 July 2003. University at Buffalo, State University of New York.

36.Beattie DS. Expanding the view of scholarship: introduction. Acad Med. 2000;75:871–6.

37.Atasoylu AA, Wright SM, Beasley BW, et al. Promotion Criteria for Clinician-educators. J Gen Intern Med. 2003;18:711–6.

38.ACGME Outcome Project 〈http://www.acgme.org/outcome/〉. Accessed 25 July 2003. Accreditation Council on Graduate Medical Education.

39.Fenwick TJ. Using student outcomes to evaluate teaching: a cautious exploration. In: Knapper C, Cranton P (eds). Fresh Approaches to the Evaluation of Teaching. New Directions for Teaching and Learning. San Francisco: Jossey-Bass, 2001:63–74.

40.Levinson W, Rubenstein A. Integrating clinician-educators into academic medical centers: challenges and potential solutions. Acad Med. 2000;75:906–12.

41.Beasley BW, Wright SM. Looking forward to promotion: characteristics of participants in the prospective study of promotion in academia. J Gen Intern Med. 2003;18:705–10.

Cited By:

This article has been cited 26 time(s).

International Review of Psychiatry
As technology and generations in medical education change, what remains is the intersection between educator, learners, assessment and context
Azzam, A
International Review of Psychiatry, 25(3): 347-356.
10.3109/09540261.2013.787048
CrossRef
Medical Teacher
The design and utility of institutional teaching awards: A literature review
Huggett, KN; Greenberg, RB; Rao, D; Richards, B; Chauvin, SW; Fulton, TB; Kalishman, S; Littlefield, J; Perkowski, L; Robins, L; Simpson, D
Medical Teacher, 34(): 907-919.
10.3109/0142159X.2012.731102
CrossRef
Medical Education
Long-term follow-up of a 10-month programme in curriculum development for medical educators: a cohort study
Gozu, A; Windish, DM; Knight, AM; Thomas, PA; Kolodner, K; Bass, EB; Sisson, SD; Kern, DE
Medical Education, 42(7): 684-692.
10.1111/j.1365-2923.2008.03090.x
CrossRef
Jama-Journal of the American Medical Association
Separate and equitable promotion tracks for clinician-educators
Fleming, VM; Schindler, N; Martin, GJ; DaRosa, DA
Jama-Journal of the American Medical Association, 294(9): 1101-1104.

American Journal of Health-System Pharmacy
The teaching portfolio: A useful guide for pharmacist's teaching goals
Johnson, PN; Smith, KM
American Journal of Health-System Pharmacy, 64(4): 352-+.
10.2146/ajhp060430
CrossRef
Teaching and Learning in Medicine
The development of an electronic educational portfolio: An outline for medical education professionals
Lewis, KO; Baker, RC
Teaching and Learning in Medicine, 19(2): 139-147.

Journal of Veterinary Medical Education
The scholarship of teaching in health science schools
Fincer, RME; Work, JA
Journal of Veterinary Medical Education, 32(1): 1-4.

Teaching and Learning in Medicine
Residents' ratings of clinical excellence and teaching effectiveness: Is there a relationship?
McOwen, KS; Bellini, LM; Shea, JA
Teaching and Learning in Medicine, 19(4): 372-377.

Academic Medicine
Educational fellowship programs: Common themes and overarching issues
Gruppen, LD; Simpson, D; Searle, NS; Robins, L; Irby, DM; Mullan, PB
Academic Medicine, 81(): 990-994.

Journal of Continuing Education in the Health Professions
MedEdPORTAL: Educational scholarship for teaching
Reynolds, RJ; Candler, CS
Journal of Continuing Education in the Health Professions, 28(2): 91-94.
10.1002/chp.163
CrossRef
Academic Psychiatry
Recruiting and rewarding faculty for medical student teaching
Pessar, LF; Levine, RE; Bernstein, CA; Cabaniss, DS; Dickstein, LJ; Graff, SV; Hales, DJ; Nadelson, C; Robinowitz, CB; Scheiber, SC; Jones, PM; Silberman, EK
Academic Psychiatry, 30(2): 126-129.

Academic Medicine
Why invest in an educational fellowship program?
Searle, NS; Hatem, CJ; Perkowski, L; Wilkerson, L
Academic Medicine, 81(): 936-940.

Medical Education
Advancing educators and education by defining the components and evidence associated with educational scholarship
Simpson, D; Fincher, RME; Hafler, JP; Irby, DM; Richards, BF; Rosenfeld, GC; Viggiano, TR
Medical Education, 41(): 1002-1009.
10.1111/j.1365-2923.2007.02844.x
CrossRef
Teaching and Learning in Medicine
Knowledge and use of academic portfolios among primary care departments in US medical schools
Zobairi, SE; Nieman, LZ; Cheng, L
Teaching and Learning in Medicine, 20(2): 127-130.
10.1080/10401330801991477
CrossRef
Family Medicine
Academic competencies for medical faculty
Harris, DL; Krause, KC; Parish, DC; Smith, MU
Family Medicine, 39(5): 343-350.

Journal of Hospital Medicine
Challenges and Opportunities in Academic Hospital Medicine: Report from the Academic Hospital Medicine Summit
Flanders, SA; Centor, B; Weber, V; McGinn, T; DeSalvo, K; Auerbach, A
Journal of Hospital Medicine, 4(4): 240-246.
10.1002/jhm.497
CrossRef
Academic Psychiatry
Technology as an instrument to improve quality, accountability, and reflection in academic medicine
Wilkes, MS; Howell, L
Academic Psychiatry, 30(6): 456-464.

Revista Medica De Chile
The academic career of clinical teachers
Sanchez, I
Revista Medica De Chile, 137(8): 1113-1116.

Anz Journal of Surgery
An academy of surgical educators: sustaining education - enhancing innovation and scholarship
Collins, JP; Gough, IR
Anz Journal of Surgery, 80(): 13-17.
10.1111/j.1445-2197.2009.05170.x
CrossRef
Journal of General Internal Medicine
A ten-month program in curriculum development for medical educators: 16 years of experience
Windish, DM; Gozu, A; Bass, EB; Thomas, PA; Sisson, SD; Howard, DM; Kern, DE
Journal of General Internal Medicine, 22(5): 655-661.
10.1007/s11606-007-0103-x
CrossRef
Journal of General Internal Medicine
Challenges and Opportunities in Academic Hospital Medicine: Report from the Academic Hospital Medicine Summit
Flanders, SA; Centor, B; Weber, V; McGinn, T; DeSalvo, K; Auerbach, A
Journal of General Internal Medicine, 24(5): 636-641.
10.1007/s11606-009-0944-6
CrossRef
Jama-Journal of the American Medical Association
Clinicians in Quality Improvement A New Career Pathway in Academic Medicine
Shojania, KG; Levinson, W
Jama-Journal of the American Medical Association, 301(7): 766-768.

Journal of Dental Education
MedEdPORTAL: A Report on Oral Health Resources for Health Professions Educators
Chickmagalur, NS; Allareddy, V; Sandmeyer, S; Valachovic, RW; Candler, CS; Saleh, M; Cahill, E; Karimbux, NY
Journal of Dental Education, 77(9): 1122-1128.

Academic Medicine
Evaluating the Performance of Medical Educators: A Novel Analysis Tool to Demonstrate the Quality and Impact of Educational Activities
Chandran, L; Gusic, M; Baldwin, C; Turner, T; Zenni, E; Lane, JL; Balmer, D; Bar-on, M; Rauch, DA; Indyk, D; Gruppen, LD
Academic Medicine, 84(1): 58-66.
10.1097/ACM.0b013e31819045e2
PDF (1334) | CrossRef
Academic Medicine
The Continued Evolution of Faculty Appointment and Tenure Policies at U.S. Medical Schools
Bunton, SA; Mallon, WT
Academic Medicine, 82(3): 281-289.
10.1097/ACM.0b013e3180307e87
PDF (516) | CrossRef
Academic Medicine
Creative Professional Activity: An Additional Platform for Promotion of Faculty
Levinson, W; Rothman, AI; Phillipson, E
Academic Medicine, 81(6): 568-570.
10.1097/01.ACM.0000225223.45701.da
PDF (48) | CrossRef
Back to Top | Article Outline

© 2004 Association of American Medical Colleges

Login

Article Tools

Images

Share