Secondary Logo

Journal Logo

INSTITUTIONAL ISSUES: ARTICLES

Collaboration and Peer Review in Medical Schools' Strategic Planning

Bonazza, Joan MSc; Farrell, Philip M. MD, PhD; Albanese, Mark PhD; Kindig, David MD, PhD

Author Information

Abstract

The benefits of long-range strategic planning in organizational development have been clearly demonstrated. However, in the intermingled medical education and health care environment—which is in the midst of unparalleled change—achieving effective planning of this type is no small challenge. Nonetheless, many schools have developed strategic plans during the past decade, and the process has been reasonably well accepted by both medical school faculty and leadership.

How to use the strategic plan in the allocation of scarce resources has proved to be more challenging,1 even though the increasing financial constraints on medical schools make the strategic allocation of resources in an efficient and effective manner essential to survival. Faculty who are satisfied with the process that is used to develop a strategic plan may become uncooperative and even hostile to both the process and the product when the plan is used to reallocate resources away from them or to reward others. Achieving “buy-in” for the allocation of resources according to designated strategic priorities can be critical to the success of the strategic planning process as well as the medical school itself.

This report describes the collaborative process used by the University of Wisconsin Medical School throughout the fall of 1997 and spring of 1998 to develop its 1998–2000 strategic plan. The unique approach devised for this purpose culminated in using a peer-review process to select a limited number of program priorities based upon a model of proposal solicitation and peer review similar to that of the National Institutes of Health (NIH), using original criteria predetermined in a collaborative fashion. The results are described along with the statistical methods used to analyze reviewers' evaluations. We hope that the criteria and methods employed by our school are helpful to other medical colleges in achieving a combination of strategic priorities and mission-aligned management.

BACKGROUND

An effective strategic plan establishes priorities and provides direction for school-wide management and resource allocation, serves as a major catalyst for goal-oriented change, and should enlist broad faculty support for implementation within specified time frames. It has become increasingly clear that the process of strategic planning can be as important as the product—both for successful implementation of the school's plan and for promoting stable leadership. A successful plan provides the medical school leadership with a mandate to implement school-wide measures that address both potential threats and promising opportunities for the school, and has been recognized as a factor in securing faculty allegiance and authority to manage.2 A coherent, cohesive strategic plan also serves as a valuable communication tool that a dean can use to inform the faculty and others about where the school is going, i.e., what strategic directions are being pursued as part of an overarching vision.3

As the University of Wisconsin Medical School neared the end date of its first strategic plan period (1995–1997), it had accomplished over 80% of the specified goals, but faced many challenges. These included integration issues with the recently established University of Wisconsin Medical Foundation (integrated/grouped faculty practice plan) and the University of Wisconsin Hospital and Clinics Authority, an upcoming merger with a 227-member physician group of private practitioners, the impact of changes in clinical practice on the academic mission, a perceived need to achieve better integration and collaboration among basic science and clinical departments, and a major fund-raising effort for facilities development. Our experience in developing and implementing a previous strategic plan gave us insight into how the process needed to be improved. A major weakness in the 1995–1997 strategic plan was that program priorities were not specified in sufficient detail, resulting in ambiguities in how to implement those priorities and complicating assessment of the extent to which they were achieved.

In the fall of 1997, faced with limited resources and budget constraints, the school was compelled to evaluate its mission and vision and determine how best to invest its fiscal, human, and physical resources in a new strategic plan. School leaders believed it was imperative that the next strategic plan articulate a clear vision, a sharper focus, and a limited number (approximately five) of strategic program areas targeted for priority resource support over the next three-year period. It was clear that to achieve this, a high degree of faculty participation in and approval of not only the development of each part of the plan but the process itself would be important. Consensus-building processes are especially valuable at the University of Wisconsin, where faculty governance has been a century-long tradition.4

OUR APPROACH

Our approach was developed after critically assessing recommendations from health care organization analysts regarding the stages and structure of strategic management.1,5 We also followed recommendations of leading business experts on steps toward creating organizational transformation8 and enhancing the effectiveness of strategic planning efforts.6 The planning process was led by our Strategic Plan Steering Committee, composed of the dean (chair), eight department chairs (four elected by the 12-member Basic Science Chairs Caucus and four elected by the 13-member Clinical Chairs Council), and four associate deans. The dean charged the steering committee to develop a three-year strategic plan that would build upon the successes of the 1995–1997 strategic plan, articulate a new vision for the medical school, and identify academic programs to be designated strategic priorities as a guide for resource allocation.

The steering committee initiated the process in the fall of 1997, meeting approximately every two weeks from October through December. From the start, the process was envisioned as two components proceeding concurrently—one component developing broader goals and objectives for the plan, and the second focusing on strategic program priorities. The steering committee provided direction and functioned as a “clearinghouse” at various stages of the process. The steering committee adopted a five-stage process: (1) review and revision of the school's mission and vision statements; (2) analysis of the institutional environment; (3) development of goals and objectives; (4) development of criteria and guidelines for competitive proposal evaluation; and (5) implementation of a competitive process for selecting strategic program priorities. To achieve faculty acceptance of the new plan, the steering committee designed a strategy that would involve a high degree of creative input at each step of the planning process. Collaboration and review by faculty provided valuable suggestions, particularly in the design and implementation of a competitive process to determine the school's program priorities for the next three years.

Review of the Mission and Vision Statements

The steering committee shortened the mission statement from the 1995–1997 plan and developed a proposed vision statement for 1998–2000. These statements were refined throughout the process in response to comments and suggestions from faculty and staff. Ultimately, a new and more ambitious vision statement was crafted: “The UW Medical School will be one of the nation's preeminent medical schools by excelling in the creation, integration, and transfer of knowledge through a combination of basic, translational, and clinical research; a greater emphasis on active learning; and consistently outstanding patient care.”

Analysis of Strengths, Weaknesses, Opportunities, and Threats

The second stage was to conduct an assessment of the environment using school-wide SWOT (strengths, weaknesses, opportunities, threats) analysis, in which all faculty were solicited to identify the school's most impressive strengths, greatest weaknesses, most attractive opportunities, and most serious threats in the current environment. Because faculty are accustomed to receiving electronic communications from the dean, this analysis was conducted and responses were collected via campus e-mail. Faculty were encouraged to respond either individually or collectively as a unit, department, or center. At least one response was received from each of the major medical school sub-units. Responses identified 103 strengths, 98 weaknesses, 76 opportunities, and 83 threats. The results of this survey were reviewed by the steering committee and by the faculty at large using monthly faculty meetings and also through electronic distribution. All faculty, staff, and administrators were given an opportunity to add to the SWOT listings.

Development of Goals and Objectives

The goals and objectives (GO) committee was appointed by the dean in December 1997 to analyze the SWOT results, review recommendations for the previous strategic plan, and develop revised goals with clear and measurable objectives for the 1998–2000 strategic plan. This committee consisted of a total of 12 faculty and academic staff (including two associate deans) with balanced representation from the basic science and clinical departments of the school. To provide continuity, this committee had members who also served on the steering committee. The committee identified six broad areas for inclusion in the plan (diversity, clinical issues, central administrative issues, research, education, and statewide connections). The GO committee worked in six subgroups through the winter to develop proposed goals and measurable objectives for each of the six areas. The report from the GO committee was presented to the strategic plan steering committee at a meeting in March 1998, where the recommended goals and objectives were further refined. In April 1998 draft goals and objectives were distributed to all faculty, providing an opportunity for review, comments, and suggestions. The draft was also reviewed by the Medical School Academic Planning Council (which participates in governance and advises the dean on all aspects of school management, including academic policy, budget, and strategic directions). Responses from all groups were compiled and presented to the steering committee for synthesis. Subsequently, the now-well-refined goals and objectives were discussed and further revised at the school's leadership retreat in late April 1998.

Development of Criteria

The dean charged the associate dean for research to lead the development of the criteria and guidelines to evaluate and rank potential program priorities. In October 1997 the associate dean for research presented to the steering committee draft criteria and guidelines. The steering committee requested that an ad-hoc group be formed with broad representation from throughout the school to refine and further develop the proposed criteria. The steering committee was convinced that if enough broad-based efforts were directed toward “battling it out” at this stage and if clear and thorough criteria were developed, the selection of program priorities would become manageable in the designated time period.

Approximately 20 faculty members, center directors, and academic staff participated on this committee. The committee members represented a balance of basic science and clinical faculty, tenure track and clinical health sciences (CHS) track faculty, and individuals from departments not represented on the steering committee. The group met over the course of two months to revise criteria and guidelines. The criteria were comprehensive and included all aspects of the mission and vision, with special efforts made to assess educational value. In an effort to better incorporate educational concerns into the document, the medical school's educational policy council was asked to add criteria to define quality in medical students' education. The final version of the criteria was presented to the steering committee in November 1997 and accepted after minor revisions. (See Appendix A for the “Proposed Guidelines and Criteria for Ranking Medical School Program Priority Areas.”)

Creation of the Peer-review Process

To ensure broad-based faculty support for both the process and the results, the steering committee sought to create a process that would generate and rank by priority a slate of strategic program proposals “from the bottom up” and encourage interdisciplinary collaboration among faculty. The committee believed that an NIH-type model of proposal solicitation and peer review would be successful because it would tap into a process that faculty value and are familiar with.

Before soliciting these proposals from all faculty, the dean shared the committee's proposed process and the criteria to rank program proposals with department chairs and center directors in late November and early December 1997. The chairs and directors were encouraged to review the material and solicit comments from their faculty. The process for soliciting proposals was designed to have two stages. The first was a call for abstracts (or letters of intent). Second, after reviewing the abstracts, a limited number of proposers would be invited to submit more extensive proposals to be considered in an NIH-type review format.

In December, a request for proposal (RFP) abstracts was sent to all faculty both electronically and through campus mail. The request letter outlined the purpose and rationale for the school's strategic planning process and noted that future fund-raising directions would be along the lines of the strategic priorities identified through that process. It also outlined the proposal submission and review process and timeline, presented the school's mission and vision statements, and stated the criteria to be used in ranking the strategic program priorities. The deadline for submitting proposal abstracts was approximately three weeks after the RFP was sent out.

In response to the requests sent to approximately 800 faculty, 40 proposal abstracts were received. (See Appendix B for one of these abstracts.) The steering committee conducted a preliminary review and made suggestions to investigators regarding some natural collaborations that could enhance their proposals. For instance, there were ten neuroscience proposals with overlapping features that essentially were amalgamated into one very strong proposal following a meeting of the ten faculty members who submitted those proposals' abstracts. The steering committee also held meetings with all faculty who submitted abstracts to provide additional advice and information and to further encourage collaborative efforts in this exercise. Approximately two months were allowed for the investigators to develop and submit full proposals. Ultimately, ten full proposals were received that reflected a broad range of disciplines within the school and featured interdepartmental teaching and research thrusts focused on unifying themes.

Review Methods

Ten members of the strategic plan steering committee functioned as a modified NIH-type study section to review the ten proposals. It was decided in advance to select the top five proposals. The dean, associate dean for research, and associate dean for administration participated in discussions and assessment of the proposals but did not rate or rank any to avoid any concern that the administration unduly influenced the process. Full proposals were rated and ranked in accordance with the criteria approved by the steering committee and accepted by the faculty. Steering committee members were assigned as primary and secondary reviewers and asked to rate each proposal on a scale of 1–5 “as you would as an NIH reviewer,” using the NIH Adjectival Scale for Assignment of Priority Scores7: 1.0–1.5 = outstanding, 1.5–2.0 = excellent, 2.0–2.5 = very good, 2.5–3.5 = good, 3.5–5.0 = acceptable. After that, they were asked to rank the ten proposals, with 1 = most meritorious. The two assessment methods were used to determine whether one or the other was better able to achieve interrater agreement and discriminate more effectively among the proposals. After “blinding” the identity of the proposals and reviewers, the ratings and rankings were assessed for their interrater reliability and then submitted to an analysis-of-variance procedure to determine whether there was a clear statistical basis for discriminating among the various proposals. For purposes of comparing ratings across the ten proposals, Bonferroni post-hoc procedures were used.

RESULTS OF THE REVIEW

An intraclass correlation estimate of the reliability of a single rater was found to be 0.82 and improved to 0.90 for the mean of two raters. These data suggest that the ratings of the projects met most criteria for being considered good to excellent. Table 1 shows the correlation between the ratings and ranking for each rater. The values ranged from 0.69 to 0.98, with a median of 0.95. Because analyses of the rating and ranking data produced relatively similar results, only results from the ratings are reported. (One clinical chair could not attend the meeting at which proposals were reviewed and rankings and ratings were submitted. However, he had fully participated in the development of the criteria and had reviewed all proposals under consideration. He was allowed to submit a rank-order list of program priorities shortly after the meeting. His rankings were not significantly different from those of the rest of the group.)

Table 1
Table 1:
Correlations between Ratings and Rankings of Ten Strategic Program Proposals by Ten Raters, University of Wisconsin Medical School, 1998*

Table 2 shows the analysis of variance for the ratings. Notice that the test of rater effects is statistically significant (F = 2.54, R2 =.043, p <.017), showing that there were differences in mean ratings that were given by the different raters, but that differences in rater stringency accounted for less than 5% of the score variability. The difference due to projects was substantially greater, yielding an F value of 42.31, R2 =.805 and a p value beyond.0001. These results demonstrate that the mean ratings effectively discriminated between different projects, accounting for 80.5% of the score variability.

Table 2
Table 2:
Analysis of Variance for the Ratings of Ten Strategic Program Proposals by Ten Raters, University of Wisconsin Medical School, 1998*

Figure 1 shows 95% confidence intervals for the individual raters. Rater “I” was a relative outlier, having overlapping confidence intervals with only four of the other eight raters. Confidence intervals for all the other eight raters overlapped with each other. The results point to modest problems with rater stringency. Clearly, rater “I” was a much more lenient rater than were the other eight raters. Figure 2 shows 95% confidence intervals for the projects ordered from the most highly rated (small numbers) to the least highly rated (large numbers). There were clearly three sets of program proposals that were rated similarly to one another, but different from the rest. Especially strong discrimination occurred for the top three programs: (1) cancer, (2) cardiovascular and respiratory sciences, and (3) neuroscience. A trio of interrelated proposals on women's health, aging, and population and community health formed the second group of programs. These three were clearly rated less favorably than were the top three projects, but they were not distinguishable from each other, although aging slightly overlapped with neuroscience. The middle group of three was rated higher than the remaining four projects. The steering committee decided to select the top six proposals, rather than five program priorities as originally intended. The fact that these three “second-tier” programs overlapped synergistically and featured epidemiologic research encouraged this decision.

Figure 1
Figure 1:
The means of the ratings made by nine raters and the confidence intervals for those raters. The ratings were of ten program proposals to judge the programs' strategic priorities. University of Wisconsin Medical School, 1998.
Figure 2
Figure 2:
The mean ratings for each program proposal and the 95% confidence intervals for each program whose strategic priority was being rated, ordered from the most-highly-rated programs (lower numbers) to the least-highly-rated (higher numbers). University of Wisconsin Medical School, 1998.

To provide a broader perspective to evaluate the results, box plots of the ratings are shown in Figure 3 for the programs. These findings show that the amounts of variability differed among the programs. Projects 7 and 8 had the greatest variability. This suggests that there was less consensus for these projects than there was for the other projects.

Figure 3
Figure 3:
Distribution of ratings for ten proposed programs according to strategic priority. The names of the programs are identified inFigure 2 by the same project numbers used in this figure.

The results of this process were presented to the steering committee, to the medical school academic planning council, to the medical school's leaders at their annual spring retreat, and to the medical school faculty at a scheduled meeting. While there were some surprises, the medical school leaders and faculty voiced support for the criteria, the review process (which was considered thorough and fair), and the results. The 1998–2000 strategic plan was approved by the medical school academic planning council and by the medical school faculty in June 1998.

REFLECTIONS ON THE PROCESS

Experience with strategic planning and management over five years at the University of Wisconsin Medical School has given us valuable insights about effective processes and products. We have reached a conviction that strategic thinking must prevail to establish realistic and inspiring visions for institutional development, directional vectors that advance the missions, and resource allocation plans that align school funds with priorities. It has also become clear that process is at least as important as product (the strategic plan) in academic cultures that feature shared governance. The highest objective in medical school strategic planning, we believe, is for the faculty to regard the product as “our strategic plan” and not “the dean's strategic plan.” This kind of attitude demonstrates the buy-in needed for successful implementation. It can best be achieved by expedited, collaborative planning with highly effective communication and leadership.

To develop an effective strategic plan, the dean, as the medical school's top manager or chief executive officer, must fully understand the desired product, support the process, and provide overall strong leadership.1,5 Furthermore, strategic planning efforts can be especially successful when the dean leads a collaborative process that engages the faculty in a meaningful way. Dean Robert Daugherty notes that the magnitude of change today “requires a more active and dynamic form of collaboration and leadership than has traditionally been a part of the medical school culture.”3 Indeed, he stresses that “today's culture demands more collaboration, with group processes leading to the desired outcomes.” One of the most significant challenges, then, is to develop a collaborative strategic planning process involving the dean that will resonate with the faculty and other administrative leaders, such as department chairs. A collegial approach proved ideal for us and created cohesiveness and synergism as we proceeded collaboratively.

Kotter's eight-stage process of creating major change8 proved to be particularly enlightening and applicable in structuring our planning process. We especially took advantage of his recommendations to: (1) establish a sense of urgency by assessing the internal and external environment; (2) create a guiding coalition; (3) develop a vision and a strategy; (4) use every vehicle to constantly communicate the vision of change; and (5) empower broad-based action by encouraging risk-taking and nontraditional ideas and actions. We considered and employed these external recommendations within the context of our university chancellor's model illustrating how chaos and alignment must coexist to “foster creativity and innovation” and “break through the boundaries of existing knowledge” as a research university moves forward into the future to achieve its mission and vision.9,10

As described earlier, we found it advantageous to use two methods for assessing the merits of the various proposals—ranking and rating. The ranking process ordered the ten proposals from most meritorious to least meritorious. The rating process we employed was modeled after methods traditionally used to review grants at the NIH. Each proposal was assigned a primary and a secondary reviewer; the primary reviewer was responsible for an in-depth analysis of the strengths and weaknesses of the proposal. Secondary reviewers added any insights they felt would be helpful. Raters then assigned a number from 1 to 5 patterned after the NIH adjectival rating scale.7 In a departure from the NIH review process, the raters were asked to rank-order the ten proposals after all the ratings had been completed. Consistent, previously endorsed criteria were used in both rating and ranking the proposals. These original criteria are provided in Appendix A for potential use by other medical schools. Other criteria for analyzing and judging academic programs are available elsewhere.11

An inherent advantage of the ranking approach is the removal of any differences in rater stringency. The statistical analysis of the ratings produced a significant rater effect, demonstrating that there were differences among the raters in their stringency. Some weaknesses of the ranking approach are that they can trivialize large differences in proposals and enlarge trivial differences. Any difference between two proposals, even a small difference, will still register as a least one rank difference in points. Conversely, if there is a break-out difference where one or more proposals would be substantially better (or worse) than the others, this difference would also register as one rank difference. On the other hand, a potential shortcoming of the methods we employed is that the large correlation between the rankings and ratings of the proposals in this study may simply reflect the procedures used. The proposals were first put through a lengthy rating process, after which they were ranked. If some other approach to ranking were adopted, different results might be obtained. Also, if there were substantially more than ten proposals, ranking each would be a more difficult task. Thus, our results' demonstration of good consistency between the ranking and rating procedures may not generalize to any great extent.

This rating and ranking exercise also produced an unexpected outcome by identifying significant weaknesses in what were viewed by many as being some of the stronger program areas in our medical school. For example, given the wealth of talent among our faculty in infection/immunity and genetics, we expected proposals submitted for these well-established program areas to score relatively well. Their low ranking reflected deficiencies and signaled a need for us to conduct a more thorough examination of these programs to ascertain more clearly what the problems are and how best to address them. This is currently under way and has revealed a need for development of more effective leadership mechanisms, better interdepartmental collaboration and integration, and more clearly organized academic priorities.

Although our school's 1995–1997 strategic plan enabled us, in the end, to achieve over 80% of our stated objectives, progress was hampered by vacancies and changes in key leadership positions and some level of uncertainty about who “owned” various parts of the plan. For the 1998–2000 plan, school leaders wanted to develop and employ mechanisms to track progress and create more accountability among school leaders and faculty for reaching performance targets. Strategic planning experts in the business community echo this desire to “plan for performance.”12 After the medical school faculty approved the 1998–2000 strategic plan, “goal managers” were assigned to each goal and objective. They were asked to develop specific, measurable action plans and to attach targeted timelines to each. The dean and associate deans reviewed these action plans, and discussions ensued with goal managers regarding feasible timelines, targets, and potential future resource allocations. It became clear that the dean's active leadership would need to continue throughout implementation of the plan. With strategic priorities and a management team in place, the next challenge was to develop resources to allocate during a period with budgetary constraints. Internal reallocations had been used to support the 1995–1997 strategic plan. A combination of sources is currently being used in association with the 1998–2000 strategic plan. This includes developing a new academic-funds flow model to better align the school's fiscal allocations with teaching, research, and priority programs.

We believe it is of critical importance to have the school's vision and its strategic program priorities endorsed by the faculty and well established before undertaking a mission-based budgeting exercise. Proceeding in this way helps keep the purpose of mission-based management clear and focused on the need to fund faculty initiatives as opposed to simply supporting the dean or alleviating the medical school's financial problems. In the absence of a school-wide strategic plan, attempts to develop a model to realign and reallocate funds towards achieving stated objectives may not be accepted as legitimate by the faculty. Taking the “high road” both in strategic planning and management and in mission-based budgeting, while frequently communicating the purpose and plan to faculty at large, has proved most successful for us.

As we have now entered the last year of our three-year strategic plan, a number of factors have influenced our decision to extend the current plan with modifications for another two years. Widespread acceptance by the faculty and enthusiastic support of academic leaders, such as department chairs, have created momentum to continue addressing the plan's priorities. After reviewing progress made toward the plan's goals, as well as considering the addition of any new goals not adequately addressed in the current plan (e.g., a rural health program), we believe that it is essential to give program priorities more time to achieve their objectives. Although some well-established programs continue to excel, others are in the process of developing programs and recruiting new leadership (e.g., neuroscience). In addition, we are just beginning to implement our mission-based fiscal management model and want to give the faculty and staff who now work within its parameters enough time to succeed before we begin another strategic planning exercise.

Recognizing the importance of communication throughout all processes to the ultimate success of the effort,1,12 the dean will continue to engage in frequent and regular dialog with school faculty at general faculty meetings, various committee meetings, the school's annual leadership retreat, and electronically with all school faculty and staff through a “Dean's Update.” This communication strategy was built upon the origins of vision, as defined by Duncan et al.,5 to craft a unique vision for the University of Wisconsin Medical School. Taking into account the successes of the school's 1995–1997 strategic plan, the current challenges facing academic medicine (and those facing our school, in particular), and the internal capacity and strengths of the school's faculty, these communication tools have enabled us to build a more ambitious and powerful vision for the school than had been possible before. During development of the strategic plan, these communications served to provide progress reports and solicit comments and recommendations on all parts of the process and developing plan (mission and vision statements, SWOT analysis, criteria development, RFP process, and draft versions of the 1998–2000 strategic plan). In our implementation period, we hope this continuing dialog with faculty deepens credibility and promotes trust in the collaborative process, the product, and the outcomes of our current plan. This will keep the strategic plan alive and active.

REFERENCES

1. Wilson MP, McLaughlin CP. Strategic planning. In: Leadership and Management in Academic Medicine. San Francisco, CA: Jossey-Bass, 1984.
2. Yedidia MJ. Challenges to effective medical school leadership: perspectives of 22 current and former deans. Acad Med. 1998;73:631–9.
3. Daugherty Jr RM. Leading among leaders: the dean in today's medical school. Acad Med. 1998;73:649–53.
4. Fellman D. Faculty governance, 1949–1974. In: Bogue AE, Taylor R (eds). The University of Wisconsin, One Hundred and Twenty-Five Years. Madison, WI: The University of Wisconsin Press, 1975.
5. Duncan WJ, Ginter PM, Swayne LE. The nature of strategic management. In: Strategic Management of Health Care Organizations. 2nd ed. Cambridge, MA: Blackwell, 1995.
6. Powell TC. High performance strategic planning for competitive advantage. In: Glass HE, Cavin BN (eds). Handbook of Business Strategy. New York: Faulkner and Gray, 1994:383–98.
7. Green J, Calhoun F, Nierzwicki L, Brackett J, Meier P. Rating intervals: an experiment in peer review. FASEB J. 1989;3:1987–92.
8. Kotter JP. Leading Change. Boston, MA: Harvard Business School Press, 1996.
9. Ward D. Proud Traditions and Future Challenges—The University of Wisconsin-Madison Celebrates 150 Years. Madison, WI: Office of University Publications, University of Wisconsin, 1999.
10. Cotter M. Systems thinking in a knowledge-creating organization. J Innovative Management. 1996;2:15–30.
11. Dickeson RC. Sifting academic priorities. Trusteeship. 1999;7:13–6.

APPENDIX A

Proposed Criteria and Guidelines for Ranking Medical School Program Priority Areas

  1. Relevance to mission and vision of medical school and the University of Wisconsin (UW)
    1. Does this proposed or existing program promote horizontal integration across departments and/or centers?
    2. Does this proposed or existing program encourage integration and enrichment of basic, translational, and clinical research and education?
    3. Does this proposed or existing program promote integration among units in health sciences (e.g., UW hospital and clinics, UW Medical Foundation, schools of pharmacy and nursing)?
    4. Does this proposed or existing program promote integration among schools/colleges throughout the UW?
    5. Does this proposed or existing program promote diversity?
    6. Must this proposed or existing program be of preeminence in order for the medical school to be ranked in the top ten medical schools nationally?
  2. Program quality and importance to scholarship and education on campus
    1. What measures define quality in research and graduate education?
      1. Amount and diversity of federal support
      2. Publications (number, quality, published in high-quality journals)
      3. Training records for residents, graduate students, post docs (i.e., national reputation of program, placement of graduates, long-term success of graduates)
      4. National rankings by National Academy of Science, Institute of Medicine, and professional groups
      5. Support and promotion of clinical enterprise
    2. What measures define quality in medical student education?
      1. Student performance outcomes (exam scores, OSCEs, residency director ratings, alumni surveys, AAMC ranking measures)
      2. Peer and student review of courses and curriculum
      3. National recognition for innovation:
        1. Evidence that other medical schools have utilized courses or programs as a model for their educational programs
        2. Publications, textbooks, course materials, information technology
        3. Extramural funding
      4. Meets societal/professional needs or standards (compelling need)
    3. What measures define quality in clinical programs?
      1. Facilitates important educational and/or research enterprises
      2. Achieves national recognition through innovation and excellent process and outcome measures
      3. Achieves recognition as evidenced by residency training program that is nationally competitive
      4. Meets societal/professional needs or standards (compelling need)
      5. Meets the clinical needs of the community and the academic health center
  3. Program impact
    1. Are there emerging scientific or clinical discoveries that make this proposed or existing program a particularly unique window of opportunity?
    2. Does this proposed or existing program add breadth and depth to the students' training? Does this proposed or existing program promote lifelong skills that will enhance students' career opportunities?
    3. Will this proposed or existing program promote development of the health sciences workforce in this state and nationally?
    4. Is this a program that appeals to private donors?
    5. Does this proposed or existing program foster economic growth in the state?
    6. Does this proposed or existing program contribute to a broad and modern view of medicine?
  4. Program leadership
    1. Do we have leaders that possess qualities essential for program development and success? Can we develop leaders internally or must we recruit?
    2. What type of leadership structure does this proposed or existing program require? Does the structure exist?
  5. Feasibility
    1. Can this proposed or existing program raise funds for infrastructure (buildings, capital equipment, computerization) from federal grants and contracts, gift funds and/or clinical revenues?
    2. Are there natural liaisons in this proposed or existing program between research and education?
    3. Will clinical areas help ensure success of this proposed or existing program?
    4. Are the patient base and resources (including computer databases) adequate to sustain research and education efforts in this proposed or existing program?
    5. Will plans for facility development foster growth of this area? Can these plans be altered to mesh better with priorities?
    6. Are funds available for salaries, start-up packages, space and remodeling of space, etc.?
    7. Does needed information infrastructure exist or can it be developed within a reasonable time period?
    8. What size should the proposed or existing program be? Is it sufficient or is growth necessary?
  6. Potential success
    1. Does this proposed or existing program integrate basic and clinical areas and utilize emerging discoveries optimally?
    2. Is the timing right (a unique window of opportunity or really too late) because of national and campus opportunities to enter or enlarge the program?
    3. Can we compete successfully with other universities in this area for funding, national rankings, students, and faculty?
    4. Will current or pending federal and state policies and regulations affect the development of this area?
    5. Does this area appeal to students and/or residents?
12. Whitaker L. Strategic planning with a sense of direction. Bryant Business. 1997;Fall:2–9.
© 2000 Association of American Medical Colleges