Since the 1980s, school ranking systems have been a topic of discussion among leaders of higher education. Various ranking systems are based on inadequate data that fail to illustrate the complex nature and special contributions of the institutions they purport to rank, including U.S. medical schools, each of which contributes uniquely to meeting national health care needs. A study by Tancredi and colleagues in this issue of Academic Medicine illustrates the limitations of rankings specific to primary care training programs. This commentary discusses, first, how each school’s mission and strengths, as well as the impact it has on the community it serves, are distinct, and, second, how these schools, which are each unique, are poorly represented by overly subjective ranking methodologies. Because academic leaders need data that are more objective to guide institutional development, the Association of American Medical Colleges (AAMC) has been developing tools to provide valid data that are applicable to each medical school. Specifically, the AAMC’s Medical School Admissions Requirements and its Missions Management Tool each provide a comprehensive assessment of medical schools that leaders are using to drive institutional capacity building. This commentary affirms the importance of mission while challenging the leaders of medical schools, teaching hospitals, and universities to use reliable data to continually improve the quality of their training programs to improve the healthof all.
Dr. Kirch is president and chief executive officer, Association of American Medical Colleges, Washington, DC.
Dr. Prescott is chief academic officer, Association of American Medical Colleges, Washington, DC.
Editor’s Note: This is a commentary on Tancredi DJ, Bertakis KD, Jerant A. Short-term stability and spread of the U.S. News & World Report primary care medical school rankings. Acad Med. 2013;88:1107–1115.
Correspondence should be addressed to Dr. Prescott, Association of American Medical Colleges, 2450 N St. NW, Washington, DC 20037; telephone: (202) 828-0533; e-mail: email@example.com.
As a community of physicians, scientists, and academicians who seek to do work grounded in evidence, we can all agree that, as Malcolm Gladwell1 states, “Rankings depend on what weight we give to what variables.” For over three decades, school ranking systems—and the variables they choose to weigh—have been a topic of discussion and concern among leaders of higher education. There is a growing consensus that ranking systems fail to acknowledge and appreciate the varied and complex missions of educational institutions, including those of U.S. medical schools.2 In parallel, there is a growing body of evidence that diversity—a factor not always measured in rankings—leads to stronger organizations.3 The article by Tancredi and colleagues4 in this issue of Academic Medicine reports on shortfalls in one ranking system specific to the quality of primary care training programs in U.S. medical schools.
In this commentary, we reaffirm the importance of clearly articulating the rich and varied missions of medical schools, all of which focus ultimately on meeting the health care needs of our nation. Accordingly, we present a challenge to the leaders of medical schools, teaching hospitals, and health systems, as well as to the leaders of their parent universities and the members of their governing boards. Rather than focusing on flawed rankings that often become little more than marketing tools, we call for a focus on metrics that more accurately measure how well each specific institution is delivering what it promises in its stated mission. The more leaders concentrate on their institutions’ clearly articulated missions, the better the academic medicine community is positioned to provide for the health care needs of thenation.
Over the years, publications such as the Gourman Report, the Princeton Review’s Best 377 Colleges, Peterson’s Four-Year Colleges, and U.S. News & World Report’s (USN&WR’s) America’s Best Colleges have provided the public with rankings and generalized information about thousands of undergraduate and graduate programs across the nation. Such ranking systems are based on a limited mix of subjective impressions and quantitative data, implying “winners and losers” but failing to sufficiently depict the complex nature and unique contributions of the institutions they rank, including medicalschools.
The mission and strengths of each medical school—and the impact each hason individual trainees, patients, and the larger community it serves—are unique; thus, schools are poorlyservedby arbitrary and limited ranking methodologies. While uncritical consumers of USN&WR might believe that rankings provide objective and unbiased data,2 rankings are unable to holistically capture any school’s quality and progress in terms of its particular mission and its specific contributions to health.
Ranking Systems—Questioning Validity and Utility
Over 10 years ago, McGaghie and Thompson2 studied the USN&WR ranking of medical schools. The fundamental finding of their comprehensive critique is clear: Rankings lack utility beyond impressionistic marketing. As McGaghie and Thompson state:
The institutions differ in their histories, goals, structures, and aspirations…. In such an environment, a system intended to rate and rank medical schools would need to do so in light of these differences. But even more important, the assessment of quality should be based on criteria of special importance to U.S. society—that is, the measures should be meaningful in ways that go beyond wealth and reputation.
With primary care in the national spotlight, it is logical that Tancredi and colleagues4 focused on the rankings of primary care training programs. They confirm that rankings provide limited insight to the quality of teaching and attainment of school mission. A school’s mission, its learning environment, the diversity of its student body, its curricular innovations, the cost of attendance, the availability of financial aid, and its location are all important factors that medical school applicants consider,5 but these are absent from school rankings. Insofar as the USN&WR methodology weighs heavily on reputation and applies limited objective data,2 it cannot possibly provide an all-encompassing review of the merits of each medical school.
Benchmarks, Not Rankings, Will Move Us Forward
Academic leaders, accrediting agencies, and governing bodies, as well as medical school applicants, need data that are more objective. As physicians, we take care to avoid making patient care decisions that are based on unverifiable data or on composite indexes based on such data. Neither should leaders make institutional decisions based on deficient or questionable information. Objective benchmarks are the best tool to use to inform organizational planning and resource allocation.
The Association of American Medical Colleges (AAMC) and its Council of Deans have been developing just such benchmarking tools to provide valid data that are pertinent to every medical school. Such resources are driving individual institutional capacity building and medical school transformation. Examining the benchmarks that most directly influence the medical education continuum and the health of our nation is an important step in the right direction. For example, in our increasingly diverse nation, measuring a medical school’s ability to prepare a diverse physician workforce is critical. Examining how medical school graduates evaluate their medical education is also meaningful in identifying a school’s strengths and weakness.
When we as a community assess schools against accepted educational standards, we begin enhancing medical education. By explicitly aligning individual institutions’ missions and goals with the greater mission of improving the health of all, the education enterprise benefits. The AAMC Medical School Admissions Requirements (MSAR) and the Missions Management Tool (MMT) are two examples of instruments that provide benchmarks and allow a comprehensive assessment of each medical school in light of its mission.
A 2011 AAMC Analysis in Brief demonstrated that aspiring applicants both use and value other sources over highly promoted ranking systems when seeking information about medical schools; their top five sources of information are medical school Web sites, friends and peers, current medical students or recent graduates, the MSAR, and physicians.6 Of these, aspiring applicants rated the MSAR highest for helping them make application decisions.6
The MSAR is a useful resource for trusted and current information on all 158 Liaison Committee on Medical Education–accredited medical schools (141 in the United States and 17 in Canada), as well as on the 44 BS/MD programs.7 MSAR uses a protected and consistent data-driven methodology to produce a valid, clear, and holistic profile for each school. The profile fosters transparency regarding the application processes and requirements for each school, and each profile has a section called Commitment to Primary Care highlighting the components of the school’s mission and curriculum dedicated to primary care.
Additionally, users of the Web-based MSAR tool are able to compare and contrast up to five medical schools side-by-side, and they have the ability to look at data across three editions of MSAR. Medical schools can edit their profiles as needed while the AAMC, the custodian of MSAR data, cross-checks simultaneously. According to internal analytics, since MSAR went online on April 1, 2011, nearly one million people have visited the site.
Since 2009, the AAMC has distributed the MMT every spring to medical school deans. This tool gives leaders a benchmark to use for assessing their programs directly as they relate to school missions and goals. The MMT intentionally provides valid qualitative data compiled from established systems such as the AAMC Student Records System, the American Medical Association Physician Masterfile, Graduate Medical Education Track Systems (records of residents), the American Medical College Application Service, and the AAMC Faculty Roster. The MMT clearly illustrates a school’s position not only in terms of its unique missions but also in terms of its work in six domains crucial to national needs and priorities. The MMT data include 44 measures in six domains:
1. Graduate a workforce that will address the priority health needs of the nation
2. Prepare a diverse physician workforce
3. Foster the advancement of medical discovery
4. Provide high-quality medical education as judged by your recent graduates
5. Prepare physicians to fulfill the needs of the community
6. Graduate a medical school class with manageable debt
Medical school leaders use this tool to have a clearer understanding of their school’s progress toward stated missions and to determine where their school fits with regard to national trends.
The Imperative of Mission
Some institutions allow the convenience of popular rankings to serve as a proxy for achievement, sometimes linking them to everything from public relations, to fundraising campaigns, to executive compensation incentives. Given the challenges faced in the areas of national health care, health professions workforce development, and support for medical research, the times demand that we abandon the pursuit of arbitrary measures such as popular rankings. The challenge for the leaders of medical schools, teachinghospitals, and health systems, and for the leaders of their parent universities and the members of their boards, is clear: to embrace metrics that more accurately reflect how well each institution is delivering on the goals specifically articulated in its mission. We have a great deal of important work to do to transform the health of the nation. No one school can do it all, but together we can achieve transformation beyond our imagination. The tremendous variety in mission emphasis and resources across our medical schools and teaching hospitals is our source of greatest hope in meeting this test. As Scott Page3 has written, innovation depends “as much on collective difference as on aggregate ability.”
Acknowledgments: The authors wish to thank Marcy Sutherland for her exceptional research and editorial support.
Other disclosures: None.
Ethical approval: None.