Secondary Logo

Share this article on:

Orthopaedic Academic Activity in the United States: Bibliometric Analysis of Publications by City and State

Hohmann, Erik, FRCS, FRCS (Tr&Orth), MD, PhD; Glatt, Vaida, PhD; Tetsworth, Kevin, MD, FRACS

JAAOS Global Research & Reviews: July 2018 - Volume 2 - Issue 7 - p e027
doi: 10.5435/JAAOSGlobal-D-18-00027
Research Article

Background: The purpose of this study was to conduct a bibliometric analysis of orthopaedic academic output in the United States.

Methods: Publications based on city and state origin, corrected for population size, median household income, total number of surgeons, and the number of various subspecialties were evaluated. The 15 highest-ranked orthopaedic journals were audited from 2010 to 2014 and then subdivided into anatomic regions and 14 subspecialties.

Results: A total of 8,100 articles were published during the study period. Most originated from New York, California, Pennsylvania, Massachusetts, and Minnesota. New York published the greatest number by city, followed by Philadelphia, Boston, Chicago, and Rochester. When adjusted for the number of publications per city, surgeons per population, publications per surgeon population, publications per population, and publications per median income per capita, Vail and New York led in two and Stanford in one of the metrics.

Conclusions: New York was the leader for the total publications, greatest activity within subspecialties, and publications per surgeon/population and per median income/capita. Vail was the leader for publications/surgeon and population. The top four cities of New York, Philadelphia, Boston, and Chicago were responsible for 28% of the academic output over the 5-year study period.

From the Valiant Clinic/Houston Methodist Group, Dubai, United Arab Emirates (Dr. Hohmann); the Faculty of Health Sciences, Medical School, University of Pretoria, Pretoria, South Africa (Dr. Hohmann); the Department of Orthopaedic Surgery, University of Texas Health Center, San Antonio, TX (Dr. Glatt); the Orthopaedic Research Centre of Australia, Brisbane, Australia (Dr. Glatt and Dr. Tetsworth); the Department of Orthopaedic Surgery, Royal Brisbane Hospital, Herston, Australia (Dr. Tetsworth); and the Department of Surgery, School of Medicine, University of Queensland, Queensland, Australia (Dr. Tetsworth).

Correspondence to Dr. Hohmann: ehohmann@hotmail.com

Dr. Glatt or an immediate family member serves as a board member, owner, officer, or committee member of Orthopaedic Research Society. Dr. Tetsworth or an immediate family member is a member of a speakers' bureau or has made paid presentations on behalf of Smith & Nephew and Stryker; serves as a paid consultant to Smith & Nephew and Stryker; serves as an unpaid consultant to Avail Technologies; and serves as a board member, owner, officer, or committee member of American Academy of Orthopaedic Surgeons and Australasian Limb Lengthening and Reconstruction Society (President). Neither Dr. Hohmann nor any immediate family member has received anything of value from or has stock or stock options held in a commercial company or institution related directly or indirectly to the subject of this article.

This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

The publication of research articles is an important aspect for medical professionals, particularly in academic environments.1 Success for both individuals and faculty groups is often measured by the number of publications, citation counts, and external research funding.2 , 3 These bibliometric measures are also frequently assessed when considering academic promotion, grant allocations, or entry into academic organizations.2 - 5 Medical journals also use the same measures to define their impact in an attempt to attract a greater number of high-quality authors.3 , 6 , 7 For academic orthopaedic departments, publication productivity is extremely important when attracting high-quality researchers and extramural funding.8 Stavrakis et al8 have shown that departmental productivity was closely correlated with leadership productivity and funding. The availability of funding has been shown to result in higher publication output, favoring countries and states with larger populations and more powerful economies.2 , 9 - 11

However, the total number of publications may not be truly representative of relatively high research activity with regard to the other tasks of clinical services and teaching obligations. Human and financial resources may be limited, and dedicated research time will be minimal with high patient loads, reflecting inherent constraints.12 Therefore, bibliographic analysis of orthopaedic research and publications must also account for population size, economic discrepancies such as median household income, and the number of orthopaedic surgeons within a specific regional area.13

Adjusting for these factors should result in more meaningful data and should facilitate consideration of the workload impact when measured by the number of orthopaedic surgeons per 100,000 population,14 which will then allow estimation of the influence of surgeon workload on their ability to actively pursue research. It has been suggested that 4 to 6 orthopaedic surgeons per 100,000 population are required to meet the clinical needs of a region.15 , 16 Although the number of publications per surgeon (or per capita) is a simple measure to minimize bias, another equally valid metric is to use population size and number of publications per surgeon per population. This reciprocal approach theoretically compensates for regional differences in underserviced areas and will be employed throughout this study.

The purpose of this study was to investigate the number of publications produced in each state and city in the United States using the 15 highest-rated orthopaedic journals over a 5-year period, based on their 2015 impact factors. The study further related these results to population size, median household income, and the number of orthopaedic surgeons in each location. The second aim was to determine the number of publications specific to each of 14 different recognized orthopaedic subspecialties and to again attribute publications to specific locations within and between the individual states and cities in the United States.

Back to Top | Article Outline

Methods

The 2015 Journal Citation report was accessed on the Web of Science (Thomson Reuters),17 and the 15 highest-ranked journals based on their 2015 impact factor were selected from the category “orthopaedics.” If the main purpose was to provide narrative reviews or if the journal was not directly related to the field of orthopaedic surgery (eg, physiotherapy, rheumatoid arthritis, sports medicine), these journals were excluded.

The specific period investigated extended from January 2010 through December 2014. The abstracts of all articles published during this interval were screened through journal-specific websites. Level I to IV research articles, systematic reviews, meta-analyses, nonsolicited review articles, and case reviews were all included in the analysis. Letters to the editor, editorials, editorial comments, historical articles, errata, proceedings papers, meeting abstracts, and notes were excluded. The level of evidence was recorded for each published article, and if this was not assigned by the journal, the senior author assigned the level of evidence according to the standards of the Journal of Bone and Joint Surgery.18

The location of the main affiliation of the primary author was used to define the origin of the publication, recording the state and city. If the article did not provide details about location, the address of the corresponding author was used. To reduce the number of “city” variables, smaller cities close to major metropolitan areas were grouped together. For example, St. Paul was grouped together with Minneapolis; Berkeley and Oakland were grouped with San Francisco; and Santa Monica and Long Beach were grouped with Los Angeles. It is recognized that allocation of smaller cities under a larger metropolitan area is less precise, but it substantially reduces the number of variables without creating notable bias. Any discrepancies were addressed by performing a Google search and by agreement between the two senior authors.

The total number of publications for each state and city was collated. The total number was further subdivided into both anatomic areas and 14 recognized subspecialties within orthopaedics: allied health, basic science, elbow, general orthopaedics, foot and ankle, hand, hip, knee, pediatric orthopaedics, shoulder, spine, sports medicine, trauma, and tumor. These designations facilitated investigation of the geographic distribution of “centers of excellence” regarding these subspecialties. To account for the number of surgeons per state and city, the American Academy of Orthopaedic Surgeons provided information used to calculate the publications per surgeon per city. Similar to establishing the study location, smaller cities were grouped under the larger metropolitan area using the same metrics. Population size and median income per capita were sourced from the United States Census Bureau (http://www.census.gov/en.html) using data from 2015.

To describe the relationship between the population size and the number of publications, the total number of publications attributed to each state and city was divided by the total population of that state or city. These data were also used to calculate the number of surgeons per 10,000 population. The publication rate per median income per capita was then calculated to allow a more direct and meaningful comparison that accounted for population size. These values also provided additional information regarding the gross cost per capita associated with producing an individual article.

Back to Top | Article Outline

Results

A total of 8,100 orthopaedic articles were published in the 15 highest-ranked orthopaedic surgery journals during the study period, between January 2010 and December 2014 (Table 1, http://links.lww.com/JG9/A15). The highest number of articles was published in Clinical Orthopedics and Related Research (n = 1,149), followed by Spine Journal (n = 1,146) and the Journal of Bone and Joint Surgery American (n = 1,128). The lowest number of articles was published in Acta Orthopaedica (n = 23), International Orthopaedics (n = 97), and Bone and Joint Journal (n = 115). Table 1 (http://links.lww.com/JG9/A15) shows the distribution of the number of publications for each of the 15 journals included and the number of publications for each state that were published in these journals.

New York was the leading state with 976 publications, followed by California (n = 931), Pennsylvania (n = 825), Massachusetts (n = 499), and Minnesota (n = 448) (Table 1, http://links.lww.com/JG9/A15). The median number of publications for all states was 60, where New Jersey (n = 63) and Arizona (n = 58) were the two states most closely reflecting this figure. North Dakota was the only state that did not generate any articles. The city of New York published the greatest number of articles (n = 862), followed by Philadelphia (n = 556), Boston (460), Chicago (n = 424), and Rochester, MN (n = 315) (Table 2, http://links.lww.com/JG9/A16). The median number of publications for all cities and metropolitan areas was 40, and the five cities of Miami (n = 40), Atlanta (n = 40), Bethesda (n = 40), Chapel Hill (n = 40), and Rochester, NY (n = 40) were grouped around this median.

Table 3 (http://links.lww.com/JG9/A17) shows an overview of the number of publications attributed to each of the 14 recognized orthopaedic subspecialties per state. New York was the leading state for 5 of the 14 subspecialties (ie, general orthopaedics, foot and ankle, spine, sports medicine, and trauma), California was the leading state for 3 subspecialties (ie, hip, shoulder, and tumor), and Minnesota was the leading state in 2 subspecialties (ie, elbow and hand) (Table 4, http://links.lww.com/JG9/A18). The leading state in both knee and basic science was Pennsylvania, whereas Michigan led in allied health and Texas in pediatric orthopaedics (Table 4, http://links.lww.com/JG9/A18).

New York City was the overwhelming leader having published the greatest number of articles in 8 of the 14 recognized orthopaedic subspecialties (ie, basic science, foot and ankle, hip, knee, shoulder, spine, sports medicine, and trauma), followed by Rochester, MN, leading in 2 subspecialties (ie, elbow and hand). Burlington, VT, was the leader in allied health; Philadelphia, PA, in general orthopaedics; Dallas, TX, in pediatric orthopedics; and Boston, MA, in trauma (Tables 5.1, http://links.lww.com/JG9/A19 and 5.2, http://links.lww.com/JG9/A20). However, after adjusting for the number of publications per city, surgeons per 10,000 population, publications per surgeon per 10,000 population, publications per 100,000 population, and publications per median income per capita, Vail, CO, and New York, NY, led in two categories, whereas Stanford, CT, led in one of these metrics (Table 6, http://links.lww.com/JG9/A21).

Back to Top | Article Outline

Discussion

The results of this study demonstrate that both the state and city of New York were the overwhelming leaders for the total number of publications and consistently ranked first with regard to the greatest activity within the most recognized orthopaedic subspecialties. However, additional metrics were used to adjust for possible socioeconomic advantages and differences in the number of surgeons per population. After adjusting for population size, publications per surgeon, and publications per median income per capita, New York City, NY, was the leading city for publications per surgeon per 10,000 population and publications per median income/capita, and Vail, CO, was the leader for publications per surgeon and publications per 100,000 population.

Several authors have previously suggested that cities with prestigious universities produce more publications.19 , 20 Three of the top 15 US medical schools (ie, Columbia, New York University, and Cornell) are located in New York, and 4 others (ie, Stanford; University of California, Los Angeles; University of California, San Diego; and University of California, San Francisco) are located in California (www.topuniversities.com). Not coincidentally, the results showed that these two states were also the top two publishing states, suggesting that the findings of both Bornman and Leydesdorff19 and Clauset et al20 are also applicable to orthopaedic surgery.

Another metric used to determine research productivity and cost-effectiveness is the median income per capita.10 Theoretically, greater numbers of publications per median income/capita are indicative of research productivity. Interestingly, the top 10 cities in this study with the greatest number of publications per median income/capita coincide closely with the top 10 universities. This fact would certainly suggest that the highest ranking cities not only contain the top universities but also produce more research output at a relatively lower cost. This could be related to greater spending on research and development, the availability of mentorship, and the presence of successful NIH-funded projects.3 , 10 , 21 , 22 Meo et al10 have shown that spending on research and development, the number of universities, and the number of scientific indexed journals of a country all have a positive association in terms of the total number of publications and citations of research documents and the corresponding h-index.

English has become the international language of medical science,23 and 45 of the current top 50 highest impact journals in orthopaedics are based in English-speaking countries.17 Furthermore, 56% of these journals are based in the United States.17 Although North America has one of the greatest numbers of medical schools and scientific journals worldwide, the distribution and geographical location could at least partially explain these differences. For example, the top five states contain 21% of the medical schools in the United States, whereas the five lowest-ranked states together contain only one medical school (Association of American Medical Colleges: www.aamc.org. Accessed March 25, 2017).

Research output is also associated with dedicated research time and mentorship. Beasley et al24 demonstrated that devoting at least 30% of work time to research was an important predictor of publication rates. Mentorship has been identified as another important factor for academic excellence.21 Reid et al21 showed that most academic hospitalists lacked mentorship, and this was associated with failure to produce publications. Valsankar et al3 further reported that the most cited faculty contributed 52% of all publications and 55% of all citations within a surgery department. They emphasized the importance of identifying and promoting these leaders as a critical consideration for the research performance of a clinical department. It is highly likely that the combination of these factors is present in many of the top research departments and is one possible explanation for the findings of this study.

We have also examined the publication rates within orthopaedic subspecialties per state and city, in an attempt to recognize centers of excellence. To define clinical excellence, well-defined and objective criteria such as high-volume hospitals, training of providers, performance-based quality metrics, discharge planning, and nursing-patient ratios are commonly used.25 , 26 In contrast, no agreement exists as to what is meant by excellence in research. Tissjen27 defined research excellence as the creation of new high-quality scientific and technologic knowledge and suggested that bibliometric indicators were the only currently available systematic metrics based on empirical data.

Similar to the total number of publications, the same top five states were also the leaders in the 14 subspecialty sections. New York, California, Pennsylvania, Massachusetts, and Minnesota occupied 39 of the overall 42 first three positions and 13 of the 14 top rankings for the 14 selected subspecialties. The city of New York was the overwhelming leader in the subspecialty section, with eight first and four second ranked positions. Boston had one first place, three second place, and four third place rankings. The third position was held by Philadelphia, with one first, two second, and five third place rankings. Other cities ranked within the first three places of the first five ranked states included Los Angeles (n = 2), Rochester, MN (n = 2), and Pittsburgh (n = 1). The other cities represented were Chicago (n = 4), Baltimore (n = 1), St. Louis (n = 1), and Durham, NC (n = 1). These findings were also consistent with earlier data reporting on overall medical research output.28 Boston (including Harvard Medical School), Los Angeles (University of California, Los Angeles), and Philadelphia (University of Pennsylvania, UPenn) were among the top five ranked medical schools in the United States.28 However, on the basis of this measure, New York scored poorly, and Columbia University ranked in only 15th place, followed by New York University School of Medicine at the 18th position. Prestige is also likely to influence outcomes in research priorities, resource allocation, and other scholarly activities.20 Elite and established research institutions perhaps focus their efforts on providing the necessary resources for scholarly excellence that influence academically driven individuals seeking faculty positions.

After adjusting for population size, publications per surgeon, and publications per median income per capita, New York was the leading city for publications per surgeon per 10,000 population and publications per median income/capita, whereas Vail, CO, was the leader for publications per surgeon and publications per 100,000 population. These metrics seem contradictory to the above outcomes. However, Hendrix28 noticed that size-dependent measures quantify the overall institutional productivity, whereas size-independent measures describe the impact in the research community and productivity of the individual faculty member. Stavrakis et al8 suggested that academic success is associated with scholarly productivity of the department chair and research director.

This study has limitations. Although the total number of publications was determined for each state and city, the impact and value of the individual articles were not assessed. It is therefore possible that lower-quality studies introduced selection bias that could have resulted in discrepancies. The overall impact, such as calculating a mean of impact factor or analyzing citation rates, was also not calculated because these metrics were not deemed critical in our opinion. The impact factor is mainly driven by technicalities, which are not related to the scientific value of the publication itself.29 , 30 It is acknowledged that citation rates are indicative of the academic impact and rank22 , 29 but tend to favor larger institutions.31 Furthermore, overcitation, biased citing, audience size, and biased data are also recognized limitations.32

Back to Top | Article Outline

Conclusions

The results of this study demonstrate that both the state and city of New York were the overwhelming leaders for the total number of publications, and both were consistently ranked first with regard to the greatest activity within the most recognized orthopaedic subspecialties. New York remained the leading city after adjusting for population size, publications per surgeon, publications per surgeon per 10,000 population, publications per 100,000 population, and publications per median income per capita. Vail, CO, was the leading city for publications per surgeon and publications per 100,000 population. These results confirm that the top four cities, New York, Philadelphia, Boston, and Chicago, were responsible for 28% of the academic output in the top 15 ranked orthopaedic journals over the 5-year period from 2010 to 2014.

Back to Top | Article Outline

References

1. Tijdink JK, Vergouwen ACM, Smulders YM: Publication pressure and burn out among Dutch medical professors: A nationwide survey. PLoS One 2013;8:e73381.
2. Tetsworth K, Fraser D, Glatt V, Hohmann E: Use of Google Scholar public profiles in orthopaedics: Rate of growth and changing international patterns. J Orthop Surg 2017;25:1–7.
3. Valsankar NP, Zimmers TA, Kim BJ, et al: Determing the drivers of academic success in surgery: An analysis of 3850 faculty. PLoS One 2015;10:e0131678.
4. Beasley BW, Wright SM, Cofrancesco J, Babbott SF, Thomas PA, Bass EB: Promotion criteria for clinician-educators in the United States and Canada: A survey of promotion committee chairpersons. JAMA 1997;278:723–728.
5. Milone MT, Bernstein J: On track to professorship? A bibliometric analysis of early scholarly output. Am J Orthop 2016;45:E119–E123.
6. Saha S, Saint S, Christakis D: Impact factor: A valid measure of journal quality? J Med Libr Assoc 2003;91:42–46.
7. Thornley P, de Sa D, Evaniew N, Farrokhyar F, Bhandari M, Ghert M: An internal survey to identify the intrinsic and extrinsic factors of research studies most likely to change orthopaedic practice. Bone Joint Res 2016;5:130–136.
8. Stavrakis AI, Patel AD, Burke ZDC, et al: The role of chairman and research director in influencing scholarly productivity and research funding in academic orthopaedic surgery. J Orthop Res 2015;33:1407–1411.
9. Lee KM, Ryu MS, Chung CY, et al: Characteristics and trends of orthopedic publications between 2000 and 2009. Clin Orthop Surg 2011;3:225–229.
10. Meo SA, Al Masri AA, Usmani AM, Memom AN: Impact of GDP, spending on R&D, number of universities and scientific journals on research publications among Asian countries. PLoS One 2013;8:e66449.
11. Hohmann E, Glatt V, Tetsworth KD: Worldwide research activity 2010-2014: Publicatin rates in the top 15 orthopaedic journals. World J Orthop 2017;8:514–523.
12. Ferrer RL, Katerndahl DA: Predictors of short-term and long-term scholarly activity by academic faculty: A departmental case study. Fam Med 2002;34:455–461.
13. Halpenny D, Burke J, McNeill G, et al: Geographic origin of publications in radiological journals as a function of GDP and percentage of GDP spent on research. Acad Radiol 2010;17:768–771.
14. Allen-Mersh T, Earlam RJ: General surgical workload in England and Wales. BMJ 1983;287:1115–1118.
15. Goodman DC, Fisher ES, Bubolz TA, Mohr JE, Poage JF, Wennberg JE: Benchmarking US physician workforce: An alternative to needs-based or demand-based planning. JAMA 1996;276:1811–1817.
16. Hicks LL, Glenn JK: Too many physicians in the wrong places and specialties? Populations and physicians from a market perspective. J Health Care Mark 1989;9:18–26.
17. Journal Citation Reports for Scientific Information. 2015. http://www.webofknowledge.com.
18. Marx R, Wilson SM, Swiontowski MF: Updating the assignment of levels of evidence. J Bone Joint Surg Am 2015;97:1–2.
19. Bornmann L, Leydesdorff L: Which cities produce more excellent papers than can be expected? A new mapping approach, using google maps, based on statistical significance testing. J Am Soc Inf Sci Technol 2011;62:1954–1962.
20. Clauset A, Arbesman S, Larremore DB: Systematic inequality and hierarchy in faculty hiring networks. Sci Adv 2015;1:e1400005.
21. Reid MB, Milsky GJ, Harrison RA, Sharpe B, Auerbach A, Glasheen JJ: Mentorship, productivity and promotion among academic hospitalists. J Gen Intern Med 2011;27:23–27.
22. Ence AK, Cope SR, Holliday EB, Somerson JS: Publication productivity and experience: Factors associated with academic rank among orthopaedic surgery faculty in the United States. J Bone Joint Surg Am 2016;98:e41.
23. Maher J: The development of English as an international language of medicine. App Linguistics 1986;7:206–219.
24. Beasley BW, Simon SD, Wright SM: A time to be promoted: The Prospective Study of Promotion in Academia. J Gen Intern Med 2006;21:123–129.
25. Mehrotra A, Sloss EM, Hussey PS, Adams JL, Lovejoy S, Soohoo NF: Evaluation of centres of excellence for knee and hip replacement. Med Care 2013;51:28–36.
26. Ronning PL, Meyer JW: Centres of excellence: An assessment tool for cardiovascular and orthopaedic programs. Hosp Technol Ser 1996;15:1–29.
27. Tissjen RJW: Scoreboards of research excellence. Res Eval 2003;12:91–103.
28. Hendrix D: An analysis of bibliometric indicators, National Institutes of Health funding, and faculty size at Association of American Medical Colleges medical schools, 1997-2007. J Med Libr Assoc 2008;96:324–334.
29. Lefaivre KA, Shadgan B, O'Brien PJ: 100 most cited articles in orthopaedic surgery. Clin Orthop Relat Res 2011;469:1487–1497.
30. Whitehouse GH: Impact factors: Facts and myths. Eur Radiol 2002;12:715–717.
31. Sypsa V, Hatzakis A: Assessing the impact of biomedical research in academic institutions of disparate sizes. BMC Med Res Methodol 2009;9:33.
32. Mac Roberts MH, Mac Roberts BR: Problems of citation analysis. Scientometric 1996;36:435–444.

Supplemental Digital Content

Back to Top | Article Outline
Copyright © 2018 The Authors. Published by Wolters Kluwer Health, Inc. on behalf of the American Academy of Orthopaedic Surgeons