Journal Logo

Editorials and Perspectives: Special Feature

Social Media and Online Attention as an Early Measure of the Impact of Research in Solid Organ Transplantation

Knight, Simon R.1,2,3

Author Information
doi: 10.1097/TP.0000000000000307


The impact of academic research is an abstract concept that takes into account the academic, economic, and societal impacts of a research (1). Assessment of scholarly impact is useful in assessing the academic successes of institutions and individuals, making decisions about distribution of research funding, and making commercial decisions for publishers.

Measuring impact across these domains is difficult. The most commonly used measures are bibliometrics; surrogates based on the premise that more important academic works are more likely to be cited in subsequent research articles. From this raw citation data, the impact of an individual researcher can be measured, most commonly using the H index (derived from the number of articles published by a researcher and their citation rates). The impact of a journal is commonly measured using the impact factor (IF), which is calculated as the average number of citations received per article published in that journal during the two preceding years.

The use of citation metrics as a measure of research impact has come under fire in recent years. Citation metrics only provide a measure of the academic impact of research and contain no information about the societal and economic impact. Citation rates, and therefore IFs, are open to manipulation by self-citation and editorial policies, such as publishing a large number of review articles and reducing the number of poorly cited articles (such as case reports) (2–4). A journal’s IF is poorly representative of the impact of the individual articles published within that journal: most citations within any 1 year are collected by a only small proportion of the articles published (5). For this reason, the practice of assessing the success of an individual researcher or institution by the IFs of the journals in which they publish has been criticized (6). Chasing high citation publications in this way may also stifle innovation because significant new developments in techniques and treatments may take years, during which time, few, if any, publications will be generated.

Traditional citation metrics are also slow to accumulate. Even with articles being published online as “early-view,” it is usually a minimum of 6 months before citations begin to appear in the literature because of the time required for citing articles to move through the editorial process.

With the rise of online journal activity, and increasing use of social media in the scientific field, there has been a lot of interest in what online and social media activity can tell us about the attention that academic work gets. Online services are increasing in popularity with both journals and researchers, including social media (Twitter, Facebook, Google+, Reddit, blog posts, and so on), social bookmarking services (those that allow users to store and share references, such as Mendeley, CiteULike, and so on), online news outlets, and expert recommendation sites, such as Faculty of 1,000 (f1000). An increasing number of journals, including Nature and PLOS medicine, are now incorporating information about online article mentions, page views, and downloads in their article metrics.

The use of alternative online metrics (“Altmetrics,” a term coined by Jason Priem from the University of North Carolina, Chapel Hill) is attractive because it measures a different aspect of research impact, the attention given to research articles by both professionals and the public. Although tweets and Facebook posts may reflect a superficial interest in a publication, blog posts, article downloads, page views, and bookmarks may give a more accurate representation of exactly how often an article has been read. Furthermore, altmetrics have the potential to give a much earlier measure of the impact of a piece of scientific work than citation rates. They may also allow better measurement of the attention given to research beyond academia, for example, by clinical practitioners, patients, and the public.

Many patients with chronic disease, including potential transplant recipients, are turning to the internet and social media to learn more about their condition and connect with other individuals with similar experiences for support. Perhaps unsurprisingly, much interest in social media in the transplant community has concentrated on the potential to increase donation rates with some degree of success (7–10). Social networks may also facilitate the often difficult discussions around living donation (11). Given the increasing number of potential transplant recipients engaging with social media, these tools may also allow researchers to connect to a wider audience than traditional journal publication allows.

This study aims to measure the current use of alternative metrics (social media, social bookmarking, expert recommendation and online news mentions) and to examine the association between alternative metrics and traditional citation measures in the field of solid organ transplantation.


The literature search identified 6,981 articles published in the 1-year period. These articles were published in 1,165 different journals.

Citation Data

Citation data were available in the Scopus database for 6,979 articles (two records had no available citation data). The median number of citations per article was 1 (range, 0–399), with 4,604 (66.0%) having at least one citation and 423 (6.1%) having 10 or more citations. A total of 6,618 of these articles (94.8%) were published in English language, and the median citation count was higher for articles published in English (median 1 vs. 0 citations; P<0.001). A summary of citation metrics and their relationship to alternative metrics is presented in Table 1.

Alternative article metrics and their relationship with citation counts

Type of publication had a significant impact on the number of citations. Citation rates were significantly higher for articles identified as meta-analysis (P<0.001), multicenter studies (P<0.001), randomized controlled trials (RCT) (P<0.001), reviews (P<0.001), comparative studies (P<0.001), and clinical trials (P=0.024). Citation rates were significantly lower for studies identified as editorials, letters, comments, and case reports (all P<0.001). The odds of each publication type becoming highly cited (10 or more citations) are shown in Figure 1.

Relationship between publication type and odds of 10 or more citations. Publication types are as indexed in MEDLINE. Squares represent the summary OR, horizontal lines represent the 95% CI. OR, odds ratio; 95% CI, 95% confidence interval.

Social Media

A total of 1,346 (19.3%) of the articles had at least one recorded mention on social media. Most mentions were on Twitter, with 1,270 (18.2%) articles having at least one associated tweet and 304 articles (4.4%) having more than one tweet. One hundred sixty-six (2.4%) articles had at least one mention on Facebook, and 42 (0.6%) were mentioned on a blog. Only two articles were mentioned on Reddit and one on Pinterest.

A total of 1,320 (98%) of the articles mentioned on social media were published in English language, and there was no significant difference in the median number of mentions between English and non-English language publications (P=0.06). Similar patterns were seen in the association between publication type and number of social media mentions, with higher rates of mention for meta-analysis (P=0.004), RCTs (P<0.001), multicenter studies (P<0.001), and comparative studies (P<0.001). Mention rates were significantly lower for studies labeled as letters (P<0.001), comments (P=0.003), and case reports (P=0.001). The odds of a mention on social media for each publication type are shown in Figure 2.

Relationship between publication type and odds of at least one social media mention. Publication types are as indexed in MEDLINE. Squares represent the summary OR, horizontal lines represent the 95% CI. OR, odds ratio; 95% CI, 95% confidence interval.

Correlation between social media mentions and citation numbers was significant but poorly predictive (Spearman’s r=0.16; P<0.001). The median numbers of citations were significantly higher if an article had a social media mention (median, 2 vs. 1 citations; P<0.001). Odds of an article being highly cited were significantly higher when mentioned on social media (odds ratio [OR], 2.58; P<0.001). A breakdown of the impact on citation rates of mentions in different social media outlets is given in Table 1. Highest citation rates were seen when an article was mentioned on a blog (median, 7 citations; range, 0–399, OR for more than 10 citations 7.98; P<0.001).

The mean time from an article being listed in the Medline database to the last recorded social media mention was 122±177 days.

Social Bookmarking

Nine hundred fifteen articles (13.1%) were stored in social bookmarking sites, 903 (12.9%) on Mendeley, and 69 (1.0%) on CiteULike. As with citation rates and social media mentions, the number of bookmarks was significantly higher for publications marked as meta-analysis (P=0.013), multicenter studies (P<0.001), reviews (P=0.047), RCTs (P<0.001), and comparative studies (P<0.001). Comments (P=0.011), letters (P<0.001), and case reports (P<0.001) were all significantly less likely to be bookmarked. The odds of bookmarking for articles of different types are displayed in Figure 3.

Relationship between publication type and odds of at least one social bookmark. Publication types are as indexed in MEDLINE. Squares represent the summary OR, horizontal lines represent the 95% CI. OR, odds ratio; 95% CI, 95% confidence interval.

Correlation between number of social bookmarks for an article and the number of citations is significant, but poorly predictive (Spearman’s r=0.23, P<0.001). Articles that are bookmarked have significantly higher mean citation counts (median 3 vs. 1 citations, P<0.001). Odds of an article being highly cited are significantly higher if an article has social bookmark activity (OR, 4.06; P<0.001).

Despite being less frequently used, bookmarking on CiteUlike was associated with higher median citation rates than Mendeley (median 7 vs. 5 citations; OR for being highly cited 11.99 [95% confidence interval, 7.35–19.55] vs. 4.04 [95% confidence interval, 3.26–6.00]).

Expert Opinion

Sixty-three articles (0.9%) were recommended by expert reviewers on the f1000 website.

Articles mentioned on f1000 showed an increase in citation rate compared to those that were not (median, 6 vs. 1 citations; P<0.001). If an article was mentioned on f1000, odds of being highly cited increased significantly (OR, 8.72; P<0.001).

News Outlets

Only eight articles were mentioned on online news websites. Median numbers of citations were significantly higher for articles picked up by online news outlets (median, 5 vs. 1 citations; P=0.006). Odds for an article being highly cited were also increased if an article was mentioned in a news outlet (OR, 5.19; P=0.04).

Article Content

A total of 5,659 unique Medical Subject Headings (MeSH) were used to index the content of the included articles. Given this degree of heterogeneity, analysis of the different subject headings in relation to citation counts and social media mentions did not yield any clear patterns. Qualitative analysis of the highest cited, mentioned, and bookmarked articles was performed (Tables S1–S3, SDC, Some of the identified articles related to general conditions for which transplantation forms only part of the management, rather than transplant-specific topics.

Most of the highest cited articles relate to kidney (14/32) and liver (11/32) transplantations (Table S1, SDC, Few highly cited articles relate to heart or lung transplantation, and none relate to pancreas transplantation. The main study types seen were clinical trials (11/32), cohort studies or registry analyses (9/32), expert reviews (8/32), and consensus statements or guidelines (4/32).

In renal transplantation, the most common topics seen in highly cited articles related to immunology and immunosuppression (10/14). In liver transplantation, common topics related to the underlying cause of hepatic failure, including hepatocellular carcinoma (3/11), alcoholic liver disease (2/11), nonalcoholic steatohepatitis (2/11), and hepatitis viruses (3/10). The most frequently cited articles related to heart and lung transplantations are related to methods of artificial support including mechanical support devices and extracorporeal membrane oxygenation (3/7).

The use of social bookmarking was more varied, with nine articles in common between the most cited and most bookmarked lists (Table S2, SDC, Again, the most highly bookmarked articles related to kidney (12/30) and liver (10/30) transplantations. Analysis of article types show that reviews predominate in the most bookmarked cohort (10/30), although there are more preclinical studies with high bookmark rates (6/30). Article topics were far more varied, with no clear trends seen.

Although the organ types most commonly mentioned in social media were similar to citations and bookmarks (kidney 16/ 36, liver 11/36), the article types and topics were different (Table S3, SDC, There were more editorials, commentaries, and letters (6/36), and more registry and cohort studies (14/36). Topics tended to center around the more emotive and controversial areas of organ transplantation. Topics related to organ donation included payment of living donors, paired kidney donation, complications in live liver donors, and the use of more marginal living and deceased kidney donors. Other controversial topics included access to liver transplantation for patients with alcoholic liver disease and cholangiocarcinoma. Most of the highest-mentioned articles appeared in high-impact general medical journals.


The current study examines the use of alternative metric data from online resources in the field of solid organ transplantation and explores its relationship with traditional methods of assessing research impact (i.e., citation rates). The present data demonstrate that the use of social media and social bookmarking sites is common in the field, with nearly 20% of this cohort of studies being mentioned at least once on social media, and approximately 13% being stored in social bookmarking sites. For comparison, 66% of the articles identified had at least one traditional citation.

The proportion of articles in the current set mentioned on twitter (18.2%) seems to be slightly higher than comparable data from an analysis of Pubmed data during a similar time period in both clinical medicine in general (10.1%) and surgical topics (8.4%) (12). An analysis of publications from the journal PLOS biology reported that although 94% of all articles had at least one Scopus citation, only 14% had Twitter mentions and 22% Facebook mentions (13). The authors do note, however, that these numbers are increasing with time, and since June 2012, 93% of publications have been discussed on Twitter, and 63% mentioned on Facebook.

There does seem to be some relationship between alternative metrics and traditional citation counts. Articles mentioned on social media or stored on social bookmarking sites in the current cohort showed significantly higher citation rates and were more likely to be highly cited. Although there was a significant correlation between citations and social media measures, the correlation coefficients were low, suggesting a limited ability to predict the number of future citations from the number of social media mentions or bookmarks. This may reflect a different type of article use or readership and has been noted in previous analyses of both biomedical and nonbiomedical literature (12, 14–16). The slightly higher correlation between citations rates and social bookmarks, which has also been noted previously (17), may reflect the fact that the use of these services is more likely to be an academic interaction and therefore more closely reflect citation counts than social media interaction. Previous studies have also reported a correlation between social media activity and a journal’s IF (12).

The pattern of study types receiving the most citations (English language publications, meta-analyses, RCTs, and comparative multicenter trials) was similar between citation metrics and alternative metrics, giving more weight to the idea that alternative metrics can provide useful information about an article’s academic impact.

What is striking is that not all social media outlets had the same impact on citation rates. The most commonly used social media outlets in this cohort of transplant-related studies were Twitter and Facebook. Discussion of a research article on a blog, although rare, was associated with a higher median citation rate and chance of at least one citation than a mention on Twitter or Facebook. Tweeting (or retweeting) a mention of an article takes a matter of seconds and is often done without much thought or consideration of the content of an article. In contrast, writing a blog article about a research article requires a greater degree of involvement and requires the author to have studied the article in some depth. It is likely that articles that are of true importance and impact to a field are more likely to be worthy of the time required to construct a blog article.

There is some similarity between this finding and the finding that articles mentioned on the expert recommendation site f1000 are significantly more likely to be cited. On f1000, a panel of expert reviewers comment on articles from a selection of high-impact journals and provide recommendation. In the current data set, a recommendation on f1000 led to eightfold higher odds of citation. As with blog articles, it is only those articles of interest to the field and published in higher-impact journals that make it onto the f1000 site, which is reflected in this higher citation rate. There is some irony in the fact that in an academic world, where we are told that evidence-based medicine is key, and that the days of “expert reviews” are numbered, that we are turning back to sources of expert opinion to tell us what to read (and possibly cite). Part of the problem are the vast number of articles being published each year—it is impossible for a practicing clinician to read everything in a given field, and so we are turning to the experts to tell us which new studies deserve our attention.

The degree of heterogeneity in article content (as measured by the number of different MeSH used to index the articles) makes analysis of the relationship between article content and likelihood of mention or citation difficult. Qualitative analysis of the most cited, mentioned, and bookmarked articles suggests that some of the articles with highest social media attention are commentaries (editorials, letters) published in high-impact general medical journals, whereas the most highly cited articles tend to be clinical trials, reviews, consensus guidelines, and large-scale registry analyses. This analysis suggests that some of the most-mentioned articles on social media relate to the more controversial and emotive areas of transplantation, such as organ donation (particularly living donation), funding issues, and access to transplantation. This supports the notion that social media may be allowing research articles to reach a wider audience than academia alone, including patients and the general public, and that the impact measured by social media attention is different to that measured by citation rates and bookmarks.

In the present set of articles, the mean time from an article being added to Medline to the last recorded mention on social media was 122 days. This suggests that most social media mentions of an article appear early after publication, and certainly earlier than one would expect an article to accrue journal citations. Previous studies have suggested that social media mentions follow a power law, with most mentions on the day of publication and rapidly diminishing from that point forward (14, 15). Alternative metrics, therefore, may provide a useful additional tool for journals, researchers, and institutions to measure the impact of their work much earlier than had been previously possible, which could help in editorial decision making and attracting funding for ongoing work.

There are a number of important limitations to the current study. The article set used was identified using MeSH terms for solid organ transplantation and thus is reliant on accurate coding in the MEDLINE database. Some of the included articles have a broad theme of which transplantation is only a small part, although all have some relevance. The collection of alternative metrics data by only began in July 2011. This means that the included articles range between 15 and 26 months citation data available since publication. As demonstrated in the results, alternative metric data tend to cluster around the time of article publication, but citation counts continue to accrue for years after publication; thus, it is possible that the true measure of the impact of an article by citation count is not adequately reflected in the current data. There is, however, some evidence that early citation rates are predictive of later citations, irrespective of the article field or topic (18). Until we have gained more experience with alternative metrics data, it will not be possible to ascertain whether the relationship between alternative metrics and citation counts varies with time. In a similar fashion, the use of altmetrics is continually increasing with time. A more recent cohort would likely have greater rates of social media mention, but of course less citation data available for review. Thus, the current cohort was selected as a compromise between the increased interest in social media and the time taken to accrue citation data.

The present data rely heavily on the accuracy of the data collected by Scopus (for citation data) and (for alternative metrics). Although there are some limitations to the collection of citation data, the main concern is in the alternative metrics data which is still evolving as a concept (19). uses a cluster of servers to monitor social media sites for mentions of academic articles. It will only identify an article if the mention provides a link to the original article on the publisher’s website, thus there is the potential to miss mentions that link indirectly to the article source. Although this may mean that the true rate of alternative metric citation is underestimated, there is no reason to suggest that would introduce bias to the current sample. Furthermore, only collects publically available tweets and Facebook posts, and will not identify mentions within private profiles. Finally, there are some notable data sources missing from, including Wikipedia references and article download or view rates. These limitations are reflected in the fact that different providers of alternative metric data will return slightly different results for the same article set (19).

In conclusion, social media mentions and social bookmarks of research articles in transplantation act as early predictors of future citation and article impact. The use of alternative social media and online metrics for measuring the impact of research output is increasing, and the field of transplantation is likely to follow this trend. Earlier identification of articles of higher impact may help to inform journal editorial policy and applications for research funding. Perhaps more importantly, the use of social media may allow the results of transplant research to reach a wider audience beyond academia, increasing awareness and potentially increasing funding opportunities and donation rates.


Included Articles

The MEDLINE database (NLM Pubmed) was searched for articles with MeSH for solid organ transplantation (kidney transplantation, pancreas transplantation, liver transplantation, lung transplantation, and heart transplantation) added between August 1, 2011, and July 31, 2012. Search was performed on October 18, 2013. Article details were extracted as Extensible Markup Language (XML) and imported into a database.

Citation Data

Citation data for each article were retrieved from Scopus (, Elsevier B.V.) by means of a unique identifier (Pubmed ID or Digital Object Identifier) and automatically appended to the database. Citation counts were correct on November 1, 2013.

Alternative Metrics

The online Altmetric service ( was used to obtain information regarding article mentions on social media (Twitter, Facebook, Google+, Reddit, Pinterest), online news outlets, expert recommendation sites (f1000), and social bookmarking sites (Mendeley, CiteULike). Altmetric servers watch these sites for mentions of academic articles and compile these data (with over 9 million mentions captured to date). Information was obtained by querying the Altmetric database with the Pubmed IDs of the included articles and the resulting data added to the database. The date of the Altmetric query was November 1, 2013.

Outcomes of Interest

Outcomes of interest included citation rates and altmetric mention rates. Data were explored for associations between citation and alternative metrics. Factors likely to affect the rate of citation in traditional journals and with online services were also explored, including publication language, article type (as recorded in the Medline record) and article content or topic (by MeSH terms and qualitative analysis). The time to last mention on social media was also extracted to give a measure of the speed at which impact can be measured using alternative metrics.

Data Analysis

Nonparametric data are summarized as median and range. Parametric data are summarized as mean and standard deviation. Correlation between journal citation rates and alternative metrics was measured using Spearman’s rank correlation coefficient. The effects of moderating variables on the likelihood of an article being highly cited (defined here as 10 or more citations) are expressed as OR and 95% confidence intervals. Comparison of citation counts for articles with and without the variable under analysis was performed using the Wilcoxon rank-sum test. A P value less than 0.05 is regarded as statistically significant. All statistical analyses were performed using the statistical package R (


The author would like to thank and Scopus for providing access to their application programming interfaces (APIs) for the purposes of this project. The author also thanks Peter Morris and Liset Pengel for their advice during article preparation.


1. Research Councils UK. What do research councils mean by impact? 2014 [April 29th, 2014]. Available from:
2. Smith R. Journal accused of manipulating impact factor. BMJ 1997; 314: 461.
3. The PLoS Medicine Editors. The Impact Factor Game. PLoS Med 2006; 3: e291.
4. Falagas ME, Alexiou VG. The top-ten in journal impact factor manipulation. Arch Immunol Ther Exp (Warsz) 2008; 56: 223.
5. Editorial. Not-so-deep impact. Nature 2005; 435: 1003.
6. Alberts B. Impact factor distortions. Science 2013; 340: 787.
7. Cucchetti A, Zanello M, Bigonzi E, et al. The use of social networking to explore knowledge and attitudes toward organ donation in Italy. Minerva Anestesiol 2012; 78: 1109.
8. D’Alessandro AM, Peltier JW, Dahl AJ. A large-scale qualitative study of the potential use of social media by university students to increase awareness and support for organ donation. Prog Transplant 2012; 22: 183.
9. D’Alessandro AM, Peltier JW, Dahl AJ. The impact of social, cognitive and attitudinal dimensions on college students’ support for organ donation. Am J Transplant 2012; 12: 152.
10. Cameron AM, Massie AB, Alexander CE, et al. Social media and organ donor registration: the Facebook effect. Am J Transplant 2013; 13: 2059.
11. Chang A, Anderson EE, Turner HT, et al. Identifying potential kidney donors using social networking web sites. Clin Transplant 2013; 27: E320.
12. Haustein S, Peters I, Sugimoto CR, et al. Tweeting biomedicine: an analysis of tweets and citations in the biomedical literature. J Assoc Info Sci Technol 2014; 65: 656.
13. Fenner M. What can article-level metrics do for you? PLoS Biol 2013; 11: e1001687.
14. Eysenbach G. Can tweets predict citations? Metrics of social impact based on Twitter and correlation with traditional metrics of scientific impact. J Med Internet Res 2011; 13: e123.
15. Shuai X, Pepe A, Bollen J. How the scientific community reacts to newly submitted preprints: article downloads, Twitter mentions, and citations. PLoS One 2012; 7: e47523.
16. Thelwall M, Haustein S, Lariviere V, et al. Do altmetrics work? Twitter and ten other social web services. PLoS One 2013; 8: e64841.
17. Li X, Thelwall M, Giustini D. Validating online reference managers for scholarly impact measurement. Scientometrics 2012; 91: 461.
18. Wang D, Song C, Barabasi AL. Quantifying long-term scientific impact. Science 2013; 342: 127.
19. Chamberlain S. Consuming article-level metrics: observations and lessons. Information Standards Quarterly 2013; 25: 4.

Social media; Transplantation; Impact; Internet; Altmetrics

Supplemental Digital Content

© 2014 by Lippincott Williams & Wilkins