Share this article on:

Beyond Citation Rates: A Real-Time Impact Analysis of Health Professions Education Research Using Altmetrics

Maggio, Lauren A. PhD; Meyer, Holly S. PhD; Artino, Anthony R. Jr PhD

doi: 10.1097/ACM.0000000000001897
Research Reports

Purpose To complement traditional citation-based metrics, which take years to accrue and indicate only academic attention, academia has begun considering altmetrics or alternative metrics, which provide timely feedback on an article’s impact by tracking its dissemination via nontraditional outlets, such as blogs and social media, across audiences. This article describes altmetrics and examines altmetrics attention, outlets used, and top article characteristics for health professions education (HPE) research.

Method Using Altmetric Explorer, a tool to search altmetrics activity, the authors searched for HPE articles that had at least one altmetrics event (e.g., an article was tweeted or featured in a news story) between 2011 and 2015. Retrieved articles were analyzed using descriptive statistics. In addition, the 10 articles with the highest Altmetric Attention Scores were identified and their key characteristics extracted.

Results The authors analyzed 6,265 articles with at least one altmetrics event from 13 journals. Articles appeared in 14 altmetrics outlets. Mendeley (161,470 saves), Twitter (37,537 tweets), and Facebook (1,650 posts) were most popular. The number of HPE articles with altmetrics attention increased 145%, from 539 published in 2011 to 1,321 in 2015. In 2015, 50% or more of the articles in 5 journals received altmetrics attention. Themes for articles with the most altmetrics attention included social media or social networking; three such articles were written as tips or guides.

Conclusions Increasing altmetrics attention signals interest in HPE research and the need for further investigation. Knowledge of popular and underused outlets may help investigators strategically share research for broader dissemination.

L.A. Maggio is associate professor of medicine and associate director of distance learning and technology, Graduate Programs in Health Professions Education, Uniformed Services University of the Health Sciences, Bethesda, Maryland; ORCID: http://orcid.org/0000-0002-2997-6133.

H.S. Meyer is assistant professor of medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland; Twitter: @hollysmeyer; ORCID: http://orcid.org/0000-0001-8833-8003.

A.R. Artino Jr is professor of medicine and deputy director, Graduate Programs in Health Professions Education, Uniformed Services University of the Health Sciences, Bethesda, Maryland; Twitter: @mededdoc; ORCID: http://orcid.org/0000-0003-2661-7853.

Data sharing: The data analyzed for this study have been deposited in Figshare and are available at https://doi.org/10.6084/m9.figshare.4884569.v2.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: Reported as not applicable.

Disclaimer: The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Uniformed Services University of the Health Sciences, the Department of Defense, or the U.S. Government.

Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A474.

Correspondence should be addressed to Lauren A. Maggio, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Rd., Bethesda, MD 20814-4799; e-mail: lauren.maggio@usuhs.edu; Twitter: @laurenmaggio.

Written work prepared by employees of the Federal Government as part of their official duties is, under the U.S. Copyright Act, a “work of the United States Government” for which copyright protection under Title 17 of the United States Code is not available. As such, copyright does not extend to the contributions of employees of the Federal Government.

Traditionally researchers, including health professions education (HPE) investigators, have published research articles with the hope that colleagues will read and ultimately cite their work. In this dissemination model, citation counts are a measure of “scientific impact” and often rewarded in academia.1,2 However, citation-based metrics provide only a single view of a researcher’s scientific impact, take years to accumulate, and may be poor indicators of practical impact in fields such as clinical medicine.3,4 Additionally, a citation-focused approach disregards calls from funders and the public for broader research dissemination outside academia.5,6 To meet this demand, stakeholders, including researchers, journals, and academic institutions, are sharing and promoting their research via alternative channels, such as news media, blogs, and social media outlets like Facebook and Twitter. While this dissemination approach may help broadcast research discoveries, use of these alternative communication channels is not captured by traditional citation counts. Alternative metrics, or altmetrics, have been developed to complement traditional citation-based metrics and provide a summary of how research is shared and discussed online, including by the public.7,8

Strictly speaking, altmetrics are defined as the collection of digital indicators related to scholarly work, with the indicators derived from activity and engagement among diverse stakeholders and scholarly outputs in the research ecosystem, including the public sphere.9 More simply, altmetrics are “web-based metrics for the impact of scholarly material, with an emphasis on social media outlets as sources of data.”10 In practice, altmetrics may measure, but are not limited to, outlets that save articles (e.g., Mendeley, CiteULike), discuss articles (e.g., Twitter, blogs), and recommend articles (e.g., Faculty of 1000). See a related AM Last Page11 and Supplemental Digital Appendix 1 at http://links.lww.com/ACADMED/A474 for more about the outlets tracked by Altmetric, one company that measures altmetrics.

To help stakeholders tell a cohesive story of the impact of their work, companies such as Altmetric, Plum Analytics, and Impactstory aggregate these digital indicators and, in some cases, calculate numeric scores to provide an overall sense of an article’s impact. For example, the Academic Medicine article “Why Medical Schools Should Embrace Wikipedia: Final-Year Medical Student Contributions to Wikipedia Articles for Academic Credit at One School”12 has an Altmetric Attention Score of 210 as of July 5, 2017, which puts it in the top 5% of all articles scored by Altmetric (ranked no. 28,207 of 7,842,929 outputs). (For reference, in 2016, Altmetric Attention Scores ranged from 1 to 8,063.13) For this particular article, the Altmetric Attention Score included coverage from news outlets, blogs, Twitter, Facebook, Wikipedia, Reddit, and Mendeley. Notably, this article, which was published online September 13, 2016, has been cited only once, which contrasts with the relatively extensive altmetrics attention it has received since its publication.

In addition to complementing traditional citation-based metrics, altmetrics provide authors with several additional benefits, including measuring the speed, breadth, diversity, and openness of the attention that articles receive.5 Unlike traditional citation counts, altmetrics provide a more immediate indication of attention. For example, an article published Monday morning could be tweeted minutes after its publication or featured on a blog on Tuesday. Altmetrics capture this dynamic activity. Altmetrics also provide broad and diverse coverage from multiple sources (e.g., newspapers, policy documents, syllabi) and fields (e.g., psychology, biomedicine, linguistics). Lastly, access to altmetrics information tends to be more open or readily accessible. For example, when accessing the Academic Medicine Web site, readers can view each article’s Altmetric Attention Score and a detailed report of the related altmetrics events.

Researchers in several fields, including biomedicine,3 have studied altmetrics use to provide information on its scientific impact. Across disciplines, studies have explored the prevalence of altmetrics outlets and identified positive associations between altmetrics attention and citation counts.14 This information is valuable to authors and increasingly important to funders,15 policy makers,6 and promotion and tenure committees.16,17

In HPE, we currently lack a sense of the altmetrics attention that our research garners. This lack of knowledge limits our understanding of which, if any, altmetrics outlets have good coverage of HPE research versus which do not. Knowledge of popular and underused outlets for HPE research may help investigators (and journal editors) strategically share their research for broader dissemination. Additionally, identifying popular altmetrics outlets may suggest to HPE investigators possible outlets worth monitoring for trending topics in the field. Also, as HPE scholars and institutions begin to consider altmetrics as a complementary means of determining scientific impact, it is important for the field to have a baseline understanding of the attention that HPE research receives from altmetrics outlets.

The aim of the analysis presented here was to describe the attention that HPE articles receive in altmetrics outlets. Specifically, we examined the overall growth in altmetrics attention, the altmetrics outlets used, altmetrics attention by journal, and characteristics of articles receiving the most altmetrics attention.

Back to Top | Article Outline

Method

To identify our dataset, we used Altmetric Explorer (Altmetric, London, England), a search tool that queries Altmetric’s database of altmetrics activity for more than 7.4 million articles.18 In comparison, PubMed contains 27 million citations.19 We selected Explorer because it focuses on altmetrics events for journal articles. Because our study did not include human participants, we did not seek ethical approval.

Articles are continually added to the Altmetric database when they have their first altmetrics event. Although the database primarily includes recent articles (Altmetric began tracking the majority of altmetrics outlets in 2011),20 there is no cutoff for articles based on age. For example, the Altmetric database contains historical articles, including a medical article published in 1809.21

We searched Explorer on July 7, 2016, using the names of PubMed-indexed HPE journals. Our search parameters included HPE articles with at least one altmetrics event that occurred at any time. We selected the included HPE journals using previous HPE citation studies that identified journals in this field by topic,22 expert selection,23 and impact factor.24 We limited the search to PubMed-indexed journals as this enabled us to also retrieve the total number of articles published by each journal annually. We included the following journals: Academic Medicine, Advances in Health Sciences Education, BMC Medical Education, Canadian Medical Education Journal, Clinical Teacher, International Journal of Medical Education, Journal of Advances in Medical Education, Journal of Continuing Education in the Health Professions (JCEHP), Journal of Graduate Medical Education, Medical Education, Medical Education Online, Medical Teacher, Perspectives on Medical Education, and Teaching and Learning in Medicine.

Our Explorer search retrieved 7,350 articles and the related metadata. These metadata included citation information (e.g., article title, journal title, publication year), counts of altmetrics events for each outlet tracked by Altmetric, and the article’s Altmetric Attention Score. We exported the data from Explorer to Excel (Microsoft, Seattle, Washington) for analysis. This spreadsheet is publicly available on Figshare at https://doi.org/10.6084/m9.figshare.4884569.v2. Because of a technical error when we exported the data, the articles from JCEHP were inadvertently excluded. Therefore, we did not include this journal and its 203 articles with altmetrics events in our analysis.

To supplement the Explorer data, which did not include information about the journals’ social media presence, we searched Twitter and Facebook to determine whether a journal had a dedicated Twitter or Facebook account. If a journal had either, we noted the number of followers and/or likes. Using PubMed, we identified how many articles were published in each journal annually. We searched Twitter and PubMed on July 19, 2016, and Facebook on October 3, 2016, to gather these data, which we added to the Excel file.

We reviewed the composite Excel file and combined citation counts for journals that had undergone name changes. We excluded entries that were not articles, such as tables of contents, oral abstracts, and journal mastheads (n = 656), as well as entries that were published online ahead of print after 2015 or were not in PubMed (n = 226). Without these exclusions and the JCEHP citations, we were left with 6,265 articles in our analysis. We analyzed these remaining citations and their related altmetrics attention using descriptive statistics.

We also identified the 10 articles in the overall sample with the highest Altmetric Attention Scores. The company Altmetric calculates an article’s Altmetric Attention Score by aggregating and weighting the attention that article receives from the tracked altmetrics outlets,20 with an altmetrics event defined as an activity in any altmetrics outlet (e.g., a Tweet or Facebook post). Using PubMed, we then extracted the characteristics of these articles, such as publication type, funding source, and how they were described using major medical subject headings (MeSH). We also searched for these 10 articles in Web of Science to retrieve the number of times each had been cited.

Back to Top | Article Outline

Results

We analyzed 6,265 articles from 13 journals that had at least one altmetrics event at any point in time. Articles in this sample were published between 1956 and 2015, with 77.6% (n = 4,860) published between 2011 and 2015, 19.9% (n = 1,247) between 2001 and 2010, and 2.5% (n = 158) between 1956 and 2000.

Between 2011 and 2015, there were 8,431 HPE articles published in the included journals. Of those articles, 4,860 (57.6%) had at least one altmetrics event. Over this time period, the number of HPE articles with at least one altmetrics event increased 145% from 539 articles published in 2011 to 1,321 articles published in 2015 (see Figure 1).

Figure 1

Figure 1

Articles received attention in 14 of the 16 outlets tracked by Altmetric. Mendeley was by far the most popular outlet, with 161,470 saves. On average, articles were saved to Mendeley 27 times. Twitter and Facebook were the second and third most popular outlets, with 37,537 tweets (average 6 tweets per article) and 1,650 posts (average 1 post per article), respectively. There was limited attention in the other outlets (see Table 1), and none of the identified HPE articles received attention on tracked book review Web sites or question-and-answer Web sites.

Table 1

Table 1

Between 2011 and 2015, the number of articles with altmetrics activity generally increased across all journals. However, BMC Medical Education, Journal of Advances in Medical Education, and Medical Teacher each had fewer altmetrics events in 2015 than they did in 2014 (see Table 2).

Table 2

Table 2

In 2015, 50% or more of the articles in five of the included journals had at least one altmetrics event. In these five journals, articles had a range of Altmetric Attention Scores (see Table 3 for the scores for these five journals and Supplemental Digital Appendix 2 at http://links.lww.com/ACADMED/A474 for the scores for all journals). For articles published in 2015, the highest Altmetric Attention Score for an HPE article was 292, with an average score of 7.11 and a median score of 3.25 For context, an Altmetric Attention Score of 292 is in the top 5% of all research outputs tracked by Altmetric.

Table 3

Table 3

Five journals had Twitter handles (e.g., @AcadMedJournal), with a range of followers: Academic Medicine (4,664 followers), Clinical Teacher (963 followers), Journal of Graduate Medical Education (1,523 followers), Medical Education Online (1,737 followers), and Perspectives on Medical Education (262 followers). Medical Education Online and Perspectives on Medical Education both had Facebook pages, with 262 and 2,316 likes, respectively.

For the top 10 articles by Altmetric Attention Score, publication dates ranged from 2011 to 2015 (see Table 4).26–35 Nine of the articles had been cited.26–33,35 Citation information was unavailable for one of these articles34 because it was not included in Web of Science. Citation counts ranged from the fifth-ranked article (Altmetric Attention Score of 150), which was published in 2015, with a single citation,32 to the sixth-ranked article (Altmetric Attention Score of 138), which was published in 2011, with 187 citations.33 Two of the articles were described as review articles28,33 and one as an observational study.32 The available metadata for the remaining articles did not include publication type. The three articles with the highest Altmetric Attention Scores were published by Medical Teacher and were all related to social media.26,29,31 Five articles addressed social media or social networking.26,28–31 Additionally, three were tagged as focusing on methods.28,31,35 Examining the article titles, one article was a guide26 and two were “Twelve Tips” articles,29,31 which tend to be action-oriented strategies for HPE scholars.

Table 4

Table 4

Back to Top | Article Outline

Discussion

Since 2011, overall altmetrics attention for HPE articles has increased. Currently, the reasons for this increase are uncertain and may be due to multiple factors such as the rise in the number of HPE articles published and/or the growth in social media use. However, this increasing attention suggests that there is general interest in HPE research and underscores the need for further exploration of altmetrics. It also suggests that HPE scholars can consider altmetrics outlets as potential resources to help promote their research. Notably, the majority of articles with altmetrics attention have been published since 2011. However, though less frequent, articles dating back more than 40 years were reintroduced into today’s scientific conversation through altmetrics outlets.

While the majority of tracked altmetrics outlets featured HPE articles, some were more widely used than others, with Mendeley and social media outlets, such as Twitter and Facebook, being the most popular. Overall, Mendeley was the most used altmetrics outlet. Mendeley is a free reference manager that enables academics to save, organize, and share article references; it also includes collaboration tools. Altmetrics attention in Mendeley is measured as number of “saves,” meaning that an individual has saved a reference to an article to her or his Mendeley account. While the level of Mendeley attention is encouraging, this resource is aimed at academics and may not answer calls to expand the reach of research beyond academia.6 From our preliminary investigation, we cannot ascertain whether Mendeley saves were made by HPE researchers or by academics from other disciplines. Looking forward, researchers might further investigate Mendeley saves for their own articles. In doing so, they can identify scholars from within and outside HPE who may be interested in forging collaborative partnerships.

Social media outlets, especially Twitter, were used for sharing HPE articles. This finding aligns with those from research specific to medical education28 and from other scientific disciplines more broadly.36 Twitter’s popularity may be partially due to the low bar for engagement in tweeting or retweeting journal article information, but it also may be due to Twitter’s popularity with scholars as a means to communicate their own and their colleagues’ research and to engage in discussions about that work.26,37 Additionally, several HPE articles, including those with the highest Altmetric Attention Scores, provide guidance on the value of Twitter and how to use it.26,29 Finally, 5 of the 13 journals we assessed had dedicated Twitter accounts, which may also contribute to the relatively high volume of Twitter use.

Several altmetrics outlets were used less frequently, and two were not used at all. This may indicate an opportunity for HPE researchers to more broadly disseminate their research using new channels. For example, there were only 38 mentions of HPE research in Wikipedia. Wikipedia, an interactive online encyclopedia, is an extremely high-trafficked resource frequented by a variety of readers. Wikipedia covers content related to HPE (e.g., topics on motivation, team-based learning, research methods). However, these HPE-relevant topics generally do not reference the HPE literature. For example, the education section of the evidence-based medicine (EBM) Wikipedia entry does not include three recent systematic reviews on teaching EBM. We consider this a missed opportunity, especially considering that this entry received over 18,000 pageviews in March 2017 alone. HPE articles also received limited news media coverage, which is critical for reaching beyond scholarly audiences. Members of the public, including practicing physicians, rely on news media, rather than academic journals, for learning about new research.38

At the journal level, all but 3 journals saw their number of articles with altmetrics attention increase from 2014 to 2015. Of the 10 journals with increased attention, 5 had social media accounts, which may have contributed to this increase. None of the 3 journals that received less altmetrics attention in 2015 had social media accounts. The cause of these decreases is unclear and suggests the need for further exploration to better understand what impacts altmetrics attention for HPE journals. For example, as indicated in traditional citation-based metrics research, certain characteristics, such as article type (e.g., review articles)24 and open access status,39 affect citation rates. Presently, it is unclear whether these factors also influence altmetrics attention, thus suggesting another area for future exploration. Additionally, researchers might consider examining how the quality of an article40 relates to the altmetrics attention that it receives.

Researchers have raised the question of whether altmetrics are “just empty buzz”8 as opposed to valuable measures of scholarly impact. Limited research to date has found moderate positive correlations between altmetrics attention and future citations40; however, no such research has been attempted in HPE. Such information is likely of interest to HPE researchers in pursuit of promotion and tenure, a process that is often informed by a candidate’s traditional citation-based metrics, such as citation counts. With respect to promotion and tenure, there have been concerns that authors can self-generate altmetrics events for their research (e.g., tweeting their own article or asking a friend to blog about it). This form of self-promotion could potentially inflate an article’s Altmetric Attention Score and should be considered by committees reviewing a promotion dossier. Nonetheless, journals, funders, authors, and their institutions are interested in articles being widely disseminated through social media and increasingly are encouraging authors to promote their research through multiple channels.6 Future research is needed to ascertain the potential relationships between altmetrics and traditional citation-based metrics in HPE and the potential role of self-generated altmetrics events.

Another limitation to using altmetrics, which is also a problem for traditional citation counts, is that a large number of altmetrics events or a high Altmetric Attention Score does not necessarily represent a mark of quality. For example, the infamous 1998 article linking vaccines and autism has an Altmetric Attention Score of 2,216.41 This retracted article continues to receive traditional and altmetrics attention, with three news stories referencing it in March 2017. Additionally, altmetrics do not convey the tone of the event. For example, the recent altmetrics events related to the retracted article mentioned above may be critical or positive in nature.

This study is a first attempt to describe altmetrics attention for the HPE literature and should be considered in light of its limitations. To begin, we used data from outlets tracked by Altmetric, which is one of a handful of altmetrics aggregation companies. It is possible that if we had used a different company’s search tool, our results would have varied. Furthermore, we focused on articles in HPE-specific journals that are searchable in PubMed. This may have excluded HPE-relevant articles published in clinical journals. Lastly, we lost data from JCEHP because of a technical error in the export process. However, based on an Explorer search in March 2017, we confirmed that the exclusion of the JCEHP articles from our analysis did not impact our list of the 10 articles with the highest Altmetric Attention Scores.

Back to Top | Article Outline

Conclusion

We have described altmetrics and provided a snapshot of HPE research dissemination via altmetrics channels, which represent potential avenues for future research. Growing attention to HPE scholarship, as measured by the increase in the number of altmetrics events, signals a broad interest in HPE research and the need for further investigation. Knowledge of the popular and underused outlets for sharing HPE research may help investigators strategically promote their work for broader dissemination. For the less frequently used altmetrics outlets, HPE researchers might consider using these channels for disseminating their work and strategize best approaches to benefiting from their reach.

Back to Top | Article Outline

Acknowledgments:

The authors would like to thank Altmetric for allowing them to use Explorer. They also greatly appreciate Sebastian Uijtdehaage’s thoughtful feedback on early versions of this article.

Back to Top | Article Outline

References

1. Eysenbach G. Citation advantage of open access articles. PLoS Biol. 2006;4:e157.
2. Galligan F, Dyas-Correia S. Altmetrics: Rethinking the way we measure. Serials Rev. 2013;39:56–61.
3. Fenner M. What can article-level metrics do for you? PLoS Biol. 2013;11:e1001687.
4. van Eck NJ, Waltman L, van Raan AF, Klautz RJ, Peul WC. Citation analysis may severely underestimate the impact of clinical research as compared to basic research. PLoS One. 2013;8:e62395.
5. Bornmann L. Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. J Informetr. 2014;8:895–903.
6. Meisel ZF, Gollust SE, Grande D. Translating research for health policy decisions: Is it time for researchers to join social media? Acad Med. 2016;91:1341–1343.
7. Piwowar H. Altmetrics: Value all research products. Nature. 2013;493:159.
8. Priem J, Taraborelli D, Groth P, Neylon C. Altmetrics: A manifesto. http://altmetrics.org/manifesto/. Published October 26, 2010. Accessed July 5, 2017.
9. National Information Standards Organization (NISO). No. NISO RP-25-2016, Outputs of the NISO Alternative Assessment Metrics Project. 2016.Baltimore, MD: National Information Standards Organization.
10. Shema H, Bar-Ilan J, Thelwall M. Do blog citations correlate with a higher number of future citations? Research blogs as a potential source for alternative metrics. J Assoc Info Sci Tech. 2014;65:1018–1027.
11. Meyer H, Artino AR Jr, Maggio LA. Tracking the scholarly conversation in health professions education: An introduction to altmetrics. Acad Med. 2017;92:1501.
12. Azzam A, Bresler D, Leon A, et al. Why medical schools should embrace Wikipedia: Final-year medical student contributions to Wikipedia articles for academic credit at one school. Acad Med. 2017;92:194–200.
13. Altmetric. Top 100 articles 2016. https://www.altmetric.com/top100/2016. Accessed July 5, 2017.
14. Thelwall M, Haustein S, Larivière V, Sugimoto CR. Do altmetrics work? Twitter and ten other social Web services. PLoS One. 2013;8:e64841.
15. Dinsmore A, Allen L, Dolby K. Alternative perspectives on impact: The potential of ALMs and altmetrics to inform funders about research impact. PLoS Biol. 2014;12:e1002003.
16. Konkiel S, Sugimoto C, Williams S. The use of altmetrics in promotion and tenure. Educause Rev. 2016;51:54–55.
17. Cabrera D. Mayo Clinic includes social medial scholarship activities in academic advancement. MCSMN Blog. May 25, 2016. https://socialmedia.mayoclinic.org/discussion/mayo-clinic-includes-social-media-scholarship-activities-in-academic-advancement/. Accessed July 5, 2017.
18. Altmetric. Explorer for Institutions. https://www.altmetric.com/products/explorer-for-institutions/. Accessed July 5, 2017.
19. National Library of Medicine. PubMed. https://www.ncbi.nlm.nih.gov/pubmed/. Accessed July 5, 2017.
20. Altmetric. How is the Altmetric Attention Score calculated? https://help.altmetric.com/support/solutions/articles/6000060969-how-is-the-altmetric-attention-score-calculated. Published November 24, 2016. Accessed July 5, 2017.
21. Babington W. A case of exposure to the vapour of burning charcoal. Med Chir Trans. 1809;1:83–98.
22. Lee K, Whelan JS, Tannery NH, Kanter SL, Peters AS. 50 years of publication in the field of medical education. Med Teach. 2013;35:591–598.
23. Reed DA, Cook DA, Beckman TJ, Levine RB, Kern DE, Wright SM. Association between funding and quality of published medical education research. JAMA. 2007;298:1002–1009.
24. Rotgans JI. The themes, institutions, and people of medical education research 1988-2010: Content analysis of abstracts from six journals. Adv Health Sci Educ Theory Pract. 2012;17:515–527.
25. Altmetric. When did Altmetric start tracking attention to each source? https://help.altmetric.com/support/solutions/articles/6000136884-when-did-altmetric-start-tracking-attention-to-each-attention-source. Published March 17, 2017. Accessed July 5, 2017.
26. Choo EK, Ranney ML, Chan TM, et al. Twitter as a tool for communication and knowledge exchange in academic medicine: A guide for skeptics and novices. Med Teach. 2015;37:411–416.
27. Chen C, Petterson S, Phillips RL, Mullan F, Bazemore A, O’Donnell SD. Toward graduate medical education (GME) accountability: Measuring the outcomes of GME institutions. Acad Med. 2013;88:1267–1280.
28. Cheston CC, Flickinger TE, Chisolm MS. Social media use in medical education: A systematic review. Acad Med. 2013;88:893–901.
29. Forgie SE, Duff JP, Ross S. Twelve tips for using Twitter as a learning tool in medical education. Med Teach. 2013;35:8–14.
30. Jalali A, Wood TJ. Tweeting during conferences: Educational or just another distraction? Med Educ. 2013;47:1129–1130.
31. Kind T, Patel PD, Lie D, Chretien KC. Twelve tips for using social media as a medical educator. Med Teach. 2014;36:284–290.
32. McKinley J, Dempster M, Gormley GJ. “Sorry, I meant the patient’s left side”: Impact of distraction on left–right discrimination. Med Educ. 2015;49:427–435.
33. Neumann M, Edelhäuser F, Tauschel D, et al. Empathy decline and its reasons: A systematic review of studies with medical students and residents. Acad Med. 2011;86:996–1009.
34. Sullivan GM, Feinn R. Using effect size—Or why the P value is not enough. J Grad Med Educ. 2012;4:279–282.
35. Ziring D, Danoff D, Grosseman S, et al. How do medical schools identify and remediate professionalism lapses in medical students? A study of U.S. and Canadian medical schools. Acad Med. 2015;90:913–920.
36. Andersen JP, Haustein S. Influence of study type on Twitter activity for medical research papers. https://arxiv.org/ftp/arxiv/papers/1507/1507.00154.pdf. Published 2015. Accessed July 17, 2017.
37. Young JR. 10 high fliers on Twitter. http://www.chronicle.com/article/10-High-Fliers-on-Twitter/16488. Published April 10, 2009. Accessed July 5, 2017.
38. Jensen JD, Krakow M, John KK, Liu M. Against conventional wisdom: When the public, the media, and medical practice collide. BMC Med Inform Decis Mak. 2013;13(suppl 3):S4.
39. Craig ID, Plume AM, McVeigh ME, Pringle J, Amin M. Do open access articles have greater citation impact? A critical review of the literature. J Informetr. 2007;1:239–248.
40. Costas R, Zahedi Z, Wouters P. Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. J Assoc Info Sci Tech. 2015;66:2003–2019.
41. Wakefield AJ, Murch SH, Anthony A, et al. RETRACTED: Illeal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. Lancet. 1998;351:637–641.

Supplemental Digital Content

Back to Top | Article Outline
© 2017 by the Association of American Medical Colleges