Secondary Logo

Journal Logo

After the Match: The Doximity Dilemma

Cook, Thomas MD

doi: 10.1097/01.EEM.0000472670.79286.29
After the Match

Dr. Cookis the program director of the emergency medicine residency at Palmetto Health Richland in Columbia, SC. He is also the founder of 3rd Rock Ultrasound (http://emergencyultrasound.com). Friend him atwww.facebook.com/3rdRockUltrasound, follow him @3rdRockUS, and read his past columns athttp://bit.ly/CookCollection.

Figure

Figure

Figure

Figure

An interesting drama in emergency medicine graduate medical education is playing out behind the scenes. On one side are residency training programs for emergency medicine. On the other is Doximity, a website self-defined as an online professional network for physicians. Doximity is trying to rate all medical specialty training programs in an attempt to mimic what U.S. News and World Report does with college rankings (and ESPN does with college football). Their implied goal is to give applicants an idea of which programs are the best. Their ultimate goal, however, is not the same.

Doximity already ranks residency programs for all specialties based on very suspect metrics, the primary one being reputation. On their website they describe their methodology with the following statement: “The default rank for programs prior to any filters being applied — such as location, preferences, or fellowships — is based on the count of peer nominations, in descending order.”

These peer nominations are collected by literally asking the respondent to rank his top five choices for best program. I know people like to check out rankings for everything from fish tacos to kitten videos, but do we seriously want an applicant to choose a program based on what an emergency medicine residency alumnus entered on his cell phone after a few beers with his classmates? How would any current resident or alumnus have the capacity to create an accurate and valid rank list? I have been a program director for 15 years and visited many programs, and I couldn't do this with any realistic chance of accuracy.

Doximity makes a big deal that its rank list is derived from peer nominations of more than 17,000 verified physicians. By way of comparison, there are more than 970,000 physicians in the United States. Doximity also does not break down which programs the peer nominators attended or if they included their program in their rank list.

It is unclear how factors such as ABEM board exam pass rates, faculty experience, publications by residents, graduates entering fellowships, ultrasound training, global health, research facilities, and any one of a host of other factors are incorporated into the calculation for top programs. Instead, it seems to be a popularity contest that entices millennials who like social media.

Recently, emergency medicine program directors banded together to say, no, thanks. The Council of Residency Directors for Emergency Medicine (CORDEM) sent a letter to Doximity asking it to stop ranking emergency medicine programs. It is an impressive show of unity from a specialty that often bucks conformity, and it speaks volumes about what is special about emergency medicine.

There's another perspective, though. Should applicants have access to this type of information if it is accurate and available? The simple answer is yes. It is the Information Age, after all, and everyone looks up this type of stuff on the web. The issue is who is providing the information and for what reason. Residency training is one of the most important decisions of a physician's life.

Prior to Doximity, program information was available through professional colleges and institutions such as the Society for Academic Emergency Medicine and the Emergency Medicine Residents Association. None of these organizations directly pursues profit via these services. They give, however, barebones statistics and no editorial content. To get that, applicants seek advice from faculty mentors, current and past residents, and blogs.

Doximity, on the other hand, is a for-profit enterprise bartering for your attention and personal data. They are intent on rating programs because it brings in viewers and generates profit. After receiving CORDEM's letter, Doximity put together an EM Advisory Board and solicited program directors to participate. Is this manipulative or smart? Profit-driven enterprises can produce great work. They often have better funding and are incentivized by profits to continuously improve their product. This is often not the case in organizations lacking incentives for production. Does Doximity's effort to involve program directors demonstrate the integrity to produce a reliable source of information?

What evolves from this might be interesting. If a single site provided a way to attractively demonstrate program attributes and compare programs offering certain types of experiences such as global health, air EMS, rural medicine, and ultrasound, it might be an efficient way for applicants to search for information and compare apples to apples. But CORDEM will not tolerate ranking systems that throw educational colleagues under the bus, and my impression is that this will never be negotiable.

Share this article on Twitter and Facebook.

Access the links in EMN by reading this on our website or in our free iPad app, both available at www.EM-News.com.

Comments? Write to us at emn@lww.com.

Copyright © 2015 Wolters Kluwer Health, Inc. All rights reserved.