Mushlin, Stuart B. MD; Katz, Joel T. MD
Dr. Mushlin is director, Preliminary Medicine Internship, and master clinician, Department of Medicine, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts.
Dr. Katz is director, Internal Medicine Residency, and vice chair for education, Department of Medicine, Brigham and Women’s Hospital, Harvard Medical School, Boston, Massachusetts.
Other disclosures: None.
Ethical approval: Not applicable.
Correspondence should be addressed to Dr. Katz, 75 Francis St., Boston, MA 02115; telephone: (617) 732-5540; e-mail: firstname.lastname@example.org.
In this issue, Grimm and Maxfield report the results of an analysis of the outcomes of manuscripts listed as “provisionally accepted,” “accepted,” “in press,” and “submitted” on applications to a university radiology residency program. Their surprising finding that one-third of manuscripts listed as “accepted” or “in press” were not published two years after being included on an application raises questions about the reasons for these discrepancies.
The authors of this commentary argue that one explanation for these findings is that some applicants deliberately misrepresented facts in order to be seen as more attractive candidates. After examining the professionalism implications of the study by Grimm and Maxfield, the authors offer recommendations for addressing lapses in students’ professionalism early on. They recommend that medical school admissions and teaching faculty establish clear and unshakable expectations that untruths will not be tolerated regardless of the difficult administrative challenges that may ensue. Further, medical school admissions committees should select entrance criteria that reward collaborative behaviors and honesty in addition to academic achievement. The authors encourage more longitudinal, systematic analyses of potential fabrications in residency applications, with the goal of fostering a culture of trust in medicine.
Editor’s Note: This is a commentary on Grimm LJ, Maxfield CM. Ultimate publication rate of unpublished manuscripts listed on radiology residency applications at one institution. Acad Med. 2013;88:1719–1722.
The study by Grimm and Maxfield1 in this issue raises disturbing issues around ethics and morality in our profession. When they analyzed the publication outcomes of manuscripts included by applicants in their radiology residency applications, the authors found that fully one-third of manuscripts described as “in press” or “accepted” did not appear in print within two years. One explanation for this discrepancy is that some applicants deliberately misrepresented facts in order to be seen as more attractive candidates. How could such a substantial proportion of applicants be swayed to inflate or falsify their credentials, and what can be done to address this issue?
What Do the Data Suggest?
Grimm and Maxfield have approached the problem of falsification in residency applications differently than previous studies have done. Rather than verify whether published papers contained false content or attribution, they tabulated the Electronic Residency Application Service (ERAS) application domain for manuscript publications, which can be categorized as “accepted,” “provisionally accepted,” and “in press,” as well as the vaguer category of “submitted.”
The authors explored the publication status of manuscripts two years after the ERAS submissions to validate how many papers were actually published, whether published papers appeared in the journals that were cited on the application (and the impact factor of the publishing journal relative to the information included on the application), and the order of the applicant’s position in the author list compared with what the applicant had submitted on the ERAS application.
They found that 33.5% of applications to a prestigious university radiology training program listed papers in the unpublished domain; two years later, they found that only 53.2% of those were published. A further look at the “in press” versus “accepted” subcategory shows that only 81.3% of “in press” papers were actually published and that only 58.9% of “accepted” papers were actually published. As the authors note, it is hard to fathom how 18.8% of “in press” papers were not published. Certainly, there are occasional delays in publication, but not of the order of magnitude the authors found.
There are some possible explanations for these findings, such as the lack of definitions of the publication categories. Although this section of the ERAS application does not meticulously define the criteria, the categories do not leave too much to the imagination. There should be no confusion that the “provisionally accepted,” “accepted,” and “in press” categories are about favorably reviewed yet unpublished manuscripts. The study’s methodology could have missed some publications that faced a prolonged editing or publishing timeline.
Such possible explanations are sadly unlikely to negate the overall conclusion of widespread fabrication. One would suspect that these findings may be representative of applications to most specialties and most institutions, not just the program and institution from which the data arose. Prior studies cited by the authors have looked at listing of fabricated publications among applicants to plastic surgery, psychiatry, otolaryngology, orthopedics, and radiology residency programs, finding rates ranging from 1.8% to as high as 34%. Previous studies have also found high rates of plagiarized personal statements.2 The problem does not end after Match Day; in a recent survey, 14% of interns anonymously admitted to fraudulent behavior.3
Answering the Tough Questions
While we can never definitively know the reasons for these troubling results, the current study raises questions central to professionalism: Why do applicants feel compelled to inflate their credentials? Can educators or the medical education system support a professional climate that mitigates falsification? What do these findings say about our profession, both currently and for the future?
There should be considerable solace in knowing that the majority of applicants do not falsify their credentials. Those who do may be motivated by a combination of fear that such falsification will be necessary for them to achieve their goals and confidence that the system will not catch them. Habits are formed early, and these attitudes may be present before most of these students enter medical school.4 Indeed, true character flaws manifest themselves throughout trainees’ careers.5
Gaining acceptance to medical school in the United States involves demon stration of high performance as an undergraduate, and achievement in medical school requires a prodigious amount of intellectual mastery as well as appropriate time management. In this high-pressure environment, getting the best scores and the best evaluations can, sadly, sometimes make students feel that dishonest means justify the end.
Medical school admissions and teaching faculty must establish clear and unshakable expectations that untruths will not be tolerated regardless of the difficult administrative challenges that may ensue. As residency program directors, we have seen too many examples of medical schools that choose not to sanction students for false application statements, generally attributing the “error” to “naiveté.” Further, medical school admissions committees should delineate entrance criteria that reward collaborative behaviors and honesty in addition to academic achievement. Students respond to faculty role modeling and are highly attuned to the “hidden curriculum” in their colleagues’ behavior and in our educational institutions. Explicit professionalism curricula are expanding and offer the opportunity to discuss areas vulnerable to moral lapse and to model wholly ethical behavior. Proper citation of manuscripts is but one example. Preparing students to understand that they can be successful in a wide range of residencies, not just the “name brand” programs, may diminish their urge to inflate credentials. On the residency program end, while ERAS is known to be both valuable and reliable, the current study should pique the curiosity of interviewers and program directors when certain applications seem too good to be true.
The persistent and low-level “signal” of lapses in professionalism again demonstrated in this study requires renewed attention to recognizing and reinforcing the profession’s foundation—trust. While physiology, remedies, and communication methods will certainly change, trust is and will remain the key ingredient of the therapeutic alliance between doctor and patient, between the profession and society.
When systematically analyzed, as this study has done, application fabrications are relatively easy to find but are hard to acknowledge, prevent, and correct. Beyond the unfairness introduced into the resident selection process, the outcome of keenest interest—unprofessional behavior—is difficult to prove at later career stages in all but the most egregious situations. Longitudinal studies of potential interventions are thus extremely challenging to perform. However, if we do not address this issue in an open and meaningful way, external forces will do so, to our shame and our peril.
1. Grimm LJ, Maxfield CM. Ultimate publication rate of unpublished manuscripts listed on radiology residency applications at one institution. Acad Med. 2013;88:1719–1722
2. Segal S, Gelfand BJ, Hurwitz S, et al. Plagiarism in residency application essays. Ann Intern Med. 2010;153:112–120
3. Arora VM, Wayne DB, Anderson RA, Didwania A, Humphrey HJ. Participation in and perceptions of unprofessional behaviors among incoming internal medicine interns. JAMA. 2008;300:1132–1134
4. Anderson RE, Obenshain SS. Cheating by students: Findings, reflections, and remedies. Acad Med. 1994;69:323–332
5. Papadakis MA, Hodgson CS, Teherani A, Kohatsu ND. Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board. Acad Med. 2004;79:244–249