Secondary Logo

Journal Logo

Unverifiable and Erroneous Publications Reported by Obstetrics and Gynecology Residency Applicants

Simmons, Haley R., BS; Kim, Sara, PhD; Zins, Andrea M., MD; Chiang, Seine, MD; Amies Oelschlager, Anne-Marie E., MD

doi: 10.1097/AOG.0b013e31824605fc
Original Research
Free

OBJECTIVE: To estimate the rate of erroneous and unverifiable publications in applications for an obstetrics and gynecology residency and to determine whether there were associated characteristics that could assist in predicting which applicants are more likely to erroneously cite their publications.

METHODS: This was a review of the Electronic Residency Application Service applications submitted to the University of Washington obstetrics and gynecology residency for the 2008 and 2009 matches. Publications reported to be peer-reviewed articles and abstracts were searched by querying PubMed, Google, and journal archives (first tier), topic-specific databases (second tier), and by e-mailing journal editors (third tier). Errors were categorized as minor, major, and unverified.

RESULTS: Five-hundred forty-six (58%) of 937 applicants listed a total of 2,251 publication entries. Three-hundred fifty-three applicants (37.7%) listed 1,000 peer-reviewed journal articles and abstracts, of which 751 were reported as published and 249 as submitted or accepted. Seven-hundred seventy (77.0%) publications were found by a first-tier search, 51 (5.1%) were found by a second-tier search, 23 (2.3%) were found by a third-tier search, and 156 (15.6%) were unverified. Of the 353 applicants listing peer-reviewed articles or abstracts, 25.5% (90 of 353) committed major errors, 12.5% (44 of 353) committed minor errors, and 24.1% (85 of 353) had articles or abstracts that were unverified.

CONCLUSION: Most applicants reported their publications accurately or with minor errors; however, a concerning number of applicants had major errors in their citations or reported articles that could not be found, despite extensive searching. Reported major and unverified publication errors are common and should cause concern for our specialty, medical schools, and our entire medical profession.

LEVEL OF EVIDENCE: III

Obstetrics and gynecology resident applicants commonly report erroneous or unverifiable peer-reviewed research abstracts and articles on their electronic residency applications.

From the University of Washington School of Medicine, Seattle, Washington; Instructional Design and Technology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California; and the Department of Obstetrics and Gynecology, University of Washington School of Medicine, Seattle, Washington.

See related editorial on page 493 and related article on page 504.

Corresponding author: Anne-Marie Amies Oelschlager, MD, University of Washington, Department of Obstetrics and Gynecology, Box 356460, Seattle, WA 98195-6460; e-mail aamies@u.washington.edu.

Financial Disclosure The authors did not report any potential conflicts of interest.

In 2009, there were 46,307 medical student applicants for 22,427 postgraduate year 1 residency positions. Specifically, in obstetrics and gynecology there were 1,796 applicants for 1,185 postgraduate year 1 obstetrics and gynecology residency positions.1 One way that residency applicants distinguish themselves when competing for a training position is through research experience. The Electronic Residency Application System application requires that all applicants sign, “I certify that the information contained within My Electronic Residency Application System application is complete and accurate to the best of my knowledge. I understand that any false or missing information may disqualify me from consideration for a position; may result in an investigation by the Association of American Medical Colleges; may also result in expulsion from Electronic Residency Application System; or if employed, may constitute cause for termination from the program.”2 Despite this attestation, diligent review of residency and fellowship applications has revealed a concerning number of unverifiable or erroneous publications, ranging from 1.8% to 30.2%.35 Higher rates were reported in more competitive specialties, including emergency medicine,6 radiology,7,8 orthopedics,911 and neurosurgery.12 A meta-analysis found that the mean percentage of applicants with misrepresented articles was 4.9%.13 There remain several important questions, the answers to which are the purpose of this study: 1) What is the rate of erroneous or unverifiable publications for a university-based obstetrics and gynecology residency? 2) What types of citation errors are applicants making? and 3) Are there any demographic characteristics that could assist obstetrics and gynecology residency directors in understanding which applicant is more likely to report a publication with an unverifiable or major citation error?

Back to Top | Article Outline

MATERIALS AND METHODS

We conducted a retrospective study using the Electronic Residency Application System applications submitted to the University of Washington residency program in obstetrics and gynecology in the autumn of 2007 and 2008 for the 2008 and 2009 matches, respectively. Institutional Review Board approval was granted for this project. A waiver of consent was obtained for this study, and a confidentiality agreement was signed by all the authors.

The Electronic Residency Application System application instructs students to enter research experience under “Experience” and to enter publications under “Publications.” In the Publications section, applicants are required to classify their entries under the following categories: Peer-Reviewed Journal Articles or Abstracts, Peer-Reviewed Journal Articles or Abstracts (Other than Published), Peer-Reviewed Book Chapter, Scientific Monograph, Other Articles, Poster Presentation, Oral Presentation, Peer-Reviewed Online Publication, and Nonpeer-Reviewed Online Publication. We searched for all articles and abstracts that were categorized by the applicants as “Peer-Reviewed Journal Articles/Abstracts” and “Peer-Reviewed Journal Articles/Abstracts (other than published),” which included articles and abstracts that applicants reported as submitted, provisionally accepted, accepted, or in press.

The types of searches were put into a ranking system that was agreed on by the authors based on the extent of search that was required to locate the publication. Querying PubMed, Google, Google Scholar, and the archives of a journal or meeting was considered to be a first-tier search. The second-tier search included querying BIOSIS, EMBASE, Web of Science, World Catalogue, and the National Library of Medicine Locator Plus. If publications were still not located after these searches, then topic-specific databases were searched, including the following: PsychInfo (psychiatry), Global Health (global health or international medical journals), SciFinder Scholar (chemistry), INSPEC (physics, electrical engineering, electronics, computing, information technology), Genome (genetics), and National Technical Information Service (government research, development, and information reports). If the publication remained unverified, then for the third-tier search the research team looked for contact information for the journal or meeting and e-mailed about the article or abstract to confirm the submission and publication. Journals that remained unverified were searched for by a health sciences librarian.

If the article or abstract was found, then we determined whether the applicant had correctly listed the author list, title of the article, and journal. We also examined whether the applicant listed a PubMed identification number, and whether the publication was actually peer-reviewed. The peer-reviewed status of the publication was determined by its listing in Ulrich's International Periodicals Directory or by the journal's Web site. If we could not find evidence that the journal was peer-reviewed, then it was classified as not peer-reviewed for the study. If the journal itself was still unverified, then a librarian searched for the journal.

The types of errors within citations were stratified into categories that were agreed on by the research team. Errors were categorized into major, minor, and unverified error types. Major errors included citations of articles that had never been submitted for publication as verified by journal editors and that were incorrectly classified as peer-reviewed. Major errors also included discrepancies in the author list, either when the applicant reported their name in the citation as a senior author or higher up the author list or when the applicant's name was not listed at all. If the applicant was not listed in the actual citation as author, then we checked the article's acknowledgments to determine whether the applicant was mentioned. Minor errors included deleting all other authors, title change, wrong journal publication name, date, volume, issue, wrong PubMed identification number, or failure to list the journal name. Unverified error was classified as articles or abstracts that could not be located despite the third-tier search combined with a librarian's assistance.

Demographic information collected included the applicant's sex, age (born before 1980 or 1980 or after), U.S. citizenship, and enrollment or graduation from a U.S. or international medical school. We calculated the percentage of applicants committing different types of error or no error by demographic characteristics. We performed χ2 tests and Fisher exact test for examining statistical significance using SPSS 19.

Back to Top | Article Outline

RESULTS

Nine hundred thirty-seven applicants applied to the University of Washington obstetrics and gynecology residency program for the 2008 and 2009 matches. Of these, 391 applicants did not report any publications of any type. A total of 546 of the 937 applicants (58%) listed a total of 2,251 publication entries of all types (mean 4.12 entries). Of these 546 applicants, 353 listed 1,000 peer-reviewed journal articles or abstracts, of which 751 were reported as published and 249 were reported unpublished (submitted, provisionally accepted, accepted, or in press). Of the peer-reviewed entries, 77% (770/1,000) of citations were found easily by a first-tier search, 5.1% (51/1,000) were found by a second-tier search, and only 2.3% (23/1,000) were identified by the third-tier search (Fig. 1). A total of 156 of the 1,000 (15.6%) abstracts and articles listed as peer-reviewed were unverified. All of the publications with PubMed identification numbers were found. Comparing the citations listed as published compared with unpublished (accepted, provisionally accepted, in press, or submitted), a greater proportion of errors was found in unpublished citations (Table 1).

Fig. 1

Fig. 1

Table 1

Table 1

Out of 353 applicants reporting published and unpublished peer-reviewed journal articles and abstracts, 90 (25.5%) had major errors, 44 (12.5%) had minor errors, and 202 (57.2%) did not have any errors. Eighty-five of the 353 applicants (24.1%) had articles or abstracts that were unverified. Sixty-two (17.6%) applicants incorrectly classified articles or abstracts as peer-reviewed and 21 (6.0%) reported themselves higher up on the author list. Seven applicants (2.0%) reported citations in which their names were inserted into the author list and two applicants reported unpublished articles or abstracts as submitted that were confirmed by the journal editor to have never been submitted.

Excluding the 70 applicants who reported only unpublished articles and abstracts, there were 283 applicants who reported published peer-reviewed articles or abstracts. Of these 283, 205 (72.4%) committed no error at all and 25 (8.8%) applicants listed unverified articles. Comparing applicants who reported peer-reviewed publications to the overall applicant demographic, those reporting peer-reviewed publications were more likely to be born before 1980 (46% compared with 38%), male (25% compared with 19%), and less likely to be U.S. citizens (65% compared with 81%) or to have attended a medical school within the United States (63% compared with 76%). Compared with students who made no errors, a higher proportion of students who committed major or unverified errors were born before 1980 and were not United States citizens or matriculated or graduated from an international medical school. Table 2 shows demographic trends within each of the error types, ie, no error, major error, minor error, unverifiable.

Table 2

Table 2

Back to Top | Article Outline

DISCUSSION

This study reveals that a concerning number of residency applicants have publications listed on their Electronic Residency Application System application that are unverified, despite extensive searching. Furthermore, there are many erroneous citations, particularly for works listed in the application as “unpublished.” The most common major error types were misclassifying an article or abstract as peer-reviewed or discrepancies in the author list. We speculate that the high error rate is mainly attributable to a lack of education and research maturity, because applicants may not be familiar with how to properly cite a research article and therefore may not have entered the title, author list, date, or journal in the correct fashion. Perhaps the applicant did not understand the tenets of authorship, the implications of the rank of the author list, or what it means to be peer-reviewed. A misunderstanding could have occurred if a student performed work on a research project and felt deserving of credit for the work performed but was not listed on the author list when the article was published. The rate we report is as high as or higher than reported in other specialties, which may be related to study limitations, and yet begs the question of why and what should be done about it.

The limitations of our study are as follows. First, this was a single university-based residency program study. It is unclear whether research citations listed by applicants applying to peer residency programs would include a comparable amount of erroneous or unverified citations. Second, our study covered data from a 2-year duration. There were 1,828 applicants for postgraduate year 1 positions in obstetrics and gynecology for the 2008 U.S. National Residency Matching Program match and 1,796 applicants for the 2009 National Residency Matching Program match for a total of 3,624 applicants for both years.1 We reviewed 937 applications to the University of Washington residency; therefore, our study represents approximately 26% of the National Residency Matching Program applicant pool over the 2-year period. Third, this study is retrospective using the past applicants' information after the match was complete. If the study were prospective, then we would have had to consent all of the applicants, which would have potentially confounded our data. Applicants may have been more careful in the way that they represented their research if they knew it was going to be so closely scrutinized. Fourth, it was more difficult to search for articles that were not originally published in English. Also, it was more difficult to find contact information for international medical journals compared with those published in the United States; therefore, some of the unverified publications from international medical journals may have been legitimate. Fifth, time from submission to publication may be considerable, and this may have increased the inability to find an unpublished article if it was recently submitted. Most of these applications were completed by November 1, 2008, allowing at least 20 months without evidence of publication, which is less than the typical time lapse from submission to publication. Finally, although our concern is that potentially students with major or unverifiable errors may have padded their resumes with their full knowledge, the applicants' intents cannot be determined from this study.

This research has led us to develop several practical recommendations for residency directors who may assume applicants who report peer-reviewed publications have a deeper level of research experience than those who report no publications or nonpeer-reviewed publications. The first-tier search, especially with a PubMed identification number, is not significantly time-consuming and may be able to quickly confirm publications reported as published. However, beyond the first-tier search, the authors recommend that the burden of proof be placed on the applicant. We recommend that residency programs consider asking the applicant to send the program a copy of their reported publications or the journal acceptance letters. This would decrease the number of major and unverifiable error types. This may be especially beneficial for international medical graduates whose research may be published in journals that are more difficult to access from the United States. If a question arises regarding a paper submitted by an applicant who is invited to interview, then we recommend that the residency request that the applicant bring a copy of the paper or abstract and discuss any discrepancies during the interview. If there is a concern that the student has behaved unethically, then the residency director may alert the Electronic Residency Application System Integrity Education and Investigation Program. The Education Program is designed to educate Electronic Residency Application System users about application errors and omissions such as submission of fraudulent publication citations, which the program targets for investigation. If the investigation uncovers a violation of the Electronic Residency Application System policy, then findings will be sent to the applicant, the designated dean's office, and the residencies.

Given the high rate of erroneous and unverified citations in our study and in other specialties, we recommend that medical schools consider including formal education for medical students about authorship requirement, peer review, and ethical and honest reporting of research. In addition, we recommend that the Electronic Residency Application System application should include a reference of the rules of authorship and also the definition of peer review. This may be especially important for applicants whose primary language is not English to minimize any confusion.

The authors believe that it is the ethical responsibility of residency applicants to represent themselves accurately in all components of the residency application. Residency directors are reticent to hire those who may be dishonest. If an error is detected in the Electronic Residency Application System application, then the residency director may assume that the applicant does not pay attention to detail or is deliberately misrepresenting their research. The applicant needs to be aware that an investigation into an erroneous citation could result in disqualification from consideration for a residency position, expulsion from Electronic Residency Application System, or termination from residency employment. For this reason, the applicant should be exceptionally attentive to detail to prevent any appearance of malfeasance.

Although there has been a study linking unprofessional behavior in medical school to future disciplinary action by medical boards,14 it is unclear if students who misrepresent publications in their residency applications are more likely to have more disciplinary actions in their future careers than the students who do not misrepresent their work. A follow-up study is warranted to see if applicants who reported major or unverified publications in their Electronic Residency Application System applications have a higher incidence of disciplinary action in their residencies and throughout their careers than those who did not misrepresent their work. Major and unverified publication errors reported on obstetrics and gynecology residency applications are common and should cause concern for our specialty, medical schools, and our entire medical profession.

Back to Top | Article Outline

REFERENCES

1. National Residency Matching Program. NRMP historical reports. Available at: http://www.nrmp.org/data/historicalreports.html#mainmatch/. Retrieved August 30, 2011.
2. Association of American Medical Colleges. Electronic residency application service. Resources for residency applicants. Available at: https://www.aamc.org/download/139512/data/worksheet.pdf. Retrieved December 4, 2011.
3. Hebert RS, Smith CG, Wright SM. Minimal prevalence of authorship misrepresentation among internal medicine residency applicants: Do previous estimates of “misrepresentation” represent insufficient case finding? Ann Intern Med 2003;138:390–2.
4. Kuo PC, Schroeder RA, Shah A, Shah J, Jacobs DO, Pietrobon R. “Ghost” publications among applicants to a general surgery residency program. J Am Coll Surg 2008;27:485–9.
5. Sekas G, Hutson WR. Misrepresentation of academic accomplishments by applicants for gastroenterology fellowships. Ann Intern Med 1995;123:38–41.
6. Gurudevan SV, Mower WR. Misrepresentation of research publications among emergency medicine residency applicants. Ann Emerg Med 1996;27:327–30.
7. Panicek DM, Schwartz LH, Dershaw DD, Ercolani MC, Castellino RA. Misrepresentation of publications by applicants for radiology fellowships: Is it a problem? Am J Roentgen 1998;170:577–81.
8. Baker DR, Jackson VP. Misrepresentation of publications by radiology residency applicants. Acad Radiol 2000;7:727–9.
9. Dale JA, Schmitt CM, Crosby LA. Misrepresentation of research criteria by orthopaedic residency applicants. J Bone Joint Surg Am 1999;81:1679–81.
10. Konstantakos EK, Laughlin RT, Markert RJ, Crosby LA. Follow-up on misrepresentation of research activity by orthopaedic residency applicants: has anything changed? J Bone Joint Surg Am 2007;89:2084–8.
11. Patel MV, Pradhan BB, Meals RA. Misrepresentation of research publications among orthopedic surgery fellowship applicants: a comparison with documented misrepresentations in other fields. Spine 2003;28:632–6.
12. Cohen-Gadol AA, Koch CA, Raffel C, Spinner RJ. Confirmation of research publications reported by neurological surgery residency applicants. Surg Neurol 2003;60:280–4.
13. Wiggins MN. A meta-analysis of studies of publication misrepresentation by applicants to residency and fellowship programs. Acad Med 2010;85:1470–4.
14. Papadakis MA, Teherani A, Banach MA, Knettler TR, Rattner SL, Stern DT, et al.. Disciplinary action by medical boards and prior behavior in medical school. N Engl J Med 2005;353:2673–82.
Figure

Figure

Cited By:

This article has been cited 1 time(s).

Obstetrics & Gynecology
Bibliografake: More Common Than We Thought?
Learman, LA
Obstetrics & Gynecology, 119(3): 493-494.
10.1097/AOG.0b013e318247fe25
PDF (106) | CrossRef
Back to Top | Article Outline
© 2012 by The American College of Obstetricians and Gynecologists. Published by Wolters Kluwer Health, Inc. All rights reserved.