What Do I Do When Something Goes Wrong? Teaching Medical Students to Identify, Understand, and Engage in Reporting Medical Errors : Academic Medicine

Secondary Logo

Journal Logo

Innovation Reports

What Do I Do When Something Goes Wrong? Teaching Medical Students to Identify, Understand, and Engage in Reporting Medical Errors

Ryder, Hilary F. MD, MS; Huntington, Jonathan T. MD, PhD; West, Alan PhD; Ogrinc, Greg MD, MS

Author Information
Academic Medicine 94(12):p 1910-1915, December 2019. | DOI: 10.1097/ACM.0000000000002872



Medical error has been one of the leading causes of death in the United States for more than a decade, and some research indicates that it is, in fact, the third leading cause of death in the United States.1 Medical students are often exposed to medical errors during their clinical experiences.2 Students have intense emotional responses to the idea of being involved in medical errors3 and suffer moral distress due to a lack of education or training on how to respond to medical errors that compromise patients’ safety.4

The Association of American Medical Colleges strongly endorses patient safety education in undergraduate medical education (UME),5 and identifying and processing medical errors should be an important part of UME training, yet the literature offers few feasible examples of how this teaching could occur. Some articles detail highly resource-intensive interventions requiring significant faculty time,2,6,7 while other interventions rely on passive learning techniques such as lectures or videos.8,9 Literature indicates that medical trainees learn more readily from medical errors they have personally experienced,3 and these personal experiences may provide fertile ground for exploring their emotional responses to medical error; however, most interventions rely on standardized cases or group work rather than on personally experienced events.6

To provide students with direct experience identifying, analyzing, and reporting medical errors, we developed the interactive patient safety reporting curriculum (PSRC) and embedded it in the required third-year internal medicine clerkship. The curriculum required students to engage both intellectually and emotionally with events they personally experienced in which the safety of one of their patients was compromised. In this report, we have described the development, implementation, and evaluation of the PSRC.



The Geisel School of Medicine at Dartmouth, which admits 92 students per year, follows a traditional medical curriculum: preclinical education for 24 months, followed by 12 months each of clinical clerkships and advanced elective activities. The third-year internal medicine clerkship is an 8-week clinical experience in which students join inpatient care teams composed of 1 intern, 1 resident, and an attending physician. Clinical work occurs at multiple sites; students participate in in-person educational sessions at Geisel at the beginning and end of the internal medicine clerkship.

PSRC design and implementation

Patient safety education occurs as discrete activities throughout the curriculum. In the first year of medical school, students receive a 90-minute preclinical patient safety curriculum. During the session, students learn the basic tenets of patient safety, and they collaborate to analyze a case study, to identify system-level errors and vulnerabilities, and to create an action plan (Chart 1). In groups, they complete a report, modeled after a standard report from the literature.10

Chart 1:
Characteristics of Patient Safety Curricular Interventions and of Cohorts Experiencing These Interventions at the Geisel School of Medicine at Dartmouth

The PSRC was initially implemented in academic year 2015–2016 within the internal medicine clerkship. The core of the PSRC is a structured, focused written report, through which students analyze a medical error they witnessed or otherwise personally experienced as part of a patient’s care team. The report is bookended by a set of 2 interactive, case-based, small-group sessions led by faculty with expertise in patient safety, quality improvement, and medical errors (J.T.H., G.O.). We designed this curriculum, using the reporting and analysis of student-experienced medical error, to accomplish the following objectives: (1) improve student understanding of how and why medical errors occur, (2) increase student comfort with reporting medical errors, and (3) explicitly explore the emotional effects of personally experiencing errors as a member of the health care team.

During the clerkship orientation, which entails a 30-minute didactic presentation on the first day of the internal medicine rotation, faculty review both a method of classifying adverse events developed by Robert Wachter and James Reason’s Swiss cheese model, topics previously covered in the 90-minute, first-year preclinical patient safety curriculum. Students are instructed to look for medical errors or near misses during their clinical experiences and are asked to complete a patient safety report, which uses the same structure as the report they completed in their preclinical curriculum (Supplemental Digital Appendix 1, available at https://links.lww.com/ACADMED/A709). They discuss potential errors with their clerkship or site director, as well as with the residents and attendings on their teams. In the first year of implementation, the report was optional, but starting in academic year 2016–2017, it became a required part of the clerkship (exceptions are made if the clerkship director feels the student needs additional practice with patient write-ups or if the student indicates they have not witnessed a medical error). The report has 5 sections: a case description, a rating of the severity of the outcome or sequelae following the error, an analysis of contributing system factors, suggestions for improvement, and a personal reflection. The reports are read and graded by a clerkship or site director within a week of submission and constitute 30% of the grade earned for written work (which, in turn, constitutes 15% of the student’s total grade for the rotation). The report is assessed based on the clarity of the description; the thoughtfulness of the analysis of the severity of outcome and the contributory factors; the ingenuity of the ideas for preventing error in the future; and the depth of reflection—specifically, the reflection of the effect of the error on the patient/family, medical team, and student. Exceptional reports provide additional context to the error, either by (1) demonstrating investigative work performed by the student to determine how the error occurred (e.g., interviews of key personnel, investigations of the system in which the error occurred) or (2) contextualizing the error by engaging with the patient safety literature and providing relevant citations. Reports concerning errors deemed by the clerkship or site director to result in high degrees of harm or deemed likely to recur are provided to the site’s quality assurance group or committee for further investigation via existing medical center processes. Reports may also be used as the starting point for resident-led quality improvement projects.

After completing the report, students gather (with the same faculty facilitators who led the orientation) to debrief about their experiences and discuss lessons learned. The clerkship and site directors do not attend the orientation or debrief, and any personal experiences discussed at the debrief session are not shared with any faculty member who has an evaluative capacity. To ensure accurate reporting and to protect student confidentiality, reports are shared only with the relevant institution’s safety office in a deidentified fashion with the student’s permission after grades are submitted.

PSRC evaluation

Students completing the clerkship in academic years 2015–2016 and 2016–2017 were invited to complete a 21-item self-assessment of their knowledge of why errors occur and their comfort with reporting medical error before and after the clerkship. The assessment, derived from Madigosky and colleagues’ survey,2 uses Likert-type scales to appraise students’ attitudes toward patient safety and their comfort in identifying and reporting medical error; it required 2 minutes to complete. In May 2015, members of the graduating class of 2015, a class not exposed to either the preclinical patient safety curriculum or the PSRC, who served as historical controls, were recruited by email and voluntarily completed the assessment online immediately before their graduation. Participation in the assessment was optional, anonymous, and not part of a student’s grade. The Dartmouth College Committee for the Protection of Human Subjects granted exemption for this project.

We assessed students’ ability to accurately classify the error they described in their reports and their ability to analyze the severity and type of medical error that had occurred by having patient safety experts from the VA National Center for Patient Safety Field Office at the White River Junction VA Medical Center analyze and code the final reports. The independent experts reviewed 120 reports submitted by students between June 2015 and January 2018 and independently determined the severity of the incident using the same interval-based severity of outcome index adapted from Vincent10 that our students used (Supplemental Digital Appendix 1, available at https://links.lww.com/ACADMED/A709). A random sample of 5 reports was independently scored by 2 patient safety experts with complete agreement, so each of these experts individually reviewed and scored the remainder of the reports.

We tracked the number of student reports leading either to further formal investigation (e.g., a root cause analysis) or to system-level changes. While we wished to obtain data on students’ reporting of medical errors through our medical centers’ adverse events tracking systems, we were unable to do so because of the confidential nature of the reporting process.

Data analysis

We compared preexposure and postexposure scores using the chi-square test. We included Cramer’s V to show the strength of the association between condition and response rates. Because of the low number of respondents in the historical control, we compared the historical control and preexposure groups using the Cochran–Mantel–Haenszel test for general association. We assessed the agreement between expert and student assessment of severity of medical error using weighted kappa and considered the severity of outcome scale as an interval scale to compare differences in assessment of severity. We used Stata 13.0 (2013, StataCorp, College Station, Texas) for all analyses.


Of the 171 students enrolled in the clerkship, 131 (76.6%) completed the patient safety assessment; 126 students completed the assessment both before and after the clerkship, and 86 of the 171 students submitted a patient safety report. Of the students who did not submit a patient safety report, 43 were in the group for which a report was optional, 10 were required to focus on clinical documentation to achieve competency, and 32 indicated that they had not seen a patient safety event. Patient safety experts analyzed another 34 reports submitted after the PSRC evaluation was completed (for a total of 120). Sixteen students completed the assessment as historical controls.

Effect of preclinical patient safety curriculum

We compared the pre-PSRC assessment data for the 2 cohorts of students who experienced the curriculum and found no differences, so we pooled the data from the 2 cohorts in the analysis. We compared the pre-PSRC assessment results of these 2 cohorts of students with those of the historical controls to determine the effect of the preclinical patient safety curriculum on students entering the clerkship. Students who experienced the preclinical patient safety curriculum reported more positive attitudes toward patient safety and practicing medicine in a complex environment (Table 1). For example, the preclinical curriculum successfully instilled the idea that making errors in medicine is inevitable, as evidenced by the increase in agreement from historical controls to pre-PSRC assessment reporting students (26.7% controls, 81.7% pre-PSRC assessment; P < .001).

Table 1:
Number of Students in Historical Control, in Pre-PSRC, and in Post-PSRC Groups Agreeing to Items on a Patient Safety Assessment, 2014–2018a

Effect of PSRC

After the PSRC, students self-reported improved attitudes toward medical error and increased comfort with analyzing and disclosing them. Baseline attitudes remained high and significantly increased relative to historical controls (Table 1). We found that students receiving the PSRC in the second half of their third year reported higher levels of skill acquisition than students receiving training in the first half of their third year.

We are aware of 4 system changes that stemmed from student reports, including enforcing isolation of patients with influenza and the creation of a standardized order set of precautions for patients at risk of self-harm. In addition, 2 additional reports led to one of the medical center sites performing a formal root cause analysis.

Accuracy of student analysis

The patient safety experts’ analysis of the severity of outcome highly aligned with the students’ analysis (weighted kappa = 0.76 [95% confidence interval of 0.67–0.86]; see Chart 2). Agreement increased with increased clinical experience, and the highest levels of agreement were between students who performed their analysis at the end of the academic year and the experts. We reviewed and described any disagreements in severity of outcome. We applied qualitative analysis techniques to the content of the entire report but focused special attention on the discussion of contributory factors and student reflections. We identified 3 common sources of disagreement between experts’ and students’ analyses. First, moral distress on the part of the student or perceived patient or family distress led to increased student assessment of harm, whereas this outcome did not lead to an increased assessment of harm by experts. Second, students and experts assessed an increased length of stay differently. Our experts uprated the harm if the error led to an increased length of stay, whereas students tended not to rate this as a significant harm. Finally, we noted 6 instances in which an expert disagreed that an error had occurred. For example, in one instance, a nurse refused an order because she believed that correct consent had not been obtained, delaying care for an hour. The student rated this as an error leading to a low level of harm, whereas the expert believed that it represented a best safety practice.

Chart 2:
Comparison of Third-Year Medical Students’ Rating of Harm Severity and a Patient Safety Expert’s Rating of Harm Severity Resulting From a Patient Safety Incidenta

Next Steps

We have presented an innovative approach to providing third-year clinical clerks with the skills to identify, understand, and report medical errors they have personally experienced. Students participating in the PSRC were able to analyze the severity and type of medical error and make recommendations leading to ongoing quality improvement work. Our students analyzed cases they witnessed, which led to the increased engagement in patient safety initiatives and learning. Several students were successful in improving patient safety at the medical centers in which they practiced. Our approach can be easily adopted by other educators and does not require a home within a clerkship. We now use the approach we have presented here to engage fourth-year medical students in learning about and discussing ethical dilemmas they have personally experienced in their clinical rotations.

Our curriculum is interactive and allows for the application of knowledge to a clinical setting in a way that can directly affect care delivery. It requires only a modest investment of resources and provides students with a safe and educational way to report errors, learn basic patient safety tenets, and process the emotional impact of witnessing adverse events. Having students submit their reports into the adverse event tracking systems at our clinical institutions would add another level of engagement for them. Two students indicated that they entered their report into a tracking system, and 20 indicated that they felt comfortable doing so; formal assessment of student use of tracking systems was hampered by our inability to access student reports. We encourage our students to use existing reporting systems and to report their use back to us. We plan to work with risk management to determine whether some errors should be disclosed to patients and their families.

In reviewing our students’ reports and listening to their comments during debriefing sessions, we have determined that clerkship students develop a realistic understanding of the systems in which they work and of when and where patients are placed at risk. We learned that medical students may respond in varied and unpredictable ways to witnessing patient safety events, especially those in which they deem harm to have occurred. Based on our experience, we recommend that schools provide guidance for reporting patient safety risks or violations, as well as the emotional support and opportunity for reflection that students (and others) require after an error is experienced.


The authors dedicate this report to their friend, colleague, and coauthor Alan West, PhD, who passed away after the report was completed. They will miss his talents, humor, and deep pursuit of knowledge and truth.


The authors wish to thank Bradley Vince Watts, MD, and Julia Neily, RN, MS, MPH, patient safety experts, for their analysis of the severity of medical errors described in the students’ reports. The authors wish to thank Spencer James, MD, for his assistance with the statistical analyses presented in this report and Susan D. Furste, RDN, for her grammatical expertise. Finally, the authors wish to thank Jeffrey Bell, MD, for his early work on the patient safety reporting curriculum.


1. Makary MA, Daniel M. Medical error—The third leading cause of death in the US. BMJ. 2016;353:i2139.
2. Madigosky WS, Headrick LA, Nelson K, Cox KR, Anderson T. Changing and sustaining medical students’ knowledge, skills, and attitudes about patient safety and medical fallibility. Acad Med. 2006;81:94–101.
3. Fischer MA, Mazor KM, Baril J, Alper E, DeMarco D, Pugnaire M. Learning from mistakes. Factors that influence how students and residents learn from medical errors. J Gen Intern Med. 2006;21:419–423.
4. Martinez W, Lo B. Medical students’ experiences with medical errors: An analysis of medical student essays. Med Educ. 2008;42:733–741.
5. Association of American Medical Colleges. Medical School Objectives Project (MSOP). Contemporary Issues in Medicine Reports. https://www.aamc.org/what-we-do/mission-areas/medical-education/msop. Accessed October 10, 2019.
6. Hall LW, Scott SD, Cox KR, et al. Effectiveness of patient safety training in equipping medical students to recognise safety hazards and propose robust interventions. Qual Saf Health Care. 2010;19:3–8.
7. Halbach JL, Sullivan LL. Teaching medical students about medical errors and patient safety: Evaluation of a required curriculum. Acad Med. 2005;80:600–606.
8. Dudas RA, Bundy DG, Miller MR, Barone M. Can teaching medical students to investigate medication errors change their attitudes towards patient safety? BMJ Qual Saf. 2011;20:319–325.
9. Moskowitz E, Veloski JJ, Fields SK, Nash DB. Development and evaluation of a 1-day interclerkship program for medical students on medical errors and patient safety. Am J Med Qual. 2007;22:13–17.
10. Vincent C. Understanding and responding to adverse events. N Engl J Med. 2003;348:1051–1056.

Supplemental Digital Content