Skip Navigation LinksHome > January 2014 - Volume 89 - Issue 1 > Teaching Medical Error Disclosure to Residents Using Patient...
Academic Medicine:
doi: 10.1097/ACM.0000000000000046
Research Reports

Teaching Medical Error Disclosure to Residents Using Patient-Centered Simulation Training

Sukalich, Sara MD; Elliott, John O. PhD, MPH; Ruffner, Gina EMT-P

Free Access
Supplemental Author Material
Article Outline
Collapse Box

Author Information

Dr. Sukalich is director, Department of Medical Education, OhioHealth Riverside Methodist Hospital, Columbus, Ohio.

Dr. Elliott is research specialist, Department of Medical Education, OhioHealth Riverside Methodist Hospital, Columbus, Ohio.

Ms. Ruffner is simulation center manager, Center for Medical Education and Innovation (CME+I), OhioHealth Riverside Methodist Hospital, Columbus, Ohio.

Funding/Support: A Picker Institute/Gold Foundation Graduate Medical Education Challenge Grant funded this research.

Other disclosures: None reported.

Ethical approval: The OhioHealth institutional review board approved this project as exempt.

Previous presentations: This research was presented in a different form at the 12th International Meeting on Simulation in Healthcare (IMSH) in San Diego, California, February 2012.

Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A173.

Correspondence should be addressed to Dr. Sukalich, Department of Medical Education, 3535 Olentangy River Rd., Columbus, OH 43214-3998; telephone: (614) 566-2426; e-mail: SSUKALI2@OhioHealth.com.

Collapse Box

Abstract

Purpose: To determine whether a standardized patient encounter and self-guided tutorial would improve first-year residents’ self-efficacy for disclosing medical errors.

Method: In 2011, 55 first-year residents participated in a simulation in which they disclosed an error to a standardized patient playing the part of a family member. Residents completed the simulation twice, four weeks apart, and completed presession knowledge and self-efficacy (based on the Accreditation Council for Graduate Medical Education [ACGME] core competencies) assessments and repeated the self-efficacy assessment after the sessions. Residents reviewed the videos of their encounters either alone (self-debrief) or with a faculty observer (faculty debrief). Between sessions, they completed a self-paced learning tutorial. Two external faculty also rated the residents’ performances using videos of the encounters.

Results: Residents’ self-efficacy significantly increased from a Session 1 pretest mean (standard deviation) score of 119.6 (26.6) to a Session 2 posttest score of 150.3 (24.9) for all ACGME competencies (P < .001, Cohen's d = 1.19). The external reviewers’ ratings provided additional, objective support for residents’ improvement on questions assessing ACGME competencies (P = .001). Comparisons of the self-efficacy of residents in the self-debrief versus faculty debrief groups yielded no significant differences on any ACGME competencies.

Conclusions: Timely, explicit, and empathetic disclosure of medical errors to patients and family is essential to maintaining trust and is an important part of patient-centered medical care. This intervention easily could be replicated in other settings and is applicable to many members of the health care team, not just to residents.

Disclosure, the process of bringing to light an unintended outcome, is essential for maintaining trust between physicians, patients, and their family members. Surveys of patients revealed that a large majority preferred to be informed immediately of errors, even minor ones.1,2 Patients also expect to be given more information about an unintended injury during treatment than doctors believe should be given.3

Furthermore, medical errors are common; previous research found that 62% of trainees and 88% of faculty physicians reported making medical mistakes.4 One study found that over 90% of physicians and trainees (medical students and residents) reported they would or should disclose a hypothetical error (major, minor, and those causing no harm); however, only 41% reported actually doing so in real instances.5 Not only is the act of disclosing a medical error the “right thing to do,” but some hospital accreditation requirements and state laws also mandate it.6

In an effort to increase health care professionals’ willingness to embrace disclosure, the National Quality Forum put forward an evidence-based safe practice guideline on the disclosure of serious unanticipated outcomes.7 This guideline recommends providing an explanation to the patient and his or her family about what happened, potential implications or consequences of the error, a commitment to investigate what went wrong, feedback regarding the findings of the investigation, and an apology or expression of regret from the physician. However, throughout training, physicians are taught that the goal in providing care is to improve health, not cause harm, which inherently affects their error disclosure behaviors.

Other barriers to fully disclosing an error include fear of potential malpractice litigation in admitting a mistake, the culture of medicine, and the psychological impact of facing mistakes and apologizing for them.8–11 The art of disclosure and the use of effective, open communication are skills not readily taught in most undergraduate medical education environments. Thus, a gap exists in residents’ competency when they enter their graduate medical training, a gap that often continues as they move into medical practice.

The majority of prior research on the disclosure of medical errors has focused on health professionals primarily through the use of surveys of the attitudes and beliefs of health sciences students,12 medical students,13 and residents/physicians.5,14–18 One investigation used written scenarios to assess how residents would disclose medical errors.19

In terms of the methods for teaching error disclosure, previous studies have described the use of didactic educational modules,12 seminars,20 DVDs of patients followed by discussion,4 and role-playing between faculty and students.21 Although the use of standardized patient encounters is a well-established method for assessing clinical skills,22 very few studies have examined a standardized patient method for teaching error disclosure to physicians or residents.12,23,24 Therefore, we argue in favor of creating a robust training and competency assessment program to teach residents how to disclose medical errors and improve communication skills that includes a standardized patient encounter and is based on the National Quality Forum’s disclosure guidelines.7

The main goal of this study was to determine whether a standardized patient encounter scenario, followed by a self-guided tutorial, would improve residents’ self-efficacy for disclosing medical errors. We also were interested in exploring whether a self-debriefing procedure would be as effective as a faculty-led debriefing. To further establish the effectiveness of this training method, we sought to determine whether external faculty members could objectively detect improvements in residents’ error disclosure skills via the review of randomly assigned, video-recorded, standardized patient encounters.

Back to Top | Article Outline

Method

In March 2011, we invited all 55 postgraduate year (PGY) 1 residents at the OhioHealth Riverside Methodist Hospital to participate in our study via announcements at program-specific meetings as well as in a letter from the principal investigator (S.S.) and the hospital’s physician vice president of medical education. The residents received no incentive to participate, and we informed them, as required by the OhioHealth institutional review board, that data collected as part of our project would be considered educational research. We deidentified all data that we collected through the use of a study-specific identification number that we could use to link all assessments. The OhioHealth institutional review board approved our project as exempt.

Back to Top | Article Outline
Study design

First, we developed the scenario to use in our study. Next, we trained the standardized patients and the internal faculty who would be debriefing residents. The residents then completed the simulation scenario twice, approximately four weeks apart. Between sessions, they completed an online, self-paced learning tutorial on the art of disclosure (15 minutes in length). Before completing each session, the residents took a medical knowledge test and a self-efficacy assessment. After completing each session, they repeated the self-efficacy assessment.

The educational material that we compiled for the self-paced learning tutorial included best practice guidelines and references from the National Quality Forum guidelines on disclosure.7 As part of the tutorial, the residents also reviewed the OhioHealth Riverside Methodist Hospital’s standard policy and procedures for the disclosure of unanticipated events. We required all residents to provide a certificate of completion for the tutorial before they started the second session. The first session took place in March 2011 and the second in April 2011. We videotaped all sessions for analysis.

After all residents completed the sessions, two trained external faculty members reviewed and evaluated the residents’ error disclosure skills, using video recordings of the encounters.

Back to Top | Article Outline
Scenario

The research team developed the scenario for this project. It was meant to offer a realistic situation that a PGY 1 resident might be required to address (see Appendix 1). We did not pilot the scenario prior to the study but did ask other faculty involved in simulation education at our institution to review it for face validity.

Residents read the scenario and then immediately entered the simulation with the standardized patient at our in-house simulation center (Center for Medical Education and Innovation [CME+I]). Each encounter lasted roughly 10 minutes.

Back to Top | Article Outline
Training

To ensure that the two standardized patients (acting as a family member) had the necessary qualifications for performance, they underwent a one-hour instruction session, during which we detailed the background of the scenario and reviewed our expectations of them, and a one-hour practice session with a CME+I staff member. As part of the training, we provided them with a list of 15 questions that we expected them to ask during each encounter with a resident (see Supplemental Digital Appendix 1 at http://links.lww.com/ACADMED/A173). This training took place in March 2011.

As part of the training process, we provided the participating internal faculty members with an example of a videotaped scenario focusing on a disclosure conversation for them to review, a copy of our hospital’s policies/procedures for error disclosure, a printed copy of the scenario, a medical debriefing guide used in our simulation lab for other resident training, a copy of the assessment questions on which we would evaluate the residents, and a copy of the 2008 Canadian Medical Protective Association report, titled “Communicating With Your Patient About Harm: Disclosure of Adverse Events.”25 Ten faculty members, representing family medicine, internal medicine, obstetrics–gynecology, and general surgery, participated.

Back to Top | Article Outline
Assessments

Prior to each session, the residents completed an online medical knowledge test via a computer terminal in the CME+I. The test included 21 true/false questions covering information in our hospital’s policies for error disclosure, which are based on best practices.

To quantify the skills needed to provide full disclosure of medical errors, we also created an online resident self-efficacy assessment based on the Accreditation Council for Graduate Medical Education (ACGME) core competencies (medical knowledge, interpersonal and communication skills, patient care/clinical skills, professionalism, systems-based practice, and practice-based learning and improvement). These six core competencies provide a set of standard principles by which residents are to be evaluated as well as a general framework for curriculum development.26,27 They also have been used for teaching risk management to residents.10,28

The self-efficacy assessment included 21 questions (see Appendix 2). The residents rated their confidence in performing aspects of error disclosure on a nine-point Likert scale (0 = not very confident, 9 = very confident). Self-efficacy, a well-established social learning theory,29 has been used to evaluate the high-fidelity training of surgery residents in pediatric trauma30 as well as in the disclosure of medical errors.20 The residents completed the self-efficacy assessment via a computer terminal in the CME+I.

Back to Top | Article Outline
Debriefing

After each simulation session, each resident completed a video review of the encounter. To assess the impact of the debriefing process, we randomly assigned the residents to review the video alone (n = 29, “self-debrief”) or with a faculty member (n = 26, “faculty debrief”).

We instructed residents who completed the self-debrief to watch their video and self-reflect on suggested items like “What went well?” “What would I change?” and “How might this impact the way I treat patients?” Residents remained in their assigned debriefing group for both sessions. The self-debrief sessions lasted approximately 10 minutes.

Residents in the faculty debrief group watched their video with one faculty member immediately following each standardized patient encounter. Faculty members provided verbal feedback during the video review. The faculty debrief sessions lasted approximately 15 minutes.

Back to Top | Article Outline
External faculty reviewers

To provide more objective ratings of the residents’ skills, two trained faculty reviewers outside our program watched all of the videotaped encounters during June and July 2011. A computer technician from the CME+I randomly sorted the videotaped encounters so that the reviewers were blinded to which session (first or second) they were viewing. We provided DVD copies of the videotaped encounters with a deidentified study-specific number to the external reviewers. After the external reviewers conducted their assessments, we provided the randomization key to the statistician to properly link each resident’s Session 1 and 2 data.

The paper-based external faculty assessment tool was similar to the residents’ self-efficacy assessment but was revised to include anchors for each number on the Likert scale. We also removed the two questions rating the documentation in the patient’s medical record, because they were not directly observable in the simulation, resulting in a 19-question assessment.

The principal investigator (S.S.) trained the external reviewers in an effort to address potential issues with interrater reliability. As part of the training, each external reviewer watched and assessed a small sample of 10 videos to assess initial interrater reliability. The intraclass correlations for these assessments ranged from −0.33 to 0.82, suggesting reasonable overall agreement.

Back to Top | Article Outline
Statistical analysis

We calculated each resident’s knowledge assessment total score by adding up the total number of correct answers. We used paired t tests to compare knowledge assessment total scores and χ2 tests to compare individual responses on the knowledge assessment between Sessions 1 and 2.

We summarized each ACGME core competency domain in the self-efficacy assessment using subscale scores consistent with the ACGME core competency guidelines.31 We summed the ratings of all the questions to create a total score on the self-efficacy assessment. We compared the residents’ responses to the self-efficacy assessments and external reviewers’ ratings via paired t tests, which examine individual change on the basis of matched data. We used a Bonferroni correction method to adjust for multiple comparisons in the Session 1 pre–post, Session 2 pre–post, and Session 1 pre-Session 2 post analyses.

Residents had to complete all assessments to be included in the data analysis of each session. If a resident did not complete an assessment because he or she had to leave for clinical duties or a computer glitch caused the loss of the assessment data, we deleted the data listwise from the analysis for that session. Thus, we report both the numerator and denominator for all percentages in the Results to reflect that some calculations included a different number of participants.

Because we found no formal assessments of the skills for the disclosure of medical errors in an extensive review of the literature, we created our own measurement scale (see Appendix 2). Intraclass correlations demonstrated that the residents’ self-efficacy assessment that we developed has very high reliability (Cronbach α = 0.98; 95% confidence interval [CI]: 0.97–0.99; P < .001).

Intraclass correlations also indicated moderate agreement between the two external reviewers (Session 1: α = 0.40; 95% CI: 0.31–0.53; P < .001; Session 2: α = 0.44; 95% CI: 0.34–0.56; P < .001). Because of this level of agreement, we averaged the scores for Reviewer 1 and Reviewer 2 for each session for analyses via paired t tests. Finally, we created a total score on the external reviewers’ assessment by summing the ratings of all the questions. We conducted all analyses using SPSS version 19.0 (IBM Corp., Armonk, New York).

Back to Top | Article Outline

Results

Fifty-five PGY 1 residents participated: 36 (65.5%) men and 19 (34.5%) women. Four specialties were represented: 38 (69.1%) from internal medicine, 7 (12.7%) from general surgery, 5 (9.1%) from family medicine, and 5 (9.1%) from obstetrics–gynecology. Only 2 residents were foreign medical graduates (3.6%). Complete data were available for 53 residents.

Back to Top | Article Outline
Medical knowledge

We found no significant differences between the pretest mean (standard deviation [SD]) score of 18.6 (1.3) and posttest score of 19.0 (1.2) in residents’ knowledge of institutional policies (P = .118) (see Supplemental Digital Table 1 at http://links.lww.com/ACADMED/A173). An examination of responses to the individual knowledge questions revealed three questions that were more commonly missed, including incorrectly identifying time constraints (Question #6) as a commonly cited barrier to physicians’ error disclosure. On this question, only 17.0% (9/53) of residents in Session 1 and 11.3% (6/53) in Session 2 correctly identified that time constraints were not a commonly cited barrier (P = .574). In both sessions, only 35.8% (19/53) identified correctly that staying with the patient and family until the chief medical office or risk management arrived (Question #10) was not one of the immediate steps to be taken after a medical error (P = .239). We found some increase in the percentage of residents who correctly identified that the process of disclosure of unanticipated events does not include naming or listing the person or team involved in the event (Question #12) from Session 1 (40/53; 75.5%) to Session 2 (50/53; 94.3%), but this change was not statistically significant (P = .145).

Table 1
Table 1
Image Tools

Responses to only one question (Question #14) improved significantly (P = .011) from pretest to posttest: 88.7% (47/53) of participants before Session 1 versus 96.2% (51/53) of participants before Session 2 correctly identified that disclosure of an unanticipated event should not include speculation regarding the cause of the event.

Back to Top | Article Outline
Self-efficacy for disclosing medical errors

For Session 1, we found a significant increase in residents’ self-efficacy for disclosing medical errors between the pre and post assessment. We saw this increase for all of the ACGME core competencies (P < .001) (see Table 1). These results suggest that the increase in residents’ self-efficacy was sustained over the one-month period between the posttest for Session 1 and the pretest for Session 2 (a difference of one point or less).

For Session 2, we also found a significant improvement in five of the six ACGME core competencies: medical knowledge, interpersonal and communication skills, patient care/clinical skills, professionalism, and practice-based learning and improvement (all P < .001). These results suggest that residents showed additional improvement from participating in a second session and from the self-paced learning tutorial.

In addition, residents’ self-efficacy significantly increased from Session 1 pretest to Session 2 posttest for all six of the ACGME competencies (P < .001). The difference in increases is most evident in the total scores: from the baseline mean (SD) score of 119.6 (26.6) to the last posttest score of 150.3 (24.9) (P < .001), demonstrating a large effect size (Cohen's d = 1.19).

Back to Top | Article Outline
Self-debrief versus faculty debrief

In comparing the self-efficacy of residents in the self-debrief (n = 28) versus faculty debrief (n = 24) groups using independent-samples t tests, we found no significant differences on any of the ACGME competencies. The self-debrief group’s self-efficacy total score mean (SD) for Session 1 pretest was 119.64 (27.6), while for Session 2 posttest it was 145.6 (28.5), a change of 25.9 points (95% CI: 15.3–36.6; P < .001). The faculty debrief group’s self-efficacy total score mean (SD) for Session 1 pretest was 120.7 (26.7), while for Session 2 posttest it was 156.1 (19.5), a change of 35.4 points (95% CI: 26.0–44.9). On the basis of overlapping 95% CIs, we found no significant difference in residents’ self-efficacy between the self-debrief group and faculty debrief group.

Back to Top | Article Outline
External reviewers

Several videos had technical issues and could not be reviewed; thus, we evaluated the data from 44 residents who completed Session 1 and Session 2. We combined the external reviewers’ ratings (see Supplemental Digital Table 2 at http://links.lww.com/ACADMED/A173 for final intraclass correlation coefficients). We found significant improvement in responses to eight assessment questions related to interpersonal and communication skills, professionalism, systems-based practice, and practice-based learning and improvement (all P values ≤ .003; see Table 2). The external reviewers’ total score mean (SD) also significantly improved from Session 1 to Session 2: 94.6 (17.4) to 106.8 (20.0) (P < .001), demonstrating a medium effect size (Cohen's d = 0.65).

Table 2
Table 2
Image Tools
Back to Top | Article Outline

Discussion

Our findings suggest that simulation encounters with standardized patients are an effective tool for improving residents’ self-efficacy for disclosing medical errors. Previous research supports the effectiveness of standardized patients for teaching error disclosure skills to medical students.12 However, most of the research on the disclosure of medical errors has focused on surveying health professionals for their attitudes and beliefs.5,13–18

From previous research, we know that the medical field is aware of the problems surrounding the disclosure of medical errors. Our research takes the next step by providing an easily implemented educational intervention to address these issues, which is especially important because only 20% of trainees and 21% of faculty report receiving adequate training on disclosing medical errors.4 Although we targeted residents with our educational program, it also may be effective in training attending faculty as well.

In comparison with other recently published studies, our research has several strengths. One recent educational intervention put 15 PGY 4 obstetrics–gynecology residents through a 30-minute small-group didactic seminar followed by three hours of practice in a group setting with a standardized patient.20 Although the researchers also found increases in residents’ self-efficacy, the residents in their study rated their preperformance after completing the educational intervention. Our study used a pre–post design that assessed residents prior to exposure to the educational intervention. Our study also included blinded external reviewers who evaluated residents’ skills and behavior. Furthermore, as PGY 1 residents are more likely to face challenges disclosing errors because of their lack of experience, we believe that our intervention is better situated as a first-year program. In contrast, the use of PGY 4 residents is problematic because of their varied exposure to adverse events or medical errors over the course of their training, potentially affecting the outcomes of the intervention.

Another study of internal medicine residents focused on evaluations by residents, standardized patients, and an internal observer.24 This study also did not use a pre–post design. It was primarily descriptive in nature and focused on the residents’ ability to disclosure a medical error from the standardized patient and observer perspectives, with no formal reliability evaluation. While focusing on similar content specific to communication (explanation, honesty, and empathy) and practice-based learning and improvement (prevention), our study examined additional outcomes as defined by the ACGME. The ACGME core competencies that we used to evaluate residents provide additional support for responsiveness to medical errors from a systems perspective.

Back to Top | Article Outline
Medical knowledge

Overall, residents scored well on both tests (before Sessions 1 and 2) of their knowledge of our institutional policies regarding medical errors. This finding suggests that the knowledge test either was too easy or that the residents were already familiar with our institutional policies. We did find an increase in the percentage of residents who correctly identified that the process of disclosure of unanticipated events does not include naming or listing the person or team involved in the event, which is an important component of error disclosure.32 This finding speaks to the success of our educational program.

Back to Top | Article Outline
Self-efficacy for disclosing medical errors

The most compelling findings of our study were the increases in residents’ self-efficacy. Previous studies reported the emotional toll that medical errors have on residents.8,9 In light of these findings, we feel that practicing the disclosure of an error in a simulated setting allows residents to better manage their emotional and physiological state. Thus, the scenarios we used could play an important role in interventions to build self-efficacy in residents.33 Self-efficacy assessments are predictive of future behavior34 and have been useful for identifying individuals who need remediation.35

Previous research evaluating examinees’ performance on the Step 2 Clinical Skills examination found that those who failed the examination the first time, then encountered the same standardized patient and the same clinical scenario again (yet passed the second time), did not score significantly higher because of the repeat exposure to the information.36 This finding suggests that residents’ confidence in disclosing medical errors would not improve if they participated in a repeated simulation scenario; however, our findings suggest that further practice, even with the same scenario, is still beneficial to residents in terms of their self-efficacy for disclosing medical errors. Because our study also included a self-paced learning tutorial between sessions, we cannot definitely attribute the increase in residents’ self-efficacy solely to participating in the standardized patient encounters. The self-paced tutorial may have provided residents with additional knowledge and reflection on the standardized patient experience.

Back to Top | Article Outline
Self-debrief versus faculty debrief

Our findings suggest that a self-debriefing protocol (reviewing the videotaped encounter without an attending physician) may be as useful as an attending-physician-led debriefing. Self-assessment may provide the psychologically safe environment needed for debriefing while also allowing residents to identify their own strengths and weaknesses, a skill that is critical to lifelong learning in the health professions.37,38 As debriefing can be time consuming for attending physicians, this finding suggests that this aspect of educational training could be streamlined.

Although research on debriefing has not defined a set method for the process,39 our findings are consistent with those of a recent study. A prospective randomized controlled study using a high-fidelity simulation for crisis management found that anesthesiology residents’ nontechnical skills for crisis resource management (task management, situation awareness, and decision making) improved regardless of the type of debriefing they received (self-debriefing versus faculty debriefing).40

Back to Top | Article Outline
External reviewers

Because the external faculty were unfamiliar with the residents and were blinded to the order of the session video recordings, our results provide objective support for residents’ improvement on the questions assessing the ACGME core competencies. These findings also provide support for residents’ self-reported improvement from participating in the simulation scenarios and the debriefing sessions. In addition, our findings are consistent with calls for resident education on error disclosure that include improvements in the ACGME core competencies of professionalism, interpersonal and communication skills, and practice-based learning improvement.32 Our research was also able to demonstrate improvement in the systems-based practice domain.

Back to Top | Article Outline

Limitations

Our study has a number of limitations. First, the debriefing stage may have been strengthened by the use of one standard faculty rater because this would have provided a more uniform approach. The involvement of numerous faculty from different medical disciplines may have contributed to wide variability in the quantity and quality of the debriefing that residents received. The overall time commitment, however, would have made it difficult to enlist the participation of only one faculty attending.

Second, the video recordings provided an efficient medium for the external reviewers to assess the residents’ skills and behavior. However, certain communication channels, such as eye contact, may be difficult to assess via video recordings.

Third, on the basis of the timing of assessments, we cannot attribute any individual outcomes to the self-paced learning tutorial. Because the Session 1 posttest scores and the Session 2 pretest scores were essentially unchanged (within one point), we believe that the self-paced learning tutorial had less impact than the experiential aspects of the simulation encounters.

Finally, assessing residents’ behavior in the clinical setting would provide more external validity to our educational program. However, a study examining the disclosure of medical errors by residents in the clinical setting would be difficult to execute. Such research would involve the consent of patients and families, which would likely add undue stress to both the health professional and the patient. Therefore, we will continue to include simulated patients in resident training.

Back to Top | Article Outline

Conclusions

Timely, explicit, and empathetic disclosure of medical errors to patients and family is essential to maintaining trust between physicians and patients and is an important part of patient-centered medical care. However, a gap exists between the ethical prerogative to disclose medical errors and the ability of providers to do so. We devised this educational program to close this gap via the use of a simulated encounter. After completing the simulation scenario twice, residents reported significant increases in their self-efficacy for disclosing a medical error. External reviewers also noticed these changes in residents. Finally, we found that a self-debriefing is as efficacious as a faculty debriefing. In conclusion, this intervention, including the simulation encounters, could be easily replicated in other settings and is applicable to many members of the health care team, not just to residents.

Acknowledgments: The authors would like to thank Jennifer Beard, MD, Anthony Casey, MD, and Scott Merryman, MD.

Back to Top | Article Outline

References

1. Witman AB, Park DM, Hardin SB. How do patients want physicians to handle mistakes? A survey of internal medicine patients in an academic setting. Arch Intern Med. 1996;156:2565–2569

2. Hobgood C, Peck CR, Gilbert B, Chappell K, Zou B. Medical errors—what and when: What do patients want to know? Acad Emerg Med. 2002;9:1156–1161

3. Hingorani M, Wong T, Vafidis G. Patients’ and doctors’ attitudes to amount of information given after unintended injury during treatment: Cross sectional, questionnaire survey. BMJ. 1999;318:640–641

4. Bell SK, Moorman DW, Delbanco T. Improving the patient, family, and clinician experience after harmful events: The “when things go wrong” curriculum. Acad Med. 2010;85:1010–1017

5. Kaldjian LC, Jones EW, Wu BJ, Forman-Hoffman VL, Levi BH, Rosenthal GE. Disclosing medical errors to patients: Attitudes and practices of physicians and trainees. J Gen Intern Med. 2007;22:988–996

6. Gallagher TH, Studdert D, Levinson W. Disclosing harmful medical errors to patients. N Engl J Med. 2007;356:2713–2719

7. National Quality Forum. Safe Practices for Better Healthcare—2010 Update. 2010 Washington, DC National Quality Forum http://www.qualityforum.org/Publications/2010/04/Safe_Practices_for_Better_Healthcare_%e2%80%93_2010_Update.aspx. Accessed September 19, 2013

8. Hobgood C, Hevia A, Tamayo-Sarver JH, Weiner B, Riviello R. The influence of the causes and contexts of medical errors on emergency medicine residents’ responses to their errors: An exploration. Acad Med. 2005;80:758–764

9. West CP, Huschka MM, Novotny PJ, et al. Association of perceived medical errors with resident distress and empathy: A prospective longitudinal study. JAMA. 2006;296:1071–1078

10. Jacob JA. Principled approach to error disclosure aligns with ACGME competencies, enhances safety and helps institutions improve. ACGME Bull. 2009;3

11. Robbennolt JK. Apologies and medical error. Clin Orthop Relat Res. 2009;467:376–382

12. Gunderson AJ, Smith KM, Mayer DB, McDonald T, Centomani N. Teaching medical students the art of medical error full disclosure: Evaluation of a new curriculum. Teach Learn Med. 2009;21:229–232

13. Muller D, Ornstein K. Perceptions of and attitudes towards medical errors among medical trainees. Med Educ. 2007;41:645–652

14. Gallagher TH, Garbutt JM, Waterman AD, et al. Choosing your words carefully: How physicians would disclose harmful medical errors to patients. Arch Intern Med. 2006;166:1585–1593

15. Gallagher TH, Waterman AD, Garbutt JM, et al. US and Canadian physicians’ attitudes and experiences regarding disclosing errors to patients. Arch Intern Med. 2006;166:1605–1611

16. Garbutt J, Brownstein DR, Klein EJ, et al. Reporting and disclosing medical errors: Pediatricians’ attitudes and behaviors. Arch Pediatr Adolesc Med. 2007;161:179–185

17. White AA, Gallagher TH, Krauss MJ, et al. The attitudes and experiences of trainees regarding disclosing medical errors to patients. Acad Med. 2008;83:250–256

18. Kaldjian LC, Jones EW, Wu BJ, Forman-Hoffman VL, Levi BH, Rosenthal GE. Reporting medical errors to improve patient safety: A survey of physicians in teaching hospitals. Arch Intern Med. 2008;168:40–46

19. White AA, Bell SK, Krauss MJ, et al. How trainees would disclose medical errors: Educational implications for training programmes. Med Educ. 2011;45:372–380

20. Bonnema RA, Gosman GG, Arnold RM. Teaching error disclosure to residents: A curricular innovation and pilot study. J Grad Med Educ. 2009;1:114–118

21. Pichert JW, Hickson GB, Trotter TS. Malpractice and communication skills for difficult situations. Ambul Child Health. 1998;4:213–221

22. van der Vleuten CPM, Swanson DB. Assessment of clinical skills with standardized patients: State of the art. Teach Learn Med. 1990;2:58–76

23. Chan DK, Gallagher TH, Reznick R, Levinson W. How surgeons disclose medical errors to patients: A study using standardized patients. Surgery. 2005;138:851–858

24. Stroud L, McIlroy J, Levinson W. Skills of internal medicine residents in disclosing medical errors: A study using standardized patients. Acad Med. 2009;84:1803–1808

25. Canadian Medical Protective Association. . Communicating With Your Patient About Harm: Disclosure of Adverse Events. http://www.cmpa-acpm.ca/cmpapd04/docs/resource_files/ml_guides/disclosure/pdf/com_disclosure_toolkit-e.pdf. Accessed September 19, 2013

26. Holt KD, Miller RS, Nasca TJ. Residency programs’ evaluations of the competencies: Data provided to the ACGME about types of assessments used by programs. J Grad Med Educ. 2010;2:649–655

27. Mainiero MB, Lourenco AP. The ACGME core competencies: Changing the way we educate and evaluate residents. Med Health R I. 2011;94:164–166

28. Nissen K, Angus SV, Miller W, Silverman AR. Teaching risk management: Addressing ACGME core competencies. J Grad Med Educ. 2010;2:589–594

29. Bandura A Self-Efficacy: The Exercise of Control. 1997 New York, NY W.H. Freeman

30. Popp J, Yochum L, Spinella PC, Donahue S, Finck C. Simulation training for surgical residents in pediatric trauma scenarios. Conn Med. 2012;76:159–162

31. Swing SR, Clyman SG, Holmboe ES, Williams RG. Advancing resident assessment in graduate medical education. J Grad Med Educ. 2009;1:278–286

32. McDonald T, Smith KM, Mayer D. “Full disclosure” and residency education: Resident learning opportunities within the context of a comprehensive program for responding to adverse patient events. ACGME Bull. 2008:5–9

33. Young HN, Schumacher JB, Moreno MA, et al. Medical student self-efficacy with family-centered care during bedside rounds. Acad Med. 2012;87:767–775

34. Reuter T, Ziegelmann JP, Wiedemann AU, et al. Changes in intentions, planning, and self-efficacy predict changes in behaviors: An application of latent true change modeling. J Health Psychol. 2010;15:935–947

35. Artino AR Jr, Hemmer PA, Durning SJ. Using self-regulated learning theory to understand the beliefs, emotions, and behaviors of struggling medical students. Acad Med. 2011;86(10 suppl):S35–S38

36. Swygert KA, Balog KP, Jobe A. The impact of repeat information on examinee performance for a large-scale standardized-patient examination. Acad Med. 2010;85:1506–1510

37. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as “nonjudgmental” debriefing: A theory and method for debriefing with good judgment. Simul Healthc. 2006;1:49–55

38. Leach DC. Competence is a habit. JAMA. 2002;287:243–244

39. Raemer D, Anderson M, Cheng A, Fanning R, Nadkarni V, Savoldelli G. Research regarding debriefing as part of the learning process. Simul Healthc. 2011;6(suppl):S52–S57

40. Boet S, Bould MD, Bruppacher HR, Desjardins F, Chandra DB, Naik VN. Looking in the mirror: Self-debriefing versus instructor debriefing for simulated crises. Crit Care Med. 2011;39:1377–1381

Back to Top | Article Outline
Appendix 1 Scenario Used in a Simulation of Medical Error Disclosure to a Standardized Patient by Postgraduate Year 1 Residents From Multiple Specialties, OhioHealth Riverside Methodist Hospital, 2011

You are on call and called to see a patient in the step-down intensive care unit who has become unarousable and has altered mental status. The handoff checkout you received on the patient states that this is a 66-year-old man who recently underwent an abdominal aortic aneurysm repair. He has been recovering well, and this is postoperative day #2. The surgical team has been pleased with his postoperative progress so far. The nurse tells you that he seemed okay all evening, and approximately 15 minutes ago she gave him his pain medication as scheduled (1 mg IV Dilaudid). As you enter and scan the room, you notice an empty vial of Dilaudid on the nursing server and confirm with the nurse that this is what she gave. Upon closer inspection, you notice the vial is Dilaudid HP, 10 mg/1 mL. The nurse confirms with you that this is the medicine she gave, administering the whole vial, and she then recognizes that she may have inadvertently given too much Dilaudid. Suspecting this too, you immediately ask the nurse to administer Narcan. The nurse gives the Narcan, and the patient immediately becomes more arousable but still is bradypneic.

For closer monitoring and to ensure the patient’s hemodynamics, breathing, and mental status remain stable, you transfer the patient back to the ICU. You have notified the surgical team and are now asked by the nurse manager to discuss the situation with the family member who is anxiously waiting in the waiting area.

Back to Top | Article Outline
Appendix 2 Self-Assessment Tool Used to Evaluate the Performance of Postgraduate Year 1 Residents Participating in a Simulation of Disclosing a Medical Error to a Standardized Patient, OhioHealth Riverside Methodist Hospital, 2011

When disclosing a medical error, how confident are you in your ability to:

 (Not very confident 1 2 3 4 5 6 7 8 9 Very confident)

MEDICAL KNOWLEDGE

1. Explain the nature of an unanticipated outcome.

2. Give a factual explanation of what is known to have contributed to an unanticipated outcome.

INTERPERSONAL AND COMMUNICATION SKILLS

3. Communicate a statement that an unexpected event occurred.

4. Provide an expression of regret/empathy to the patient (or family).

5. Answer questions posed by the patient (or family) about a medical error.

6. Maintain eye contact at comfortable intervals throughout the interview.

PATIENT CARE/CLINICAL SKILLS

7. Describe possible consequences of the event, including immediate or long-term effects (if any).

8. Give information regarding any tests, procedures, therapies that may be necessary as a result of the event.

9. Give information regarding change in level of monitoring or change in level of care that may be necessary as a result of the event.

10. Make a plan for managing the patient’s condition and comfort.

PROFESSIONALISM

11. Avoid the use of medical jargon.

12. Respond to patient/family concerns and expectations.

13. Express willingness to be helpful to the patient/family in addressing their concerns.

14. Identify who will communicate with the patient and/or the family on an ongoing basis.

SYSTEMS-BASED PRACTICE

15. Offer to obtain support from appropriate health care workers (e.g., social workers, patient representative, clergy).

16. Notify the correct people in hospital administration that an error has occurred.

17. Provide adequate documentation in the patient’s medical chart regarding the event.

18. Provide appropriate documentation in the patient’s medical chart regarding your discussion.

PRACTICE-BASED LEARNING AND IMPROVEMENT

19. Communicate that the factors that resulted in the event will be investigated.

20. Describe the steps that will be taken to evaluate contributing factors.

21. Describe the steps that will be taken to prevent future recurrences.

Supplemental Digital Content

Back to Top | Article Outline

© 2014 by the Association of American Medical Colleges

Login

Article Tools

Images

Share