Journal Logo

Empirical Investigations

Exploring Faculty Approaches to Feedback in the Simulated Setting

Are They Evidence Informed?

Roze des Ordons, Amanda Lee MD, MMEd; Cheng, Adam MD; Gaudet, Jonathan E. MD, MSc; Downar, James MD, MSc; Lockyer, Jocelyn M. PhD; Cumming School of Medicine Feedback and Debriefing Investigators

Author Information
Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare: June 2018 - Volume 13 - Issue 3 - p 195-200
doi: 10.1097/SIH.0000000000000289
  • Free


Many faculty preceptors feel ill-prepared to lead feedback conversations, whether they are in clinical or simulated settings.1,2 Faculty have expressed uncertainty in balancing reinforcing and corrective feedback and question their own ability to respond to variability in resident insight, receptivity, and skill.1 Their uncertainty may stem from inexperience and lack of training in approaching feedback.3 Although there are descriptions of studies in which faculty are taught generic skills, there is a paucity of literature that critically examines faculty development in how to have effective conversations about performance,4 and how faculty can optimally adapt process and content to the wide range of learner behaviors and attitudes.

Feedback conversations are complex. They involve discussion of specific information about performance relative to a standard.5 When approached effectively, these conversations facilitate reflection; help learners identify strengths, gaps in performance, and alternative strategies; and encourage translation of learning into other contexts.1,6 Helping faculty develop skills in feedback is particularly critical at a time when residency programs have adopted competency-based medical education (CBME) approaches that require more frequent and timely feedback to support residents' progress toward milestones and entrustable professional activities.7,8

Recognizing the importance of feedback for learning and the extensive body of research published on feedback in recent years, Lefroy et al.6 have developed a set of guidelines to offer some clarity in the midst of complexity. These guidelines were derived from a review of the literature and expert consensus and offer compelling evidence to guide feedback in clinical education.6 They provide recommendations for the process and content of feedback and are framed as a series of “do's, don'ts, and don't knows” for individual preceptors and for institutional culture, with the strength of each recommendation specified.6 Although guidelines can help assimilate large amounts of evidence, their generalizations can become oversimplifications; without descriptions of what the recommended behaviors look like in practice, faculty may be mystified as to how to apply the guidelines, leading to considerable variability in practice. Others may not be aware they exist. The purpose of our study was to examine how the specific behaviors that faculty adopt in feedback conversations align with guideline-based recommendations that have the most substantive evidence.


This qualitative research study9 was conducted using template analysis10 to examine the range of approaches used in simulated feedback encounters. Template analysis was chosen as a way to organize the large amounts of textual data generated by complex feedback conversations in a way that would facilitate further in-depth analysis. We elected to use simulated encounters to approximate the closed-door context in which sensitive feedback conversations between faculty and residents usually take place within the clinical setting. For the purpose of this study, we use the term “feedback” to refer to conversations between faculty and residents and “debriefing” for conversations between facilitators and faculty.


Faculty preceptors who had volunteered to participate in a series of four feedback simulations through which they would receive individual feedback from skilled facilitators were invited to participate in the study. The facilitators who had volunteered to debrief the simulations were also invited. The facilitators were a group of 11 experienced simulation educators with previous training in simulation and debriefing, and 3 to 10 years of experience in leading simulation-based learning and debriefing at local, national, and international workshops and conferences. Two facilitators had a background in nursing and nine facilitators were physicians from various medical specialties.

Data Collection

Each faculty member engaged in four simulated feedback conversations with actors who portrayed residents with “good communication skills,” “insight gaps,” “overconfidence,” and “emotional distress” when discussing goals of care with a patient who had lung cancer. An edited 2-minute video clip provided a rich portrayal of goals of care discussion between an actor resident and actor patient to set the stage for each of the simulations. The faculty members engaged in feedback with the resident for 8 to 10 minutes followed by the facilitator debriefing with the faculty member for another 8 to 10 minutes. The feedback and debriefing conversations were video recorded, converted to audio files, and transcribed verbatim to facilitate textual analysis.

Data Analysis

All of the videos were reviewed before conversion by at least one of the researchers to gain a full understanding of the data. Two members of the research team (A.L.R., J.M.L.) analyzed the video transcripts using template analysis.10 We began by developing a coding template, with a priori themes based on the the following four Lefroy “do's” recommendations designated as having “strong” evidence (italicized): feedback as a social interaction, feedback tailored to the individual, offering specific feedback, and feedback as actionable.6 Feedback conversations between simulated residents and faculty were the focus of analysis, with debriefing between facilitators and faculty used to understand whether faculty members' approaches were aligned with the guidelines and to better understand the intentions behind their approaches.

Both A.L.R. and J.M.L. read through the transcripts to become familiar with the data. The two researchers independently analyzed the transcripts using inductive coding with constant comparison, where codes emerging from successive transcripts were applied to previously analyzed transcripts9; meetings were held after analyzing every three to five transcripts to compare findings and seek to understand the data in greater depth, and discrepancies were resolved through discussion. Meetings continued until meaning saturation was achieved,11 after which A.L.R. and J.M.L. continued to code the remaining transcripts. Codes were then mapped to the template of a priori themes and subthemes that had emerged from the transcripts were identified (Appendix 1). After all transcripts had been coded and mapped, the data were organized to describe behaviors that aligned with Lefroy's guidelines6 and those, which did not. Ethics approval for the study was obtained through the University of Calgary Conjoint Health Research Ethics Board.


A total of 21 faculty members participated in the simulations and 18 consented to be part of the research study. Faculty were physicians from a range of medical (15) and surgical specialties (3), with a mean of 15.1 years in clinical practice (range = 0.5–35 years). Seven (38.9%) of the 18 faculty had previously taken a workshop on feedback.

There were 55 videos of faculty-resident feedback and 56 videos of facilitator-faculty feedback available for analysis. Qualitative analysis of the video transcripts identified different approaches to the process of feedback and content of feedback. In examining the approaches that faculty adopted, we found variability in alignment with Lefroy's recommended practices,6 with some faculty more skilled than others. Our findings are summarized in Table 1 and described in detail hereinafter. Italics are used to highlight Lefroy's recommended practices.6 Names in the quotations have been changed to preserve anonymity.

Alignment Between Faculty Approaches to Feedback and Evidence-Informed Practices

Process Elements of Feedback

Faculty whose approach reflected feedback as a social interaction greeted the resident with a warm welcome, setting the tone for the remaining conversation, rather than a brief “hello” or immediately starting with feedback, for example,

Faculty: Hi. I'm Jane. Nice to meet you. And you are?

Resident: Breanna.

Faculty: Breanna. Breanna, thank you very much for participating. I really appreciate when you volunteer for this type of sessions and we really hope this program will be useful. Um, that was sort of a heavy, difficult conversation, hey there? How did you feel? (Video 23.1)

Those who exemplified the social element of feedback also adopted a fluid conversational approach; this included retaining flexibility and fluency of language within a structured framework, openly sharing their own perspectives rather than keeping the resident guessing, seeking and exploring resident responses to achieve depth of understanding, actively listening and responding in turn, and attending to emotion, as illustrated in the passages hereinafter.

Faculty: And you used some not only language but also body language not quite in keeping with the content of the discussion. Things like “awesome,” “perfect,” You know you're really sort of talking about something heavy and I was wondering what your mindset was going in? (Video 51.1)

Faculty: What about that situation you were in? It was a tough situation for him right and then I think from what I saw subsequently for you; what sort of made you forget those things?

Resident: Um well that's just it because being there in that situation and I couldn't even focus on what he was actually saying. Because the whole time I was thinking of um my um…um I don't know if this…

Faculty: It's okay. Take your time.

Resident: Because it's just seeing him reminded me of my grandfather who's also got lung cancer.

Faculty: I'm so sorry to hear that. We have Kleenex in here… It's really important that you remember that no one exists in isolation to the rest of the world. We all have families, we all have loved ones right? I think it makes you a stronger physician if you can relate to your patients. Obviously it makes it tough. There are certainly things that happen in your personal life for the rest of your career that are going to make going to work difficult and are going to make doing things at work difficult. (Video 39.1)

This was in contrast to a more scripted checklist-type of approach focused on behaviors that other faculty members adopted.

Resident: I, I mean just [pause] I didn't [pause] [exhales while crying].

Faculty: Yeah, yeah, yeah, no, don't be upset. You- this is about how we learn so um what we're going to try and do today is look at um the situation and, and learn how to be more comfortable with it and get so that you feel good about it. (Video 32.1)

Content Elements of Feedback

Faculty skilled in tailoring feedback to the individual flexibly adapted the agenda and focus of feedback to the resident's needs. They achieved a balance between reinforcing and corrective feedback rather than focusing only on strengths or weaknesses, and their questions to the resident were asked in such a way as to facilitate self-reflection.

Faculty: You were sitting down and you were comfortable, but you were still separated.

Resident: Yeah.

Faculty: You could have reached out a little bit more to him. You were empathic but you could have shown a little bit more empathy with your body and your gestures, which I think would have added even more empathy on top of what was already very good. So, I mean that's something to work on for next time; it's just the touching the shoulder or the arm, and you know maybe reaching in a little bit more with your tone of voice and asking more about the family members to kind of create a little bit more connection. (Video 4.1)

Resident: It was an effort to connect with him as I'd never met him before. So yeah, that was just a way to help humanize the encounter.

Faculty: Okay. How did you see him reacting to that or to the discussion? You know, what was your read of his body language? (Video 51.1)

Faculty who integrated specific descriptions of observed and potential future performance into feedback conversations offered specific examples of actions and verbatim phrases to help the resident recall or become aware of their behaviors and the associated impact on the patient, setting the stage for coaching.

Faculty: I thought you did an excellent job of providing statements that made the patient feel that you understood what he was saying, um that you definitely were empathetic toward his situation. You said, “I'm sorry to hear that you're not feeling well today,” for example. And then he elaborated a little bit, I thought in a very helpful way. Another time he said he didn't want to suffer, and then you stopped right then and you listened. You chose that as a point to listen and then he came up with more information, it was very specific about what he meant about not suffering. So I felt that your timing was excellent when you paused. (Video 3.1)

This was in contrast to the nonspecific comments that were incongruent with the approach advocated in the guidelines:

Faculty: I think you showed that you cared and you truly wanted to be there for him and you definitely demonstrated empathy part in your communication skills (Video 52.1)

Faculty skilled in offering specific descriptions discussed corrective feedback and offered concrete examples of alternate strategies, with teachings grounded in personal experience rather than abstract concepts.

Faculty: One thing I noticed, um, to kind of add on what you've been saying and to kind of put it all together is that, it was almost a. a perfect script that you were following, and you really hit on all the points, but you could have put things in your own words, you know, to make them sound a little bit less scripted. (Video 5.1)

Faculty: So one of the things I sometimes find helpful when I'm talking to someone in this situation is to ask him a little bit initially about what he understands about his cancer or what he's already been told about prognosis or the direction things are going and that sort of thing (Video 24.1)

Faculty: So, I might have asked the patient, um, tell me what you know about your illness before mentioning goals of care.

Resident: Right. Okay.

Faculty: So I think I would have perhaps said to him, “Tell me a bit about yourself. How long have you been unwell? What have you been told about your illness?” And then I might have asked him a bit more about his family in a, in a more conversational way. (Video 8.1)

Faculty who were particularly effective in helping residents develop actionable items to improve performance adopted a more goal-oriented approach; a shared understanding of strengths and gaps was achieved earlier on in the conversation, allowing more time for problem-solving and developing a plan to close gaps and refine skills. In helping residents identify these actionable items, faculty scaffolded new ideas to prior knowledge. This involved matching their language to that used by the resident and using residents' responses and ideas as a platform upon which to share their own perspectives and teach.

Faculty: I agree with you in that I think we do need to display the facts and not give them a false sense of reality, it's normal to approach things that are difficult with facts initially. I sometimes find that we're able to have patients relate to us and maybe open up about what they would want done in situations of end of life and share their values with simple gestures such as sitting down, being at their eye level, making eye contact… I wonder how you might be able to translate that to the patient? How do you think that you could have posed [the question] to the patient in such that you were able to give the facts, but also find a way to relate to his concerns? (Video 28.1)

Behaviors not aligned with the guidelines included repeatedly reframing questions to achieve the response that the faculty themselves had in mind, as illustrated hereinafter, and not spending time on developing a plan for ongoing resident development.

Faculty: So, um, I'd like to get a clearer, a clearer idea of what your intention was in this interaction with Mr. Anderson, like what were you, what were you talking to him, what was your goal in, in, in that session with him?… What did you come away with from your discussion with him about what you felt his understanding of his condition was?… I just want to get it clear in my mind what your goal was, what you were trying to accomplish in that session. (Video 45.1)

Facilitator: I think you asked a couple of times “Tell me if you felt this way,” you asked about insight and then you asked again about his perspective and then you kind of said, “well how would you do it differently,” and it almost seemed like “I've got an idea, I want to see if you're going to come to it. And I'm trying to pull it out of you.” And then [the resident] starts going, “What's up? Like if you've got something to tell me, just tell me.” (Video 53.2)


Engaging residents in feedback conversations is challenging for faculty and a skill that develops with experience. How different faculty members approach feedback has not previously been well described. This study provides an examination of the feedback practices among faculty within a simulated setting, demonstrating variability in how faculty have adopted recommendations supported by the strongest available evidence.

Applying a qualitative approach, we were able to describe the nuances of structure, focus, and language that faculty adopted in feedback conversations in relation to current guidelines. There was considerable variability in how faculty members' approaches aligned with recommended practices delineated by Lefroy et al,6 where optimal feedback is framed as a social interaction that is tailored, specific, and actionable. Faculty demonstrated a range of behaviors within each of the recommended practices; behaviors that were aligned with the recommendations established rapport with the resident and adopted a goal-focused approach to help them understand their strengths and gaps and develop strategies to improve their performance. Such behaviors are well aligned with a coaching model.12 Behaviors not aligned with Lefroy's recommendations6 resulted in a transactional and task-focused approach to feedback, without attempts to establish a relationship and where perspectives were concealed, questions were scripted and answers not explored, and the path forward was either vague or not addressed. Between these two extremes were numerous gradations, and none of the faculty uniformly excelled or performed poorly within all of Lefroy's recommended practices6; performance varied within and between simulations.

Our findings have implications for medical education and faculty development as CBME is adopted, where milestones and entrustable professional activities attest to acceptable levels of performance.8 Faculty will need to be skilled in feedback conversations that guide resident development. While some feedback will occur in the workplace, patient safety concerns have led to an increase in the use of simulated experiences to allow more opportunities for deliberate practice, where residents can attain and demonstrate skills before applying them in the context of patient care.13,14 Faculty will therefore need to develop their skills in discussing feedback in a timely and focused way,15 in both simulated and clinical settings. Given the challenges inherent in feedback conversations, it is unlikely that checklists to guide faculty in these assessments will be sufficient; faculty also require deliberate practice and feedback about these skills,16 which might best take place in simulated settings where they can try out different approaches with actors in a safe learning environment, and receive coaching on their performance.

There are limitations to our research. We selected Lefroy's four strongest “do's” recommendations as our template.6 This was deliberate because these all focused on the individual resident and were able to be assessed during a relatively brief encounter, whereas the “do's” pertaining to culture could not be assessed in a simulated setting, and Lefroy's “don'ts” had only moderate or tentative levels of evidence for process and content elements of feedback.6 The conversion of video to audio and then to textual data for analysis, while making the analysis feasible, may have reduced our understanding of some of the nuances of feedback conveyed through nonverbal communication. Template analysis has the potential to further constrain findings by using a predetermined lens to inform data analysis; limiting the number of themes in our original template allowed for subthemes to emerge through data analysis. We also recognize that the study took place at a single Canadian institution and our setting may be different than others. Lastly, the findings from a simulation may not reflect what happens within the greater complexity of the clinical environment.


The ability to engage in meaningful feedback conversations with residents is a critical skill for faculty in both clinical and simulated settings. With regular assessment as the cornerstone of CBME frameworks, faculty will need to be skilled in feedback to help residents develop insight into their strengths and performance gaps and develop strategies to achieve their learning goals. Our research describes how faculty approach feedback encounters and the variability in practice relative to evidence-informed recommendations and highlights applications of simulation for faculty development and coaching in cultivating feedback skills.


The authors thank James Huffman, Jason Lord, Kristin Fraser, Ghazwan Altabbaa, Sue Barnes, Bobbi Johal, and Gord McNeil for their contributions in debriefing the simulated feedback sessions. The authors also thank Helen Catena for acting in the videos and Simon Guienguere for helping in converting video to audio files.


1. Kogan JR, Conforti LN, Bernabeo EC, Durning SJ, Hauer KE, Holmboe ES. Faculty staff perceptions of feedback to residents after direct observation of clinical skills. Med Educ 2012;46:201–215.
2. Rudolph JW, Foldy EG, Robinson T, Kendall S, Taylor SS, Simon R. Helping without harming: the instructor's feedback dilemma in debriefing – a case study. Simul Healthc 2013;8:304–316.
3. Eppich W, Cheng A. Competency-based simulation education: should competency standards apply to simulation educators? BMJ Stel 2015;1:3–4.
4. Cheng A, Grant V, Robinson T, et al. The promoting excellence and reflective learning in simulation (PEARLS) approach to health care debriefing: a faculty development guide. Clin Simul Nurs 2016;12:419–428.
5. Cheng A, Grant V, Dieckmann P, Arora S, Robinson T, Eppich W. Faculty development for simulation programs: five issues for the future of debriefing training. Simul Healthc 2015;10:217–222.
6. Lefroy J, Watling C, Teunissen PW, Brand P. Guidelines: the do's, don'ts and don't knows of feedback for clinical education. Perspect Med Educ 2015;4:284–299.
7. Dath D, Dobst I. The importance of faculty development in the transition to competency-based medical education. Med Teach 2010;32:683–686.
8. Carraccio C, Englander R, Van Melle E, et al. Advancing competency-based medical education: a charter for clinician-educators. Acad Med 2016;91:645–649.
9. Braun V, Clarke V. Successful qualitative research: a practical guide for beginners. London: Sage; 2013.
10. King N: Doing template analysis. In: Symon G, Cassell C, eds. Qualitative Organizational Research: Core Methods and Current Challenges. London: Sage; 2012:426–450.
11. Hennink MM, Kaiser BN, Marconi VC. Code saturation versus meaning saturation: how many interviews are enough? Qual Health Res 2017;27:591–608.
12. van Nieuwerburgh C. Coaching in Education: Getting Better Results for Students, Educators, and Parents. London: Karnac Books; 2012.
13. Aggarwal R, Mytton OT, Derbrew M, et al. Training and simulation for patient safety. Qual Saf Health Care 2010;19:i34–i43.
14. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: a best evidence practical guide. AMEE Guide No. 82. Med Teach 2013;35:e1511–e1530.
15. Holmboe ES, Ward DS, Reznick RK, et al. Faculty development in assessment: the missing link in competency-based medical education. Acad Med 2011;86:460–467.
16. Berendonk C, Stalmeijer RE, Schuwirth LW. Expertise in performance assessment: assessors' perspectives. Adv Health Sci Educ Theory Pract 2013;18:559–571.
Appendix 1
Appendix 1:
Coding Template Derived from Guideline Recommendations and Video Transcripts

Feedback; simulation; faculty development; qualitative

Copyright © 2018 Society for Simulation in Healthcare