The recently published Healthcare Simulationist Code of Ethics addresses a number of key ethical concerns under the domains of integrity, transparency, mutual respect, professionalism, accountability, and results orientation.1 One issue explicitly approached by this code is the use of deceptive educational techniques and manipulations (hereafter “deception”).
Deception has been viewed an important tool since the inception of simulation.2–5 In the broader psychological and educational community, however, deception has a more controversial history.3,5 Stanley Milgram and Philip Zimbardo are often referenced in this context as their research involved a significant degree of participant psychological manipulation. Both have been the subject of ethical criticism.6–11 Within healthcare simulation, some educators, ethicists, and psychologists have questioned whether these criticisms might apply, speculating that the use of deception may jeopardize learners' trust in simulation facilitators, negatively affect learning and the educational environment, erode the integrity of the simulation enterprise, and threaten learners' emotional well-being.5,12
Despite these concerns, deception is often used within simulation to reproduce important rare and critical clinical events, including equipment failure, situations in which inappropriate orders are given, situations in which learners must reconcile conflicting data, and situations where learners must engage in complex decision-making.2,13 Given the high patient safety risk that such situations can pose, the use of deception is justified by some as unavoidable.4
At present, there is little empirical work examining deception in simulation, and what exists is largely speculative.3,12 This, however, does not absolve facilitators of the need to carefully consider how deceptive techniques should and should not be used within their own practice. To assist in this process, we offer the following considerations, recommendations, and guidelines regarding the uses and potential pitfalls of deception.12,14
Defining deception is not straightforward. In some critical ways, all simulation is a form of deception as it causes “individuals to accept as true or valid what is false or invalid.”15 Without this “fiction contract” much of what the healthcare simulation community does would be impossible.16 On the other hand, deception also connotes deliberate trickery, which all would wish to avoid. To complicate matters, in the authors' experience individual facilitators disagree significantly on where the line should be drawn between what might be considered the legitimate “deception” needed to conduct simulation and intentional trickery, creating a significant gray area where no consensus seems to exist.
Here Erving Goffman's work can provide a useful touchstone.17,18 As discussed in prior publications, Goffman conceptualizes the fiction contract as an understanding within which a “primary frame” (in the case of healthcare simulation a patient care environment) is re-enacted as a “modulation” within another primary frame (the simulation center or program).12,16–18 The basic ground rules and boundaries for this modulation (the fact of the modulation's existence, the places where it begins and ends, the reason for its enactment, etc.) are understood by both learners and facilitators. If a modulation is introduced for which no clear agreement or understanding of its presence, boundaries and/or ground rules exists; however, it becomes a “deception.”12,16–18 This deeper understanding includes not only facilitator intent but also learner perception of the event as well as the unavoidable power differentials created by the facilitators' control over the flow of information in the simulated environment. Based on this, we will use the term “deception” throughout the rest of this article to refer to an element or aspect introduced into a simulation for which there is no clear agreement or knowledge among participants and facilitators regarding its presence, ground rules, or boundaries.
DECEPTION IN SIMULATION PRACTICE: OBSERVATIONS REGARDING APPLICATIONS, BENEFITS, AND PITFALLS
Diversity of Application
Deception seems to be applied in 2 different ways by facilitators: the omission of information or other aspects of the environment (equipment, personnel, etc.), and the deliberate provision of false information and/or incorporation of faulty equipment or “saboteurs.”19 Facilitators also seem to differ in whether they include information about deception in orientation practices, plan to disclose the deception during the debriefing, or intend to keep it concealed from learners.19 This range of practice has implications for how deception is experienced by learners.
Potential Detriments and Benefits
In the authors' experience potential detriments to deception's use exist, including the generation of negative learner emotions (eg, stress, anxiety, anger, shame) and loss of trust and respect for team members, facilitators, and even simulation as a field.19,20 These observations are supported by studies in psychology suggesting that deception can lead to moral distress, anger, suspicion, distrust, and, in some cases, poor team functioning.9,21–25 Excessive stress and loss of trust have also been shown to be detrimental to learning.26–28 Potential beneficial uses also seem to exist, however, as deception can be used to authentically recreate the complex situational and social realities of medical practice within the simulated environment.19,29 These include key teamwork events, situations where critical problem-solving is required, equipment malfunction, miscommunication, and situations of resource or knowledge limitation.13 Session plans and goals can also be altered (in potentially nontransparent/deceptive ways) in real time by facilitators to tailor simulations to specific learner needs. These benefits seem to depend, however, on facilitator intentions and conduct during and after the event, points discussed later in greater depth.
Several forms of deception seem to be potentially high risk.19 Here, Goffman's understanding of deception is of particular value, as many of these situations center on deliberate breaches of the common understanding of the boundaries and ground rules of the event. These breaches can include disregarding the learning objectives as understood by learners, breaking promises or other guarantees made by facilitators, and/or introducing confusion regarding what is part of the simulation and what is real. The importance of creating and maintaining such boundaries early in the simulation has been emphasized through the concept of the “safe container for learning” advanced by Rudolph et al.30 By breaching these boundaries, events such as these are likely to be perceived by learners as ill-intended. Another high-risk situation is the use of deceptive techniques to create “gotcha” moments, which are often perceived by learners as tricks intended by facilitators to precipitate failure without educational value.
The use of the learners themselves to cause or perpetuate deception is also potentially high risk, as it can precipitate breakdowns in the shared sense of teamwork and camaraderie among learners. One of the authors (A.C.) vividly recalls an experience early in his training when he was asked in private before a simulation to order an incorrect (and potentially harmful) medication during the session. This deception led to significant concern that other team members would perceive him as untrustworthy or lacking necessary knowledge and has influenced his educational practices to this day. This technique also risks the learners inadvertently acquiring false knowledge and/or performing the wrong action in actual practice if they do not clearly understand that the deception does not represent optimal care. Table 1 provides specific examples of each high-risk situation.
Key Ethical Decision Points
Based on our observations of deception in practice and the issues noted previously, there seem to be a number of key decision points that, if carefully considered, may assist facilitators in the thoughtful use of deception while safeguarding against psychological harm.19 The first is to ask whether deception is necessary at all or whether the learning objectives can be achieved without it. If deception is deemed essential, the next decision point is whether to implement the deception by simple omission or via the provision of false data or defective items. Many recognize a moral distinction between these approaches.31–33 Providing false information may result in learners feeling as if they were “lied to,” and while doing so may be needed to recreate specific situations within the simulation, the cost may be higher.
In addition, certain deceptive approaches may feel “false” to the team despite the facilitator's best intentions and thus threaten the fiction contract if taken too far. A careful decision regarding the extent of the deceptive approach that considers any potential unintended consequences on fidelity is thus needed. This is particularly relevant during simulation-based assessments. Although the authors can envision certain types of “deception” (eg, the initial omission of certain information by standardized patients, unless specifically asked) being used as an aid to authenticity, if taken too far this could result in loss of perceived fidelity and create a situation in which the learner cannot demonstrate their actual knowledge or skill. If the assessment is high-stakes (such as the United States Medical Licensing Examination Step 2 Objective Structured Clinical Examination), this could potentially damage the learner's future career.16,30
Finally, the emotional context of the case (particularly if mannequin death, or some other negative outcome, is a possibility) and the learner's level of clinical experience and knowledge must be considered. This is especially true for novice learners who may not yet possess the needed resilience to emotionally navigate the situation. Successful learner development should always be the guiding principle when making these decisions.
DECEPTION AND THE CODE OF ETHICS: APPLICATIONS FOR EDUCATIONAL PRACTICE
The Healthcare Simulationist Code of Ethics contains a number of key statements that address deception.1 The most direct of these are found within the core value of transparency, but deception is also indirectly addressed within the core values of integrity, mutual respect, and accountability. Table 2 lists all relevant quotes from the Code of Ethics under their respective core values. To summarize, however, the Code of Ethics recommends maximizing clarity about the nature of the simulation while minimizing deception's use. Furthermore, if deception is to be used at all, the potential risks must clearly be disclosed, the values and skills of learners must be respected, and care must be taken to avoid unintended consequences. This accords well with the authors' observations and is of particular relevance for the “high-risk” situations described in Table 1.
In 1967, Kelman proposed a concept that is useful when considering these “high-risk” situations.12,34 Similar to Goffman, Kelman suggested that participants perceive deception that occurs within the understood boundaries and ground rules of a psychological experiment differently than deception that raises doubts about where the boundaries themselves lie. It is not difficult to translate these concepts to the realm of healthcare simulation by suggesting that a deception conducted wholly within the established ground rules, boundaries, and “safe learning container” (as defined by Rudolph et al30) will be perceived differently than a deception that violates those borders. As an example, consider a simulation with learning objectives related to airway maintenance. Kelman's classification suggests that the learners will respond more favorably to a deception in which a confederate hands the team leader faulty airway equipment during the case that they then must troubleshoot (deception within) than they would to the revelation that the “real” reason they were there was for a study on cognitive load (deception about). In the first situation, the deception was conducted wholly inside the learners' understood reason for attendance (ie, within the ground rules), whereas in the second, learners were deceived as to the very purpose of the event. Upholding as much as possible the pre-established ground rules and boundaries in which simulations occur is thus an important way to mitigate deception's negative effects.
Perhaps the most significant boundary in a simulation is the border between what is real and what is not real (ie, simulated). Deceptions that blur this boundary and lead learners to question whether an event is part of the simulation or real thus have high potential for harm. Indeed, we suggest that a clear understanding of this boundary is a key aspect of the “safe container” for learning discussed by Rudolph et al.30 This issue is perhaps most acute when session facilitators introduce the confusion (such as the example described in Table 1 in which a facilitator feigns chest pain during the session). It is easy to see that a mannequin is not a live human being, and standardized patient “confederates” are usually seen as part of the simulated environment as well (although this depends on the ground rules established for the case).35 When a facilitator feigns illness as part of a deception, however, participants' ability to distinguish simulation and reality can be significantly impaired, introducing confusion and unnecessary stress.
For those rare circumstances where the benefits of such an approach are judged to outweigh the risks, it is important to implement a successful mitigating strategy.1 One approach is the establishment of a universally understood emergency phrase that learners can use at any time to clarify what is real and what is simulated. Although care must be taken to assure the phrase is not abused by learners to escape from difficult (but legitimate) learning situations, this approach allows learners to re-establish reality when needed, giving them a measure of control over the experience.
For those deceptions that may be perceived by learners as “gotcha” moments context is key, as it is easy to imagine a simulation that, at first glance, is a straightforward “gotcha” moment but in fact accurately reflects a clinically relevant harm event. As an example, consider a simulation in which the oxygen flow to the room is deliberately disabled. Although this may seem unrealistic, high-risk situations and errors such as this can actually occur in the healthcare environment and simulations designed to recreate these events can be valuable in assuring patient safety.36 Instead, it is trickery for trickery's sake that should be avoided. Unconscious motivations must also be considered here. Although it is the authors' hope that facilitators would not deliberately introduce “gotcha” moments without solid educational reasons, many facilitators interact with their learners outside the educational environment and may have formed opinions about them (both positive and negative) before the simulation. The possibility thus exists of “gotcha” moments unconsciously creeping in based on these experiences. To combat this, facilitators must maintain awareness of not just their conscious motives, but their potential unconscious motives also. Whether consciously or unconsciously introduced by facilitators, deception should never be used with a “damage the learner's future career.”1
Regarding the use of deception during emotionally charged simulations, one protective action that facilitators can take is to predefine learner “success” within the session in terms of the learning objectives rather than mannequin “recovery.” Situations in which mannequins die or decompensate as planned aspects of the case are quite legitimate educational exercises, and in actual clinical practice, performing all the correct actions does not guarantee patient recovery.37 In simulation, however, the authors have observed a tendency among leaners to equate success with clinical improvement and/or survival. Because of this, learners may experience anger or stress if they perceived that a deceptive technique was used to ensure mannequin death or clinical deterioration, even if in service to understood learning objectives. We thus suggest clearly addressing what “successful” scenario navigation entails during the briefing.
In what situations, then, should deception be used? Here, the American Psychological Association's (APA) recently published Ethical Principles of Psychologists and Code of Conduct offers a valuable perspective.38 Like the Healthcare Simulationist Code of Ethics, the APA document addresses deception and advises its avoidance when possible, especially in those situations likely to “cause physical pain or severe emotional distress.”38 The APA document also recognizes, however, that situations exist in which deception may be “justified by the study's significant prospective scientific, educational, or applied value.”38 Although this APA statement primarily addresses the context of psychological research, there is enough methodological overlap with healthcare simulation to recommend a thorough review of the document by the simulation community.
MITIGATING THE EFFECTS OF DECEPTION: THE ROLE OF BRIEFING AND DEBRIEFING
If it is true that situations, however uncommon, exist in which deception is necessary to achieve needed learning objectives, it then becomes important to consider what guardrails should be implemented for its safe use. The overall orientation, immediate presession briefing, and postsession debriefing represent natural points of intervention, as they are the bookends between which the simulation is conducted.30,39–41 Simulation programs are quite diverse, and in some the orientation and presession briefing are combined as one event. This, however, does not fundamentally change this recommendation.
The program orientation and presession briefings represent the preparation learners are given before entering the simulated environment and provide critical opportunities for facilitators to establish the ground rules and boundaries of the session. By clearly articulating the learners' reason for being there, delineating what may and may not occur within the session, and defining where the simulation ends and reality resumes, learners can be encouraged to engage, take risks, and endure negative emotions and events secure in the understanding that these are intended for their educational benefit.30 One way to accomplish this is the incorporation of a general statement within all presimulation orientations and/or presession briefings defining deception, stating that it may occur within any scenario when deemed necessary to meet the learning objectives, and assuring learners that trickery for its own sake will be avoided. Doing so both acknowledges the practice and places any deception firmly within the boundaries of the “safety container” without drawing undue attention to specific cases.30 This is also an optimal time to define the concept of “successful case management” in a way that explicitly incorporates the possibility of mannequin death or clinical deterioration and to present a preselected “go to” phrase that can be used by learners to clarify what is simulated and what is real. Table 3 presents several ways of approaching these phrases.
Postsimulation debriefing also plays a critical role. A well-conducted debriefing has long been recognized as a powerful tool for defusing negative learner emotions and enhancing learning after emotionally difficult simulations.2,20,39–43 Debriefing thus provides an ideal time to disclose and reflect on any “necessary” deception that may have occurred. Although the parallels are not exact, the literature addressing deception in psychological research provides similar advice, emphasizing that any deceptive techniques used (and the rationale for their use) be disclosed after the experiment has concluded in a transparent, professional manner.44,45 The APA Code of Conduct offers additional useful advice regarding disclosure, stating that deception be explained to participants “as early as feasible,” that measures be taken within the context of debriefing to reduce harm, and that, if harm is perceived to exist, “reasonable steps be taken” to address it.38 Given the complexities of such disclosure, we further suggest that novice facilitators refrain from using deception until they have attained sufficient debriefing experience in less emotionally charged situations. Finally, despite optimal orientation/briefing and debriefing conducted by skilled facilitators, some learners may still remain unsettled by the use of deception. If this occurs it will be necessary to follow-up with these learners long term to offer support where needed and assist them in their ongoing reflection on the experience, as they still may be processing the deception days to months later.
CHARTING A COURSE
How, then, should the simulation community proceed with regard to this issue? Here, succinct decision-making heuristics become particularly valuable as guides for whether deception should be used, and, if so, the most judicious and psychologically safe way to implement it. With this in mind, we offer the following practical guidelines regarding deception to the simulation community.
First, we affirm the call of the Healthcare Simulationist Code of Ethics to minimize the use of deception and further suggest that deception only be applied when the subject matter of the simulation is both critical in nature and cannot otherwise be approached with adequate fidelity. Furthermore, the perceived need to use modification/deception should be held in conscious, thoughtful tension with the emotional context of the case and the experience level of the learners. Such an approach must be deliberate in nature, should focus on a disciplined evaluation of potential positive and negative ramifications and should approach possible risks with discernment. Table 4 outlines how these “3 Ds” can be applied.
For those situations in which deception is felt to be necessary, we further recommend that facilitators place the potential for deception firmly within the “safe container for learning” of the session during the overall orientation or presession briefing using general introductory language. At the same time, facilitators should preemptively establish a “go to” phrase that learners can use to clarify what is real and what is simulated if they experience undue distress. Finally, facilitators should prepare to carefully disclose both the deception's existence and the underlying rationale behind it during the debriefing, with an eye toward any potential negative emotions that may require special attention in both the short and long term. Table 3 gives useful tips on how these “3 Ps” can be concretely applied.
The publication of the Healthcare Simulationist Code of Ethics has enhanced the community's awareness of key ethical issues, including the use of deception. Although it is our hope that future research sheds new light on this area, at present it seems best to curb deception's use unless absolutely necessary. For those situations in which deception is felt to be justified, it is vital to recognize that the potential for harm still exists. It has been said that with great power comes great responsibility, and we offer these guidelines to promote the judicious, responsible use of this powerful, yet potentially harmful, technique.
The authors thank the board of directors, research committees, and staff of the Association of Standardized Patient Educators (ASPE), the Society in Europe for Simulation as Applied to Medicine (SESAM), and the Society for Simulation in Healthcare (SSH) for the invaluable assistance and support, without which this work would not be possible.
1. Healthcare Simulationist Code of Ethics. Available at: http://www.ssih.org/Code-of-Ethics
. Accessed March 1, 2019.
2. Calhoun AW, Boone MC, Miller KH, Pian-Smith MC. Case and commentary: using simulation to address hierarchy issues during medical crises. Simul Healthc
3. Gaba DM. Simulations that are challenging to the psyche of participants: how much should we worry and about what? Simul Healthc
4. Goldberg A, Katz D, Levine A, Demaria S. The importance of deception in simulation: an imperative to train in realism. Simul Healthc
5. Truog RD, Meyer EC. Deception and death in medical simulation. Simul Healthc
6. Haney C, Banks C, Zimbardo P. A study of prisoners and guards in a simulated prison. Naval Res Rev
7. Milgram S. Behavioral study of obedience. J Abnorm Psychol
8. Milgram S. Obedience to Authority: An Experimental View
. New York, NY: Harpercollins; 1974.
9. Perry G. Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments
. New York: The New Press; 2012.
10. Baumrind D. Research using intentional deception: ethical issues revisited. Am Psychol
11. Savin HB. Professors and psychological researchers: conflicting values in conflicting roles. Cognition
12. Calhoun AW, Pian-Smith MC, Truog RD, Gaba DM, Meyer EC. Deception and simulation education: issues, concepts, and commentary. Simul Healthc
13. Goldberg A, Silverman E, Samuelson S, et al. Learning through simulated independent practice leads to better future performance in a simulated crisis than learning through simulated supervised practice. Br J Anaesth
14. Calhoun A, Pian-Smith M, Shah A, et al. Modification of Information During Simulation: An Exploratory Study. In: Abstracts Presented at the 19th Annual International Meeting on Simulation in Healthcare: 2019
. San Antonio, TX: Simulation in Healthcare; 2019;14(4):e1–e62.
15. (n.d.) In Merriam-Webster.com
. Available at: https://www.merriam-webster.com
. Accessed October 12, 2019.
16. Dieckmann P, Gaba D, Rall M. Deepening the theoretical foundations of patient Simulation as social practice. Simul Healthc
17. Munch R. Sociological Theory: From the 1920's to the 1960's Volume 2. Chicago: Nelson Hall Publishers; 1994.
18. Goffman E; Frame Analysis. An Essay on the Organization of Experience
. Boston: Northeastern: University Press; 1974.
19. Calhoun AW, Pian-Smith M, Shah A, et al. Exploring the boundaries of deception in simulation: a mixed-methods study. Clin Simul Nurs
20. Auerbach M, Cheng A, Rudolph JW. Rapport management: opening the door for effective debriefing. Simul Healthc
21. Connelly C, Turel O. Effects of team emotional authenticity on virtual team performance. Front Psychol
22. Epley N, Huff C. Suspicion, affective response, and educational benefit as a result of deception in psychology research. Pers Soc Psychol Bull
23. Fuller C, Marett K, Twitchell D. An examination of deception in virtual teams: effects of deception on task performance, mutuality, and trust. IEEE Transact Prof Commun
24. Sharpe D, JG A, Roese N. Twenty years of deception research: a decline in subjects' trust? Pers Soc Psychol Bull
25. Burgoon JK, Buller DB, Ebesu AS, White CH, Rockwell RA. Testing interpersonal deception theory: effects of suspicion on communication behaviors and perceptions. Commun Theory
26. Fraser K, Huffman J, Ma I, et al. The emotional and cognitive impact of unexpected simulated patient death: a randomized controlled trial. Chest
27. Watling C, Driessen E, van der Vleuten CP, Lingard L. Learning from clinical work: the roles of learning cues and credibility judgements. Med Educ
28. Goldberg A, Samuelson S, Khelemsky Y, et al. Exposure to simulated mortality affects resident performance during assessment scenarios. Simul Healthc
29. Tun J, Alinier G, Tang J. Redefining simulation fidelity for healthcare simulation. Simul Gaming
30. Rudolph JW, Raemer DB, Simon R. Establishing a safe container for learning in simulation: the role of the presimulation briefing. Simul Healthc
31. Sokol DK. Can deceiving patients be morally acceptable? BMJ
32. Hey JD. Experimental economics and deception: a comment. J Econ Psychol
33. Dunleavy KN, Chory RM, Goodboy AK. Responses to deception in the workplace: perceptions of credibility, power, and trustworthiness. Commun Stud
34. Kelman H. Human use of human subjects: the problem of deception in social psychological experiments. Psychol Bull
35. Lopreiato JO. Healthcare Simulation Dictionary. Rockville, MD: Agency for Healthcare Research and Quality; 2016. AHRQ Publication No 16(17)-0043
36. In-situ simulation, challenges and results. AHRQ
. Available at: http://ahrq.hhs.gov/downloads/pub/advances2/vol3/Advances-Patterson_48.pdf
. Accessed March 1, 2009.
37. Leighton K. Death of a simulator. Clin Simul Nurs
38. Ethical Principles of Psychologists and Code of Conduct. Available at: https://www.apa.org/ethics/code/index
. Accessed May 3, 2019.
39. Arafeh JM, Hansen SS, Nichols A. Debriefing in simulated-based learning: facilitating a reflective discussion. J Perinat Neonatal Nurs
40. Rudolph JW, Simon R, Rivard P, Dufresne RL, Raemer DB. Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesthesiol Clin
41. Smith SS, Richardson D. Amelioration of deception and harm in psychological research: the important role of debriefing. J Pers Soc Psychol
42. Tripathy S, Miller KH, Berkenbosch JW, et al. When the mannequin dies, creation and exploration of a theoretical framework using a mixed methods approach. Simul Healthc
43. Loo ME, Krishnasamy C, Lim WS. Considering face, rights, and goals: a critical review of rapport management in facilitator-guided simulation debriefing approaches. Simul Healthc
44. Mills J. A procedure for explaining experiments involving deception. Pers Soc Psychol Bull
45. Boynton MH, Portnoy DB, Johnson BT. Exploring the ethics and psychological impact of deception in psychological research. IRB