Many instructional design features contribute to the effectiveness of simulation as an educational tool. Evidence now supports the presence of feedback, cognitive interactivity, repetitive practice, and range of difficulty as best practices in simulation-based training.1,2 As high-technology simulation has risen in popularity in medical education, simulator fidelity has emerged as a potentially significant instructional design feature, with the assumption that greater fidelity will result in enhanced learning. Yet previous reviews of simulator fidelity emerging from the aviation industry and military have highlighted that fidelity is multifactorial3,4 and that fidelity requirements vary according to the learning context.5,6 Reviews from health professions education have identified similar concerns.7–9 However, no review to date has considered broadly the intersection between the dimensions of fidelity and the technologies available in simulation-based health professions education. We wrote the present critical commentary to highlight the multiple meanings of the term fidelity, demonstrate that these meanings actually represent different underlying principles of effective learning, and propose a more precise vocabulary for discussing the functional and physical features of simulation training.
Confusion Surrounding the Term Fidelity
We and other colleagues recently conducted a large systematic review and meta-analysis of the literature on technology-enhanced simulation in health professions education.10 From an initial pool of 10,904 articles screened, we identified 985 original comparative studies of simulation-based health professions education. Given the perceived importance of fidelity in simulation training, we planned to code this feature for all included studies. However, we found it impossible to code this feature with high reliability. List 1 illustrates some of our frustrations as we struggled to achieve consensus. Among the most salient points, we realized that the same simulator could be viewed as high or low fidelity depending on which features were emphasized or ignored, that fidelity requirements vary depending on the training task, and that classifying fidelity as high or low is too simplistic. We attempted to define, refine, clarify, subcategorize, and implement this term during our review, but to no avail. Despite several attempts to clarify our definitions, we were never able to consistently recognize fidelity between or even within raters.
The problem was that fidelity seemed to be a moving target. Different authors described the same simulator as reflecting high or low fidelity depending on whether they emphasized the visual, auditory, tactile, or functional features of the simulator, and also depending on the learners, learning objectives, and learning context. For example, researchers in anesthesia nearly always considered mannequins and virtual reality (VR) systems as high fidelity. However, surgeons typically viewed cadavers and animal models as possessing higher fidelity or realism than mannequins11,12 or VR systems.13–15
The confusion over fidelity is not new in other fields. In their review of simulation for teamwork, Beaubien and Baker7 point out that early in the field of industrial and organizational psychology, fidelity was seen as a bipolar concept—that is, consisting of “high” and “low.” But authors now view this perspective as too simplistic, especially in the sense that it overemphasizes the technology at the expense of “more substantive issues, such as the training’s goals, content, and design.”7 Likewise, in military aviation, “countless dimensions of simulation fidelity have been proposed.”7 In reviewing the literature on these various conceptions of fidelity across many fields, we uncovered a wide variety of definitions, reflecting little underlying consensus.3,5,16–21
Perhaps the most useful starting point for this discussion comes from the review by Allen et al,3 who concluded that two major dimensions underpinned the large variety of definitions and constructs for fidelity—that of structural fidelity (how the simulator appears) and functional fidelity (what the simulator does). This dichotomy is in itself problematic, in that it glosses over the complexities of educational context, but the distinction between structural and functional is useful as a first approximation and will serve as a starting point for our critical commentary.
An Undue Emphasis on Structural Fidelity?
Many studies have now shown that increases in structural fidelity do not necessarily correspond to increases in educational effectiveness,3,5,22–29 and there is also specific evidence showing that the effect of structural fidelity depends on the skill level of the learner.17,30,31 For teamwork and communication, the way in which learners interact with one another is of paramount importance, so it is perhaps not surprising that the simulation environment is more important in training for these skills than is the structural fidelity of the simulator itself.21 The focus on interactions between learners helps to create authenticity for such skills.
For surgical procedures, structural fidelity might be very important for some parts of the task (e.g., where the learner is interacting directly with the tissue), but not others (e.g., decision making or communication). Certain procedures might require live animal tissue for learning, whereas others might be more effectively taught using inanimate animal tissue.32 Cadavers have been called the “gold standard” for training in surgery,33 but even they can vary in structural fidelity, depending on the degree of tissue preservation.8 In any case, it is perhaps more important to match the learners’ needs regarding the dynamic (functional) properties of the system rather than the structural features. Salas et al25 have concluded that “the level of fidelity built into the simulator should be determined by the level needed to support learning on the tasks.”
In separate reviews on the importance of fidelity in simulation,5,34 authors have agreed on the primary importance of learning objectives and task demands for effective transfer of learning (where transfer of learning is defined as the application of knowledge and skills learned in one context to another—e.g., from simulation to patient care). Kneebone16 has emphasized that structural fidelity does not always correspond to educational effectiveness: “All too often it is the surface realism of the simulation that occupies the ingenuity of those who develop it, eclipsing key issues of teaching and learning … lower levels of fidelity may reduce technological limitations and cost without compromising outcomes.”
Why Functional Fidelity Matters
It may be more productive from an educational standpoint if we consider the fidelity of the simulation scenario relative to clinical task demands (functional fidelity) rather than the physical resemblance to the human patient (structural fidelity). If the learner is given a particular task to learn, and oriented properly to the context and physical platform on which to learn it, that learner may actually “project” fidelity onto the simulation scenario. For instance, if one enhances functional fidelity (e.g., using cellophane to represent connective tissue) while degrading structural fidelity, the overall potential for educational effectiveness could be preserved or even improved. In this setting, a low-structural-fidelity simulator with high-functional fidelity may lead to more effective transfer.30,*
There are good theoretical reasons why this might be so. In the constructivist framework of education and knowledge acquisition, the learner actively attempts to make the learning context relevant to their objectives.† Thus, the relevance of low structural fidelity depends on the goals of the learner.35–37 Such an active learning framework helps to explain why low structural fidelity can be effective and, consequently, how fidelity is not a static attribute of the simulator. A cognitive capacity model (wherein learners have a finite capacity for holding task-relevant information in working memory)38 has also been used to explain the distracting effect of high structural fidelity.9,39–41 An undue emphasis on high structural fidelity can direct attention toward irrelevant aspects of the simulation platform and away from those elements central to the primary training objective.42 In simulation-based health professions education, close alignment between the clinical task and the simulation task (i.e., functional fidelity) is often more important than structural fidelity for achieving the training goals. For example, inert animal tissue (e.g., a fresh pig’s foot) may be considered to be of low (structural) fidelity until it is used to teach simple suturing techniques, at which time the tissue responsiveness to manipulation confers precisely the level of (functional) fidelity required for the task of basic suturing skills. This highlights the benefits of functional task alignment (i.e., aligning the simulator’s functional properties with the functional requirements of the task).
Matsumoto et al36 used a coffee cup and drinking straws to successfully train ureteroscopic skills. In that study, content experts were briefed on the learning objectives and identified key functional parameters of the target clinical task while designing the simple model. Effective training in this context can be explained by increased attention to functional task alignment (i.e., creating a minimally effective platform for teaching specific technical skills) rather than the simulator’s physical appearance.5 By asking “What are we going to teach?” rather than “How will we use the existing platform to teach this skill?” the instructor effectively shifts focus toward the learner and allows greater opportunity for engaging principles of active learning. Simulators with low structural fidelity are usually simple in their design and devoid of electronics, which renders them generally easy to design, construct, and modify. Their main advantage lies in the ability to design highly specific task demands into the physical platforms to address targeted learning objectives (i.e., functional task alignment). It is this design approach, based on educational principles, which leads to effective simulation-based training. Indeed, the key features of effective simulation identified by Issenberg et al1 and McGaghie et al43 and confirmed in our recent work2 all have firm grounding in accepted theories of instructional design.44
The Concept of Fidelity Is Flawed
Fidelity is generally assumed to be an important factor in simulation-based training, and this assumption is rarely discussed or challenged. However, as noted above, structural fidelity cannot be determined independent of the instructional goals. A simulator that is considered low fidelity in one circumstance might be considered high fidelity in another for legitimate reasons. Moreover, we have explained how the term fidelity is rather imprecise on its own and refers, instead, to many separate concepts. Given this multiplicity of meanings, we question the continued usefulness of the term fidelity as it relates to simulation-based health professions education. It seems that in most cases people use this word to refer to the physical resemblance of the simulator, yet the functional alignment with the learning task, the instructional design, and the instructor likely have far greater impact on immediate learning, retention, and transfer to new settings.
On the basis of these observations above, we developed the recommendations summarized in List 2 and discussed in more detail below.
Recommendation #1: Abandon the term fidelity.
The importance of precision in one’s choice of words cannot be overemphasized. Thomas Kuhn45 argued that disagreements in basic terminology represent an early phase in the development of a field. Precise language facilitates accurate and concise communication of thought. Yet the term fidelity as employed in the field of simulation-based health professions education is imprecise, referring, instead, to several distinct concepts. We propose that the field of simulation abandon the term fidelity and, instead, focus on the various underlying principles for effective learning. We propose that the term structural fidelity be replaced by the term physical resemblance. This would include tactile, visual, auditory, and olfactory features of the simulator designed to enhance its physical appearance. We also propose that the term functional fidelity be replaced by the term functional task alignment. This subtle change in terms not only emphasizes the importance of the task but also connotes the need for an active and intentional process to determine the needed alignment.
Recommendation #2: Shift emphasis from physical resemblance to functional task alignment.
There is now plenty of evidence that physical resemblance can be reduced with minimal or no loss of educational effectiveness, provided there is appropriate correspondence between functional aspects of the simulator and the applied context. Such features can include contextual cues such as similar staffing or spatial arrangement of components, or ensuring appropriate orientation to the case during the simulation scenario. As noted above, the physical properties of the simulator are often of secondary importance relative to the functional task alignment and instructional design. Physical resemblance should still be considered in the choice of the simulator, but only after careful consideration of educational need. Hays26 has pointed out that the choice of physical resemblance for maximal training effectiveness depends on a number of factors, including the context within which the simulator is used, the kind of task for which the learner is being trained, the stages of learning involved, learner abilities and capabilities, task difficulty, and the effects of various instructional features. Consistent with recommendations in other fields, such as military aviation training, the field of simulation should shift emphasis away from structural properties of the simulator (i.e., physical resemblance) to functional properties of the entire simulation context that align with learning objectives (i.e., functional task alignment).
Recommendation #3: Focus on methods to enhance transfer of learning.
Transfer of learning involves the application of skills learned in one context to another context, and as such can be a powerful motivation for making use of simulators in medical education. The historical emphasis on structural fidelity evolved as a means to enhance transfer of learning through learner engagement.24 Indeed, cognitive engagement is associated with higher learning outcomes.2,46,47 However, physical resemblance is only one way to enhance learner engagement, and it is only one of several ways in which transfer of learning is enhanced. Educational effectiveness results from a complex interaction between the simulator and what the educator and/or the learner does with the simulator, including the provision of appropriate orientation and learning objectives, with the human element most often exerting more influence than the simulator itself.3
Much remains to be done
We believe that much remains to be done to understand how to enhance transfer of learning. Simulation scenarios should be designed to enhance transfer by whatever means necessary, including a mix of physical and functional resemblance, within the context of effective instructional design. This will no doubt include sensory augmentation offered by physical resemblance to promote suspension of disbelief, but also much broader aspects of learner engagement, such as learner orientation and focused learning objectives. Such considerations amount to enhancing functional alignment between the simulation setting and the real patient setting. Whether an educational experience is perceived by the learner to mimic real clinical practice could be considered an educational outcome that should be measured; this perception is typically not a fixed property of the simulator but depends more on effective design in scenario development.
The reconceptualization of the field of simulation-based health professions education we have presented above gives renewed emphasis to deliberate instructional design. Educators and researchers will need to consider case-by-case whether the simulation-based task aligns with the intended learning objectives, whether the learners are engaged in the learning process, and whether details of the simulation-based educational intervention contribute to effective learning.
Implications for education
The ultimate intent of much simulation-based health professions education is to promote transfer of learning to the clinical setting, and this compels us, as educators, to determine the degree to which a given simulator supports such transfer. It is perhaps most helpful to consider physical resemblance as a continuum (rather than as high or low) that can vary on the basis of the learners and the specific learning objectives. Functional task alignment will be greater to the extent that essential functional components of the target task are designed into the simulation. These essential components can be identified by, for example, engaging a panel of experts in an analysis of the task of interest,48 followed by the selection of the best simulator platform or context to address the educational goals. In this way, simulation is no different than any other educational process.16,49,50
Learner engagement can be enhanced by appropriate orientation of the learner and selection of a simulation scenario and physical platform that matches his or her level of training. For example, a senior cardiac surgery resident with plenty of experience in the operating room might not see the value in a session with pig’s heart valves if the tissue properties are not carefully preserved. Educators must remain alert to the motivational value of investing in simulators in correspondence with the level of the learner. This is especially important for more advanced learners, who typically have experience with and access to real patients on a regular basis. For that level of learner, it is perhaps most helpful to focus on teaching around high-acuity, low-frequency events.
Implications for future research
The merits and role of physical resemblance should be systematically studied. Specific research could determine for what tasks physical resemblance is important, and under what conditions novices benefit from or become distracted by enhancing a simulator’s superficial appearance. Questions also remain about why learners engage to varying degrees and how we can enhance engagement in a given simulation scenario. It has been suggested that high-technology models are better at maintaining interest and enthusiasm.51 Whether this is a property of the simulator or a passing fad, and whether functional task alignment and instructional methods can maintain enthusiasm with equal or superior results, remains to be seen. Given that some learners acknowledge the educational value of models with less physical resemblance, why do they still prefer the use of more sophisticated and expensive models?22
What role if any does suspension of disbelief play? For low-tech models such as Matsumoto and colleagues’36 cup and straws, for example, the learner can readily understand that the black dot at the base of the Penrose drain is not a good representation of the verumontanum, but is willing to suspend disbelief for the moment because the important issue in this training context is the location of that indicator. In this way, suspension of disbelief is a mechanism that learners engage in to project the learning objective onto the physical structure; they become willing to ignore features that are irrelevant to the task at hand. Physical resemblance may simply be just another approach for engaging the learner (and not a particularly strong one in isolation).
Perhaps most important, our experience as reviewers of the literature on simulation has made clear that simply exploring whether greater physical resemblance is more effective, or noninferior, is simplistic and will not advance the science of simulation education. It is only in the intersection of these features with objectives, learners, contexts, and other educational principles, including transfer of learning, that such features of the simulator and simulation experience have meaning in research and in educational practice.
Acknowledgments: The authors acknowledge Jason Szostek, MD, Amy Wang, MD, and Pat Erwin, MLS, for their efforts in initial study selection and data abstraction, and thank Glenn Regehr, PhD, and Tim Wood, PhD, for helpful comments and discussion.
* Given that our initial review identified over 10,000 papers focused on technology-enhanced simulation, we chose to confine that meta-analysis to technology-enhanced simulation and forego a review of the entire field of simulation-based health professions education, which would include standardized patients. Because this essay arose from that review, we decided to address only that literature and leave the broader discussion concerning fidelity and other forms of simulation, such as standardized patients, to another forum.
† We are aware that we conducted our original meta-analysis using a positivist approach, while we have used a constructivist paradigm to explain and interpret some of the themes here related to fidelity.
1. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach. 2005;27:10–28
2. Cook DA, Hamstra SJ, Brydges R, et al. Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis. Med Teach. 2013;35:e867–e898
3. Allen J, Buffardi L, Hays R The Relationship of Simulator Fidelity to Task and Performance Variables. Report no. ARI-91-58. 1991 Alexandria, Va Army Research Institute for the Behavioral and Social Sciences
4. Lane NE, Alluisi EA Fidelity and Validity in Distributed Interactive Simulation: Questions and Answers. Report no. IDA-D-1066. 1992 Alexandria, Va Institute for Defense Analysis
5. Alessi SM. Fidelity in the design of instructional simulations. J Comput Based Instr. 1988;15(2):40–47
6. Caro PWiener E, Nagel D. Flight training and simulation. Human Factors in Aviation. 1988 San Diego, Calif Academic Press
7. Beaubien JM, Baker DP. The use of simulation for training teamwork skills in health care: How low can you go? Qual Saf Health Care. 2004;13(suppl 1):i51–i56
8. Tan SS, Sarker SK. Simulation in surgery: A review. Scott Med J. 2011;56:104–109
9. Norman G, Dore K, Grierson L. The minimal relationship between simulation fidelity and transfer of learning. Med Educ. 2012;46:636–647
10. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: A systematic review and meta-analysis. JAMA. 2011;306:978–988
11. Ocel JJ, Natt N, Tiegs RD, Arora AS. Formal procedural skills training using a fresh frozen cadaver model: A pilot study. Clin Anat. 2006;19:142–146
12. Yang JH, Kim YM, Chung HS, et al. Comparison of four manikins and fresh frozen cadaver models for direct laryngoscopic orotracheal intubation training. Emerg Med J. 2010;27:13–16
13. Berry M, Lystig T, Beard J, Klingestierna H, Reznick R, Lönn L. Porcine transfer study: Virtual reality simulator training compared with porcine training in endovascular novices. Cardiovasc Intervent Radiol. 2007;30:455–461
14. Leblanc F, Senagore AJ, Ellis CN, et al.Colorectal Surgery Training Group. Hand-assisted laparoscopic sigmoid colectomy skills acquisition: Augmented reality simulator versus human cadaver training models. J Surg Educ. 2010;67:200–204
15. Mishra S, Kurien A, Ganpule A, Muthu V, Sabnis R, Desai M. Percutaneous renal access training: Content validation comparison between a live porcine and a virtual reality (VR) simulation model. BJU Int. 2010;106:1753–1756
16. Kneebone R. Evaluating clinical simulations for learning procedural skills: A theory-based approach. Acad Med. 2005;80:549–553
17. Miller RB Psychological Considerations in the Design of Training Equipment. Report no. WADC-TR-54–563, AD 71202. 1953 Wright-Patterson Air Force Base, Ohio Wright Air Development Center
18. AGARD Working Group. Fidelity of Simulation for Pilot Training. Report no. AGARD-AR-159. 1980 Neuilly Sur Seine, France NATO Advisory Group for Aerospace Research and Development
19. Flexman RE, Stark EASalvendy G. Training simulators. Handbook of Human Factors. 1987 New York, NY Wiley In:
20. Rehmann A, Mitman R, Reynolds M A Handbook of Flight Simulation Fidelity Requirements for Human Factors Research. Technical report no. DOT/FAA/CT-TN95/46. 1995 Wright-Patterson AFB, Ohio Crew Systems Ergonomics Information Analysis Center
21. Sharma S, Boet S, Kitto S, Reeves S. Interprofessional simulated learning: The need for “sociological fidelity.” J Interprof Care. 2011;25:81–83
22. Grober ED, Hamstra SJ, Wanzel KR, et al. The educational impact of bench model fidelity on the acquisition of technical skill: The use of clinically relevant outcome measures. Ann Surg. 2004;240:374–381
23. Amin Z, Boulet JR, Cook DA, et al. Technology-enabled assessment of health professions education: Consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33:364–369
24. Bradley P. The history of simulation in medical education and possible future directions. Med Educ. 2006;40:254–262
25. Salas E, Bowers CA, Rhodenizer L. It is not how much you have but how you use it: Toward a rational use of simulation to support aviation training. Int J Aviat Psychol. 1998;8:197–208
26. Hays RT Simulator Fidelity: A Concept Paper. ARI technical report 490. 1980 Alexandria, Va U.S. Army Research Institute for the Behavioral and Social Sciences
27. Scerbo MW, Bliss JP, Schmidt EA, Thompson SN. The efficacy of a medical virtual reality simulator for training phlebotomy. Hum Factors. 2006;48:72–84
28. Scerbo MW, Schmidt EA, Bliss JP. Comparison of a virtual reality simulator and simulated limbs for phlebotomy training. J Infus Nurs. 2006;29:214–224
29. Chmarra MK, Dankelman J, van den Dobbelsteen JJ, Jansen FW. Force feedback and basic laparoscopic skills. Surg Endosc. 2008;22:2140–2148
30. Boreham NC. Transfer of training in the generation of diagnostic hypotheses: The effect of lowering fidelity of simulation. Br J Educ Psychol. 1985;55:213–223
31. Brydges R, Carnahan H, Rose D, Rose L, Dubrowski A. Coordinating progressive levels of simulation fidelity to maximize educational benefit. Acad Med. 2010;85:806–812
32. Giger U, Frésard I, Häfliger A, Bergmann M, Krähenbühl L. Laparoscopic training on Thiel human cadavers: A model to teach advanced laparoscopic procedures. Surg Endosc. 2008;22:901–906
33. McDougall EM. Validation of surgical simulators. J Endourol. 2007;21:244–247
34. Maran NJ, Glavin RJ. Low- to high-fidelity simulation—a continuum of medical education? Med Educ. 2003;37(suppl 1):22–28
35. Chandrasekera SK, Donohue JF, Orley D, et al. Basic laparoscopic surgical training: Examination of a low cost alternative. Eur Urol. 2006;50:1285–1291
36. Matsumoto ED, Hamstra SJ, Radomski SB, Cusimano MD. The effect of bench model fidelity on endourological skills: A randomized controlled study. J Urol. 2002;167:1243–1247
37. Paul M, Nobel K. Papaya: A simulation model for training in uterine aspiration. Fam Med. 2005;37:242–244
38. van Merriënboer JJ, Sweller J. Cognitive load theory in health professional education: Design principles and strategies. Med Educ. 2010;44:85–93
39. Williges BH, Roscoe SN, Williges RC. Synthetic flight training revisited. Hum Fact. 1973;15:543–560
40. Doerner D. On the difficulties people have in dealing with complexity. Simul Games. 1980;11:87–106
41. Jones NA, Ross H, Lynam T, Perez P, Leitch A. Mental models: An interdisciplinary synthesis of theory and methods. Ecol Soc. 2011;16:46
42. Sterman JD. Learning in and about complex systems. Syst Dynam Rev. 1994;10:291–330
43. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63
44. Gagne RM, Wager WW, Golas KC, Keller JM Principles of Instructional Design. 2004 Belmont, Calif Wadsworth Press
45. Kuhn TS The Structure of Scientific Revolutions. 1962 Chicago, Ill University of Chicago Press
46. Cook DA, Erwin PJ, Triola MM. Computerized virtual patients in health professions education: A systematic review and meta-analysis. Acad Med. 2010;85:1589–1602
47. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81
48. Clark RE, Pugh CM, Yates KA, Inaba K, Green DJ, Sullivan ME. The use of cognitive task analysis to improve instructional descriptions of procedures. J Surg Res. 2012;173:e37–e42
49. Hamstra SJ, Dubrowski A, Backstein D. Teaching technical skills to surgical residents: A survey of empirical research. Clin Orthop Relat Res. 2006;449:108–115
50. de Giovanni D, Roberts T, Norman G. Relative effectiveness of high- versus low-fidelity simulation in learning heart sounds. Med Educ. 2009;43:661–668
51. Andreatta PB, Woodrum DT, Birkmeyer JD, et al. Laparoscopic skills are improved with LapMentor training: Results of a randomized, double-blinded study. Ann Surg. 2006;243:854–863