Academic Medicine

Skip Navigation LinksHome > June 2005 - Volume 80 - Issue 6 > Evaluating Clinical Simulations for Learning Procedural Skil...
Academic Medicine:
Article

Evaluating Clinical Simulations for Learning Procedural Skills: A Theory-Based Approach

Kneebone, Roger MB, ChB, PhD, FRCS, FRCSEd, MRCGP

Free Access
Article Outline
Collapse Box

Author Information

Dr. Kneebone is senior lecturer in surgical education, Department of Surgical Oncology and Technology, Faculty of Medicine, Imperial College London, United Kingdom.

Correspondence should be addressed to Dr. Kneebone, Department of Surgical Oncology and Technology, 10th Floor QEQM Wing, St. Mary's Hospital, Praed Street, London W2 1NY, UK; e-mail: 〈r.kneebone@imperial.ac.uk〉.

Collapse Box

Abstract

Simulation-based learning is becoming widely established within medical education. It offers obvious benefits to novices learning invasive procedural skills, especially in a climate of decreasing clinical exposure. However, simulations are often accepted uncritically, with undue emphasis being placed on technological sophistication at the expense of theory-based design.

The author proposes four key areas that underpin simulation-based learning, and summarizes the theoretical grounding for each. These are (1) gaining technical proficiency (psychomotor skills and learning theory, the importance of repeated practice and regular reinforcement), (2) the place of expert assistance (a Vygotskian interpretation of tutor support, where assistance is tailored to each learner's needs), (3) learning within a professional context (situated learning and contemporary apprenticeship theory), and (4) the affective component of learning (the effect of emotion on learning).

The author then offers four criteria for critically evaluating new or existing simulations, based on the theoretical framework outlined above. These are: (1) Simulations should allow for sustained, deliberate practice within a safe environment, ensuring that recently-acquired skills are consolidated within a defined curriculum which assures regular reinforcement; (2) simulations should provide access to expert tutors when appropriate, ensuring that such support fades when no longer needed; (3) simulations should map onto real-life clinical experience, ensuring that learning supports the experience gained within communities of actual practice; and (4) simulation-based learning environments should provide a supportive, motivational, and learner-centered milieu which is conducive to learning.

In this article I offer a framework for considering simulation-based learning and suggest a set of criteria for judging the effectiveness of existing and new simulations. The focus of the article is the place of simulation as a milieu for gaining expertise in practical clinical procedures, especially those relating to surgery. Although firmly grounded in relevant theory, the article should not be considered comprehensive or exhaustive. The views I put forward are rooted in my 25 years of professional experience as a surgeon, family physician and educator, with a longstanding interest in the field of simulation.

Simulation is now well established within health care training and practice. An extensive literature bears on its use for learning clinical procedures, both simple and advanced. Procedures range from venipuncture and tying surgical knots to highly complex surgical operations, while the simulations themselves range from simple physical models to sophisticated virtual reality computer systems.1,2 In this literature there is a heavy emphasis on individual simulations and the technology that underpins them, but a coherent theoretical structure for their use is seldom presented. Making wise judgments about the usefulness or otherwise of simulations can therefore be difficult.

Anesthesia has led the way in developing a broad-based approach to clinical training that addresses teamworking and communication alongside technical and decision-making skills.3–5 Much published work relating to procedural interventions, however, is directed towards task-based training, where clinical practice is artificially broken down into component skills. These are practiced and assessed in skills laboratories, isolated from the clinical reality that they are intended to reflect. The assumption that such learning is directly transferable to a clinical context often goes untested. There seems a danger, indeed, that task-based simulation may become a self-referential universe, divorced from the wider context of actual clinical practice.6

Back to Top | Article Outline

A Conceptual Framework

As technology advances and as the climate of clinical education places more emphasis on simulation as a safe substitute for practicing on real patients, the range of available simulations will inevitably increase. It is therefore timely to examine what are the characteristics of a successful simulation in terms of learning and clinical outcomes. Based on my experience, my observations, and my reading of the literature, I have highlighted four broad areas, each of which subsumes a range of elements. In making this selection, my aim has been to provide a conceptual framework for judging the usefulness or otherwise of a given simulation.

These key areas are

* gaining and retaining technical proficiency,

* the place of expert assistance in task-based learning,

* learning within a professional context, and

* the affective component of learning.

Each area is briefly described below.

Back to Top | Article Outline
Gaining and retaining technical proficiency

It seems self-evident that clinicians who perform invasive practical procedures must acquire the psychomotor skills to do so safely. Such procedures may be relatively simple (e.g., venipuncture and urinary catheterization) or extremely complex (e.g., specialized surgical operations), but all require a high level of technical mastery.

It is clear from the literature across a wide range of domains that the acquisition of expertise requires sustained deliberate practice over many years.7,8 As Ericsson9 compellingly argues, the primary goal of practice is to improve some specific aspect of performance. Practice should therefore focus on a well-defined area, be supported by detailed immediate feedback, and provide opportunities for gradual improvement of the same or similar tasks.

Such practice should be distributed (broken into smaller units) rather than massed (carried out in a long continuous session) and should build-in an element of overlearning (additional training beyond that required for initial proficiency).10,11 Moreover, if skills are not to decay they must be reinforced by regular repetition. Perhaps most important of all, however, is motivation. Simple repetition of a task is not enough, but must be underpinned by a determination to improve.9,12 Such a determination underpins the continual striving toward improvement that is a sine qua non for achieving expertise.

Unlike sports science and the performing arts, on which much of the literature on expertise acquisition is based, practice within a health care environment can pose obvious dangers if it involves real patients. Indeed, much of the thrust towards simulation as an integral part of health care training is driven by a need to protect patients. However, simulation has much more to offer than protection from harm.

Learning to perform a surgical operation illustrates important shortcomings of the traditional clinical approach to learning. In that approach, repeated observation while assisting a master surgeon necessarily involves a learner in watching or doing parts of the procedure that are either too easy or too difficult to result in meaningful learning. Opportunities for sustained deliberate practice of just those parts that need to be learned at that time are likely to be few, and immediate feedback may not be provided. Simulation can address such drawbacks, offering the opportunity to practice relevant tasks as often as required and at a time convenient to the learner, yet without placing patients in jeopardy.

Simply providing access to simulators, however, is no guarantee that they will be used effectively or that the conditions for learning will be met. Lack of an effective curriculum all too often prevents learning gains from being consolidated and developed, and the absence of opportunities for regular reinforcement results in the loss of many recently acquired skills. As I point out in the following sections, it is also important to recognize the limitations of simulator-based practice and to avoid uncritically equating skills gained in an artificial setting with those needed for real-life practice.

Back to Top | Article Outline
The place of expert assistance in task-based learning

Having the opportunity to practice procedural skills repeatedly may be a necessary condition for learning, but it is certainly not sufficient. Feedback is a crucial component of the process. It is clear that expert assistance is vitally important, but also clear that such assistance must be judiciously applied if it is not to become counterproductive.

Lev Vygotsky, the early 20th-century Russian psychologist, put forward the concept of a “zone of proximal development” (ZPD), within which a learner can make progress in problem solving “in collaboration with more capable peers,” even if unable to do so unassisted.13,14 Bruner and others subsequently developed the idea of temporary learning support or “scaffolding” by an expert tutor, and Wood's notion of “contingent instruction”—of help that is there when needed but that deliberately “fades” when no longer required—emphasizes the two-way, responsive relationship between learner and teacher.15–17

A later view sees progress through the ZPD as a dynamic, often recursive process where each learner first receives external help, then performs under conscious guidance from the self, and finally internalizes the process to render it automatic.18 It is at this stage that assistance is not only unnecessary but counterproductive, as it interferes with the process of internalization. If this “automatization” is only partially completed, moreover, there may be recursion to an earlier stage, where a tutor's support is needed once more. Recent work on learner-oriented teaching highlights the importance of “constructive friction” in the progress from external to self-regulation of learning.19

It is important to distinguish between core procedural and technical skills (which form part of a clinician's basic armamentarium and which, once mastered, are available “automatically”) and higher levels of expertise (where, as Ericsson points out, “the key challenge for aspiring expert performers is to avoid the arrested development associated with automaticity and to acquire cognitive skills to support their continued learning and improvement”9). In relation to the former, a Vygotskian model provides a useful framework for conceptualizing the acquisition of technical skills within an educational setting, and particularly for examining the learning that takes place within simulations.

Back to Top | Article Outline
Learning within a professional context

The traditional model of clinical skills acquisition, where a learner learns by “sitting at the feet of a master,” is being supplanted by a more contemporary view of apprenticeship based on communities of practice and of learning.20,21 Instead of seeing learning as a process of internalization of individual experience, Lave and Wenger see it as an integral and inseparable aspect of social practice:

Learning viewed as situated activity has as its central defining characteristic a process that we call legitimate peripheral participation. By this we mean to draw attention to the point that learners inevitably participate in communities of practitioners, and that the mastery of knowledge and skill requires newcomers to move toward full participation in the sociocultural practices of a community. A person's intentions to learn are engaged and the meaning of learning is configured through the process of becoming a full participant in a sociocultural practice. This social process includes, indeed it subsumes, the learning of knowledgeable skills.20

In their examination of apprenticeship as a model for learning, Lave and Wenger point out that apprenticeship is about conferring legitimacy, not providing teaching. Much of the learning that takes place, indeed, does so through relations between peers, as part of their engagement in practice. Mastery, they say, resides not in the master but in the organization of the community of practice of which the master is part. Although Lave and Wenger's work is not directly related to health care, a wide view of the context of practice seems essential when considering the role of the clinician as teacher or as facilitator of learning.

Guile and Young, in their current view of Vygotsky's ZPD, see it as a space:

populated by a range of resources which include physical and cultural tools as well as other people, and that these resources are used, or come together to be used to shape and direct human activity. It follows that, from [Lave and Wenger's] perspective, intelligence and expertise are acquired through a process of accomplishment, rather than being a matter of self-possession.22

This wider view implies that simulation, viewed as one such resource, must reflect the contextual realities of everyday practice if it is to provide an effective adjunct to clinical experience. Practicing tasks on isolated models—however sophisticated—will inevitably present only a partial picture.

Ericsson highlights this point as follows:

Once we conceive of expert performance as mediated by complex integrated systems of representations for the execution, monitoring, planning and analyses of performance, it becomes clear that its acquisition requires an orderly and deliberate approach. Deliberate practice is therefore designed to improve specific aspects of performance in a manner that assures that attained changes can be successfully integrated into representative performance. Hence, practice aimed at improving integrated performance cannot be performed mindlessly or independent of the representative context for the target performance.9

Recent developments in scenario-based procedural skills, using inanimate models attached to simulated patients, provide contextual settings of this kind.23,24

Back to Top | Article Outline
The affective component of learning

The theory of skills acquisition is dominated by cognitive issues, and much less attention is paid to the emotional content of learning experiences. Indeed, this component of learning is often ignored altogether in traditional teaching. However, there is clearly a strong affective element to any learning experience, and this may exert a powerful positive or negative effect.25–27 Most clinicians, for instance, can give examples of inspirational teaching that profoundly affected their professional development. Equally, however, most can also tell of occasions where they were humiliated in front of patients and peers. Such experiences often endure in the memory for decades.

As outlined above, motivation is a key issue whose effect upon learning is often underrecognized. Although the picture is complex, it seems clear that learning must be underpinned by a determination to improve, and that a sense of self-efficacy is crucial to effective development.12,28

In “real” health care settings, learning is, in a sense, a byproduct of care. The clinical needs of the patient must always take priority over the educational needs of the learner. Simulation, however, deliberately places the learner's needs at the centre of attention and provides the opportunity to create conditions of best practice for teaching. Although the need for a supportive learning environment is widely recognized in a general sense, the emotional climate within which clinical procedural skills are acquired is seldom acknowledged or explored. Ensuring that simulation-based training addresses these affective issues is a key challenge for future development.

Back to Top | Article Outline

Discussion

Simulation clearly offers enormous potential for safely developing expertise in procedural skills. In order to be effective, however, such activity needs to be part of a broader picture, supporting and meshing with actual clinical practice. All too often, simulation-based training seems dominated by technology, losing its links with the wider worlds of health care and educational theory.

It is clear that the locus of medical learning, especially at a novice level, is shifting from real life to simulation. At a superficial glance this makes eminent sense. Only in the protected environment of skills centers and virtual reality simulators, after all, is it safe to make mistakes and learn from them. Indeed, the opportunity of “testing to destruction,” of exploring the consequences of any given clinical action without risking actual harm, opens up possibilities that would be unthinkable within actual patient care.

Moreover, radical changes in patterns of health care and decreasing exposure to patients are rendering traditional apprenticeship inadequate or unacceptable, especially for gaining procedural expertise. Here, simulation is becoming a necessity rather than an optional extra. It is therefore all the more important to develop a critical approach to what any given simulation has to offer.

The surface appeal of simulation can conceal pitfalls. Skills centers are seductive places, in the sense that training within them has a high subjective value at the time it happens, both for teachers and for learners. Because such training focuses on areas of perceived clinical need, those who take part feel that they are making good progress. Indeed, from a short-term perspective, they are making good progress. The weakness in this argument, however, lies in the isolation within which such interventions commonly take place. The strong effect of face validity creates a perception of usefulness that runs counter to what we know about effective learning of skills, namely the need for sustained deliberate practice.

Because individual episodes of simulator-based training give a high sense of immediate satisfaction, it is easy to forget that such skills, however firmly grasped they appear to be at the time, will rapidly decay if they are not consolidated. Such decay is insidious, because learners do not realize it is happening. Since the responsibility of those who organize simulation-based training frequently ends after a single teaching episode, this essential process of consolidation is often ignored. In consequence, those fragile gains in learning that occur after a brief intervention may be lost through lack of regular practice.

Simulated environments are becoming widespread, and siren voices can be heard when approaching them. Educational resources are limited, however, and not all simulations are equally effective. All too often it is the surface realism of a simulation that occupies the ingenuity of those who develop it, eclipsing key issues of learning and teaching. The relationship between simulator fidelity and educational effectiveness is still open for discussion, however, and lower levels of fidelity may reduce technological limitations and cost without compromising outcomes.29

Evaluation of simulations is therefore a key challenge. As I have pointed out elsewhere, evaluation inevitably trails behind innovation.1 New simulations are constantly being developed, and by the time a formal evaluation has been carried out, the landscape surrounding the original product has often changed radically. Many evidence-based studies are therefore out of synchrony with currently available equipment.

This highlights a fundamental tension between those who develop simulations and those who use them. For a systems designer, technical advancement and profitability are key drivers, and a finished product represents the end of extensive development. For an evaluator, however, the same product marks the beginning of a prolonged and detailed process, constrained by imperatives of curriculum and course design. This tension can lead to unhelpful confusion about who is leading and who is following, diverting attention from the key tasks of identifying training needs and working collaboratively to satisfy them.

Because simulation is used in many ways by many people, there is no universally accepted approach to evaluation. This tension is clearly reflected in the literature, where descriptive articles far outnumber outcome-based studies, small sample sizes cause problems with validation, and opportunities for longitudinal investigation are limited. General approaches drawn from other domains, such as Kirkpatrick's30 influential four-level model, have much to offer but do not address specific issues of clinical simulation.

I therefore propose the following criteria for critically evaluating a new or existing simulation. These build on the four theoretical positions areas outlined above and emphasize the mutuality of knowledge, skill, professional practice, and clinical context.

▪ Simulations should allow for sustained, deliberate practice within a safe environment, ensuring that recently acquired skills are consolidated within a defined curriculum which assures regular reinforcement.

▪ Simulations should provide access to expert tutors when appropriate, ensuring that such support fades when it is no longer needed.

▪ Simulations should map onto real-life clinical experience, ensuring that learning supports the experience gained within communities of actual practice.

▪ Simulation-based learning environments should provide a supportive, motivational, and learner-centered milieu that is conducive to learning.

Back to Top | Article Outline

Summing Up

In this article I have drawn together theories of education, psychomotor skill, and professional practice to offer a rationale for effective simulation-based learning, especially those relating to invasive procedures. From this I have proposed a framework for evaluating existing and new simulations, which I hope will be a stimulus for discussion and debate.

I propose that the four key criteria outlined above be considered when designing, implementing, evaluating, or purchasing simulation-based training programs for procedural skills.

Back to Top | Article Outline

References

1 Kneebone R. Simulation in surgical training: educational issues and practical implications. Med Educ. 2003;37:267–77.

2 Issenberg SB, Gordon MS, Gordon DL, Safford RE, Hart IR. Simulation and new learning technologies. Med Teach. 2001;23(1):16–23.

3 Holzman RS, Cooper JB, Gaba DM, Philip JH, Small SD, Feinstein D. Anesthesia crisis resource management: real-life simulation training in operating room crises. J Clin Anesth. 1995;7:675–87.

4 Gaba DM, DeAnda A. A comprehensive anesthesia simulation environment: re-creating the operating room for research and training. Anesthesiology. 1988;69:387–94.

5 Devitt JH, Kurrek MM, Cohen MM, Cleave-Hogg D. The validity of performance assessments using simulation. Anesthesiology. 2001;95:36–42.

6 Gordon JA, Oriol NE, Cooper JB. Bringing good teaching cases “to life”: a simulator-based medical education service. Acad Med. 2004;79:23–7.

7 Ericsson KA, Krampe RT, Tesch-Römer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100:363–406.

8 Ericsson KA, Charness N. Expert Performance. Its structure and acquisition. Am Psychol. 1994;49:725–47.

9 Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79:S70–81.

10 Donovan J, Radosevich D. A meta-analytic review of the distribution of practice effect: now you see it, now you don't. J Appl Psychol. 1999;84:795–805.

11 Arthur W, Bennett W, Stanush PL, McNelly TL. Factors that influence skill decay and retention: a quantitative review and analysis. Hum Perform. 1998;11(1):57–101.

12 Guest CB, Regehr G, Tiberius RG. The life long challenge of expertise. Med Educ. 2001;35:78–81.

13 Wertsch JV. Vygotsky and the Social Formation of Mind. Cambridge, MA: Harvard University Press; 1985.

14 Vygotsky LS. Thought and Language. Cambridge, MA: MIT Press; 1962.

15 Bruner JS. Toward a Theory of Instruction. Cambridge, MA: Harvard University Press; 1967.

16 Bruner JS. The Process of Education. Cambridge, MA: Harvard University Press; 1960.

17 Wood D. How Children Think and Learn. 2nd ed. Oxford: Blackwell; 1998.

18 Tharp R, Gallimore R. Theories of teaching as assisted performance. In: Light P, Sheldon P, Woodhead M (eds). Learning to Think. Routledge; 1991:42–59.

19 ten Cate O, Snell L, Mann K, Vermunt J. Orienting teaching toward the learning process. Acad Med. 2004;79:219–28.

20 Lave J, Wenger E. Situated learning. Legitimate peripheral participation. Cambridge: Cambridge University Press; 1991.

21 Wenger E. Communities of practice. Learning, meaning, and identity. Cambridge: Cambridge University Press; 1998.

22 Guile D, Young M. Apprenticeship as a conceptual basis for a social theory of learning. In: Paechter C, Preedy M, Scott D, Soler J (eds). Knowledge, Power and Learning. London: Paul Chapman Publishing; 2001:56–73.

23 Kneebone R, Kidd J, Nestel D, Asvall S, Paraskeva P, Darzi A. An innovative model for teaching and learning clinical procedures. Med Educ. 2002;36:628–34.

24 Nestel D, Kneebone R, Kidd J. Teaching and learning about skills in minor surgery. J Clin Nurs. 2003;12:291–6.

25 Ferro TR. The influence of affective processing in education and training. In: Flannery DD (ed). Applying Cognitive Learning Theory to Adult Learning. San Francisco: Jossey-Bass; 1993:25–33.

26 Boud D, Keogh R, Walker D. Promoting reflection in learning: a model. In: Edwards R, Hanson A, Raggatt P (eds). Boundaries of Adult Learning. New York: Routledge; 1996:32–56.

27 Cassar K. Development of an instrument to measure the surgical operating theatre learning environment as perceived by basic surgical trainees. Med Teach. 2004;26:260–4.

28 Mann K. Motivation in medical education: how theory can inform our practice. Acad Med. 1999;74:237–9.

29 Grober ED, Hamstra SJ, Wanzel KR, et al. The educational impact of bench model fidelity on the acquisition of technical skill:the use of clinically relevant outcome measures. Ann Surg. 2004;240:374–81.

30 Kirkpatrick D. Evaluating Training Programs. San Francisco: Berrett-Koehler Publishers; 1994.

Cited By:

This article has been cited 6 time(s).

Academic Medicine
Perspective: Simulation and Transformational Change: The Paradox of Expertise
Kneebone, R
Academic Medicine, 84(7): 954-957.
10.1097/ACM.0b013e3181a843d6
PDF (109) | CrossRef
Critical Care Medicine
Using simulation to isolate physician variation in intensive care unit admission decision making for critically ill elders with end-stage cancer: A pilot feasibility study*
Angus, DC; Arnold, RM; Barnato, AE; Hsu, HE; Bryce, CL; Lave, JR; Emlet, LL
Critical Care Medicine, 36(12): 3156-3163.
10.1097/CCM.0b013e31818f40d2
PDF (602) | CrossRef
Critical Care Medicine
Teaching ol’ Docs new tricks: Simulation in intensive care unit admissions and patients with end-stage cancer*
Burden, A; Bekes, C
Critical Care Medicine, 36(12): 3265-3266.
10.1097/CCM.0b013e31818f2636
PDF (737) | CrossRef
Current Opinion in Pediatrics
Emergency and critical care pediatrics: use of medical simulation for training in acute pediatric emergencies
Eppich, WJ; Adler, MD; McGaghie, WC
Current Opinion in Pediatrics, 18(3): 266-271.
10.1097/01.mop.0000193309.22462.c9
PDF (68) | CrossRef
Genetics in Medicine
The use of role-play to enhance medical student understanding of genetic counseling
McIlvried, DE; Prucka, SK; Herbst, M; Barger, C; Robin, NH
Genetics in Medicine, 10(10): 739-744.
10.1097/GIM.0b013e318187762e
PDF (437) | CrossRef
Simulation in Healthcare
One Drop at a Time: Research to Advance the Science of Simulation
Cook, DA
Simulation in Healthcare, 5(1): 1-4.
10.1097/SIH.0b013e3181c82aaa
PDF (144) | CrossRef
Back to Top | Article Outline

© 2005 Association of American Medical Colleges

Login

Article Tools

Share

Article Level Metrics