Learn, See, Practice, Prove, Do, Maintain: An Evidence-Based Pedagogical Framework for Procedural Skill Training in Medicine

Sawyer, Taylor DO, MEd; White, Marjorie MD, MPPM, MEd; Zaveri, Pavan MD, MEd; Chang, Todd MD; Ades, Anne MD; French, Heather MD; Anderson, JoDee MD, MEd; Auerbach, Marc MD, MSCI; Johnston, Lindsay MD; Kessler, David MD, MSCI

doi: 10.1097/ACM.0000000000000734
Perspectives

Acquisition of competency in procedural skills is a fundamental goal of medical training. In this Perspective, the authors propose an evidence-based pedagogical framework for procedural skill training. The framework was developed based on a review of the literature using a critical synthesis approach and builds on earlier models of procedural skill training in medicine. The authors begin by describing the fundamentals of procedural skill development. Then, a six-step pedagogical framework for procedural skills training is presented: Learn, See, Practice, Prove, Do, and Maintain. In this framework, procedural skill training begins with the learner acquiring requisite cognitive knowledge through didactic education (Learn) and observation of the procedure (See). The learner then progresses to the stage of psychomotor skill acquisition and is allowed to deliberately practice the procedure on a simulator (Practice). Simulation-based mastery learning is employed to allow the trainee to prove competency prior to performing the procedure on a patient (Prove). Once competency is demonstrated on a simulator, the trainee is allowed to perform the procedure on patients with direct supervision, until he or she can be entrusted to perform the procedure independently (Do). Maintenance of the skill is ensured through continued clinical practice, supplemented by simulation-based training as needed (Maintain). Evidence in support of each component of the framework is presented. Implementation of the proposed framework presents a paradigm shift in procedural skill training. However, the authors believe that adoption of the framework will improve procedural skill training and patient safety.

T. Sawyer is assistant professor, Department of Pediatrics, University of Washington School of Medicine, Seattle, Washington.

M. White is assistant professor, Department of Pediatrics, University of Alabama at Birmingham, Birmingham, Alabama.

P. Zaveri is assistant professor, Division of Emergency Medicine, Children’s National Health System, Washington, DC.

T. Chang is assistant professor, Division of Emergency Medicine and Transport, Children’s Hospital Los Angeles, Los Angeles, California.

A. Ades is associate professor, Department of Pediatrics, Children’s Hospital of Philadelphia, Philadelphia, Pennsylvania.

H. French is assistant professor, Department of Pediatrics, Children’s Hospital of Philadelphia, Philadelphia, Pennsylvania.

J. Anderson is associate professor, Department of Pediatrics, Oregon Health Sciences University, Portland, Oregon.

M. Auerbach is assistant professor, Department of Pediatrics, Yale School of Medicine, New Haven, Connecticut.

L. Johnston is assistant professor, Department of Pediatrics, Yale School of Medicine, New Haven, Connecticut.

D. Kessler is assistant professor, Department of Pediatrics, Columbia University College of Physicians and Surgeons, New York, New York.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: Reported as not applicable.

Previous presentations: The framework proposed here has been previously taught as part of instructional workshops conducted by the authors at national and international conferences, including the Academic Pediatric Society Meeting, Washington, DC, May 2013; the International Pediatric Simulation Symposia and Workshops, Vienna, Austria, April 2014; and the International Meeting for Simulation in Healthcare, New Orleans, Louisiana, January 2015.

Correspondence should be addressed to Taylor Sawyer, Department of Pediatrics, Division of Neonatology, University of Washington School of Medicine, 1959 NE Pacific St., RR 539, HSB, Seattle, WA 98195; telephone: (206) 221-5716; e-mail: tlsawyer@uw.edu.

Article Outline

Procedures are fundamental to the medical profession. Acquiring competency in procedural skills is a fundamental goal of medical education, requiring specific education, training, and assessment. Once competency is acquired, maintenance of skills is essential to avoid natural skill decay. The well-known Halstedian mantra “see one, do one, teach one” is the traditional paradigm for teaching procedural skills in medicine. In this paradigm, procedural skill training is accomplished through direct patient care, with trainees practicing procedures on patients as part of a medical apprenticeship model. This training method has been brought under scrutiny within the past decade because of patient safety concerns,1 and an end to the “see one, do one, teach one” era, through the use of simulation-based medical education, has been proposed.2

Simulation-based medical education is an instructional technique that enables trainees to safely gain competency in procedural skills without harm to patients. Its use has been associated with better patient care and improved patient safety.3–9 The utility of simulation for psychomotor skills acquisition has been recently reviewed,10 and the use of simulation is advocated by the Accreditation Council for Graduate Medical Education (ACGME).11 Thus, a modern pedagogy for procedural skill education should incorporate instructional design strategies that effectively use simulation as a procedural skills training platform.

In this article we describe an evidence-based, pedagogical framework for teaching procedural skills in medicine. We developed our proposed framework—learn, see, practice, prove, do, and maintain—based on a review and critical synthesis of the literature. The proposed framework includes simulation as a key educational modality and incorporates proven instructional design features, such as deliberate practice and mastery learning, as critical components. The framework addresses the development, assessment, and maintenance of procedural skills. The foundation of the framework is rooted in adult learning theory.

We begin by describing the search methodology used to define the proposed framework. Next we describe the fundamentals of procedural skill development to provide context for the training framework. We then describe each step of the framework, including relevant examples and supportive data from the literature. To conclude, we summarize and discuss the implication of using the proposed framework.

Back to Top | Article Outline

Literature Review and Synthesis

To develop our proposed framework, we followed a nonsystematic, critical synthesis approach.12,13 The process was completed in two phases over the course of two years. Phase I focused on collating evidence in support of a unified procedural skills training framework. Phase II involved a critical synthesis of the literature in relation to the proposed framework.

During Phase I, five reviewers (T.S., M.W., P.Z., D.K., M.A.) performed a broad review of the literature pertaining to psychomotor skill training and procedural skill education in medicine. After individual reviews, the authors met in January 2013 to discuss the results and map a draft procedural skills training framework. During Phase II, after the draft framework was developed, the original five and five additional reviewers (T.C., J.A., H.F., A.A., L.J.) searched the medical literature for empiric evidence to support or refute the framework. The authors met again in January 2014 to review the evidence and formalize the final framework.

In keeping with a nonsystematic critical synthesis approach, articles reviewed in both phases comprised a broad range of materials, including descriptive/narrative reports; qualitative and quantitative studies using both experimental and quasi-experimental methods; literature reviews; systematic reviews; and meta-analyses. We also employed searches of gray literature and hand searches of bibliographies. Given the diversity of materials reviewed, we did not attempt to quantitate results, grade the level of evidence from each paper, or perform statistical analysis. Instead, we strove to consider the literature broadly to answer the focal question What is the best framework for teaching procedural skills in medicine?

Back to Top | Article Outline

Procedural Skill Development

We define procedural skills to include “the mental and motor activities required to execute a manual task.”14 Procedural skills can range from simple tasks, such as drainage of an abscess, to complex tasks, such as endotracheal intubation. However, we believe that learning any procedure follows the same fundamental process, thus allowing all procedural skills training to be based on a common framework.

The developmental stages of learning in medicine have been previously defined by Dreyfus and Dreyfus.15 The “Dreyfus model” details the development of a medical provider’s scope of vision and range of capabilities along a continuum of five stages: novice, advanced beginner, competent, proficient, and expert.15 A five-stage developmental progression has also been defined specifically for psychomotor/procedural skills. Simpson’s16 and Harrow’s17 taxonomy of the psychomotor domain describes a progression of procedural skill development through a continuum of five stages:

1. Guided response indicates the earliest stage in learning a skill, and primarily includes imitation and trial and error.

2. Mechanism is an intermediate stage in skill learning and describes a state wherein learned responses have become habitual and the movements associated with the skill can be performed with some proficiency and confidence.

3. Complex overt response is a stage at which a procedure can be performed competently with quick, accurate, and highly coordinated performance. At this stage the learner would be considered “competent” with the procedure.

4. Adaptation indicates that skills are so well developed that the individual can modify movement patterns to fit difficult situations.

5. Originating, the final step in skill development, defines a phase in which the skill has been mastered to such an extent that new movement patterns can be created to fit a particular situation or unique problem.

Figure 1 shows the developmental progression in procedural skill mastery using Simpson and Harrow’s taxonomy and correlates each of the five stages of psychomotor skill development with the Dreyfus and Dreyfus developmental stages lexicon. It is within this context of procedural skills development that our proposed pedagogical framework for procedural skill training is employed.

Back to Top | Article Outline

An Evidence-Based Pedagogical Framework

We identified numerous reports on how to conduct procedural skill training in medicine.18–23 We also identified several practical guides on teaching medical procedures.24–26 Of the available training methodologies, we felt the paradigm provided by Kovacs18 provided one of the best approaches and possessed a high degree of validity based in its foundation in psychomotor learning theory. According to Kovacs, procedural skill training should encompass four steps:

1. Learn: A trainee should learn about the procedure and acquire the requisite cognitive knowledge.

2. See: The trainee should then see the procedure performed by an instructor or preceptor.

3. Practice: After learning the procedure and observing it being performed, the trainee should practice the procedure.

4. Do: Finally, the trainee should continue to practice the procedure by performing it on patients.

Kovacs briefly discussed the role of simulation in this paradigm, mentioning the use of “artificial settings” and “models,” but his early report did not include modern evidence in support of simulation. Building on Kovacs’s original framework, we identified two additional, vitally important, steps: Prove and Maintain. Our proposed framework is Learn, See, Practice, Prove, Do, Maintain. We believe this takes into account the best evidence currently available in procedural skills education and establishes a modern pedagogy for procedural skills education in medicine. An overview of the pedagogical framework is presented in Figure 2, and we discuss each of the components of the framework below.

Back to Top | Article Outline

Learn

Teaching and learning procedural skills can be divided into two phases: the cognitive phase and the psychomotor phase.18 The relative importance of each phase, and the amount of time devoted to each, is dependent on both the procedure and the learner. The cognitive phase is the period devoted to learning about the procedure and developing an understanding of the indications, contraindications, and motor actions involved. Some complex procedures may require a significant cognitive component, whereas simple procedural skills may require minimal cognition. The cognitive phase comprised two subphases: conceptualization and visualization.18

In our proposed framework, the first phase of procedural skill training involves acquiring the required cognitive knowledge about the procedural skill. This Learn step focuses on conceptualization. Instructional techniques involved in this step could include learning strategies such as assigned reading, didactic sessions, and multimedia Web-based programs.27 The benefits of providing a cognitive component prior to any hands-on training is supported by empiric investigation.28,29 This step can be conducted individually, or in a group, through either asynchronous or synchronous modalities. Verification of cognitive knowledge can be done with a standardized test, such as a multiple-choice exam, which can be used to verify that requisite cognitive knowledge has been gained prior to the initiation of hands-on procedural skill training.

Back to Top | Article Outline

See

After the cognitive phase has been completed, the next phase of procedural skill training involves an instructor demonstrating and modeling the procedure for the learner. The See step focuses on visualization.18 The demonstration of a skill is optimized by including both nonverbal and verbal instruction.26,30 The nonverbal instruction includes a demonstration of the procedure from start to finish without commentary. The verbal instruction, referred to as “deconstruction” by Peyton,30 includes a demonstration of each step in the procedure with accompanying verbal description. These demonstrations can be presented either through in-person training or in a video demonstration.28,29 A third step may involve the learner explaining each step of the procedure with the teacher following the instructions.30 Evidence supports the educational benefits of demonstrating procedural skills prior to hands-on training to enhance clinical skill acquisition.28,29,31–33

A requirement for the proper demonstration of a procedure is for educators and instructors to come to a consensus on the way the procedure is best performed and to identify the key steps of the procedure. This can be accomplished through the development of a validated procedural checklist via a Delphi method.34–40

Back to Top | Article Outline

Practice

The psychomotor phase of procedural skills training involves practicing the procedure with correction and reinforcement, as well as completing the procedure on a patient in the clinical arena.18 In our proposed framework, practicing the procedure (Practice) and proving competency through simulation-based assessment (Prove) precede performing the procedure for the first time on a patient (Do). The Practice step is optimized by using deliberate practice.

As defined by Ericsson et al,41–43 deliberate practice describes a regimen of effortful activity designed to optimize improvements in the acquisition of expert performance. The key features of deliberate practice are motivated learners, well-defined learning objectives, focused and repetitive practice, precise measurements of performance, and formative feedback. The goal of formative feedback during practice is to improve performance. The importance of formative feedback in procedural skills training is supported by Adams’s44,45 closed-loop theory (see Figure 3), wherein the feedback improves a learner’s knowledge of results and facilitates the detection and correction of errors.

In the Practice step, the learner is allowed the opportunity for deliberate practice of the procedure in a safe learning environment (e.g., a simulation center or in situ simulation-based training) on a partial-task trainer, mannequin, or virtual reality trainer. Evaluation at this phase is formative in nature and directed at defining areas for improvement and modification to maximize performance. Numerous reports exist in the medical literature describing the benefits of deliberate practice at improving procedural performance.46–49 Deliberate practice using simulation has been found to be superior to traditional clinical medical education in achieving specific clinical skill acquisition goals.50 Other instructional design features shown to improve skills outcomes in simulation-based practice include a range of difficulty, distributed practice, longer practice time, using multiple learning strategies, introducing clinical variation, individualized learning, and mastery learning.27

Back to Top | Article Outline

Prove

In the Prove step of our proposed framework, the learner undergoes objective skills assessment on a simulator, to ensure that procedural competency has been achieved, prior to performing the procedure on a patient. The Prove step uses simulation-based mastery learning (SBML). The seven key characteristics of SBML include (1) clear learning objectives; (2) baseline skill assessment; (3) a valid assessment tool with a predetermined minimal passing standard (e.g., “mastery-level”); (4) practice that is focused on reaching mastery-level performance; (5) skill testing to assess achievement of mastery-level performance; (6) continued practice, as needed, until the mastery-level performance is achieved; and (7) progression to the next level of training only after achievement of the mastery standard.4 Mastery learning augments deliberate practice through the addition of a clearly delineated level of performance that defines mastery, and the requirement for continuous practice until the learner achieves mastery-level performance.51 This predefined mastery-level performance greatly informs the feedback provided to the learner and may assist with clarifying the knowledge of results, as defined by Adams.44,45 Competency-based assessment using medical simulation, prior to the performance of the procedure on a patient, is one of the most important roles of simulation as a patient safety modality.1,52 This type of “pre-patient training” is currently used in many medical training programs.11,24,52 Multiple reports in the literature demonstrate the benefits of using an SBML model to teach procedural skills.53–57 A recent meta-analysis showed simulation-based medical education incorporating mastery learning to be superior to nonmastery instruction.58 The determination of mastery-level performance on the simulator can be performed prior to the start of clinical rotations,24,32,52 or immediately prior to the performance of a procedure on a patient using a “just-in-time” model of performance assessment.59

The ability to evaluate mastery-level performance requires an assessment tool with a high level of validity and reliability. Assessment tools for procedural skills commonly take the form of either checklists or global rating scales.20 Methods used to determine the validity and reliability of these assessment tools are described elsewhere.60–62 The evidence supporting the psychometric properties of several assessment tools has been recently reviewed.63,64

Both checklists and global rating scales have benefits and drawbacks. Benefits of checklists include their specific and objective nature, typically involving a sequential series of steps in the procedure with a simple “done” or “not done” check box next to each step.65 Drawbacks of checklists include the fact that sequential checklists may not convey a differentiation in status of critical versus less important steps, and that sometimes not all steps of a checklist are required to successfully complete a procedure.20

Global rating scales provide a more broad-based assessment of procedural competency. Global rating scales, typically involving a Likert-type scale, are used to provide a global rating of procedural skill (e.g., 1 = novice, 3 = competent, 5 = expert). Specific behavioral anchors can be used to provide explicit examples of the behaviors that are indicative of each skill level, yielding a type of global rating scale known as a behaviorally anchored rating scale. A benefit of a global rating scale is the comprehensive impression of competency it provides, without reliance on predefined steps to determine proficiency.20 Limitations include the loss of granularity and inability to provide specific feedback based on incorrect steps.20

Given the benefits and drawbacks of each type of assessment method, we recommend a hybrid assessment tool that includes both a checklist and a global rating scale to mitigate the weaknesses of both methods and accentuate their respective strengths. An example template of such a hybrid procedural skills checklist is provided in Appendix 1.

Back to Top | Article Outline

Do

The teaching of procedural skills must eventually move from the simulation realm to the clinical realm. In Miller’s66 well-known hierarchy, assessment begins with “knows,” then progresses to “knows how,” “shows how,” and culminates in “does.” Assessment of procedural skills on a simulator aligns with “shows how,” and assessment of procedural skills on a real patient aligns with “does” in Miller’s pyramid.66 In our proposed framework, after cognitive knowledge of the procedure has been attained (Learn), the procedure has been modeled (See), sufficient practice using simulation has been conducted (Practice), and verification of procedural skill to a predefined mastery level on a simulator has been achieved (Prove), the learner is finally allowed to perform the procedure on a patient (Do). Thus, only after a trainee is deemed competent on a simulator can he or she continue the process of procedural skill development on real patients in the workplace. This translation of the procedural skill from the realm of simulation to a real-world setting represents a key transition point.

Because of the inherent differences between simulation and real-life clinical practice, competency during simulation should never be considered adequate evidence of true clinical competency. Rethans et al67 defined competency-based assessment as measures of what doctors do in testing situations (e.g., simulation), and “performance-based assessment as measures of what doctors do in practice.”67 Performance-based assessment is required to ensure that the learner can be trusted to perform the procedure independently and without direct supervision. The concept of entrusting a trainee to perform in the clinical environment without direct supervision is the core tenet of entrustable professional activities.68 As proposed by ten Cate,69 the levels of graduated supervision leading to entrustment progress from observation of the procedure only, to performing the procedure with direct supervision in the room, to having supervision available within minutes, to performing the procedure unsupervised (i.e., under clinical oversight), and eventually to providing supervision to more junior practitioners. In our proposed framework, graduated supervision occurs during the Do step, leading eventually to entrustment, and then to ongoing skill maintenance in the Maintain step.

For the Do step to be successful—and safe—the learner must initially be directly supervised during the performance of a procedure on a patient and receive real-time assessment and feedback on technique. This type of direct observation has been referred to as “workplace-based assessment,” “assessment of performance,” or a “supervised learning event.”70,71 These assessments are formative in nature and provide an opportunity for a preceptor to give direct feedback to a trainee to optimize procedural skills and patient outcomes while avoiding harm. Providing a structured environment within which a learner can reliably receive formative assessment of procedural skills can be accomplished through individualized one-on-one training during a clinical rotation, or by rotation on a dedicated medical procedure service.72–74 Supervision in either context is best provided by an attending physician or other expert provider, as opposed to one of the trainee’s peers.74–76

The determination of clinical competency with a procedural skill is challenging, but several methods may help determine clinical competency, including an individualized screening process, tracking the number of procedures performed by a trainee, and statistical analysis of procedural success and failure rates. Each of these methods has benefits and drawbacks. Ideally, some combination of these assessment methods could be used simultaneously to provide optimal evidence of clinical competency.

As described by Rethans et al,67 assessment of procedural competency during clinical care should include a general screening component in which all trainees participate, followed by either a continuous quality improvement cycle for those who pass the screen, or a diagnostic investigation and follow-up for those who perform poorly on the screen. To facilitate the screening process and provide an accurate assessment of procedural competency, the same checklist or assessment tool used in the Prove step can be used in the Do step—this time to evaluate procedural skill on a patient, rather than a simulator. The benefits of this methodology include the one-on-one expert assessment provided to each individual learner. Drawbacks include difficulty facilitating the one-on-one supervision and feedback in a busy clinical environment and the need for faculty training in the use of the assessment tools.

In the United States, several ACGME resident review committees have outlined specific “key index procedures” for their specialty and have published guidelines on the minimum numbers of these procedures that a resident must perform prior to graduation.77–80 The goal for this minimum number is to ensure that each trainee receives adequate exposure to these key index procedures and, as a result, achieves performance-based competency. Benefits of this method include the relative ease with which the assessment can be done using procedure logs, also referred to as case logs. A clear drawback is that performance of a set number of procedures does not provide definitive evidence of achievement of competency because there is a wide range of procedural experience required for individuals to achieve competency, and some trainees will achieve competency more slowly and require more procedures to do so than others.

Cumulative summative (CUSUM) analysis, a type of statistical control chart, has been explored as a method of obtaining objective information on both individual competency and the average number of procedures that are required to achieve competence amongst a given learner group. Using CUSUM analysis, individual learning curves can be created based on predefined acceptable and unacceptable failure rates and reasonable probabilities of type I and type II errors. Early evidence using CUSUM methodology to define the number of procedures needed to achieve competency is promising.81–83 Benefits of CUSUM include the reliance on objective statistical analysis of procedural success. Drawbacks include the need for trainees to diligently record all procedural successes and failures and the inherent difficulties in defining “acceptable” success and failure rates for any given procedure.

Back to Top | Article Outline

Maintain

Once achieved, competency with a procedural skill will degrade with time if the procedure is not practiced regularly. The term “de-skilling” has been applied to the gradual loss of skills through infrequent practice.84 In novice providers, this de-skilling will likely occur rapidly. In experienced providers, de-skilling may occur more slowly. However, degradation curves for procedural skills, based on learner groups and experience, have yet to be defined. Thus, the required frequency and intensity of practice needed to maintain procedural skill are unknown. The area of skill decay, and simulation-based interventions to avoid skill decay, is an active area of ongoing research.

For practitioners who do not perform a specific procedure on a regular basis in their clinical practice, or who have long gaps in clinical time, simulation provides the only feasible method to allow needed practice with the procedure.85 A theoretical representation of the synthesis between simulation and clinical practice on procedural skill development and maintenance is provided in Figure 4. As shown in Figure 4, skill maintenance could include both clinical practice and simulation, with simulation acting as supplemental training for infrequently performed procedures or as refresher training after breaks in clinical practice.85 Tracking of procedures (e.g., with procedure logs) or CUSUM analysis could be included as part of individual continuous quality improvement to provide objective information on the potential need for simulation-based refresher training. Several methods have been used to provide simulation-based maintenance training: “dress rehearsals,” “rolling refreshers,” “just-in-time” training, and “booster” training.9,86–89 Maintenance of competency in procedural skills in one’s area of clinical practice is a critical component of lifelong learning, and key to the ACGME and American Board of Medical Specialties core competencies of Patient Care and Procedural Skills and Practice-based Learning and Improvement. The American Board of Anesthesiology currently uses a simulation-based practice performance assessment and improvement program to satisfy maintenance of certification (MOC) requirements.90 Other medical specialties, including family medicine, are investigating the use of simulation-based training for MOC as well.90

Back to Top | Article Outline

Summary

In this article we have described a six-step, evidence-based pedagogical framework for procedural skill training in medicine: Learn, See, Practice, Prove, Do, Maintain. The framework was developed after a review and critical synthesis of the literature and is founded on adult learning theory. The evidence behind each of the key components of the framework is rooted in empiric investigation. We hope that the framework described here will provide a comprehensive conceptual guide to medical educators involved in teaching procedural skills. Implementation of our proposed framework will no doubt be challenging. The formal structure of the training paradigm, with a focus on competency-based assessments through simulation, performance-based assessments during clinical care, and skills maintenance augmented by simulation as needed, presents a paradigm shift in procedural skill training. However, we believe that adoption of the framework by medical educators will improve procedural skill training and will ultimately improve medical care and patient safety.

Back to Top | Article Outline

References

1. Ziv A, Wolpe PR, Small SD, Glick S. Simulation-based medical education: An ethical imperative. Acad Med. 2003;78:783–788
2. Lenchus JD. End of the “see one, do one, teach one” era: The next generation of invasive bedside procedural instruction. J Am Osteopath Assoc. 2010;110:340–346
3. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: A systematic review and meta-analysis. JAMA. 2011;306:978–988
4. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63
5. Schaefer J, Venderbilt A, Cason C, et al. Instructional design and pedagogy science in healthcare simulation. Simul Healthc. 2011;6:S30–S41
6. McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the impact of simulation on translational patient outcomes. Simul Healthc. 2011;6(suppl):S42–S47
7. Gunberg J. Simulation and psychomotor skill acquisition: A review of the literature. Clin Simul Nurs. 2012;8:e429–435
8. Zendejas B, Brydges R, Wang AT, Cook DA. Patient outcomes in simulation-based medical education: A systematic review. J Gen Intern Med. 2013;28:1078–1089
9. Scholtz AK, Monachino AM, Nishisaki A, Nadkarni VM, Lengetti E. Central venous catheter dress rehearsals: Translating simulation training to patient care and outcomes. Simul Healthc. 2013;8:341–349
10. Ross J. Simulation and psychomotor skill acquisition: A review of the literature. Clin Simul Nurs. 2012;8:e429–e435
11. Philbert I. Accreditation Council for Graduate Medical Education (ACGME) Bulletin. Published December 2005. https://tulane.edu/som/sim/faculty/upload/bulletin12_05.pdf. Accessed March 11, 2015
12. Eva KW. On the limits of systematicity. Med Educ. 2008;42:852–853
13. Cook D. Narrowing the focus and broadening horizons: Complementary roles for systematic and nonsystematic reviews. Adv Health Sci Educ. 2008;13:391–395
14. Foley RP, Spilansy J Teaching Techniques—A Handbook for Health Professionals. 1980 New York, NY McGraw Hill:71–91
15. Dreyfus S, Dreyfus H A Five Stage Model of the Mental Activities Involved in Directed Skill Acquisition. Published February 1980. http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA084551. Accessed March 11, 2015
16. Simpson E The Classification of Educational Objectives in the Psychomotor Domain. 1972 Washington, DC Gryphon House
17. Harrow A A Taxonomy of the Psychomotor Domain. 1972 New York, NY David McKay:14–31
18. Kovacs G. Procedural skills in medicine: Linking theory with practice. J Emerg Med. 1997;15:387–391
19. Norris TE, Cullison SW, Fihn SD. Teaching procedural skills. J Gen Intern Med. 1997;12(suppl 2):S64–S70
20. Lammers RL, Davenport M, Korley F, et al. Teaching and assessing procedural skills using simulation: Metrics and methodology. Acad Emerg Med. 2008;15:1079–1087
21. Kneebone R. Evaluating clinical simulations for learning procedural skills: A theory-based approach. Acad Med. 2005;80:549–553
22. McLeod PJ, Steinert Y, Trudel J, Gottesman R. Seven principles for teaching procedural and technical skills. Acad Med. 2001;76:1080
23. Wang E, Quinones J, Fitch M, et al. Developing technical expertise in emergency medicine—the role of simulation in procedural skill acquisition. Acad Emerg Med. 2008;15:1046–1057
24. Grantcharov TP, Reznick RK. Teaching procedural skills. BMJ. 2008;336:1129–1131
25. Brodsky D, Smith C. A structured approach to teaching medical procedures. NeoReviews. 2012;13:e635–e641
26. Paulman P. A simple five-step method for teaching clinical skills. Fam Med. 2001;33:577–578
27. Cook DA, Hamstra SJ, Brydges R, et al. Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis. Med Teach. 2013;35:e867–e898
28. Srivastava G, Roddy M, Langsam D, Agrawal D. An educational video improves technique in performance of pediatric lumbar punctures. Pediatr Emerg Care. 2012;28:12–16
29. Ventres WB, Senf JH. Introducing a procedure using videotape instruction: The case of the lateral birth position. Fam Med. 1994;26:434–436
30. Peyton JWR Teaching and Learning in Medical Practice. 1998 Rickmansworth, UK Manticore Publishing House Europe Limited
31. Bjerrum AS, Hilberg O, van Gog T, Charles P, Eika B. Effects of modelling examples in complex procedural skills training: A randomised study. Med Educ. 2013;47:888–898
32. Kessler DO, Arteaga G, Ching K, et al. Interns’ success with clinical procedures in infants after simulation training. Pediatrics. 2013;131:e811–e820
33. Gaies MG, Morris SA, Hafler JP, et al. Reforming procedural skills training for pediatric residents: A randomized, interventional trial. Pediatrics. 2009;124:610–619
34. Berg D, Berg K, Riesenberg LA, et al. The development of a validated checklist for thoracentesis: Preliminary results. Am J Med Qual. 2013;28:220–226
35. Berg K, Riesenberg LA, Berg D, et al. The development of a validated checklist for radial arterial line placement: Preliminary results. Am J Med Qual. 2014;29:242–246
36. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for femoral venous catheterization: Preliminary results. Am J Med Qual. 2014;29:445–450
37. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for paracentesis: Preliminary results. Am J Med Qual. 2013;28:227–231
38. Berg K, Riesenberg LA, Berg D, et al. The development of a validated checklist for adult lumbar puncture: Preliminary results. Am J Med Qual. 2013;28:330–334
39. Riesenberg LA, Berg K, Berg D, et al. The development of a validated checklist for nasogastric tube insertion: Preliminary results. Am J Med Qual. 2013;28:429–433
40. Huang GC, Newman LR, Schwartzstein RM, et al. Procedural competence in internal medicine residents: Validity of a central venous catheter insertion assessment instrument. Acad Med. 2009;84:1127–1134
41. Ericsson KA, Krampe RT, Tesch-Romer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100:363–406
42. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81
43. Ericsson KA. Deliberate practice and acquisition of expert performance: A general overview. Acad Emerg Med. 2008;15:988–994
44. Adams JA. A closed-loop theory of motor learning. J Mot Behav. 1971;3:111–149
45. Adams J. A historical review and appraisal of research on the learning, retention and transfer of human motor skills. Psychol Bull. 1987;101:41–47
46. Kessler DO, Auerbach M, Pusic M, Tunik MG, Foltin JC. A randomized trial of simulation-based deliberate practice for infant lumbar puncture skills. Simul Healthc. 2011;6:197–203
47. Sawyer T, Sierocka-Castaneda A, Chan D, Berg B, Lustik M, Thompson M. Deliberate practice using simulation improves neonatal resuscitation performance. Simul Healthc. 2011;6:327–336
48. Clapper T, Kardong-Edgren S. Using deliberate practice and simulation to improve nursing skills. Clin Simul Nurs. 2012;8:e109–e113
49. Issenberg SB, McGaghie WC, Gordon DL, et al. Effectiveness of a cardiology review course for internal medicine residents using simulation technology and deliberate practice. Teach Learn Med. 2002;14:223–228
50. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86:706–711
51. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Medical education featuring mastery learning with deliberate practice can lead to better health for individuals and populations. Acad Med. 2011;86:e8–e9
52. Cohen ER, Barsuk JH, Moazed F, et al. Making July safer: Simulation-based mastery learning during intern boot cAMP. Acad Med. 2013;88:233–239
53. Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med. 2006;21:251–256
54. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37:2697–2701
55. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4:397–403
56. Barsuk JH, Ahya SN, Cohen ER, McGaghie WC, Wayne DB. Mastery learning of temporary hemodialysis catheter insertion by nephrology fellows using simulation technology and deliberate practice. Am J Kidney Dis. 2009;54:70–76
57. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012;79:132–137
58. Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technology-enhanced simulation: A systematic review and meta-analysis. Acad Med. 2013;88:1178–1186
59. Kamdar G, Kessler DO, Tilt L, et al. Qualitative evaluation of just-in-time simulation-based learning: The learners’ perspective. Simul Healthc. 2013;8:43–48
60. Downing SM. Validity: On meaningful interpretation of assessment data. Med Educ. 2003;37:830–837
61. Downing SM. Reliability: On the reproducibility of assessment data. Med Educ. 2004;38:1006–1012
62. Cook D, Beckman T. Current concepts in validity and reliability for psychomotor instruments: Theory and application. Am J Med. 2006;119:e7–e16
63. Jelovsek JE, Kow N, Diwadkar GB. Tools for the direct observation and assessment of psychomotor skills in medical trainees: A systematic review. Med Educ. 2013;47:650–673
64. Ahmed K, Miskovic D, Darzi A, Athanasiou T, Hanna GB. Observational tools for assessment of procedural skills: A systematic review. Am J Surg. 2011;202:469–480.e6
65. Hales B, Terblanche M, Fowler R, Sibbald W. Development of medical checklists for improved quality of patient care. Int J Qual Health Care. 2008;20:22–30
66. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 suppl):S63–S67
67. Rethans JJ, Norcini JJ, Barón-Maldonado M, et al. The relationship between competence and performance: Implications for assessing practice performance. Med Educ. 2002;36:901–909
68. ten Cate O. Trust, competence, and the supervisor’s role in postgraduate training. BMJ. 2006;333:748–751
69. Ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5:157–158
70. Beard J. Workplace-based assessment: The need for continued evaluation and refinement. Surgeon. 2011;9(suppl 1):S12–S13
71. Ali JM. Getting lost in translation? Workplace based assessments in surgical training. Surgeon. 2013;11:286–289
72. Smith CC, Gordon CE, Feller-Kopman D, et al. Creation of an innovative inpatient medical procedure service and a method to evaluate house staff competency. J Gen Intern Med. 2004;19(5 pt 2):510–513
73. Lenhard A, Moallem M, Marrie RA, Becker J, Garland A. An intervention to improve procedure education for internal medicine residents. J Gen Intern Med. 2008;23:288–293
74. Huang GC, Smith CC, York M, Weingart SN. Asking for help: Internal medicine residents’ use of a medical procedure service. J Hosp Med. 2009;4:404–409
75. Mourad M, Kohlwes J, Maselli J, Auerbach ADMERN Group. . Supervising the supervisors—procedural training and supervision in internal medicine residency. J Gen Intern Med. 2010;25:351–356
76. Huang GC, Smith CC, Gordon CE, et al. Beyond the comfort zone: Residents assess their comfort performing inpatient medical procedures. Am J Med. 2006;119:71.e17–71.e24
77. Review Committee for Otolaryngology, Accreditation Council for Graduate Medical Education. . Required minimum number of key indicator procedures for graduating residents. Published March 6, 2013. http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramResources/280_Required_Minimum_Number_of_Key_Indicator_Procedures.pdf. Accessed February 27, 2015
78. Review Committee for Thoracic Surgery, Accreditation Council for Graduate Medical Education. . Case requirements for residents beginning residency. Published 2012. http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramResources/460_CaseRequirements_ResidentsBeginning_OnorAfter_July012012.pdf. Accessed March 11, 2015
79. Review Committee for Emergency Medicine, Accreditation Council for Graduate Medical Education. . Frequently asked questions: Emergency medical services. Published 2014. http://www.acgme.org/acgmeweb/Portals/0/PDFs/FAQ/112_emergency_medical_svcs_FAQs.pdf. Accessed February 27, 2015
80. Program Directors of Pediatric Urology Programs, Accreditation Council for Graduate Medical Education. . Review committee for urology. Minimum numbers memorandum. Published July 1, 2013. http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramResources/480_Memo_Ped_Uro_Operative_Minimum_Numbers.pdf. Accessed February 27, 2015
81. Filho G. The construction of learning curves for basic skills in anesthetic procedures: An application for cumulative sum method. Anesth Analg. 2002;95:411–416
82. Dalal P, Dalal G, Pott L, Bezinover D, Prozesky J, Murray WB. Learning curves of novice anesthesiology residents performing simulated fiberoptic upper airway endoscopy. Can J Anesth. 2011;58:802–809
83. Ward ST, Mohammed MA, Walt R, Valori R, Ismail T, Dunckley P. An analysis of the learning curve to achieve competency at colonoscopy using the JETS database. Gut. 2014;63:1746–1754
84. Levitt LK. Use it or lose it: Is de-skilling evidence-based? Rural Remote Health. 2001;1:81
85. Kneebone RL, Scott W, Darzi A, Horrocks M. Simulation and clinical practice: Strengthening the relationship. Med Educ. 2004;38:1095–1102
86. Kovacs G, Bullock G, Ackroyd-Stolarz S, Cain E, Petrie D. A randomized controlled trial on the effect of educational interventions in promoting airway management skill maintenance. Ann Emerg Med. 2000;36:301–309
87. Niles D, Sutton RM, Donoghue A, et al. “Rolling Refreshers”: A novel approach to maintain CPR psychomotor skill competence. Resuscitation. 2009;80:909–912
88. Sam J, Pierse M, Al-Qahtani A, Cheng A. Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ mock codes. Paediatr Child Health. 2012;17:e16–e20
89. Bender J, Kennally K, Shields R, Overly F. Does simulation booster impact retention of resuscitation procedural skills and teamwork? J Perinatol. 2014;34:664–668
90. Steadman RH, Huang YM. Simulation for quality assurance in training, credentialing and maintenance of certification. Best Pract Res Clin Anaesthesiol. 2012;26:3–15
Back to Top | Article Outline

Example Template for a Procedural Skills Assessment Checklist

Figure. No caption a...
© 2015 by the Association of American Medical Colleges