The apprenticeship model of surgical education is being transformed by new technology and by the need to objectively assess and confirm the competence of trainees.1 Medical educators in general, and surgical educators in particular, have become increasingly interested in gaining a better understanding of what constitutes expert judgment, so as to apply teaching methods that enhance the acquisition of this important component of competence and expertise.2–4
Simulation models, techniques, and applications have been adopted by medical educators to enhance and accelerate the learning of clinical skills and to assess proficiency. Although simulation applications have demonstrated their value in technical skill acquisition and team building, they have not yet fulfilled their potential role in the attainment of (expert) surgical judgment.
Therefore, in this article I explore some of the recent findings of educational psychologists on what constitutes expert judgment and review how surgical investigators are applying the principles of cognitive task analysis (CTA) and a form of error analysis referred to as human reliability assessment to uncover characteristics of expert judgment in the performance of surgical procedures. I then examine how existing and future simulation models may be applied to teach, learn, and assess expert judgment in surgeons and surgical trainees.
What Is Expert Judgment?
In the field of cognitive psychology, experts are defined as individuals capable of a variety of advanced skills, including constant, rapid, accurate, effective diagnosis and solving of complex problems.5,6 They are able to quickly apply domain-specific rules about the condition or problem at hand and maintain an awareness of the consequences of the application of problem-solving strategies.7 Novices, in contrast, have severe limitations on conscious processing but develop expertise through a lengthy regimen of motivated, deliberate practice.8 Deliberate practice can lead to expert performance when highly challenging increases in the demands and difficulty of tasks are accompanied by constant corrective feedback. Job experience is not necessarily deliberate practice if it lacks progressive and varied challenges.
Educational psychologists often refer to the Dreyfus and Dreyfus9 model of staged acquisition of expertise, which includes (1) novice, (2) advanced beginner, (3) competence, (4) proficiency, and (5) expertise. The five stages of this progression mirror the five levels of surgical training—that is, (1) student or postgraduate year 1, (2) postgraduate year 2, (3) postgraduate years 3 and 4, (4) chief resident, and (5) fellow or attending—during which progressive phases of learning occur. Skill-guided learning results in knowing how, rule-guided learning results in knowing that, and experience-guided learning results in knowing when to do or not to do something, or when to alter the plan or procedure.3 Furthermore, the development and display of automatic use of a behavior or technique without reflective effort, or automaticity, increases with each stage of development.3
Purposeful, goal-directed behavior incorporates both automatic, nonanalytic behavior and mentally effortful, analytic behavior, in varying proportions.10 Expertise is characterized by effective interfacing between these two modes of behavior.11 Nonanalytic capacity develops as a result of scripting and scenarios related to pattern recognition. When conducting an interventional procedure, the experienced individual performs the routine steps comprising the basic elements of the procedure with some level of attention but without analysis. This automatic processing requires less attentional capacity which, in the expert, frees up cognitive resources to invest in risk awareness, problem solving, and anticipating the results of the intervention. The individual who makes exclusive use of nonanalytic resources and automatic processes is unlikely to manage novel or unusual situations expertly and has been defined by Bereiter and Scardamalia10 as the “experienced non-expert.”
In Bereiter and Scardamalia’s schema,10 the experienced nonexpert will adapt the present problem to known solutions rather than adapting new solutions to the present problem. Such individuals are otherwise known as technicians “who perform well on routine problems by unreflectively and automatically applying standard assumptions and techniques,” but they do “not display creativity in finding solutions to ill-defined or unusual problems.”3 Experts “identify the subtle complexities of situations”3; through an effortful process, they develop a deeper understanding of the problem and arrive at a better solution that avoids error and a poor outcome.10
Lack of situational awareness leads to errors in judgment: An action decision may be correct for the perceived circumstance but result in error because the actual situation is not as perceived.12,13 (A common example of such an error in general surgery is the bile duct injury that occurs during “routine” laparoscopic cholecystectomy.) Experts may devolve or revert into experienced nonexperts through burnout, disillusionment, or complacency as well as through fatigue, physiologic (e.g., hypoglycemia) or pharmacologic (e.g., alcohol) effects, or excessive anxiety.14–16 In addition, cognitive capacity is limited at any moment in time: The need to perform multiple tasks simultaneously (i.e., technical tasks that demand effortful performance on top of analyzing and being cognitively aware of situational complexities) can result in a failure of performance.17,18
According to Schön,19 experts typically demonstrate three characteristics in action: knowing-in-action (automatic know-how), reflection-in-action (ability to improvise), and reflection-on-action (post hoc reasoning about the experience, which informs future performance). Moulton et al3 state that the expert engages automatic resources in action and uses the cognitive resources that thereby become available to continuously evaluate the situation, assessing how effectively his or her automatic resources are functioning; he or she is able to recognize when it is necessary to “slow down,” that is, when to engage in more effortful, analytic behavior.3 The experienced nonexpert, however, may not invest these additional cognitive resources and may fail to recognize when it is necessary to slow down. In surgery, we teach this conversion from routine technical behavior to analytic, reflective behavior by slowing or stopping the progress of an operation to focus on analysis of the hazard risk when we encounter a critical point. Or, as one professor of surgery once explained to me, “the secret of being a good surgeon is knowing when to speed up, and knowing when to slow down.”20
Experts demonstrate the ability to deal with uncertainty by naming (observing and/or recognizing the unusual or abnormal) and framing an abnormal situation correctly (understanding the significance of the recognized abnormality).19,21 Therefore, an important characteristic of expert judgment is slowing down when it is necessary “to take the time to ensure that the muddy problems of practice will be correctly named and framed.”3 This transition from automatic, nonreflective activities to cognitive awareness and analysis becomes routinized with increasing experience, such that the expert surgeon may be only minimally aware of when he or she shifts from one mode to the other.3
To understand the process of slowing down, Moulton et al22 interviewed 28 surgeons who were regarded as experts. The surgeons were asked to recall moments when an error occurred in the course of an operation and they found themselves saying, “I can’t believe I just did that,” and to reflect on how those moments informed their subsequent approaches to similar problems. The surgeons acknowledged the role that slowing down plays in exercising analytic tools and expertise. Their comments enabled the investigators to identify planned and unplanned initiators of as well as inhibitors of the transition (slowing down) to a more effortful, analytic mode of behavior. Planned initiators included operative procedure-specific steps and patient-specific abnormalities that were identified in advance, whereas unplanned initiators resulted from situation-specific events, such as complications or events that required a deviation from the operative plan. Factors inhibiting or impairing the transition included physical factors such as fatigue, personality factors such as overconfidence, and situational factors such as difficulties in team management and communication, competing cognitive input (e.g., chatter among team members, loud music), and time constraints (e.g., “I’ve got to be at a meeting with the chairman/dean in 10 minutes”).
An inherent part of expert judgment, therefore, appears to be the transition from routine, automated processes to focused, analytic behavior, characterized by slowing down at critical decision points.3,21 Slowing down allows the expert surgeon to engage in analysis, teaching, and self-reflection, which enhances patient safety. A surgeon’s failure to transition from automatic/routine to attentive/more effortful modes of function (due to lack of situational awareness) can lead to surgical error and patient harm. Over time, and with experience, this transition itself becomes automated or routinized and requires less effort (and time), which frees up more cognitive capacity for expanded situational awareness, self-awareness, reflection, and innovation, leading to improved outcomes.23
To summarize, expert judgment consists of superior problem-solving skills, which the expert invokes at appropriate points of increased risk during a procedure. These skills are observable as transitions from routine, automatic technical behavior to deliberate, analytical, reflective behavior. Do we as educators identify this behavior as characteristic of expert judgment when we see it? And how do we know if a trainee’s (slowing-down) transitions are occurring at appropriate times and for the right reasons?
How Do We Assess Clinical Judgment?
Classically, attending surgeons’ assessments of whether trainee surgeons possess “good judgment” have been made in a subjective fashion. This approach is problematic because there is potential for bias due to perceived overall performance (the “halo effect”), because teachers have poor insight into their own judgmental processes, and because unstructured global assessments do not provide scope or focus for formative feedback to improve performance.24 Furthermore, in the current surgical training environment, it is rare for a senior surgeon to have extensive one-on-one contact with a trainee so as to determine with confidence the level of judgmental skills the trainee has attained.
Various methods have been developed that attempt to provide an objective and standardized approach to assessing clinical skills. These include the objective structured clinical examination,25 the objective structured assessment of technical skills,26 and the global operative assessment of laparoscopic skills (GOALS),27 which can be modified to a particular procedure, such as groin hernia repair28 or incisional hernia repair.29 Performance scores measured by the GOALS tool correlate highly with performance on a simulator and performance in the operating room, but each of these objective assessment tools places more emphasis on evaluating technical skills than on assessing judgmental ability. Wohaibi et al30 have described an operative assessment tool that uses an 11-point scale to assess cognitive skills and organizational ability (defined as “trouble-shooting ability”) in addition to technical ability and that correlates well with trainee experience and expertise. It is completed jointly by the attending and resident surgeon via a Web-based surgical performance rating system (OpRate) at the completion of each procedure. Validation of this tool’s usefulness in a simulated environment, where trainee awareness, analysis, and decision making can be assessed in isolation, is currently in progress.
The oral examination required for certification by the American Board of Surgery is the prototypical structured model for the assessment of a trainee’s judgment and organizational skills. Many residency training programs have adopted this assessment tool in the form of “mock oral exams” administered to middle- and upper-level residents annually. The oral examination format, however, carries its own set of features that may influence examinee performance, and success on such a high-stakes examination can be affected by the test taker’s anxiety and familiarity with the format. Moreover, in my experience, the oral examination model is not a test of intraoperative situational awareness or decision making but, rather, is heavily weighted toward rules-based facts and algorithms.
Surgical trainees’ awareness of predictable or unpredictable situations that contain hazards and their ability to assess strategies to avoid risk or to alter the operative plan are therefore not measured routinely in a structured, transparent, procedure-specific, instructor-independent manner. To create a tool to measure trainees’ judgment, a codified “best practice” set of behaviors needs to be identified.
Methods to Identify the Essential Elements of Expert Judgment
The difference between skilled, experienced workers (experienced nonexperts) and advanced experts is the ability of the latter to recognize and solve problems. Even the most advanced experts, however, are largely unaware of the automated strategies that guide most of their problem solving.31 One approach to modeling a cognitive system is to trace the problem-solving process to identify points where limited knowledge and resources can lead to breakdowns given the demands of a specific problem.32 CTA is a system of assessing and defining the steps involved in expert task performance. It was developed by industrial psychologists in the 1980s and 1990s to address training needs in industrial, transportation, and military sectors so as to create job expertise in certain complex tasks in a short time frame.6 It has proven to be a successful model for expert training; it has been said that “50 hours of training based on CTA is the equivalent of 5 years of advanced job knowledge.”7 The impetus for the development of CTA was the realization that it takes trainees too long to achieve expertise under the apprenticeship model, which was flawed because situational variations requiring expert judgment happen too infrequently, teaching methods are subjective and imprecise (task teachers do not know how or what to teach), and it is instructor dependent and, therefore, nontransferrable.6
CTA provides a set of methods and techniques that specify the cognitive structures and processes associated with task performance. Or, as Woods33 explained, “Cognitive task analysis is like learning the secret of a magic trick; once you know the trick and know what to look for, you see things you didn’t notice when you didn’t know what to look for.” The CTA process involves interviewing experts to identify a representative sample of domain-specific problems that need to be solved (List 1). The concepts, principles, and practices that characterize expert performance can then be mapped or scripted for teaching and learning by nonexperts (for an example, see Appendix 1). CTA can be laborious because an investigator must interrogate experts who perform the procedure and may need to interrogate other experts who critique the experts who perform the procedure.34
CTA has been applied to surgical procedures,35–37 surgical simulation development,38,39 and surgical curriculum development.40 Velmahos et al35 used CTA to analyze the task of central line insertion and identified seven critical steps and the decisions associated with successful performance of the procedure. Sullivan et al36 used CTA to deconstruct the automated and thoughtful skills of experts in percutaneous tracheostomy, revealing multiple steps in the procedure, including key decision points and the cues that inform these decisions. They found that residents trained with a CTA-based curriculum performed better than those trained using standard methods, not only immediately but on reassessment six months later. Using CTA, Jacklin et al37 identified 18 key decision points in the management of gallstone disease and then formed a map of care from the point of initial patient evaluation through recovery from cholecystectomy. To do this, a surgeon and a behavioral psychologist interviewed expert surgeons and characterized decision-making steps and (automated) procedural steps. Although the two interviewers agreed on the definition of the majority of the steps, the surgeon assessed some steps as judgment based, whereas the psychologist graded them as procedure based. This suggests that certain crucial procedural steps, such as dissecting the triangle of Calot and securing the cystic duct, require a combination of technical and judgmental skills. These authors concluded that the concept of surgical competency should be broadened beyond the surgeon’s ability to perform the technical steps of the operation to include his or her ability to make appropriate judgments.
CTA is being applied to an expanding list of surgical procedures so as to define a standard expert method for performing them and the critical decision steps embedded in the expert method.39 As Jacklin et al,37 Pugh et al,39 DaRosa et al,40 and others have pointed out, the involvement of expert surgeons as objective investigators in such efforts is essential for the success of the CTA process.
An alternative method for modeling a complex set of activities that requires expert judgment to avoid harm is to identify the steps of a procedure which contain a high risk of error. Joice et al41 analyzed 200 videotaped laparoscopic cholecystectomies to study the factors contributing to intraoperative errors, using a technique called human reliability analysis (HRA) in which they created a predefined set of error modes (e.g., wrong sequence, wrong instrument, too fast). Most errors were seen in the steps related to identifying the cystic artery (dissecting the triangle of Calot) and securing the cystic duct. The authors concluded that HRA is useful in detecting procedural steps in which errors are commonly encountered and in identifying technical and cognitive remediation approaches. Tang et al42,43 used the same format and an expanded version of HRA, called observational clinical human reliability assessment (OCHRA), to analyze differing techniques used at steps with high risk for error to determine the best (safest) and worst (most error-prone) technical methods of performing a task. These investigators concluded that OCHRA systems should be incorporated into technical training to increase awareness of risk and to identify steps in which decreased speed and increased cognitive resources are required to minimize the chance of error. OCHRA is labor-intensive, however, because professional/expert scoring of each recorded procedure is required. As a result, the method has not gained widespread use.
How Can Simulation Methods Be Applied to Teach, Learn, and Assess Expert Judgment?
Whole-patient (mannequin) simulation models have been used successfully for learning advanced management skills in anesthesia,44,45 emergency medicine and trauma care,46,47 and obstetrical care48 as well as in perioperative surgical care and team building.49 These simulation models focus more on the acquisition of cognitive skills than technical skills, and they successfully engage both experienced practitioners and trainees to sharpen their decision-making and team management skills.
Although attaining expert-level proficiency on a laparoscopic simulator has been shown to result in fewer intraoperative errors,50,51 the currently available high-fidelity surgical simulators do little to help users refine and expand their surgical judgment. This explains, in part, why novices and advanced beginners use such surgical task simulation applications enthusiastically, whereas experts and experienced nonexperts are less motivated to use them.52
Surgical educators and organizations have begun to address the shortcomings of these simulation models by pairing rules-guided learning with technical skill acquisition in a simulated environment. In a study of simulated laparoscopic ventral hernia repair, Pugh et al53 showed that adding instructor feedback on error prevention significantly improved residents’ subsequent intraperformance decision making. In Kohls-Gatzoulis and colleagues’54 study of arthroplasty training, residents who received cognitive skills training in place of some technical skills practice sessions showed improved ability to detect error, plan next steps, and correctly execute the procedure when compared with control group residents who had received only technical instruction.
In cooperation with the Society of American Gastrointestinal and Endoscopic Surgeons, Fried et al55,56 developed the Fundamentals of Laparoscopic Surgery (FLS) course, which emphasizes the attainment of basic laparoscopic skills and understanding. The FLS course includes technical exercises completed on a box-trainer model as well as a cognitive training curriculum, the mastery of which is assessed by a multiple-choice test. The American Board of Surgery now requires all general surgery residents to complete the FLS course successfully.57 The scope of technical and cognitive demands is such that most third-year residents are able to obtain a passing score.
To teach and assess more advanced clinical management skills, the Society for Surgery of the Alimentary Tract has partnered with a Web-based education company to develop a “virtual surgical patient” evaluation tool (www.discoursellc.com). They have created a series of clinical scenarios that combine text, static and video imaging, and pull-down data menus that allow the user to determine what information is needed for clinical decision making. The tool assesses the user’s management skills by offering options at key steps in the scenario. The “branching” design allows the user to proceed down paths that may not be correct. At the end of the case, the tool generates a score that reflects the user’s overall proficiency of judgment. This system emphasizes cognitive rather than technical skills, and it can be used to assess the expertise of residents as well as practicing surgeons.58 Zendejas et al59 recently reported reductions in patient complications and length of hospital stay following the addition of a simulation-based training protocol that included Web-based cognitive simulation exercises and practice to proficiency on an extraperitoneal hernia simulator.
Web-based and computer software applications have become increasingly popular as a means to teach and learn clinical skills. Cognitive simulation applications are one type of Web-based model in which variations in data in simulated scenarios can be accessed to assess clinical problem-solving behavior. This permits the mapping of the cognitive demands imposed by the situation to assess the response of the problem-solving operator. As described by Windsor,60 cognitive simulators are Web- or software-based, combine multiple media (text, Internet, audiovisual), use prerendered simulation (video or virtual reality), allow testing of knowledge and decision making, and provide feedback based on relevant metrics. The Web-based Integrated Cognitive Simulator (www.simtics.com) and the software-based SimPraxis trainers (www.redllamainc.com) are two examples of this developing type of simulation that promotes both cognitive rehearsal and technical rehearsal of operative procedures and can be used anytime, by multiple users, close to the operating room in location and time.
Pugh et al39 have used CTA to adapt box-trainer systems and models of open surgical procedures to assess intraoperative decision making for ventral hernia repair, mediastinal lymphadenectomy, pancreaticojejunostomy, and intestinal stoma creation. By combining direct feedback and instruction by expert observers during simulated procedures that are based on CTA-guided procedure analysis, they found that users significantly improved their simulated intraoperative decision making and more correctly performed the simulated procedure. Marshall et al61 reported a similar application of CTA-based methods to assess and teach the decision-making steps in chest wall tumor resection. Such simulation models allow the instructor to stop the procedure to assess the learner’s cognitive processes and assess his or her situational awareness. Direct interrogation allows the instructor to assess whether a trainee’s slowing down in action is a result of appropriate awareness and analysis or whether it is merely dithering. Immediate, formative feedback from instructors results in improvements in surgical planning, recognition of possible complications, error prevention, and error management.
Cognitive rehearsal of a procedure is used routinely by expert surgeons, but it is more difficult for advanced beginners and inexperienced but technically proficient trainees. Simulation models which promote cognitive rehearsal in addition to technical rehearsal have been shown to improve surgical outcomes.62 Cognitive rehearsal of an event or procedure can take the form of mental visualization, a technique that has been shown to improve the performance of professional athletes.63 Mental practice can also take the form of a structured simulation or depiction of a procedure. Arora et al demonstrated that a mental practice protocol significantly improved surgical residents’ performance of simulated surgery64 or laparoscopic cholecystectomies.65
Currently, therefore, the surgical simulation methods that are the most effective for teaching, learning, and assessing the essentials of expert surgical judgment consist of box-trainer and low-fidelity, open-surgery models coupled to Web-based applications that incorporate a programmed map or algorithm of the steps and tasks involved in the correct performance of a procedure. Box-trainer and high-fidelity simulation models have been shown to be equally effective for the acquisition of technical skills in novice trainees,66 which raises the issue of justifying the high cost of high-fidelity devices.60 Using box trainers and synthesized open-surgery models to evaluate and teach judgment requires the presence of expert instructors, however, which raises issues of the cost and means to support the instructors’ time commitment. Thus, the challenge for high-fidelity simulator development, as noted by Satava,67 remains the incorporation of training and assessing judgment, so as to provide skills-based, rules-based, and knowledge-based training that does not require the presence of an instructor.
Challenges and Recommendations
As Spencer stated, “An operation calls for 70–75% decision making and 20–25% dexterity.”68 Opportunities to assess trainees’ critical decision making are limited in the operating room, however; such assessments are better performed in a simulated surgical environment. For simulation applications to be effective in learners’ acquisition and instructors’ assessment of judgmental or knowledge-based skills, validated cognitive and technical skills criteria need to be codified for each “core” procedure in a surgical curriculum. This is a challenge, however, because no two senior surgeons perform the same operation in exactly the same way. Therefore, check points (e.g., steps with high error risk, key decision points) must be identifiable, and the surgeon’s level of situational awareness needs to be measureable.69 The response to each critical step must be recorded, scored, and displayed for the instructor to provide corrective feedback and to inform the trainee’s future decision making. Anomalies due to anatomic or pathologic variations need to be introducible so that the challenges and demands on the user can be increased progressively as skill attainment allows.
The successes reported by some surgical educators who have incorporated these principles into training exercises paired with “homegrown” box trainers and open surgical models indicate that expert judgment is teachable and transferable to trainees outside the operating room. The incorporation of Web-based clinical training sites that permit technical and cognitive rehearsal of procedures is likely to further enhance the transfer of judgmental expertise in the simulated environment. Yet, until high-fidelity simulators incorporate a validated “best practice” for each core procedure with metrics indicating expert performance and a method to score user performance, dedicated surgical teachers will continue to be needed to guarantee trainees’ attainment of expert surgical judgment in the simulated environment.
Recommended reading: Readings that may be of value to the interested reader are suggested in Supplemental Digital List 1, available at http://links.lww.com/ACADMED/A91.
Acknowledgments: The author is indebted to the following individuals for helpful discussions related to this topic: Richard H. Bell Jr., MD, Debra A. DaRosa, PhD, Andrew J. Duffy, MD, Gerald M. Fried, MD, David Hananel, David M. Mahvi, MD, Carla M. Pugh, MD, PhD, and Neal E. Seymour, MD.
Other disclosures: Dr. Andersen is coeditor of Schwartz’s Principles of Surgery (published by McGraw Hill) and serves as a consultant to foundations and corporations involved in surgical education and medical technology development.
Ethical approval: Not applicable.
Previous presentation: Presented in part at the Surgicon Congress on Surgical Education; Göteborg, Sweden; September 8, 2011.
1. Satava RM, Gallagher AG, Pellegrini CA. Surgical competence and surgical proficiency: Definitions, taxonomy, and metrics. J Am Coll Surg. 2003;196:933–937
2. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 suppl):S63–S67
3. Moulton CE, Regehr G, Myopoulos M, MacRae HM. Slowing down when you should: A new model of expert judgment. Acad Med. 2007;82(10 suppl):S109–S116
4. Yule S, Flin R, Paterson-Brown S, Maran N. Non-technical skills for surgeons in the operating room: A review of the literature. Surgery. 2006;139:140–149
5. Cooke NJCamar WJ. The implications of cognitive task analysis for the revision of the Dictionary of Occupational Titles. Implications of Cognitive Psychology and Cognitive Task Analysis for the Revision of the Dictionary of Occupational Titles. 1992 Washington, DC American Psychological Association:1–25
6. Clark F, Estes RE. Cognitive task analysis for training. Int J Educ Res. 1996;25:403–417
7. Means B, Gott SPsotka J, Massey LD, Mutter SA. Cognitive task analysis as a basis for tutor development: Articulating abstract knowledge representations. Intelligent Tutoring Systems: Lessons Learned. 1988 Hillsdale, NJ Lawrence Erlbaum:201–244
8. Ericsson KA, Krampe RT, Tesch-Romer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100:363–406
9. Dreyfus HL, Dreyfus SE. Mind Over Machine: The Power of Human Intuition. 1986 New York, NY Free Press
10. Bereiter C, Scardamalia M. The need to understand expertise. Surpassing Ourselves: An Inquiry Into the Nature and Implications of Expertise. 1993 Chicago, Ill Open Court:1–24
11. Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2005;39:98–106
12. Neisser U. . Cognition and Reality—Principles and Implications of Cognitive Psychology. 1976 San Francisco, Calif WH Freeman
13. Simons DJ. Attentional capture and inattentional blindness. Trends Cogn Sci. 2000;4:147–155
14. Blandford A, Wong BLW. Situation awareness in emergency medical dispatch. Int J Hum Comput Stud. 2004;61:421–452
15. Gaba DM, Howard SK, Small SD. Situation awareness in anesthesiology. Hum Factors. 1995;37:20–31
16. Endsley MR. Towards a theory of situation awareness. Hum Factors. 1995;37:32–64
17. Posner MI, Rossman E. Effect of size and location of informational transforms upon short term retention. J Exp Psychol. 1965;70:496–505
18. Kahneman D. Attention and Effort. 1973 Englewood Cliffs, NJ Prentice-Hall
19. Schön DA. The Reflective Practitioner: How Professionals Think in Action. 1983 New York, NY Basic Books
20. Siegler HF. Personal communication with DK Andersen. 1979
21. Schön DA. Educating the Reflective Practitioner: Toward a New Design for Teaching and Learning in the Professions. 1987 San Francisco, Calif Jossey-Bass
22. Moulton CE, Regehr G, Lingard L, Merritt C, MacRae H. “Slowing down when you should”: Initiators and influences of the transition from the routine to the effortful. J Gastrointest Surg. 2010;14:1019–1026
23. Regehr G, Eva K. Self-assessment, self-direction, and the self-regulating professional. Clin Orthop Relat Res. 2006;449:34–38
24. Jacklin R, Sevdalis N, Harries C, Darzi A, Vincent C. Judgment analysis: A method for quantitative evaluation of trainee surgeons’ judgments of surgical risk. Am J Surg. 2008;195:183–188
25. Van der Vleuten CPM, Swanson DB. Assessment of clinical skills with standardized patients: State of the art. Teach Learn Med. 1990;2:58–76
26. Martin JA, Regehr G, Reznick R, et al. Objective structured assessment of technical skills (OSATS) for surgical residents. Br J Surg. 1997;84:273–278
27. Vassiliou MC, Feldman LS, Andrew CG, et al. A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg. 2005;190:107–113
28. Kurashima Y, Feldman LS, Al-Sabah S, et al. A tool for training and evaluation of laparoscopic inguinal hernia repair: The Global Operative Assessment of Laparoscopic Skills–Groin Hernia (GOALS-GH). Am J Surg. 2011;201:54–61
29. Ghaderi I, Vaillancourt M, Sroka G, et al. Performance of simulated laparoscopic incisional hernia repair correlates with operating room performance. Am J Surg. 2011;201:40–45
30. Wohaibi EM, Earle DB, Ansanitis FE, et al. A new web-based operative skills assessment tool effectively tracks progression in surgical resident performance. J Surg Educ. 2007;64:333–341
31. Gagne ED, Yekovich CW, Yekovich FR. . The Cognitive Psychology of School Learning. 1993 New York, NY Harper Collins
32. Woods DDGoodstein LP, Andersen HB, Olsen SE. Coping with complexity: The psychology of human behavior in complex systems. Mental Models, Tasks, and Errors. 1988 London, UK Taylor & Francis:128–148
33. Woods DDHollnagel E. Discovering how distributed cognitive systems work. Handbook of Cognitive Task Design. 2003 London, UK Taylor & Francis:37–53
34. Miller JE, Patterson ES, Woods DD. Elicitation by critiquing as a cognitive task methodology. Cogn Tech Work. 2006:90–102
35. Velmahos GC, Toutouzas KG, Sillin LF, et al. Cognitive task analysis for teaching technical skills in an inanimate surgical skills laboratory. Am J Surg. 2004;187:114–119
36. Sullivan ME, Brown CVR, Peyre SE, et al. The use of cognitive task analysis to improve the learning of percutaneous tracheostomy placement. Am J Surg. 2007;193:96–99
37. Jacklin R, Sevdalis N, Darzi A, Vincent C. Mapping surgical practice decision making: An interview study to evaluate decisions in surgical care. Am J Surg. 2008;195:689–696
38. Grunwald T, Clark D, Fisher SS, McLaughlin M, Narayanan S, Piepol DWestwood JD, Haluck RS, Hoffman HM, Mogel GT, Phillips R, Robb RA. Using cognitive task analysis to facilitate collaboration in development of simulator to accelerate surgical training. Medicine Meets Virtual Reality 12. 2004 Amsterdam, the Netherlands IOS Press:114–120
39. Pugh CM, DaRosa DA, Santacatarina S, Clark RE. Faculty evaluation of simulation-based modules for assessment of intraoperative decision making. Surgery. 2011;149:534–542
40. DaRosa DA, Rogers DA, Williams RG, et al. Impact of a structured skills laboratory curriculum on surgical residents’ intraoperative decision-making and technical skills. Acad Med. 2008;83(10 suppl):S68–S71
41. Joice P, Hanna GB, Cuschieri A. Errors enacted during endoscopic surgery—A human reliability analysis. Appl Ergon. 1998;29:409–414
42. Tang B, Hanna GB, Joice P, Cuschieri A. Identification and characterization of technical errors by observational clinical human reliability assessment (OCHRA) during laparoscopic cholecystectomy. Arch Surg. 2004;139:1215–1220
43. Tang B, Hanna GB, Carter F, Adamson GD, Martindale JP, Cuschieri A. Competence assessment of laparoscopic operative and cognitive skills: Objective structured clinical examination (OSCE) or observational clinical human reliability assessment (OCHRA). World J Surg. 2006;30:527–534
44. Gaba DM, DeAnda A. A comprehensive anesthesia simulation environment: Re-creating the operating room for research and training. Anesthesiology. 1988;69:387–394
45. Schwid HA, Rooke GA, Carline J, et al. Anesthesia simulator research consortium: Evaluation of anesthesia residents using mannequin-based simulation—A multiinstitutional study. Anesthesiology. 2002;97:1434–1444
46. Reznek M, Smith-Coggins R, Howard SK, et al. Emergency medicine crisis management (EMCM): Pilot study of a simulation-based crisis management course for emergency medicine. Acad Emerg Med. 2002;10:386–389
47. Kizakevich PN, McCartney ML, Nissman DB, et al. Virtual medical trainer: Patient assessment and trauma care simulator. Stud Health Technol Inform. 1998;50:309–315
48. Maslovitz S, Barkai G, Lessing JB, et al. Recurrent obstetric management mistakes identified by simulation. Obstet Gynecol. 2007;119:1295–1300
49. Heinrichs WL, Youngblood P, Harter PM, Dev P. Simulation for team training and assessment: Case studies of online training with virtual worlds. World J Surg. 2008;32:161–170
50. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance. Ann Surg. 2002;236:458–463
51. Sroka G, Feldman LS, Vassiliou MC, et al. Fundamentals of laparoscopic surgery simulator training to proficiency improves laparoscopic performance in the operating room—A randomized controlled trial. Am J Surg. 2010;199:115–120
52. Boyd KB, Olivier J, Salameh JR. Surgical residents’ perception of simulation training. Am Surg. 2006;72:521–524
53. Pugh C, Plachta S, Auyang E, et al. Outcome measures for surgical simulators: Is the focus on technical skills the best approach? Surgery. 2010;147:646–654
54. Kohls-Gatzoulis JA, Regehr G, Hutchinson C. Teaching cognitive skills improves learning in surgical skills courses: A blinded, prospective, randomized study. Can J Surg. 2004;47:277–283
55. Fried GM, Derossis AM, Bothwell J, et al. Comparison of laparoscopic performance in vivo
with performance measured in a laparoscopic simulator. Surg Endosc. 1999;13:1077–1081
56. Swanstrom LL, Fried GM, Hoffman KL, et al. Beta test results of a new system assessing competence in laparoscopic surgery. J Am Coll Surg. 2006;202:62–69
57. American Board of Surgery. . Booklet of Information: Surgery 2011–2012 www.absurgery.org/xfer/BookletofInfo-Surgery.pdf
Accessed March 30 2012
58. Bell RH Jr, Discourse LLC. Personal communication with DK Andersen. September 15 2011
59. Zendejas B, Cook DA, Bingener J, et al. Simulation-based mastery learning improves patient outcomes in laparoscopic inguinal herniorrhaphy: A randomized controlled trial. Ann Surg. 2011;254:502–511
60. Windsor JA. Role of simulation in surgical education and training. ANZ J Surg. 2009;79:127–132
61. Marshall MB, Wilson BM, Carter YM. Thoracic surgery skill proficiency with chest wall tumor simulator [published online ahead of print March 1, 2011] J Surg Res. doi:10.1016/j.jss.2011.01.055
62. Immenroth M, Burger T, Brenner J, et al. Mental training in surgical education: A randomized controlled trial. Ann Surg. 2007;245:385–391
63. Vadocz E, Hall C, Moritz SE. . The relationship between competitive anxiety and imagery use. J Appl Sport Psychol. 1997;9:241–253
64. Arora S, Aggarwal R, Moran A, et al. Mental practice: Effective stress management training for novice surgeons. J Am Coll Surg. 2011;212:225–233
65. Arora S, Hull L, Sevdalis N, et al. Factors compromising safety in surgery: Stressful events in the operating room. Am J Surg. 2010;199:60–65
66. Diesen DL, Ehunmwunsee L, Bennett KM, et al. Effectiveness of laparoscopic computer simulator versus usage of box trainer for endoscopic surgery training of novices. J Surg Educ. 2011;68:282–289
67. Satava RM. Historical review of surgical simulation—A personal perspective. World J Surg. 2008;32:141–148
68. Spencer FC. Teaching and measuring surgical techniques—The technical evaluation of competence. Bull Am Coll Surg. 1978;63:9–12
69. Dankelman J. Surgical simulator design and development. World J Surg. 2008;32:149–155