In their article in this issue of Academic Medicine, Kyser and colleagues1 docu ment the limited learning opportunities for obstetrics–gynecology residents to master certain types of core procedures, such as forceps and vacuum deliveries.1 They show convincingly that the fre quency of these procedures at many teaching hospitals is remarkably low, resulting in insufficient opportunities for residents to learn by doing under the careful supervision of an expert, who can teach the procedure and take over if any unanticipated complications occur. Kyser and colleagues also document that these procedures are challenging and even today are associated with the risk of significant injuries to the mother and the newborn infant, so protected training environments are critical to doctors acquiring and maintaining these skills.
Both the limitations on learning opportunities available in the clinics and the restriction on resident work hours have created a real problem for the traditional apprenticeship model for training doctors. Fortunately, studies of expert performance in domains, such as music, chess, dance, and sports, have shown that the accumulation of realistic experience is not the most effective method to improve performance.2 For example, playing more chess games is not associated with attaining higher levels of chess skill for tournament players. Research shows that, instead, studying and practicing by oneself is the most effective method to improve chess skill.2 When engaging in this type of deliberate practice, individuals work on particular tasks (often recommended by their teacher) to gradually improve some aspect of their performance using problem solving and repetitions with immediate feedback. Typically, the teacher assesses the students’ levels of performance, then recommends training activities to improve targeted aspects of their performance. The students then practice by themselves until their targeted goals are attained. Exactly what these training activities are will, of course, differ as a function of each individual’s expertise, skill level, and idiosyncratic weaknesses, but they always require the execution of a training task with immediate feedback on performance.
The difference between deliberate practice and actual experience is also nicely illustrated in chess. When individuals play a chess game, it may be hours before the game is finished and the winner is determined. Even when a player loses, she may find it difficult to identify which moves lost her the game to generate feedback on what she needs to change to improve her performance in the future. Solitary deliberate practice, on the other hand, often relies on chess computer programs, which today are far superior in skill to any human chess player. Most important, computers can provide more accurate feedback on each chess move and are available at any time for practice. Consequently, motivated chess players can play chess games against a computer program and receive feedback on each move along with suggestions for better moves that they could have made at that particular point in the chess game. Computer programs, then, provide a vastly superior learning environment than playing games against other human players.
My general point is that educators should consider training doctors outside the context of performing actual tasks in their natural context in the clinic, such as residents carrying out core procedures on real patients in their local hospital. In fact, musicians, dancers, and athletes spend most of their time training by themselves to get ready to exhibit their skills for the first time in front of a large audience, such as an athlete giving a particular performance only once during the finals of a given event in the Olympic Games.
In this commentary, I will argue that training doctors outside the constraints of everyday encounters with patients can be more effective in enhancing their performance. Although the case can be made for promoting deliberate practice in training doctors to perform most types of medical procedures, I will focus on emergency procedures like those discussed by Kyser and colleagues.1 Doctors cannot schedule a forceps or vacuum delivery in advance—these procedures are used when the second stage of a delivery is not proceeding as planned. Even if the attending doctor is skilled in both forceps and vacuum deliveries, a resident might have to attend a great many deliveries before he has the opportunity to practice either procedure. If a resident is keen on accumulating experience with the forceps procedure, for example, she may have to spend hours or days in the delivery ward to get a short period of relevant experience. Let us contrast this situation with that of a musician who desires to master a rapid section of a musical piece that he is scheduled to perform in public. The musician can focus solely on the section giving him problems without having to wait for an opportunity to do so. A resident in the current apprenticeship model, however, is more like an orchestra musician forced to play the entire music program over and over for the chance to practice the brief problem section.
Over a century ago, when the appren ticeship model for surgery was first introduced, the only viable option was to have medical students first observe the experts complete the procedures. Then the students would execute the same procedure under the careful monitoring of the same experts until they were able to execute the procedure independently. Following the famous dictum “see one, do one, teach one,” newly trained doctors would then teach the procedure to the next class of medical students. Since then, very significant developments have advanced training methods in virtually all domains of expertise. In music, the emergence of recordings from gramophone records to iPods, and in other domains, such as sports and dance, the availability of videos of performances, have provided trainees with the ability to repeatedly review particular aspects of the performances of national and even world-class professionals. This new technology also offers opportunities for a musician, for example, to replay videos of his own as well as of masters’ performances and can produce more accurate immediate feedback, as illustrated in the chess example above.
Given this technology, several alternative options to the traditional apprenticeship model of learning procedures and skills exist. Today, we can video-record the execution of medical procedures in the clinic. Some aspects of the procedure can be captured better on video than by first-person observation in the clinic by simultaneously recording the events from several points of view, such as a bird’s eye view of the entire team as well as the view of the individual performing the procedure. After digitizing the recording and storing it on a server, an unlimited number of residents could watch it and focus on particular parts, run them repeatedly in slow motion, and make copies of selected portions to bring to their supervisors for an explanation of the recorded events. As I mentioned earlier, recordings of infrequent emergency procedures, such as forceps and vacuum deliveries, would be particularly valuable. If the videos were stored on a server and indexed with relevant starting and ending points, a resident could carefully observe and review in a single afternoon the same number of emergency procedures that would take her weeks, months, or even years to experience in the clinic.
Thousands of years ago, the first libraries were established to collect books and writings to capture and safeguard recorded knowledge. In a similar way, I envision that one could collect these videos of medical events and procedures and index rare and infrequent events that happen unpredictably, such as particular complications during emergency procedures. These videos would not only be relevant to residents, who are learning the procedures, but also to experienced doctors, who want to refresh their knowledge after a period of disuse, and to doctors who anticipate complications with a particular patient. As these videos could provide general information about the patient, the relevant experience of the individual executing the procedure, and the patient’s outcome, researchers could use them to analyze the performance of the doctor completing the procedure. Analyzing both the precursors of a decision to execute the emergency procedure and the patient’s outcomes to develop safer procedures with better patient outcomes may even be possible.
A library with pictures and/or digitized images of patients also could be used for training and assessing the accuracy of medical decision making. For example, some of the videos could stop at a point where a critical action has to be taken. With the video stopped, trainees would be required to describe their next action under real-time constraints. After the trainees had recorded their responses, they would be asked to describe what would happen if the particular action about to be performed on the video were taken instead. After stating their predictions, the video would be played to show what actually happened, thus giving the trainees feedback about the accuracy of their predictions and decision-making abilities.3 This methodology also can be applied to the diagnosis of x-rays and other medical images, where trainees are shown patient information and the image and asked to enter their diagnosis. By using images from past patients, trainees could receive accurate feedback immediately after their diagnosis. Others have implemented this effective technique to teach trainees to identify foot fractures.4,5 Similar systems could be developed for the formative assessment of decision making by trainees, certification of residents, and continuing medical education of doctors.
Even observing numerous emergency procedures and knowing what an expert would do at different decision points is not enough to execute a complex procedure safely with good patient outcomes. The learning of every procedure is associated with a learning curve. For most medical procedures, the critical measure is not speed to complete a typical case but, rather, the ability to handle complications and unusual cases. Furthermore, patient outcomes are often not immediately observable, so, for complex procedures, cancer surgeons, for example, should ideally follow cases for several years to determine successful treatments. These long-term evaluation studies suggest that surgeons may need as many as 100 to 500 completed procedures to attain their best patient outcomes.6 Doctors cannot be kept in training until they attain these levels of performance. In addition, taking part in certain training activities in designed training environments improves performance more efficiently than experiencing more procedures with real patients. Simulators, for example, are increasingly being used to give trainees a safe environment in which to learn. However, simulators must provide trainees with deliberate practice opportunities, as computer programs do for chess players, musicians, and athletes, rather than offering mere experiences of completing procedures. In several influential reviews, McGaghie and colleagues7 showed that the key to improving and transferring trainees’ skills is to design the training according to principles of deliberate practice and to use tasks with immediate feedback and opportunities for corrections and refinements after repetitions. Ideally, trainees’ target performance on a simulator task should be determined by experts performing the simulator task during the procedure, thereby establishing a performance standard that has face validity and will be motivating for trainees.
Deliberate practice with simulators offers specific advantages over supervised practice with real patients. First, isolating the difficult aspects of the procedure and repeatedly working on them is possible without running through the whole, often time-consuming procedure every time—much like expert musicians focusing on the challenging sections of musical pieces. Second, performance can be measured and informative feedback given at any point during the procedure rather than having to wait until the end of surgery or active efforts caring for the patient. Third, a trainee’s performance in the simulator can be slowed or stopped several times to correct a particular error before the trainee continues with the procedure. Finally, using a simulator allows trainees to focus on the fundamentals of a procedure to learn the best positions and basic actions, which in turn allows them to attain a higher skill level and to perform better.8 The importance of fundamental procedures is well established in music and dance—some performers have to relearn the fundamentals to reach the highest levels and/or deal with overuse injuries resulting from faulty fundamentals.
Interest is increasing in systematically measuring residents’ and doctors’ performance on the job, such as their readiness for conducting procedures after completing experience-based training. One such test showed that the vast majority of neurology residents failed a test of lumbar punctures.9 However, after completing a three-hour simulator session on lumbar punctures that featured deliberate practice and feedback, all of the residents passed the test, and more than 80% passed a retest 12 months later. These and other related findings led Nathan and Kincaid10 to argue in an editorial that “old training methods are no longer enough to ensure the best education and thus the best care for patients.”
In medicine, collecting evidence on the reliable effects of simulator training in emergency procedures on the survival of patients has been difficult because of the scarcity and unpredictability of such events. The U.S. Air Force, however, has a long history of training pilots in emergency procedures using simulators, and they repeat this training on an annual basis. In a unique study, McKinney and Davis11 analyzed the outcomes of 150 mechanical malfunctions experienced by Air Force fighter pilots while flying their jets from 1980 to 1990. The authors asked if the outcomes of these crisis situations were related to prior training. They found that higher estimated amounts of time spent in deliberate practice (when the pilots had practiced the actual malfunctions in the simulators) were significantly associated with better decision making and more successful outcomes. They also found that skills did not transfer from the successful handling of practiced to unpracticed scenarios.
Making the newly emerging technology for recording procedures available for educational and research purposes offers a new perspective for medical education, including residency training and continuing education. Many exciting opportunities exist for indexing video recordings of different types of events and preparing opportunities for training and assessing students’ medical decision-making skills. In the future, linking these videos directly to simulators should be possible, so trainees could focus on particular aspects of the procedures and be required to respond to prompts with recordable actions. In addition, these learning systems should allow medical students and residents to train when they are most rested rather than when complications from a protracted surgery unexpectedly require the execution of the procedure. Similarly, they would offer the capability of on-demand training for doctors who have not performed a procedure for a specific length of time. Such systems should allow all levels of practitioners to learn from others’ emergencies and mistakes rather than their own mistakes with real patients. I am convinced that recent and future developments in medical training and simulation, using designed learning and simulation environments, will make medical training more effective. These environments also will support continued professional assessment and training to develop the expert performance2 in medicine that attains the best possible patient outcomes.
1. Kyser KL, Lu X, Santillan D, et al. Forceps delivery volumes in teaching and non teaching hospitals: Are volumes sufficient for physicians to acquire and maintain competence? Acad Med. 2014;89:71–76
2. Ericsson KAEricsson KA, Charness N, Feltovich P, Hoffman RR. The influence of experience and deliberate practice on the development of superior expert performance. Cambridge Handbook of Expertise and Expert Performance. 2006 Cambridge, UK Cambridge University Pres In:
3. Ericsson KA. Deliberate practice and acquisition of expert performance: A general overview. Acad Emerg Med. 2008;15:988–994
4. Pusic M, Pecaric M, Boutis K. How much practice is enough? Using learning curves to assess the deliberate practice of radiograph interpretation. Acad Med. 2011;86:731–736
5. Yarris LM, Gruppen LD, Hamstra SJ, Anders Ericsson K, Cook DA. Overcoming barriers to addressing education problems with research design: A panel discussion. Acad Emerg Med. 2012;19:1344–1349
6. Klein EA, Bianco FJ, Serio AM, et al. Surgeon experience is strongly associated with biochemical recurrence after radical prostatectomy for all preoperative risk categories. J Urol. 2008;179:2212–2216
7. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86:706–711
8. Crochet P, Aggarwal R, Dubb SS, et al. Deliberate practice on a virtual reality laparoscopic simulator enhances the quality of surgical technical skills. Ann Surg. 2011;253:1216–1222
9. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012;79:132–137
10. Nathan BR, Kincaid O. Does experience doing lumbar punctures result in expertise? A medical maxim bites the dust. Neurology. 2012;79:115–116
11. McKinney EH Jr, Davis KJ. Effects of deliberate practice on crisis decision performance. Hum Factors. 2003;45:436–444