Traditionally, the creation of technical skills curricula has relied on the ability of subject matter experts (SMEs) to articulate the cognitive decisions and steps in a procedure. This can cause a significant problem in our current educational model because experts tend to unintentionally omit important information when teaching novices.1–3 This is due to the automated nature of knowledge and the fact that practice over time causes the basic elements of a task to become “automated” and performed largely without conscious awareness.1–3 Recent research has shown that experts may omit up to 70% of the cognitive components and decision steps learners need to successfully perform a task.4–6 This “70% rule” most often results in incomplete training protocols and curricula that are not optimally effective and efficient.
Cognitive task analysis (CTA) has emerged as a method to unpack the cognitive processing of experts and gain access to the decision strategies that underlie task performance.7 Studies have shown that learners who participated in a CTA-informed curriculum outperformed their peers who received traditional teaching8–11 and that the use of CTA techniques has resulted in increased capture of knowledge and skills.5,6
The most widely used methods of CTA involve a series of structured interviews with three to five subject matter experts. During the interviews, the cognitive task analyst asks the experts to describe the steps of a specific task and asks a series of probing questions that assess the expert’s actions, critical cues, potential error identification, and cognitive decision points. At the end of the interviews, the step-by-step procedure and critical decision points are put into a curriculum document that describes the steps of the task and the cognitive decision points that were identified throughout the procedure.
The objective of this study was to compare the percentage of knowledge that experts omitted when teaching a cricothyrotomy (a procedure used to establish a definitive airway in an emergency situation) and validate the 70% rule. In addition, we sought to determine the percentage of knowledge gained during a CTA interview by comparing the amount of knowledge articulated by experts during the initial free-recall (unprompted) portion of the interview with the percentage of knowledge captured using CTA techniques (prompted) against an aggregated “gold standard.” Thus, our purpose was twofold: to determine the amount of clinical knowledge, action steps, and decision steps that are unintentionally omitted by experts when teaching a cricothyrotomy using traditional teaching methods; and to identify the gaps in expert knowledge during a free-recall interview as compared with a CTA interview protocol.
In this study, we used CTA methodology to capture the expertise of three experienced trauma surgeons. CTA emerged in the 1980s as an extension of traditional behavioral task analysis (BTA).12 BTA was developed during the behavioral psychology era when behavior was explained by a linear series of single stimulus-response units. Thus, BTA focuses solely on the observable execution of the task, with little or no attention to the underlying cognitive processes that accompany each skill.1 Today, many programs are rooted in a culture of BTA and rely on specific objective checklists that outline the steps of a procedure. The problem is that many of these checklists focus primarily on skill execution and not on the cognitive decisions that are made throughout the procedure. BTA may overlook the hidden cognitive aspects of a task and may oversimplify a complex task,13 thus making it a bit absolute in terms of capturing expertise to create curricular content. CTA can help us deconstruct the automated skills of experts so that they are broken down into concrete measurable steps that learners understand and can provide us with information about how decisions are made; more important, CTA can illuminate what information or cues are used by the expert to make those decisions.
In 2010, we videotaped three expert trauma surgeons (A, B, C) teaching an open cricothyrotomy procedure to second- and third-year postgraduate surgical residents at the University of Southern California. We instructed each of these subject matter experts (SMEs) to be as thorough as possible when describing the steps and decision points of the task. At the end of the procedure, we asked the experts if they wanted to add any information that they may have unintentionally omitted during the teaching session, as well as if they felt that they had given a complete description of the skill. Each videotape was transcribed verbatim and coded by two investigators (M.S. and K.Y.) using a modified CTA coding scheme developed by Clark and colleagues.14 Each item was coded as the clinical knowledge, action steps, or decision steps that each surgeon described during the teaching session. We defined clinical knowledge as the knowledge of facts, or declarative knowledge, needed to understand the implications for performing or assessing the task. Physical actions (procedural steps) that were recalled were coded as action steps. We defined action steps as any physical motor step that was performed. Action steps that were demonstrated but not described were not included. Decisions that were made during the procedure and described were coded as decision steps. We defined decision steps as any decision or judgment that was made during the procedure. The coders achieved a substantial interrater reliability with a kappa coefficient of 0.82.
After the video transcription, each of the three experts (A, B, C) participated in a CTA interview for the same procedure. We recruited an additional three trauma surgeons (D, E, F) to go through a CTA interview for an open cricothyrotomy. The CTA method employed was Clark’s Concepts, Process, and Principles (CPP) approach.14 This method involves a multistage interview technique that uses multiple SMEs to describe the same procedure, followed by cycles of expert self- and peer-review. All interviews were digitally recorded.
All CTA interviews were conducted by one of us (M.E.S.), who was trained extensively by another author (R.E.C.), who is a known CTA expert. All CTA documents in this study were prepared by M.E.S. and reviewed by R.E.C. During the first phase of the interview (unprompted), we asked the SMEs to describe the performance sequence of all key subtasks necessary to perform the skill. The resulting task diagram, which encompassed four tasks each with subtasks, served as the guide for the rest of the interviews. The tasks, sequentially, were prepare patient, equipment, and self; make incisions and open airway; place tube in airway; confirm placement; and secure the tube. During this initial phase, the SMEs were asked to freely recall and describe the steps, cognitive decisions, and actions for each subtask identified. We then implemented phase two of the interview (prompted) and asked probing questions to clarify actions and decisions, such as “How do you identify the location landmarks?” or “How do you know if your incision location is correct?” These knowledge elicitation techniques allowed us to gain information about the sequence of actions, decisions, alternatives to consider, criteria to decide between alternatives, equipment and materials, and performance standards for each of the subtasks. Once the interviews were completed, we transcribed them verbatim, and they were coded by two investigators using a scheme developed by Clark and colleagues.14 Each rater reviewed each line to identify the actions, conditions (indications, contraindications), accuracy, time, equipment and materials, declarative knowledge (concepts, processes, principles), sensory cues, clinical knowledge, and decision steps. Disagreements were resolved through consensus. The coders achieved a kappa statistic of 0.83.
We used the transcribed CTA interviews to create a procedure protocol for each individual surgeon. Once completed, the protocols were sent to the respective experts for review and revisions. We asked each surgeon to carefully review his or her protocol for any errors and to add any additional information that was necessary. Once the experts returned the corrected protocols to the investigators, they were combined into one procedure to create a preliminary “gold standard.” A draft of the gold standard was reviewed by two surgical residents (postgraduate years 2 and 3) for clarity and understandability. We instructed the residents to pose clarifying questions to the experts if there was an area of confusion or if they felt that they needed additional information to understand the task. The gold standard document was then sent back to the experts for review and revisions. Conflicts amongst the surgeons regarding the preliminary gold standard were resolved with e-mail and phone communication. Once all of the preliminary protocols were returned, the final gold standard protocol was developed.
We used the gold standard protocol to create a 46-item task list for cricothyrotomy (Table 1). In an effort to determine the amount of knowledge that was omitted during the teaching sessions (experts A, B, C), we compared the transcriptions from each session with the task list. Omissions of clinical knowledge, action steps, and decision steps were identified. The action steps were coded as either “what to do” steps or “how to do it” steps. We felt this was important because often learners are told “what” to do without the information that they need to know “how” to perform the task successfully.
In addition, we compared the amount of knowledge that we were able to obtain during the CTA interviews against the aggregate gold standard. Specifically, we were interested in comparing the amount of knowledge articulated by each expert without any prompting (on their own recall) versus the amount of knowledge gained by using CTA knowledge elicitation techniques.
This study was determined to be exempt by the institutional review board of the University of Southern California, Health Sciences Campus.
The results from individual teaching sessions can be found in Table 2. Experts omitted describing an average of 71% (10/14) of clinical knowledge steps, 51% (14/27) of action steps, and 73% (3.6/5) of decision steps when observed teaching a cricothyrotomy. On further analysis, we found that when describing the action steps, the experts tended to describe only “what to do” instead of “how to do it” and included information about how to perform the task only 13% of the time (3.6/27 steps). This implies that 87% of the time the learners had to fill in gaps in their knowledge about how to perform an open cricothyrotomy.
When looking at the individual CTA interview transcripts, we were interested in determining the amount of knowledge that each surgeon gave during free recall (unprompted) as compared with the percentage of knowledge that we were able to capture once we began to ask probing questions (prompted) and implement knowledge elicitation techniques. As shown in Table 3, we gained an average of 23% of clinical knowledge steps (increase from 6 steps to 9.3/14), 28% of action steps (increase from 14.3 steps to 20/27), 20% of decision steps (increase from 3 to 4/5 steps), and 22% of total steps (increase from 20.3 to 30.6/46) by using CTA methods.
This study supports previous research indicating that experts omit about 70% of the knowledge during teaching that a novice needs to successfully perform a skill.4–6 In a 2012 study, Clark and colleagues6 showed that expert surgeons omitted an average of 69% of the standard procedural steps when asked to provide an unaided free-recall description of a femoral shunt procedure. In another study, Sullivan and colleagues5 showed that experts omitted an average of 65% of the “how to” steps and 65% of the critical decision points when teaching a colonoscopy to second-year postgraduate residents. This “70% rule” can be explained by the expertise literature, which shows that as individuals develop expertise, their skills become automated and the steps of the skill begin to blend together.2,3,15 The SMEs in this study are known at our institution as excellent teachers, and all have won at least one teaching award. Despite this, they all omitted a significant amount of information when teaching this task. This omission can cause problems in medical education because learners are sometimes forced to fill in the blanks by trial-and-error learning. It is important to note that the term “omission” in this context refers to the fact that because decisions are “automated” and nonconscious, they are unintentionally omitted when experts explain how they perform familiar but complex tasks. Decisions are omitted from descriptions but not from expert performance. Unfortunately, procedural knowledge becomes inflexible once automated, and experts often develop rigid mental models.16 In the domain of medicine, it has been documented that an inverted U-shape relationship exists between expertise level and recall.17,18 This has been termed the “intermediate effect” and supports the concept that teachers at an intermediate level of expertise are better able to recall and describe details of a procedure than a novice or an expert.18 Thus, although the extensive experience, superior performance, and conceptual knowledge of experts makes them the best practitioners, the automaticity that develops along with expertise leads us to believe that the critical cognitive processes and procedural steps that are essential to teaching procedural skills are not effectively conveyed to learners by experts in the current medical education model. A potential solution to this dilemma is the use of CTA to unpack the mental models and extract implicit and explicit knowledge from experts.
There are many different methods of CTA, and although commonalities can be found among all methods, they all vary with respect to how they elicit expert knowledge, represent knowledge, and use the tasks to stimulate expert recall. In 2007, Yates19 performed an analysis of all published descriptions of CTA methodology and identified approximately 100 different methods. In 2008, Yates and Feldon20 concluded that only 6 out of the 100 published methods are supported by empirical evidence and predict knowledge outcomes. Most of these 6 methods involve a series of structured, unstructured, or group interviews with three to five subject matter experts (for a comprehensive review of CTA methods, see Schraagen and colleagues21).
In this study, we compared the amount of knowledge obtained during SMEs’ initial free recall (unprompted) with the percentage of knowledge we were able to capture by using CTA techniques (prompted). Our findings are consistent with findings by Chao and Salvendy,22 who demonstrated a 28% increase in information captured from SMEs when using CTA techniques, and Clark and Estes,1 who showed that using CTA methods can result in a 40% increase in knowledge gained. Collectively, these findings demonstrate the usefulness of CTA methods to capture expertise where many covert decisions are linked to overt actions.
The use of CTA methodology is promising and has the potential to improve medical education programs in several tangible ways. First, as described above, it allows us to capture the covert cognitive strategies that experts use during tasks and pay attention to the complex thought processes that accompany the behavioral execution of a skill. Second, it is a method to deconstruct an expert’s automated procedural skills and dissect out the procedure into concrete measurable steps that learners understand. With the information acquired from experts, medical educators can create curricular models that outline each step of the procedure and the critical decision points that need to be discussed with learners. This information can then be put into training aids that can be given to junior faculty to assist with teaching procedural skills. This may decrease the need to rely solely on experts for teaching, capitalize on the “intermediate effect,” and allow learners to develop the needed conceptual understanding of the task and begin formulating the procedural steps and decision points. It is important to note that CTA was used in this study on a relatively short and manageable procedure. It has been a concern of some that the use of CTA methods on a longer, more complicated procedure may result in an explosion of steps linked to the task. We recommend that CTA methodology is even more useful in longer procedures where the faculty may run out of steam teaching every aspect of the task and begin to skip over essential elements. CTA allows us to develop a written protocol to give to learners and faculty so that the segments of the task are documented. The faculty member can then decide whether the procedure is best taught in chunks or sequentially in one learning session. Lastly, CTA allows us to provide learners access to all of the knowledge that they need in order to successfully accomplish a task up front. This changes the role of the teacher/coach who traditionally has the role of determining learner readiness and monitoring student progress. By giving all needed information up front, the teacher/coach can spend less time on instruction and more time on the critical role of assessing students’ application of a CTA-based protocol when they practice and give needed corrective feedback. This has been shown to increase the learning curve and facilitate the development of expertise.1,23,24 One of the most noteworthy claims was made by Means and Gott,24 who speculated that 50 hours of training based on CTA could be transmitted to the equivalent of five years of job knowledge.
There are numerous studies that have shown the effectiveness of CTA methodology to elicit expert knowledge not captured by other means5,6,25 and to develop training materials that are substantially more effective.8–11,26,27 Yet, despite a long history of development and several studies giving evidence for its impact, the limitations associated with CTA have prevented its widespread adoption and use in medical education. The biggest bottleneck is the time associated with completing a CTA as the manpower needed to design, conduct, and participate in a CTA can be demanding. The time investment will vary depending on the expertise of the CTA analyst, the number of experts used, and the complexity and length of the procedure; however, it has been documented that one hour of focused expertise requires approximately 30 to 35 hours of effort by a CTA analyst.1 The second limitation to applying CTA to medical education is that there are very few trained CTA analysts and very few resources available to learn the process. The current method of gaining experience with CTA is through traditional, mentor-based instruction. Many CTA practitioners have developed business ventures on their own propri etary version of CTA and are either not performing research or not sharing the results of their studies. A third limitation is that nearly all CTA methods require human judgments throughout the process regarding the selection of experts and how to capture and format experts’ judgment. This introduces variability that can make the analysis, generalizability, and replication of CTA protocols difficult if not impossible.
To move forward with a CTA research agenda, several things are needed. First, some agreement needs to be made to focus studies using one or more of the six evidence-based CTA methods that have been documented and whose advocates have published studies in peer-reviewed journals.20 However, the first step may be to develop a training tool that can be used to teach individuals CTA techniques. This can be accomplished by performing a CTA on expert CTA practitioners using the same set of tasks and experts. The goal of this would be to develop an instructional design model that can be used to train an army of CTA collaborators and researchers. The next step would be to organize a library of CTA-informed curricula for medical educators to share as opposed to working in silos and spending time repeating CTAs on shared procedures. With the development of a standardized CTA protocol and the training of additional CTA analysts, it may be possible to take advantage of the considerable problem-solving expertise developed by top practitioners in the field. This would enable students to learn and apply an achieved solution instead of “filling in the gaps” through trial and error due to incomplete instruction. Although CTA can be time-intensive, the benefit of capturing more accurate and complete descriptions of procedures and transmitting this information more completely to students may be worth the investment.
In an era of increased faculty demands and decreased resident work hours, it is important to use the best and most efficient teaching methods available. Programs that have used CTA methodology have been shown to decrease training time1,23,24 and accelerate the learning curve.1,23,24 As the demands in medical education continue to rise, CTA may prove to be an important tool to improve instruction and accelerate resident training.
Acknowledgments: The authors wish to thank Drs. Ramon Cesteros, Donald Green, David Plurad, and Andrew Tang for serving as a subject matter expert and assisting in the development of the cognitive task analysis gold standard protocol.
1. Clark RE, Estes F. Cognitive task analysis. Int J Educ Res. 1996;25:403–417
2. Fitts PM, Posner MI Human Performance. 1967 Monterey, Calife Brookes/Cole
3. Dreyfus H, Dreyfus S Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer. 1986 New York, NY Free Press
4. Feldon DF, Clark REClarebout G, Elen J. Instructional implications of cognitive task analysis as a method for improving the accuracy of experts’ self-report. Avoiding Simplicity, Confronting Complexity: Advances in Studying and Designing (Computer-Based) Powerful Learning Environments. 2006 Rotterdam, The Netherlands Sense Publishers:109–116
5. Sullivan ME, Ortega A, Wasserberg N, Kaufman H, Nyquist J, Clark R. Assessing the teaching of procedural skills: Can cognitive task analysis add to our traditional teaching methods? Am J Surg. 2008;195:20–23
6. Clark RE, Pugh CM, Yates KA, Inaba K, Green DJ, Sullivan ME. The use of cognitive task analysis to improve instructional descriptions of procedures. J Surg Res. 2012;173:e37–e42
7. Schraagen JM, Chipman SF, Shalin VL. Cognitive Task Analysis. 2000 Mahwah, NJ Lawrence Erlbaum
8. Velmahos GC, Toutouzas KG, Sillin LF, et al. Cognitive task analysis for teaching technical skills in an inanimate surgical skills laboratory. Am J Surg. 2004;187:114–119
9. Sullivan ME, Brown CV, Peyre SE, et al. The use of cognitive task analysis to improve the learning of percutaneous tracheostomy placement. Am J Surg. 2007;193:96–99
10. Luker KR, Sullivan ME, Peyre SE, Sherman R, Grunwald T. The use of a cognitive task analysis-based multimedia program to teach surgical decision making in flexor tendon repair. Am J Surg. 2008;195:11–15
11. Campbell J, Tirapelle L, Yates K, et al. The effectiveness of a cognitive task analysis informed curriculum to increase self-efficacy and improve performance for an open cricothyrotomy. J Surg Educ. 2011;68:403–407
12. Crandall B, Klein G, Hoffman RR Working Minds: A Practitioner’s Guide to Cognitive Task Analysis. 2006 Cambridge, Mass MIT Press
13. Hoffman RR, Militello LG Perspectives on Cognitive Task Analysis. 2009 New York, NY Psychology Press, Taylor & Frances Group
14. Clark RESpector JM, Merril MD, Ellen J, Bishop MJ. Cognitive task analysis for expert based instruction in health care. Handbook of Research on Educational Communications and Technology. 20144th ed New York, NY Springer Science+ Business:541–555
15. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81
16. Feldon D. The implications of research on expertise for curriculum and pedagogy. Educ Psychol Rev. 2007;19:91–110
17. Wilson TD, Nisbett RE. The accuracy of verbal reports about the effects of stimuli on evaluations and behavior. Soc Psychol. 1978;41:118–131
18. Rikers RM, Schmidt HG, Boshuizen HP. Knowledge encapsulation and the intermediate effect. Contemp Educ Psychol. 2000;25:150–166
19. Yates KA. Towards a Taxonomy of Cognitive Task Analysis Methods: A Search for Cognition and Task Analysis Interactions. Unpublished doctoral dissertation presented to the faculty of the Rossier School of Education at the University of Southern California; 2007
20. Yates KA, Feldon DF. Towards a taxonomy of cognitive task analysis methods for instructional design: Interactions with cognition.Paper presented at: Annual Meeting of the American Educational Research AssociationMarch 25, 2008New York, NY
21. Schraagen J, Chipman S, Shalin V Cognitive Task Analysis. 2000 Mahwah, NJ Lawrence Erlbaum
22. Chao CJ, Salvendy G. Percentage of procedural knowledge acquired as a function of the number of experts from whom knowledge is acquired for diagnosis, debugging and interpretation tasks. Int J Hum Comput Interaction. 1994;6:221–233
23. Glaser R, Lesgold A, Lajoie S, et al. Cognitive Task Analysis to Enhance Technical Skills Training and Assessment. Final Report to the Air Force Human Resources Laboratory on Contract No. F41689-8V3-C-0029. 1985 Pittsburgh, Pa Learning Research and Development Center, University of Pittsburgh
24. Means B, Gott SPPsotka J, Massey LD, Mutter SA. Cognitive task analysis as a basis for tutor development: Articulating abstract knowledge representations. Intelligent Tutoring Systems: Lessons Learned. 1988 Hillsdale, NJ Lawrence Erlbaum:35–58
25. Crandall B, Getchell-Reiter K. Critical decision method: A technique for eliciting concrete assessment indicators from the intuition of NICU nurses. ANS Adv Nurs Sci. 1993;16:42–51
26. Schaafstal A, Schraagen JM, van Berlo M. Cognitive task analysis and innovation of training: The case of structured troubleshooting. Hum Factors. 2000;42:75–86
27. Merrill DM. First principles of instruction. Educ Technol Res Dev. 2002;50:42–59