Share this article on:

Teaching Clinical Reasoning: Case-Based and Coached

Kassirer, Jerome P. MD

doi: 10.1097/ACM.0b013e3181d5dd0d
Clinical Reasoning

Optimal medical care is critically dependent on clinicians' skills to make the right diagnosis and to recommend the most appropriate therapy, and acquiring such reasoning skills is a key requirement at every level of medical education. Teaching clinical reasoning is grounded in several fundamental principles of educational theory. Adult learning theory posits that learning is best accomplished by repeated, deliberate exposure to real cases, that case examples should be selected for their reflection of multiple aspects of clinical reasoning, and that the participation of a coach augments the value of an educational experience. The theory proposes that memory of clinical medicine and clinical reasoning strategies is enhanced when errors in information, judgment, and reasoning are immediately pointed out and discussed. Rather than using cases artificially constructed from memory, real cases are greatly preferred because they often reflect the false leads, the polymorphisms of actual clinical material, and the misleading test results encountered in everyday practice.

These concepts foster the teaching and learning of the diagnostic process, the complex trade-offs between the benefits and risks of diagnostic tests and treatments, and cognitive errors in clinical reasoning. The teaching of clinical reasoning need not and should not be delayed until students gain a full understanding of anatomy and pathophysiology. Concepts such as hypothesis generation, pattern recognition, context formulation, diagnostic test interpretation, differential diagnosis, and diagnostic verification provide both the language and the methods of clinical problem solving. Expertise is attainable even though the precise mechanisms of achieving it are not known.

Dr. Kassirer is distinguished professor, Tufts University School of Medicine, Boston, Massachusetts, and visiting professor, Stanford University, Palo Alto, California.

Correspondence should be addressed to Dr. Kassirer, Tufts University School of Medicine, 136 Harrison Ave., Boston, MA 02111; telephone: (781) 237-1971 or (617) 306-9788 (cell); e-mail:

All teaching methods are of necessity pragmatic and context-dependent. Teaching approaches lack a firm scientific underpinning because of the paucity of scientific evidence about optimal learning. Despite substantial advances in our understanding of human cognition during the last few decades, our teaching methods are still based largely on expert opinion. If these assertions are true for elementary teaching, they are even more compelling when applied to a field as complex as clinical reasoning. Given these modest scientific underpinnings, we might just throw up our hands and give up any hope of imparting reasoning skills to students and residents, yet we know there is much to learn, that many do become expert clinical problem solvers, and that the welfare of patients depends as much on reasoning and problem-solving abilities as it does on the use of the latest technology.

Clinical cognition encompasses the range of strategies that clinicians use to generate, test, and verify diagnoses, to assess the benefits and risks of tests and treatments, and to judge the prognostic significance of the outcomes of these cognitive achievements. Needless to say, clinical medicine consists of much more than clinical cognition, including meticulous gathering of data, careful examination of patients, empathy with the sick, ability to communicate with patients, and professional demeanor, among many others, but this essay is restricted to clinical cognition.

Though we still have much to learn about clinical cognition, several sources can be combined to define a reasonable pragmatic approach that can be subjected to critical evaluation. These sources start with commonsense notions of learning from some of the most venerated and respected educators, from modern theories of adult learning, from research on clinical cognition, and from the experience of educators, such as myself, who have been working at it for decades.

Back to Top | Article Outline

Insights From Educational Theory

Seventy years ago, John Dewey, the great educator and pragmatist, outlined criteria for teaching that have stood the test of time. One fundamental principle, which seems almost mundane today, is that experiences are critical determinants that influence the quality of learning, and that the teacher has an obligation to provide optimal experiences. Dewey believed that teaching experiences should arouse curiosity, enhance personal initiative, and allow free expression of learners' ideas. In explaining the importance of individual experiences on the development of expertise, he wrote, “What [the student] has learned in the way of knowledge and skill in one situation becomes an instrument of understanding and dealing effectively with the situations which follow.”1

Modern concepts of “adult learning” supplement these concepts. They hold that the role of the teacher is not to transmit knowledge but to facilitate learning, encourage spontaneity, and engage in mutual inquiry.2 Such a strategy requires that the educator be comfortable when others in a group engage in critical thinking and challenge the educator's opinions and convictions. As in Dewey's formulation, adult learning theory holds that people learn new knowledge and skills most effectively when they are presented in the context of the application of new knowledge to real-life situations.3,4 It proposes that because learning cannot be separated from the context in which it is used, the best time to learn anything is when the material is immediately useful.2,5 Emotional issues count also: Adults learn best, the theory posits, in informal, comfortable, flexible, and nonthreatening settings. Lastly, because experience is the learner's “textbook,” the core method of adult education should be the analysis of experience.2,5,6

Gaining expertise is not easy, and it cannot be achieved passively.3,7,8 Some who have studied expertise expressed the process this way: “The development of genuine expertise requires struggle, sacrifice, and honest, often painful self-assessment. There are no shortcuts … and you will need to invest that time wisely, by engaging in ‘deliberate practice’—practice that focuses on tasks beyond your current level of competence and comfort.”8,9 Presumably, expertise develops as learners mindfully assemble simple concepts into more complex ones.4 Experts just know more, remember more, and perceive more than do novices, but becoming an expert requires persistence, focus, struggle, and rigorous self-assessment.3,10

Back to Top | Article Outline

Research on Clinical Reasoning

Earlier work on clinical reasoning centered not on the mental mechanisms and procedures that expert clinicians claim that they use in solving problems, but on what they are observed to do. Cognitive scientists have long given up on personal theories of mental processes because they are known to be unreliable.11 Nonetheless, observations on what clinicians do and how they behave can inform both the teaching and learning of reasoning processes. Much of the early work in the field was based on detailed analysis of thinking-aloud transcripts of clinicians solving real clinical diagnostic and therapeutic problems and some on recall of physicians viewing videotapes of their interactions with simulated patients.12,13 In one study, for example, authentic clinical material from a patient was made available serially to an experimental subject (a physician) in the same sequence as it became evident to the doctors caring for the patient, and the experimental subject, unaided by external sources of information, responded spontaneously by offering diagnostic or therapeutic opinions.12 The subjects were not asked to explain how or why they reached conclusions because such opinions are considered unreliable. Instead, the subject's utterances were recorded, transcribed, and then analyzed for their reasoning content by another individual who was familiar with the medical domain.12 In another study, physicians “working up” simulated patients were videotaped and debriefed later about what they were considering during the encounter, and transcripts of their responses were analyzed.13 The assumption of these methods is that solving problems while speaking is probably not greatly dissimilar to doing so without speaking.13–16

Though analysis of transcripts of physicians thinking aloud probes only some aspects of reasoning,9,17 early studies produced useful insights, including a vocabulary for discussing clinical reasoning concepts, a notion of the sequence of iterative steps in the process, and an approach to both learning and teaching clinical problem solving. These studies and others suggest that diagnostic hypotheses are quickly generated with minimal clinical data and that these hypotheses are then used as a problem representation, a framework for further focused information gathering.12,13,18–20 Only small numbers of hypotheses appear to be active at any one time, consistent with the observation that short-term memory has a limited capacity.3,21,22 Differential diagnosis is envisioned not as a single static list of disorders collected when all of a patient's facts are revealed, but as an evolving, iterative process involving repeated hypothesis generation, deletion, and refinement.23–25 Modification and evolution of hypotheses involves both probabilistic and causal reasoning modes. Working diagnoses, that is, hypotheses used for prognostic or therapeutic recommendations, are evoked only after they are assessed for their adequacy in explaining all positive, negative, and normal clinical findings, and for their pathophysiologic reliability—namely, a check on the reasonableness of causal linkages between clinical events.12,26

There is little doubt that clinical knowledge is a fundamental requirement of successful clinical reasoning and that repeated exposure to well-selected cases is the ideal way to absorb such knowledge and store it in memory.27–32 Over the past decade, efforts to explain how disease entities (or syndromes) are represented in memory have been a focus of much analysis. A priori, it seems difficult to imagine that there could be any single such representation, given the polyglot way that individuals retain such information, the polymorphism of most diseases, and the complex way diseases evolve in different patients over time. The descriptions are quite varied. Some claim that disease entities are stored in mental representations of disease attributes called “frames” (in the language of artificial intelligence), in “semantic networks” or as “semantic qualifiers,” in “illness scripts,” or in the form of scenarios of actual patients previously encountered.29,33–38 Others have suggested representations analogous to the “if–then” production rules of computer programs, or in neuron combinations according to connectionism theories of brain function.39,40 All such characterizations probably should be considered not as definitive descriptions of mental processes but, rather, as tentative theories of how the mind works, or as metaphors for thought processes. For this reason, despite recommendations in favor of so doing,29,31 it remains to be determined whether there is value in incorporating these notions into active teaching sessions.

Although considerable uncertainty exists about the structure of knowledge in memory, a substantial body of evidence bears on how people process and apply their knowledge. Reasoning, including clinical reasoning, is visualized as a dual-process system, with intuitive (i.e., tacit) and analytical components.17,41–43 (Note that though these two components are described here as discrete entities, in reality their interactions are almost certainly far more integrated and interdependent.) The intuitive components, thought to be a holdover from our evolutionary origins in our primitive past, are instinctual and reflexive, require no input from the analytic system, and respond to domain-relevant stimuli. They are characterized by first impressions, quick pattern recognition, and rapid responses to information.28,33,41–45 They seem to be effortless and autonomous, require little or no awareness or active thought, can be influenced by affect and emotions,46 and are activated in conditions of considerable uncertainty. Some aspects of diagnosis, such as hypothesis generation, are presumably an intuitive function. Though intuitive, this heuristic part of the process is also primed to recognize new situations or patterns in its rapid-recall fashion after repeated exposure to the same stimuli or set of events. Stated differently, by repeated practice, what was once an analytic process can become automatized and then can respond autonomously.17,42 Thus, even some decision rules become autonomous42; evidence suggests that “easy cases” are more likely solved by pattern recognition and more difficult cases by analytic strategies.27–29 Intuitive components often produce valuable, accurate responses, but because of their inherent characteristics (namely, their quickness and apparent lack of computation), they can be influenced by the context of the moment, including emotions,17,46 and are sometimes prone to error. Such cognitive errors are considered later.

By contrast, the analytic components are deliberate, studied problem-solving processes that consciously and mindfully consider alternatives and options. They are thought to require considerable cognitive work, are slower than the intuitive component, and are solidly based on science, logic, inference, causality, probabilistic associations, and decision making.42 These components are activated when a pattern is not clear, for example, when a patient's clinical or laboratory findings do not fit an easily recognized clinical picture. Parts of the diagnostic process subsumed by these components include hypothesis testing, differential diagnosis, diagnostic verification, and maintaining a coherent clinical story that explains all the findings. The analytic system creates and manipulates models of reality in working memory and maintains a coherent story, thus facilitating diagnostic reasoning and hypothesis testing.42 The analytic components are less likely than the intuitive component to be error-prone, and they have the special trait of being capable of being a check and an override of the first impressions of rapid recognition.42 Nonetheless, an individual initiates this checking function only when some characteristic of the first impression strikes the problem solver as being out of the ordinary.47,48 Finally, strong first impressions are often correct, of course, and an override should not be invoked without a convincing rationale.

Thanks to formal work on quantitative clinical approaches, namely, Bayesian analysis and decision analysis, there is less mystery in how clinical data can be combined in diagnostic and therapeutic problem solving than in how information is stored and retrieved in memory. Though few argue that people reason according to these formalisms, modeling clinical decision making by these approaches helps put a rigorous, logical framework on these processes. Understanding Bayes' rule makes concepts such as sensitivity and specificity of diagnostic tests comprehensible. Bayes' rule is also a framework for understanding the evolution of a differential diagnosis based on any new clinical information whether or not the data are derived from diagnostic tests.26 It embodies the concept of diagnostic “gold standards.” Though few physicians stop to do the math required of the method, Bayes' rule is the basis of many compiled testing strategies.49,50 Likewise, formal decision trees are not often constructed and their probabilities and utilities of outcomes not often specified, yet the principles of this strategy cement multiple therapeutic concepts. They include the trade-offs between the benefits and risks of tests and treatments, thresholds for testing and treating, and decisional toss-ups or close calls.51–54

Reasoning based on causality is another approach to diagnosis that is based not on probabilistic considerations but on pathophysiologic concepts.55–58 Causal reasoning involves forming inferences based on major cause-and-effect relations between clinical variables or events. Because such reasoning often relies on the pathophysiologic aspects of individual disease states, its application is far narrower diagnostically than the other strategies. Nonetheless, causal reasoning is a powerful analytic tool to explain discrepancies in certain diagnoses. Such reasoning may also be useful in unraveling disease polymorphisms, namely, instances in which a patient's clinical manifestations fail to match precisely with the textbook description of a disease state.

Back to Top | Article Outline

Cognitive Errors

But cold logic as exemplified by the analytic approach, including probabilistic and causal reasoning, fails to account for the fact that humans are human, not silicon processors. As noted before, humans often jump to conclusions, using intuitive heuristics and reflexive rules of thumb.59,60 Such conclusions often turn out to be correct, but when they miss the mark in medicine such a miss can be costly in terms of a patient's welfare.61,62 For decades, cognitive psychologists have known, based on laboratory experiments, that people misjudge likelihoods of events based on their recall of salient examples, their vividness, or their resemblance to other examples.26,63,64 In addition, they may misjudge the likelihood of an outcome based on some starting point or initial value.63–66 Physicians occasionally misjudge the a priori likelihood of diseases, suspect rare diseases more often than is appropriate, overemphasize the significance of a positive test, jump to conclusions with little information, and judge prematurely that they have a working (or final) diagnosis.67,68 The existence of a cohesive structure of the diagnostic process, as outlined above, and this laboratory confirmation of cognitive errors, made it possible 20 years ago to identify and classify cognitive diagnostic errors.61 Such errors, many of which lead to life-threatening outcomes, have been identified in every stage of the diagnostic process.27,61,62,69 Despite the early recognition of cognitive errors, attention to them has been only a recent endeavor.62,69–73

Back to Top | Article Outline


All of the foregoing research, information, and practical experiences have informed our thinking about how best to teach clinical reasoning, but before considering the method described here, first a few caveats. This teaching proposal encompasses only clinical cognition, the apparent mental processes that constitute the diagnostic process. The method also allows for discussion of cognitive aspects of therapeutics, including the trade-offs between the risks and benefits of treatments, treatment thresholds, and therapeutic toss-ups or close calls. Not considered here are critical and often inseparable aspects of patient encounters, namely, personal communication, the importance of extracting useful and accurate information, the need to meticulously review old records, appropriate review of systems, assessment of medical evidence, or performance of the physical examination.74 The reader should not infer that these issues are unimportant but merely separate from the cognitive issues under discussion. One could argue that teaching clinical reasoning in the age of computer-aided diagnostic aids, electronic medical records, and massive clinical electronic databases is superfluous. In my judgment, it is more needed than ever: None of these digital modalities can yet substitute for an expert clinician. Lastly, there is plenty of room for disagreement over the principles and practice of teaching any subject, and clinical reasoning is no exception. But dissension is no reason to avoid proposing a method that many besides myself have found useful both in their roles as learners and educators.75,76

Back to Top | Article Outline

Teaching Clinical Reasoning 2010

The approach I describe here is applicable predominantly to case-based teaching conferences, especially with groups of 30 or fewer students or residents, though it has been widely used with larger audiences. As a starter, even a rudimentary exposure to the components of the clinical reasoning process (List 1) is helpful as a framework or roadmap to guide students as they begin to understand the elements of reasoning in particular cases.4,13,26 At the very least, such an introduction provides students a language for thinking about clinical problem solving. Given the strong predominance of medical knowledge as a criterion for learning clinical reasoning, some have argued not to introduce reasoning strategies until after the second year of medical school, at a time when students are well grounded in pathophysiology. In fact, beginning medical students' knowledge of medicine cannot be considered merely tabula rasa, owing to their exposure to medicine in the media and in their personal lives. Thus, I believe that exposure to clinical reasoning using carefully selected case examples can begin during the first year of medical school. Of course, no matter where learners are in their training, some fundamental habits are required. Clinical cognition requires a flexible cast of mind, a power of observation, and a willingness to question, to learn from others, and to compare notes.77

List 1 Diagnostic and Therapeutic Concepts of Clinical Reasoning

List 1 Diagnostic and Therapeutic Concepts of Clinical Reasoning

Back to Top | Article Outline

Selection of examples

Though it is quite clear that repeated, deliberate experience with real clinical material is an essential component of the learning process, a random selection of cases is not sufficient to teach all the complex elements of reasoning, clinical or otherwise.7,29 To aim for a broad understanding of reasoning principles, a thoughtful selection of examples is critical. Trainees who bring the cases to the teaching session should be encouraged to select cases that, over time, illustrate all kinds of aspects of the diagnostic process, as well as those that instantiate the judgmental aspects of the trade-off between the risks and benefits of testing and treating (List 1). Such a selection of cases is available.26 Both recently admitted patients and past cases have value; the latter have special usefulness with respect to understanding how the disease evolved and for connecting prior decision making and patient outcomes.

Examples should be selected according to the level of the learners.29 They should not be synthesized according to someone's memory of former cases but instead should be genuine, active cases, to ensure that the actual uncertainties, inconsistencies, imperfections, complexity, and ambiguity of clinical data are encompassed.4,26 To explain how cognitive errors arise and how “near misses” occur, some examples of defective clinical reasoning should be included among cases that illustrate excellent reasoning. The case examples should be unfamiliar to the learners so they will be forced to confront the clinical material de novo and thus will not be hampered by hindsight (retrospective) bias.78,79 Needless to say, the more cases experienced in this way, the better.9,17

Back to Top | Article Outline

Organization of material

Narratives of cases are time-worn rituals that are created to capture clinical experience. Such narratives should contain not just the facts of the patient's illness but the judgments that were made and the actions that were taken as the patient's condition evolved.77 The case functions not only as an organized template for clinical reasoning but as the basis for learning clinical medicine and clinical reasoning.77 Thus, when possible, clinical material should be organized in the same chronologic sequence as the events unfolded in real life. Although the presentation of material will often start with a patient's age, sex, and chief complaint, it also can begin with the problem for which the patient was referred to a physician or hospital. If a patient's chief complaint is nausea and weakness, for example, and if the learners are inexperienced, and if the goal is to elaborate on the causes of these complaints, then it is appropriate to start with these complaints. But for more-advanced learners, if the presenting symptoms are the same but the principal issue is declining kidney function, the chief complaints can be bypassed in favor of a starting point such as “the patient is being seen for unexplained progressive renal insufficiency.”

My point is that the individual who selects the case should be cognizant of the teaching goal and should tailor the case presentation to achieve the goal. In a classroom setting, learners can be asked to extract information in an iterative fashion from the presenter, or the presenter can provide clinical data in “chunks.” Though the chunks often would follow the traditional sequence of history, physical examination, and laboratory findings, they need not always be structured this way. Effective problem solving can be illustrated by beginning with a physical examination finding, an X-ray, or a set of laboratory studies.

Back to Top | Article Outline

The role of a coach

A coach functions best when he or she is as unfamiliar about the case as the learners and is forced to examine the same information prospectively as the learners. The coach fills an essential task, namely, monitoring the learners' questions and responses, commenting on their relevance and accuracy.17 Specific issues about the medical aspects of the patient as well as the reasoning involved in diagnosis and treatment can be dealt with simultaneously. The coach asks the participants why they requested information and then has them explain what they learned when they receive the answer. The point of this interactive give-and-take during the problem-solving session is to provide instant feedback17 by examining the intermediate reasoning as data are collected and as a diagnosis or therapeutic plan is being formulated, rather than holding all discussion until all the information from the case is available.26,33,80 By that time, much of the intermediate reasoning is lost.

The coach must try to engage all participants to be actively involved in the problem-solving session even, if necessary, by calling on some to participate. Adult learning theory stresses that the teacher must try to make the teaching session intellectually challenging, enjoyable, respectful, and nonthreatening.2,6 This does not mean that the session should always be anxiety-free, either for the learners or the coach; sometimes such stress actually renders the memory of a case keener. If the coach is as much “in the dark” about the case as the learners, he or she might also be embarrassed about mistaken facts, wrong judgments, inappropriate hypotheses, and other errors. The coach must encourage spontaneity: What matters is what is learned, not whether every case has a final answer. The coach should not feel compelled to cover all aspects of the case. Because the coach cannot be expected to be a compendium of medical information on all cases presented at such a session, participants should be encouraged at the appropriate time to seek critical evidence from other information sources, including available electronic databases. But the coach should be discouraged from retreating into his or her special interest when befuddled and convert the session into a lecture on his or her research, specialty, or special interest.

There is no better time to explain the application of probability theory, threshold concepts, the nature of a differential diagnosis, notions of causality, and disease polymorphisms than with real cases in an active teaching session when these issues surface as part of the discussion, namely, at the time of greatest interest. Needless to say, the moderator must be well versed in these concepts to be able to impart them adequately.

Back to Top | Article Outline

Avoiding cognitive errors

Recent essays about clinical reasoning argue that metacognition might be an effective strategy for avoiding cognitive errors.62,70,72,81–83 Metacognition is a method of introspection in which one is supposed to contemplate or reflect on one's own thinking. Because many cognitive errors are the consequence of inappropriate triggering of the intuitive component of cognition, they are, as discussed earlier, susceptible to correction by analytic reasoning.17,42,47,48 Generally, however, some signal must be perceived that could activate this checking process. There is little doubt that individuals can be forced to rethink their instinctive responses, and when they do so, they seem to make fewer errors.84 Nonetheless, how much reassessing and revisiting intuitive responses occurs in the real world is not known. In theory, there would be great value if individuals could use critical-thinking skills such as emotional detachment, neutral examination of beliefs, perspective switching, and assessment of the current context, but how to do so is difficult.42 In medicine, approaches to teach metacognition and thus correct or prevent cognitive errors are not fully tested and so far have produced inconsistent results.82 Checking through long lists of cognitive errors might be another strategy, and though several such lists of errors have been created,62,69,70,83 there is little evidence that their use reduces the chance of subsequent errors.

When summarizing a just-discussed case, however, the information is fresh and the time is ideal for a retrospective analysis and immediate feedback, including a discussion of all kinds of errors if there were any.7,29,85 (This approach has been used effectively for some time by the U.S. Army, which carries out an after-action review of events in training or in combat.86) If cognitive errors were made, this case “wrap-up” presents an opportunity to dissect them and expose them. If the learners have been actively engaged in the problem-solving session, they will be personally invested in understanding how errors occurred.85 In wrapping up a case, a coach can also ask whether the diagnosis satisfies criteria of adequacy (Were all findings explained?) and coherence (Did physiologic linkages make sense?), whether it is a parsimonious explanation of the findings, what the major clues were that led to the correct diagnosis, whether and how a diagnosis could have been arrived at earlier or more efficiently, and whether the therapeutic approaches selected were rational or not. One might hope that these approaches will reduce future errors, but their influence on students' future cognition is only a matter of speculation at present. Note, finally, that this discussion is devoted only to cognitive diagnostic errors, not to those involving system-level dysfunctions. Nothing, however, precludes discussion of such systemic errors when they are discovered as part of the analysis of individual cases.

Back to Top | Article Outline

This Approach in Relation to Other Teaching Modalities

Needless to say, this case-based approach that simulates a real clinical encounter is only one of many methods to teach clinical reasoning, and given its relative inefficiency from a financial standpoint and its requirement for faculty who are willing to expose their reasoning strategies, why should it be preferred? How does it compare to large-group lectures, to online interactive case exercises, and published clinicopathologic conferences and similar approaches? It is my view that purposeful case selection, active student participation, immediate feedback, and thoughtful involvement by a seasoned coach promote enhanced learning. I suggest the method described here not as the only approach to teaching clinical reasoning but as one to interdigitate with any others that share the same goals. Whether the human resources required are worth the investment is to some extent a local issue, but it is a hypothesis worth assessing.

Although the practice of medicine is in itself not a science, it is based on science and is always striving to become more scientific.39,77 There are no double-blind controlled studies of clinical reasoning, nor of any of the programs designed to teach clinical reasoning. Teaching programs are based principally on pragmatic considerations, educational theory, experience, and (frankly) trial and error. There is no justification to apologize for these attributes, because despite these shortcomings, our medical schools and training programs do yield practitioners who excel in clinical problem solving and who effectively navigate the complexities of diagnosis and treatment. We may not know precisely how they become expert problem solvers, but over time they do. Our job as educators is to continue to evolve our teaching methods in the hope that our students become more efficient and more accurate problem solvers and make fewer cognitive errors.

Back to Top | Article Outline


Supported in part by a grant from the Josiah Macy Jr. Foundation.

Back to Top | Article Outline

Other disclosures:

Dr. Kassirer is a coauthor of a recently published book, Learning Clinical Reasoning, which is expected to generate modest royalties. He has no other financial conflicts of interest.

Back to Top | Article Outline

Ethical approval:

Not applicable.

Back to Top | Article Outline


1 Dewey J. Experience and Education. New York, NY: Simon and Shuster; 1938.
2 Knowles MS, Holton EF, Swanson RA. The Adult Learner. 6th ed. Amsterdam, Netherlands: Elsevier; 2005.
3 van Gog T, Ericsson KA, Remy M, Rikers JP, Paas F. Instructional design for advanced learners: Establishing connections between the theoretical frameworks of cognitive load and deliberate practice. Educ Technol Res Dev. 2005;53:73–81.
4 van Merrienboer JJG, Sweller J. Cognitive load theory and complex learning: Recent developments and future directions. Educ Psychol Rev. 2005;17:147–177.
5 Higgs J, Jones M. Introduction. In: Higgs J, Jones M, eds. Clinical Reasoning in the Health Professions. Oxford, UK: Butterworth; 1995.
6 Refshauge K, Higgs J. Teaching clinical reasoning in health science curricula. In: Higgs J, Jones M, eds. Clinical Reasoning in the Health Professions. Oxford, UK: Butterworth; 1995.
7 Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S80.
8 Ericsson KA, Prietula MJ, Cokely ET. The making of an expert. Harv Bus Rev. July–August 2007;85:114–121, 193. Available at: Accessed January 15, 2010.
9 Ericsson KA, Charness N. Expert performance; its structure and acquisition. Am Psychol. 1994;49:725–747.
10 Remy M, Rikers JP, Paas F. Recent advances in expertise research. Appl Cogn Psychol. 2005;19:145–149.
11 Nisbett RE, Wilson TD. Telling more than we can know: Verbal reports on mental processes. Psychol Rev. 1977;84:231–259.
12 Kassirer JP, Gorry GA. Clinical problem solving: A behavioral analysis. Ann Intern Med. 1978;89:245–255.
13 Elstein AS, Shulman LS, Sprafka SA. Medical Problem Solving: An Analysis of Clinical Reasoning. Cambridge, Mass: Harvard University Press; 1978.
14 Anderson JR. Methodologies for studying human knowledge. Behav Brain Sci. 1987;10:467–505.
15 Kuipers BJ, Kassirer JP. Knowledge acquisition by analysis of verbatim protocols. In: Kidd AL, ed. Knowledge Acquisition for Expert Systems: A Practical Handbook. New York, NY: Plenum Press; 1987.
16 Ericsson KA, Simon HA. Protocol Analysis: Verbal Reports as Data. Rev ed. Cambridge, Mass: MIT Press; 1993.
17 Hogarth RM. Educating Intuition. Chicago, Ill: University of Chicago Press; 2001.
18 Nendaz MR, Bordage G. Promoting diagnostic problem representation. Med Educ. 2002;36:760–766.
19 Chang RW, Bordage G, Connell KJ. Cognition, confidence, and clinical skills. The importance of early problem representation in case presentations. Acad Med. 1998;73(10 suppl):S109–S111.
20 Elstein AS. Clinical Reasoning in Medicine. In: In: Higgs J, Jones M, eds. Clinical Reasoning in the Health Professions. Oxford, UK: Butterworth; 1995.
21 Miller GA. The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychol Rev. 1994;101:343–352.
22 Waldrop MM. The workings of working memory. Science. 1987;237:1564–1567.
23 Kassirer JP, Kopelman RI. What is a differential diagnosis? Hosp Pract. August 15, 1990;24:19–28.
24 Kassirer JP. Teaching clinical medicine by iterative hypothesis testing. Let's preach what we practice. N Engl J Med. 1983;309:921–923.
25 Leblanc VR, Brooks LR, Norman GR. Believing is seeing: The influence of a diagnostic hypothesis on the interpretation of clinical features. Acad Med. 2002;77(10 suppl):S67–S69.
26 Kassirer JP, Wong JB, Kopelman RI. Learning Clinical Reasoning. Philadelphia, Pa: Wolters Kluwer, Lippincott Williams and Wilkins; 2009.
27 Elstein A, Schwarz A. Evidence base of clinical diagnosis. Clinical problem solving and diagnostic decision making: Selective review of the cognitive literature. BMJ. 2002;324:729–732.
28 Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2004;39:98–106.
29 Norman G. Research in clinical reasoning: Past history and current trends. Med Educ. 2005;39:418–427.
30 Higgs J, Jones M. Clinical reasoning. In: Higgs J, Jones M, eds. Clinical Reasoning in the Health Professions. Oxford, UK: Butterworth; 1995.
31 Norman GR. The epistemology of clinical reasoning: Perspectives from philosophy, psychology, and neuroscience. Acad Med. 2000;75(10 suppl):S127–S133.
32 Weber EU, Bockenholt U, Hilton DJ, Wallace B. Determinants of diagnostic hypothesis generation: Effects of information, base rates, and experience. J Exp Psychol Learn Mem Cogn. 1993;19:1151–1164.
33 Bowen JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med. 2006;355:2217–2225.
34 Bordage G. Elaborated knowledge: A key to successful diagnostic thinking. Acad Med. 1994;69:883–885.
35 Bordage G. Prototypes and semantic qualifiers: From past to present. Med Educ. 2007;41:1117–1121.
36 Custers EJ, Regehr G, Norman GR. Mental representations of medical diagnostic knowledge: A review. Acad Med. 1996;71(10 suppl):S55–S61.
37 Norman G. Building on experience—The development of clinical reasoning. N Engl J Med. 2006;355:2251–2252.
38 Schmidt HG, Norman GR, Boshuizen HPA. A cognitive perspective on medical expertise: Theory and implications. Acad Med. 1990;65:611–621.
39 Erneling CE, Johnson DM, eds. The Mind as a Scientific Object. Between Brain and Culture. New York, NY: Oxford University Press; 2005.
40 Bechtel W, Abrahamson A. Connectionism and the Mind. Parallel Processing, Dynamics, and Evolution in Networks. 2nd ed. Malden, Mass: Blackwell; 2002.
41 Dawson NV. Physician judgment in clinical settings: Methodological influences and cognitive performance. Clin Chem. 1983;39:1468–1480.
42 Stanovich KE. The Robot's Rebellion. Finding Meaning in the Age of Darwin. Chicago, Ill: University of Chicago Press; 2004.
43 Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009;84:1022–1028.
44 Coderre S, Mandin H, Harasym PH, Fick GH. Diagnostic reasoning strategies and diagnostic success. Med Educ. 2003;37:695–703.
45 Brooks LR, Allen SW, Norman GR. Role of specific similarity in a medical diagnostic task. J Exp Psychol Gen. 1991;120:278–287.
46 Vohs KD, Baumeister RF, Loewenstein G, eds. Do Emotions Help or Hurt Decision Making? A Hedgefoxian Perspective. New York, NY: Russell Sage Foundation; 2007.
47 Mamede S, Schmidt HG, Rikers R, Penaforte JC, Cuelho-Filho JM. Breaking down automaticity: Case ambiguity and the shift to reflective approaches in clinical reasoning. Med Educ. 2007;41:1185–1192.
48 Mamede S, Schmidt HG, Rikers R. Diagnostic errors and reflective practice in medicine. J Eval Clin Pract. 2007;13:138–145.
49 Wells PS, Ginsberg JS, Anderson DR, et al. Use of a clinical model for safe management of patients with suspected pulmonary embolism. Ann Intern Med. 1998;129:997–1005.
50 Joseph J, Badrinath P, Basran GS, Sahn SA. Is the pleural fluid transudate or exudate? A revisit of the diagnostic criteria. Thorax. 2001;56:867–870.
51 Bergus GR, Hamm RM. How physicians make medical decisions and why medical decision making can help. Med Decis Making. 1995;22:167–179.
52 Pauker SG, Kassirer JP. Therapeutic decision making: A cost–benefit analysis. N Engl J Med. 1975;293:229–234.
53 Pauker SG, Kassirer JP. The threshold approach to clinical decision making. N Engl J Med. 1980;302:1109–1117.
54 Kassirer JP, Pauker SG. The toss-up. N Engl J Med. 1981;305:1467–1469.
55 Patel VL, Arocha JF, Zhang J. Thinking and reasoning in medicine. In: Holyoak K, ed. Cambridge Handbook of Thinking and Reasoning. Cambridge, UK: Cambridge University Press; 2004.
56 Woods NN. Science is fundamental: The role of biomedical knowledge in clinical reasoning. Med Educ. 2007;41:1173–1177.
57 Kuipers B, Kassirer JP. Causal reasoning in medicine: Analysis of a protocol. Cogn Sci. 1984;8:363–385.
58 Kuipers BJ. Commonsense reasoning about causality: Deriving behavior from structure. Artif Intell. 1984;24:169–203.
59 Tversky A, Kahneman D. The framing of decisions and the psychology of choice. Science. 1981;211:453–458.
60 Kahneman D, Slovic P, Tversky A. Judgment Under Uncertainty: Heuristics and Biases. New York, NY: Cambridge University Press; 1982.
61 Kassirer JP, Kopelman RI. Cognitive diagnostic errors: Instantiation, classification, and consequences. Am J Med. 1989;86:433–441.
62 Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005;165:1493–1499.
63 Patel VL, Kaufman DR, Arocha JF. Emerging paradigms of cognition in medical decision-making. J Biomed Inform. 2002;35:52–75.
64 Redelmeier DA. The cognitive psychology of missed diagnoses. Ann Intern Med. 2005;142:115–120.
65 Hall KH. Reviewing intuitive decision-making and uncertainty: The implications for medical education. Med Educ. 2002;36:216–224.
66 McDonald CJ. Medical heuristics: The silent adjudicators of clinical practice. Ann Intern Med. 1996;124:56–62.
67 Voytovich AE, Rippey RM, Suffredini A. Premature conclusions in diagnostic reasoning. J Med Educ. 1985;60:302–307.
68 McSherry D. Avoiding premature closure in sequential diagnosis. Artif Intell Med. 1997;10:269–283.
69 Bordage G. Why did I miss the diagnosis? Some cognitive explanations and educational implications. Acad Med. 1999;74(10 suppl):S138–S143.
70 Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78:775–780.
71 Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: What's the goal? Acad Med. 2002;77:981–992.
72 Graber M. Metacognitive training to reduce diagnostic errors: Ready for prime time? Acad Med. 2003;78:781.
73 Newman-Toker D, Pronovost PJ. Diagnostic errors—The next frontier for patient safety. JAMA. 2009;301:1060–1062.
74 Verghese A, Horwitz RI. In praise of the physical examination. BMJ. 2009;339:B5448.
75 Kassirer JP. Clinical-problem-solving—A new feature in the Journal. N Engl J Med. 1992;326:60–61.
76 Kassirer JP. Images in clinical medicine. N Engl J Med. 1992;326:829–830.
77 Montgomery K. How Doctors Think: Clinical Judgment and the Practice of Medicine. New York, NY: Oxford University Press; 2006.
78 Fischhoff B. Hindsight not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. 1975. Qual Saf Health Care. 2003;12:304–311.
79 Wood G. The knew-it-all-along effect. J Exp Psychol Hum Percept Perform. 1978;4:345–353.
80 Eva KW, Neville AJ, Norman GR. Exploring the aetiology of content specificity: Factors influencing analogic transfer and problem solving. Acad Med. 1998;73(10 suppl):S1–S5.
81 Dhaliwal G. Clinical-decision-making: Understanding how clinicians make a diagnosis. In: Saint S, Drazen JM, Solomon CG, eds. New England Journal of Medicine: Clinical Problem Solving. New York, NY: McGraw-Hill Professional; 2006:19–29.
82 Berner ES, Graber ML. Overconfidence as a cause of diagnostic error in medicine. Am J Med. 2008;121(5 suppl):S2–S23.
83 Croskerry P. Achieving quality in clinical decision making: Cognitive strategies and detection of bias. Acad Emerg Med. 2002;9:1184–1204.
84 Mamede S, Schmidt HG, Penaforte JC. Effects of reflective practice on the accuracy of medical diagnoses. Med Educ. 2008;42:468–475.
85 Kuhn GJ. Diagnostic errors. Acad Emerg Med. 2002;9:740–750.
86 Colvin G. Talent Is Overrated. What Really Separates World-Class Performers From Everybody Else. New York, NY: Portfolio; 2008.
© 2010 Association of American Medical Colleges