Home Current Issue Previous Issues Published Ahead-of-Print Collections For Authors Journal Info
Skip Navigation LinksHome > September 2012 - Volume 200 - Issue 9 > Thinking Fast and Slow
Journal of Nervous & Mental Disease:
doi: 10.1097/NMD.0b013e318266bcd3
Book Reviews

Thinking Fast and Slow

Chaiklin, Harris PhD

Free Access
Article Outline
Collapse Box

Author Information

Professor Emeritus University of Maryland School of Social Work, Baltimore

Daniel Kahneman’s Thinking Fast and Slow updates his work on the components of decision making in uncertain situations. The advances he made in this area received the 2002 Nobel prize in economics. This is a remarkable work. It has been on major best-seller lists for some time and, yet, it presents a profound theory.

What accounts for this broad appeal? Part of it lies in the way the ideas are presented. The writing is clear, concise, and free of academic cant. Others’ ideas are not attacked; he simply presents his own. There are 38 short chapters with numerous optical illusions and brief problems. Each ends with brief statements that reflect its principles.

Another part relates to Kahneman’s personal style. There is no academic ego, and credit is given to anyone who ever helped him. This especially applies to Amos Tversky, his collaborator for many years. The two key articles he did with Tversky, which were the basis of the Nobel Prize, are reprinted at the end of the book. He did not believe lead authorship meant anything, so they alternated authorship in the articles they wrote. Kahneman believes that if he hadn’t dies Tversky would have shared the Nobel Prize with him.

There are three core ideas: a) System 1, which is intuitive and fast, and System 2, which is logical and slow. These “fictitious characters” help understand behavior, not predict it. Both thought systems have strengths and weaknesses. b) Whether decision making in economics is always rational is questioned and a theory for decision making under conditions of risk is presented. c) The idea concerns the difference between events as experienced and as remembered.

The chapters are organized into five sections. Part 1 explicates the similarities and differences between the two systems, with more emphasis on the intuitive. “I attempt to give a sense of the complexity and richness of the often unconscious processes that underlie intuitive thinking and of how these automatic processes explain the heuristics of judgment” (p. 13).

Part 2 presents logical and statistical judgment. It is difficult for many people to switch to logical thinking when it is called for. He repeatedly characterizes System 2 as lazy and says “… a recurrent theme of this book: [is that] our minds are strongly biased toward causal explanations and do not deal well with mere ’statistics”’ (p. 182). To counter this, there is an outstanding explanation of correlation, regression, probability, and cause, without showing how to compute the statistic. This is a boon to those who avoid numbers because they do not like to compute.

Laziness leads many to see cause in what are random events. People often extrapolate judgments or estimate on the basis of a number that is “anchored” in their perception. Move the anchor and the judgments change. Judgments can be influenced by seemingly small things barely in a person’s perception.

There is also “hindsight bias,” which “… leads observers to assess the quality of a decision not by whether the process was sound but by whether the outcome was good or bad” (p. 203). If a low-risk medical decision is properly made but an inexplicable accident occurs, juries judge the outcome and not the decision. Parenthetically, decision makers often do not get enough credit for good decisions that seem obvious after the outcome.

Part 3 picks up on System 2’s laziness. Many people have an excessive confidence in what they believe they know, and they want to make decisions only on the basis of the data before them, when they should stop and look around. To guard against impulsiveness, the acronym WYSIATI is introduced. It stands for What You See Is All There Is. It is to remind decision makers to look before they leap. The role of chance (luck) in decision outcome is underestimated, and the belief in the illusion that hindsight provides perfect explanations is overestimated. Many people who are supposed to have an intuitive ability to make good estimates do so because it is a practice effect. He is reassuring people such as clinicians, who are often in impossible situations where there is turmoil and no secure facts, so they should not blame themselves for not having good intuition.

To deal with the illusions of understanding and validity, Apgar scores are cited to recommend using short objective checklists to check enthusiastic clinical judgments. People often do not answer the question they are asked but the question they are prepared to answer. Objective measures do not have this problem.

Part 4 deals with decision making in economics. The assumption that economic decisions are always rational is questioned. Prospect theory is advanced as a more complete explanation. It concerns the reference points used in making decisions, the types of gain and loss involved, and the aversion to loss. These ideas apply to more than economics. In a study about the choice of surgery or radiation for cancer treatment, physicians at Harvard were split into two groups. One surgery statement was “The one-month survival rate is 90%” and the other was “There is a 10% mortality in the first month” (p. 367). Those given statement 1 favored surgery 84% of the time, and those given 2 only favored surgery 50% of the time. Risk aversion explains the choice difference in the statements, which had equal probabilities. This causes physicians to use standard treatments rather than risk using something unconventional that they think might work.

Part 5 concerns the distinction between the experiencing self and the remembering self. Differential pain experiments reflect this. Each person was exposed to two different pain conditions: one, for a long time with easing toward the end; the second, for a short period but without easing. When asked later which type of treatment they wanted, they tended to pick the period with the longer exposure to pain. Evaluation and experience are often different. This is the “focusing illusion”: “Nothing in life is as important as you think it is when you are thinking about it” (p. 402).

Chapter 16, Causes Trump Statistics, is outstanding. It deals with the difficulty in teaching psychology. People who were taught only statistical facts may not have increased their understanding of the situations they are in. Surprising individual cases are more effective in teaching because they must be integrated by the person. “That is why this book contains questions that are addressed personally to the reader. You are more likely to learn something by finding surprises in your own behavior than by hearing surprising facts about people in general” (p. 174).

This work will be highly useful to those in practice professions. The readers will differ in chapters they find easy and difficult, but all of the chapters add something to understanding the way decisions are made. What is always present is asking the reader to question the bases of decisions that they make. This will greatly enhance practice awareness. They will also get the idea that although they should continually strive to do the right thing, they are human and that irrational judgments will continue to be made.

Harris Chaiklin, PhD

Professor Emeritus

University of Maryland

School of Social Work, Baltimore

Back to Top | Article Outline
DISCLOSURE

The author declares no conflict of interest.

© 2012 Lippincott Williams & Wilkins, Inc.

Login