Book Summary and Analysis |

Popular Psychology, Personal Growth and Self-Help


A Summary and Critique of
Daniel Kahneman’s
Thinking, Fast and Slow

by I.K. Mullins

Copyright©2015 I.K. Mullins. All Rights Reserved. No part of this book may be reproduced or retransmitted in any form or by any means without the written permission of the author.

Should you have any questions, please contact us at


In order to switch between pages of Part  I. A Summary of Kahneman‘s Thinking, Fast and Slow, please use tabs located beneath this message.

Daniel Kahneman’s book presents the results of the research that he has conducted over the past forty years, including the results of the research he conducted in the past together with Amos Tversky. Its general conclusion is that we are intuitive thinkers and that our imperfect intuitive thinking affects our choices and judgments.

The principal messages of Kahneman’s book are summarized in the following sections.


I. System 1 and System 2

Kahneman’s book is organized around two models of thinking: System 1 and System 2. In accordance with the title of the book, “thinking fast” describes System 1, and “thinking slow” describes System 2. System 1 is unconscious, automatic, intuitive and effortless. It utilizes associations and resemblances in order to provide quick answers to questions. System 1 is naive, susceptible, non-statistical and empirical. System 2 is conscious, effortful, controlled, slow, careful, statistical and suspicious. System 2 is also “lazy” because it is costly to use (in terms of energy and time).

Kahneman argues that System 1 describes our “normal” decision making, and System 2 becomes engaged in decision making only when circumstances require its involvement. Many of the real decisions that people make in their lives, including some important and far-reaching decisions, are generated by System 1. System 1 can lead to right answers and wonderful inspirations, yet, it can also lead to systematic errors.

Kahneman points out that the domains of System 1 and System 2 are different for different people. They depend on people’s knowledge and experience, and they can change as people acquire more experience and expertise. For example, for many people, calculating the product of 20 × 20 is a very simple task that can be effortlessly performed by System 1. Yet, for some people this task will require activation of System 2. For many people, screwing in a light bulb is a task for System 1, but for some people this task has to be done by the effortful and conscious System 2.

Although System 1 can make mistakes because of its speed and intuitive method, System 2 is not perfect neither. People can solve problems incorrectly whenever they do not think about them in the right way.


II. Heuristics and Biases

Kahneman’s book describes research o biases and heuristics conducted by Kahneman and Tversky. A heuristic is a mental shortcut that allows people to solve problems and make judgments quickly and efficiently. Heuristics comprise intuitive judgments, stereotyping or a rule of thumb, as well as educated guesses. They are applied in the thinking process in order to shorten time spent on problem solving and decision making.

Heuristics can be helpful in many circumstances; however, they may lead to biases. Kahneman and Tversky’s research demonstrated that people often make biased predictions and estimates when they use heuristics to solve statistical problems.

Kahneman and Tversky identified two categories of heuristics. One category is related to anchoring. It involves heuristics that allow people to guess answers to questions when they do not have a good idea about the correct answers. In the process, people’s answers can be influenced by irrelevant data or conditions..

For example, in an experiment, a wheel of fortune is marked from 0 to 100, and it is set up in a way that it can stop only at 10 or 65. After spinning the wheel, the experiment participants have to write down the number at which it stopped. Then, they have to tell if the percentage of African member nations of the UN is greater or less than the number that they just wrote down. The average guess, which was 25 percent, was made by those participants who observed the wheel to stop at 10. For the participants who recorded the wheel to stop at 65, the average guess was 45 percent. In Kahneman’s opinion, anchoring is an example of System 1 trying to make these kinds of quick estimating decisions.

The second category of heuristics deals with statistical problems. These kinds of problems arise when people have all the necessary information, but they use it in the wrong way. Errors can arise when people do not consider properly all available information. The “Linda” experiment is a good example of the use of such heuristics. In this experiment, participants are given Linda’s description: Linda is 31 years old. She majored in philosophy. She is single, opinionated and bright. When Linda was a student, she participated in anti-nuclear demonstrations. She was also very much concerned with issues of social justice and discrimination. The participants in the experiment have to rank different scenarios in order of their probability: 1) Linda is an elementary school teacher; 2) Linda is active in the feminist movement; 3) Linda is a bank teller; 4) Linda is an insurance salesperson; 5) Linda is a bank teller also active in the feminist movement.

This experiment has been conducted many times, and the participants consistently picked the 5th scenario to be more likely than the 3rd scenario, although the 5th scenario is really a special case of the 3rd scenario, which contradicts the most basic laws of probability theory.


III. Prospect Theory

Developed by Kahneman and Tversky, the Prospect Theory relies on four essential assumptions:

  • People evaluate risky choices in terms of their gains and losses relative to a reference point (e.g., their current wealth status).
  • People are loss averse and risk averse when it comes to small bets around some reference points. Kahneman points out that loss might be processed by the human brain in the same way as threats.
  • People are risk averse with respect to gains, and they are risk loving with respect to losses.
  • People have a tendency to place too much weight on low probability events and not enough weight on high probability events. For example, people put excessive weight on such events as plane crashes or jackpot winnings.

The presentation of a problem can impact the way people evaluate it. Kahneman identifies the importance of context in forming mental pictures of problems in his discussion of the WYSIATI principle (i.e., “what you see is all there is”).


IV. Hedonic Psychology and Subjective Well-Being

Kahneman’s conceptual studies have formed the foundation for several empirical discoveries in hedonic psychology that he describes in his book. For example, even though French mothers spend less time with their children than American mothers, they enjoy the time spent with children more; women who live by themselves appear to enjoy the same level of well-being as those women who live with a mate; headaches make the poor feel unhappier than other people.

A person’s actual experience of pleasure or pain can be tested at some moment of time, and then it can be summed up over time. Kahneman refers to this “experienced” well-being, as the opposite of “remembered” well-being. He finds that these two measures of happiness ( “experienced” well-being and “remembered” well-being) are different in unexpected ways. For example, “remembered” well-being does not care about duration of the experience. In retrospect, it rates an experience by the highest level of pleasure or pain throughout the experience, and by how the experience ends.

summaryofdanielkahnemanthinkingfastandslow_page_01 summaryofdanielkahnemanthinkingfastandslow_page_02 summaryofdanielkahnemanthinkingfastandslow_page_03 summaryofdanielkahnemanthinkingfastandslow_page_04 summaryofdanielkahnemanthinkingfastandslow_page_05 summaryofdanielkahnemanthinkingfastandslow_page_06 summaryofdanielkahnemanthinkingfastandslow_page_07 summaryofdanielkahnemanthinkingfastandslow_page_08 summaryofdanielkahnemanthinkingfastandslow_page_09 summaryofdanielkahnemanthinkingfastandslow_page_10 summaryofdanielkahnemanthinkingfastandslow_page_11 summaryofdanielkahnemanthinkingfastandslow_page_12 summaryofdanielkahnemanthinkingfastandslow_page_13 summaryofdanielkahnemanthinkingfastandslow_page_14 summaryofdanielkahnemanthinkingfastandslow_page_15 summaryofdanielkahnemanthinkingfastandslow_page_16 summaryofdanielkahnemanthinkingfastandslow_page_17 summaryofdanielkahnemanthinkingfastandslow_page_18 summaryofdanielkahnemanthinkingfastandslow_page_19 summaryofdanielkahnemanthinkingfastandslow_page_20 summaryofdanielkahnemanthinkingfastandslow_page_21 summaryofdanielkahnemanthinkingfastandslow_page_22 summaryofdanielkahnemanthinkingfastandslow_page_23 summaryofdanielkahnemanthinkingfastandslow_page_24 summaryofdanielkahnemanthinkingfastandslow_page_25 summaryofdanielkahnemanthinkingfastandslow_page_26 summaryofdanielkahnemanthinkingfastandslow_page_27 summaryofdanielkahnemanthinkingfastandslow_page_28

Go to Part II A Critical Analysis

Go Back to Introduction and Overview

Go Back to Table of Contents   

Related content

Dead Wake by Eric Larson

Clinton Cash

Promoted links from around the web

Dead Wake by Eric Larson

Clinton Cash