Full Book Summary of Thinking Fast and Slow

|

Here is my summary of Daniel Kahneman’s book Thinking, Fast and Slow, before you read my full book review. A Nobel prize winner in behavioral economics, Kahneman has decades of research in cognitive psychology that he applies to the economic sciences. If you’re at all interested in how our thinking influences our decisions without us realizing it, check out this book! 

This book explores the complex human mind, ways of thinking, and our many cognitive biases. As he presents the research, he also gives practical application and ways to improve your critical thinking. If this is a topic that interests you, I highly recommend reading the whole book. 

thinking fast and slow book

Part 1 Two Systems 

Chapter 1 

There’s two systems of thinking. System 1 is quick with little effort or voluntary control (the ‘fast thinking’). System 2 requires attention and choice and is often longer and slower with highly diverse operations (the ‘slow thinking’). 

The main function of the fast-acting system 1 is to constantly provide system 2 with suggestions based on intuition, feelings, and impressions. Based on these, system 2 makes the final decisions, implements self-control, and ultimately chooses the voluntary action. 

Chapters 2-3

System 2 requires both attention and effort. The more tasks we try to combine at the same time, the more effort it takes system 2. But if we’re giving full attention to a task, we’re blind to anything else going on.

When system 2 is doing mental work-using self-control, cognitive effort, attention- system 1 begins to take over other tasks. And we know when system 2 is used highly for self-control, we become overall depleted physically, mentally, and emotionally. Research also shows people who score higher on mental control also score higher on intelligence tests. 

Chapters 4-5

System 1 is constantly making associations between things without your conscious effort. You make associations between words, events, actions, and feelings in your own mind without thinking about it.

System 1 also monitors cognitive ease-how good, easy, comfortable, or nonthreatening things are around you. Then, system 1 tells system 2 if you have to act, change, or redirect attention based on the level of cognitive ease. Research has found that being familiar with something or being in a good moods increases cognitive ease and therefore, system 1. 

Chapters 6-7

System 1 is more likely to make associations between things that happen at the same time even if unrelated. It will also make associations between things that happen more than once, and we’ll feel less surprised the second time even if the second time is equally unlikely. 

In familiar situations with low risk, System 1 jumps to conclusions to save time and effort. This can lead to errors like the confirmation bias where we just see information to confirmation our previous thoughts. It also produces the Halo Effect which means if we like part of a person or thing, we tend to like everything about them or it. 

Chapters 8-9

At all times, System 1 is making basic assessments-assessments of what’s happening around you without specific intentions. Our System 1 is good at quickly identifying friend from foe and getting a quick average of things. But it’s bad at quickly adding or drawing conclusions when we receive seemingly conflicting information.

Because System 1 is always trying to answer questions, when faced with too difficult a question, System 1 will substitute a related but easier questions to answer. System 1 is also very subjected to emotional states which influence the automatic answers to those questions. 

Part 2 Heuristics and Biases

Chapters 10-11

As humans, we are prone to exaggerate the consistency and cohesiveness of things around us. This leads to the law of small numbers- we believe what happens in small samples is also true in large samples, but statistically that is false. We are also prone to see patterns and causality where there is none because our brains don’t like randomness. 

The anchoring effect occurs when people consider a particular value for an unknown quantity before estimating that quantity. Part of this happens as System 2 starts from the anchor number and moves in one direction until you reach uncertainty. The other part is the priming effect- the anchor number creates a suggestion for System 1 even if the anchor number is totally random. 

Chapters 12-14

We easily fall into the availability bias that says we make judgements and estimates based on how easy the information comes to mind. Our brains overestimate things that are more salient, dramatic, or personal to us. And we find things easier to recall if the information comes fluently. 

Another common bias that affects is the affect heuristic- people make judgements based on their emotions. The affect cascade is a chain of events charged by a relatively minor event that leads to government change and/or widespread public panic. 

When given little information, we make predictions based on the base rate- how common or the average size something is. When given more information, we predict based on representativeness, or how closely something fits our preconceived stereotypes or beliefs. 

Chapters 15-16

Research has shown we’ll make predictions on representativeness that even go against logic. This leads to the conjunction fallacy where people will judge a conjunction of two events to be more likely than just one (which goes against logical statistics). 

When drawing conclusions, we tend to ignore statistics if we believe there’s a different cause. We also ignore statistics if they go against deep held beliefs. 

Chapter 17-18

One phenomenon that we largely ignore is regression to the mean which states that after one extreme, the next event will move back closer to the mean. For example, when an athlete has a particularly good day or single performance, the next is statistically likely to be worse (more towards the average day or performance). This is random and there’s usually no causal event even though we like to look for one. 

As humans, we tend to make intuitive predictions (based on ‘gut feeling’) that are overly extreme as well as overconfident. We also use our evaluation on the current state to determine our predictions of the future state without realizing it. 

thinking fast and slow book cover

Part 3 Overconfidence 

Chapters 19-20

Because of hearing stories of the past, we overestimate our ability to know what’s going to happen in the future aka the narrative fallacy. Along the same lines is the hindsight bias which says that we judge how good a decision was based on the outcome not based on the information we had when making the decision.

When we feel confident in an answer or judgement, it means our brains have created a coherent story, not that we’re necessarily right. So high confidence should never be taken as a high level of accuracy. The illusion of validity is that we overestimate our ability to be right about predictions, especially if we are “experts” (someone with years of experience).

Chapter 21-22

Research has shown that statistical formulas are much better at predicting outcomes than clinical or expert intuition. Human intuition is very easily influenced while algorithms are not. Overall, the algorithms make more correct decisions and fewer errors. 

Further research has shown that we can trust intuitive judgement more when the environment is sufficiently regular to be predictable. We can also trust it when you or the “expert” had the opportunity to learn these regularities through prolonged practice. System 1 pulls immediate memories and patterns from these practiced, predictable environments and relays that to System 2 to make a judgement based on the past experiences.

Chapters 23-24

When making forecasts, there’s the inside view and the outside view. The inside view, made by people inside the situation, is based on experience, intuition, or current feelings and is usually wrong. The outside view is made on statistics, based on rates, and historical evidence from similar situations and is usually much more accurate.

The inside view often leads to the planning fallacy when forecasts are made unrealistically close to best case scenarios and could be improved by consulting statistics.

We are prone to the optimism bias, or always overestimating good decisions or outcomes. But optimism comes with some positives like resilience in the face of setbacks. 

Part 4 Choices

Chapters 25-26

For hundreds of years, researchers have accepted that people will always choose a sure thing over a gamble of the same of or slightly higher value. But this theory doesn’t take into account when people choosing have a different reference point. So, to address this is Prospect theory. 

Prospect theory says evaluation of risk are made relative to a reference point. It also takes into account the relative value of the loss or gain. And the third piece of the theory says that we’re more afraid of losses than enticed by gains. 

Chapters 27-28

The endowment effect describes when we give more value to something just because we own it. This largely comes from loss aversion. It’s also a much larger effect for objects we have ‘for use’ compared to objects we have ‘for exchange’ (like money).

System 1 is designed to give more priority to bad news or threats. Again because of loss aversion, we’re much more affected by loses than by gains in all aspects of life, including economics.

Chapters 29-30

System 1 also assigns weights to different aspects of something as you make a complete evaluation of that thing, and these weights are not proportional or completely logical. In choosing outcomes, we’re affecting first by the possibility effect where improbably but possible outcomes are overweighted. Second, we’re affected by the certainty principle which shows that outcomes that are almost certain are underweighted relative to actual certainty. 

System 1 also overestimates the occurrence or probability of rare or extreme events happening. This is even more true of rare events that are fluent, vivid, or easy to imagine.

Chapters 31-32

We tend to be risk averse for potential gains and more risk seeking when faced with losses. But loss aversion is lessened when using broad framing- considering multiple separate bets or gambles together. This happens a lot with people who work on the stock market.

We are constantly using mental accounting when it comes to gains and losses that also takes into account the emotion it costs us. This leads to the sunk-cost fallacy which says we’re more likely to try to save a failing project than cut our losses because of the emotion attachment and the mental accounting of taking that loss. 

Chapters 33-34

We often make reversals when faced either more than one scenario, gamble, or chance. That means if we viewed each separately from each other first and then next to each other, we usually change our choice. 

System 1 is not reality-bound, so we are heavily influenced by framing. We make different choices between two equivalent statements based on the wording. Because of loss aversion, we avoid choices with negative wording or framed as a loss. 

thinking fast and slow book review graphic

Part 5 Two Selves

Chapter 35-36

When making decisions, we judge decisions as wrong if it’s inconsistent with other preferences, not inconsistent with logic. This leads to the peak-end rule which states that our retrospective rating is an average of the worst point and the last point. Our remembering self bases the whole experience on those two points usually. 

Our brains evaluate experiences (and people’s whole lives) as a story that follows the peak-end rule and the duration neglect. That means that the duration of an event has no effect on how we judge it if everything else is the same. 

Chapter 37-38

Our emotional state is based on what our attention is on which is normally our current activity and immediate environment. And research shows that people’s evaluations of their lives and their actual experiences are related but different. 

Researchers have shown a mood heuristic when answering overall life satisfaction which means our current mood affects our answer in the moment. Life satisfaction and experienced happiness are both also affected by genetic disposition. 

Conclusions 

Recaps the book. Concludes that the best way to avoid thinking bias is to recognize situations where bias is common, slow down, and think with System 2 to make better decisions.

Read My Review

Now that you’re reminded of the main points of the book, read my full review here. This is just a short summary of the book, so, if you want more examples and the research behind it, read the whole book. It’s very illuminating for human psychology in general, but specifically for our cognitive illusions and bias in the decision-making processes. 

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *