Thinking Fast and Slow – Book Notes

In Book Notes, Decision Making, All by Claudio

Thinking, Fast and Slow
by Daniel Kahneman

Print | eBook | Audiobook

 

What Can We Learn From This Book?

Nobel prize winner Daniel Kahneman helps us understand how our minds work, our potential blind spots, and how we can be easily exploited and influenced. Armed with this knowledge and awareness, we can then make better decisions.

  • “My main aim here is to present a view of how the mind works that draws on recent developments in cognitive and social psychology.”
  • “We can be blind to the obvious, and we are also blind to our blindness.”
Part I: The Two Systems

Our brains utilize two systems.

  • System 1: Fast Thinking
    • Continuously scans our environment
    • Fast but error-prone
    • Works on auto-pilot using shortcuts, impulses, and intuition to minimize effort.
  • System 2: Slow Thinking
    • Used for solving more important problems, but only when necessary
    • Requires effort to analyze, reason, and exercise self-control
    • Slow but more reliable than System 1
  • “One of the tasks of System 2 is to overcome the impulses of System 1. In other words, System 2 is in charge of self-control.”
  • “The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when stakes are high.”
Attention and Effort
The Law of Least Effort

We’re hard-wired to take the path of least resistance. Because System 2 requires more effort and energy, we default automatically to System 1, with System 2 kicking in only for vital tasks that warrant the extra effort. However:

  • System 1 is prone to biases and errors. It can be exploited by others to influence our responses & choices
  • System 2 is supposed to monitor and scrutinize System 1, but often fails when it’s lazy or overloaded
  • “In the economy of action, effort is a cost, and the acquisition of skill is driven by the balance of benefits and costs. Laziness is built deep into our nature.”
  • “System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy.”
Part II: Heuristics (Mental Shortcuts)

System 1 works using mental shortcuts (called heuristic thinking) to solve problems and make judgments quickly and efficiently. However, this can often lead to biases and errors.

  • Associations and Priming: When consciously or subconsciously exposed to an idea, we’re “primed” to think about associated ideas, memories & feelings.
  • Cognitive Ease: When we’re at ease (things seem familiar and effortless), System 1 can lead to us letting our guard down. When we experience “cognitive strain”, it indicates that a problem exists, and System 2 kicks in. We switch to an analytical, problem-solving mode. We become more vigilant and are less susceptible to lies and suggestions.
  • Stories and Causes: To make sense of the world, we connect bits and pieces of information and tell ourselves stories about what’s going on. We make associations between events and circumstances. The more coherent our story, the more confident we feel. But confidence does not equal accuracy.
  • Jumping to Conclusions: We often have to deal with incomplete information. System 1 fills the gaps with interpretations and guesses that fit our stories, jumping to conclusions on causes.
  • Substitution: Kahneman recommends substituting a simpler question for another question when trying to solve difficult problems.
  • Emotions: When we emotionally prefer an outcome, we play up its benefits and downplay its costs.
Some Quotes on Heuristics
  • “Anything that makes it easier for the associative machine to run smoothly will also bias beliefs.”
  • “Cognitive strain, whatever its source, mobilizes System 2 which is more likely to reject the intuitive answer suggested by System 1.”
  • “People are prone to apply causal thinking inappropriately, to situations that require statistical reasoning.”
Heuristics Lead To Biases and Errors
  • Confirmation Bias: Our tendency to search for and find confirming evidence for a belief while dismissing counter-evidence.
  • The Halo Effect: Our tendency to like or dislike everything about a person, place, or thing, based on first impressions.
  • What You See is All There Is (WYSIATI): When system 1 ignores whether the information used is complete or accurate. It only cares how coherent the story is, based on available information.
  • Framing Effects: “Different ways of presenting the same information evoke different emotions.”
  • Law of Small Numbers: We base our decisions on inadequate data.
  • Anchoring: When numbers create mental anchors that bias our estimates and decisions.
  • Conjunction Fallacy: We perceive A+B to be more likely than A when the reverse is true. An example is that we perceive an expensive item to be worth more on its own, than when a cheap item is added.
  • Representativeness: When we make judgments about a person, place, or thing based on how much it resembles something else. A more accurate approach would be to study the probability and statistics and then examine the evidence.
  • Causes Trump Statistics: When we focus on causal data and ignore statistical data.
  • Regression to the Mean: Random fluctuations will usually regress to the mean over time. Yet, we may give causal explanations to these random fluctuations when reason suggests otherwise.
Additional Quotes on Biases
  • “We have limited information about what happened on a day, and System 1 is adept at finding a coherent causal story that links the fragments of knowledge at its disposal.”
  • “System 1 does not keep track of alternatives that it rejects, or even of the fact that there were alternatives.”
  • “The halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted.”
Part III: Heuristics Lead To Overconfidence
Illusion of Understanding
  • The Narrative Fallacy: People tend to understand the past less than they think they do. Believing that we understand the past, and that we can predict the future leads to a false sense of security. The reality is that we often create flawed stories to explain the past.
  • The Hindsight Illusion: This is the “I knew it all along” effect. When we change our view of the world, we lost much of the ability to recall what we used to believe. We end up feeling that our current belief system has been our belief all along. This is why it’s easy to blame others for bad decisions in hindsight and don’t give enough credit for good decisions.
Illusion of Validity
  • Skills Illusion: We are prone to think that skills produce predictable results, but that’s not always true. Studies have shown that stock market traders educated guesses are no better than blind guesses.
  • Formulas vs Intuitions: Statistical algorithms are better predictors than experts. Experts tend to think out of the box, and while their creative solutions sometimes work, overall reliability is reduced.
  • Trusting Expert Intuition: Expert intuition can still be trustworthy, but only if the environment is regular and predictable, and the expert has learned the regularities through prolonged practice.
The Optimistic Bias
  • While optimism is a positive trait, overconfidence and the optimistic bias can lead to excessive risks. Two examples of this bias are Planning Fallacy and the Outside View.
  • Planning Fallacy: when people take on risky projects based on best-case scenarios, without taking into account works case scenarios or unknowns.
  • Outside View: when planners identify others who have engaged in similar projects and use their statistics as a baseline. Then adjust the baseline using information specific to their case.
Part IV: Heuristics Impact Our Choices
Prospect Theory
  • Kahneman won the Nobel Prize in Economics with this theory.
  • The absolute value of money is less vital than the subjective experience that comes with changes to your level of wealth. People don’t attach value to wealth, but rather to gains and losses.
  • We experience reduced sensitivity to changes in wealth. Losing $200 hurts more if you start with $400 than if you had started with $2000.
  • Loss aversion: People hate to lose money, and remember losses more than gains. Our brains process threats and bad news faster and work harder to avoid pain than to achieve pleasure. An example would be if given a 50% chance to win $5,000 or get $2,500 for sure, most people will choose the latter.
The Endowment Effect
  • An object that we already own and use, or intend to use, is more valuable to us.
  • We give such objects additional value and we are unwilling to part with them.
  • An example would be purchasing a concert ticket for your favourite artist, and then refusing to sell it for ten times the amount.
The Fourfold Pattern

  • Certainy Effect: People are typically risk-averse when they have a high chance of getting the desired outcome. An example would be a court settlement with a high chance of winning.
    • Quadrant 1: High probability, big gains. “People are risk-averse when they consider prospects with a high chance of achieving a large gain.”
    • Quadrant 4: Low probability, big loss. “The bottom right cell is where insurance is bought. People are willing to pay much more for insurance than expected value—which is how insurance companies cover their costs and make their profits. Here again, people buy more than protection against an unlikely disaster; they eliminate a worry and purchase peace of mind.”
  • Possibility Effect: When we give an irrationally high weight to a desired but improbable outcome.
    • Quadrant 3: Low probability, big gains. “A lottery ticket is the ultimate example of the possibility effect. Without a ticket you cannot win, with a ticket you have a chance, and whether the chance is tiny or merely small matters little. Of course, what people acquire with a ticket is more than a chance to win; it is the right to dream pleasantly of winning.”
    • Quadrant 2: High probability, big losses. This one sticks out as the really odd one that does occasionally occur, unfortunately. An example of this would be terminal illness treatments. “Many unfortunate human situations unfold in the top right cell. This is where people who face very bad options take desperate gambles, accepting a high probability of making things worse in exchange for a small hope of avoiding a large loss. Risk taking of this kind often turns manageable failures into disasters. The thought of accepting the large sure loss is too painful, and the hope of complete relief too enticing, to make the sensible decision that it is time to cut one’s losses. This is where businesses that are losing ground to superior technology waste their remaining assets in futile attempts to catch up. Because defeat is so difficult to accept, the losing side in wars often fights long past the point at which the victory of the other side is certain, and only a matter of time.”
Rare Events
  • When we start combing some of these heuristics mentioned, we can begin to understand why people pay an undue amount of attention to very rare events like terrorist attacks that the media helps amplify.”System 2 may “know” that the probability is low, but this knowledge does not eliminate the self-generated discomfort and the wish to avoid it. System 1 cannot be turned off. The emotion is not only disproportionate to the probability, it is also insensitive to the exact level of probability.”
  • “The psychology of high-prize lotteries is similar to the psychology of terrorism.”
  • “The original formulation of prospect theory included the argument that “highly unlikely events are either ignored or overweighted,” but it did not specify the conditions under which one or the other will occur, nor did it propose a psychological interpretation of it.”
  • “My current view of decision weights has been strongly influenced by recent research on the role of emotions and vividness in decision making. Overweighting of unlikely outcomes is rooted in System 1 features that are familiar by now.”
Risk Policies
  • There were two ways of construing decisions, Narrow Framing: a sequence of two simple decisions, considered separately. And Broad Framing: a single comprehensive decision, with four options.”
  • “Broad framing was obviously superior in this case. Indeed, it will be superior (or at least not inferior) in every case in which several decisions are to be contemplated together.”
  • “Decision makers who are prone to narrow framing construct a preference every time they face a risky choice. They would do better by having a risk policy that they routinely apply whenever a relevant problem arises.”
  • If we adopt the “outside view” mentioned earlier to address planning fallacies from over-optimism and to counter excessive caution from loss aversion, we can come up with “risk policies” to help our routine decision making. An example would be to never purchase extended warranties and to always pay the highest possible deductible when purchasing insurance.
Keeping Score
  • With the exception of the very poor, most people seeking money are not only after economic gains but the symbolic success and achievement that comes along with having money.
  • As a result, people keep score of the potential financial gains and losses of a transaction, as well as the emotions, risks vs rewards, and potential regrets of these decisions.
  • Disposition Effect: Our tendency to sell winners rather than losers. This is also an example of “narrow thinking”.
  • Sunk-cost Fallacy: Our tendency to invest more resources in a losing endeavour, instead of other better ones. Cutting our losses is an emotionally difficult thing to do. By gambling further, we hope to recoup the original investment or postpone the day of reckoning. This fallacy keeps people in bad situations.
  • Fear of Regret: “Regret is an emotion, and it is also a punishment that we administer to ourselves.” It leads to us choosing safer, more conventional options. But don’t put too much focus on regret because it usually hurts less then we expect.
Joint Comparisons
  • People tend to make different decisions when they are done in isolation then when they are made in a side by side comparison. We do this when we are shopping for a new car or piece of furniture.
Frames and Reality
  • How we frame problems evoke different emotional responses, which can then affect our choices.
  • We will intuitively choose a 10% chance to win over a 90% chance to lose. The medical system does this by framing something as having a 90% survival rate versus a 10% mortality rate. Doctors prefer the former but the options are exactly the same.
Part V: The Two Selves
  • In Kahneman’s research on happiness, he found that we have an “experiencing self” and a “remembering self” and that our memories override our experiences. We make decisions that will result in better memories. Not better experiences.
The Peak-End Rule
  • Kahneman’s research (which this section of the book dives into the details of how humans perceive and remember pain) revealed that we evaluate our lives as a story, focusing and remembering major events and experiences (the peaks), and how they end. The experience isn’t as important as the memory of how painful or pleasurable the experience was. Our “remembering self” makes decisions based on the memories that we expect to create.
Duration Neglect
  • “The duration of the procedure had no effect whatsoever on the ratings of total pain.”
  • “The peak-end rule predicts a worse memory for the short than for the long trial, and duration neglect predicts that the difference between 90 seconds and 60 seconds of pain will be ignored. We, therefore predicted that the participants would have a more favorable (or less favorable) memory of the long trial and choose to repeat it. They did. Fully 80% of the participants who reported that their pain diminished during the final phase of the longer episode opted to repeat it, thereby declaring themselves willing to suffer 30 seconds of needless pain in the anticipated third trial.”
  • “The subjects who preferred the long episode were not masochists and did not deliberately choose to expose themselves to the worse experience; they simply made a mistake. If we had asked them, “Would you prefer a 90-second immersion or only the first part of it?” they would certainly have selected the short option.”
  • “Duration neglect is normal in a story, and the ending often defines its character. The same core features appear in the rules of narratives and in the memories of colonoscopies, vacations, and films. This is how the remembering self works: it composes stories and keeps them for future reference.”
Experienced Well Being
  • When we are having a good time doing something, we resist ending the experience. Therefore, to improve our well-being, we should spend time on the things that we enjoy and pay attention to what we are doing. As a result, we’ll get more pleasure from our activities and create positive memories of these experiences.
  • “Some aspects of life have more effect on the evaluation of one’s life than on the experience of living. Educational attainment is an example. More education is associated with higher evaluation of one’s life, but not with greater experienced well-being. Indeed, at least in the United States, the more educated tend to report higher stress.”
  • “On the other hand, ill health has a much stronger adverse effect on experienced well-being than on life evaluation. Living with children also imposes a significant cost in the currency of daily feelings—reports of stress and anger are common among parents, but the adverse effects on life evaluation are smaller.”
  • “Religious participation also has relatively greater favorable impact on both positive affect and stress reduction than on life evaluation. Surprisingly, however, religion provides no reduction of feelings of depression or worry.”
  • “An analysis of more than 450,000 responses to the Gallup-Healthways Well-Being Index, a daily survey of 1,000 Americans, provides a surprisingly definite answer to the most frequently asked question in well-being research: Can money buy happiness? The conclusion is that being poor makes one miserable, and that being rich may enhance one’s life satisfaction, but does not (on average) improve experienced well-being.”
  • “The satiation level beyond which experienced well-being no longer increases was a household income of about $75,000 in high-cost areas (it could be less in areas where the cost of living is lower). The average increase of experienced well-being associated with incomes beyond that level was precisely zero.”
  • “Why do these added pleasures not show up in reports of emotional experience? A plausible interpretation is that higher income is associated with a reduced ability to enjoy the small pleasures of life.”
  • “There is a clear contrast between the effects of income on experienced well-being and on life satisfaction. Higher income brings with it higher satisfaction, well beyond the point at which it ceases to have any positive effect on experience.”
  • “The general conclusion is as clear for well-being as it was for colonoscopies: people’s evaluations of their lives and their actual experience may be related, but they are also different. Life satisfaction is not a flawed measure of their experienced well-being, as I thought some years ago. It is something else entirely.”
Thinking About Life
  • We are terrible at predicting what will make us happy. He calls this “affective forecasting”. We make decisions based on what we think will make us happy, only to later find out that the happiness didn’t last.
  • Our levels of happiness and satisfaction depend on many factors, and when we focus on only one area we suffer from what Kahneman calls the “focusing illusion”.
  • We are prone to acclimatization. What’s initially exciting loses its appeal over time. An example of this is getting a new car or, or an early romantic relationship and the newlywed phase of a marriage.
My Overall Review
What I liked Most
  • The quality of information based on years of Kahneman’s research which earned him a nobel prize.
  • After reading this, I have a newfound insight into how easily we can be influenced and manipulated by marketing, advertising, and election campaigns. I plan on tring to be more consciously aware of my decision making after reading this book.
What I Didn’t Like
  • I wasn’t crazy about the prose and flow of the writing. He’s a great researcher, but not a poet.
Buy the Book -Thinking Fast and Slow

Print | eBook | Audiobook

Click here to get back to the Articles page