• You have two systems — System 1 for quick, impulsive thought, and System 2 for slow, deliberate thought.

  • Think of Attention as a finite resource that is allocated for various tasks, including regulation of thought.

    • Activities which require more effort require more attention.
    • Multi-tasking does not help ease cognitive load as task switching requires effort. So does time pressure.
  • If you are mentally fatigued, you are more prone to making impulsive decisions.

  • Intuition is a fallback mechanism when we are too lazy, by the Principle of Least Effort, to put in cognitive effort. This is for better or worse. The pursuit of more rational thinking comes with making System 2 less lazy.

    • It is easier to recognize the mistakes of others rather than our own

    • Priming is a phenomenon where we perform a behavior unconsciously as a result of a prior stimulus.1.

    • What is familiar to us, or more sensually appealing and pleasant, will appear to be truth even when it is not. Inversely, cognitive strain and discomfort stimulates System 2 and encourages active thinking.

      • From an evolutionary perspective, this follows from the strategy of associating novel with potentially dangerous.
    • Repeated exposure yields normalcy which reduces cognitive load. Conversely, When something does not fit into the current context of activated ideas, the system detects an abnormality and we try to resolve the incongruity between expectations or norms, and reality.

      • We seek to impose on everything, a cause or attribution that aligns with our expectations, even when a more refined, statistical approach would be better. We favor certainty over doubt.
      • Hence, we jump to conclusions easier when there is low risk and the effort to think things through is not worth it. This is despite having an incomplete picture. To System 1, What you see is all there is.
      • We favor using recent events or context to guide our interpretations. When both are unavailable, we turn to past experience. Such is the spirit of confirmation bias.
      • We favor things we like or perceive as likeable. This is the Halo Effect. We favor first impressions.
    • System 1 favors aggregates. System 2 favors details. System 1 favors associations and correlations even those that don’t realistically make sense. Most are bad at statistics.

      • The mental shotgun - we (System 1) incorporate more than we need to when it comes to judgments.
      • System 1 is prone to substituting a target question with a simpler heuristic question which is convenient to answer but sometimes leads to errors.
      • When emotions become involved, System 2 becomes more of an apologist than a critic. It is biased towards confirmation and affirmation of these emotions and beliefs.
  • To reduce bias, decorrelate error by averaging over independent samples.

  • Intuition is nothing but unconscious memory.

    • Intuition becomes more valid for strongly valid skills — that is those with a predictable environment, and where one can practice in this environment (see more)
  • Some common heuristics, biases, and mistakes:

    • The Law of Small Numbers — deriving conclusions from small sample sizes, even when Statistics dictates larger sample sizes give more precise results. In general, System 1 is not good at considering sample size. Small samples give more drastic, extreme, inconsistent results.
      • This is coupled with the inability to accept life is random.
      • We pay more attention to the content of the messages rather than their reliability.
    • Anchoring - the tendency to consider a particular value for an unknown quantity before estimating it. It takes effortful adjustment to move away from an anchor.
    • Availability - the tendency to judge something based on whether it is readily available for information retrieval. It substitutes judgments on frequency with the ease / availability in which example instances come to mind (System 1), but readily switches to evaluating self-fluency (System 2) when more engaged.
      • A consequence of this is the availability cascade: What is reported in the media influences our judgments, and vice versa.
      • The affect heuristic - instead of asking “What do I think about it?”, we substitute “What do I feel about it”. This is in contrast to the non-binary nature of most stances.
      • A consequence of this is giving more importance to small risks.
    • Base Rate Fallacy - ignoring representativeness and differing sample sizes and substituting the assumption (implicitly) that sample sizes across categories are the same.
      • We struggle with dealing with proportions.
      • We mistake what is probable with what is plausible.
      • When a base rate is given for a cause, we ignore the base rate of the group and treat the thing as the cause for an individual case. This is how we treat and use Stereotypes.
      • Individuals feel relieved of responsibility when they know that others have heard the same request for help. This is the Bystander effect.
      • Subjects’ unwillingness to deduce the particular from the general was matched only by their willingness to infer the general from the particular.
    • Regression to the mean - when one sample is extreme (i.e., by luck), the next sample is likely to be close to the mean.
      • This leads to the phenomenon where we are punished for being nice and rewarded for being nasty.
      • Regression and correlation are two perspective on the same concept — low correlation leads to more pronounced regression to the mean of the extremes of the population.
      • Thus, it is not a byproduct of causation that sampling leads to regression to the mean, rather it is correlation (specifically weak correlation).
      • The implication for prediction is we should not only trust our intuition, but consider the correlation between the dependent and independent variables., moving more to the mean. with lower correlation (intuitively because, high correlation means our intuition was likely correct).
        • This is a Bayesian approach. The intuition is the prior, and the correlation is the likelihood.
  • We ignore luck and randomness as a factor in everyday life. We ignore that the world in itself is very complex, and even then we are inconsistent in making judgments about complex information due to a reliance on System 1

    • Narrative fallacies arise from favoring simple explanations of the world. We tend towards good stories — simple stories because they make us feel like we understand the world.

    • We build the best possible story with the information available, the story becomes better with fewer information..

    • We believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do

    • Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight. The worse the consequence, the greater the hindsight bias.

      • Hindsight is worse when you think a little, just enough to tell yourself later, “I almost made a better choice.”
    • Remember: In the presence of randomness, regular patterns can only be mirages

    • We think we are better than we actually are. Ignorance of ignorance. Even experts are prone to this, especially those who are asked to “predict”, and those who are skilled in low validity environments— where even a simple formula can outmatch expert intuition.

    • Those who know more forecast very slightly better than those who know less. But those with the most knowledge are often less reliable, because they believe skill correlates with confident predictions.

    • Often, simple formulas are better. (following Occam’s razor). Intuition adds value but only after a disciplined collection of objective information related to what is being scored, but as independent of each other.

    • We are biased to ignoring information we don’t believe in. We ignore the outsider’s perspective—the perspective from other cases or other people.

    • The people who have the greatest influence on the lives of others are likely to be optimistic and overconfident, and to take more risks than they realize. One solution to blind optimism is to legitimize doubt

    • Once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws

  • People are risk averse. They prefer a safer but low-utility guarantee than a riskier but high-utility gamble.

    • We think, not just in terms of the state of our current utility (happiness) but also the relative change of utility. Utility is non-Markovian.

    • See more on Prospect Theory

    • When both a gain and a loss are possible, loss aversion causes risk-averse choices.

    • In bad choices, where a sure loss is compared to a larger loss that is merely probable, diminishing sensitivity causes risk seeking. (see here).

    • In the context of buying and selling, we consider the reference of getting the product and giving up the product as well.

    • Prospect Theory cannot deal with disappointment and regret.

      • An abnormal event attracts attention, and it also activates the idea of the event that would have been normal under the same circumstances. Regret is made in comparison to a norm.
    • We weigh near certainty (~0% or ~100%) higher compared to uncertainty (~50%). This is even though the marginal gains are very small for the former compared to the latter.

    • We tend to overweight small risks and are willing to pay far more than expected value to eliminate them altogether

    • We tend to overestimate the probabilities of unlikely events, and overweight them when making decisions. However, rare events tend to not be overweighted when we make decisions based on our experiences (possibly because rare events are rare).

    • When providing an estimate of risks, we tend to look at a ratio more favorably (25% chance of X) than a frequency (250 in 1000). Focal attention and Salience contribute to overestimation.

    • Every simple choice formulated in terms of gains and losses can be deconstructed in innumerable ways into a combination of choices, yielding preferences that are likely to be inconsistent.

    • When faced with multiple decisions, the rational thing would be to do broad framing where we consider the overall utility over time rather than the instantaneous utility. However, humans tend to do narrow framing, considering only instantaneous utility.

      • Narrow framing manifests in compartmentalization — performing mental accounting to keep track of many simple things.
      • We prefer things that are framed as gains rather than an equivalent thing framed as a loss.
      • Good frames, even if it leads to faulty reasoning, can lead to good outcomes (i.e., by preventing sunk cost)
    • The ultimate currency that rewards or punishes is often emotional

      • Hence the sunk cost fallacy.
      • We’d rather sell winners rather than losers. This is the disposition effect.
      • We have a more emotional reaction to action than inaction.
  • We normally experience life in the between-subjects mode, in which contrasting alternatives that might change your mind are absent. Thus, how things are judged in isolation and how things are judged together can be inconsistent, even though joint judgment is broader and more rational.

  • Two selves: The experiencing self is the one that assesses pain and pleasure moment to moment. The remembering self assesses the overall experience. We only think of the experience from the remembering self.

    • We confuse experience with memory of the experience.
    • The remembering self constructs a narrative that bests captures the essence of the experience.
      • Empathy takes the form of concern for the quality of the story of people’s lives rather than their feelings.
      • We all care intensely for the narrative of our own life and very much want it to be a good story, with a decent hero
    • The memory of an experience is influenced by its highest and lowest points, as well as the end. It is not influenced by duration.
    • We cannot fully trust our preferences to reflect our interests, even if they are based on personal experience. Our preference is based on memory not experience.
  • A more pleasurable experience can be achieved by switching from passive leisure to active leisure. Control happiness by controlling how time is used.

    • Happiness is a state of mind. The wellness of life is evaluated with respect to one’s current mood.
    • Goals influence what happens to us, where we end up and how satisfied we are.
    • The Focusing Illusion: Nothing in life is as important as you think it is when you are thinking about it.
    • Thoughts of any aspect of life are more likely to be salient if a contrasting alternative is highly available.
    • Adaptation to a new situation, whether good or bad, consists in large part of thinking less and less about it. Thus most long-term circumstances are part-time states that one inhabits when one attends to them.
    • Experiences that will retain their attention value in the long term are appreciated less than they deserve to be.
  • Organizations are better than individuals when it comes to avoiding errors, because they naturally think more slowly and have the power to impose orderly procedures 2

Links

Footnotes

  1. Note: The validity of Priming theory has been called into question due to replicability. That said, priming is a really interesting fantastical concept

  2. The quotations at the end of each chapter are good ways to summarize the heart of the chapter.