Thinking, Fast and Slow

by Daniel Kahneman

Read:  2017-03-21, Rating:  5/10.

Three Sentence Summary

The master of all pop-psychology books, full of rich and fascinating studies, dives into knowing how our two systems, 1. automatic and 2. effortful, shape our judgements and decisions. We have excess confidence in what we believe we know but our minds have an inability to acknowledge the full extent of our ignorance and the uncertainty of the world we live in. A huge thorough book, but unfortunately, my system 2 was too weak to get through this book.

Two Systems:

  • System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.
  • System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration

The attentive System 2 is who we think we are. System 2 articulates judgements and makes choices, but it often endorses or rationalizes ideas and feelings that are generated by System 1. System 2 also prevents many foolish thoughts and inappropriate impulses from overt expression. Its abilities are limited and so is the knowledge to which it has access. We do not always think straight when we reason, and the errors are not always due to intrusive and incorrect intuitions. We often make mistakes because we (our System 2) do not know any better.

Our thoughts and actions are routinely guided by System 1 and generally are on the mark. The acquisition of skills requires a regular environment, an adequate opportunity to practice, and rapid and unequivocal feedback about the correctness of thoughts and actions. When these conditions are fulfilled, skill eventually develops, and the intuitive judgements and choices are that quickly come to mind will mostly be accurate. All this is the work of System 1, which means it occurs automatically and fast.

System 1 registers the cognitive ease with which it processes information, but it does not generate a warning signal when it becomes unreliable.

What can be done about biases?

The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2.

 

%d bloggers like this: