Thinking, Fast and Slow

From Infogalactic: the planetary knowledge core
Jump to: navigation, search
Thinking, Fast and Slow
File:Thinking, Fast and Slow.jpg
Hardcover edition
Author Daniel Kahneman
Country United States
Language English
Subject Psychology
Genre Non-fiction
Publisher Farrar, Straus and Giroux
Publication date
2011
Media type Print (hardcover, paperback)
Pages 499 pages
ISBN 978-0374275631
OCLC 706020998

Thinking, Fast and Slow is a best-selling[1] 2011 book by Nobel Memorial Prize in Economics winner Daniel Kahneman which summarizes research that he conducted over decades, often in collaboration with Amos Tversky.[2][3] It covers all three phases of his career: his early days working on cognitive biases, his work on prospect theory, and his later work on happiness.

The book's central thesis is a dichotomy between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical. The book delineates cognitive biases associated with each type of thinking, starting with Kahneman's own research on loss aversion. From framing choices to people's tendency to substitute an easy-to-answer question for one that is harder, the book highlights several decades of academic research to suggest that people place too much confidence in human judgement.

Prospect theory

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Kahneman developed prospect theory, the basis for his Nobel prize, to account for experimental errors he noticed in Daniel Bernoulli's traditional utility theory. This theory makes logical assumptions of economic rationality that do not reflect people's actual choices, and does not take into account behavioral biases.

One example is that people are loss-averse: they are more likely to act to avert a loss than to achieve a gain. Another example is that the value people place on a change in probability (e.g., of winning something) depends on the reference point: people appear to place greater value on a change from 0% to 10% (going from impossibility to possibility) than from, say, 45% to 55%, and they place the greatest value of all on a change from 90% to 100% (going from possibility to certainty). This occurs despite the fact that all three changes give the same increase in utility. Consistent with loss-aversion, the order of the first and third of those is reversed when the event is presented as losing rather than winning something: there, the greatest value is placed on eliminating the probability of a loss to 0.

In 2012, the American Economic Association's Journal of Economic Literature published a review of Thinking Fast and Slow. A thorough discussion of Kahneman's take on prospect theory, as well as an analysis of the four fundamental factors that it rests on, can be seen on pages 7–9.[4]

Two systems

In the book's first section, Kahneman describes two different ways the brain forms thoughts:

  • System 1: Fast, automatic, frequent, emotional, stereotypic, subconscious
  • System 2: Slow, effortful, infrequent, logical, calculating, conscious

Kahneman covers a number of experiments which purport to highlight the differences between these two thought systems and how they arrive at different results even given the same inputs. Terms and concepts include coherence, attention, laziness, association, jumping to conclusions, and how one forms judgments. The System 1 vs. System 2 debate dives into the reasoning or lack thereof for human decision making, with big implications for market research.[5]

Heuristics and biases

The second section offers explanations for why humans struggle to think statistically. It begins by documenting a variety of situations in which we either arrive at binary decisions or fail to precisely associate reasonable probabilities with outcomes. Kahneman explains this phenomenon using the theory of heuristics. Kahneman and Tversky originally covered this topic in their landmark 1974 article titled Judgment under Uncertainty: Heuristics and Biases.[6]

Kahneman uses heuristics to assert that System 1 thinking involves associating new information with existing patterns, or thoughts, rather than creating new patterns for each new experience. For example, a child who has only seen shapes with straight edges would experience an octagon rather than a triangle when first viewing a circle. In a legal metaphor, a judge limited to heuristic thinking would only be able to think of similar historical cases when presented with a new dispute, rather than seeing the unique aspects of that case. In addition to offering an explanation for the statistical problem, the theory also offers an explanation for human biases.

Anchoring

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The "anchoring effect" names our tendency to be influenced by irrelevant numbers. Shown higher/lower numbers, experimental subjects gave higher/lower responses.[2]

This is an important concept to have in mind when navigating a negotiation or considering a price. As an illustrative example, you might find it interesting that most people, when asked whether Gandhi was more than 114 years old when he died, will provide a much larger estimate of his age at death than they would if the anchoring question referred to death at 35 years old. The funny and powerful examples shared here show that our behavior is influenced, much more than we know or want, by the environment of the moment.

Availability

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The availability heuristic is a mental shortcut that occurs when people make judgments about the probability of events on the basis of how easy it is to think of examples. The availability heuristic operates on the notion that, "if you can think of it, it must be important." The availability of consequences associated with an action is positively related to perceptions of the magnitude of the consequences of that action. In other words, the easier it is to recall the consequences of something, the greater we perceive these consequences to be. Sometimes, this heuristic is beneficial, but the frequencies at which events come to mind are usually not accurate reflections of the probabilities of such events in real life.[7][8]

Substitution

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

System 1 is prone to substituting a difficult question with a simpler one. In what Kahneman calls their "best-known and most controversial" experiment, "the Linda problem," subjects were told about an imaginary Linda, young, single, outspoken, and very bright, who, as a student, was deeply concerned with discrimination and social justice. They asked whether it was more probable that Linda is a bank teller or that she is a bank teller and an active feminist. The overwhelming response was that "feminist bank teller" was more likely than "bank teller," violating the laws of probability. (Every feminist bank teller is a bank teller.) In this case System 1 substituted the easier question, "Is Linda a feminist?", dropping the occupation qualifier. An alternative view is that the subjects added an unstated cultural implicature to the effect that the other answer implied an exclusive or (xor), that Linda was not a feminist.[2]

Optimism and loss aversion

Kahneman writes of a "pervasive optimistic bias", which "may well be the most significant of the cognitive biases." This bias generates the illusion of control, that we have substantial control of our lives. This bias may be usefully adaptive. Optimists are more psychologically resilient and have stronger immune systems than more reality-based opposites.[citation needed] Also optimists are wrongly thought of having longer lives on average, a common belief which was disproved in the longevity project. Optimism protects from loss aversion: people's tendency to fear losses more than they value gains.[2]

A natural experiment reveals the prevalence of one kind of unwarranted optimism. The planning fallacy is the tendency to overestimate benefits and underestimate costs, impelling people to take on risky projects. In 2002, American kitchen remodeling was expected on average to cost $18,658, but actually cost $38,769.[2]

To explain overconfidence, Kahneman introduces the concept he labels What You See Is All There Is (WYSIATI). This theory states that when the mind makes decisions, it deals primarily with Known Knowns, phenomena it has already observed. It rarely considers Known Unknowns, phenomena that it knows to be relevant but about which it has no information. Finally it appears oblivious to the possibility of Unknown Unknowns, unknown phenomena of unknown relevance.

He explains that humans fail to take into account complexity and that their understanding of the world consists of a small and necessarily un-representative set of observations. Furthermore, the mind generally does not account for the role of chance and therefore falsely assumes that a future event will mirror a past event.

Framing

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Framing is the context in which choices are presented. Experiment: subjects were asked whether they would opt for surgery if the "survival" rate is 90 percent, while others were told that the mortality rate is 10 percent. The first framing increased acceptance, even though the situation was no different.[9]

Sunk-cost

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Rather than consider the odds that an incremental investment would produce a positive return, people tend to "throw good money after bad" and continue investing in projects with poor prospects that have already consumed significant resources. In part this is to avoid feelings of regret.[9]

Choices

In this section Kahneman returns to economics and expands his seminal work on Prospect Theory. He discusses the tendency for problems to be addressed in isolation and how, when other reference points are considered, the choice of that reference point (called a frame) has a disproportionate impact on the outcome. This section also offers advice on how some of the shortcomings of System 1 thinking can be avoided.

Rationality and happiness

Evolution teaches that traits persist and develop because they increase fitness. One possible hypothesis is that our conceptual biases are adaptive, as are our rational faculties. Kahneman offers happiness as one quality that our thinking process nurtures. Kahneman first took up this question in the 1990s. At the time most happiness research relied on polls about life satisfaction.

Two selves

Kahneman proposed an alternate measure that assessed pleasure or pain sampled from moment to moment, and then summed over time. Kahneman called this "experienced" well-being and attached it to a separate "self". He distinguished this from the "remembered" well-being that the polls had attempted to measure. He found that these two measures of happiness diverged. His major discovery was that the remembering self does not care about the duration of a pleasant or unpleasant experience. Rather, it retrospectively rates an experience by the peak (or valley) of the experience, and by the way it ends. Further, the remembering self dominated the patient's ultimate conclusion.

<templatestyles src="Template:Blockquote/styles.css" />

"Odd as it may seem," Kahneman writes, "I am my remembering self, and the experiencing self, who does my living, is like a stranger to me."[3][lower-alpha 1]

Awards and honors

See also

<templatestyles src="Div col/styles.css"/>

Footnotes

  1. Brain-scanning experiments by Rafael Malach showed that when subjects are absorbed in an experience, such as watching a movie, the parts of the brain associated with self-consciousness are not merely quiet, they're actually shut down ("inhibited") by the rest of the brain. The self seems simply to disappear. If the self is not participating in the experience, how does the remembering self get its data?[citation needed]

References

  1. Lua error in package.lua at line 80: module 'strict' not found.
  2. 2.0 2.1 2.2 2.3 2.4 Lua error in package.lua at line 80: module 'strict' not found.
  3. 3.0 3.1 Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.(subscription required)
  9. 9.0 9.1 Lua error in package.lua at line 80: module 'strict' not found.
  10. Lua error in package.lua at line 80: module 'strict' not found.
  11. Lua error in package.lua at line 80: module 'strict' not found.
  12. Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.

External links