Thinking, fast and slow

Daniel Kahneman, 1934-2024

Book - 2011

Kahneman exposes the extraordinary capabilities and also the faults and biases of fast thinking, and the pervasive influence of intuitive impressions on peoples' thoughts and choices.

Saved in:
2 being processed

2nd Floor Show me where

153.42/Kahneman
1 / 4 copies available
Location Call Number   Status
2nd Floor 153.42/Kahneman Due Jan 10, 2025
2nd Floor 153.42/Kahneman Due Dec 29, 2024
2nd Floor 153.42/Kahneman Due Apr 2, 2024
2nd Floor 153.42/Kahneman Checked In
Subjects
Published
New York : Farrar, Straus and Giroux 2011.
Language
English
Main Author
Daniel Kahneman, 1934-2024 (-)
Edition
First edition
Physical Description
499 pages : illustrations ; 24 cm
Bibliography
Includes bibliographical references (pages 447-481) and index.
ISBN
9780374275631
  • Introduction
  • Part I. Two Systems
  • 1. The Characters of the Story
  • 2. Attention and Effort
  • 3. The Lazy Controller
  • 4. The Associative Machine
  • 5. Cognitive Ease
  • 6. Norms, Surprises, and Causes
  • 7. A Machine for Jumping to Conclusions
  • 8. How Judgments Happen
  • 9. Answering an Easier Question
  • Part II. Heuristics and Biases
  • 10. The Law of Small Numbers
  • 11. Anchors
  • 12. The Science of Availability
  • 13. Availability, Emotion, and Risk
  • 14. Tom W's Specialty
  • 15. Linda: Less is More
  • 16. Causes Trump Statistics
  • 17. Regression to the Mean
  • 18. Taming Intuitive Predictions
  • Part III. Overconfidence
  • 19. The Illusion of Understanding
  • 20. The Illusion of Validity
  • 21. Intuitions vs. Formulas
  • 22. Expert Intuition: When Can We Trust It?
  • 23. The Outside View
  • 24. The Engine of Capitalism
  • Part IV. Choices
  • 25. Bernoulli's Errors
  • 26. Prospect Theory
  • 27. The Endowment Effect
  • 28. Bad Events
  • 29. The Fourfold Pattern
  • 30. Rare Events
  • 31. Risk Policies
  • 32. Keeping Score
  • 33. Reversals
  • 34. Frames and Reality
  • Part V. Two Selves
  • 35. Two Selves
  • 36. Life as a Story
  • 37. Experienced Well-Being
  • 38. Thinking About Life
  • Conclusions
  • Appendix A. Judgment Under Uncertainty
  • Appendix B. Choices, Values, and Frames
  • Notes
  • Acknowledgments
  • Index
Review by Choice Review

This is an important book from an important scholar. Winner (along with his late colleague, Amos Tversky) of the 2002 Nobel Prize in economics--for work on decision making--Kahneman (emer., psychology, Princeton; public affairs, Princeton's Woodrow Wilson School of Public and International Affairs) practically invented the discipline of behavioral economics and more generally transformed the entire approach to the psychology of decision making. This book makes, among many other things, two major contributions. First, Kahneman provides a substantial review and synthesis of the body of research he did with Tversky. Second, he explicates, and organizes the work around, his recent model of ways of thinking: system 1--fast, intuitive, emotional; system 2--slower, more deliberative, logical. Kahneman explores the consequences of this distinction in a variety of domains. Summing Up: Essential. Upper-division undergraduates through faculty and professionals; general readers. R. Levine California State University--Fresno

Copyright American Library Association, used with permission.
Review by New York Times Review

IN 2002, Daniel Kahneman won the Nobel in economic science. What made this unusual - indeed, unique in the history of the prize - is that Kahneman is a psychologist. Specifically, he is one-half of a pair of psychologists who, beginning in the early 1970s, set out to dismantle an entity long dear to economic theorists: that arch-rational decision maker known as Homo economic-us. The other half of the dismantling duo, Amos Tversky, died in 1996 at the age of 59. Had Tversky lived, he would certainly have shared the Nobel with Kahneman, his longtime collaborator and dear friend. Human irrationality is Kahneman's great theme. There are essentially three phases to his career. In the first, he and Tversky did a series of ingenious experiments that revealed twenty or so "cognitive biases" - unconscious errors of reasoning that distort our judgment of the world. Typical of these is the "anchoring effect": our tendency to be influenced by irrelevant numbers that we happen to be exposed to. (In one experiment, for instance, experienced German judges were inclined to give a shoplifter a longer sentence if they had just rolled a pair of dice loaded to give a high number.) In the second phase, Kahneman and Tversky showed that people making decisions under uncertain conditions do not behave in the way that economic models have traditionally assumed; they do not "maximize utility." The two then developed an alternative account of decision making, one more faithful to human psychology, which they called "prospect theory." (It was for this achievement that Kahneman was awarded the Nobel.) In the third phase of his career, mainly after the death of Tversky, Kahneman has delved into "hedonic psychology": the science of happiness, its nature and its causes. His findings in this area have proved disquieting - and not just because one of the key experiments involved a deliberately prolonged colonoscopy. "Thinking, Fast and Slow" spans all three of these phases. It is an astonishingly rich book: lucid, profound, full of intellectual surprises and self-help value. It is consistently entertaining and frequently touching, especially when Kahneman is recounting his collaboration with Tversky. ("The pleasure we found in working together made us exceptionally patient; it is much easier to strive for perfection when you are never bored.") So impressive is its vision of flawed human reason that the New York Times columnist David Brooks recently declared that Kahneman and Tversky's work "will be remembered hundreds of years from now," and that it is "a crucial pivot point in the way we see ourselves." They are, Brooks said, "like the Lewis and Clark of the mind." Now, this worries me a bit. A leitmotif of this book is overconfidence. All of us, and especially experts, are prone to an exaggerated sense of how well we understand the world - so Kahneman reminds us. Surely, he himself is alert to the perils of overconfidence. Despite all the cognitive biases, fallacies and illusions that he and Tversky (along with other researchers) purport to have discovered in the last few decades, he fights shy of the bold claim that humans are fundamentally irrational. Or does he? "Most of us are healthy most of the time, and most of our judgments and actions are appropriate most of the time," Kahneman writes in his introduction. Yet, just a few pages later, he observes that the work he did with Tversky "challenged" the idea, orthodox among social scientists in the 1970s, that "people are generally rational." The two psychologists discovered "systematic errors in the thinking of normal people": errors arising not from the corrupting effects of emotion, but built into our evolved cognitive machinery. Although Kahneman draws only modest policy implications (e.g., contracts should be stated in clearer language), others - perhaps overconfidently? - go much further. Brooks, for example, has argued that Kahneman and Tversky's work illustrates "the limits of social policy"; in particular, the folly of government action to fight joblessness and turn the economy around. Such sweeping conclusions, even if they are not endorsed by the author, make me frown. And frowning - as one learns on Page 152 of this book - activates the skeptic within us: what Kahneman calls "System 2." Just putting on a frown, experiments show, works to reduce overconfidence; it causes us to be more analytical, more vigilant in our thinking; to question stories that we would otherwise unreflectively accept as true because they are facile and coherent. And that is why I frowningly gave this extraordinarily interesting book the most skeptical reading I could. System 2, in Kahneman's scheme, is our slow, deliberate, analytical and consciously effortful mode of reasoning about the world. System 1, by contrast, is our fast, automatic, intuitive and largely unconscious mode. It is System 1 that detects hostility in a voice and effortlessly completes the phrase "bread and. . . ." It is System 2 that swings into action when we have to fill out a tax form or park a car in a narrow space. (As Kahneman and others have found, there is an easy way to tell how engaged a person's System 2 is during a task: just look into his or her eyes and note how dilated the pupils are.) More generally, System 1 uses association and metaphor to produce a quick and dirty draft of reality, which System 2 draws on to arrive at explicit beliefs and reasoned choices. System 1 proposes, System 2 disposes. So System 2 would seem to be the boss, right? In principle, yes. But System 2, in addition to being more deliberate and rational, is also lazy. And it tires easily. (The vogue term for this is "ego depletion.") Too often, instead of slowing things down and analyzing them, System 2 is content to accept the easy but unreliable story about the world that System 1 feeds to it. "Although System 2 believes itself to be where the action is," Kahneman writes, "the automatic System 1 is the hero of this book." System 2 is especially quiescent, it seems, when your mood is a happy one. AT this point, the skeptical reader might wonder how seriously to take all this talk of System 1 and System 2. Are they actually a pair of little agents in our head, each with its distinctive personality? Not really, says Kahneman. Rather, they are "useful fictions" - useful because they help explain the quirks of the human mind. To see how, consider what Kahneman calls the "best-known and most controversial" of the experiments he and Tversky did together: "the Linda problem." Participants in the experiment were told about an imaginary young woman named Linda, who is single, outspoken and very bright, and who, as a student, was deeply concerned with issues of discrimination and social justice. The participants were then asked which was more probable: (1) Linda is a bank teller. Or (2) Linda is a bank teller and is active in the feminist movement. The overwhelming response was that (2) was more probable; in other words, that given the background information furnished, "feminist bank teller" was more likely than "bank teller." This is, of course, a blatant violation of the laws of probability. (Every feminist bank teller is a bank teller; adding a detail can only lower the probability.) Yet even among students in Stanford's Graduate School of Business, who had extensive training in probability, 85 percent flunked the Linda problem. One student, informed that she had committed an elementary logical blunder, responded, "I thought you just asked for my opinion." What has gone wrong here? An easy question (how coherent is the narrative?) is substituted for a more difficult one (how probable is it?). And this, according to Kahneman, is the source of many of the biases that infect our thinking. System 1 jumps to an intuitive conclusion based on a "heuristic" - an easy but imperfect way of answering hard questions - and System 2 lazily endorses this heuristic answer without bothering to scrutinize whether it is logical. Kahneman describes dozens of such experimentally demonstrated breakdowns in rationality - "base-rate neglect," "availability cascade," "the illusion of validity" and so on. The cumulative effect is to make the reader despair for human reason. Are we really so hopeless? Think again of the Linda problem. Even the great evolutionary biologist Stephen Jay Gould was troubled by it. As an expert in probability he knew the right answer, yet he wrote that "a little homunculus in my head continues to jump up and down, shouting at me - 'But she can't just be a bank teller; read the description.'" It was Gould's System 1, Kahneman assures us, that kept shouting the wrong answer at him. But perhaps something more subtle is going on. Our everyday conversation takes place against a rich background of unstated expectations - what linguists call "implicatures." Such implicatures can seep into psychological experiments. Given the expectations that facilitate our conversation, it may have been quite reasonable for the participants in the experiment to take "Linda is a bank clerk" to imply that she was not in addition a feminist. If so, their answers weren't really fallacious. This might seem a minor point. But it applies to several of the biases that Kahneman and Tversky, along with other investigators, purport to have discovered in formal experiments. In more natural settings - when we are detecting cheaters rather than solving logic puzzles; when we are reasoning about things rather than symbols; when we are assessing raw numbers rather than percentages - people are far less likely to make the same errors. So, at least, much subsequent research suggests. Maybe we are not so irrational after all. Some cognitive biases, of course, are flagrantly exhibited even in the most natural of settings. Take what Kahneman calls the "planning fallacy": our tendency to overestimate benefits and underestimate costs, and hence foolishly to take on risky projects. In 2002, Americans remodeling their kitchens, for example, expected the job to cost $18,658 on average, but they ended up paying $38,769. The planning fallacy is "only one of the manifestations of a pervasive optimistic bias," Kahneman writes, which "may well be the most significant of the cognitive biases." Now, in one sense, a bias toward optimism is obviously bad, since it generates false beliefs - like the belief that we are in control, and not the playthings of luck. But without this "illusion of control," would we even be able to get out of bed in the morning? Optimists are more psychologically resilient, have stronger immune systems, and live longer on average than their more reality-based counterparts. Moreover, as Kahneman notes, exaggerated optimism serves to protect both individuals and organizations from the paralyzing effects of another bias, "loss aversion": our tendency to fear losses more than we value gains. It was exaggerated optimism that John Maynard Keynes had in mind when he talked of the "animal spirits" that drive capitalism. Even if we could rid ourselves of the biases and illusions identified in this book - and Kahneman, citing his own lack of progress in overcoming them, doubts that we can - it is by no means clear that this would make our lives go better. And that raises a fundamental question: What is the point of rationality? We are, after all, Darwinian survivors. Our everyday reasoning abilities have evolved to cope efficiently with a complex and dynamic environment. They are thus likely to be adaptive in this environment, even if they can be tripped up in the psychologist's somewhat artificial experiments. Where do the norms of rationality come from, if they are not an idealization of the way humans actually reason in their ordinary lives? As a species, we can no more be pervasively biased in our judgments than we can be pervasively ungrammatical in our use of language - or so critics of research like Kahneman and Tversky's contend. Kahneman never grapples philosophically with the nature of rationality. He does, however, supply a fascinating account of what might be taken to be its goal: happiness. What does it mean to be happy? When Kahneman first took up this question, in the mid 1990s, most happiness research relied on asking people how satisfied they were with their life on the whole. But such retrospective assessments depend on memory, which is notoriously unreliable. What if, instead, a person's actual experience of pleasure or pain could be sampled from moment to moment, and then summed up over time? Kahneman calls this "experienced" well-being, as opposed to the "remembered" well-being that researchers had relied upon. And he found that these two measures of happiness diverge in surprising ways. What makes the "experiencing self" happy is not the same as what makes the "remembering self" happy. In particular, the remembering self does not care about duration - how long a pleasant or unpleasant experience lasts. Rather, it retrospectively rates an experience by the peak level of pain or pleasure in the course of the experience, and by the way the experience ends. These two quirks of remembered happiness - "duration neglect" and the "peak-end rule" - were strikingly illustrated in one of Kahneman's more harrowing experiments. Two groups of patients were to undergo painful colonoscopies. The patients in Group A got the normal procedure. So did the patients in Group B, except - without their being told - a few extra minutes of mild discomfort were added after the end of the examination. Which group suffered more? Well, Group B endured all the pain that Group A did, and then some. But since the prolonging of Group B's colonoscopies meant that the procedure ended less painfully, the patients in this group retrospectively minded it less. (In an earlier research paper though not in this book, Kahneman suggested that the extra discomfort Group B was subjected to in the experiment might be ethically justified if it increased their willingness to come back for a follow-up!) As with colonoscopies, so too with life. It is the remembering self that calls the shots, not the experiencing self. Kahneman cites research showing, for example, that a college student's decision whether or not to repeat a spring-break vacation is determined by the peak-end rule applied to the previous vacation, not by how fun (or miserable) it actually was moment by moment. The remembering self exercises a sort of "tyranny" over the voiceless experiencing self. "Odd as it may seem," Kahneman writes, "I am my remembering self, and the experiencing self, who does my living, is like a stranger to me." KAHNEMAN'S conclusion, radical as it sounds, may not go far enough. There may be no experiencing self at all. Brain-scanning experiments by Rafael Malach and his colleagues at the Weizmann Institute in Israel, for instance, have shown that when subjects are absorbed in an experience, like watching the "The Good, the Bad, and the Ugly," the parts of the brain associated with self-consciousness are not merely quiet, they're actually shut down ("inhibited") by the rest of the brain. The self seems simply to disappear. Then who exactly is enjoying the film? And why should such egoless pleasures enter into the decision calculus of the remembering self? Clearly, much remains to be done in hedonic psychology. But Kahneman's conceptual innovations have laid the foundation for many of the empirical findings he reports in this book: that while French mothers spend less time with their children than American mothers, they enjoy it more; that headaches are hedonically harder on the poor; that women who live alone seem to enjoy the same level of well-being as women who live with a mate; and that a household income of about $75,000 in high-cost areas of the country is sufficient to maximize happiness. Policy makers interested in lowering the misery index of society will find much to ponder here. By the time I got to the end of "Thinking, Fast and Slow," my skeptical frown had long since given way to a grin of intellectual satisfaction. Appraising the book by the peak-end rule, I overconfidently urge everyone to buy and read it. But for those who are merely interested in Kahneman's takeaway on the Malcolm Gladwell question it is this: If you've had 10,000 hours of training in a predictable, rapid-feedback environment - chess, firefighting, anesthesiology - then blink. In all other cases, think. Jim Holt's new book, "Why Does the World Exist?," will be published next spring. The two psychologists discovered 'systematic errors in the thinking of normal people.'

Copyright (c) The New York Times Company [November 27, 2011]
Review by Booklist Review

Decision making tends to be intuitive rather than logical. Kahneman has dedicated his academic research to understanding why that is so. This work distills his and colleagues' findings about how we make up our minds and how much we can trust intuition. Clinical experiments on psychology's traditional guinea pigs college students abound and collectively batter confidence in System 1. as Kahneman calls intuition. All sorts of biases, sporting tags like the halo effect (i.e., unwarranted attribution of positive qualities to a thing or person one likes), bedevil accurate appraisal of reality. According to Kahneman, intuitive feelings often override System 2. or thinking that requires effort, such as simple arithmetic. Exemplifying his points in arenas as diverse as selecting military officers, speculating in stocks, hiring employees, and starting up businesses, Kahneman accords some reliability to intuitive choice, as long as the decision maker is aware of cognitive illusions (the study of which brought Kahneman the 2002 Nobel Prize in Economics). Kahneman's insights will most benefit those in leadership positions yet they will also help the average reader to become a better car buyer.--Taylor, Gilbert Copyright 2010 Booklist

From Booklist, Copyright (c) American Library Association. Used with permission.
Review by Library Journal Review

Kahneman (psychology, emeritus, Princeton) won the 2002 Nobel Prize in Economics for his work with Amos Tversky on decision making. In this large, readable book, Kahneman presents provocative theories and groundbreaking research and, moreover, clearly explains both. He postulates two systems of thinking that operate simultaneously but often at odds: intuitive and deliberative, or fast and slow, respectively. Fast judgments dominate to a greater extent than we know and to our disadvantage. A key discovery that overcame an effect Kahneman terms "theory induced blindness" (which refers mainly to fast-thinking mistakes but can occur in slow thinking when our assumptions are wrong or simply interfere with seeing) was that outcomes are better defined by gains and losses than by sums of wealth. "Prospect theory," an idea Kahneman developed with Tversky, posits that, when all our options are bad, we tend to take riskier paths. With Kahneman's expert help, readers may understand this mix of psychology and economics better than most accountants, therapists, or elected representatives. VERDICT A stellar accomplishment, a book for everyone who likes to think and wants to do it better. [See Prepub Alert, 5/9/11]-E. James Lieberman, George Washington Univ. Sch. of Medicine, Washington, DC (c) Copyright 2011. Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.

(c) Copyright Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.
Review by Kirkus Book Review

want to think rigorously about something. The author then explores the nuances of our two-system minds, showing how they perform in various situations. Psychological experiments have repeatedly revealed that our intuitions are generally wrong, that our assessments are based on biases and that our System 1 hates doubt and despises ambiguity. Kahneman largely avoids jargon; when he does use some ("heuristics," for example), he argues that such terms really ought to join our everyday vocabulary. He reviews many fundamental concepts in psychology and statistics (regression to the mean, the narrative fallacy, the optimistic bias), showing how they relate to his overall concerns about how we think and why we make the decisions that we do. Some of the later chapters (dealing with risk-taking and statistics and probabilities) are denser than others (some readers may resent such demands on System 2!), but the passages that deal with the economic and political implications of the research are gripping. Striking research showing the immense complexity of ordinary thought and revealing the identities of the gatekeepers in our minds.]] Copyright Kirkus Reviews, used with permission.

Copyright (c) Kirkus Reviews, used with permission.

THINKING, FAST AND SLOW (Chapter 1)The Characters of the Story To observe your mind in automatic mode, glance at the image below. Figure 1 Your experience as you look at the woman's face seamlessly combines what we normally call seeing and intuitive thinking. As surely and quickly as you saw that the young woman's hair is dark, you knew she is angry. Furthermore, what you saw extended into the future. You sensed that this woman is about to say some very unkind words, probably in a loud and strident voice. A premonition of what she was going to do next came to mind automatically and effortlessly. You did not intend to assess her mood or to anticipate what she might do, and your reaction to the picture did not have the feel of something you did. It just happened to you. It was an instance of fast thinking. Now look at the following problem: 17 × 24 You knew immediately that this is a multiplication problem, and probably knew that you could solve it, with paper and pencil, if not without. You also had some vague intuitive knowledge of the range of possible results. You would be quick to recognize that both 12,609 and 123 are implausible. Without spending some time on the problem, however, you would not be certain that the answer is not 568. A precise solution did not come to mind, and you felt that you could choose whether or not to engage in the computation. If you have not done so yet, you should attempt the multiplication problem now, completing at least part of it. You experienced slow thinking as you proceeded through a sequence of steps. You first retrieved from memory the cognitive program for multiplication that you learned in school, then you implemented it. Carrying out the computation was a strain. You felt the burden of holding much material in memory, as you needed to keep track of where you were and of where you were going, while holding on to the intermediate result. The process was mental work: deliberate, effortful, and orderly--a prototype of slow thinking. The computation was not only an event in your mind; your body was also involved. Your muscles tensed up, your blood pressure rose, and your heart rate increased. Someone looking closely at your eyes while you tackled this problem would have seen your pupils dilate. Your pupils contracted back to normal size as soon as you ended your work--when you found the answer (which is 408, by the way) or when you gave up. Two Systems Psychologists have been intensely interested for several decades in the two modes of thinking evoked by the picture of the angry woman and by the multiplication problem, and have offered many labels for them. I adopt terms originally proposed by the psychologists Keith Stanovich and Richard West, and will refer to two systems in the mind, System 1 and System 2. System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration. The labels of System 1 and System 2 are widely used in psychology, but I go further than most in this book, which you can read as a psychodrama with two characters. When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book. I describe System 1 as effortlessly originating impressions and feelings that are the main sources of the explicit beliefs and deliberate choices of System 2. The automatic operations of System 1 generate surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps. I also describe circumstances in which System 2 takes over, overruling the freewheeling impulses and associations of System 1. You will be invited to think of the two systems as agents with their individual abilities, limitations, and functions. In rough order of complexity, here are some examples of the automatic activities that are attributed to System 1: Detect that one object is more distant than another. Orient to the source of a sudden sound. Complete the phrase "bread and..." Make a "disgust face" when shown a horrible picture. Detect hostility in a voice. Answer to 2 + 2 = ? Read words on large billboards. Drive a car on an empty road. Find a strong move in chess (if you are a chess master). Understand simple sentences. Recognize that a "meek and tidy soul with a passion for detail" resembles an occupational stereotype. All these mental events belong with the angry woman--they occur automatically and require little or no effort. The capabilities of System 1 include innate skills that we share with other animals. We are born prepared to perceive the world around us, recognize objects, orient attention, avoid losses, and fear spiders. Other mental activities become fast and automatic through prolonged practice. System 1 has learned associations between ideas (the capital of France?); it has also learned skills such as reading and understanding nuances of social situations. Some skills, such as finding strong chess moves, are acquired only by specialized experts. Others are widely shared. Detecting the similarity of a personality sketch to an occupational stereotype requires broad knowledge of the language and the culture, which most of us possess. The knowledge is stored in memory and accessed without intention and without effort. Several of the mental actions in the list are completely involuntary. You cannot refrain from understanding simple sentences in your own language or from orienting to a loud unexpected sound, nor can you prevent yourself from knowing that 2 + 2 = 4 or from thinking of Paris when the capital of France is mentioned. Other activities, such as chewing, are susceptible to voluntary control but normally run on automatic pilot. The control of attention is shared by the two systems. Orienting to a loud sound is normally an involuntary operation of System 1, which immediately mobilizes the voluntary attention of System 2. You may be able to resist turning toward the source of a loud and offensive comment at a crowded party, but even if your head does not move, your attention is initially directed to it, at least for a while. However, attention can be moved away from an unwanted focus, primarily by focusing intently on another target. The highly diverse operations of System 2 have one feature in common: they require attention and are disrupted when attention is drawn away. Here are some examples: Brace for the starter gun in a race. Focus attention on the clowns in the circus. Focus on the voice of a particular person in a crowded and noisy room. Look for a woman with white hair. Search memory to identify a surprising sound. Maintain a faster walking speed than is natural for you. Monitor the appropriateness of your behavior in a social situation. Count the occurrences of the letter a in a page of text. Tell someone your phone number. Park in a narrow space (for most people except garage attendants). Compare two washing machines for overall value. Fill out a tax form. Check the validity of a complex logical argument. In all these situations you must pay attention, and you will perform less well, or not at all, if you are not ready or if your attention is directed inappropriately. System 2 has some ability to change the way System 1 works, by programming the normally automatic functions of attention and memory. When waiting for a relative at a busy train station, for example, you can set yourself at will to look for a white-haired woman or a bearded man, and thereby increase the likelihood of detecting your relative from a distance. You can set your memory to search for capital cities that start with N or for French existentialist novels. And when you rent a car at London's Heathrow Airport, the attendant will probably remind you that "we drive on the left side of the road over here." In all these cases, you are asked to do something that does not come naturally, and you will find that the consistent maintenance of a set requires continuous exertion of at least some effort. The oft en-used phrase "pay attention" is apt: you dispose of a limited budget of attention that you can allocate to activities, and if you try to go beyond your budget, you will fail. It is the mark of effortful activities that they interfere with each other, which is why it is difficult or impossible to conduct several at once. You could not compute the product of 17 × 24 while making a left turn into dense traffic, and you certainly should not try. You can do several things at once, but only if they are easy and undemanding. You are probably safe carrying on a conversation with a passenger while driving on an empty highway, and many parents have discovered, perhaps with some guilt, that they can read a story to a child while thinking of something else. Everyone has some awareness of the limited capacity of attention, and our social behavior makes allowances for these limitations. When the driver of a car is overtaking a truck on a narrow road, for example, adult passengers quite sensibly stop talking. They know that distracting the driver is not a good idea, and they also suspect that he is temporarily deaf and will not hear what they say. Intense focusing on a task can make people effectively blind, even to stimuli that normally attract attention. The most dramatic demonstration was offered by Christopher Chabris and Daniel Simons in their book The Invisible Gorilla. They constructed a short film of two teams passing basketballs, one team wearing white shirts, the other wearing black. The viewers of the film are instructed to count the number of passes made by the white team, ignoring the black players. This task is difficult and completely absorbing. Halfway through the video, a woman wearing a gorilla suit appears, crosses the court, thumps her chest, and moves on. The gorilla is in view for 9 seconds. Many thousands of people have seen the video, and about half of them do not notice anything unusual. It is the counting task--and especially the instruction to ignore one of the teams--that causes the blindness. No one who watches the video without that task would miss the gorilla. Seeing and orienting are automatic functions of System 1, but they depend on the allocation of some attention to the relevant stimulus. The authors note that the most remarkable observation of their study is that people find its results very surprising. Indeed, the viewers who fail to see the gorilla are initially sure that it was not there--they cannot imagine missing such a striking event. The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness. Plot Synopsis The interaction of the two systems is a recurrent theme of the book, and a brief synopsis of the plot is in order. In the story I will tell, Systems 1 and 2 are both active whenever we are awake. System 1 runs automatically and System 2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine--usually. When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilized when a question arises for which System 1 does not offer an answer, as probably happened to you when you encountered the multiplication problem 17 × 24. You can also feel a surge of conscious attention whenever you are surprised. System 2 is activated when an event is detected that violates the model of the world that System 1 maintains. In that world, lamps do not jump, cats do not bark, and gorillas do not cross basketball courts. The gorilla experiment demonstrates that some attention is needed for the surprising stimulus to be detected. Surprise then activates and orients your attention: you will stare, and you will search your memory for a story that makes sense of the surprising event. System 2 is also credited with the continuous monitoring of your own behavior--the control that keeps you polite when you are angry, and alert when you are driving at night. System 2 is mobilized to increased effort when it detects an error about to be made. Remember a time when you almost blurted out an offensive remark and note how hard you worked to restore control. In summary, most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word. The division of labor between System 1 and System 2 is highly efficient: it minimizes effort and optimizes performance. The arrangement works well most of the time because System 1 is generally very good at what it does: its models of familiar situations are accurate, its short-term predictions are usually accurate as well, and its initial reactions to challenges are swift and generally appropriate. System 1 has biases, however, systematic errors that it is prone to make in specified circumstances. As we shall see, it sometimes answers easier questions than the one it was asked, and it has little understanding of logic and statistics. One further limitation of System 1 is that it cannot be turned off. If you are shown a word on the screen in a language you know, you will read it--unless your attention is totally focused elsewhere. Conflict Figure 2 is a variant of a classic experiment that produces a conflict between the two systems. You should try the exercise before reading on. Figure 2 You were almost certainly successful in saying the correct words in both tasks, and you surely discovered that some parts of each task were much easier than others. When you identified upper-and lowercase, the left-hand column was easy and the right-hand column caused you to slow down and perhaps to stammer or stumble. When you named the position of words, the left-hand column was difficult and the right-hand column was much easier. These tasks engage System 2, because saying "upper/lower" or "right/left" is not what you routinely do when looking down a column of words. One of the things you did to set yourself for the task was to program your memory so that the relevant words (upper and lower for the first task) were "on the tip of your tongue." The prioritizing of the chosen words is effective and the mild temptation to read other words was fairly easy to resist when you went through the first column. But the second column was different, because it contained words for which you were set, and you could not ignore them. You were mostly able to respond correctly, but overcoming the competing response was a strain, and it slowed you down. You experienced a conflict between a task that you intended to carry out and an automatic response that interfered with it. Conflict between an automatic reaction and an intention to control it is common in our lives. We are all familiar with the experience of trying not to stare at the oddly dressed couple at the neighboring table in a restaurant. We also know what it is like to force our attention on a boring book, when we constantly find ourselves returning to the point at which the reading lost its meaning. Where winters are hard, many drivers have memories of their car skidding out of control on the ice and of the struggle to follow well-rehearsed instructions that negate what they would naturally do: "Steer into the skid, and whatever you do, do not touch the brakes!" And every human being has had the experience of not telling someone to go to hell. One of the tasks of System 2 is to overcome the impulses of System 1. In other words, System 2 is in charge of self-control. Illusions To appreciate the autonomy of System 1, as well as the distinction between impressions and beliefs, take a good look at figure 3. This picture is unremarkable: two horizontal lines of different lengths, with fins appended, pointing in different directions. The bottom line is obviously longer than the one above it. That is what we all see, and we naturally believe what we see. If you have already encountered this image, however, you recognize it as the famous Müller-Lyer illusion. As you can easily confirm by measuring them with a ruler, the horizontal lines are in fact identical in length. Figure 3 Now that you have measured the lines, you--your System 2, the conscious being you call "I"--have a new belief: you know that the lines are equally long. If asked about their length, you will say what you know. But you still see the bottom line as longer. You have chosen to believe the measurement, but you cannot prevent System 1 from doing its thing; you cannot decide to see the lines as equal, although you know they are. To resist the illusion, there is only one thing you can do: you must learn to mistrust your impressions of the length of lines when fins are attached to them. To implement that rule, you must be able to recognize the illusory pattern and recall what you know about it. If you can do this, you will never again be fooled by the Müller-Lyer illusion. But you will still see one line as longer than the other. Not all illusions are visual. There are illusions of thought, which we call cognitive illusions. As a graduate student, I attended some courses on the art and science of psychotherapy. During one of these lectures, our teacher imparted a morsel of clinical wisdom. This is what he told us: "You will from time to time meet a patient who shares a disturbing tale of multiple mistakes in his previous treatment. He has been seen by several clinicians, and all failed him. The patient can lucidly describe how his therapists misunderstood him, but he has quickly perceived that you are different. You share the same feeling, are convinced that you understand him, and will be able to help." At this point my teacher raised his voice as he said, "Do not even think of taking on this patient! Throw him out of the office! He is most likely a psychopath and you will not be able to help him." Many years later I learned that the teacher had warned us against psychopathic charm, and the leading authority in the study of psychopathy confirmed that the teacher's advice was sound. The analogy to the Müller-Lyer illusion is close. What we were being taught was not how to feel about that patient. Our teacher took it for granted that the sympathy we would feel for the patient would not be under our control; it would arise from System 1. Furthermore, we were not being taught to be generally suspicious of our feelings about patients. We were told that a strong attraction to a patient with a repeated history of failed treatment is a danger sign--like the fins on the parallel lines. It is an illusion--a cognitive illusion--and I (System 2) was taught how to recognize it and advised not to believe it or act on it. The question that is most oft en asked about cognitive illusions is whether they can be overcome. The message of these examples is not encouraging. Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are oft en difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2. As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical. Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book is that it is easier to recognize other people's mistakes than our own. Useful Fictions You have been invited to think of the two systems as agents within the mind, with their individual personalities, abilities, and limitations. I will oft en use sentences in which the systems are the subjects, such as, "System 2 calculates products." The use of such language is considered a sin in the professional circles in which I travel, because it seems to explain the thoughts and actions of a person by the thoughts and actions of little people inside the person's head. Grammatically the sentence about System 2 is similar to "The butler steals the petty cash." My colleagues would point out that the butler's action actually explains the disappearance of the cash, and they rightly question whether the sentence about System 2 explains how products are calculated. My answer is that the brief active sentence that attributes calculation to System 2 is intended as a description, not an explanation. It is meaningful only because of what you already know about System 2. It is shorthand for the following: "Mental arithmetic is a voluntary activity that requires effort, should not be performed while making a left turn, and is associated with dilated pupils and an accelerated heart rate." Similarly, the statement that "highway driving under routine conditions is left to System 1" means that steering the car around a bend is automatic and almost effortless. It also implies that an experienced driver can drive on an empty highway while conducting a conversation. Finally, "System 2 prevented James from reacting foolishly to the insult" means that James would have been more aggressive in his response if his capacity for effortful control had been disrupted (for example, if he had been drunk). System 1 and System 2 are so central to the story I tell in this book that I must make it absolutely clear that they are fictitious characters. Systems 1 and 2 are not systems in the standard sense of entities with interacting aspects or parts. And there is no one part of the brain that either of the systems would call home. You may well ask: What is the point of introducing fictitious characters with ugly names into a serious book? The answer is that the characters are useful because of some quirks of our minds, yours and mine. A sentence is understood more easily if it describes what an agent (System 2) does than if it describes what something is, what properties it has. In other words, "System 2" is a better subject for a sentence than "mental arithmetic." The mind--especially System 1--appears to have a special aptitude for the construction and interpretation of stories about active agents, who have personalities, habits, and abilities. You quickly formed a bad opinion of the thieving butler, you expect more bad behavior from him, and you will remember him for a while. This is also my hope for the language of systems. Why call them System 1 and System 2 rather than the more descriptive "automatic system" and "effortful system"? The reason is simple: "Automatic system" takes longer to say than "System 1" and therefore takes more space in your working memory. This matters, because anything that occupies your working memory reduces your ability to think. You should treat "System 1" and "System 2" as nicknames, like Bob and Joe, identifying characters that you will get to know over the course of this book. The fictitious systems make it easier for me to think about judgment and choice, and will make it easier for you to understand what I say. Speaking of System 1 and System 2 "He had an impression, but some of his impressions are illusions." "This was a pure System 1 response. She reacted to the threat before she recognized it." "This is your System 1 talking. Slow down and let your System 2 take control." THINKING, FAST AND SLOW Copyright (c) 2011 by Daniel Kahneman Excerpted from Thinking, Fast and Slow by Daniel Kahneman All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.