Review by Choice Review
This is an important book from an important scholar. Winner (along with his late colleague, Amos Tversky) of the 2002 Nobel Prize in economics--for work on decision making--Kahneman (emer., psychology, Princeton; public affairs, Princeton's Woodrow Wilson School of Public and International Affairs) practically invented the discipline of behavioral economics and more generally transformed the entire approach to the psychology of decision making. This book makes, among many other things, two major contributions. First, Kahneman provides a substantial review and synthesis of the body of research he did with Tversky. Second, he explicates, and organizes the work around, his recent model of ways of thinking: system 1--fast, intuitive, emotional; system 2--slower, more deliberative, logical. Kahneman explores the consequences of this distinction in a variety of domains. Summing Up: Essential. Upper-division undergraduates through faculty and professionals; general readers. R. Levine California State University--Fresno
Copyright American Library Association, used with permission.
Review by New York Times Review
IN 2002, Daniel Kahneman won the Nobel in economic science. What made this unusual - indeed, unique in the history of the prize - is that Kahneman is a psychologist. Specifically, he is one-half of a pair of psychologists who, beginning in the early 1970s, set out to dismantle an entity long dear to economic theorists: that arch-rational decision maker known as Homo economic-us. The other half of the dismantling duo, Amos Tversky, died in 1996 at the age of 59. Had Tversky lived, he would certainly have shared the Nobel with Kahneman, his longtime collaborator and dear friend. Human irrationality is Kahneman's great theme. There are essentially three phases to his career. In the first, he and Tversky did a series of ingenious experiments that revealed twenty or so "cognitive biases" - unconscious errors of reasoning that distort our judgment of the world. Typical of these is the "anchoring effect": our tendency to be influenced by irrelevant numbers that we happen to be exposed to. (In one experiment, for instance, experienced German judges were inclined to give a shoplifter a longer sentence if they had just rolled a pair of dice loaded to give a high number.) In the second phase, Kahneman and Tversky showed that people making decisions under uncertain conditions do not behave in the way that economic models have traditionally assumed; they do not "maximize utility." The two then developed an alternative account of decision making, one more faithful to human psychology, which they called "prospect theory." (It was for this achievement that Kahneman was awarded the Nobel.) In the third phase of his career, mainly after the death of Tversky, Kahneman has delved into "hedonic psychology": the science of happiness, its nature and its causes. His findings in this area have proved disquieting - and not just because one of the key experiments involved a deliberately prolonged colonoscopy. "Thinking, Fast and Slow" spans all three of these phases. It is an astonishingly rich book: lucid, profound, full of intellectual surprises and self-help value. It is consistently entertaining and frequently touching, especially when Kahneman is recounting his collaboration with Tversky. ("The pleasure we found in working together made us exceptionally patient; it is much easier to strive for perfection when you are never bored.") So impressive is its vision of flawed human reason that the New York Times columnist David Brooks recently declared that Kahneman and Tversky's work "will be remembered hundreds of years from now," and that it is "a crucial pivot point in the way we see ourselves." They are, Brooks said, "like the Lewis and Clark of the mind." Now, this worries me a bit. A leitmotif of this book is overconfidence. All of us, and especially experts, are prone to an exaggerated sense of how well we understand the world - so Kahneman reminds us. Surely, he himself is alert to the perils of overconfidence. Despite all the cognitive biases, fallacies and illusions that he and Tversky (along with other researchers) purport to have discovered in the last few decades, he fights shy of the bold claim that humans are fundamentally irrational. Or does he? "Most of us are healthy most of the time, and most of our judgments and actions are appropriate most of the time," Kahneman writes in his introduction. Yet, just a few pages later, he observes that the work he did with Tversky "challenged" the idea, orthodox among social scientists in the 1970s, that "people are generally rational." The two psychologists discovered "systematic errors in the thinking of normal people": errors arising not from the corrupting effects of emotion, but built into our evolved cognitive machinery. Although Kahneman draws only modest policy implications (e.g., contracts should be stated in clearer language), others - perhaps overconfidently? - go much further. Brooks, for example, has argued that Kahneman and Tversky's work illustrates "the limits of social policy"; in particular, the folly of government action to fight joblessness and turn the economy around. Such sweeping conclusions, even if they are not endorsed by the author, make me frown. And frowning - as one learns on Page 152 of this book - activates the skeptic within us: what Kahneman calls "System 2." Just putting on a frown, experiments show, works to reduce overconfidence; it causes us to be more analytical, more vigilant in our thinking; to question stories that we would otherwise unreflectively accept as true because they are facile and coherent. And that is why I frowningly gave this extraordinarily interesting book the most skeptical reading I could. System 2, in Kahneman's scheme, is our slow, deliberate, analytical and consciously effortful mode of reasoning about the world. System 1, by contrast, is our fast, automatic, intuitive and largely unconscious mode. It is System 1 that detects hostility in a voice and effortlessly completes the phrase "bread and. . . ." It is System 2 that swings into action when we have to fill out a tax form or park a car in a narrow space. (As Kahneman and others have found, there is an easy way to tell how engaged a person's System 2 is during a task: just look into his or her eyes and note how dilated the pupils are.) More generally, System 1 uses association and metaphor to produce a quick and dirty draft of reality, which System 2 draws on to arrive at explicit beliefs and reasoned choices. System 1 proposes, System 2 disposes. So System 2 would seem to be the boss, right? In principle, yes. But System 2, in addition to being more deliberate and rational, is also lazy. And it tires easily. (The vogue term for this is "ego depletion.") Too often, instead of slowing things down and analyzing them, System 2 is content to accept the easy but unreliable story about the world that System 1 feeds to it. "Although System 2 believes itself to be where the action is," Kahneman writes, "the automatic System 1 is the hero of this book." System 2 is especially quiescent, it seems, when your mood is a happy one. AT this point, the skeptical reader might wonder how seriously to take all this talk of System 1 and System 2. Are they actually a pair of little agents in our head, each with its distinctive personality? Not really, says Kahneman. Rather, they are "useful fictions" - useful because they help explain the quirks of the human mind. To see how, consider what Kahneman calls the "best-known and most controversial" of the experiments he and Tversky did together: "the Linda problem." Participants in the experiment were told about an imaginary young woman named Linda, who is single, outspoken and very bright, and who, as a student, was deeply concerned with issues of discrimination and social justice. The participants were then asked which was more probable: (1) Linda is a bank teller. Or (2) Linda is a bank teller and is active in the feminist movement. The overwhelming response was that (2) was more probable; in other words, that given the background information furnished, "feminist bank teller" was more likely than "bank teller." This is, of course, a blatant violation of the laws of probability. (Every feminist bank teller is a bank teller; adding a detail can only lower the probability.) Yet even among students in Stanford's Graduate School of Business, who had extensive training in probability, 85 percent flunked the Linda problem. One student, informed that she had committed an elementary logical blunder, responded, "I thought you just asked for my opinion." What has gone wrong here? An easy question (how coherent is the narrative?) is substituted for a more difficult one (how probable is it?). And this, according to Kahneman, is the source of many of the biases that infect our thinking. System 1 jumps to an intuitive conclusion based on a "heuristic" - an easy but imperfect way of answering hard questions - and System 2 lazily endorses this heuristic answer without bothering to scrutinize whether it is logical. Kahneman describes dozens of such experimentally demonstrated breakdowns in rationality - "base-rate neglect," "availability cascade," "the illusion of validity" and so on. The cumulative effect is to make the reader despair for human reason. Are we really so hopeless? Think again of the Linda problem. Even the great evolutionary biologist Stephen Jay Gould was troubled by it. As an expert in probability he knew the right answer, yet he wrote that "a little homunculus in my head continues to jump up and down, shouting at me - 'But she can't just be a bank teller; read the description.'" It was Gould's System 1, Kahneman assures us, that kept shouting the wrong answer at him. But perhaps something more subtle is going on. Our everyday conversation takes place against a rich background of unstated expectations - what linguists call "implicatures." Such implicatures can seep into psychological experiments. Given the expectations that facilitate our conversation, it may have been quite reasonable for the participants in the experiment to take "Linda is a bank clerk" to imply that she was not in addition a feminist. If so, their answers weren't really fallacious. This might seem a minor point. But it applies to several of the biases that Kahneman and Tversky, along with other investigators, purport to have discovered in formal experiments. In more natural settings - when we are detecting cheaters rather than solving logic puzzles; when we are reasoning about things rather than symbols; when we are assessing raw numbers rather than percentages - people are far less likely to make the same errors. So, at least, much subsequent research suggests. Maybe we are not so irrational after all. Some cognitive biases, of course, are flagrantly exhibited even in the most natural of settings. Take what Kahneman calls the "planning fallacy": our tendency to overestimate benefits and underestimate costs, and hence foolishly to take on risky projects. In 2002, Americans remodeling their kitchens, for example, expected the job to cost $18,658 on average, but they ended up paying $38,769. The planning fallacy is "only one of the manifestations of a pervasive optimistic bias," Kahneman writes, which "may well be the most significant of the cognitive biases." Now, in one sense, a bias toward optimism is obviously bad, since it generates false beliefs - like the belief that we are in control, and not the playthings of luck. But without this "illusion of control," would we even be able to get out of bed in the morning? Optimists are more psychologically resilient, have stronger immune systems, and live longer on average than their more reality-based counterparts. Moreover, as Kahneman notes, exaggerated optimism serves to protect both individuals and organizations from the paralyzing effects of another bias, "loss aversion": our tendency to fear losses more than we value gains. It was exaggerated optimism that John Maynard Keynes had in mind when he talked of the "animal spirits" that drive capitalism. Even if we could rid ourselves of the biases and illusions identified in this book - and Kahneman, citing his own lack of progress in overcoming them, doubts that we can - it is by no means clear that this would make our lives go better. And that raises a fundamental question: What is the point of rationality? We are, after all, Darwinian survivors. Our everyday reasoning abilities have evolved to cope efficiently with a complex and dynamic environment. They are thus likely to be adaptive in this environment, even if they can be tripped up in the psychologist's somewhat artificial experiments. Where do the norms of rationality come from, if they are not an idealization of the way humans actually reason in their ordinary lives? As a species, we can no more be pervasively biased in our judgments than we can be pervasively ungrammatical in our use of language - or so critics of research like Kahneman and Tversky's contend. Kahneman never grapples philosophically with the nature of rationality. He does, however, supply a fascinating account of what might be taken to be its goal: happiness. What does it mean to be happy? When Kahneman first took up this question, in the mid 1990s, most happiness research relied on asking people how satisfied they were with their life on the whole. But such retrospective assessments depend on memory, which is notoriously unreliable. What if, instead, a person's actual experience of pleasure or pain could be sampled from moment to moment, and then summed up over time? Kahneman calls this "experienced" well-being, as opposed to the "remembered" well-being that researchers had relied upon. And he found that these two measures of happiness diverge in surprising ways. What makes the "experiencing self" happy is not the same as what makes the "remembering self" happy. In particular, the remembering self does not care about duration - how long a pleasant or unpleasant experience lasts. Rather, it retrospectively rates an experience by the peak level of pain or pleasure in the course of the experience, and by the way the experience ends. These two quirks of remembered happiness - "duration neglect" and the "peak-end rule" - were strikingly illustrated in one of Kahneman's more harrowing experiments. Two groups of patients were to undergo painful colonoscopies. The patients in Group A got the normal procedure. So did the patients in Group B, except - without their being told - a few extra minutes of mild discomfort were added after the end of the examination. Which group suffered more? Well, Group B endured all the pain that Group A did, and then some. But since the prolonging of Group B's colonoscopies meant that the procedure ended less painfully, the patients in this group retrospectively minded it less. (In an earlier research paper though not in this book, Kahneman suggested that the extra discomfort Group B was subjected to in the experiment might be ethically justified if it increased their willingness to come back for a follow-up!) As with colonoscopies, so too with life. It is the remembering self that calls the shots, not the experiencing self. Kahneman cites research showing, for example, that a college student's decision whether or not to repeat a spring-break vacation is determined by the peak-end rule applied to the previous vacation, not by how fun (or miserable) it actually was moment by moment. The remembering self exercises a sort of "tyranny" over the voiceless experiencing self. "Odd as it may seem," Kahneman writes, "I am my remembering self, and the experiencing self, who does my living, is like a stranger to me." KAHNEMAN'S conclusion, radical as it sounds, may not go far enough. There may be no experiencing self at all. Brain-scanning experiments by Rafael Malach and his colleagues at the Weizmann Institute in Israel, for instance, have shown that when subjects are absorbed in an experience, like watching the "The Good, the Bad, and the Ugly," the parts of the brain associated with self-consciousness are not merely quiet, they're actually shut down ("inhibited") by the rest of the brain. The self seems simply to disappear. Then who exactly is enjoying the film? And why should such egoless pleasures enter into the decision calculus of the remembering self? Clearly, much remains to be done in hedonic psychology. But Kahneman's conceptual innovations have laid the foundation for many of the empirical findings he reports in this book: that while French mothers spend less time with their children than American mothers, they enjoy it more; that headaches are hedonically harder on the poor; that women who live alone seem to enjoy the same level of well-being as women who live with a mate; and that a household income of about $75,000 in high-cost areas of the country is sufficient to maximize happiness. Policy makers interested in lowering the misery index of society will find much to ponder here. By the time I got to the end of "Thinking, Fast and Slow," my skeptical frown had long since given way to a grin of intellectual satisfaction. Appraising the book by the peak-end rule, I overconfidently urge everyone to buy and read it. But for those who are merely interested in Kahneman's takeaway on the Malcolm Gladwell question it is this: If you've had 10,000 hours of training in a predictable, rapid-feedback environment - chess, firefighting, anesthesiology - then blink. In all other cases, think. Jim Holt's new book, "Why Does the World Exist?," will be published next spring. The two psychologists discovered 'systematic errors in the thinking of normal people.'
Copyright (c) The New York Times Company [November 27, 2011]
Review by Booklist Review
Decision making tends to be intuitive rather than logical. Kahneman has dedicated his academic research to understanding why that is so. This work distills his and colleagues' findings about how we make up our minds and how much we can trust intuition. Clinical experiments on psychology's traditional guinea pigs college students abound and collectively batter confidence in System 1. as Kahneman calls intuition. All sorts of biases, sporting tags like the halo effect (i.e., unwarranted attribution of positive qualities to a thing or person one likes), bedevil accurate appraisal of reality. According to Kahneman, intuitive feelings often override System 2. or thinking that requires effort, such as simple arithmetic. Exemplifying his points in arenas as diverse as selecting military officers, speculating in stocks, hiring employees, and starting up businesses, Kahneman accords some reliability to intuitive choice, as long as the decision maker is aware of cognitive illusions (the study of which brought Kahneman the 2002 Nobel Prize in Economics). Kahneman's insights will most benefit those in leadership positions yet they will also help the average reader to become a better car buyer.--Taylor, Gilbert Copyright 2010 Booklist
From Booklist, Copyright (c) American Library Association. Used with permission.
Review by Library Journal Review
Kahneman (psychology, emeritus, Princeton) won the 2002 Nobel Prize in Economics for his work with Amos Tversky on decision making. In this large, readable book, Kahneman presents provocative theories and groundbreaking research and, moreover, clearly explains both. He postulates two systems of thinking that operate simultaneously but often at odds: intuitive and deliberative, or fast and slow, respectively. Fast judgments dominate to a greater extent than we know and to our disadvantage. A key discovery that overcame an effect Kahneman terms "theory induced blindness" (which refers mainly to fast-thinking mistakes but can occur in slow thinking when our assumptions are wrong or simply interfere with seeing) was that outcomes are better defined by gains and losses than by sums of wealth. "Prospect theory," an idea Kahneman developed with Tversky, posits that, when all our options are bad, we tend to take riskier paths. With Kahneman's expert help, readers may understand this mix of psychology and economics better than most accountants, therapists, or elected representatives. VERDICT A stellar accomplishment, a book for everyone who likes to think and wants to do it better. [See Prepub Alert, 5/9/11]-E. James Lieberman, George Washington Univ. Sch. of Medicine, Washington, DC (c) Copyright 2011. Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.
(c) Copyright Library Journals LLC, a wholly owned subsidiary of Media Source, Inc. No redistribution permitted.
Review by Kirkus Book Review
want to think rigorously about something. The author then explores the nuances of our two-system minds, showing how they perform in various situations. Psychological experiments have repeatedly revealed that our intuitions are generally wrong, that our assessments are based on biases and that our System 1 hates doubt and despises ambiguity. Kahneman largely avoids jargon; when he does use some ("heuristics," for example), he argues that such terms really ought to join our everyday vocabulary. He reviews many fundamental concepts in psychology and statistics (regression to the mean, the narrative fallacy, the optimistic bias), showing how they relate to his overall concerns about how we think and why we make the decisions that we do. Some of the later chapters (dealing with risk-taking and statistics and probabilities) are denser than others (some readers may resent such demands on System 2!), but the passages that deal with the economic and political implications of the research are gripping. Striking research showing the immense complexity of ordinary thought and revealing the identities of the gatekeepers in our minds.]] Copyright Kirkus Reviews, used with permission.
Copyright (c) Kirkus Reviews, used with permission.