Blindspot Hidden biases of good people

Mahzarin R. Banaji

Book - 2013

In this accessible and groundbreaking look at the science of prejudice, Banaji and Greenwald show that prejudice and unconscious biases toward others are a fundamental part of the human psyche.

Saved in:

2nd Floor Show me where

154.2/Banaji
1 / 1 copies available
Location Call Number   Status
2nd Floor 154.2/Banaji Checked In
Subjects
Published
New York : Delacorte Press c2013.
Language
English
Main Author
Mahzarin R. Banaji (-)
Other Authors
Anthony G. Greenwald (-)
Edition
1st ed
Physical Description
xv, 254 p. : ill. ; 25 cm
Bibliography
Includes bibliographical references and index.
ISBN
9780553804645
  • Preface
  • 1. Mindbugs
  • 2. Shades of Truth
  • 3. Into the Blindspot
  • 4. "Not That There's Anything Wrong with That!"
  • 5. Homo Categoricus
  • 6. The Hidden Costs of Stereotypes
  • 7. Us and Them
  • 8. Outsmarting the Machine
  • Appendix 1. Are Americans Racist?
  • Appendix 2. Race, Disadvantage, and Discrimination
  • Acknowledgments
  • Notes
  • References
  • Index
Review by Choice Review

Stereotypes are, alas, alive and well. Social psychological research on attitudes reveals that hidden biases, implicit "bits of knowledge" based on social categories (e.g., race, gender, religion, ethnicity, social class, sexual orientation, disability), can subtly influence how one acts toward others. The larger problem--perhaps tragedy is a better word--is that people can be unaware of, even oblivious to, the behavioral impacts of these biases in daily life. As Banaji (Harvard) and Greenwald (Univ. of Washington), developers of the well-known Implicit Association Test (IAT), explain in this thoughtful, provocative book, even the most open-minded people (the "good people") can harbor prejudicial "mindbugs" that lead them to behave in ways misaligned with their personal beliefs. Thus, the tolerant person is possibly a secret chauvinist--or worse--because his or her perceptions of social groups are outside the realm of awareness or conscious control. Using the book's well-placed paper-and-pencil IATs, readers can learn about their own social blind spots concerning race and gender. Two thoughtful appendixes--one exploring whether Americans are (still) racist, the other examining race, disadvantage, and discrimination--support the main narrative. The authors pull no punches here. This is a powerful book. Summing Up: Essential. All readers. D. S. Dunn Moravian College

Copyright American Library Association, used with permission.
Review by Publisher's Weekly Review

Citing the influence of "mindbugs"-ingrained judgments and biases that unconsciously influence behavior-social psychologists Banaji and Greenwald, professors at Harvard and the University of Washington, respectively, provide an accessible and persuasive account of the causes of stereotyping and discrimination. Using numerous tests and data sets, the authors demonstrate that while most Americans are not overtly racist, a majority show implicit preferences for whites versus African-Americans, which can lead to discriminatory treatment of the latter and economic and social disparities. Similar associations can be seen with regard to gender biases and ageism, to the extent that even members of these groups have internalized stereotypes. Perhaps the most interesting aspect of these results is the degree to which these mindbugs then become self-fulfilling prophecies, to the point where "people... are willing to sacrifice their self-interest for the sake of maintaining the existing social order." What arises as critical is that these behaviors often occur in ways that are subtle and unintentional, having more to do with a favoritism of one's own in-group, rather than actual animosity toward others. Banaji and Greenwald will keep even nonpsychology students engaged with plenty of self-examinations and compelling elucidations of case studies and experiments. Agent: Katinka Matson and John Brockman, Brockman Inc. (Feb. 5) (c) Copyright PWxyz, LLC. All rights reserved.

(c) Copyright PWxyz, LLC. All rights reserved
Review by Kirkus Book Review

An examination of how beliefs are shaped by hidden bias. Banaji (Psychology/Harvard Univ.) and Greenwald (Psychology/Univ. of Washington) argue that the 4 percent divergence between Barack Obama's actual white American votes in 2008 and pre-election polls is an indication of the racial factors involved. In their opinion, had Obama "been obliged to rely only on the white American electorate, he would have lost in a landslide." The authors have collaborated since 1980 and have developed survey methods designed to reveal what they call "unconscious" or implicit cognition. The Implicit Association Test (developed by Greenwald in 1994) is one of these methods, which they and others have used to help understand the role that unconscious bias or prejudice plays in shaping attitudes. (On the Oprah Winfrey show, Malcolm Gladwell described how he took one of the tests and was shocked at the results: "I was biased--slightly biased--against Black people, toward White people, which horrified me because my mom's Jamaican.") Subjects taking the test are required to make rapid associations to reveal unconscious associations with race, gender and age. The authors discuss how, paradoxically, these associative mechanisms also confer cognitive benefits: "Stereotyping achieves the desirable effect of allowing us to rapidly perceive total strangers as distinctive individuals." Their tests have produced a "large body of data" on the relationship between automatic associations and the reflective mind. A stimulating treatment that should help readers deal with irrational biases that they would otherwise consciously reject.]] Copyright Kirkus Reviews, used with permission.

Copyright (c) Kirkus Reviews, used with permission.

1 Mindbugs It is an ordinary day on a college campus. Students and professors of experimental psychology have filed into a lecture hall to listen to a distinguished visiting scientist explain how our minds perceive the physical world. Nothing about his tweed jacket and unkempt hair suggests the challenge he is about to deliver. A few minutes into the lecture, he says matter-of-factly, "As you can see, the two tabletops are exactly the same in shape and size." Shuffling in their seats, some in the audience frown while others smile in embarrassment because, as anyone can plainly see, he is dead wrong. Some tilt their heads from side to side, to test if a literal shift in perspective will help. Others wonder whether they should bother staying for the lecture if this nonsense is just the start. The nonbelievers are caught short, though, when the speaker proceeds to show the truth of his audacious claim. Using an overhead projector, he takes a transparent plastic sheet containing only a single red parallelogram, lays it over the tabletop on the left, and shows that it fits perfectly. He then rotates the plastic sheet clockwise, and places the parallelogram over the tabletop on the right; it fits perfectly there as well. An audible gasp fills the hall as the speaker moves the red frame back and forth, and the room breaks into laugher. With nothing more than a faint smile the speaker goes on to complete his lecture on how the eye receives, the brain registers, and the mind interprets visual information. Unconvinced? You can try the test yourself. Find some paper thin enough to trace the outline of one of the tabletops, and then move the outline over to the other tabletop. If you don't find that the shape of the first tabletop fits identically onto the second tabletop, there can be only one explanation--you've botched the tracing job, because the table surfaces are precisely the same. But how can this be? Visual Mindbugs You, like us, have just succumbed to a famous visual illusion, one that produces an error in the mind's ability to perceive a pair of objects as they actually are. We will call such errors mindbugs--ingrained habits of thought that lead to errors in how we perceive, remember, reason, and make decisions.1 The psychologist Roger Shepard, a genius who has delighted in the art of confounding, created this illusion called Turning the Tables. When we look at the images of the two table surfaces, our retinas do, in fact, receive them as identical in shape and size. In other words, the retina "sees" the tabletops quite accurately. However, when the eye transmits that information to the brain's visual cortex, where depth is perceived, the trouble begins. The incorrect perception that the two tabletops are strikingly different in shape occurs effortlessly, because the brain automatically converts the 2-D image that exists both on the page and on the retina into a 3-D interpretation of the tabletop shapes as they must be in the natural world. The automatic processes of the mind, in other words, impose the third dimension of depth onto this scene. And the conscious, reflective processes of the mind accept the illusion unquestioningly. So much so that when encountering the speaker's assertion that the tabletop outlines are the same, the conscious mind's first reaction is to consider it to be sheer nonsense. Natural selection has endowed the minds of humans and other large animals to operate successfully in a three-dimensional world. Having no experience in a world other than a 3-D one, the brain we have continues to perform its conscious perceptual corrections of the tables' dimensions to make them appear as they would in the traditional 3-D world.2 Contrary to expectation, this error reflects not a weakness of adaptation but rather a triumph, for Shepard's tabletops highlight the success of a visual system that has adapted effectively to the combination of a two-dimensional retina inside the eye and a three-dimensional world outside. The mind's automatic understanding of the data is so confident that, as Shepard puts it, "any knowledge or understanding of the illusion we may gain at the intellectual level remains virtually powerless to diminish the magnitude of the illusion." Take a look at the tables again. The knowledge you now have (that the tables have identical surfaces) has no corrective effect in diminishing the illusion!3 Disconcerting as this experience is, it serves as a vivid illustration of a signal property of the mind--it does a great deal of its work automatically, unconsciously, and unintentionally. Mention of the mind's unconscious operation may summon up for you a visual memory of the bearded, cigar-smoking Sigmund Freud, who rightly gets credit for having brought the term unconscious into everyday use. However, an understanding of the unconscious workings of the mind has changed greatly in the century since Freud's pathbreaking observations. Freud portrayed an omniscient unconscious with complex motives that shape important aspects of human mind and behavior--from dreams to memories to madness, and ultimately to civilization itself. Today, however, Freud's arguments, detached as they have remained from scientific verification, have a greatly reduced impact on scientific understanding of unconscious mental life. Instead, the modern conception of the unconscious mind must be credited to another historical figure, one far less known than Freud. A nineteenth-century German physicist and physiologist, Hermann von Helmholtz, offered the name unbewußter Schluß, or unconscious inference, to describe how an illusion like Shepard's tabletops might work.4 Helmholtz aimed to describe the means by which the mind creates from physical data the conscious perceptions that define our ordinary and subjective experiences of "seeing." Our visual system is capable of being tricked by a simple 2-D image, because an unconscious mental act replaces the 2-D shape of the retinal image with a consciously perceived 3-D shape of the inferred object it suggests. Now try this: Read the following sixteen words with sufficiently close attention so that you can expect to be able to recognize them when you see them again a few pages hence: Ant Spider Feelers Web Fly Poison Slimy Crawl Bee Wing Bug Small Bite Fright Wasp Creepy In the meantime, here's another striking example of unconscious inference in the form of a checkerboard and cylinder to confound us further. When we tell you that the squares marked A and B are exactly the same in their coloring, you will doubtless believe us to be wrong. But take a thick piece of opaque paper, one large enough to cover the entire picture, mark with a point the two squares labeled A and B, and make a circular hole just a bit smaller than the checkerboard square on which each sits. When you look only through the holes and without the rest of the image, you will see that they are indeed identical in color. Again the culprit is an unconscious inference, a mindbug that automatically goes to work on the image. What causes this remarkable failure of perception? Several features of this checkerboard image are involved, but let us attend to the most obvious ones. First of all, notice that B is surrounded by several dark squares that make it look lighter than it is, merely by contrast; likewise, just the opposite, A is surrounded by adjacent lighter squares that make it seem darker than it actually is. Second, notice the shadow being cast by the cylinder. This darkens the squares within the shadow--including the one marked B--but the mind automatically undoes this darkening to correct for the shadow, lightening our conscious experience of B. As with the table illusion, the mechanisms that produce this one also exist to enable us to see and understand the world successfully. Ted Adelson, a vision scientist at MIT and creator of this checkershadow image, writes: "As with many so-called illusions, this effect really demonstrates the success rather than the failure of the visual system. The visual system is not very good at being a physical light meter, but that is not its purpose."5 Such examples force us to ask a more general question: To what extent do our minds possess efficient and accurate methods that fail us so miserably when we put them to use in a slightly revised context? Memory Mindbugs Think back to the words you memorized earlier, as you examine the list below. As you review each word, without turning back to the original list, try to recall whether each word you see here also appeared in the list you read earlier. If you have paper and pencil handy, and to avoid any doubt about your answers, copy all the words you recall seeing on the previous list and leave out any word that, by your recollection, did not appear before. Maple Ant Poison Fly Stem Berry Feelers Slimy Birch Wing Leaves Tree Roots Bite Web Bug Small Oak Crawl Acorn Wasp Branch Insect Bee Willow Fright Spider Pine Creepy To be correct, you should have left out all twelve tree-related words, starting with maple and ending with pine, for indeed, none of the tree words appeared on the earlier list. You should have also written down all the insect-related words, except one--the word insect itself! That word was not on the original list. If, as is quite likely, you included the word insect as one you'd seen before, you have demonstrated a powerful but ordinary mindbug that can create false memories. In retrospect, it's easy to see the basis for the false memory for insect. The mind is an automatic association-making machine. When it encounters any information--words, pictures, or even complex ideas--related information automatically comes to mind. In this case, the words in the original list had an insect theme. Unthinkingly, we use that shared theme as we try to remember the past and, in so doing, stumble easily when we come across the word insect itself. Such a memory error is called a false alarm--we mistakenly remember something that actually did not occur. In a study conducted at Washington University, 82 percent of the time students remembered seeing words that shared a theme--say, insects--but were not on the original lists. That huge percentage of error is especially remarkable when compared to the 75 percent correct memory for words that were actually on the list! In other words, mindbugs can be powerful enough to produce greater recollection of things that didn't occur than of things that did occur.6 The errors witnessed so far may not seem terribly consequential. What's the harm, after all, in misremembering a word? But imagine being interrogated about a potential suspect in a crime you have witnessed. Could the false-memory mindbug interfere with your accuracy in reporting what you saw? If the suspect bears some resemblance to the criminal--for example, has a similar beard--might a false identification result? If so, with what probability? Elizabeth Loftus is among psychology's most creative experimentalists. Now at the University of California at Irvine, she has made it her life's work to study memory mindbugs in eyewitnesses by presenting simulated burglaries, car accidents, and other common mishaps and then testing people's memories of them. She has found not only that errors in these eyewitness memories are disturbingly frequent but also that even slight changes in the way in which the witness is prompted during questioning to remember an event can alter the content of what is remembered. In one famous study, Loftus showed witnesses scenes from an automobile accident in which two cars had collided with no personal injury. Later she asked half the witnesses, "How fast was the car going when it hit the other car?" She asked the other half, "How fast was the car going when it smashed into the other car?" Those who were asked the "smashed" question gave higher estimates of the speed of the vehicle, compared to those who were asked the "hit" question, in addition to which they were more likely to mistakenly insert a memory of broken glass at the accident scene even though there was none in what they saw.7 Psychologists call this mindbug retroactive interference--an influence of after-the-experience information on memory. Loftus gave this a more memorable name: the misinformation effect. Her point is that a small change in language can produce a consequential change in what is remembered, often resulting in mistaken testimony by eyewitnesses who relied on mistaken information. In recent years it has become clear that the number of wrongful convictions produced by eyewitness errors is substantial.8 From the efforts of the Innocence Project, an organization dedicated to exonerating the wrongfully convicted through DNA testing, 250 people so far have been exonerated by conclusive tests that confirmed their innocence. Of these, 190 cases had been decided based on a mistaken eyewitness account. In other words, in nearly 75 percent of the cases of wrongful conviction, the failure of eyewitness memory (assuming no malign intent on the part of the witness to wrongfully convict) was responsible for tragedies that many societies believe to be so intolerable that their laws explicitly err on the side of allowing the guilty to walk free. Availability and Anchoring: Two Famous Mindbugs Pick the correct answer in each of the three pairs: Each year, do more people in the United States die from cause (a) or cause (b)? 1. (a) murder (b) diabetes 2. (a) murder (b) suicide 3. (a) car accidents (b) abdominal cancer Most of us give the answer (b) for question 1 and (a) for questions 2 and 3; when in fact the correct answer to each question is (b). In other words, we get the first one right but not the next two. Psychologists Daniel Kahneman and Amos Tversky named and described the generic version of this mindbug, calling it the availability heuristic. When instances of one type of event (such as death by murder rather than suicide) come more easily to mind than those of another type, we tend to assume that the first event also must occur more frequently in the world. Murder is more likely to receive media attention than suicide, not to mention that the stigma of suicide makes it less likely to be information that is shared beyond the family. Car accidents are likewise more likely to be mentioned because of their shocking nature, whereas abdominal cancer is one of many kinds of cancer, a common cause of death. Because murder and car accidents come to mind more easily, they are wrongly assumed to occur more frequently. This is seemingly reasonable, but it can lead us to overestimate car accident deaths. However, greater ease of availability to the mind doesn't mean greater frequency of occurrence in the world. These kinds of mistakes occur routinely, and are often accompanied with great decision costs.9 Dan Ariely, a behavioral economist, asked students at MIT to write down the last two digits of their Social Security number on a piece of paper. He then asked them to estimate the price of a keyboard, a trackball, or a design book, items easily familiar to MIT students. Ariely collected these two numbers from each person and then computed the correlation between them, looking for a possible relation between the two digits of the Social Security number and the estimated prices. Logically, of course, there is no connection between the two sets of numbers, so the correlation should have been at or close to zero. In fact, Ariely discovered that there was a substantial correlation between the two sets of numbers. Those for whom the last two digits of their Social Security number happened to lie between 00 and 19 said they would pay $8.62 on average for the trackball; those with digits between 20 and 39 were willing to pay more, $11.82; those with digits between 40 and 59 offered up even more, $13.45; and the poor souls whose Social Security numbers happened to end in digits from 60 to 79 and 80 to 99 offered to pay $21.18 and $26.18--all for the very same object!10 This, the second of the two famous mindbugs, was discovered by psychologists Daniel Kahneman and Amos Tversky, who called it anchoring, to capture the idea that the mind doesn't search for information in a vacuum.11 Rather, it starts by using whatever information is immediately available as a reference point or "anchor" and then adjusting. The result, in this case of the random-digit anchor, was the potentially self-harming penalty of being willing to pay too much. Those who fall prey to the availability and anchoring heuristics are not more feeble-minded or gullible than others. Each of us is an ever-ready victim. Property values can be altered by manipulated price anchors that inflate or deflate the actual price. The valuation of stocks can be influenced more by their suggested market price than actual value, perhaps providing some of the explanation for the persistence of financial bubbles.12 Excerpted from Blindspot: Hidden Biases of Good People by Mahzarin R. Banaji, Anthony G. Greenwald All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.