Rule One Search Your Feelings Luke Skywalker: "No . . . No. That's not true. That's impossible!" Darth Vader: "Search your feelings, you know it to be true!" The Empire Strikes Back (1980) Abraham Bredius was nobody's fool. An art critic and collector, he was the world's leading scholar on Dutch painters, and particularly the seventeenth-century master Johannes Vermeer. As a young man in the 1880s, Bredius had made his name by spotting works wrongly credited to Vermeer. At the age of eighty-two, in 1937, he was enjoying something of a retirement swan song. He had just published a highly respected book in which he had identified two hundred fakes or imitations of Rembrandt. It was at this moment in Bredius's life that a charming lawyer named Gerard Boon paid a visit to his Monaco villa. Boon wanted to ask Bredius's opinion of a newly rediscovered work, Christ at Emmaus, thought to have been painted by Vermeer himself. The exacting old man was spellbound. He sent Boon away with his verdict: Emmaus was not only a Vermeer, it was the Dutch master's finest work. "We have here-I am inclined to say-the masterpiece of Johannes Vermeer of Delft," wrote Bredius in a magazine article shortly after. "Quite different from all his other paintings and yet every inch a Vermeer. "When this masterpiece was shown to me I had difficulty controlling my emotion," he added, noting reverently that the work was ongerept-Dutch for "virginally pure" or "untouched." It was an ironic choice of words: Emmaus could hardly have been more corrupt. It was a rotten fraud of a painting, stiffly applied to an old canvas just a few months before Bredius caught sight of it, and hardened with Bakelite. Yet this crude trickery caught out not only Bredius, but the entire Dutch art world. Christ at Emmaus soon sold for 520,000 guilders to the Boijmans Museum in Rotterdam. Compared with the wages of the time, that is well over $10 million today. Bredius himself contributed to help the museum buy the picture. Emmaus became the centerpiece of the Boijmans Museum, drawing admiring crowds and rave reviews. Several other paintings in a similar style soon emerged. Once the first forgery had been accepted as a Vermeer, it was easier to pass off these other fakes. They didn't fool everyone, but like Emmaus they fooled the people who mattered. Critics certified them; museums exhibited them; collectors paid vast sums for them-a total of more than $100 million in today's money. In financial terms alone, this was a monumental fraud. But there was more. The Dutch art world revered Vermeer as one of the greatest painters who ever lived. Painting mostly in the 1660s, he had been rediscovered only in the late 1800s. Fewer than forty of his works survive. The apparent emergence of half a dozen Vermeers in just a few years was a major cultural event. It was also an event that should have strained credulity. But it did not. Why? Don't look to the paintings themselves for an answer. If you compare a genuine Vermeer with the first forgery, Emmaus, it is hard to understand how anyone was fooled-let alone anyone as discerning as Abraham Bredius. Vermeer was a true master. His most famous work is Girl with a Pearl Earring, a luminous portrait of a young woman: seductive, innocent, adoring, and anxious all at once. The painting inspired a novel, and a movie starring Scarlett Johansson as the unnamed girl. In The Milkmaid, a simple scene of domesticity is lifted by details such as the rendering of a copper pot, and a display of fresh-baked bread that looks good enough to grab out of the painting. Then there's Woman Reading a Letter. She stands in the soft light of an unseen window. Is she, perhaps, pregnant? We see her in profile as she holds the letter close to her chest, eyes cast down as she reads. There's a dramatic stillness about the image-we feel that she's holding her breath as she scans the letter for news; we hold our breath, too. A masterpiece. And Christ at Emmaus? It's a static, awkward image by comparison. Rather than seeming to be an inferior imitation of Vermeer, it doesn't look like a Vermeer at all. It's not a terrible painting, but it's not a brilliant one either. Set alongside Vermeer's works it seems dour and clumsy. And yet it, and several others, fooled the world-and might have continued to fool the world to this day, had not the forger been caught out by a combination of recklessness and bad luck. In May 1945, with the war in Europe at an end, two officers from the Allied Art Commission knocked on the door of 321 Keizersgracht, one of Amsterdam's most exclusive addresses. They were met by a charismatic little man called Han van Meegeren. The young van Meegeren had enjoyed some brief success as an artist. In middle age, as his jowls had loosened and his hair had silvered, he had grown rich as an art dealer. But perhaps he had been dealing art with the wrong people, because the officers came with a serious charge: that van Meegeren had sold Johannes Vermeer's newly discovered masterpiece, The Woman Taken in Adultery, to a German Nazi. And not just any Nazi, but Hitler's right-hand man, Hermann Gsring. Van Meegeren was arrested and charged with treason. He responded with furious denials, trying to bluster his way to freedom. His forceful, fast-talking manner was usually enough to get him out of a sticky situation. Not this time. A few days into his incarceration, he cracked. He confessed not to treason but to a crime that caused astonishment across the Netherlands and the art world as a whole. "Fools!" he sneered. "You think I sold a priceless Vermeer to Gsring? There was no Vermeer! I painted it myself." Van Meegeren admitted painting not only the work that had been found in Nazi hands, but Christ at Emmaus and several other supposed Vermeers. The fraud had unraveled not because anyone spotted these flawed forgeries, but because the forger himself confessed. And why wouldn't he? Selling an irreplaceable Vermeer masterpiece to the Nazis would have been a hanging offense, whereas selling a forgery to Hermann Gsring wasn't just forgivable, it was admirable. But the question remains: How could a man as expert as Abraham Bredius have been fooled by so crass a forgery? And why begin a book about statistics with a tale that has nothing at all to do with numbers? The answer to both questions is the same: when it comes to interpreting the world around us, we need to realize that our feelings can trump our expertise. When Bredius wrote, "I had difficulty controlling my emotion," he was, alas, correct. Nobody had more skill or knowledge than Bredius, but van Meegeren understood how to turn Bredius's skill and knowledge into a disadvantage. Working out how van Meegeren fooled Bredius teaches us much more than a footnote in the history of art; it explains why we buy things we don't need, fall for the wrong kind of romantic partner, and vote for politicians who betray our trust. In particular, it explains why so often we buy into statistical claims that even a moment's thought would tell us cannot be true. Van Meegeren wasn't an artistic genius, but he intuitively understood something about human nature. Sometimes, we want to be fooled. We'll return to the cause of Abraham Bredius's error in a short while. For now, it's enough to understand that his deep knowledge of Vermeer's paintings proved to be a liability rather than an asset. When he saw Christ at Emmaus, Bredius was undone by his emotional response. The same trap lies in wait for any of us. The aim of this book is to help you be wiser about statistics. That means I also need to help you be wiser about yourself. All the statistical expertise in the world will not prevent your believing claims you shouldn't believe and dismissing facts you shouldn't dismiss. That expertise needs to be complemented by control of your own emotional reactions to the statistical claims you see. In some cases there's no emotional reaction to worry about. Say I tell you Mars is more than 50 million kilometers, or 30 million miles, from Earth. Very few people have a passionately held belief about that claim, so you can start asking sensible questions immediately. For example: Is 30 million miles a long way? (Sort of. It's more than a hundred times farther than the distance between Earth and the moon. Other planets are a lot farther away, though.) Hang on, isn't Mars in a totally different orbit? Doesn't that mean the distance between the Earth and Mars varies all the time? (Indeed it does. The minimum distance between the two planets is a bit more than 30 million miles, but sometimes Mars is more than 200 million miles away.) Because there is no emotional response to the claim to trip you up, you can jump straight to trying to understand and evaluate it. It's much more challenging when emotional reactions are involved, as we've seen with smokers and cancer statistics. Psychologist Ziva Kunda found the same effect in the lab, when she showed experimental subjects an article laying out the evidence that coffee or other sources of caffeine could increase the risk to women of developing breast cysts. Most people found the article pretty convincing. Women who drank a lot of coffee did not. We often find ways to dismiss evidence that we don't like. And the opposite is true, too: when evidence seems to support our preconceptions, we are less likely to look too closely for flaws. The more extreme the emotional reaction, the harder it is to think straight. What if your doctor told you that you had a rare form of cancer, and advised you not to look it up? What if you ignored that advice, consulted the scientific literature, and discovered that the average survival time was just eight months? Exactly that situation confronted Stephen Jay Gould, a paleontologist and wonderful science writer, at the age of forty. "I sat stunned for about fifteen minutes . . ." he wrote in an essay that has become famous. You can well imagine his emotions. Eight months to live. Eight months to live. Eight months to live. "Then my mind started to work again, thank goodness." Once his mind did start to work, Gould realized that his situation might not be so desperate. The eight months wasn't an upper limit. It was the median average, which means that half of sufferers live longer than that. Some, possibly, live a great deal longer. Gould had a good chance: he was fairly young; his cancer had been spotted early; he'd get good treatment. Gould's doctor was being kind in trying to steer him away from the literature, and many of us will go to some lengths to avoid hearing information we suspect we might not like. In another experiment, students had a blood sample taken and were then shown a frightening presentation about the dangers of herpes; they were then told that their blood sample would be tested for the herpes virus. Herpes can't be cured, but it can be managed, and there are precautions a person can take to prevent transmitting the virus to sexual partners-so it would be useful to know whether or not you have herpes. Nevertheless, a significant minority, one in five, not only preferred not to know whether they were infected but were willing to pay good money to have their blood sample discarded instead. They told researchers they simply didn't want to face the anxiety. Behavioral economists call this "the ostrich effect." For example, when stock markets are falling, people are less likely to log in to check their investment accounts online. That makes no sense. If you use information about share prices to inform your investment strategy, you should be just as keen to get it in bad times as good. If you don't, there's little reason to log in at all-so why check your account so frequently when the market is rising? It is not easy to master our emotions while assessing information that matters to us, not least because our emotions can lead us astray in different directions. Gould realized he hadn't been thinking straight because of the initial shock-but how could he be sure, when he spotted those signs of hope in the statistics, that he wasn't now in a state of denial? He couldn't. With hindsight, he wasn't: he lived for another twenty years, and died of an unrelated condition. We don't need to become emotionless processors of numerical information-just noticing our emotions and taking them into account may often be enough to improve our judgment. Rather than requiring superhuman control over our emotions, we need simply to develop good habits. Ask yourself: How does this information make me feel? Do I feel vindicated or smug? Anxious, angry, or afraid? Am I in denial, scrambling to find a reason to dismiss the claim? I've tried to get better at this myself. A few years ago, I shared a graph on social media that showed a rapid increase in support for same-sex marriage. As it happens, I have strong feelings about the matter and I wanted to share the good news. Pausing just long enough to note that the graph seemed to come from a reputable newspaper, I retweeted it. The first reply was "Tim, have you looked at the axes on that graph?" My heart sank. Five seconds looking at the graph would have told me that it was a mess, with the timescale a jumble that distorted the rate of progress. Approval for marriage equality was increasing, as the graph showed, but I should have clipped it for my "bad data visualization" file rather than eagerly sharing it with the world. My emotions had gotten the better of me. I still make that sort of mistake-but less often, I hope. I've certainly become more cautious-and more aware of the behavior when I see it in others. It was very much in evidence in the early days of the coronavirus epidemic, as helpful-seeming misinformation spread even faster than the virus itself. One viral post-circulating on Facebook and in email newsgroups-all-too-confidently explained how to distinguish between COVID-19 and a cold, reassured people that the virus was destroyed by warm weather, and incorrectly advised that ice water was to be avoided, while warm water kills any virus. The post, sometimes attributed to "my friend's uncle," sometimes to "Stanford hospital board" or some blameless and uninvolved pediatrician, was occasionally accurate but generally speculative and misleading. Yet people-normally sensible people-shared it again and again and again. Why? Because they wanted to help others. They felt confused, they saw apparently useful advice, and they felt impelled to share. That impulse was only human, and it was well-meaning-but it was not wise. Excerpted from The Data Detective: Ten Easy Rules to Make Sense of Statistics by Tim Harford All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.