Quotes & Commentary #74: Kahneman

Quotes & Commentary #74: Kahneman

We are prone to overestimate how much we understand about the world and underestimate the role of chance.

—Daniel Kahneman

Kahneman’s book, Thinking, Fast and Slow is one of the most subtly disturbing books that I have ever read. This is due to Kahneman’s ability to undermine our delusions. The book is one long demonstration that we are not nearly as clever as we think we are. Not only that, but the architecture of our brains makes us blind to our own blindness. We commit systematic cognitive errors repeatedly without ever suspecting that we do so. We make a travesty of rationality while considering ourselves the most reasonable of creatures.

As Kahneman repeatedly demonstrates, we are particularly bad when it comes to statistical information: distributions, tendencies, randomness. Our brains seem unable to come to terms with chance. Kahneman gives the example of the “hot hand” illusion in basketball—when we believe a player is more likely to make the next shot after making the last one—as an example of our insistence on projecting tendencies onto random sets of data. Another example is from history. Looking at the bombed-out areas of their city, Londers began to suspect that the German Luftwaffe was deliberately sparing certain sections of the city for some unknown reason. However, mathematical analysis revealed that the bombing pattern was consistent with randomness.

The fundamental error we humans commit is a refusal to see chance. Chance, for our brains, is not an explanation at all, but rather something to be explained. We want to see a cause—a reason why something is the way it is. We do this automatically. When we see videos of animals, we automatically attribute to them human emotions and motivations. Even when we talk about inanimate things like genes and computers, we cannot help using causal language. We say genes “want” to reproduce, or a computer “is trying” to do such and such. 

A straightforward consequence of this tendency is our tendency to see ourselves as “in control” of random outcomes. The best example of this may be the humble lottery ticket. Even though the chance of winning the lottery is necessarily equal for any given number, people are more likely to buy a ticket when they can pick their own number. This is because of the illusion that some numbers are “luckier” than others, or that there is a skill involved in choosing a number. This is called the ‘illusion of control,’ and it is pervasive. Just as we humans cannot help seeing events as causally connected, we also crave a sense of control: we want to be that cause.

The illusion of control is well-demonstrated, and is likely one of the psychological underpinnings of religious and superstitious behavior. When we are faced with an outcome largely out of our control, we grasp at straws. This is famously true in baseball, especially with batters, who are notoriously prone to superstition. When even the greatest possible skill cannot guarantee regular outcomes, we supplement skill with “luck”—lucky clothes, lucky foods, lucky routines, and so on.

The origins of our belief in gods may have something to do with this search for control. We tend to see natural events like droughts and plagues as impregnated with meaning, as if they were built by conscious creatures like us. And as our ancestors strove to influence the natural world with ritual, we imagined ourselves as causes, too—as able to control, to some extent, the wrath of the gods by appeasing them.

As with all cognitive illusions, these notions are insulated from negative evidence. Disproving them is all but impossible. Once you are convinced that you are in control of an event, any (random) success will reinforce your idea, and any (random) failure can be attributed to some slight mistake on your part. The logic is thus self-reinforcing. If, for example, you believe that eating carrots will guarantee you bat a home run, then any failure to bat a home run can be attributed to not having eaten just the right amount of carrots, at the right time, and so on. This sort of logic is so nefarious because, once you begin to think along these lines, experience cannot provide the route out.

I bring up this illusion because I cannot help seeing instances of it in our response to the coronavirus. In the face of danger and uncertainty, everyone naturally wants a sense of control. We ask: What can I do to make myself safe? Solutions take the form of strangely specific directives: stay six feet apart, wash your hands for twenty seconds, sneeze into your elbow. And many people are going further than the health authorities are advising—wearing masks and gloves even when they are not sick, disinfecting all their groceries, obsessively cleaning clothes, and so on. I even saw a man the other day who put little rubber bags on his dog’s paws, presumably so that the dog would not track coronavirus back into the house. 

Now, do not get me wrong: I think we should follow the advice of the relevant specialists. But there does seem to be some uncertainty in the solutions, which does not inspire confidence. For example, here in Europe we are being told to stand one meter apart, while in the United States the distance is six feet—nearly twice as far. Here in Spain I have seen recommendations for handwashing from between 40 and 60 seconds, while in the United States it is 20 seconds. It is difficult to resist the conclusion that these numbers are arbitrary. 

If Michael Osterholm is to be believed—and he is one of the United States’ top experts on infectious disease—then many of these measures are not based on hard evidence. According to him, it is quite possible that the virus spreads more than six feet in the air. And he doubts that all of our disinfecting has much effect on the virus’s spread, as he thinks that it is primarily not through surface contact but through breathing that we catch it. Keep in mind that, a week or so ago, we were told that we could stop it through handwashing and avoiding touching our own faces.

Telling people that they are powerless is not, however, a very inspiring message. Perhaps there is good psychology in advocating certain rituals, even if they are not particularly effective, since it can aid compliance in other, more effective, measures like social distancing. Rituals do serve a purpose, both psychological and social. Rituals help to focus people’s attention, to reassure them, and to increase social cohesion. These are not negligible benefits. 

So far, I think that the authorities have only been partially effective in their messaging to the public. They have been particularly bad when it comes to masks. This is because the public was told two contradictory messages: Masks are useless, and doctors and nurses need them. I think people caught on to this dissonance, and thus continued to buy and wear masks in large numbers. Meanwhile, the truth seems to be that masks, even surgical masks, are better than nothing (though Osterholm is very skeptical of that). Thus, if the public were told this truth—that masks might help a little bit, but since we do not have enough of them we ought to let healthcare workers use them—perhaps there would be less hoarding. 

Another failure on the mask front has been due to bad psychology. People were told only to wear masks when they were sick. However, if we follow this measure, masks will become a mark of infection, and will instantly turn wearers into pariahs. (What is more, many people are infectious when they do not know it.) In this case, ritualistic use of masks may be wise, since it will eliminate the shame while perhaps marginally reducing infection rates.

The wisest course, then, may indeed involve a bit of ritual, at least for the time being. In the absence of conclusive evidence for many of these measures, it is likely the best that we can do. I will certainly abide by what the health authorities instruct me to do. But the lessons of psychology do cause a little pinprick in my brain, as I repeatedly wonder if we are just grasping at a sense of control in a situation that is for the most part completely beyond our means to control it.

I certainly crave a sense of control. Though I have not been obsessively disinfecting everything in my house, I have been obsessively reading about the virus, hoping that knowledge will give me some sort of power. So far it has not, and I suspect this is not going to change.

Review: Thinking, Fast and Slow

Review: Thinking, Fast and Slow

Thinking, Fast and Slow by Daniel Kahneman

My rating: 5 of 5 stars

Nothing in life is as important as you think it is when you are thinking about it.

I think this book is mistitled. For years, I assumed that it was some kind of self-help book about when to trust your gut and when to trust your head, and thus I put off reading it. But Thinking, Fast and Slow is nothing of the sort. As I finally discovered when the book was gifted to me (the ecstatic blurbs in the front pages were the first clue), this book is the summary of Daniel Kahneman’s study of cognitive errors. The book should probably be called: Thinking, Just Not Very Well.

Granted, my initial impression had a grain of truth. Kahneman’s main focus is on what we sometimes call our gut. This is the “fast thinking” of the title, otherwise known as our intuition. Unlike many books on the market, which describe the wonders of human intuition and judgment, Kahneman’s primary focus was on how our intuition can systematically fail to draw correct conclusions. So you might say that this is a book about all of the reasons you should distrust your gut.

Every researcher of the mind seems to divide it up into different hypothetical entities. For Freud it was the conscious and unconscious, while for Kahneman there are simply System 1 and System 2. The former is responsible for fast thinking—intuition, gut feelings—and the second is responsible for slow thinking—deliberative thought, using your head. System 2, while admirably thorough and logical, is also effortful and sluggish. Trying any unfamiliar mental task (such as mental arithmetic) can convince you of this. Thus, we must rely on our fast-acting System 1 for most of any given day.

System 1 generates answers to questions without any experience of conscious deliberation. Most often these answers are reasonable, such as when answering the question “What you like a hamburger?” (Answer: yes). But, as Kahneman demonstrates, there are many situations in which the answer that springs suddenly to mind is demonstrably false. This would not be a problem if our conscious System 2 detected these falsehoods. Yet our default position is to simply go with our intuition unless we have a strong reason to believe our intuition is misleading. Unfortunately, the brain has no warning system to tell you that your gut feeling is apt to be unreliable. You can call these sorts of situations “cognitive illusions.”

A common theme in these cognitive illusions is a failure of our intuition to deal with statistical information. We are good at thinking in terms of causes and comparisons, but situations involving chance throw us off. As an example, imagine a man who is shy, quiet, and orderly. Is he more likely to be a librarian or a farmer? Now consider the answer that springs to mind (librarian, I assume): how was it generated? Your mind compared the description to the stereotype of a librarian, and made the judgment. But this judgment did not take into account the fact that there are many times more farmers than male librarians.

Another example of this failure of intuition is the mind’s tendency to generate causal stories to explain random statistical noise. A famous example of this is the “hot hand” in basketball: interpreting a streak of successful shots as due to the player being especially focused, rather than simply as a result a luck. (Although subsequent research has shown that there was something to the idea, after all. So maybe we should not lament too much about our intuitions!) Another well-known example is the tendency for traders to attribute their success or failure in the stock market to skill, while Kahneman demonstrated that the rankings of a group of traders from year to year had no correlation at all. The basic point is that we are generally hesitant to attribute something to chance, and instead invent causal stories that “explain” the variation.

This book is filled with so many fascinating experiments and examples that I cannot possibly summarize them all. Suffice to say that the results are convincing, not only because of the weight of evidence, but mainly because Kahneman is usually able to demonstrate the principle at work on the reader. Our intuitive reactions are remarkably similar, apparently, and I found that I normally reacted to his questions in the way that he predicted. If you are apt to believe that you are a rational person (as I am) it can be quite depressing.

After establishing the groundwork, Kahneman sets his sights on the neighboring discipline of economics. Conventional economic theory presupposes rational actors who are able to weigh risks and to act in accordance with their desires. But, as Kahneman found, this does hold with actual people. Not only do real humans act irrationally, but real humans deviate from the expected predictions of the rational agent model systematically. This means that we humans are (to borrow a phrase from another book in this vein) predictably irrational. Our folly is consistent.

One major finding is that people are loss-averse. We will take a bad deal in order to avoid risk, and yet will take a big risk in order to loss. This behavior seems to be motivated by an intense fear of regret, and it is the cause of a certain amount of conservatism, not only in economics, but in life. If an action turns out badly, we tend to regret it more of it was an exceptional rather than a routine act (picking up a hitchhiker rather than driving to work, for example), and so people shy away from abnormal options that carry uncertainty.

Yet, logically speaking, there is no reason to regret a special action more than a customary one, just as there is no reason to weigh losses so much more heavily than gains. Of course, there is good evolutionary logic for these tendencies. In a dangerous environment, losing a gamble could mean losing your life, so it is best to stay to the tried-and-true. But in an economic context, this strategy is not usually optimal.

The last section of the book was the most interesting of all, at least from a philosophical perspective. Kahneman investigates how our memories systematically misrepresent our experiences, which can cause a huge divergence between experienced happiness and remembered joy. Basically, when it comes to memory, intensity matters more than duration, and the peaks and ends of experiences matter more than their averages. The same applies with pain: We may remember one experience as less painful than another just because the pain was mild when it ended. And yet, in terms of measured pain per minute, the first experience may actually have included more experiential suffering.

As a result of this, our evaluations of life satisfaction can often have very little to do with our real, experiential well being. This presents us with something of a paradox, since we often do things, not for how much joy they will bring us in the moment, but for the nice memory they will create. Think about this: How much money would you spend on a vacation if you knew that every trace of the experience would be wiped out as soon as the vacation ended, including photos and even your memories? The answer for most people is not much, if anything at all. This is why so many people (myself included) frantically take photos on their vacations: the vacation is oriented toward a future remembering-self. But perhaps it is just as well that humans were made this way. If I made my decisions based on what was most pleasant to do in the moment, I doubt I would have made my way through Kant.

This is just a short summary of the book, which certainly does not do justice to the richness of Kahneman’s many insights, examples, and arguments. What can I possibly add? Well, I think I should begin with my few criticisms. Now, it is always possible to criticize the details of psychological experiments—they are artificial, they mainly use college students, etc. But considering the logistical restraints of doing research, I thought that Kahneman’s experiments were all quite expertly done, with the relevant variables controlled and additional work performed to check for competing explanations. So I cannot fault this.

What bothered me, rather, was that Kahneman was profuse in diagnosing cognitive errors, but somewhat reticent when it came to the practical ramifications of these conclusions, or to strategies to mitigate these errors. He does offer some consequences and suggestions, but these are few and far between. Of course, doing this is not his job, so perhaps it is unfair to expect anything of the kind from Kahneman. Still, if anyone is equipped to help us deal with our mental quagmires, he is the man.

This is a slight criticism. A more serious shortcoming was that his model of the mind fails to account for a ubiquitous experience: boredom. According to Kahneman’s rough sketch, System 1 is pleased by familiarity, and System 2 is only activated (begrudgingly, and without much relish) for unfamiliar challenges. Yet there are times when familiarity can be crushing and when novel challenges can be wonderfully refreshing. The situation must be more subtle: I would guess that we are most happy with moderately challenging tasks that take place against a familiar background. In any case, I think that Kahneman overstated our intellectual laziness.

Pop psychology—if this book can be put under that category—is a genre I dip into occasionally. Though there is a lot of divergence in emphasis and terminology, the consensus is arguably more striking. Most authors seem to agree that our conscious mind is rather impotent compared to all of the subconscious control exerted by our brains. Kahneman’s work in the realm of judgments closely parallels Johathan Haidt’s work in morals: that our conscious mind mostly just passively accepts verdicts handed up from our mental netherworld. Indeed, arguably this was Freud’s fundamental message, too. Yet it is so contrary to all of our conscious experiences (as, indeed, it must be) that it still manages to be slightly disturbing.

Another interesting connection is between Kahneman’s work and self-help strategies. It struck me that these cognitive errors are quite directly related to Cognitive Behavioral Therapy, which largely consists of getting patients to spot their own mental distortions (most of which are due to our mind’s weakness with statistics) and correct them. And Kahneman’s work on experiential and remembered well being has obvious relevance to the mindfulness movement—strategies for switching our attention from our remembering to our experiencing “self.” As you can see from these connections, Kahneman’s research is awfully rich.

Though perhaps not as amazing as the blurbs would have you believe, I cannot help but conclude that this is a thoroughly excellent book. Kahneman gathers many different strands of research together into a satisfying whole. Who would have thought that a book about all the ways that I am foolish would make me feel so wise?



View all my reviews