Reading: “Thinking, Fast and Slow”

If you like thinking about thinking, then Daniel Kahneman‘s book “Thinking, Fast and Slow” will delight you. Here’s a bit from the introduction:

Much of the discussion in this book is about biases of intuition. However, the focus on error does not denigrate human intelligence, any more than the attention to diseases in medical texts denies good health. Most of us are healthy most of the time, and most of our judgments and actions are appropriate most of the time. As we navigate our lives, we normally allow ourselves to be guided by impressions and feelings, and the confidence we have in our intuitive beliefs and preferences is usually justified. But not always. We are often confident even when we are wrong, and an objective observer is more likely to detect our errors than we are.

So this is my aim for watercooler conversations: improve the ability to identify and understand errors of judgment and choice, in others and eventually in ourselves, by providing a richer and more precise language to discuss them. In at least some cases, an accurate diagnosis may suggest an intervention to limit the damage that bad judgments and choices often cause.

I note that Kahneman stresses the importance of having a vocabulary to discuss the matter of thinking. I have long believed that a rich vocabulary is indispensable for advancing our understanding of the world.

Kahneman received the Nobel Memorial Prize in Economics in 2002. The prize would have been certainly shared with his long-time collaborator, Amos Tversky, but for the unfortunate fact that Tversky died in 1996.

Kahneman writes

Amos and I spent several years studying and documenting biases of intuitive thinking in various tasks—assigning probabilities to events, forecasting the future, assessing hypotheses, and estimating frequencies.

In the fifth year of our collaboration, we presented our main findings in Science magazine, a publication read by scholars in many disciplines. The article (which is reproduced in full at the end of this book) was titled “Judgment Under Uncertainty: Heuristics and Biases.” It described the simplifying shortcuts of intuitive thinking and explained some 20 biases as manifestations of these heuristics—and also as demonstrations of the role of heuristics in judgment.

Historians of science have often noted that at any given time scholars in a particular field tend to share basic assumptions about their subject. Social scientists are no exception; they rely on a view of human nature that provides the background of most discussions of specific behaviors but is rarely questioned. Social scientists in the 1970s broadly accepted two ideas about human nature. First, people are generally rational, and their thinking is normally sound. Second, emotions such as fear, affection, and hatred explain most of the occasions on which people depart from rationality. Our article challenged both assumptions without discussing them directly. We documented systematic errors in the thinking of normal people, and we traced these errors to the design of the machinery of cognition rather than to the corruption of thought by emotion.

Our article attracted much more attention than we had expected, and it remains one of the most highly cited works in social science (more than three hundred scholarly articles referred to it in 2010). Scholars in other disciplines found it useful, and the ideas of heuristics and biases have been used productively in many fields, including medical diagnosis, legal judgment, intelligence analysis, philosophy, finance, statistics, and military strategy.

For example, students of policy have noted that the availability heuristic helps explain why some issues are highly salient in the public’s mind while others are neglected. People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media. Frequently mentioned topics populate the mind even as others slip away from awareness. In turn, what the media choose to report corresponds to their view of what is currently on the public’s mind. It is no accident that authoritarian regimes exert substantial pressure on independent media. Because public interest is most easily aroused by dramatic events and by celebrities, media feeding frenzies are common. For several weeks after Michael Jackson’s death, for example, it was virtually impossible to find a television channel reporting on another topic. In contrast, there is little coverage of critical but unexciting issues that provide less drama, such as declining educational standards or overinvestment of medical resources in the last year of life.

My interest in the matter of thinking and biases is motivated by my interest in understanding the political system of India. The way Indians think must have a systematic effect on what kind of government India has — and thus has a fundamental bearing on India’s poverty.

This line by Kahneman — “It is no accident that authoritarian regimes exert substantial pressure on independent media” — resonates with me because I am convinced that the Indian media is not independent. The authoritarian nature of the Indian government is not fully appreciated. But let’s move on.

Micheal Lewis has a nice article in the Dec 2011 issue of Vanity Fair on Kahneman and his book — “The King of Human Error”

Kahneman is a professor emeritus at Princeton, but, as it turned out, he lived during the summers with his wife, Anne Treisman, another well-known psychologist, near my house in Berkeley. Four years ago I summoned the nerve to write him an e-mail, and he invited me for a safe date, a cup of coffee. I found his house on the top of our hill. He opened the door wearing hiking shorts and a shirt not tucked into them, we shook hands, and I said something along the lines of what an honor it was to meet him. He just looked at me a little strangely and said, “Ah, you mean the Nobel. This Nobel Prize stuff, don’t take it too seriously.” He then plopped down into a lounge chair in his living room and began to explain to me, albeit indirectly, why he took such an interest in human unreason. His laptop rested on a footstool and a great many papers and books lay scattered around him. He was then 73 years old. It was tempting to describe him as spry, but the truth is that he felt more alert and alive than most 20-year-olds.

He was working on a book, he said. It would be both intellectual memoir and an attempt to teach people how to think. As he was the world’s leading authority on his subject, and a lot of people would pay hard cash to learn how to think, this sounded promising enough to me. He disagreed: he was certain his book would end in miserable failure. He wasn’t even sure that he should be writing a book, and it was probably just a vanity project for a washed-up old man, an unfinished task he would use to convince himself that he still had something to do, right up until the moment he died. Twenty minutes into meeting the world’s most distinguished living psychologist I found myself in the strange position of trying to buck up his spirits. But there was no point: his spirits did not want bucking up. Having spent maybe 15 minutes discussing just how bad his book was going to be, we moved on to a more depressing subject. He was working, equally unhappily, on a paper about human intuition—when people should trust their gut and when they should not—with a fellow scholar of human decision-making named Gary Klein. Klein, as it happened, was the leader of a school of thought that stressed the power of human intuition, and disagreed with the work of Kahneman and Tversky. Kahneman said that he did this as often as he could: seek out people who had attacked or criticized him and persuade them to collaborate with him. He not only tortured himself, in other words, but invited his enemies to help him to do it. “Most people after they win the Nobel Prize just want to go play golf,” said Eldar Shafir, a professor of psychology at Princeton and a disciple of Amos Tversky’s. “Danny’s busy trying to disprove his own theories that led to the prize. It’s beautiful, really.”

And now back to reading TFaS.

Author: Atanu Dey

Economist.

4 thoughts on “Reading: “Thinking, Fast and Slow””

  1. Atanu,

    I was fortunate to be able to attend a discussion with Daniel Kahneman three days ago, here in London. What came across is his utter humility and – as Michael Lewis describes it – his belief that he’s guilty of the same errors that he says other humans are.

    And yes, Danny Kahneman calls himself a pessimist, insisting that he’s probably missed or overlooked something vitally important that will bring all his theories crashing down.

    Apart from the fact that the interviewer was a bit of a bore in the end, it could have made for a fascinating conversation.

    Like

  2. Kaushal,

    Lucky you. I was lucky to attend a lecture of his at UC Berkeley in 2001 — before he got the prize. Really good guy. I have the mp3 version of a few of his talks as well.

    Regards,
    Atanu

    Like

  3. A review in The Economist, which I read a few days ago, had already aroused my interest in Prof Kahneman’s latest book, and your perspective has added to that – am planning to pick up a copy.

    I will arrive a firm conclusion only after reading it, but going by your feedback (on what you have read so far), probably along with “Nudge” by Richard Thaler (and “Fooled by Randomness” by NN Taleb) this could complete a trinity of must-read books for anyone desiring a better grip on how people think and what drives their decisions.

    May I add that the manner in which you have shared (on Twitter) what you’re reading (by not just tweeting the book’s/author’s name but actually putting down your initial thoughts as well) is worth emulating. Thanks.

    Like

Comments are closed.