book: Thinking Fast and Slow, review part 1

A very good book; would have been a great book but for one mid-course breakdown. 400+ pages of (what unfortunately isn’t) common sense and worth the effort, even accelerating from start to finish. An enthralling and varied road trip across the continent of Kahneman’s career at the crossroads of economics and psychology, which recently won him the 2002 Nobel Prize in Economics. And marred only by one breakdown, but it’s a dandy, a mid-trip totaling of the entire vehicle and breaking one’s leg, forcing any alert reader to hop painfully to some distant hospital before allowed to continue the excellent trip as the distaste fades. (The metaphor isn’t perfect, but that’s how it felt.)

And the trip is mostly good. The author skilfully recounts his witnessing (and often participating in or even leading) the past 40 years’ advances in economic psychology.

I took in some good new ideas, often wondering how in the world I could have missed them myself! And less forgivably–came across some ideas I had used in a different area, but had never thought of applying in another. One example, from page 85:

The principle of independent judgments (and decorrelated errors) has immediate applications for the conduct of meetings…A simple rule can help: before an issue is discussed, all members of the committee should be asked to write a very brief summary of their position…The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them.

How could I have missed this? Egad, I who have spent so many years ensuring that whatever measurement errors I can’t outsmart are at least decorrelated, to minimize their end effects. How could have missed this pure Bayesian admonition to benefit maximally on supposedly independent pieces of evidence by…ensuring they are really independent? Now, this bit of advice is not a complete solution, but it need not be complete to be useful.

I say it’s incomplete because there are 3 possible (I would say probable) kinds of correlations between the committee members’ thinking, one good, but 2 bad and the above idea deals with only 1 of them:

  1. One bad correlation: members influencing each other, right or wrong. Kahneman’s recommended pre-scribble exercise removes this one. Great. (I slap my head yet again.)
  2. But another bad source of correlation: too much similarity of the members themselves, naturally tending to restrict the range of possible solutions, and making those solutions correlate more than they should. This is why we invite unlike members onto a committee, or at least why you show outsiders your work before it’s committed to. You need not invite giraffes to serve on the committee, and you need not make permanent invitations to habitual troublemakers, though their occasional presence will probably do more good than harm. Even consultants might help–unless they’re too eager to please the committee (in which case: see previous correlation!).
  3. The good correlation: the members’ agree because, being well chosen, most of them are more often in touch with reality than not. This is what you hope for! And one important way to preserve this good correlation is to not overwhelm it with the two bad ones.

The book’s first third covers the author’s System 1 and System 2, representing two ways you solve problems. Since “the-first-damn-thing-that-pops-into-your-head” sounds a bit inelegant, the author calls this System 1. Thinking Fast. What you use when you reflect and use what you already know, together with a sense of how sure you are, or not: that’s System 2. Thinking Slow. Though I don’t think the author says so, I believe he would consider the two systems of thought as equally valid in their places.

I don’t agree. At all. Indeed: I could call “System 1 is just as valid as System 2” real fighting words–except that fighting is so System 1.

There are two cases where System 1 works well: (1) when fast response is critical, and (2) when the decision doesn’t matter. Kahneman gives fine examples of each: (1) jerking the steering wheel to avoid X, and (2) deciding which frivolous treat to buy at the counter. But these examples are either unavoidable or trivial. Real intelligence lies elsewhere.

And even in the two cases where System 1 seems to be OK: hey, are we really so helpless in the face of First Reactions? What ever happened to practicing something–even if only in thoughts–so that System 2 responses become System 1 responses. Even if everyone’s System 1 responses are similar–a very dubious assertion–what about faster and faster censoring of System 1 by System 2. How would anyone tell the difference? I’ll even assert that this is one’s primary path to achieving intelligence: faster and faster System 2 response, at least to suppress System 1’s damage until System 2 can fully take charge. It’s not magic, and it’s hardly new, as Samuel Johnson advised:

What we hope ever to do with ease, we must learn first to do with diligence.
Rephrasing in Kahneman’s terms: Work as long as you have to, to get your System 2 to dominate your System 1.. Always and completely. Without regret, without mercy. You’ll never quite get there, and there may be trivial cases where it has a place, but your general direction should be: Death to System 1.

...life intrudes. In a week or so, in a separate post, I will relate more of his Kahneman's ideas, and describe his one car wreck: how chapter "Bernoulli's Mistake" is really Kahneman's Mistake.

Comments are closed.