When people face an uncertain situation, they don't carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on mental short cuts, which often lead them to make foolish decisions. The short cuts aren't a faster way of doing the math; they're a way of skipping the math altogether...The biases and blind-spots identified by Messrs. Kahneman and Tversky aren't symptoms of stupidity. They're an essential part of our humanity, the inescapable byproducts of a brain that evolution engineered over millions of years.
Consider the overconfidence bias, which drives many of our mistakes in decision-making. The best demonstration of the bias comes from the world of investing. Although many fund managers charge high fees to oversee stock portfolios, they routinely fail a basic test of skill: persistent achievement. As Mr. Kahneman notes, the year-to-year correlation between the performance of the vast majority of funds is barely above zero, which suggests that most successful managers are banking on luck, not talent...This shouldn't be too surprising. The stock market is a case study in randomness, a system so complex that it's impossible to predict. Nevertheless, professional investors routinely believe that they can see what others can't. The end result is that they make far too many trades, with costly consequences.
We like to see ourselves as a Promethean species, uniquely endowed with the gift of reason. But Mr. Kahneman's simple experiments reveal a very different mind, stuffed full of habits that, in most situations, lead us astray. Though overconfidence may encourage us to take necessary risks—Mr. Kahneman calls it the "engine of capitalism"—it's generally a dangerous (and expensive) illusion.
What's even more upsetting is that these habits are virtually impossible to fix. As Mr. Kahneman himself admits, "My intuitive thinking is just as prone to overconfidence, extreme predictions and the planning fallacy as it was before I made a study of these issues."...Even when we know why we stumble, we still find a way to fall.
Wednesday, October 19, 2011
The science of irrationality.
As a followup to yesterday's post on how common sense, ideology and intuition lead us astray in our attempts to fix social problems - while social intervention programs that have been validated by true randomized experiments are ignored - I point to Jonah Lehrer's brief review of Daniel Kahneman's new book "Thinking, Fast and Slow.", which describes his work on evolved blind spots in our rational processes which appear to be virtually impossible to fix, even though we understand that they are there.