When people face an uncertain situation, they don't carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on mental short cuts, which often lead them to make foolish decisions. The short cuts aren't a faster way of doing the math; they're a way of skipping the math altogether...The biases and blind-spots identified by Messrs. Kahneman and Tversky aren't symptoms of stupidity. They're an essential part of our humanity, the inescapable byproducts of a brain that evolution engineered over millions of years.
Consider the overconfidence bias, which drives many of our mistakes in decision-making. The best demonstration of the bias comes from the world of investing. Although many fund managers charge high fees to oversee stock portfolios, they routinely fail a basic test of skill: persistent achievement. As Mr. Kahneman notes, the year-to-year correlation between the performance of the vast majority of funds is barely above zero, which suggests that most successful managers are banking on luck, not talent...This shouldn't be too surprising. The stock market is a case study in randomness, a system so complex that it's impossible to predict. Nevertheless, professional investors routinely believe that they can see what others can't. The end result is that they make far too many trades, with costly consequences.
We like to see ourselves as a Promethean species, uniquely endowed with the gift of reason. But Mr. Kahneman's simple experiments reveal a very different mind, stuffed full of habits that, in most situations, lead us astray. Though overconfidence may encourage us to take necessary risks—Mr. Kahneman calls it the "engine of capitalism"—it's generally a dangerous (and expensive) illusion.
What's even more upsetting is that these habits are virtually impossible to fix. As Mr. Kahneman himself admits, "My intuitive thinking is just as prone to overconfidence, extreme predictions and the planning fallacy as it was before I made a study of these issues."...Even when we know why we stumble, we still find a way to fall.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff
Wednesday, October 19, 2011
The science of irrationality.
As a followup to yesterday's post on how common sense, ideology and intuition lead us astray in our attempts to fix social problems - while social intervention programs that have been validated by true randomized experiments are ignored - I point to Jonah Lehrer's brief review of Daniel Kahneman's new book "Thinking, Fast and Slow.", which describes his work on evolved blind spots in our rational processes which appear to be virtually impossible to fix, even though we understand that they are there.
Posted by Deric Bownds at 4:30 AM
Blog Categories: culture/politics, evolutionary psychology, psychology
Subscribe to: Post Comments (Atom)
I just have to look back to my history to validate what you say, what a shame.ReplyDelete
I have come to the conclusion that the only way to overcome this problem, is to trascend human nature, to move ahead, to stop being human, to achieve a new mind/brain structure. Probably that is enlightment. Poor humans, poor us.
Trivial point: you mean Jonah Lehrer, not Johan.ReplyDelete
Typo. Thanks. Changed it.ReplyDelete
Human societies go through phases where they value logic over intuition, and vice versa. We are indeed living through a stage where intuition has been placed on a pedestal (despite what you might think), but Kahneman exposes intuition's blind spots and bigoted nature.ReplyDelete
But it has never been the dichotomy of reason vs intuition. We have always needed both, at their best, to be at our best.
Irrational fears -- such as the fashionable carbon hysteria of climate fame -- are ample evidence of how the masses allow their intuitions to be driven by outsiders with powerful megaphones.