I want to mention a
rambunctious essay by Kevin Simler, "Crony Beliefs," that a MindBlog reader pointed me to recently. It deals with the same issue as the previous post: why facts don't change people's minds. I suggest reading the whole article. Here are a few clips.
I contend that the best way to understand all the crazy beliefs out there — aliens, conspiracies, and all the rest — is to analyze them as crony beliefs. Beliefs that have been "hired" not for the legitimate purpose of accurately modeling the world, but rather for social and political kickbacks.
As Steven Pinker says,
"People are embraced or condemned according to their beliefs, so one function of the mind may be to hold beliefs that bring the belief-holder the greatest number of allies, protectors, or disciples, rather than beliefs that are most likely to be true."
The human brain has to strike an awkward balance between two different reward systems:
-Meritocracy, where we monitor beliefs for accuracy out of fear that we'll stumble by acting on a false belief; and
-Cronyism, where we don't care about accuracy so much as whether our beliefs make the right impressions on others.
And so we can roughly (with some caveats) divide our beliefs into merit beliefs and crony beliefs. Both contribute to our bottom line — survival and reproduction — but they do so in different ways: merit beliefs by helping us navigate the world, crony beliefs by helping us look good.
...our brains are incredibly powerful organs, but their native architecture doesn't care about high-minded ideals like Truth. They're designed to work tirelessly and efficiently — if sometimes subtly and counterintuitively — in our self-interest. So if a brain anticipates that it will be rewarded for adopting a particular belief, it's perfectly happy to do so, and doesn't much care where the reward comes from — whether it's pragmatic (better outcomes resulting from better decisions), social (better treatment from one's peers), or some mix of the two. A brain that didn't adopt a socially-useful (crony) belief would quickly find itself at a disadvantage relative to brains that are more willing to "play ball." In extreme environments, like the French Revolution, a brain that rejects crony beliefs, however spurious, may even find itself forcibly removed from its body and left to rot on a pike. Faced with such incentives, is it any wonder our brains fall in line?
And, the final portion of Simler's essay:
...it's .. clueless (if well-meaning) to focus on beefing up the "meritocracy" within an individual mind. If you give someone the tools to purge their crony beliefs without fixing the ecosystem in which they're embedded, it's a prescription for trouble. They'll either (1) let go of their crony beliefs (and lose out socially), or (2) suffer more cognitive dissonance in an effort to protect the cronies from their now-sharper critical faculties.
The better — but much more difficult — solution is to attack epistemic cronyism at the root, i.e., in the way others judge us for our beliefs. If we could arrange for our peers to judge us solely for the accuracy of our beliefs, then we'd have no incentive to believe anything but the truth.
In other words, we do need to teach rationality and critical thinking skills — not just to ourselves, but to everyone at once. The trick is to see this as a multilateral rather than a unilateral solution. If we raise epistemic standards within an entire population, then we'll all be cajoled into thinking more clearly — making better arguments, weighing evidence more evenhandedly, etc. — lest we be seen as stupid, careless, or biased.
The beauty of Less Wrong, then, is that it's not just a textbook: it's a community. A group of people who have agreed, either tacitly or explicitly, to judge each other for the accuracy of their beliefs — or at least for behaving in ways that correlate with accuracy. And so it's the norms of the community that incentivize us to think and communicate as rationally as we do.
All of which brings us to a strange and (at least to my mind) unsettling conclusion. Earlier I argued that other people are the cause of all our epistemic problems. Now I find myself arguing that they're also our best solution.
A bit pitty that article dealing with "motivated reasoning" end with motivated reasoning:
ReplyDelete"Now I find myself arguing that they're also our best solution." What is has to do with reality?
I guess his argument is that you need to use the same group mechanism that can reinforce irrationality to reinforce rationality. Use irrational means in the pursuit of rationality.
ReplyDelete