A number of articles are now appearing that suggest that the ascendancy of Donald Trump, the devotion of his supporters, their indifference to facts (which are derided as "fake news") is explained by our evolutionary psychology. In this vein, a
lucid piece by Elizabeth Kolbert in The New Yorker that should be required reading for anyone wanting to understand why so many reasonable-seeming people so often behave irrationally. She cites Mercier and Sperber (authors of "The Enigma of Reason"), who
...point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context...Humans’ biggest advantage over other species is our ability to coƶperate. Coƶperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups...Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.
Of the many forms of faulty thinking that have been identified, confirmation bias - the tendency people have to embrace information that supports their beliefs and reject information that contradicts them - is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments...Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.
This lopsidedness, according to Mercier and Sperber, reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group. Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave. There was little advantage in reasoning clearly, while much was to be gained from winning arguments.
Kolbert also points to work by Sloman and Fernbach (authors of The Knowledge Illusion: Why We Never Think Alone”), who describe the importance of the "illusion of explanatory depth."
People believe that they know way more than they actually do. What allows us to persist in this belief is other people...We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins...“As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write. And here our dependence on other minds reinforces the problem. If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.
Finally the work of Gorman and Gorman is noted (whose book is “Denying to the Grave: Why We Ignore the Facts That Will Save Us”):
Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly, like the conviction that vaccines are hazardous...The Gormans, too, argue that ways of thinking that now seem self-destructive must at some point have been adaptive. And they, too, dedicate many pages to confirmation bias, which, they claim, has a physiological component. They cite research suggesting that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs. “It feels good to ‘stick to our guns’ even if we are wrong,” they observe.
Thanks for this, Deric and it reminds me of the psychological phenomenon known as the backfire effect - the more evidence is shown as to the quiestioning of a particular notion or feeling - the more we dig in. Sadly, pointing this out only reaffirms the notion that the backfire effect indeed is alive and well with all of us. Thanks again.
ReplyDeleteI've found that when it comes to politics, facts rarely matter. Donald Trump's behaviour has brought this to the forefront of public discourse. He's not the first, nor the last, politician to capitalize on this phenomenon. Nor is this something unique to the Republican party and its voters.
ReplyDeleteThe same facts apply to Clinton's supporters.
ReplyDelete