...Under normal circumstances, the emotional, reward-seeking, selfish, “myopic” part of our brain is checked and balanced in its desirous cravings by our powers of cognition — our awareness of the consequences, say, of eating too much or spending too much. But after decades of never-before-seen levels of affluence and endless messages promoting instant gratification...this self-regulatory system has been knocked out of whack. The “orgy of self-indulgence” that spread in our land of no-money-down mortgages, Whybrow wrote in his 2005 book, “American Mania: When More Is Not Enough, ”has disturbed the “ancient mechanisms that sustain our physical and mental balance.”...If you put a person in an environment that worships wealth and favors conspicuous consumption, add gross income inequalities that breed envy and competition, mix in stagnant wages, a high cost of living and too-easy credit, you get overspending, high personal debt and a “treadmill-like existence,” as Whybrow calls it: compulsive getting and spending.
The “yawning void, an insatiable hunger, an emptiness waiting to be filled,” that Lasch identified as animating the typical narcissist of the 1970s has grown only deeper with the passage of time. The Great Recession was supposed to portend a scaling back, a recalibration of our lifestyle, and usher in a new era of making more of less. But the pressures that drive the dysregulated American haven’t abated any since the fall of 2008. Wall Street is resurgent, and unemployment is still high. For too many people, the cycle of craving and debt that drives our treadmill existence simply can’t be broken.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Wednesday, June 30, 2010
Dysregulation Nation
Judith Warner does an nice piece in the NYTimes Magazine, in which she notes that problems of self-regulation — of appetite, emotion, impulse and cupidity — may well be the defining social pathology of our time. The ideas of Peter C. Whybrow at UCLA are referenced:
The classical art of memory
Andrea Becchetti notes a correspondence between modern understanding of memory formation in the hippocampus and techniques in the "Art of Memory" developed by Greek and Roman culture:
Formation and consolidation of declarative memories depend on the physiology of the hippocampal formation. Moreover, this brain structure determines the dynamic representation of the spatial environment, thus also contributing to spatial navigation through specialized cells such as “place” and “grid” cells (1). The most interesting recent results by Jacobs et al. (2) further this research field by showing that the entorhinal cortex contains path cells that represent direction of movement (clockwise or counterclockwise). The authors underscore that neurons in the entorhinal cortex encode multiple features of the environmental and behavioral context that can then be memorized by means of operations carried out by the hippocampus. They conclude by suggesting that a fuller characterization of these neurons’ properties and relation to the hippocampal circuit will be necessary to understand the neural basis of cognition. I fully agree with their conclusion and wish to comment on a further aspect of this complex issue, by considering psychological evidence that traces back to the ancient world and is generally neglected by modern neuroscience.
Greek and Roman culture has handed down to us the so-called “art of memory,” a set of methods aimed at improving one’s memory, described in detail by Cicero, Quintilianus, and others. The history of these concepts and their multifarious cultural meaning was masterfully treated by Rossi (3) and Yates (4). In brief, committing to memory long written pieces, word lists, series of numbers, etc. is greatly facilitated by proceeding as follows. First, one chooses a series of objects or places located in a (preferably familiar) spatial environment, such as the architecture details of a building or the landmarks of a certain route. Subsequently, these objects or places are mentally associated to the items to be remembered. The map of environmental images, which is easy to recall, thus provides a direct hint to the more abstract items. Moreover, proceeding along such a mental path directly provides the proper order of the sequence to be memorized (say a poem or speech). For example, Cicero used to associate the main points of his long speeches to specific buildings or other topographical reference points along the familiar route to the Roman Forum.
This method still constitutes the basis of modern mnemonics, which must be deeply rooted in neurology because it seems to be applied unawares even by mnemonists who have never heard of its existence. A famous example is the one described by Luria (5), who was himself unaware of the art of memory. Such venerable psychological evidence makes the neurophysiological association between orientation in space and declarative memory in the hippocampal formation even more suggestive. It supports the notion that the consolidation of human memory is guided by a partially preconfigured system related to external space representation, which may be the evolutionary basis of memory processing of more abstract entities in complex brains. These considerations may also have heuristic value in suggesting how the enthorinal cortex, the hippocampus, and the neocortex interplay during memory consolidation of complex abstract issues.
1. Moser EI Kropff E, Moser MB
(2008) Place cells, grid cells, and the brain’s spatial representation system. Annu Rev Neurosci 31:69–89.
2. Jacobs J, Kahana MJ, Ekstrom AD, Mollison MV, Fried I (2010) A sense of direction in human entorhinal cortex. Proc Natl Acad Sci USA 107:6487–6492.
3. Rossi P (2006) Logic and the Art of Memory (Continuum International, London).
4. Yates F (1992) The Art of Memory (Pimlico, London).
5. Luria AR (1986) The Mind of a Mnemonist: A Little Book About a Vast Memory (Harvard Univ Press, Cambridge, MA).
Tuesday, June 29, 2010
Binge drinking and the adolescent brain.
Deficits in hippocampus-associated cognitive tasks are observed in alcoholic humans. Taffe et al. show that binge drinking in adolescent macaque monkeys causes long lasting decreases in hippocampus cell division, turnover, and migration. Their results:
...demonstrate that the hippocampal neurogenic niche during adolescence is highly vulnerable to alcohol and that alcohol decreases neuronal turnover in adolescent nonhuman primate hippocampus by altering the ongoing process of neuronal development. This lasting effect, observed 2 mo after alcohol discontinuation, may underlie the deficits in hippocampus-associated cognitive tasks that are observed in alcoholics.
Being hungry or full influences our risk taking.
Dolan's group does some neat experiments showing that having a full stomach makes us more risk aversive in monetary decisions. We act just like other animals, who often express a preference for risky (more variable) food sources when below a metabolic reference point (hungry), and safe (less variable) food sources when sated. We follow an ecological model of feeding behavior, not the behavior predicted by normative economic theory. Thus hormone levels that reflect our metabolic state (ghrelins signalling acute nutrient intake and leptins providing an assay of energy reserves) can, like oxytocin and testosterone, influence our economic choices.
Monday, June 28, 2010
MRI can decode subjective, but not objective, memories
In experiments that cast further doubt on claims of lie detection by fMRI measurements, Rissman et al. find that subjective memory states can be decoded accurately under controlled experimental conditions, but that fMRI has uncertain utility for objectively detecting an individual's past experiences. Here is a nice summary of their work from Gilbert Chin:
As a consequence of recent investigations that have used sophisticated methods of analyzing brain activity to propose that objective lie detection may be feasible, it has become apparent that designing a task in which subjects lie whole-heartedly and voluntarily (as opposed to being instructed to do so every fifth answer, for instance) is a nontrivial undertaking. Rissman et al. have approached this challenge by adapting a well-established laboratory paradigm—that of face recognition—to conditions that approximate those of quotidian experience. They asked subjects to study 200 faces and then interrogated them 1 hour later, using a mix of new and old test faces. The menu of responses offered a choice of (i) definitely remembered; (ii–iii) high and low confidence that the face was familiar; and (iv–v) high and low confidence that the face was new.
An analysis of brain activity during the response phase revealed distinctive patterns when old (that is, previously studied) faces were rated by the subject as definitely remembered versus strongly familiar, and also when they were rated as being strongly versus weakly familiar. In contrast, for faces rated as being weakly unfamiliar, it was not possible to tell from the neural activity patterns which were actually new and which had been seen during the study phase, and for weakly familiar faces, the new/old distinction was achievable only some of the time. Furthermore, if subjects were instead told to rate attractiveness during the study phase and then asked to categorize faces by gender during the response phase, it was not possible to diagnose which faces were new and which were not. Taken together, these findings suggest that brain activity reflects subjective, rather than objective, face recognition.
How we read the minds of others.
Tamir et al. do some interesting MRI studies that suggest that understanding the mental states of others starts with self perception as an anchor from which serial adjustments of the perceptions of others are made:
Recent studies have suggested that the medial prefrontal cortex (MPFC) contributes both to understanding the mental states of others and to introspecting about one's own mind. This finding has suggested that perceivers might use their own thoughts and feelings as a starting point for making inferences about others, consistent with “simulation” or “self-projection” views of social cognition. However, perceivers cannot simply assume that others think and feel exactly as they do; social cognition also must include processes that adjust for perceived differences between self and other. Recent cognitive work has suggested that such correction occurs through a process of “anchoring-and-adjustment” by which perceivers serially tune their inferences from an initial starting point based on their own introspections. Here, we used functional MRI to test two predictions derived from this anchoring-and-adjustment view. Participants (n = 64) used a Likert scale to judge the preferences of another person and to indicate their own preferences on the same items, allowing us to calculate the discrepancy between the participant's answers for self and other. Whole-brain parametric analyses identified a region in the MPFC in which activity was related linearly to this self–other discrepancy when inferring the mental states of others. These findings suggest both that the self serves as an important starting point from which to understand others and that perceivers customize such inferences by serially adjusting away from this anchor.
A bit more on the actual experimental design:
Figure - The relation between BOLD response and self–other discrepancy during Other trials was calculated separately for subregions of the MPFC. Although the response of dorsal MPFC (A) increased linearly with increasing self–other discrepancy, the response of ventral MPFC (B) distinguished only between trials on which self–other discrepancy was zero (overlap between self and other) versus greater than zero (discrepancy between self and other). Error bars indicate the SEM.
Although the specific design of the four experiments differed slightly, each required participants to answer a series of questions about their opinions and preferences and to judge how other individuals would answer the same questions. On each trial, participants saw a cue that indicated the target of the judgment (self or another person) and a brief phrase (e.g., “enjoy winter sports such as skiing or snowboarding”; “fear speaking in public”). Participants used either a four- or five-point scale either to report how well the statement described themselves or to judge how well it described the other person. Within each experiment, participants considered the same set of statements for self and other.
Before scanning, participants were told that the purpose of the experiment was to examine how people make inferences about target individuals on the basis of minimal or no information. In all studies, targets were college-aged individuals depicted by a photograph downloaded from an internet dating website, although the specific identity of individuals varied across studies.
Friday, June 25, 2010
"The Singularity" - and Singularity University
I've been meaning to point to this interesting New York Times article, on techno-utopian Singularity University (whose sponsors include Google co-founders Sergey Brin and Larry Page), which aims to enhance and prepare us for the arrival of "The Singularity" — "a time, possibly just a couple decades from now, when a superior intelligence will dominate and life will take on an altered form that we can’t predict or comprehend in our current, limited state." The article focuses on Raymond Kurzweil, the inventor and businessman who is the Singularity’s most ubiquitous spokesman:
...(who, in August)..will begin a cross-country multimedia road show to promote “Transcendent Man,” a documentary about his life and beliefs. Another of his projects, “The Singularity Is Near: A True Story About the Future,” has also started to make its way around the film festival circuit....some Singularitarians aren’t all that fond of Mr. Kurzweil...“I think he’s a genius and has certainly brought a lot of these ideas into the public discourse,” says James J. Hughes, the executive director of the Institute for Ethics and Emerging Technologies, a nonprofit that studies the implications of advancing technology. “But there are plenty of people that say he has hijacked the Singularity term.”
Some of the Singularity’s adherents portray a future where humans break off into two species: the Haves, who have superior intelligence and can live for hundreds of years, and the Have-Nots, who are hampered by their antiquated, corporeal forms and beliefs....“The Singularity is not the great vision for society that Lenin had or Milton Friedman might have,” says Andrew Orlowski, a British journalist who has written extensively on techno-utopianism. “It is rich people building a lifeboat and getting off the ship.”
Despite all of the zeal behind the movement, there are those who look askance at its promises and prospects...Jonathan Huebner, for example, is often held up as Mr. Kurzweil’s foil. A physicist who works at the Naval Air Warfare Center as a weapons designer, he, like Mr. Kurzweil, has compiled his own cathedral of graphs and lists of important inventions. He is unimpressed with the state of progress and, in 2005, published in a scientific journal a paper called “A Possible Declining Trend for Worldwide Innovation.”..Measuring the number of innovations divided by the size of the worldwide population, Dr. Huebner contends that the rate of innovation peaked in 1873. Or, based on the number of patents in the United States weighed against the population, he found a peak around 1916. (Both Dr. Huebner and Mr. Kurzweil are occasionally teased about their faith in graphs.)
Presidential Harrisment
A clip from an article by Steve Mirsky in the June issue of Scientific American:
In early March, Harris Interactive conducted an on-line survey to gauge the attitudes of Americans toward President Barack Obama. The Harris Poll generated some fascinating data. For example, 40 percent of those polled believe Obama is a socialist. (He’s not—ask any socialist.) Thirty-two percent believe he is a Muslim. (I had predicted that a Mormon, Jew, Wiccan, atheist and Quetzalcoatl worshipper would become president before America elected a Muslim, so a third of this country actually may be quite open-minded, in an obtuse way.) Also, 14 percent believe that Obama may be the Antichrist. Of those who identified themselves as Republicans, 24 percent think Obama might be.
Thursday, June 24, 2010
Concert pianists as genius models?
Not likely in my case.... but I will mention an article by Charles Ambrose in the July-August issue of American Scientist, pointed out to me by a friend who is a loyal MindBlog reader. I started this blog post at the same time I plunged into read the article, assuming I would be passing on some juicy clips, but alas have to report coming up short of much substance - although the article is worth a link because of its review of brain plasticity, and notes specific brain changes associated with development of various skilled activities. Ambrose mentions the increased areas in the parietal lobe found in Albert Einstein's brain, and then goes on to note other examples of increases in brain areas associated with expertise, as for example in professional musicians who have enlarged areas in their auditory cortex. The article doesn't even begin to engage the teaser sentence at its beginning: "What accounts for highly intelligent and greatly gifted individuals?" and is disjointed and wandering enough that I'm surprised that editors at American Scientist let it through their filters.
Antipsychotic drug shrinks the brain
It turns out that haloperidol, a commonly-prescribed antipsychotic drug, shrinks the brain within hours of administration, specifically diminishing grey-matter volume in the striatum — a region that mediates movement. The effect is reversible. The Meyer-Lindberg group doing the study suggests that by acting on Dopamine D2 receptors it may downsize synaptic connections, and thus cause the lapses in motor control that affect many patients on antipsychotics.
Wednesday, June 23, 2010
Sense of Wonder
As we age our brains become so stuffed with our history that we loose the capacity to open to novel experiences, to sense things with the naive freshness of a child. I pass on this brief fable by Richard A. Lovett in the June 3 Issue of Nature about a miraculous cure for this condition:
Clay Nadir wanted a book for the beach. Not just any book, but the type that makes you forget the beach, other than as the place where you discovered Jack London or Sherlock Holmes or Norman Mailer. But even on the shelves of his city's largest bookstore, he wasn't finding anything. It was as though everything new had long ago been stuffed into his brain.
Maybe he was jaded. Maybe, once you'd worked your way through Agatha Christie, no manor house would ever again hold your attention. And was The Time Machine really that good, or had it simply been a first, both for Clay and the world?
Nature writing, westerns, mountaineering, ghost stories, dysfunctional families ... literarily, Clay had been there, done that. In the past hour he'd wandered though fantasy, mystery, biography and what a friend called 'Qual. Lit.' — a conceited term if ever there was one. Quality literature, ha! As though any genre had a monopoly. Not to mention that once you'd read Nabokov and Woolf and Joyce, you could get as jaded with that stuff as anything else. Even Shakespeare you could eventually memorize.
Maybe he should try romance. He'd never dabbled in it before, so at least it would be different.
Then, in the occult section, something caught his eye. It was an odd book: black, with a red, spiral vortex on the cover. It made him think of Hitchcock's Vertigo. Now that was a movie: Jimmy Stewart and Kim Novak in a deceptively simple story you had to see several times to fully grasp. But once you did, so many other movies seemed so ... trivial.
The book also made him think of something from his youth. Something to do with an old TV show. What was it called? Oh yes, The Time Tunnel. Each week, they'd spun this thing like a giant pinwheel and run off to some distant era. Probably unbelievably stupid if he watched it today, but at the time it hit him like his first viewing of Doctor Who, another show involving a time vortex, plus a lot of other things he'd never seen before.
There was no author listed, and as he picked up the book, he seemed to be falling into the vortex. On the back was a simple endorsement: 'Guaranteed to restore your sense of wonder'.
Yeah, right. He'd heard that one before.
He opened it but there was no preface, no introduction, no writing at all. Just more spirals, one to a page, these in black-and-white.
He nearly set it back down, but the sense of being sucked in was too strong. It was as though the entire room were spinning: just what Jimmy Stewart's character must have felt as he looked into the depths ... Dizzying enough that Clay no longer wanted to think about Stewart or Hitchcock or old TV shows.
There was a white circle in the centre of the first spiral. Peering into it he saw flickers of motion: barely remembered images of Jimmy Stewart, Kim Novak and maybe The Time Tunnel.
With an effort, he turned the page. Another spiral, again sucking him in. This time, he saw words.
Call me Ishmael.
It was the best of times ...
In the beginning God created the heavens and earth.
Once upon a time there was a Martian named Valentine Michael Smith.
To be, or not to be ...
Rather than simply reading them, he felt as though the words were being pulled from him, faster and faster. He turned another page and another and another. It wasn't just words and videos. There were also stills: a stern-looking couple with a pitchfork; a woman with an odd half-smile. Guitar riffs, symphonies, something about Lucy in the sky with a yellow submarine. Names for these would tickle his memory then be gone, often faster than he could grasp what they had been. Something about whistling and moaning. Something about singing insects.
Then, it was over. Clay had no idea how long he'd been staring at the book. All he knew was that he'd flipped through most of the pages, but not all. He looked at the next few, but they were simply spirals. Still dizzying, but not like before. He flipped back, but there were no longer any words or images. Just paper.
There was no price tag on the book. Clay wondered briefly why it had been so captivating. Maybe he'd merely let his blood sugar dip too low. Maybe the spirals caught him off guard.
He put the book back where he'd found it, on a countertop beside a computer terminal where customers could check the store's inventory. It was as though a prior reader, if that was the proper term for the peruser of such a book, had placed it there, easy to find.
Clay still needed something for the beach.
He wandered the store, more or less at random, until he fetched up in the mystery section. Mysteries were fun, he thought, although he didn't know why. There were a lot of books and he couldn't remember which ones he'd read, so he picked the first that caught his eye.
'It was a dark and stormy night,' he read. Wow, he breathed, and was instantly hooked.
The Devil's grimmace.
An interesting fragment by Gisela Telis in ScienceNow:
When 15th-century Europeans first landed on the Bahamas, Cuba, and Hispaniola, they met with the "devil's grimace." That's what these foreigners dubbed the faces with bared teeth that adorned everything from necklaces to ceremonial bowls created by the native TaÃno people. European chroniclers interpreted the motif as a ferocious animal's snarl or a skull's grimace, signs of the heathen islanders' aggression. But they were wrong, researchers report in the latest issue of Current Anthropology. By studying teeth-baring in humans, chimps, and rhesus macaques and comparing these to the TaÃno depictions, scientists determined that open-lipped, closed-jaw displays show submission, benign intent, and even happiness—but not aggression. So the "fiendish" faces that so troubled Europeans were most likely just smiling, to signal—ironically enough—social cohesion and connection.
Tuesday, June 22, 2010
Oxytocin: out-group aggression, social cues, and amygdalar action
Three recent papers are a reflection of the recent outpouring of work on oxytocin (a peptide hormone containing the 9 amino acids shown in the figure), which (from Miller's review):
In another study on oxytocin, Gamer et al. add to studies that have shown that oxytocin decreases aversive reactions to negative social stimuli, and find that subjects given oxytocin, relative to subjects given placebo, are more likely to make eye movements toward the eye region when viewing images of human faces. They find that subregions of the amygdala are important in mediating this effect. Oxytocin:
...promotes social bonding in a wide range of animals, including humans. Sold on the Internet in a formulation called "Liquid Trust," the peptide hormone is marketed as a romance enhancer and sure ticket to business success. Australian therapists are trying it alongside counseling for couples with ailing marriages. And police and military forces reportedly are interested in its potential to elicit cooperation from crime suspects or enemy agents.The hormone is now being found to have a prickly side, and is coming to be regarded as much more than just a touchy-feely "trust hormone." De Dreu et al. have designed experiments to demonstrate that oxytocin drives a "tend and defend" response in that it promotes in-group trust and cooperation, and defensive, but not offensive, aggression toward competing out-groups.
In another study on oxytocin, Gamer et al. add to studies that have shown that oxytocin decreases aversive reactions to negative social stimuli, and find that subjects given oxytocin, relative to subjects given placebo, are more likely to make eye movements toward the eye region when viewing images of human faces. They find that subregions of the amygdala are important in mediating this effect. Oxytocin:
...attenuated activation in lateral and dorsal regions of the anterior amygdala for fearful faces but enhanced activity for happy expressions, thus indicating a shift of the processing focus toward positive social stimuli. On the other hand, oxytocin increased the likelihood of reflexive gaze shifts toward the eye region irrespective of the depicted emotional expression. This gazing pattern was related to an increase of activity in the posterior amygdala and an enhanced functional coupling of this region to the superior colliculi. Thus, different behavioral effects of oxytocin seem to be closely related its specific modulatory influence on subregions within the human amygdala.These finding have implications for understanding the role of oxytocin in normal social behavior as well as the possible therapeutic impact of oxytocin in brain disorders characterized by social dysfunction.
The Wayward Mind
Continuing my review of old posts that have popped into my head over the past few days during mulling over this and that, I am reproducing my March 6, 2006 post in its entirety:
I want to mention the excellent book by Guy Claxton - THE WAYWARD MIND, an intimate history of the unconscious (2005, Little, Brown, and Co., available from amazon.com). Here is a excerpt and paraphrase from pp. 348-252:
I want to mention the excellent book by Guy Claxton - THE WAYWARD MIND, an intimate history of the unconscious (2005, Little, Brown, and Co., available from amazon.com). Here is a excerpt and paraphrase from pp. 348-252:
"What we call our "self " is an agglomeration of both conscious and unconscious ingredients, cans, needs, dos, oughts, thinks - the temptation is to assume that the "I" is the same in all of them - so that instead of having an intricate web of things that make me ME, I have to create a single imaginary hub around which they all revolve, to which they all refer - the attempt to keep this fiction going, to "hold it together" can become quite tiring and bothersome - If "I" am essentially reasonable, if I imagine that my zones of control - over my own feelings for example - are wider and more robust than they are, then I am going to get in a tangle trying to "control myself." If I have decided that who I am is clever, attractive, athletic, stable, creating the hub of "I" locks everything together and prevents it moving. It stops Me expanding to include the unconscious, or graciously shrinking to accommodate old age. I can not enjoy my waywardness, nor see it as an intrinsic part of ME - (note: he gives Ramachandran's two foot nose pinocchio demonstration as evidence of plasticity of self image), and then says - The orthodox sense of self is thrown by such experiences, and tends to suffer a sense-of-humour failure. It sees all waywardness as an affront, and tends to become earnest or myopic in response. In a nutshell: it is bad enough to have a nightmare, without your rattled sense of self telling you that you are going mad. Weird experience can never be just funny (as the pinocchio effect can be) or matter-of fact (as possession is in Bali), or transiently inconvenvient (as a bad dream is), or wonderful (as a mystical experience can be), or just mysterious (as a premonition might be). For the locked-up self they have to be denied, explained or dealt with. All the evidence is that a more relaxed attitude toward the bounds of self makes for a richer, easier and more creative life. Perhaps, after all, waywardness in all its forms is in need not so much of explanation, but of a mystified but friendly welcome. We can explain it if we wish, and the brain is beginning to a reasonable job. But the need to explain, when not motivated by the dispassionate curiosity of the scientist, is surely a sign of anxiety: of the desire to tame with words that which is experienced as unsettling.
Monday, June 21, 2010
Followup on acupuncture
Following the "Acupuncture's secret revealed?" post on 6/17 a reader sent me two interesting links that I want to pass on. Body in Mind reviews a paper in Journal of Pain that finds no significant effect of physician training or expertise on outcome, and notes studies that find no difference between needle placement at classical acupuncture points and randomly placed needles, and also that patient expectations about acupuncture influence outcome. And, Eric Mead briefly lectures on "The Magic of the Placebo."
Is life worth living?
Philosopher Peter Singer, in his "Should this be the last generation" query, offers philosophical rambling of the sort that drives me up the wall, a prime example of one of the things our brains were definitely not designed to do. My bottom line is that I refuse to get excited about existential issues ("Is life worth living", etc.) beyond those I think my two abyssinian cats would find compelling...food, shelter, a place to poop, and sex. I don't think the human overlay on top of that has shown much competence with 'the meaning of it all' questions. In this territory we are like the dog being asked to understand quantum physics. Singer starts by noting 19th century German philosopher Schopenhauer's pessimism:
...the best life possible for humans is one in which we strive for ends that, once achieved, bring only fleeting satisfaction. New desires then lead us on to further futile struggle and the cycle repeats itself.He then goes on to note more recent arguments from philosopher David Benatar:
To bring into existence someone who will suffer is, Benatar argues, to harm that person, but to bring into existence someone who will have a good life is not to benefit him or her...Hence continued reproduction will harm some children severely, and benefit none.Singer does end on a more upbeat note:
Benatar also argues that human lives are, in general, much less good than we think they are....we are, in Benatar’s view, victims of the illusion of pollyannaism. This illusion may have evolved because it helped our ancestors survive, but it is an illusion nonetheless. If we could see our lives objectively, we would see that they are not something we should inflict on anyone.
...the people who will be most severely harmed by climate change have not yet been conceived. If there were to be no future generations, there would be much less for us to feel to guilty about....So why don’t we make ourselves the last generation on earth? If we would all agree to have ourselves sterilized then no sacrifices would be required — we could party our way into extinction!
I do think it would be wrong to choose the non-sentient universe. In my judgment, for most people, life is worth living. Even if that is not yet the case, I am enough of an optimist to believe that, should humans survive for another century or two, we will learn from our past mistakes and bring about a world in which there is far less suffering than there is now. But justifying that choice forces us to reconsider the deep issues with which I began. Is life worth living? Are the interests of a future child a reason for bringing that child into existence? And is the continuance of our species justifiable in the face of our knowledge that it will certainly bring suffering to innocent future human beings?
Are you holding your breath?
While mulling over how I am feeling or acting during the day, a recollection of an old MindBlog post occasionally pops into my head....I sometimes go back and look at that post, find it useful, and think it might be worth repeating. I think I will act on this impulse now, and more frequently in the future. Here is a repeat of the entirety of a post from Jan. 28, 2008:
I notice - if I am maintaining awareness of my breathing - that the breathing frequently stops as I begin a skilled activity such as piano or computer keyboarding. At the same time I can begin to sense an array of unnecessary (and debilitating) pre-tensions in the muscle involved. If I just keep breathing and noticing those tensions, they begin to release. (Continuing to let awareness return to breathing when it drifts is a core technique of mindfulness meditation). Several sources note that attending to breathing can raise one's general level of restfulness relative to excitation, enhancing parasympathetic (restorative) over sympathetic (arousing) nervous system activities. These personal points make me feel like passing on some excerpts from a recent essay which basically agrees with these points: "Breathtaking New Technologies," by Linda Stone, a former Microsoft VP and Co-Founder and Director of Microsoft's Virtual Worlds Group/Social Computing Group. It is a bit simplistic, but does point in a useful direction.
I notice - if I am maintaining awareness of my breathing - that the breathing frequently stops as I begin a skilled activity such as piano or computer keyboarding. At the same time I can begin to sense an array of unnecessary (and debilitating) pre-tensions in the muscle involved. If I just keep breathing and noticing those tensions, they begin to release. (Continuing to let awareness return to breathing when it drifts is a core technique of mindfulness meditation). Several sources note that attending to breathing can raise one's general level of restfulness relative to excitation, enhancing parasympathetic (restorative) over sympathetic (arousing) nervous system activities. These personal points make me feel like passing on some excerpts from a recent essay which basically agrees with these points: "Breathtaking New Technologies," by Linda Stone, a former Microsoft VP and Co-Founder and Director of Microsoft's Virtual Worlds Group/Social Computing Group. It is a bit simplistic, but does point in a useful direction.
I believe that attention is the most powerful tool of the human spirit and that we can enhance or augment our attention with practices like meditation and exercise, diffuse it with technologies like email and Blackberries, or alter it with pharmaceuticals...but... the way in which many of us interact with our personal technologies makes it impossible to use this extraordinary tool of attention to our advantage...the vast majority of people hold their breath especially when they first begin responding to email. On cell phones, especially when talking and walking, people tend to hyper-ventilate or over-breathe. Either of these breathing patterns disturbs oxygen and carbon dioxide balance...breath holding can contribute significantly to stress-related diseases. The body becomes acidic, the kidneys begin to re-absorb sodium, and as the oxygen and CO2 balance is undermined, our biochemistry is thrown off.
The parasympathetic nervous system governs our sense of hunger and satiety, flow of saliva and digestive enzymes, the relaxation response, and many aspects of healthy organ function. Focusing on diaphragmatic breathing enables us to down regulate the sympathetic nervous system, which then causes the parasympathetic nervous system to become dominant. Shallow breathing, breath holding and hyper-ventilating triggers the sympathetic nervous system, in a "fight or flight" response...Some breathing patterns favor our body's move toward parasympathetic functions and other breathing patterns favor a sympathetic nervous system response. Buteyko (breathing techniques developed by a Russian M.D.), Andy Weil's breathing exercises, diaphragmatic breathing, certain yoga breathing techniques, all have the potential to soothe us, and to help our bodies differentiate when fight or flight is really necessary and when we can rest and digest.
I've changed my mind about how much attention to pay to my breathing patterns and how important it is to remember to breathe when I'm using a computer, PDA or cell phone...I've discovered that the more consistently I tune in to healthy breathing patterns, the clearer it is to me when I'm hungry or not, the more easily I fall asleep and rest peacefully at night, and the more my outlook is consistently positive...I've come to believe that, within the next 5-7 years, breathing exercises will be a significant part of any fitness regime.
Friday, June 18, 2010
Associating a nerve growth factor with positive affect - depression therapy?
Panksepp has made a number of interesting observations on the neurochemistry of affiliative (bonding) and hedonic behavior in animals (role of dopamine, etc). Now attention turns to nerve growth factors. Here is the abstract from a recent collaboration:
Positive emotional states have been shown to confer resilience to depression and anxiety in humans, but the molecular mechanisms underlying these effects have not yet been elucidated. In laboratory rats, positive emotional states can be measured by 50-kHz ultrasonic vocalizations (hedonic USVs), which are maximally elicited by juvenile rough-and-tumble play behavior. Using a focused microarray platform, insulin-like growth factor I (IGFI) extracellular signaling genes were found to be upregulated by hedonic rough-and-tumble play but not depressogenic social defeat. Administration of IGFI into the lateral ventricle increased rates of hedonic USVs in an IGFI receptor (IGFIR)-dependent manner. Lateral ventricle infusions of an siRNA specific to the IGFIR decreased rates of hedonic 50-kHz USVs. These results show that IGFI plays a functional role in the generation of positive affective states and that IGFI-dependent signaling is a potential therapeutic target for the treatment of depression and anxiety.
Is that my Mobile ringing? Rapid brain processing
Roye et al. show that top-down frontal-parietal attentional mechanisms prime even the earliest stages of our auditory pathways to be especially sensitive to personally significant sounds:
Anecdotal reports and also empirical observations suggest a preferential processing of personally significant sounds. The utterance of one's own name, the ringing of one's own telephone, or the like appear to be especially effective for capturing attention. However, there is a lack of knowledge about the time course and functional neuroanatomy of the voluntary and the involuntary detection of personally significant sounds. To address this issue, we applied an active and a passive listening paradigm, in which male and female human participants were presented with the SMS ringtone of their own mobile and other's ringtones, respectively. Enhanced evoked oscillatory activity in the 35–75 Hz band for one's own ringtone shows that the brain distinguishes complex personally significant and nonsignificant sounds, starting as early as 40 ms after sound onset. While in animals it has been reported that the primary auditory cortex accounts for acoustic experience-based memory matching processes, results from the present study suggest that in humans these processes are not confined to sensory processing areas. In particular, we found a coactivation of left auditory areas and left frontal gyri during passive listening. Active listening evoked additional involvement of sensory processing areas in the right hemisphere. This supports the idea that top-down mechanisms affect stimulus representations even at the level of sensory cortices. Furthermore, active detection of sounds additionally activated the superior parietal lobe supporting the existence of a frontoparietal network of selective attention.
Thursday, June 17, 2010
Acupuncture's secret revealed?
I've tried acupuncture therapy for myself, and found it to be somewhere between ineffective and mildly annoying. This experience, plus reading several convincing studies finding identical pain reduction effects when acupuncture was compared with sham or placebo manipulations has made me think it likely that acupuncture is in fact a placebo effect. Now Goldman et al. have found a greater than 20-fold increase in adenosine, a nerve modulator with anti-nociceptive (i.e. anti-pain) properties, in tissue around the point where acupuncture needles are rotated in a mouse's paw. This reduces pain in the paw caused by an injected inflamatory chemical, and the effect is not observed in mice genetically altered to delete the pain nerve adenosine receptors. An adenosine receptor agonist (enhancer) boosts the effectiveness of the acupuncture treatment.
Now, what one needs to see next is experiments with humans in which adenosine release caused by rotating a needle at the classical acupuncture needle points is measured and compared with the release at randomly placed needles. Also, it would be interesting to see whether other placebo interventions shown to cause endorphin release also caused adenosine release.
Now, what one needs to see next is experiments with humans in which adenosine release caused by rotating a needle at the classical acupuncture needle points is measured and compared with the release at randomly placed needles. Also, it would be interesting to see whether other placebo interventions shown to cause endorphin release also caused adenosine release.
Cognitive changes caused by single exposure to a placebo
Given the mention of placebo responses in today's other post on acupuncture, I thought I would pass on this interesting bit in Neuropsychologia from Morton et al., showing that the cognitive effects of a single placebo intervention can persist for six weeks:
Placebo has been shown to be a powerful analgesic with corresponding reduction in the activation of the pain matrix in the brain. However, the response to placebo treatment is highly variable. It is unclear how anticipatory and pain-evoked potentials are affected by the treatment and how reproducible the response is. Laser stimulation was used to induce moderate pain in healthy volunteers. We induced placebo analgesia by conditioning subjects to expect pain reduction by applying a sham anaesthetic cream on one arm in conjunction with a reduced laser stimulus. Pain ratings were assessed before, during and after treatment. Using electroencephalography (EEG) we measured anticipatory neural responses and pain-evoked potentials to laser heat to determine how expectation of analgesia affected the response to a placebo manipulation. This was a reproducibility study and as such the experimental procedure was repeated after a minimum gap of 2 weeks. Significant reductions in pain-evoked potentials were shown after treatment. The anticipatory responses did not change after treatment for the control and sham-treatment groups in the first session but were significantly lower in the repeat session relative to the first session in the sham-treatment group only. A significant correlation was found between the reduction in state anxiety in the repeat session relative to the first and the reduction in the anticipatory response in the sham-treatment group. Receiving a placebo treatment appears to cause a lasting change in the cognitive processing of pain for at least 6 weeks. This cognitive change may be facilitated by a change in state anxiety.
Wednesday, June 16, 2010
Our brains on the internet - smarter, dumber, neither?
As part of my re-entry catching up with accumulated articles, I want to point out some contrasting takes on how gadgets, the internet, our modern pace, multi-tasking and attention span, etc. are influencing our brains:
Richtel describes a number of experiments demonstrating how multitasking can diminish the ability to focus on or switch between tasks
Richtel describes a number of experiments demonstrating how multitasking can diminish the ability to focus on or switch between tasks
While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information...and they experience more stress...even after the multitasking ends, fractured thinking and lack of focus persist.Jonah Lehrer reviews "The Shallows," a new book by Nicholas Carr on the internet and the brain. Carr takes a dire view of what the internet is doing to our brains, but Lehrer counters:
There is little doubt that the Internet is changing our brain. Everything changes our brain. What Carr neglects to mention, however, is that the preponderance of scientific evidence suggests that the Internet and related technologies are actually good for the mind. For instance, a comprehensive 2009 review of studies published on the cognitive effects of video games found that gaming led to significant improvements in performance on various cognitive tasks,...Carr's argument also breaks down when it comes to idle Web surfing. A 2009 study by neuroscientists at the University of California, Los Angeles, found that performing Google searches led to increased activity in the dorsolateral prefrontal cortex, at least when compared with reading a "book-like text." Interestingly, this brain area underlies the precise talents, like selective attention and deliberate analysis, that Carr says have vanished in the age of the Internet. Google, in other words, isn't making us stupid -- it's exercising the very mental muscles that make us smarter.Pinker offers a sanguine and sane assessment:
New forms of media have always caused moral panics...such panics often fail basic reality checks...If electronic media were hazardous to intelligence, the quality of science would be plummeting. Yet discoveries are multiplying like fruit flies, and progress is dizzying. Other activities in the life of the mind, like philosophy, history and cultural criticism, are likewise flourishing.
...the effects of experience are highly specific to the experiences themselves. If you train people to do one thing (recognize shapes, solve math puzzles, find hidden words), they get better at doing that thing, but almost nothing else. Music doesn’t make you better at math, conjugating Latin doesn’t make you more logical, brain-training games don’t make you smarter...The effects of consuming electronic media are also likely to be far more limited than the panic implies. Media critics write as if the brain takes on the qualities of whatever it consumes, the informational equivalent of “you are what you eat.” As with primitive peoples who believe that eating fierce animals will make them fierce, they assume that watching quick cuts in rock videos turns your mental life into quick cuts or that reading bullet points and Twitter postings turns your thoughts into bullet points and Twitter postings.
...to encourage intellectual depth, don’t rail at PowerPoint or Google. It’s not as if habits of deep reflection, thorough research and rigorous reasoning ever came naturally to people. They must be acquired in special institutions, which we call universities, and maintained with constant upkeep, which we call analysis, criticism and debate. They are not granted by propping a heavy encyclopedia on your lap, nor are they taken away by efficient access to information on the Internet.
The new media have caught on for a reason. Knowledge is increasing exponentially; human brainpower and waking hours are not. Fortunately, the Internet and information technologies are helping us manage, search and retrieve our collective intellectual output at different scales, from Twitter and previews to e-books and online encyclopedias. Far from making us stupid, these technologies are the only things that will keep us smart.
Believing is seeing.
Langer et al. do experiments showing that vision can be improved by manipulating our mind-set:
.... In Study 1, participants were primed with the mind-set that pilots have excellent vision. Vision improved for participants who experientially became pilots (by flying a realistic flight simulator) compared with control participants (who performed the same task in an ostensibly broken flight simulator). Participants in an eye-exercise condition (primed with the mind-set that improvement occurs with practice) and a motivation condition (primed with the mind-set “try and you will succeed”) demonstrated visual improvement relative to the control group. In Study 2, participants were primed with the mind-set that athletes have better vision than nonathletes. Controlling for arousal, doing jumping jacks resulted in greater visual acuity than skipping (perceived to be a less athletic activity than jumping jacks). Study 3 took advantage of the mind-set primed by the traditional eye chart: Because letters get progressively smaller on successive lines, people expect that they will be able to read the first few lines only. When participants viewed a reversed chart and a shifted chart, they were able to see letters they could not see before. Thus, mind-set manipulation can counteract physiological limits imposed on vision.
Tuesday, June 15, 2010
MindBlog back in Madison - the anterior insula rests from risk alertness
I'm finally back in my University of Wisconsin office, after a month of mainly being on the road, and I'm very much looking forward to settling in to have more time to read and face the list of accumulated articles that might be the subjects of blog posts. No matter how comfortable I think I am feeling while traveling, I still am surprised on the return to familiar settings as I watch the body relax to reveal a stored up tiredness, indicating how much energy was being put into the alertness and vigilance - being poised for the unexpected in foreign settings.
It would appear that my being more alert to risks associated with travel was, according to Mohr et al., strenuously exercising my anterior insula:
It would appear that my being more alert to risks associated with travel was, according to Mohr et al., strenuously exercising my anterior insula:
In our everyday life, we often have to make decisions with risky consequences, such as choosing a restaurant for dinner or choosing a form of retirement saving. To date, however, little is known about how the brain processes risk. Recent conceptualizations of risky decision making highlight that it is generally associated with emotions but do not specify how emotions are implicated in risk processing. Moreover, little is known about risk processing in non-choice situations and how potential losses influence risk processing. Here we used quantitative meta-analyses of functional magnetic resonance imaging experiments on risk processing in the brain to investigate (1) how risk processing is influenced by emotions, (2) how it differs between choice and non-choice situations, and (3) how it changes when losses are possible. By showing that, over a range of experiments and paradigms, risk is consistently represented in the anterior insula, a brain region known to process aversive emotions such as anxiety, disappointment, or regret, we provide evidence that risk processing is influenced by emotions. Furthermore, our results show risk-related activity in the dorsolateral prefrontal cortex and the parietal cortex in choice situations but not in situations in which no choice is involved or a choice has already been made. The anterior insula was predominantly active in the presence of potential losses, indicating that potential losses modulate risk processing.
Observing disease symptoms causes a more vigorous immune response
We know that social status, positive versus negative affect, etc. can influence immune system robustness. Now Schaller et al. show that the mere observation of photographs depicting symptoms of infectious disease can boost the subsequent elevation of proinflammatory cytokines released by white blood cells in response to a bacterial stimulus. The effect was specific to the perception of disease-connoting social cues; it did not occur in response to a different category of stress-inducing interpersonal threat. The abstract and a few clips:
An experiment (N = 28) tested the hypothesis that the mere visual perception of disease-connoting cues promotes a more aggressive immune response. Participants were exposed either to photographs depicting symptoms of infectious disease or to photographs depicting guns. After incubation with a model bacterial stimulus, participants’ white blood cells produced higher levels of the proinflammatory cytokine interleukin-6 (IL-6) in the infectious-disease condition, compared with the control (guns) condition. These results provide the first empirical evidence that visual perception of other people’s symptoms may cause the immune system to respond more aggressively to infection.
This linkage may have been adaptive in ancestral ecologies, as individuals characterized by perception-facilitated immune responses would have had reduced likelihood of succumbing to pathogenic infections...People are sensitive to visual stimuli connoting the potential presence of infectious pathogens in others. These stimuli include anomalous morphological and behavioral characteristics (e.g., skin discolorations, sneezing) that suggest infection with disease-causing microorganisms. When perceived, these stimuli trigger psychological responses—such as disgust and the activation of aversive cognitions into working memory—that inhibit interpersonal contact.
Monday, June 14, 2010
You are how you eat - fast food and impatience
Here is a gem from the May issue of Psychological Science, offered by Zhong and DeVoe at the University of Toronto:
Based on recent advancements in the behavioral priming literature, three experiments investigated how incidental exposure to fast food can induce impatient behaviors and choices outside of the eating domain. We found that even an unconscious exposure to fast-food symbols can automatically increase participants’ reading speed when they are under no time pressure and that thinking about fast food increases preferences for time-saving products while there are potentially many other product dimensions to consider. More strikingly, we found that mere exposure to fast-food symbols reduced people’s willingness to save and led them to prefer immediate gain over greater future return, ultimately harming their economic interest. Thus, the way people eat has far-reaching (often unconscious) influences on behaviors and choices unrelated to eating.
"Vital Exhaustion"
Benedict Carey discusses how the term "Nervous Breakdown," popular in 1900 before yielding to number of supposedly more scientific diagnoses, has mutated to what psychiatrists in Europe have been diagnosing as “burnout syndrome,” the signs of which include “vital exhaustion.” A paper published last year defined three types: “frenetic,” “underchallenged,” and “worn out.” "Nervous breakdown" has begun to fade from use, and the same fate may or may not await "burnout syndrome".
Friday, June 11, 2010
Austin pictures
This is my last day in Austin Texas, where, after attending a 50th high school reunion, I have been revisiting scenes of my childhood. One of the most beautiful is Hamilton's Pool, formed in a box canyon on a creek that runs into the Pedernales River about 23 miles west of Austin. I have put a few photos from the trip on my Picassa photo page.
Testosterone decreases trust.
Fascinating observations from Bos et al., who tested the effect of testosterone in regulating womens' rating of the trustworthiness of a series of men’s faces shown in photographs. It is essentially an antidote to oxytocin, which has been shown to increase judgments of trustworthiness. Their abstract:
Trust plays an important role in the formation and maintenance of human social relationships. But trusting others is associated with a cost, given the prevalence of cheaters and deceivers in human society. Recent research has shown that the peptide hormone oxytocin increases trust in humans. However, oxytocin also makes individuals susceptible to betrayal, because under influence of oxytocin, subjects perseverate in giving trust to others they know are untrustworthy. Testosterone, a steroid hormone associated with competition and dominance, is often viewed as an inhibitor of sociality, and may have antagonistic properties with oxytocin. The following experiment tests this possibility in a placebo-controlled, within-subjects design involving the administration of testosterone to 24 female subjects. We show that compared with the placebo, testosterone significantly decreases interpersonal trust, and, as further analyses established, this effect is determined by those who give trust easily. We suggest that testosterone adaptively increases social vigilance in these trusting individuals to better prepare them for competition over status and valued resources. In conclusion, our data provide unique insights into the hormonal regulation of human sociality by showing that testosterone downregulates interpersonal trust in an adaptive manner.Also, check out this article by Nicholas Wade in the NYTimes discussing this work and comments on its relevance to understanding human evolution.
Neural processing of risk.
Mohr et al. show that, over a range of experiments and paradigms, risk is consistently represented in the anterior insula, a brain region known to process aversive emotions such as anxiety, disappointment, or regret. This provides further evidence that risk processing is influenced by emotions.
In our everyday life, we often have to make decisions with risky consequences, such as choosing a restaurant for dinner or choosing a form of retirement saving. To date, however, little is known about how the brain processes risk. Recent conceptualizations of risky decision making highlight that it is generally associated with emotions but do not specify how emotions are implicated in risk processing. Moreover, little is known about risk processing in non-choice situations and how potential losses influence risk processing. Here we used quantitative meta-analyses of functional magnetic resonance imaging experiments on risk processing in the brain to investigate (1) how risk processing is influenced by emotions, (2) how it differs between choice and non-choice situations, and (3) how it changes when losses are possible. By showing that, over a range of experiments and paradigms, risk is consistently represented in the anterior insula, a brain region known to process aversive emotions such as anxiety, disappointment, or regret, we provide evidence that risk processing is influenced by emotions. Furthermore, our results show risk-related activity in the dorsolateral prefrontal cortex and the parietal cortex in choice situations but not in situations in which no choice is involved or a choice has already been made. The anterior insula was predominantly active in the presence of potential losses, indicating that potential losses modulate risk processing.
Thursday, June 10, 2010
More on clever crows and complex cognition.
Taylor et al have done some beautiful experiments clearly demonstrating complex cognition in New Caledonian crows, proving that these birds are capable to thinking through their actions, not simply learning through association a series of behaviors and combining them. You really should watch the video in the review by Telis. From that review:
...Taylor trained seven wild crows to associate a short stick with ineffectiveness; the crows failed to obtain their out-of-reach food with the stick and eventually began to ignore or reject it. Then they were divided into two groups, an innovation group and a training group. The training group learned six activities—such as how to use a short stick to extract a long stick from a toolbox—that together could help them get meat with long and short tools. The innovation group wasn’t taught how to use a short stick to extract a long stick from the toolbox, but did learn other techniques.The Taylor et al. abstract:
When tasked with reaching a snack in a hole using a short stick on a string and a longer stick trapped in a toolbox, all of the crows pulled it off. The three birds in the training group linked the behaviors they had learned into a new behavior. They freed the short stick from the string, used it to dislodge the long stick, and used the long stick to obtain their food. And all four crows in the innovation group figured out the sequence on their own. One crow in the innovation group stared at the setup for less than 2 minutes and then performed the whole trial correctly on her very first attempt.
Apes, corvids and parrots all show high rates of behavioural innovation in the wild. However, it is unclear whether this innovative behaviour is underpinned by cognition more complex than simple learning mechanisms. To investigate this question we presented New Caledonian crows with a novel three-stage metatool problem. The task involved three distinct stages: (i) obtaining a short stick by pulling up a string, (ii) using the short stick as a metatool to extract a long stick from a toolbox, and finally (iii) using the long stick to extract food from a hole. Crows with previous experience of the behaviours in stages 1–3 linked them into a novel sequence to solve the problem on the first trial. Crows with experience of only using string and tools to access food also successfully solved the problem. This innovative use of established behaviours in novel contexts was not based on resurgence, chaining and conditional reinforcement. Instead, the performance was consistent with the transfer of an abstract, causal rule: ‘out-of-reach objects can be accessed using a tool’. This suggests that high innovation rates in the wild may reflect complex cognitive abilities that supplement basic learning mechanisms.
Wednesday, June 09, 2010
Older is happier
Many diminutions come with aging, but decreasing happiness is not apparently among them. Bakalar notes studies by Stone et al. showing, to the contrary, that by almost any measure, people get happier as they get older, for reasons that are not clear. Clips from Bakalar's summary:
On the global measure, people start out at age 18 feeling pretty good about themselves, and then, apparently, life begins to throw curve balls. They feel worse and worse until they hit 50. At that point, there is a sharp reversal, and people keep getting happier as they age. By the time they are 85, they are even more satisfied with themselves than they were at 18.
In measuring immediate well-being — yesterday’s emotional state — the researchers found that stress declines from age 22 onward, reaching its lowest point at 85. Worry stays fairly steady until 50, then sharply drops off. Anger decreases steadily from 18 on, and sadness rises to a peak at 50, declines to 73, then rises slightly again to 85. Enjoyment and happiness have similar curves: they both decrease gradually until we hit 50, rise steadily for the next 25 years, and then decline very slightly at the end, but they never again reach the low point of our early 50s.
...we can expect to be happier in our early 80s than we were in our 20s...and it’s not being driven predominantly by things that happen in life. It’s something very deep and quite human that seems to be driving this.
Tuesday, June 08, 2010
Keep Austin Weird
The title of this post is the unofficial motto of Austin Texas, where I am spending this week. One of the things I enjoy most is its funky coffee houses, all with high speed wireless internet, and copious outlets to plug in your laptop.
Washing away postdecisional dissonance
An interesting tidbit from Lee and Schwartz:
Hand washing removes more than dirt—it also removes the guilt of past misdeeds, weakens the urge to engage in compensatory behavior, and attenuates the impact of disgust on moral judgment. These findings are usually conceptualized in terms of a purity-morality metaphor that links physical and moral cleanliness; however, they may also reflect that washing more generally removes traces of the past by metaphorically wiping the slate clean. If so, washing one’s hands may lessen the influence of past behaviors that have no moral implications at all. We test this possibility in a choice situation. Freely choosing between two similarly attractive options (e.g., Paris or Rome for vacation) arouses cognitive dissonance, an aversive psychological state resulting from conflicting cognitions. People reduce dissonance by perceiving the chosen alternative as more attractive and the rejected alternative as less attractive after choice, thereby justifying their decision.The authors tested whether hand washing reduces this classic postdecisional dissonance effect (the need to justify a recent choice) by designing a ranking and choice experiment in which participants, after making a choice, were subsequently asked to evaluate a soap - some with actual hand washing and some without. Their finding:
...indicate that the psychological impact of physical cleansing extends beyond the moral domain. Much as washing can cleanse us from traces of past immoral behavior, it can also cleanse us from traces of past decisions, reducing the need to justify them. This observation is not captured by the purity-morality metaphor and highlights the need for a better understanding of the processes that mediate the psychological impact of physical cleansing. To further constrain the range of plausible candidate explanations, future research may test whether the observed "clean slate" effect is limited to past acts that may threaten one’s self-view (e.g., moral transgressions and potentially poor choices) or also extends to past behaviors with positive implications.
Monday, June 07, 2010
Brunch at Fonda San Miquel in Austin
A picture from yesterday's birthday brunch for my son, now 36 years old, at my favorite Mexican gourmet restaurant in Austin Texas. Shown from left to right are my partner Len Walker, my son Jon Bownds, Deric (me), daughter-in-law Shana Merlin, and old family friend Martha Leipziger.
Prozac reverses maturation of some brain cells
Here is some intriguing work from Kobayashi et al. showing that fluoxetine (Prozac) indices a dematuration of hippocampus dentate gyrus cells that reinstates synaptic plasticity that is reduced with development, thereby potentially causing beneficial effects on the adult brain. (These cells are key in learning and memory processes). Their results suggest that the state of neuronal maturation, including aberrant maturation, might be controlled or corrected in adults, a unique approach to treating neuronal dysfunctions associated with neurodevelopmental disorders. Some clips from the abstract:
Serotonergic antidepressant drugs have been commonly used to treat mood and anxiety disorders, and increasing evidence suggests potential use of these drugs beyond current antidepressant therapeutics. Facilitation of adult neurogenesis in the hippocampal dentate gyrus has been suggested to be a candidate mechanism of action of antidepressant drugs, but this mechanism may be only one of the broad effects of antidepressants. Here we show a distinct unique action of the serotonergic antidepressant fluoxetine in transforming the phenotype of mature dentate granule cells. Chronic treatments of adult mice with fluoxetine strongly reduced expression of the mature granule cell marker calbindin. The fluoxetine treatment induced active somatic membrane properties resembling immature granule cells and markedly reduced synaptic facilitation that characterizes the mature dentate-to-CA3 signal transmission. These changes cannot be explained simply by an increase in newly generated immature neurons, but best characterized as “dematuration” of mature granule cells...Our results suggest that serotonergic antidepressants can reverse the established state of neuronal maturation in the adult hippocampus...Such reversal of neuronal maturation could affect proper functioning of the mature hippocampal circuit, but may also cause some beneficial effects by reinstating neuronal functions that are lost during development.
Friday, June 04, 2010
MindBlog in Austin Texas
Having just returned from Istanbul last Friday, I travel again - this time with my partner Len Walker to Austin Texas to attend the 50th reunion of Austin High School Alumni. I will be vacationing here through next week. Being in Austin requires adapting the digestive system to gargantuan servings of TexMex dishes.
Aging brains are less able to recover from the effects of stress
It is well documented that aging reduces the effectiveness of our prefrontal cortex in mediating cognitive processing and decision making, including working memory and flexible use of mental strategies. Bloss, McEwen et al. have now conducted studies that suggest that one reason for this decline may be that aging reduces the ability of prefrontal cortex to recover from stress-induced damage. Their abstract:
Neuronal networks in the prefrontal cortex mediate the highest levels of cognitive processing and decision making, and the capacity to perform these functions is among the cognitive features most vulnerable to aging. Despite much research, the neurobiological basis of age-related compromised prefrontal function remains elusive. Many investigators have hypothesized that exposure to stress may accelerate cognitive aging, though few studies have directly tested this hypothesis and even fewer have investigated a neuronal basis for such effects. It is known that in young animals, stress causes morphological remodeling of prefrontal pyramidal neurons that is reversible. The present studies sought to determine whether age influences the reversibility of stress-induced morphological plasticity in rat prefrontal neurons. We hypothesized that neocortical structural resilience is compromised in normal aging. To directly test this hypothesis we used a well characterized chronic restraint stress paradigm, with an additional group allowed to recover from the stress paradigm, in 3-, 12-, and 20-month-old male rats. In young animals, stress induced reductions of apical dendritic length and branch number, which were reversed with recovery; in contrast, middle-aged and aged rats failed to show reversible morphological remodeling when subjected to the same stress and recovery paradigm. The data presented here provide evidence that aging is accompanied by selective impairments in long-term neocortical morphological plasticity.
Thursday, June 03, 2010
Overimitation of adults by kids is a cultural universal.
Most studies showing overimitation of adults by children during learning have been conducted on middle- to upper-class kids of Western-educated parents. Nielsen and Tomaselli studied (from Telis's review)
...a culture with a distinctly different parenting style: the Bushmen of the Kalahari Desert. Whereas a Western parent might teach a youngster to use a bow and arrow by standing behind her and guiding her motions, a parent from the indigenous African Bushmen culture would allow the child to come along for a hunt and learn by observation and through trial and error. Nielsen hypothesized that a child taught in this hands-off manner would have less reason to overimitate adults and would do so less often.The Nielsen and Tomaselli abstract:
Children are surrounded by objects that they must learn to use. One of the most efficient ways children do this is by imitation. Recent work has shown that, in contrast to nonhuman primates, human children focus more on reproducing the specific actions used than on achieving actual outcomes when learning by imitating. From 18 months of age, children will routinely copy even arbitrary and unnecessary actions. This puzzling behavior is called overimitation. By documenting similarities exhibited by children from a large, industrialized city and children from remote Bushman communities in southern Africa, we provide here the first indication that overimitation may be a universal human trait. We also show that overimitation is unaffected by the age of the child, differences in the testing environment, or familiarity with the demonstrating adult. Furthermore, we argue that, although seemingly maladaptive, overimitation reflects an evolutionary adaptation that is fundamental to the development and transmission of human culture.The Telis review has an interesting video of overimitation in a Bushmen child.
Prehistoric makeover
From the 21 May Science Magazine "Random Samples" section:
Care to feel closer to your extinct relatives? The Smithsonian Institution's MEanderthal app for iPhones and Android devices melds your mug shot with features of Homo neanderthalensis, modern humans' closest kin—or, if you prefer, the more distant H. heidelbergensis or H. floresiensis. In the first case, expect to gain a big nose and a puffier face, says Robert Costello, an outreach manager with the Smithsonian. Neandertals needed large sinus cavities to cope with the colder climate in Europe and Asia 28,000 to 200,000 years ago.
Wednesday, June 02, 2010
The cognitive niche
Steven Pinker offers (full text, open access) one of several fascinating paper in a special PNAS supplement issue: In the light of evolution IV: The human condition. All of the papers are open access. Pinker's title is "The cognitive niche: Coevolution of intelligence, sociality, and language." The abstract:
Although Darwin insisted that human intelligence could be fully explained by the theory of evolution, the codiscoverer of natural selection, Alfred Russel Wallace, claimed that abstract intelligence was of no use to ancestral humans and could only be explained by intelligent design. Wallace's apparent paradox can be dissolved with two hypotheses about human cognition. One is that intelligence is an adaptation to a knowledge-using, socially interdependent lifestyle, the “cognitive niche.” This embraces the ability to overcome the evolutionary fixed defenses of plants and animals by applications of reasoning, including weapons, traps, coordinated driving of game, and detoxification of plants. Such reasoning exploits intuitive theories about different aspects of the world, such as objects, forces, paths, places, states, substances, and other people's beliefs and desires. The theory explains many zoologically unusual traits in Homo sapiens, including our complex toolkit, wide range of habitats and diets, extended childhoods and long lives, hypersociality, complex mating, division into cultures, and language (which multiplies the benefit of knowledge because know-how is useful not only for its practical benefits but as a trade good with others, enhancing the evolution of cooperation). The second hypothesis is that humans possess an ability of metaphorical abstraction, which allows them to coopt faculties that originally evolved for physical problem-solving and social coordination, apply them to abstract subject matter, and combine them productively. These abilities can help explain the emergence of abstract cognition without supernatural or exotic evolutionary forces and are in principle testable by analyses of statistical signs of selection in the human genome.
Tuesday, June 01, 2010
Contra doomsayers, a bright future beckons?
Matt Ridley has a new book out, "The Rational Optimist", reviewed by John Tierney. (Ridley is a very bright polymath, I recall he did a much better job than I did some ~15 years ago, as a fellow contributor of several chapters of a standard introductory biology text book - a hack writing for pay gig). Ridley argues in his grand theory that it was the invention of the exchange of one object for another, rather than increasingly big brains or cooperation and reciprocity, that started the explosive growth of civilization.
Adam potentially now had access to objects he did not know how to make or find; and so did Oz,” Dr. Ridley writes. People traded goods, services and, most important, knowledge, creating a collective intelligence: “Ten individuals could know between them ten things, while each understanding one.”
Rulers like to take credit for the advances during their reigns, and scientists like to see their theories as the source of technological progress. But Dr. Ridley argues that they’ve both got it backward: traders’ wealth builds empires, and entrepreneurial tinkerers are more likely to inspire scientists than vice versa. From Stone Age seashells to the steam engine to the personal computer, innovation has mostly been a bottom-up process.
“Forget wars, religions, famines and poems for the moment,” Dr. Ridley writes. “This is history’s greatest theme: the metastasis of exchange, specialization and the invention it has called forth, the ‘creation’ of time.”
Progress this century could be impeded by politics, wars, plagues or climate change, but Dr. Ridley argues that, as usual, the “apocaholics” are overstating the risks and underestimating innovative responses....with new hubs of innovation emerging elsewhere, and with ideas spreading faster than ever on the Internet, Dr. Ridley expects bottom-up innovators to prevail. His prediction for the rest of the century: “Prosperity spreads, technology progresses, poverty declines, disease retreats, fecundity falls, happiness increases, violence atrophies, freedom grows, knowledge flourishes, the environment improves and wilderness expands.”