Wednesday, May 09, 2007

Meditation can alter attentional resource allocation

A striking observation from the Wisconsin group on how meditation can improve performace in discriminating closely spaced stimuli. Here is their summary:
Meditation includes the mental training of attention, which involves the selection of goal-relevant information from the array of inputs that bombard our sensory systems. One of the major limitations of the attentional system concerns the ability to process two temporally close, task-relevant stimuli. When the second of two target stimuli is presented within a half second of the first one in a rapid sequence of events, it is often not detected. This so-called “attentional-blink” deficit is thought to result from competition between stimuli for limited attentional resources. We measured the effects of intense meditation on performance and scalp-recorded brain potentials in an attentional-blink task. We found that three months of intensive meditation reduced brain-resource allocation to the first target, enabling practitioners to more often detect the second target with no compromise in their ability to detect the first target. These findings demonstrate that meditative training can improve performance on a novel task that requires the trained attentional abilities.

Personal space in virtual reality

Check out the fascinating video in this link, showing how rules of personal space and eye contact carry over into the Second Life virtual reality game.

Tuesday, May 08, 2007

Discover Magazine on Mind and Brain

A variety of interesting articles in an online Mind and Brain section of Discover Magazine.

Wisdom and the Amygdala...

The Sunday May 6 New York Times Magazine has an intersting article by Stephen Hall titled "The Older-and-Wiser Hypothesis" describing efforts to define what constitutes wise behavior (PDF here). It describes the "Berlin Paradigm" which in essence defines wisdom as
“an expert knowledge system concerning the fundamental pragmatics of life.” It emphasizes several complementary qualities: expert knowledge of both the “facts” of human nature and the “how” of dealing with decisions and dilemmas; an appreciation of one’s historical, cultural and biological circumstances during the arc of a life span; an understanding of the “relativism” of values and priorities; and an acknowledgment, at the level of both thought and action, of uncertainty.
Central to wisdom is emotion regulation:
...despite the well-documented cognitive declines associated with advancing age, older people seem to have figured out how to manage their emotions in a profoundly important way. Compared with younger people, they experience negative emotions less frequently, exercise better control over their emotions and rely on a complex and nuanced emotional thermostat that allows them to bounce back quickly from adverse moments. Indeed, they typically strive for emotional balance, which in turn seems to affect the ways their brains process information from their environment.
The article quotes Richard Davidson at Wisconsin:
“Those people who are good at regulating negative emotion, inferred by their ability to voluntarily use cognitive strategies to reappraise a stimulus, show reductions in activation in the amygdala,” says Davidson, who added that such regulation probably results from “something that has been at least implicitly trained over the years.” It is difficult (not to say dangerous) to generalize from such a small, focused study, but the implication is that people who learn, or somehow train themselves, to modulate their emotions are better able to manage stress and bounce back from adversity. Although they can register the negative, they have somehow learned not to get bogged down in it. Whether this learning is a form of “wisdom” accumulated over a lifetime of experience, as wisdom researchers see it, or can be acquired through training exercises like meditation, as Davidson’s previous research has shown, the recent message from neuroscience laboratories is that the optimal regulation of emotion can be seen in the brain.
Further clips:
Similarly, several years ago, Carstensen; Mara Mather of the University of California at Santa Cruz; John Gabrieli, a neuroscientist now at the Massachusetts Institute of Technology; and several colleagues performed f.M.R.I. studies of young and old people to see whether the ability to regulate emotions left a trace in the amygdala. The study indicated that the amygdala in young people becomes active when they view both positive and negative images; the amygdala in older people is active only when they view positive images. Put another way, young people tend to cling to the negative information, neurologically speaking, while older people seem better able to shrug it off and focus more on positive images. This neural selectivity, this focus on the positive, is virtually instantaneous, Gabrieli says, and yet probably reflects a kind of emotional knowledge or experience that guides cognitive focus; Carstensen says older people “disattend” negative information. This “disattention” also echoes some very old thoughts on wisdom. In his 1890 book “The Principles of Psychology,” William James observed, “The art of being wise is the art of knowing what to overlook.” In modern neuroscience parlance, Gabrieli says, “you could say that in older people the amygdala is overlooking the negative.”

Much of the research to date has reflected a predominantly Western notion of wisdom, but its definition can be further muddied by cultural vagaries. In one cross-cultural study, researchers found that Americans and Australians essentially equated being wise with being experienced and knowledgeable; being old and discreet were seen as less-than-desirable qualities. People in India and Japan, by contrast, linked wisdom to being discreet, aged and experienced.

Nevertheless, the notion of wisdom is sufficiently universal that it raises other questions: Where does it come from, and how does one acquire it? Surprisingly, a good deal of evidence, both anecdotal and empirical, suggests that the seeds of wisdom are planted earlier in life — certainly earlier than old age, often earlier than middle age and possibly even earlier than young adulthood. And there are strong hints that wisdom is associated with an earlier exposure to adversity or failure. That certainly seems to be the case with emotional regulation and is perfectly consistent with Carstensen’s ideas about shifting time horizons. Karen Parker and her colleagues at Stanford have published several striking animal studies showing that a very early exposure to mild adversity (she calls it a “stress inoculation”) seems to “enhance the development of brain systems that regulate emotional, neuroendocrine and cognitive control” — at least in nonhuman primates. Some researchers are also exploring the genetic basis of resilience.

This week's piano - Suite Bergmanesque

This week's recording, another Debussy piece. The Prelude from Suite Bergmanesque.....

Monday, May 07, 2007

One Clever Raven

Heinrich and Bugnyar have an interesting article in the April 2007 Scientific American titled "Just How Smart Are Ravens?" It reminds me of this video that I have used in my teaching, made by Weir et al., of a Raven who obviously understands a few things about physical forces and causal relations. It takes a straight wire and fashions a hook to lift from a tube a container that contains a piece of meat.

Brain Lessons

Steven Pinker, Oliver Sacks, and others on how learning about their brains changed the way they live. I particularly like the paragraph by Alison Gopnik, author of the critique of the mirror neuron myth that I have posted and co-author of The Scientist in the Crib: Minds, Brains, and How Children Learn.
Consciousness, attention, and brain plasticity all seem to be linked. And attention and plasticity are much more widely distributed in young animals—including human babies—than older ones. For grown-ups, consciousness is like a spotlight; for babies it's like a lantern. I have always loved the childlike moments, however brief, when our minds seem to open to the entire world around us—the experience celebrated by Romantic poets and Zen sages alike. The neuroscience makes me think that these moments aren't just a passing thrill. Cultivating this childlike "lantern consciousness," this broad focus, might help make us almost as good as babies at changing our brains.

Friday, May 04, 2007

Bush's Mistake and Kennedy's Error

This is the title of Michael Shermer's essay in the April 15 issue of the Scientific American, on how self-deception proves itself to be more powerful than deception.
...most members of Congress from both parties, along with President George W. Bush, believe that we have to "stay the course" and not just "cut and run." ...We all make similarly irrational arguments about decisions in our lives: we hang on to losing stocks, unprofitable investments, failing businesses and unsuccessful relationships. If we were rational, we would just compute the odds of succeeding from this point forward and then decide if the investment warrants the potential payoff. But we are not rational--not in love or war or business--and this particular irrationality is what economists call the "sunk-cost fallacy."

The psychology underneath this and other cognitive fallacies is brilliantly illuminated by psychologist Carol Tavris and University of California, Santa Cruz, psychology professor Elliot Aronson in their book Mistakes Were Made (But Not by Me) (Harcourt, 2007). Tavris and Aronson focus on so-called self-justification, which "allows people to convince themselves that what they did was the best thing they could have done." The passive voice of the telling phrase "mistakes were made" shows the rationalization process at work.

What happens in those rare instances when someone says, "I was wrong"? Surprisingly, forgiveness is granted and respect is elevated. Imagine what would happen if George W. Bush delivered the following speech:

This administration intends to be candid about its errors. For as a wise man once said, "An error does not become a mistake until you refuse to correct it." We intend to accept full responsibility for our errors.... We're not going to have any search for scapegoats ... the final responsibilities of any failure are mine, and mine alone.

Bush's popularity would skyrocket, and respect for his ability as a thoughtful leader willing to change his mind in the teeth of new evidence would soar. That is precisely what happened to President John F. Kennedy after the botched Bay of Pigs invasion of Cuba, when he spoke these very words.

Ginkgo Biloba? Forget About It.

Brendan I. Koerner gives a history of the top-selling brain enhancer. Bottom line:
In 2002, a long-anticipated paper appeared in JAMA titled "Ginkgo for memory enhancement: a randomized controlled trial." This Williams College study, sponsored by the National Institute on Aging rather than Schwabe, examined the effects of ginkgo consumption on healthy volunteers older than 60. The conclusion, now cited in the National Institutes of Health's ginkgo fact sheet, said: "When taken following the manufacturer's instructions, ginkgo provides no measurable benefit in memory or related cognitive function to adults with healthy cognitive function.

The impact of this seemingly damning assessment, however, was ameliorated by the almost simultaneous publication of a Schwabe-sponsored study in the less prestigious Human Psychopharmacology. This rival study, conducted at Jerry Falwell's Liberty University, was rejected by JAMA, and came to a very different—if not exactly sweeping—conclusion: There was ample evidence to support "the potential efficacy of Ginkgo biloba EGb 761 in enhancing certain neuropsychological/memory processes of cognitively intact older adults, 60 years of age and over." The two studies canceled each other out in the court of public opinion; ginkgo sales remained strong.

A large-scale, multicenter, multiyear study might clear things up, but no one appears interested in funding such a massive effort. The National Center for Complementary and Alternative Medicine is in the midst of a clinical trial involving 3,000 Alzheimer's patients, but this obviously has no bearing on whether ginkgo can help the healthy.

Thursday, May 03, 2007

Gesture in language evolution - data from Chimps

Pollick and de Waal have observed the association of manual and facial/vocal signals in groups of chimpanzees and bonobos, distinguishing 31 manual gestures and 18 facial/vocal signals. Bonobos, which became a separate species from chimpanzees 2.5 million years ago, seem to make special use of hand gestures that elicit a response from other bonobos much more often when included in the mix of sounds and expressions.
"...our closest primate relatives use brachiomanual gestures more flexibly across contexts than they do facial expressions and vocalizations. Gestures seem less closely tied to particular emotions, such as aggression or affiliation, hence possess a more adaptable function. Gestures are also evolutionarily younger, as shown by their presence in apes but not monkeys, and likely under greater cortical control than facial/ vocal signals .... This observation makes gesture a serious candidate modality to have acquired symbolic meaning in early hominins. As such, the present study supports the gestural origin hypothesis of language."

Train Your Brain

Meghan O'Rourke on the new mania for neuroplasticity.
Neuroplasticity certainly has capacious ramifications, but you could be forgiven for thinking that the mania for harnessing its supposed anti-aging benefits is just our latest form of magical thinking, invoked by baby boomers who've turned away from fussing over their children's brains to ward off their own eventual decline.

...the idea that a little mindful meditation could calm down the forgetful, buzzing frenzy of our brains is still an appealing one. Even if the science is less than solid, maybe the placebo effect will kick in; and in any case, my brain seems to enjoy its crossword-puzzle respites and its Sudoku vacations, the way my muscles enjoy a massage. Or so my mind is telling me. Seven-letter word for "memory loss," anyone?

Wednesday, May 02, 2007

The Way We Age Now

This is the title of one the best articles on aging that I have read, written by Atul Gawande (Asst. Prof. in the Harvard School of Public Health, and staff writer for the New Yorker Magazine). The article appears in the April 30 issue of the New Yorker.

Some clips:

Even though some genes have been shown to influence longevity in worms, fruit flies, and mice..
...scientists do not believe that our life spans are actually programmed into us. After all, for most of our hundred-thousand-year existence—all but the past couple of hundred years—the average life span of human beings has been thirty years or less...Today, the average life span in developed countries is almost eighty years. If human life spans depend on our genetics, then medicine has got the upper hand. We are, in a way, freaks living well beyond our appointed time. So when we study aging what we are trying to understand is not so much a natural process as an unnatural one...

...complex systems—power plants, say—have to survive and function despite having thousands of critical components. Engineers therefore design these machines with multiple layers of redundancy: with backup systems, and backup systems for the backup systems. The backups may not be as efficient as the first-line components, but they allow the machine to keep going even as damage accumulates...within the parameters established by our genes, that’s exactly how human beings appear to work. We have an extra kidney, an extra lung, an extra gonad, extra teeth. The DNA in our cells is frequently damaged under routine conditions, but our cells have a number of DNA repair systems. If a key gene is permanently damaged, there are usually extra copies of the gene nearby. And, if the entire cell dies, other cells can fill in.

Nonetheless, as the defects in a complex system increase, the time comes when just one more defect is enough to impair the whole, resulting in the condition known as frailty. It happens to power plants, cars, and large organizations. And it happens to us: eventually, one too many joints are damaged, one too many arteries calcify. There are no more backups. We wear down until we can’t wear down anymore.
Gawande proceeds to a discussion of social and medical consequences of people over 65 becoming 20% of the population.
Improvements in the treatment and prevention of heart disease, respiratory illness, stroke, cancer, and the like mean that the average sixty-five-year-old can expect to live another nineteen years—almost four years longer than was the case in 1970. (By contrast, from the nineteenth century to 1970, sixty-five-year-olds gained just three years of life expectancy.)

The result has been called the “rectangularization” of survival. Throughout most of human history, a society’s population formed a sort of pyramid: young children represented the largest portion—the base—and each successively older cohort represented a smaller and smaller group. In 1950, children under the age of five were eleven per cent of the U.S. population, adults aged forty-five to forty-nine were six per cent, and those over eighty were one per cent. Today, we have as many fifty-year-olds as five-year-olds. In thirty years, there will be as many people over eighty as there are under five.

Americans haven’t come to grips with the new demography. We cling to the notion of retirement at sixty-five—a reasonable notion when those over sixty-five were a tiny percentage of the population, but completely untenable as they approach twenty per cent. People are putting aside less in savings for old age now than they have in any decade since the Great Depression. More than half of the very old now live without a spouse, and we have fewer children than ever before—yet we give virtually no thought to how we will live out our later years alone.

...medicine has been slow to confront the very changes that it has been responsible for—or to apply the knowledge we already have about how to make old age better. Despite a rapidly growing elderly population, the number of certified geriatricians fell by a third between 1998 and 2004.

Spirit Tech

John Horgan on how to wire your brain for religious ecstasy.
Our current mystical technologies are primitive, but one day, neurotheologians may find a technology that gives us permanent, blissful self-transcendence with no side effects. Should we really welcome such a development? Recall that in the 1950s and 1960s, the CIA funded research on psychedelics because of their potential as brainwashing agents and truth serums.

Even setting aside the issue of control, mystical technologies raise troubling philosophical issues. Shulgin, the psychedelic chemist, once wrote that a perfect mystical technology would bring about "the ultimate evolution, and perhaps the end of the human experiment." When I asked Shulgin to elaborate, he said that if we achieve permanent mystical bliss, there would be "no motivation, no urge to change anything, no creativity." Both science and religion aim to eliminate suffering. But if a mystical technology makes us immune to anxiety, grief, and heartache, are we still fully human? Have we gained something or lost something? In short, would a truly effective mystical technology—a God machine that works—save us, or doom us?

Human evolution and migration


I just came across one of the best graphical presentations of human origins and migrations that I have seen, posted by The Bradshaw Foundation, whose website also has other information on human origins. It allows you to click through the various expansions and contractions of human groups as the ice ages came and went.

Tuesday, May 01, 2007

God Is in the Dendrites

George Johnson asks: Can "neurotheology" bridge the gap between religion and science? He gives an excellent summary of relevant experiments that measure or induce brain activity correlated with meditative, religious, or estatic states to conclude:
So it goes, round and round. Either the brain naturally or through a malfunction manufactures religious delusions, or some otherworldly presence speaks to homo sapiens through the language of neurological pulses. Hot in pursuit of this undecidable proposition, neurotheology will keep on churning out data—but when it comes to the biggest questions, it will never have much to say.

An interesting effect of Cochlear Implants - better than normal audiovisual integration

Rouger et al. show that deaf people have superior lip-reading abilities and superior audiovisual integration compared with those with normal hearing and that they maintain superior lip-reading performance even after cochlear implantation.

From Shannon's review of this work:
Cochlear implants are sensory prostheses that restore hearing to deafened individuals by electric stimulation of the remaining auditory nerve. Contemporary cochlear implants generally use 16–22 electrodes placed along the tonotopic axis of the cochlea. Each electrode is designed to stimulate a discrete neural region and thereby present a coarse representation of the frequency-specific neural activation in a normal cochlea. However, within each region of stimulated neurons, the fine spectro-temporal structure of neural activation/response is quite different from that of the normal ear. Despite these differences, modern cochlear implants provide high levels of speech understanding, with most recipients capable of telephone conversation.
from Rouger et al.'s abstract:
... recovery goes through long-term adaptative processes to build coherent percepts from the coarse information delivered by the implant.... we analyzed the longitudinal postimplantation evolution of word recognition in a large sample of cochlear implant (CI) users in unisensory (visual or auditory) and bisensory (visuoauditory) conditions. We found that, despite considerable recovery of auditory performance during the first year postimplantation, CI patients maintain a much higher level of word recognition in speechreading conditions compared with normally hearing subjects, even several years after implantation. Consequently, we show that CI users present higher visuoauditory performance when compared with normally hearing subjects with similar auditory stimuli. This better performance is not only due to greater speechreading performance, but, most importantly, also due to a greater capacity to integrate visual input with the distorted speech signal. Our results suggest that these behavioral changes in CI users might be mediated by a reorganization of the cortical network involved in speech recognition that favors a more specific involvement of visual areas. Furthermore, they provide crucial indications to guide the rehabilitation of CI patients by using visually oriented therapeutic strategies.

Monday, April 30, 2007

The Myth of Mirror Neurons?

In an article in a special issue of Slate devoted to the brain (well worth checking over...I'll give some links to articles in the Slate issue in subsequent posts), Gopnik argues that excitement over the discovery of mirror neurons in our brains (the subject of a number of blog posts and my lecture posted earlier...) is generating a new scientific myth. Like a traditional myth, it captures intuitions about the human condition through vivid metaphors. Some clips:
It didn't take long for scientists and science writers to speculate that mirror neurons might serve as the physiological basis for a wide range of social behaviors, from altruism to art appreciation. Headlines like "Cells That Read Minds" or "How Brain's 'Mirrors' Aid Our Social Understanding" tapped into our intuitions about connectedness. Maybe this cell, with its mellifluous name, gives us our special capacity to understand one another—to care, to learn, and to communicate. Could mirror neurons be responsible for human language, culture, empathy, and morality?.

The evidence for individual mirror neurons comes entirely from studies of macaque monkeys. That's because you can't find these cells without inserting electrodes directly (though painlessly) into individual neurons in the brains of living animals. These studies haven't been done with chimpanzees, let alone humans.

The trouble is that macaque monkeys don't have language, they don't have culture, and they don't understand other animals' minds. In fact, careful experiments show that they don't even systematically imitate the actions of other monkeys—and they certainly don't imitate in the prolific way that the youngest human children do. Even chimpanzees, who are much more cognitively sophisticated than macaques, show only very limited abilities in these areas. The fact that macaques have mirror neurons means that these cells can't by themselves explain our social behavior.

This week's recording - Arabesque

Debussy's first Arabesque, recorded on my Steinway B in Middleton Wisconsin.

Top-Down/Bottom-Up in Attention Control

An elegant study from Buschman and Miller. Their abstract:
Attention can be focused volitionally by "top-down" signals derived from task demands and automatically by "bottom-up" signals from salient stimuli. The frontal and parietal cortices are involved, but their neural activity has not been directly compared. Therefore, we recorded from them simultaneously in monkeys. Prefrontal neurons reflected the target location first during top-down attention, whereas parietal neurons signaled it earlier during bottom-up attention. Synchrony between frontal and parietal areas was stronger in lower frequencies during top-down attention and in higher frequencies during bottom-up attention. This result indicates that top-down and bottom-up signals arise from the frontal and sensory cortex, respectively, and different modes of attention may emphasize synchrony at different frequencies.

Friday, April 27, 2007

Does Darwinism have to be depressing?

Robert Wright, author of "The Moral Animal," argues no, in spite of the fact that evolutionary explanations boil our loftiest feelings boil down to genetic self-interest. Morality, along with love and positive emotions, are your genes way of getting you to serve their agenda. Here is a PDF of his essay.