Monday, March 16, 2009

Neural correlates of religious belief - neuroscience and spirituality

I'm realizing that I have a sufficient number of notes from this area in my queue that I'm not going to get to them separately. So, here I pass on first some links to recent publications and then some work on neural correlates of religious belief.

First, three publications:
"How God changes your brain"

"Contemplative Science: Where Buddhism and Neuroscience Converge"

A conference on Neuroscience and spiritual practices
Next,
Kapogiannis and collaborators attempt to model the complexity of religious belief and then provide brain imaging data correlate their categories with well known brain networks:
We propose an integrative cognitive neuroscience framework for understanding the cognitive and neural foundations of religious belief. Our analysis reveals 3 principle psychological dimensions of religious belief (God's perceived level of involvement, God's perceived emotion, and doctrinal/experiential religious knowledge), which functional MRI localizes within networks processing Theory of Mind regarding intent and emotion, abstract semantics, and imagery. Our results are unique in demonstrating that specific components of religious belief are mediated by well-known brain networks, and support contemporary psychological theories that ground religious belief within evolutionary adaptive cognitive functions.
In a less ambitious effort, Inzlicht et al. suggest that religious people are more chilled out when they commit errors, reflected by reduced reactivity of their anterior cingulate cortex:
Many people derive peace of mind and purpose in life from their belief in God. For others, however, religion provides unsatisfying answers. Are there brain differences between believers and nonbelievers? Here we show that religious conviction is marked by reduced reactivity in the anterior cingulate cortex (ACC), a cortical system that is involved in the experience of anxiety and is important for self-regulation. In two studies, we recorded electroencephalographic neural reactivity in the ACC as participants completed a Stroop task. Results showed that stronger religious zeal and greater belief in God were associated with less firing of the ACC in response to error and with commission of fewer errors. These correlations remained strong even after we controlled for personality and cognitive ability. These results suggest that religious conviction provides a framework for understanding and acting within one's environment, thereby acting as a buffer against anxiety and minimizing the experience of error.

Friday, March 13, 2009

To be less helpful to others, watch violent media...

A sobering study from Bushman and Anderson:
Two studies tested the hypothesis that exposure to violent media reduces aid offered to people in pain. In the first study, participants played a violent or nonviolent video game for 20 min. After game play, while completing a lengthy questionnaire, they heard a loud fight, in which one person was injured, outside the lab. Participants who played violent games took longer to help the injured victim, rated the fight as less serious, and were less likely to "hear" the fight in comparison to participants who played nonviolent games. In the second study, violent- and nonviolent-movie attendees witnessed a young woman with an injured ankle struggle to pick up her crutches outside the theater either before or after the movie. Participants who had just watched a violent movie took longer to help than participants in the other three conditions. The findings from both studies suggest that violent media make people numb to the pain and suffering of others.

How to enhance the wisdom of one...

The point of James Surowiecki's engaging book, "The Wisdom of Crowds" is that a marketplace - of ideas, goods, services, whatever - can make astonishingly accurate predictions of election outcomes, Oscar winners, etc. Herzog and Hertwig ask how a single individual might improve a best guess for an outcome and test a Hegelian process they call "dialectical bootstrapping." After making the first estimate, consider the reasons and assumptions underpinning that estimate (and how they might be off target), and then formulate a new, second estimate that harks back to somewhat different knowledge.

They tested the efficacy of this method by asking 101 students at the University of Basel to date a collection of 40 historical events (e.g., the discovery of electricity), 10 each from the 16th, 17th, 18th, and 19th centuries. Each participant was randomly assigned to one of two conditions. In both conditions, participants first generated their estimates without knowing that they would be asked later to generate a second estimate. In the dialectical-bootstrapping condition, participants (n= 50) were then asked to give dialectical estimates (while their first estimates were displayed in front of them) using a technique inspired by the consider-the-opposite strategy: First, assume that your first estimate is off the mark. Second, think about a few reasons why that could be. Which assumptions and considerations could have been wrong? Third, what do these new considerations imply? Was the first estimate rather too high or too low? Fourth, based on this new perspective, make a second, alternative estimate.

They found that the improvement in accuracy (in years) over the first estimate was twice as large for the dialectical average than for the repeat average, although averaging the first estimates from two random individuals worked better still.

Thursday, March 12, 2009

Roots of our social glue

Angier does a nice summary of ideas in Hrdy's forthcoming new book “Mothers and Others: The Evolutionary Origins of Mutual Understanding,” (Harvard Univ. Press):
...the extraordinary social skills of an infant are at the heart of what makes us human. Through its ability to solicit and secure the attentive care not just of its mother but of many others in its sensory purview, a baby promotes many of the behaviors and emotions that we prize in ourselves and that often distinguish us from other animals, including a willingness to share, to cooperate with strangers, to relax one’s guard...Our capacity to cooperate in groups, to empathize with others and to wonder what others are thinking and feeling...probably arose in response to the selective pressures of being in a cooperatively breeding social group, and the need to trust and rely on others and be deemed trustworthy and reliable in turn. Babies became adorable and keen to make connections with every passing adult gaze....mother chimpanzees and gorillas jealously hold on to their infants for the first six months or more of life. Other females may express real interest in the newborn, but the mother does not let go: you never know when one of those females will turn infanticidal, or be unwilling or unable to defend the young ape against an infanticidal male.

Dr. Hrdy wrote her book in part to counter what she sees as the reigning dogma among evolutionary scholars that humans evolved their extreme sociality and cooperative behavior to better compete with other humans. “I’m not comfortable accepting this idea that the origins of hypersociality can be found in warfare, or that in-group amity arose in the interest of out-group enmity,” she said in a telephone interview. Sure, humans have been notably violent and militaristic for the last 12,000 or so years, she said, when hunter-gatherers started settling down and defending territories, and populations started getting seriously dense. But before then? There weren’t enough people around to wage wars. By the latest estimates, the average population size during the hundreds of thousands of years of human evolution that preceded the Neolithic Age may have been around 2,000 breeding adults. “What would humans have been fighting over?” Dr. Hrdy said. “They were too busy trying to keep themselves and their children alive.”

Dr. Hrdy also argues that our human ancestors became emotionally modern long before the human brain had reached its current average volume of 1,300 cubic centimeters, which is about three times the size of a chimpanzee brain — in other words, that we became the nicest apes before becoming the smartest. You don’t need a bulging brain to evolve cooperative breeding. Many species of birds breed cooperatively, as do lions, rats, meerkats, wolves and marmosets, among others. But to become a cooperatively breeding ape, and to persuade a bunch of smart, hot-tempered, suspicious, politically cunning primates to start sharing child care and provisionings, now that took a novel evolutionary development, the advent of this thing called trust.

The Myth of Language Universals

To continue the thread from several previous posts, I pass on the abstract of a draft article by Evans and Levinson titled "The Myth of Language Universals: Language diversity and its importance for cognitive science" :
Talk of linguistic universals has given cognitive scientists the impression that languages are all built to a common pattern. In fact, there are vanishingly few universals of language in the direct sense that all languages exhibit them. Instead, diversity can be found at almost every level of linguistic organization. This fundamentally changes the object of enquiry from a cognitive science perspective.

The article summarizes decades of cross-linguistic work by typologists and descriptive linguists, showing just how few and unprofound the universal characteristics of language are, once we honestly confront the diversity offered to us by the world's 6-8000 languages. After surveying the various uses of 'universal', we illustrate the ways languages vary radically in sound, meaning, and syntactic organization, then examine in more detail the core grammatical machinery of recursion, constituency, and grammatical relations. While there are significant recurrent patterns in organization, these are better explained as stable engineering solutions satisfying multiple design constraints,reflecting both cultural-historical factors and the constraints of human cognition.

Linguistic diversity then becomes the crucial datum for cognitive science: we are the only species with a communication system which is fundamentally variable at all levels. Recognising the true extent of structural diversity in human language opens up exciting new research directions for cognitive scientists, offering thousands of different natural experiments given by different languages, with new opportunities for dialogue with biological paradigms concerned with change and diversity, and confronting us with the extraordinary plasticity of the highest human skills.

Wednesday, March 11, 2009

MindBlog's winter office...

Inspired by 'the view from your window' feature of Andrew Sullivan's Blog, I thought I would post an iPhone photo of where I am sitting (in Fort Lauderdale, Florida) as I bang out MindBlog's posts. I will be heading back to Madison Wisconsin in early April.

How to keep from stumbling on steps...

Elliott et al. make the neat observation that a simple visual illusion can lead to safer stepping behavior. The perceived height of a step is manipulated as shown in this figure:

Subjects perceived the step to be higher in the V (vertical stripes on the height dimension) configuration on the right compared to the H (horizontal stripes) configuration on the left, and correspondingly raised their toes higher to clear the step.

Neuroscience and the soul.

A recent letter from Martha Farah to Science Magazine is worth passing on:
Science and religion have had a long relationship, by turns collegial and adversarial. In the 17th century Galileo ran afoul of the Church's geocentrism, and in the 19th century Darwin challenged the biblical account of creation. The breaches that open at such times often close again, as religions determine that the doctrine in question is not an essential part of faith. This is precisely what happened with geocentrism and, outside of certain American fundamentalist Christian sects, evolution.

A new challenge to the science-religion relationship is currently at hand. We hope that, with careful consideration by scientists and theologians, it will not become the latest front in what some have called the "culture war" between science and religion. The challenge comes from neuroscience and concerns our understanding of human nature.

Most religions endorse the idea of a soul (or spirit) that is distinct from the physical body. Yet as neuroscience advances, it increasingly seems that all aspects of a person can be explained by the functioning of a material system. This first became clear in the realms of motor control and perception. Yet, models of perceptual and motor capacities such as color vision and gait do not directly threaten the idea of the soul. You can still believe in what Gilbert Ryle called "the ghost in the machine" and simply conclude that color vision and gait are features of the machine rather than the ghost.

However, as neuroscience begins to reveal the mechanisms underlying personality, love, morality, and spirituality, the idea of a ghost in the machine becomes strained. Brain imaging indicates that all of these traits have physical correlates in brain function. Furthermore, pharmacologic influences on these traits, as well as the effects of localized stimulation or damage, demonstrate that the brain processes in question are not mere correlates but are the physical bases of these central aspects of our personhood. If these aspects of the person are all features of the machine, why have a ghost at all?

By raising questions like this, it seems likely that neuroscience will pose a far more fundamental challenge than evolutionary biology to many religions. Predictably, then, some theologians and even neuroscientists are resisting the implications of modern cognitive and affective neuroscience. "Nonmaterialist neuroscience" has joined "intelligent design" as an alternative interpretation of scientific data. This work is counterproductive, however, in that it ignores what most scholars of the Hebrew and Christian scriptures now understand about biblical views of human nature. These views were physicalist, and body-soul dualism entered Christian thought around a century after Jesus' day.

To be sure, dualism is intuitively compelling. Yet science often requires us to reject otherwise plausible beliefs in the face of evidence to the contrary. A full understanding of why Earth orbits the Sun (as a consequence of the way the solar system was formed) took another century after Galileo's time to develop. It may take even longer to understand why certain material systems give rise to consciousness. In the meantime, just as Galileo's view of Earth in the heavens did not render our world any less precious or beautiful, neither does the physicalism of neuroscience detract from the value or meaning of human life.

Tuesday, March 10, 2009

Saving the world...

A University of Vermont course in the spring of 2008 came up with a magnum opus now published in the Proceedings of the National Academy titled "Overcoming systemic roadblocks to sustainability: The evolutionary redesign of worldviews, institutions, and technologies." I totally have a mind-numbing headache from reading this ponderous but worthwhile effort, and give you a few clips from their summary of "an integrated set of worldviews, institutions, and technologies to stimulate and seed evolutionary redesign of the current socio-ecological regime to achieve global sustainability":
Redefine Well-Being Metrics.
In any new context, we first have to remember that the goal of an economy is to sustainably improve human well-being and quality of life. Material consumption and GDP are merely means to that end, not ends in themselves. We have to recognize, as both ancient wisdom and new psychological research tell us, that material consumption beyond real need can actually reduce overall well-being.
Ensure the Well-Being of Populations During the Transition.
We must ensure that reductions in economic output and consumption fall on those with the lowest marginal utility of consumption, the wealthy. Presently, the U.S. tax code taxes the third wealthiest man in the world, Warren Buffett, at 17.7%, while his receptionist is taxed at the average rate of 30%....although qualitative development may continue indefinitely... existing levels of physical economic output and consumption are already unsustainable and should be reduced.
Reduce Complexity and Increase Resilience.
Efforts to create new cultural/institutional variants can benefit from the lessons offered by history, particularly cases of successful adaptation...Although environmental factors contribute to decline, equally important are the decisions made during the crises. A society's responses depend on the ability of its political, economic, and social institutions to respond, as well as on its cultural values.
Expand the “Commons Sector.”
Recognizing that we are in a biophysical crisis because of our over-consumption and lack of protection of ecosystem services, we must invest in institutions and the technologies required to reduce the impact of the market economy and to preserve and protect public goods. It is now time to create another major category of institution, the commons sector, which would be responsible for managing existing common assets and for creating new ones. Some assets should be held in common because it is more just; these include resources created by nature or by society as a whole.
Remove Barriers to Improving Knowledge and Technology.
With the invention of television, political advertisements became a critical outlet for candidates to broadcast their message and to sway voters. However, the decentralized nature of the Internet allows citizens to gain knowledge about what is done in their name, just as politicians can find out more about those they claim to represent. As a means of two-way communication, the Internet provides voters the ability to speak out within their government without leaving their homes. For the Internet to transform the idea of electronic democracy, universal access is critical. Currently technological, financial, and social barriers exist to such universal accessibility. Removal of these barriers thus becomes a major goal for replacement of the current plutocracy with real democracy.

Persistent effect of early abuse or deprivation on immune function in humans.

The wisconsin group that has studied various aspect of early abuse and deprivation in children offers evidence that adult immune function is compromised in these children, even if they were adopted into nuturing families and more benevolent settings.
It is well known that children need solicitous parenting and a nurturing rearing environment to ensure their normal behavioral development. Early adversity often negatively impacts emotional and mental well-being, but it is less clearly established how much the maturation and regulation of physiological systems is also compromised. The following research investigated the effect of 2 different types of adverse childhood experiences, early deprivation through institutionalization and physical abuse, on a previously unexplored outcome: the containment of herpes simplex virus (HSV). The presence of HSV-specific antibody in salivary specimens was determined in 155 adolescents, including 41 postinstitutionalized, 34 physically-abused, and 80 demographically-similar control youth. Across 4 school and home days, HSV antibody was higher in both postinstitutionalized and physically-abused adolescents when compared with control participants. Because the prevalence of HSV infection was similar across the groups, the elevated antibody was likely indicative of viral recrudescence from latency. Total secretory Ig-A secretion was associated with HSV, but did not account for the group differences in HSV-specific antibody. These findings are likely caused by a failure of cellular immune processes to limit viral reactivation, indicating a persistent effect of early rearing on immune functioning. The fact that antibody profiles were still altered years after adoption into a more benevolent setting with supportive families suggests these results were not caused by contemporaneous factors, but rather reflect a lingering influence of earlier life experiences.

Monday, March 09, 2009

From oral to moral

An interesting synthesis from Chapman et al.
In common parlance, moral transgressions "leave a bad taste in the mouth." This metaphor implies a link between moral disgust and more primitive forms of disgust related to toxicity and disease, yet convincing evidence for this relationship is still lacking. We tested directly the primitive oral origins of moral disgust by searching for similarity in the facial motor activity evoked by gustatory distaste (elicited by unpleasant tastes), basic disgust (elicited by photographs of contaminants), and moral disgust (elicited by unfair treatment in an economic game). We found that all three states evoked activation of the levator labii muscle region of the face, characteristic of an oralnasal rejection response. These results suggest that immorality elicits the same disgust as disease vectors and bad tastes.
A summary graphic from the review of this work by Rozin et al. and some of their comments:


Domains of disgust. The schematic represents routes by which eliciting situations may trigger the disgust output program. Those that run through the disgust evaluation system--which includes appraisal of the elicitor, feelings, and contamination ideation--trigger the full disgust emotion. Solid lines represent routes through which an elicitor can activate the disgust evaluation-output program. Dashed lines (green) represent direct elicitation of the disgust output program. The dotted line (brown) represents a metaphoric, indirect route.

According to the principle of preadaptation, a system that evolves for one purpose is later used for another purpose. From this viewpoint, disgust originates in the mammalian bitter taste rejection system, which directly activates a disgust output system. This primal route (e.g., bitter and some other tastes) evokes only the output program, without a disgust evaluation phase. During human evolution, the disgust output system was harnessed to a disgust evaluation system that responded not to simple sensory inputs (such as bitter tastes) but to more cognitively elaborated appraisals (e.g., a cockroach). Initially, the evaluation system was a food rejection system that rejected potential foods on the basis of their nature or perceived origin. This was the first "true disgust," because it engaged this evaluation system. Later, through some combination of biological and cultural evolution, the eliciting category was enlarged to include reminders of our animal nature, as wel as some people or social groups. This process had adaptive value, because by making things or thoughts disgusting a culture could communicate their negativity and cause withdrawal from them.

Brain activity started by music you think you are going to hear.

Here is an interesting piece of work from Leaver et al:
Music consists of sound sequences that require integration over time. As we become familiar with music, associations between notes, melodies, and entire symphonic movements become stronger and more complex. These associations can become so tight that, for example, hearing the end of one album track can elicit a robust image of the upcoming track while anticipating it in total silence. Here, we study this predictive "anticipatory imagery" at various stages throughout learning and investigate activity changes in corresponding neural structures using functional magnetic resonance imaging. Anticipatory imagery (in silence) for highly familiar naturalistic music was accompanied by pronounced activity in rostral prefrontal cortex (PFC) and premotor areas. Examining changes in the neural bases of anticipatory imagery during two stages of learning conditional associations between simple melodies, however, demonstrates the importance of fronto-striatal connections, consistent with a role of the basal ganglia in "training" frontal cortex. Another striking change in neural resources during learning was a shift between caudal PFC earlier to rostral PFC later in learning. Our findings regarding musical anticipation and sound sequence learning are highly compatible with studies of motor sequence learning, suggesting common predictive mechanisms in both domains.

Friday, March 06, 2009

A mother's experience can alter her offspring's memory performance.

Here are some fascinating experiments, done in mice to be sure (but likely to be shown for humans soon, as with so many other mouse models). It is known that exposure to an enriched environment enhances learning and memory in mice [which is reflected by an enhancement of nerve-nerve signaling in the hippocampus termed long term potentiation (LTP)]. This new study shows that these effects can be transmitted to the next generation; the authors observed that LTP was enhanced in the offspring of enriched mothers. Moreover, the characteristic defects in LTP and contextual fear conditioning of ras–Gfr-knockout mice were masked in the offspring of knockout mice exposed to an enriched environment. These data raise the intriguing possibility that a mother's experience can induce epigenetic changes that influence her offspring's memory performance (see Tuesday's post for information on another maternal effect and information on epigenetic effects). If a similar phenomenon occurs in humans, the effectiveness of one's memory during adolescence, particularly in those with defective cell signaling mechanisms that control memory, can be influenced by environmental stimulation experienced by one's mother during her youth. A portion of the abstract:
The idea that qualities acquired from experience can be transmitted to future offspring has long been considered incompatible with current understanding of genetics. However, the recent documentation of non-Mendelian transgenerational inheritance makes such a "Lamarckian"-like phenomenon more plausible. Here, we demonstrate that exposure of 15-d-old mice to 2 weeks of an enriched environment (EE), that includes exposure to novel objects, elevated social interactions and voluntary exercise, enhances long-term potentiation (LTP) not only in these enriched mice but also in their future offspring through early adolescence, even if the offspring never experience EE. In both generations, LTP induction is augmented by a newly appearing cAMP/p38 MAP kinase-dependent signaling cascade. Strikingly, defective LTP and contextual fear conditioning memory normally associated with ras-grf knock-out mice are both masked in the offspring of enriched mutant parents. The transgenerational transmission of this effect occurs from the enriched mother to her offspring during embryogenesis.

Why pay university tuition?

...when you can get an array of astounding courses from places like Academic Earth, with the videos of the lectures shown in your web browser. I recommend the introductory Psychology course offered by Paul Bloom at Yale.

Thursday, March 05, 2009

Our genes influence our social networks

Jackson reviews an analysis by Fowler et al. that suggests that genetic traits influence the social behavior of individuals:
...Fowler et al examined the social network characteristics of 1,110 twins from an Adolescent Health Dataset which is based on interviews of high school students. Presuming that the social environment that twins share is not influenced by whether they are monozygotic or dizygotic, if network characteristics are significantly more correlated among monozygotic twins than dizygotic twins then there is evidence for a genetic role in network formation.

The network characteristics that Fowler et al.investigate are: in-degree (how many students name a given student as a friend), out-degree (how many students a given student names as friends), transitivity (if A and B are friends, and B and C are friends, what is the likelihood that A and C are friends), and betweenness centrality (the fraction of shortest paths between other pairs of students that a given student lies on). Their statistical analysis assumes that the variation in a network characteristic can be additively separated into a component that is genetic, a component caused by the environment that would be shared with a twin, and a component caused by the environment that would not be shared with a twin. The covariance between monozygotic twins is then the variance caused by the common environment plus the variance caused by genetic factors, whereas the covariance between dizygotic twins is the variance caused by the common environment plus half of the variance caused by genetic factors. This formulation allows one to solve for the percentage of variation in a given network characteristic that is caused by each of the genetic, common environment, and unshared environment components. The figure shows that almost half of the variation in transitivity and in-degree are genetically attributable, and more than a quarter of betweenness centrality is genetically attributable, but the genetic component of the out-degree variation is too small to be statistically significant. The common environment is statistically insignificant in all cases. (click on figure to enlarge it).


Fowler et al. tried a number of network models to fit with the data and found the only one which generated a relationship between genetics and transitivity was an “Attract and Introduce” model built on two assumptions. First, some individuals are inherently more attractive than others, whether physically or otherwise, so they receive more friendship nominations. Second, some individuals are inherently more inclined to introduce new friends to existing friends (and hence such individuals will indirectly enhance their own transitivity).

Brain correlates of musical improvisation.

Berkowitz and Ansari report an fMRI study of the brains of trained pianists while they are improvising. To get control conditions for comparisons they designed a series of four activities. In the two general types of tasks, they had subjects either improvise melodies or play pre-learned patterns. Comparing brain activity in these two situations allowed them to focus on melodic improvisation. Subjects did each of these two general tasks either with or without a metronome. When there was no metronome marking time, subjects improvised their own rhythms. Comparing conditions with and without metronome allowed them to look at rhythmic improvisation. A key point is that when the subjects played patterns (instead of improvised melodies), they could choose to play them in any order. Thus there was still some spontaneity in decision making, but the choices were more limited than during improvisation.

The authors observed an overlap between melodic improvisation and rhythmic improvisation in three areas of the brain: the dorsal premotor cortex (dPMC), the anterior cingulate (ACC), and the inferior frontal gyrus/ventral premotor cortex (IFG/vPMC). From a summary of the work by Bannatyne:
“The dPMC takes information about where the body is in space, makes a motor plan, and sends it to the motor cortex to execute the plan. The fact [that] it’s involved in improvisation is not surprising, since it is a motor activity. The ACC is a part of the brain that appears to be involved in conflict monitoring — when you’re trying to sort out two conflicting possibilities, like when you to read the word BLUE when it’s printed in the color red. It’s involved with decision making, which also makes sense — improvisation is decision making, deciding what to play and how to play it.” The IFG/vPMC is perhaps one of the most interesting findings of their study. “This area is known to be involved when people speak and understand language. It’s also active when people hear and understand music. What we’ve shown is that it’s involved when people create music.”

Improvising, from a neurobiological perspective, involves generating, selecting, and executing musical-motor sequences, something that wouldn’t surprise musicians. But in terms of brain research, it’s a new piece of information.

Wednesday, March 04, 2009

Erasing fear responses and preventing the return of fear.

Kindt et al. demonstrate an interesting effect of a beta-blocker that one thinks might become part of clinical practice soon. They found that a conditioned fear response can be weakened by disrupting the reconsolidation of the fear memory with propranolol and that this disruption prevents the return of fear. While Propranolol disrupts the reconsolidation of the fear memory, it does not disrupt declarative memory (recall of the facts of the fear inducing event). The abstract:
Animal studies have shown that fear memories can change when recalled, a process referred to as reconsolidation. We found that oral administration of the beta-adrenergic receptor antagonist propranolol before memory reactivation in humans erased the behavioral expression of the fear memory 24 h later and prevented the return of fear. Disrupting the reconsolidation of fear memory opens up new avenues for providing a long-term cure for patients with emotional disorders.
Some details:
The conditioned fear response was measured as potentiation of the eyeblink startle reflex to a loud noise (40 ms, 104 dB) by electromyography of the right orbicularis oculi muscle. Stronger startle responses to the loud noise during the fear-conditioned stimulus (CS1+) as compared with the control stimulus (CS2-) reflects the fearful state of the participant elicited by CS1+. Startle potentiation taps directly into the amygdala, and fear-conditioning procedures yield highly reliable and robust startle potentiation.


Figure. (click to enlarge) (af) Mean startle potentiation to the fear-conditioned stimulus (CS1), the control stimulus (CS2) and noise alone (NA) trials (left) and mean expectancy scores of the unconditioned stimulus to CS1 and CS2 trials (right) during acquisition (trial 1–8), extinction (trial 1–10) and test (trial 1–5) for the placebo (n = 20, a,b), propranolol reactivation (n = 20, c,d) and propranolol without reactivation (n = 20, e,f) group. CS1+ refers to the fear conditioned stimulus during acquisition, CS1- refers to the fear conditioned stimulus during extinction and test, CS1-R refers to the reactivation of the fear conditioned stimulus and CS2- refers to the control stimulus during all phases of the experiment. Error bars represent s.e.m.

Transcendence from Neuroscience

Clip from a brief essay by Garreau:
....the new vision of transcendence coming out of neuroscience. It’s long been observed that intelligent organisms require love to develop or even just to survive. Not coincidentally, we can readily identify brain functions that allow and require us to be deeply relational with others. There are also aspects of the brain that can be shown to equip us to experience elevated moments when we transcend boundaries of self. What happens as the implications of all this research starts suggesting that particular religions are just cultural artifacts built on top of universal human physical traits?

Genetic determinants of financial risk taking

Here is an interesting bit from Kuhnen and Chiao, although I'm surprised that the reviewers let them get away with using the word 'determinants' rather than 'correlates':
Individuals vary in their willingness to take financial risks. Here we show that variants of two genes that regulate dopamine and serotonin neurotransmission and have been previously linked to emotional behavior, anxiety and addiction (5-HTTLPR and DRD4) are significant determinants of risk taking in investment decisions. We find that the 5-HTTLPR s/s allele carriers take 28% less risk than those carrying the s/l or l/l alleles of the gene. DRD4 7-repeat allele carriers take 25% more risk than individuals without the 7-repeat allele. These findings contribute to the emerging literature on the genetic determinants of economic behavior.

Tuesday, March 03, 2009

The gourmet palete - an exercise in hedonistic psychology

John Bohannon does a humorous piece in the Feb. 20 issue of Science:
What did you do on New Year's Eve? I watched my friends eat dog food. Throughout the last night of 2008, I stood in a makeshift laboratory in the corner of a packed Brooklyn house party. I presented people with bowls of paté--labeled A through E--and a pile of crackers. I explained that four of the bowls contained human food, including expensive luxury patés. One was canned dog food that had been pulsed in a food processor, giving it the same consistency as that of paté. My open-minded friends looked thoughtfully into the middle distance as they munched on mouthfuls of each, jotted down their assessment on data sheets, and then drifted back into the party. As the data rolled in, my eyes grew wide with amazement. Nobody was guessing correctly which was the dog food.

...The five samples covered a wide price range: two expensive liver patés (duck and chicken), two cheap imitation patés (puréed liverwurst and Spam), and the ultimate bargain (dog food). My subjects were hopeless at guessing which paté was dog food. But the answer was literally on the tip of their tongues. Although only one in six people correctly guessed that dish C contained the dog food, almost 75% rated it last in terms of taste. People significantly loathed the dog food (Newell and MacFarlane multiple comparison, p less than 0.1), and that did not correlate with relative sobriety. To cap it off, the average taste rankings of the five spreads exactly matched their relative prices.
Perhaps this result is not surprising, given that numerous blind taste tests involving hundreds of people have shown no correlation between the price of wines costing from $1.50 to $150 and their reported taste.

Thought for the day - the Twitter Bubble

I am incredulous that so many people seem to want to share the ongoing details of their life via twitter and facebook. Do I really care to know that friend X is about to brush his teeth and go to bed? Allesandra Stanley writes a humorous piece on this phenomenon. Some clips:
Left alone in a cage with a mountain of cocaine, a lab rat will gorge itself to death. Caught up in a housing bubble, bankers will keep selling mortgage-backed securities — and amassing bonuses — until credit markets seize, companies collapse, and millions of investors lose their jobs and homes....And news anchors and television personalities who have their own shows, Web sites, blogs and pages on Facebook.com and MySpace.com will send Twitter messages until the last follower falls into a coma.

At the height of the subprime folly, there was not enough outside regulation or inner compunction to restrain heedless excess. It’s too late for traders, but that economic mess should be a lesson for those who traffic in information. Like bankers who never feel they’ve earned enough, television anchors and correspondents apparently never feel that they have communicated enough....It’s not just television, of course. Ordinary people, bloggers and even columnists and book authors, who all already have platforms for their views, feel compelled to share their split-second aperçus, no matter how mundane.

Those who say Twitter is a harmless pastime, which skeptics are free to ignore, are ignoring the corrosive secondary effects. We already live in an era of me-first journalism, autobiographical blogs and first-person reportage. Even daytime cable news is clotted with Lou Dobbsian anchors who ooze self-regard and intemperate opinion...On-air meltdowns are the new scoops. The CNBC correspondent Rick Santelli, a former trader, delivered a rant last week on the floor of the Chicago Mercantile Exchange about the Obama administration’s mortgage bailout proposal.

Mr. Santelli, it should be noted, has not lost all restraint: he does not yet have his own Twitter account. Fans created one for him, in case he changes his mind. “Just to let everyone know,” one follower explained. “This is NOT Rick’s account, but it is a place holder for him as soon as WE can convince him to join Twitter. :)”

And that space has, as of 4:20 on Friday afternoon, 158 followers. Twitterers who maintain that their messages must have meaning since they have an audience should keep Mr. Santelli’s void in mind. There are always some people who, given the chance, will respond to anything, even nothing.

How early abuse in humans changes the adult brain.

Studies on rat models have shown that affectionate mothering alters gene expression to dampen physiological responses to stress, while early abuse has the opposite effect. Now these basic results have been extended to humans by McGowan et al., who carried out a study of people who have committed suicide. They found that that people who were abused or neglected as children showed genetic alterations that likely made them more biologically sensitive to stress. An epigenetic regulation of the glucocorticoid receptor gene, NR3C1, is observed in humans who had been abused as children that is consistent with predictions derived from a rodent model in which early postnatal experience influences adult responses to stress. (Decreases in the expression of this receptor increase reactivity to stress.) I pass on their abstract, and here is a nice explanation of what epigenetic changes are (see also the review by Benedict Carey).
Maternal care influences hypothalamic-pituitary-adrenal (HPA) function in the rat through epigenetic programming of glucocorticoid receptor expression. In humans, childhood abuse alters HPA stress responses and increases the risk of suicide. We examined epigenetic differences in a neuron-specific glucocorticoid receptor (NR3C1) promoter between postmortem hippocampus obtained from suicide victims with a history of childhood abuse and those from either suicide victims with no childhood abuse or controls. We found decreased levels of glucocorticoid receptor mRNA, as well as mRNA transcripts bearing the glucocorticoid receptor 1F splice variant and increased cytosine methylation of an NR3C1 promoter. Patch-methylated NR3C1 promoter constructs that mimicked the methylation state in samples from abused suicide victims showed decreased NGFI-A transcription factor binding and NGFI-A–inducible gene transcription. These findings translate previous results from rat to humans and suggest a common effect of parental care on the epigenetic regulation of hippocampal glucocorticoid receptor expression.

Monday, March 02, 2009

For a tranquil start to your week, Debussy with flowers

I got an email from the fellow who made this video asking if he could use my YouTube videorecording of the Debussy Reverie. I said 'sure, go ahead'.... I'm not too keen on the electronic 'enhancements' he added to my basic piano track to make the first half of the video, but here it is...

Biased minds make better inferences.

Here is an interesting open access article "Homo Heuristicus: Why Biased Minds Make Better Inferences" from the first issue of a new journal from Wiley Interscience, Topics in Cognitive Science. (Check out this free online first issue, there are a number of other fascinating articles). It makes the point that a biased mind can handle uncertainty more efficiently and robustly than an unbiased mind relying on more resource-intensive and general-purpose processing strategies. Its abstract:
Heuristics are efficient cognitive processes that ignore information. In contrast to the widely held view that less processing reduces accuracy, the study of heuristics shows that less information, computation, and time can in fact improve accuracy. We review the major progress made so far: (a) the discovery of less-is-more effects; (b) the study of the ecological rationality of heuristics, which examines in which environments a given strategy succeeds or fails, and why; (c) an advancement from vague labels to computational models of heuristics; (d) the development of a systematic theory of heuristics that identifies their building blocks and the evolved capacities they exploit, and views the cognitive system as relying on an "adaptive toolbox;" and (e) the development of an empirical methodology that accounts for individual differences, conducts competitive tests, and has provided evidence for people's adaptive use of heuristics. Homo heuristicus has a biased mind and ignores part of the available information, yet a biased mind can handle uncertainty more efficiently and robustly than an unbiased mind relying on more resource-intensive and general-purpose processing strategies.

A common brain substrate for evaluating physical and social space.

From Yamakawa et al, work that is consonant with models of embodied cognition (cf. George Lakoff and Mark Johnson) :
Across cultures, social relationships are often thought of, described, and acted out in terms of physical space (e.g. “close friends” “high lord”). Does this cognitive mapping of social concepts arise from shared brain resources for processing social and physical relationships? Using fMRI, we found that the tasks of evaluating social compatibility and of evaluating physical distances engage a common brain substrate in the parietal cortex. The present study shows the possibility of an analytic brain mechanism to process and represent complex networks of social relationships. Given parietal cortex's known role in constructing egocentric maps of physical space, our present findings may help to explain the linguistic, psychological and behavioural links between social and physical space.

Friday, February 27, 2009

Gesture and language acquisition

Gestures precede speech development and, after speech development, continue to enrich the communication process. Comparing how young children and their parents used gesture in their communications with analyses of socioeconomic status and of the child's vocabulary at age 54 months, Rowe and Goldin-Meadow find disparities in gesture use that precede vocabulary disparities (Children from lower socioeconomic brackets tend to have smaller vocabularies than children from higher socioeconomic brackets.) Their abstract:
Children from low–socioeconomic status (SES) families, on average, arrive at school with smaller vocabularies than children from high-SES families. In an effort to identify precursors to, and possible remedies for, this inequality, we videotaped 50 children from families with a range of different SES interacting with parents at 14 months and assessed their vocabulary skills at 54 months. We found that children from high-SES families frequently used gesture to communicate at 14 months, a relation that was explained by parent gesture use (with speech controlled). In turn, the fact that children from high-SES families have large vocabularies at 54 months was explained by children's gesture use at 14 months. Thus, differences in early gesture help to explain the disparities in vocabulary that children bring with them to school.

Followup on genes and language

I wanted to pass on some summary clips from a review by Berwick of the paper featured in a Feb. 12 post on an article by Chater et al. ("Language evolved to fit the human brain...").
Is language more like fashion hemlines or more like the number of fingers on each hand? On the one hand, we know that all normal people, unlike any cats or fish, uniformly grow up speaking some language, just like having 5 fingers on each hand, so language must be part of what is unique to the human genome. However, if one is born in Beijing one winds up speaking a very different language than if one is born in Mumbai, so the number-of-fingers analogy is not quite correct.
The Chater et al. article:
...maintains that the linguistic particulars distinguishing Mandarin from Hindi cannot have arisen as genetically encoded and selected-for adaptations via at least one common route linking evolution and learning, the Baldwin–Simpson effect

In the Baldwin–Simpson model, rather than direct selection for a trait, in this case a particular external behavior, there is selection for learning it. However, as is well known, this entrainment linking learning to genomic encoding works only if there is a close match between the pace of external change and genetic change, even though gene frequencies change only relatively slowly, plodding generation by generation. Applied to language evolution, the basic idea of Chater et al. is to use computer simulations to show that in general the linguistic regularities learners must acquire, such as whether sentences get packaged into verb–object order, e.g., eat apples, as in Mandarin, or object-verb order, e.g., apples eat, as in Hindi, can fluctuate too rapidly across generations to be captured and then encoded by the human genome as some kind of specialized “language instinct.” This finding runs counter to one popular view that these properties of human language were explicitly selected for, instead pointing to human language as largely adventitious, an exaptation, with many, perhaps most, details driven by culture. If this finding is correct, then the portion of the human genome devoted to language alone becomes correspondingly greatly reduced. There is no need, and more critically no informational space, for the genome to blueprint some intricate set of highly-modular, interrelated components for language, just as the genome does not spell out the precise neuron-to-neuron wiring of the developing brain.
Matters boil down to recursion, which I have mentioned in several previous posts.
Chater et al.'s report also points to a rare convergence between the results from 2 quite different fields and methodologies that have often been at odds: the simulation-based, culturally-oriented approach of the PNAS study and a recent, still controversial trend in one strand of modern theoretical linguistics. Both arrive at the same conclusion: a minimal human genome for language. The purely linguistic effort strips away all of the special properties of language, down to the bare-bones necessities distinguishing us from all other species, relegating such previously linguistic matters such as verb–object order vs. object–verb order to extralinguistic factors, such as a general nonhuman cognitive ability to process ordered sequences aligned like beads on a string. What remains? If this recent linguistic program is on the right track, there is in effect just one component left particular to human language, a special combinatorial competence: the ability to take individual items like 2 words, the and apple, and then “glue” them together, outputting a larger, structured whole, the apple, that itself can be manipulated as if it were a single object. This operation runs beyond mere concatenation, because the new object itself still has 2 parts, like water compounded from hydrogen and oxygen, along with the ability to participate in further chemical combinations. Thus this combinatorial operation can apply over and over again to its own output, recursively, yielding an infinity of ever more structurally complicated objects, ate the apple, John ate the apple, Mary knows John ate the apple, a property we immediately recognize as the hallmark of human language, an infinity of possible meaningful signs integrated with the human conceptual system, the algebraic closure of a recursive operator over our dictionary.

This open-ended quality is quite unlike the frozen 10- to 20-word vocalization repertoire that marks the maximum for any other animal species. If it is simply this combinatorial promiscuity that lies at the heart of human language, making “infinite use of finite means,” then Chater et al.'s claim that human language is an exaptation rather than a selected-for adaptation becomes not only much more likely but very nearly inescapable.

Thursday, February 26, 2009

Envy and Schadenfreude in the brain.

Takahasi et al. show that experiencing envy at another person's success activates pain-related neural circuitry, whereas experiencing schadenfreude--delight at someone else's misfortune--activates reward-related neural circuitry. A graphic from the perspectives article by Lieberman and Eisenberger:


The pain and pleasure systems. The pain network consists of the dorsal anterior cingulate cortex (dACC), insula (Ins), somatosensory cortex (SSC), thalamus (Thal), and periaqueductal gray (PAG). This network is implicated in physical and social pain processes. The reward or pleasure network consists of the ventral tegmental area (VTA), ventral striatum (VS), ventromedial prefrontal cortex (VMPFC), and the amygdala (Amyg). This network is implicated in physical and social rewards.

Fetal testosterone predicts male-typical play.

In a survey of several hundred births (112 male, 100 female), Auyeung et al. have found a significant relationship between fetal testosterone and sexually differentiated play behavior in both boys and girls.
Mammals, including humans, show sex differences in juvenile play behavior. In rodents and nonhuman primates, these behavioral sex differences result, in part, from sex differences in androgens during early development. Girls exposed to high levels of androgen prenatally, because of the genetic disorder congenital adrenal hyperplasia, show increased male-typical play, suggesting similar hormonal influences on human development, at least in females. Here, we report that fetal testosterone measured from amniotic fluid relates positively to male-typical scores on a standardized questionnaire measure of sex-typical play in both boys and girls. These results show, for the first time, a link between fetal testosterone and the development of sex-typical play in children from the general population, and are the first data linking high levels of prenatal testosterone to increased male-typical play behavior in boys.

Wednesday, February 25, 2009

Monoamine oxidase A gene predicts aggression following provocation

From McDermott et al. :
Monoamine oxidase A gene (MAOA) has earned the nickname “warrior gene” because it has been linked to aggression in observational and survey-based studies. However, no controlled experimental studies have tested whether the warrior gene actually drives behavioral manifestations of these tendencies. We report an experiment, synthesizing work in psychology and behavioral economics, which demonstrates that aggression occurs with greater intensity and frequency as provocation is experimentally manipulated upwards, especially among low activity MAOA (MAOA-L) subjects. In this study, subjects paid to punish those they believed had taken money from them by administering varying amounts of unpleasantly hot (spicy) sauce to their opponent. There is some evidence of a main effect for genotype and some evidence for a gene by environment interaction, such that MAOA is less associated with the occurrence of aggression in a low provocation condition, but significantly predicts such behavior in a high provocation situation. This new evidence for genetic influences on aggression and punishment behavior complicates characterizations of humans as “altruistic” punishers and supports theories of cooperation that propose mixed strategies in the population. It also suggests important implications for the role of individual variance in genetic factors contributing to everyday behaviors and decisions.

Musical training enhances linguistic abilities in children

An interesting report from Moreno et al. in the journal Cerebral Cortex. They:
...conducted a longitudinal study with 32 nonmusician children over 9 months to determine 1) whether functional differences between musician and nonmusician children reflect specific predispositions for music or result from musical training and 2) whether musical training improves nonmusical brain functions such as reading and linguistic pitch processing. Event-related brain potentials were recorded while 8-year-old children performed tasks designed to test the hypothesis that musical training improves pitch processing not only in music but also in speech. Following the first testing sessions nonmusician children were pseudorandomly assigned to music or to painting training for 6 months and were tested again after training using the same tests. After musical (but not painting) training, children showed enhanced reading and pitch discrimination abilities in speech. Remarkably, 6 months of musical training thus suffices to significantly improve behavior and to influence the development of neural processes as reflected in specific pattern of brain waves. These results reveal positive transfer from music to speech and highlight the influence of musical training. Finally, they demonstrate brain plasticity in showing that relatively short periods of training have strong consequences on the functional organization of the children's brain.

Tuesday, February 24, 2009

Training your working memory increases your cortical Dopamine D1 receptors

McNab et al demonstrate training-induced brain changes that indicate an unexpectedly high level of plasticity of our cortical dopamine D1 system and illustrate the mutual interdependence of our behavior and the underlying brain biochemistry. The training included a visuo-spatial working memory task, a backwards digit span task and a letter span task. These are similar to the n-back tests that I have mentioned in previous posts. The authors had previously shown increased prefrontal and parietal activity after training of working memory. Their abstract:
Working memory is a key function for human cognition, dependent on adequate dopamine neurotransmission. Here we show that the training of working memory, which improves working memory capacity, is associated with changes in the density of cortical dopamine D1 receptors. Fourteen hours of training over 5 weeks was associated with changes in both prefrontal and parietal D1 binding potential. This plasticity of the dopamine D1 receptor system demonstrates a reciprocal interplay between mental activity and brain biochemistry in vivo.
A clip from their methods description:
Participants performed working memory (WM) tasks with a difficulty level close to their individual capacity limit for about 35 min per day over a period of 5 weeks (8–10). Thirteen volunteers (healthy males 20 to 28 years old) performed the 5-week WM training. Five computer-based WM tests (three visuospatial and two verbal) were used to measure each participant's WM capacity before and after training, and they showed a significant improvement of overall WM capacity (paired t test, t = 11.1, P less than 0.001). The binding potential (BP) of D1 and D2 receptors was measured with positron emission tomography (PET) while the participants were resting, before and after training, using the radioligands [11C]SCH23390 and [11C]Raclopride, respectively.

Malthusian information famine

A view of our information future from Charles Seife:
...Vast amounts of digital memory will change the relationship that humans have with information....For the first time, we as a species have the ability to remember everything that ever happens to us. For millennia, we were starving for information to act as raw material for ideas. Now, we are about to have a surfeit.

Alas, there will be famine in the midst of all that plenty. There are some hundred million blogs, and the number is roughly doubling every year. The vast majority are unreadable. Several hundred billion e-mail messages are sent every day; most of it—current estimates run around 70%—is spam. There seems to be a Malthusian principle at work: information grows exponentially, but useful information grows only linearly. Noise will drown out signal. The moment that we, as a species, finally have the memory to store our every thought, etch our every experience into a digital medium, it will be hard to avoid slipping into a Borgesian nightmare where we are engulfed by our own mental refuse.

Monday, February 23, 2009

Some Chopin for a Monday morning.

This is Chopin's Nocture Op. 9 No. 1, which I recorded last May. I miss my Steinway B grand piano back in Wisconsin during my current snowbird period in Fort Lauderdale Florida. I will probably do a burst of pent-up recordings when I get back in April.

How we decide how big a reward is...

Furlong and Opfer do a nice set of experiments showing that we can be lured into making decisions by numbers that seem bigger than they really are. We apparently go with numerical values rather than real economic values. They asked volunteers to take part in the prisoner’s dilemma behavioral test, in which two partners are offered various rewards to either work together or defect. The idea is that in the long term, the participants earn the most money by cooperating. But in any given round of play, they make the most if they decide to turn against their partner while he stays loyal. (The reward is lowest when both partners defect.) When the reward for cooperation was increased to 300 cents from 3 cents, the researchers found, the level of cooperation went up. But when the reward went from 3 cents to $3, it did not. Here is their abstract:
Cooperation often fails to spread in proportion to its potential benefits. This phenomenon is captured by prisoner's dilemma games, in which cooperation rates appear to be determined by the distinctive structure of economic incentives (e.g., $3 for mutual cooperation vs. $5 for unilateral defection). Rather than comparing economic values of cooperating versus not ($3 vs. $5), we tested the hypothesis that players simply compare numeric values (3 vs. 5), such that subjective numbers (mental magnitudes) are logarithmically scaled. Supporting our hypothesis, increasing only numeric values of rewards (from $3 to 300¢) increased cooperation, whereas increasing economic values increased cooperation only when there were also numeric increases. Thus, changing rewards from 3¢ to 300¢ increased cooperation rates, but an economically identical change from 3¢ to $3 elicited no gains. Finally, logarithmically scaled reward values predicted 97% of variation in cooperation, whereas the face value of economic rewards predicted none. We conclude that representations of numeric value constrain how economic rewards affect cooperation.

Similar risk assessment in man and mouse.

In an open access article Balci et al. devise a simple and clever timing task which captures the essence of temporal decision making that confronts human and nonhuman animal subjects in everyday life, and show that men are no better than mice in assessing a simple kind of uncertainty. This suggests that mechanisms for near-optimal risk assessment in many everyday contexts evolved long ago. Their abstract:
Human and mouse subjects tried to anticipate at which of 2 locations a reward would appear. On a randomly scheduled fraction of the trials, it appeared with a short latency at one location; on the complementary fraction, it appeared after a longer latency at the other location. Subjects of both species accurately assessed the exogenous uncertainty (the probability of a short versus a long trial) and the endogenous uncertainty (from the scalar variability in their estimates of an elapsed duration) to compute the optimal target latency for a switch from the short- to the long-latency location. The optimal latency was arrived at so rapidly that there was no reliably discernible improvement over trials. Under these nonverbal conditions, humans and mice accurately assess risks and behave nearly optimally. That this capacity is well-developed in the mouse opens up the possibility of a genetic approach to the neurobiological mechanisms underlying risk assessment.

Friday, February 20, 2009

How cute is that baby's face - hormones regulate the answer.

Sprengelmeyer et al. make some interesting observations suggesting that female reproductive hormones increase sensitivity to variations in the cuteness of baby faces. Their abstract:
We used computer image manipulation to develop a test of perception of subtle gradations in cuteness between infant faces. We found that young women (19–26 years old) were more sensitive to differences in infant cuteness than were men (19–26 and 53–60 years old). Women aged 45 to 51 years performed at the level of the young women, whereas cuteness sensitivity in women aged 53 to 60 years was not different from that of men (19–26 and 53–60 years old). Because average age at menopause is 51 years in Britain, these findings suggest the possible involvement of reproductive hormones in cuteness sensitivity. Therefore, we compared cuteness discrimination in pre- and postmenopausal women matched for age and in women taking and not taking oral contraceptives (progestogen and estrogen). Premenopausal women and young women taking oral contraceptives (which raise hormone levels artificially) were more sensitive to variations of cuteness than their respective comparison groups. We suggest that cuteness sensitivity is modulated by female reproductive hormones.

Modulation of the brain's emotion circuits by facial muscle feedback

Several studies have shown that facial muscle contractions associated with various emotions can induce or enhance the correlated emotional feelings, or counter them if the facial movements and central feelings are in opposition (as in forcing a smile while angry.) The late senator Proxmire of Wisconsin wrote a self help book that included instruction for making a 'happy face' to improve your mood. Hennenlotter et al. now do an interesting bit of work in which they observe that blocking the feedback of frown muscles to the brain lowers the level of amygdala activation during a subject's imitiation of an angry facial expression:
Afferent feedback from muscles and skin has been suggested to influence our emotions during the control of facial expressions. Recent imaging studies have shown that imitation of facial expressions is associated with activation in limbic regions such as the amygdala. Yet, the physiological interaction between this limbic activation and facial feedback remains unclear. To study if facial feedback effects on limbic brain responses during intentional imitation of facial expressions, we applied botulinum toxin (BTX)–induced denervation of frown muscles in combination with functional magnetic resonance imaging as a reversible lesion model to minimize the occurrence of afferent muscular and cutaneous input. We show that, during imitation of angry facial expressions, reduced feedback due to BTX treatment attenuates activation of the left amygdala and its functional coupling with brain stem regions implicated in autonomic manifestations of emotional states. These findings demonstrate that facial feedback modulates neural activity within central circuitries of emotion during intentional imitation of facial expressions. Given that people tend to mimic the emotional expressions of others, this could provide a potential physiological basis for the social transfer of emotion.

Thursday, February 19, 2009

The smell of fear modulates our perception of threat in faces

This is kind of neat: Zhou and Chen collected gauze pads that had absorbed sweat from the armpit apocrine glands of men (because they sweat more) watching a horror movie or a happy or neutral movie. Women sniffed the extracted smells (versus neutral controls) while watching a face morph from happy to frightened (women have more sensitive sense of smell and sensitivity to emotional signals). The chemosignal of fearful sweat biased the women toward interpreting ambiguous expressions as more fearful, but had no effect when the facial emotion was more discernible. This shows that fear-related chemosignals modulate humans' visual emotion perception in an emotion-specific way

Men tolerate their peers better than women

This study by Benenson et al. was conducted to examine the often-cited conclusion that human females are more sociable than males. Its results certainly correlate with my own university experience. In studying students at a Northeastern university they concluded that:
Males were more likely than females to be satisfied with their roommates and were less bothered by their roommates' style of social interaction, types of interests, values, and hygiene, regardless of whether or not the roommates were selected for study because they were experiencing conflicts. Furthermore, males were less likely than females to switch roommates over the course of a year at three collegiate institutions. Finally, violation of a friendship norm produced a smaller negative effect on friendship belief in males than in females.
They authors maintain (this surprises me, if true) that their studies are the first to demonstrate that males, compared with females, display higher levels of tolerance for genetically unrelated same-sex individuals.

Wednesday, February 18, 2009

When losing control can be useful.

Apfelbaum and Sommers do a simple experiment that suggests that diminished executive control can facilitate positive outcomes in contentious intergroup interactions. Here is their abstract, followed by a description of how the subject's sense of executive control was manipulated:
Across numerous domains, research has consistently linked decreased capacity for executive control to negative outcomes. Under some conditions, however, this deficit may translate into gains: When individuals' regulatory strategies are maladaptive, depletion of the resource fueling such strategies may facilitate positive outcomes, both intra- and interpersonally. We tested this prediction in the context of contentious intergroup interaction, a domain characterized by regulatory practices of questionable utility. White participants discussed approaches to campus diversity with a White or Black partner immediately after performing a depleting or control computer task. In intergroup encounters, depleted participants enjoyed the interaction more, exhibited less inhibited behavior, and seemed less prejudiced to Black observers than did control participants—converging evidence of beneficial effects. Although executive capacity typically sustains optimal functioning, these results indicate that, in some cases, it also can obstruct positive outcomes, not to mention the potential for open dialogue regarding divisive social issues.
Now, the following dinking with executive control to generate 'depleted' participants sort of makes sense to me, but I'm not sure I really get it...
The Attention Network Test is a computer-based measure of attention. We modified the ANT component typically used to gauge executive control into a manipulation of executive capacity. Across multiple trials, participants were presented with a string of five arrows and instructed to quickly and accurately indicate the direction of the center arrow (i.e., whether the arrow was pointing left or right). The center arrow was either congruent (i.e., ←←←←←, →→→→→) or incongruent (i.e., →→←→→, ←←→←←) with its flankers; correct responses to incongruent trials thus required executive control to override the natural tendency to follow the flankers. Participants in the depletion condition were presented with congruent and incongruent stimuli, whereas participants in the control condition viewed congruent stimuli only.

If it is difficult to pronounce, it must be risky.

Song and Schwartz make the observation that low processing fluency (as with names that are difficult to pronounce) fosters the impression that a stimulus is unfamiliar, which in turn results in perceptions of higher risk. Ostensible food additives were rated as more harmful when their names were difficult to pronounce than when their names were easy to pronounce, and amusement-park rides were rated as more likely to make one sick (an undesirable risk) and also as more exciting and adventurous (a desirable risk) when their names were difficult to pronounce than when their names were easy to pronounce.

Tuesday, February 17, 2009

Brain imaging can reflect expected, rather than actual, nerve activity

Work by Sirotin and Das illustrates how the brain thinks ahead. Electrical signalling among brain cells summons the local delivery of extra blood — the basis of functional brain imaging. And the usual assumption is that an increase in blood flow means an increase in electrical activity. The experiments by Sirotin and Das show that blood can be sent to the brain's visual cortex in the absence of any stimulus, priming the neural tissue in apparent anticipation of future events. (They observed this mismatch in alert rhesus monkeys by simultaneously measuring vascular and neural responses in the same region of the visual cortex. Changes in the blood supply were monitored by a sensitive video camera peering at the surface of the brain through a transparent window in the animal's skull, and local electrical responses of neurons were measured with a microelectrode.) Their results show that cortical blood flow can depart wildly from what is expected on the basis of local neural activity. Blood can be sent in anticipation of neural events that never take place.

Knowledge about how we know changes everything.

The essay by Boroditsky in the Edge series has the following interesting comments:
In the past ten years, research in cognitive science has started uncovering the neural and psychological substrates of abstract thought, tracing the acquisition and consolidation of information from motor movements to abstract notions like mathematics and time. These studies have discovered that human cognition, even in its most abstract and sophisticated form, is deeply embodied, deeply dependent on the processes and representations underlying perception and motor action. We invent all kinds of complex abstract ideas, but we have to do it with old hardware: machinery that evolved for moving around, eating, and mating, not for playing chess, composing symphonies, inventing particle colliders, or engaging in epistemology for that matter. Being able to re-use this old machinery for new purposes has allowed us to build tremendously rich knowledge repertoires. But it also means that the evolutionary adaptations made for basic perception and motor action have inadvertently shaped and constrained even our most sophisticated mental efforts. Understanding how our evolved machinery both helps and constrains us in creating knowledge, will allow us to create new knowledge, either by using our old mental machinery in yet new ways, or by using new and different machinery for knowledge-making, augmenting our normal cognition.

So why will knowing more about how we know change everything? Because everything in our world is based on knowledge. Humans, leaps and bounds beyond any other creatures, acquire, create, share, and pass on vast quantities of knowledge. All scientific advances, inventions, and discoveries are acts of knowledge creation. We owe civilization, culture, science, art, and technology all to our ability to acquire and create knowledge. When we study the mechanics of knowledge building, we are approaching an understanding of what it means to be human—the very nature of the human essence. Understanding the building blocks and the limitations of the normal human knowledge building mechanisms will allow us to get beyond them. And what lies beyond is, well, yet unknown...

Monday, February 16, 2009

Robocop and cello scrotum

I thought these two items in the Random Samples section of the Feb. 6 Science Magazine were a hoot:
FIDDLE WITHOUT FEAR:

Elaine Murphy was just starting her medical career in 1974 when she and her husband, John, pulled a fast one on the editors of the British Medical Journal (BMJ). The joke's long run ended last week when the Murphys confessed that a medical condition, "cello scrotum," they coined in a letter to the journal 35 years ago doesn't exist.

Now a baroness and member of the British House of Lords, Murphy and her partner in crime admitted the hoax in a letter published 27 January in BMJ. The couple came up with the prank after reading a letter to BMJ in April 1974 on "guitar nipple," an alleged chest inflammation that the couple assumed was fake. In the spirit of one-upmanship, the pair wrote a short note on "cello scrotum," an inflammation on a fabricated patient who played the cello for hours each day. "We never expected our spoof letter to be published," Murphy says. "We probably wrote it after a glass of wine or two."

The Murphys came clean after finding a reference to cello scrotum in a December 2008 issue of the journal. Although journal editors disapprove of dishonesty in science, Tony Delamothe, a deputy editor at BMJ, says that the Murphys' joke was harmless. "All of my colleagues, from the editor down, think it's a hoot," Delamothe says. Murphy adds that she's received no negative fallout. "I was worried the House of Lords would think I was bringing them into disrepute," she says, "but so far, everyone wants to enjoy the joke."

NEW SHERIFF IN TOWN:

It's not RoboCop, but Japanese robotmaker Tmsuk believes its T-34 security robot can fight crime by snaring intruders in an entangling net. The 60-centimeter-tall robot sends real-time video of its surroundings to a remote operator's mobile phone over Japan's advanced mobile phone service, eliminating the need for cables or wireless networks. On command, the T-34 fires a weighted net capable of enveloping a human target up to 3.5 meters away, holding the suspected criminal until security officers arrive. Tmsuk, which worked with security service provider Alacom in developing the T-34, says the robot could confront dangerous intruders while keeping human guards at a safe distance. "We think this could serve the needs of the security industry," says company spokesperson Mariko Ishikawa. The company recently demonstrated a working prototype and says a commercial model could be on the market in a few years for about $5000.

Brain correlates of dealing with risk versus ambiguity

Because it is relevant to last friday's post on the economic situation, I thought I would bring forward this bit of work which I had been planning to mention soon. It is yet another interesting study from the group at Wellcome Center group at University College associated with Ray Dolan - cognitive neuroscience that is directly relevant to our current economic and political reality:
In economic decision making, outcomes are described in terms of risk (uncertain outcomes with certain probabilities) and ambiguity (uncertain outcomes with uncertain probabilities). Humans are more averse to ambiguity than to risk, with a distinct neural system suggested as mediating this effect. However, there has been no clear disambiguation of activity related to decisions themselves from perceptual processing of ambiguity. In a functional magnetic resonance imaging (fMRI) experiment, we contrasted ambiguity, defined as a lack of information about outcome probabilities, to risk, where outcome probabilities are known, or ignorance, where outcomes are completely unknown and unknowable. We modified previously learned pavlovian CS+ stimuli such that they became an ambiguous cue and contrasted evoked brain activity both with an unmodified predictive CS+ (risky cue), and a cue that conveyed no information about outcome probabilities (ignorance cue). Compared with risk, ambiguous cues elicited activity in posterior inferior frontal gyrus and posterior parietal cortex during outcome anticipation. Furthermore, a similar set of regions was activated when ambiguous cues were compared with ignorance cues. Thus, regions previously shown to be engaged by decisions about ambiguous rewarding outcomes are also engaged by ambiguous outcome prediction in the context of aversive outcomes. Moreover, activation in these regions was seen even when no actual decision is made. Our findings suggest that these regions subserve a general function of contextual analysis when search for hidden information during outcome anticipation is both necessary and meaningful.
The authors also comment on previous work emphasizing the amygdala:
In contrast to the present experiment, a previous fMRI study has suggested that the amygdala and dorsomedial prefrontal and orbitofrontal cortex underlie decision making under ambiguity (Hsu et al., 2005). ... Although it is obvious that the amygdala responds to some kinds of uncertainty [e.g., temporal unpredictability], different forms of uncertainty have not been formally compared with regard to such responses. The kind of outcome uncertainty described in the aforementioned work is likely to be different from the economic definition applied in the present study (e.g., the lack of knowledge about CS–UCS contingencies in fear conditioning paradigms corresponds to the ignorance and not the ambiguity condition in the present study). The study by Hsu et al. (2005), although concerned with an economic definition of ambiguity, in fact collapsed different kinds of "ambiguous" situations for analysis of fMRI data, that is, monetary gambles following a strict economic definition, but also quizzes, and uninformed gambles against an informed opponent. Together, the data indicate that there is no entirely convincing empirical evidence that the amygdala responds to ambiguity as defined in a strict economic sense, an inference upheld by our present findings, although such a role of the amygdala cannot be discounted entirely (Seymour and Dolan, 2008).

Placebos, curing within...

I wanted to pass on two pieces on self curing and the placebo effect pointed out to me by a mindblog reader. Amanda Schaffer offers a review in Slate on Anne Harrington's new book "The Cure Within", which maps the history of mind-body medicine. Also, a recent study testing pain relief from analgesics shows that merely telling people that a novel form of codeine they were taking (actually a placebo) was worth $2.50 rather than 10 cents increased the proportion of people who reported pain relief from 61% to 85.4%.1 When the "price" of the placebo was reduced, so was the pain relief.

Friday, February 13, 2009

Run for the hills.....

Three items in today's New York Times are sufficienly pungent to warrant mention. Lohr's article gives a clear exposition of the fact that the nation's banking system is effectively insolvent, it's debts being greater than its assets. Krugman again notes the futility of current plans which avoid shutting down the bad banks (and wiping out their investors) and saving the solvent ones. And Brooks, in an OpEd piece that motivated me to go ahead with this post, paints a pessimistic imagined future scenario for 2010 influenced by his reading of current cognitive neuroscience (Here, for example, is a relevant article, more recent than the work that Brooks was aware of, showing structures that appear to be more important than the amygdala in dealing with uncertainty). From Brooks' piece:
The problem was this: The policy makers knew how to pull economic levers, but they did not know how to use those levers to affect social psychology.

The crisis was labeled an economic crisis, but it was really a psychological crisis. It was caused by a mood of fear and uncertainty, which led consumers to not spend, bankers to not lend and entrepreneurs to not risk. No amount of federal spending could change this psychology because uncertainty about the future remained acute.

Essentially, Americans had migrated from one society to another — from a society of high trust to a society of low trust, from a society of optimism to a society of foreboding, from a society in which certain financial habits applied to a society in which they did not. In the new world, investors had no basis from which to calculate risk. Families slowly deleveraged. Bankers had no way to measure the future value of assets.

Cognitive scientists distinguish between normal risk-assessment decisions, which activate the reward-prediction regions of the brain, and decisions made amid extreme uncertainty, which generate activity in the amygdala. These are different mental processes using different strategies and producing different results. Americans were suddenly forced to cope with this second category, extreme uncertainty.

Economists and policy makers had no way to peer into this darkness. Their methods were largely based on the assumption that people are rational, predictable and pretty much the same. Their models work best in times of equilibrium. But in this moment of disequilibrium, behavior was nonlinear, unpredictable, emergent and stubbornly resistant to Keynesian rationalism.

...The nation had essentially bet its future on economic models with primitive views of human behavior. The government had tried to change social psychology using the equivalent of leeches and bleeding.

(A friend of mine claims to know a former hedge fund manager who has converted his assets to gold coins, and bought a safe, and a shotgun!)

Faster evolution means more ethnic differences.

Some interesting thoughts from Jonathan Haidt:
...a betting person would have to predict that as we decode the genomes of people around the world, we're going to find deeper differences than most scientists now expect...A wall has long protected respectable evolutionary inquiry from accusations of aiding and abetting racism. That wall is the belief that genetic change happens at such a glacial pace that there simply was not time, in the 50,000 years since humans spread out from Africa, for selection pressures to have altered the genome in anything but the most trivial way (e.g., changes in skin color and nose shape were adaptive responses to cold climates). ...But the writing is on the wall. Russian scientists showed in the 1990s that a strong selection pressure (picking out and breeding only the tamest fox pups in each generation) created what was — in behavior as well as body — essentially a new species in just 30 generations. That would correspond to about 750 years for humans.

Humans may never have experienced such a strong selection pressure for such a long period, but they surely experienced many weaker selection pressures that lasted far longer, and for which some heritable personality traits were more adaptive than others. It stands to reason that local populations (not continent-wide "races") adapted to local circumstances by a process known as "co-evolution" in which genes and cultural elements change over time and mutually influence each other. The best documented example of this process is the co-evolution of genetic mutations that maintain the ability to fully digest lactose in adulthood with the cultural innovation of keeping cattle and drinking their milk. This process has happened several times in the last 10,000 years, not to whole "races" but to tribes or larger groups that domesticated cattle.

...traits that led to Darwinian success in one of the many new niches and occupations of Holocene life — traits such as collectivism, clannishness, aggressiveness, docility, or the ability to delay gratification — are often seen as virtues or vices. Virtues are acquired slowly, by practice within a cultural context, but the discovery that there might be ethnically-linked genetic variations in the ease with which people can acquire specific virtues is — and this is my prediction — going to be a "game changing" scientific event. (By "ethnic" I mean any group of people who believe they share common descent, actually do share common descent, and that descent involved at least 500 years of a sustained selection pressure, such as sheep herding, rice farming, exposure to malaria, or a caste-based social order, which favored some heritable behavioral predispositions and not others.)

I believe that the "Bell Curve" wars of the 1990s, over race differences in intelligence, will seem genteel and short-lived compared to the coming arguments over ethnic differences in moralized traits. I predict that this "war" will break out between 2012 and 2017...There are reasons to hope that we'll ultimately reach a consensus that does not aid and abet racism. I expect that dozens or hundreds of ethnic differences will be found, so that any group — like any person — can be said to have many strengths and a few weaknesses, all of which are context-dependent. Furthermore, these cross-group differences are likely to be small when compared to the enormous variation within ethnic groups and the enormous and obvious effects of cultural learning. But whatever consensus we ultimately reach, the ways in which we now think about genes, groups, evolution and ethnicity will be radically changed by the unstoppable progress of the human genome project.

Caloric restriction improves memory

From Witte et al:
Animal studies suggest that diets low in calories and rich in unsaturated fatty acids (UFA) are beneficial for cognitive function in age. Here, we tested in a prospective interventional design whether the same effects can be induced in humans. Fifty healthy, normal- to overweight elderly subjects (29 females, mean age 60.5 years, mean body mass index 28 kg/m2) were stratified into 3 groups: (i) caloric restriction (30% reduction), (ii) relative increased intake of UFAs (20% increase, unchanged total fat), and (iii) control. Before and after 3 months of intervention, memory performance was assessed under standardized conditions. We found a significant increase in verbal memory scores after caloric restriction (mean increase 20%; P less than 0.001), which was correlated with decreases in fasting plasma levels of insulin and high sensitive C-reactive protein, most pronounced in subjects with best adherence to the diet (all r values less than −0.8; all P values less than 0.05). Levels of brain-derived neurotrophic factor remained unchanged. No significant memory changes were observed in the other 2 groups. This interventional trial demonstrates beneficial effects of caloric restriction on memory performance in healthy elderly subjects. Mechanisms underlying this improvement might include higher synaptic plasticity and stimulation of neurofacilitatory pathways in the brain because of improved insulin sensitivity and reduced inflammatory activity. Our study may help to generate novel prevention strategies to maintain cognitive functions into old age.