Wednesday, January 16, 2008

Planned Obsolescence? The Four-Year Itch

Helen Fisher, author of "Why We Love" and an anthropology professor at Rutgers, has written a brief essay with the title of this post. She did a cross cultural survey of when divorces occur and found that divorces regularly peaked during and around the fourth year after wedding (no evidence for the commonly assumed seven year itch indicated in the graphic). Divorces peaked among couples in their late twenties. And the more children a couple had, the less likely they were to divorce: some 39% of worldwide divorces occurred among couples with no dependent children; 26% occurred among those with one child; 19% occurred among couples with two children; and 7% of divorces occurred among couples with three young. In trying to figure out so many men and women divorce during and around the 4-year mark; at the height of their reproductive years; and often with a single child, she had an "a ha" moment:
Women in hunting and gathering societies breastfeed around the clock, eat a low-fat diet and get a lot of exercise — habits that tend to inhibit ovulation. As a result, they regularly space their children about four years apart. Thus, the modern duration of many marriages—about four years—conforms to the traditional period of human birth spacing, four years.

Perhaps human parental bonds originally evolved to last only long enough to raise a single child through infancy, about four years, unless a second infant was conceived. By age five, a youngster could be reared by mother and a host of relatives. Equally important, both parents could choose a new partner and bear more varied young.
Her theory fits with data on other species:
Only about three percent of mammals form a pairbond to rear their young. Take foxes. The vixen's milk is low in fat and protein; she must feed her kits constantly; and she will starve unless the dog fox brings her food. So foxes pair in February and rear their young together. But when the kits leave the den in mid summer, the pairbond breaks up. Among foxes, the partnership lasts only through the breeding season. This pattern is common in birds. Among the more than 8,000 avian species, some 90% form a pairbond to rear their young. But most do not pair for life. A male and female robin, for example, form a bond in the early spring and rear one or more broods together. But when the last of the fledgling fly away, the pairbond breaks up... Like pair-bonding in many other creatures, humans have probably inherited a tendency to love and love again—to create more genetic variety in our young.

The neural control of vigor

An interesting article from Dolan's laboratory on the neural substrates of the motivation and vigor with which we perform actions. Their abstract lays it out clearly:
The vigor with which a participant performs actions that produce valuable outcomes is subject to a complex set of motivational influences. Many of these are believed to involve the amygdala and the nucleus accumbens, which act as an interface between limbic and motor systems. One prominent class of influences is called pavlovian–instrumental transfer (PIT), in which the motivational characteristics of a predictor influence the vigor of an action with respect to which it is formally completely independent. We provide a demonstration of behavioral PIT in humans, with an audiovisual predictor of the noncontingent delivery of money inducing participants to perform more avidly an action involving squeezing a handgrip to earn money. Furthermore, using functional magnetic resonance imaging, we show that this enhanced motivation was associated with a trial-by-trial correlation with the blood oxygenation level-dependent (BOLD) signal in the nucleus accumbens and a subject-by-subject correlation with the BOLD signal in the amygdala. Our data dovetails well with the animal literature and sheds light on the neural control of vigor.


Figure - The PIT paradigm used. Stage 1, In the pavlovian conditioning stage, participants are exposed to repeated pairings of the CS+ (a visual background and a sound) and a US (monetary reward of 20 pence), as well as presentations of a CS– that is not associated with reward. Here participants pressed a key to remove a patch that hid either a coin (CS+) or a coin with a superimposed red X (CS–). During the baseline CS, no patches were present; thus, there was no opportunity for reward. Each CS block lasted 12 s. Stage 2, During instrumental learning, participants were trained to squeeze a handgrip to obtain the same reward. Each block lasted 12 s. Stage 3, The critical PIT test took place under extinction and included presentation of the three CSs in a random order (here only the CS+ block is depicted). The participant was allowed to continue responding instrumentally.

Figure - Amygdala activity associated with PIT. Participants who showed a larger global PIT expressed enhanced bilateral amygdala activation. The bar graph shows, for the right amygdala and NAcc, mean parameter estimates for the correlation, across participants, of global PIT with the parameter estimate in each CS condition. Error bars represent the 90% confidence interval. *p <>

Tuesday, January 15, 2008

There are No Moral Facts - Metzinger

Here is a brief essay from one of my heroes, Thomas Metzinger, that I completely agree with - spiced up by an unrelated and gratuitous graphic on morality.
I have become convinced that it would be of fundamental importance to know what a good state of consciousness is. Are there forms of subjective experience which — in a strictly normative sense — are better than others? Or worse? What states of consciousness should be illegal? What states of consciousness do we want to foster and cultivate and integrate into our societies? What states of consciousness can we force upon animals — for instance, in consciousness research itself? What states of consciousness do we want to show our children? And what state of consciousness do we eventually die in ourselves?

2007 has seen the rise of an important new discipline: "neuroethics". This is not simply a new branch of applied ethics for neuroscience — it raises deeper issues about selfhood, society and the image of man. Neuroscience is now quickly transformed into neurotechnology. I predict that parts of neurotechnology will turn into consciousness technology. In 2002, out-of-body experiences were, for the first time, induced with an electrode in the brain of an epileptic patient. In 2007 we saw the first two studies, published in Science, demonstrating how the conscious self can be transposed to a location outside of the physical body as experienced, non-invasively and in healthy subjects. Cognitive enhancers are on the rise. The conscious experience of will has been experimentally constructed and manipulated in a number of ways. Acute episodes of depression can be caused by direct interventions in the brain, and they have also been successfully blocked in previously treatment-resistant patients. And so on.

Whenever we understand the specific neural dynamics underlying a specific form of conscious content, we can in principle delete, amplify or modulate this content in our minds. So shouldn’t we have a new ethics of consciousness — one that does not ask what a good action is, but that goes directly to the heart of the matter, asks what we want to do with all this new knowledge and what the moral value of states of subjective experience is?

Here is where I have changed my mind. There are no moral facts. Moral sentences have no truth-values. The world itself is silent, it just doesn’t speak to us in normative affairs — nothing in the physical universe tells us what makes an action a good action or a specific brain-state a desirable one. Sure, we all would like to know what a good neurophenomenological configuration really is, and how we should optimize our conscious minds in the future. But it looks like, in a more rigorous and serious sense, there is just no ethical knowledge to be had. We are alone. And if that is true, all we have to go by are the contingent moral intuitions evolution has hard-wired into our emotional self-model. If we choose to simply go by what feels good, then our future is easy to predict: It will be primitive hedonism and organized religion.

Listening with your visual cortex.

We experience our environment through simultaneous stimulation of several sensory channels. Watching a movie is usually a visual and auditory experience. This integration from different sensory modalities helps with stimulus detection and discrimination in noisy environments. A traditional views of brain organization has postulated strict parceling into unisensory and and then multisensory cortical levels. Romei et al. have now shown in humans that auditory information goes directly to the primary visual cortex, before higher levels of integration.

When subjects are instructed to detect simple stimuli (a briefly presented pure tone, a small white disk, or a combination of the two), and their reaction times (RTs) are measured, reaction RTs are significantly better for the audio-visual (AV) condition than for both unimodal conditions, indicating a behavioral facilitation effect for stimuli presented simultaneously in both modalities. Romei et al. gave brief trans-cranial magnetic stimultion (TMS) to occipital poles of the subjects' heads. TMS effects over visual cortex in a timeframe from 60 to 75 ms after sensory stimulus onset would suggest an interaction with feedforward processes, whereas later effects might be caused by feedback from higher cortical regions. Thus, varying the delay from 30 to 150 ms between TMS and the preceding sensory stimulation in different sensory modalities enabled them to determine the processing type (feedforward or feedback), as well as the critical timeframe of visual cortex involvement in stimulus processing.

Relative to TMS over a control site, reactions times (RTs) to unisensory visual stimuli were prolonged by TMS at 60–75 ms poststimulus onset (visual suppression effect), confirming stimulation of functional visual cortex. Conversely, RTs to unisensory auditory stimuli were significantly shortened when visual cortex was stimulated by TMS at the same delays (beneficial interaction effect of auditory stimulation and occipital TMS). No TMS-effect on RTs was observed for AV stimulation. A follow-up experiment showed that auditory input enhances excitability within visual cortex itself over a similarly early time-window (75–120 ms).

Monday, January 14, 2008

Face perception after no experience of faces

This work really nails down the fact that face processing is a special perceptual process and is organized as such at birth, as contrasted with having its origin in a more general-purpose perceptual system that becomes specialized after frequent visual experiences. Sugita has studied face perception in monkeys reared with no exposure to faces. Here is his abstract, and one figure from the paper:
Infant monkeys were reared with no exposure to any faces for 6–24 months. Before being allowed to see a face, the monkeys showed a preference for human and monkey faces in photographs, and they discriminated human faces as well as monkey faces. After the deprivation period, the monkeys were exposed first to either human or monkey faces for a month. Soon after, the monkeys selectively discriminated the exposed species of face and showed a marked difficulty in regaining the ability to discriminate the other nonexposed species of face. These results indicate the existence of an experience-independent ability for face processing as well as an apparent sensitive period during which a broad but flexible face prototype develops into a concrete one for efficient processing of familiar faces.

Figure: An infant monkey and her living circumstance. An infant monkey and a caregiver with (A) and without (B) a facemask. Both photos were taken after the face-deprivation period. (C) Toys placed in the monkey's home cage. (D) Decorations provided around the home cage.

We Differ More Than We Thought

This essay by Mark Pagel is worth passing on in its entirety:
The last thirty to forty years of social science has brought an overbearing censorship to the way we are allowed to think and talk about the diversity of people on Earth. People of Siberian descent, New Guinean Highlanders, those from the Indian sub-continent, Caucasians, Australian aborigines, Polynesians, Africans — we are, officially, all the same: there are no races.

Flawed as the old ideas about race are, modern genomic studies reveal a surprising, compelling and different picture of human genetic diversity. We are on average about 99.5% similar to each other genetically. This is a new figure, down from the previous estimate of 99.9%. To put what may seem like miniscule differences in perspective, we are somewhere around 98.5% similar, maybe more, to chimpanzees, our nearest evolutionary relatives.

The new figure for us, then, is significant. It derives from among other things, many small genetic differences that have emerged from studies that compare human populations. Some confer the ability among adults to digest milk, others to withstand equatorial sun, others yet confer differences in body shape or size, resistance to particular diseases, tolerance to hot or cold, how many offspring a female might eventually produce, and even the production of endorphins — those internal opiate-like compounds.

We also differ by surprising amounts in the numbers of copies of some genes we have. Modern humans spread out of Africa only within the last 60-70,000 years, little more than the blink of an eye when stacked against the 6 million or so years that separate us from our Great Ape ancestors. The genetic differences amongst us reveal a species with a propensity to form small and relatively isolated groups on which natural selection has often acted strongly to promote genetic adaptations to particular environments.

We differ genetically more than we thought, but we should have expected this: how else but through isolation can we explain a single species that speaks at least 7,000 mutually unintelligible languages around the World?

What this all means is that, like it or not, there may be many genetic differences among human populations — including differences that may even correspond to old categories of 'race' — that are real differences in the sense of making one group better than another at responding to some particular environmental problem. This in no way says one group is in general 'superior' to another, or that one group should be preferred over another. But it warns us that we must be prepared to discuss genetic differences among human populations.

Friday, January 11, 2008

Please Clap, Talk or Shout at Any Time

Bernard Holland reviews Kenneth Hamilton's book, “After the Golden Age,” a detailed reflection on concert behavior in the 19th and early 20th centuries published recently by Oxford University Press. Fascinating bits of information about a bygone era before our current time, when
Concertgoers like you and me have become part police officer, part public offender. We prosecute the shuffled foot or rattled program, the errant whisper or misplaced cough. We tense at the end of a movement, fearful that one of the unwashed will begin to clap, bringing shame on us all. How serious we look, and how absurd we are.
A number of fascinating facts:
...the silence at a London performance of Liszt’s “Dante” Symphony represented not rapt attention but audience distaste.
...hardly anybody played more than one movement of a Beethoven sonata at a time.
...Audience participation was taken for granted in the 1840s. The pianist Alexander Dreyschock was criticized for playing “so loud that it made it difficult for the ladies to talk,”

...Concerts were different back then. Liszt could get away with the radical idea of “one man, one recital,” but musical events were usually variety shows in the manner of vaudeville. The star pianist or violinist was just an occasionally recurring act in a parade of singers, orchestra players, quartets and trios. When Liszt did his solo acts, there was none of the march-on, march-off stage ritual of today. Liszt greeted patrons at the door, mingled in the audience and schmoozed with friend and stranger alike.

...Whole recitals also took place between acts of an opera or movements of a symphony. When Chopin played his E minor Piano Concerto in Warsaw in 1830, other pieces were inserted between the first two movements. Perhaps the most celebrated such interruption was at the 1806 premiere of Beethoven’s Violin Concerto in Vienna, where the soloist thrilled listeners by playing his violin upside down and on one string.

Regret

How do we feel about alternative versions of ourselves - lost possible selves, or the person we might have been? Benedict Carey writes a nice piece on this question. A few clips:
...Over the past decade and a half, psychologists have studied how regrets — large and small, recent and distant — affect people’s mental well-being. They have shown, convincingly though not surprisingly, that ruminating on paths not taken is an emotionally corrosive exercise. The common wisdom about regret — that what hurts the most is not what you did but what you didn’t do — also appears to be true, at least in the long run.

...young adults who scored high on measures of psychological well-being tended to think of regretted decisions as all their own — perhaps because they still had time to change course. By contrast, older people who scored highly tended to share blame for their regretted decisions. “I tried to reach out to him, but the effort wasn’t returned.”

...those who are able to talk or write about this lost future without sinking into despair or losing hope tend to have developed another quality, called complexity...an ability to incorporate various points of view into a recollection, to vividly describe the circumstances, context and other dimensions...that this knack for self-evaluation develops over time; it is a learned ability.

...therapists have long known the value of seeing regretted choices in the context of what has been gained as well as lost.

...the perspective from which people remember slights or mistakes can affect the memories’ emotional impact... reimagining painful scenes from a third-person point of view, as if seeing oneself in a movie, blunts their emotional sting and facilitates ... clearheaded self-perception.

Thursday, January 10, 2008

Compensatory neural plasticity in aging human brains.

Recent imaging studies have shown that seniors exhibit stronger brain activation than younger controls during the execution of various motor tasks. Old subjects activate the same regions as their younger counterparts, but to a larger extent, and they also activate additional regions that are not observed in the young subjects.

Heuninckx et al. examine the underlying neural mechanisms of this "overactivation" by determining whether it reflects compensation for various neural/behavioral deficits (e.g., neurodegeneration, attentional problems, reduction in sensory function, etc.) or whether it is due to de-differentiation (a generalized nonfunctional spread of activity attributable to deficits in neurotransmission, which in turn causes a decrease in the signal-to-noise ratio in neural firing and a loss of neural specialization). They compared brain activity in 24 older adults and 11 young controls during the performance of rhythmical hand–foot coordination tasks, whereby both limbs moved either in the same (iso-directional) or in the opposite (non-isodirectional, NONISODIR in the figure below) direction. Previous behavioral work had shown convincingly that the non-isodirectional pattern is more difficult and is produced with lower accuracy and stability than the iso-directional pattern. Activation in dedicated brain regions was correlated with motor performance in the elderly. According to the compensation hypothesis, the underlying rationale was that the over-activation would be larger in good than in poor motor performers, with the effect being more pronounced in more (non-isodirectional) than less (iso-directional) demanding coordination tasks. Conversely, the de-differentiation hypothesis assumed overactivation to be larger in poor than in successful motor performers because of nonfunctional neural irradiation. Thus, positive correlations between brain activation and motor performance were considered to reflect compensation, and negative correlations were considered to reflect de-differentiation.

They found that that coordination resulted in activation of classical motor coordination regions and also higher-level sensorimotor and frontal regions in the elderly. A positive correlation between activation level in these latter regions and motor performance was observed. This performance enhancing additional recruitment is consistent with the compensation hypothesis and reflects neuroplasticity at the systems level in the aging brain.


Figure: (Click to enlarge). Statistical parametric maps representing significantly larger activation in the old compared with the young group during the NONISODIR coordination mode, resulting from the following contrast: (NONISODIR – rest)old versus (NONISODIR – rest)young. L, Left hemisphere; R, right hemisphere. White arrows indicate brain regions that exhibit a significant correlation between brain activity level and coordination performance, as identified by a whole-brain multiple regression analysis. The graphics display each subject's BOLD response with respect to the within-cluster peak activation as a function of the inverse of the phase error (1/AE), with the younger subjects in blue and the older subjects in red.

What have you changed your mind about?

The Edge.org Annual Question for 2008, addressed to a select group of their choice of the intellectual elite, is "What have you changed your mind about? Why?" (I've done blog postings on the responses to questions of the two previous years: "What is your dangerous idea?" and "What are you optimistic about? Why?"

I started to do thumbnail summaries of the responses I thought worth passing on to you, but found that most were not very succinct, and sufficiently diffuse to make brief summary difficult. Then pack much less punch than the 'dangerous idea' responses. I recommend that you scroll through the responses yourself. I may focus on a few of them in subsequent posts.

Wednesday, January 09, 2008

Drunken flies get hypersexual - and gay

Sound familiar? Reminds me of similar behaviors after University of Wisconsin football games, when drunken guys who could not find an appropriate female object would go ahead with what was available - other guys. This news item by Heidi Ledford in Nature describes experiments by Lee et al. that:
...tested the effects of chronic alcohol exposure on sexual behaviour in the fruitfly Drosophila melanogaster. The researchers noted that male flies repeatedly exposed to ethanol vapour became less discriminate in their mate selection. The buzzed flies often courted fellow males, pursuing them around the cage while serenading with a traditional fruitfly courtship song played on vibrating wings.


[Figure: Love Chain, male fruit flies chase each other in a circle] Eventually, the lusty flies devolve into a courting frenzy. “You get a chain of males chasing each other,” says Heberlein, who was not associated with the study but has observed similar behaviour in her own unpublished work. In contrast, alcohol had little effect on mating in female fruitflies, which normally do not court their mates.

The findings suggest that the flies do not fundamentally change their sexual orientation, but rather get over-sexed. “Multiple alcohol exposures makes them essentially hypersexual,” says Heberlein. The mind-dulling effects of alcohol might also make it more of a challenge for male fruitflies to distinguish the gender of other flies in the crowd.
Because of the genetic tools available, fruitflies might be a good model system for probing the idea, suggested for humans, that the neurotransmitter dopamine is a link between sex and alcohol.

Love hangover - the sex peptide

A male, after copulation, has a particular interest in seeing that the female involved ceases further sexual activity that might dilute his genetic contribution. It turns out that male fruitflies don't have to stand by and guard their transferred genetic material — a sex peptide in their semen will do the job. This peptide leads to increased egg-laying by the mated female and behavioural changes that reduce the likelihood of her re-mating. Yapici et al. have now identified the receptor protein for this peptide. It functions in a subset of neurons implicated in other sex-related behaviors. The receptor is highly conserved across insect species, raising the possibility that it could be targeted to disrupt reproduction in insect pests or host-seeking behaviour in disease vectors. (There appears to be no evidence for such a mechanism in primates and humans!).

Tuesday, January 08, 2008

More on laughing rats...and human chanting?

This is a sequel to my March 20 and June 18, 2007, posts on laughing rats. Rats use ultrasonic communication, with 50-kHz vocalizations indicating an animal's positive subjective state. Wöhr and Schwarting now show that show that 50-kHz signals (either natural 50-kHz calls, which had been previously recorded from other rats, or artificial sine wave stimuli, which were identical to these calls with respect to peak frequency, call length and temporal appearance) can induce approach behaviors. The effect is more pronounced in juvenile rats. It is commonly assumed that humans have lost this mechanism, but I wonder if the powerful bonding emotions induced in groups of humans doing very low frequency vocal chants, which surely have harmonics in the 50-mHz range, might be a evolutionary derivative of this early mammalian behavior . Here are several Tibetan master chants offered by the free sound project. Do they chill you out?

The value of believing in free will.

Vohs and Schooler do an interesting experiment in which they ask whether believing in free will versus determinism influences moral behavior. I have free access only to the abstract of the article, so can not spell out the details of the experiments. Here is that abstract:
Does moral behavior draw on a belief in free will? Two experiments examined whether inducing participants to believe that human behavior is predetermined would encourage cheating. In Experiment 1, participants read either text that encouraged a belief in determinism (i.e., that portrayed behavior as the consequence of environmental and genetic factors) or neutral text. Exposure to the deterministic message increased cheating on a task in which participants could passively allow a flawed computer program to reveal answers to mathematical problems that they had been instructed to solve themselves. Moreover, increased cheating behavior was mediated by decreased belief in free will. In Experiment 2, participants who read deterministic statements cheated by overpaying themselves for performance on a cognitive task; participants who read statements endorsing free will did not. These findings suggest that the debate over free will has societal, as well as scientific and theoretical, implications.

A YouTube for ideas......

You might enjoy checking out this article by Tim Arango on a new website, Big Think, which appears to be a sort of combination of YouTube and Facebook for intellectuals.

Monday, January 07, 2008

Why can't we perform perfectly?

Some fascinating experiments by Tumer and Brainar on songbirds inform me on why I am not able to perform a completely learned and exhaustively practiced piano piece the same way each time I bang it out.... from the Nature Editor's review of their article:
Why is it that even the best-trained athletes and musicians cannot perform perfectly? One thought is that residual variability in performance is 'noise' that reflects fundamental limits on our ability to control our movements. Experiments using the exceptionally well-rehearsed songs of adult songbirds as a model point to an alternative explanation. Computerized monitoring of the apparently stereotyped songs of adult Bengalese finches revealed minuscule variations in performance. When the birds were given corrections each time the song varied beyond a certain limit, they rapidly learned to adapt their vocalizations. The implication is that once learned, songs can be maintained despite subtle changes to the vocal system due to factors such as ageing. So behavioural 'noise', rather than simply being a nuisance, may reflect experimentation by the nervous system to refine performance.
The abstract from Rumer and Brainar:
Significant trial-by-trial variation persists even in the most practiced skills. One prevalent view is that such variation is simply 'noise' that the nervous system is unable to control or that remains below threshold for behavioural relevance. An alternative hypothesis is that such variation enables trial-and-error learning, in which the motor system generates variation and differentially retains behaviours that give rise to better outcomes. Here we test the latter possibility for adult bengalese finch song. Adult birdsong is a complex, learned motor skill that is produced in a highly stereotyped fashion from one rendition to the next. Nevertheless, there is subtle trial-by-trial variation even in stable, 'crystallized' adult song. We used a computerized system to monitor small natural variations in the pitch of targeted song elements and deliver real-time auditory disruption to a subset of those variations. Birds rapidly shifted the pitch of their vocalizations in an adaptive fashion to avoid disruption. These vocal changes were precisely restricted to the targeted features of song. Hence, birds were able to learn effectively by associating small variations in their vocal behaviour with differential outcomes. Such a process could help to maintain stable, learned song despite changes to the vocal control system arising from ageing or injury. More generally, our results suggest that residual variability in well learned skills is not entirely noise but rather reflects meaningful motor exploration that can support continuous learning and optimization of performance.

Hope worse than Hopelessness

This short piece by Marina Krakovsky from the NY Times:
People often display a remarkable ability to adapt to adversity, bouncing back to their usual levels of happiness despite extreme hardships. But people don’t always rebound, and scientists have long wondered what factors might account for the difference. In a talk at Harvard in September, a team of researchers suggested that one obstacle to emotional recovery, oddly enough, is hope — the belief that your current hardship is temporary.

From the beginning, the investigators suspected that hope might sometimes be counterproductive: prisoners with life sentences but with the possibility of parole adapt less well to prison life, for example, than prisoners with life sentences without the possibility of parole. But the researchers sought another empirical test. Their choice: Colostomy patients. The research team, led by Peter Ubel, a physician at the University of Michigan, tracked people who had portions of their colons removed or bypassed, such that the patients couldn’t defecate normally. The condition is extremely unpleasant and leads many people to say they’d rather be dead, Ubel reports. But a colostomy isn’t always permanent. Some patients are likely to heal and have their bowels reconnected. Whether your colostomy is permanent depends on your condition, but were it up to the patient to choose, “almost anybody would choose temporary over permanent,” Ubel says.

So it’s surprising that the permanent-colostomy patients ended up happier six months after the operation than the temporary group, whose members were still holding out hope. Patients with a temporary colostomy experienced a significant drop in life satisfaction versus patients in the permanent group.


It might seem strange that patients who are better off objectively were less satisfied with their lives, yet the finding makes sense: “If your condition is temporary,” Ubel explains, “you’re thinking, I can’t wait until I get rid of this.” Ubel says thoughts like these keep you from moving on with your life and focusing on the many good things that remain.

Friday, January 04, 2008

Clutter - more in the brain than in the house...

Parker-Pope offers a brief essay on the "clutter problem," suggesting that the problem in many cases is not a house problem but a person problem.
Excessive clutter and disorganization are often symptoms of a bigger health problem...At its most extreme, chronic disorganization is called hoarding...David F. Tolin, director of the anxiety disorders center at the Institute of Living in Hartford and an adjunct associate professor of psychiatry at Yale...recently studied compulsive hoarders using brain-scan technology. While in the scanner, hoarders looked at various possessions and made decisions about whether to keep them or throw them away. The items were shredded in front of them, so they knew the decision was irreversible. When a hoarder was making decisions about throwing away items, the researchers saw increased activity in the orbitofrontal cortex, a part of the brain involved in decision-making and planning...people who didn’t hoard showed no extra brain activity.
The article continues with a discussion of holding on to excess 'stuff' and being overweight. In several cases therapists have noted a correlation between reducing clutter and weight loss.

Cultural Influences on Neural Substrates of Attentional Control

The abstract from Hedden et al. of the article with the title of this post (in Psychological Science, Volume 19, pp 12-17, 2008). I thought it was interesting enough to mention, though I don't have access to the full text, so can't determine exactly what is meant by culturally preferred and non-preferred judgements:
Behavioral research has shown that people from Western cultural contexts perform better on tasks emphasizing independent (absolute) dimensions than on tasks emphasizing interdependent (relative) dimensions, whereas the reverse is true for people from East Asian contexts. We assessed functional magnetic resonance imaging responses during performance of simple visuospatial tasks in which participants made absolute judgments (ignoring visual context) or relative judgments (taking visual context into account). In each group, activation in frontal and parietal brain regions known to be associated with attentional control was greater during culturally nonpreferred judgments than during culturally preferred judgments. Also, within each group, activation differences in these regions correlated strongly with scores on questionnaires measuring individual differences in culture-typical identity. Thus, the cultural background of an individual and the degree to which the individual endorses cultural values moderate activation in brain networks engaged during even simple visual and attentional tasks.

Thursday, January 03, 2008

A science debate for presidential candidates?

A note for this day of the Iowa presidential primary...I have signed on with many others at Science Debate 2008 to urge the candidates to debate issues in science and technology, and would urge readers of this blog to do the same. Here are some comments by John Tierney.

Liberals are smarter than conservatives...?

A British study by Deary et al. titled "Bright Children become Enlightened Adults" (Psychological Science, Volume 19 pp. 1-6, January 2008) shows a correlation between general intelligence (g) at age 10 and liberal and anti-traditional social attitudes at age 30...
We examined the prospective association between general intelligence (g) at age 10 and liberal and antitraditional social attitudes at age 30 in a large (N = 7,070), representative sample of the British population born in 1970. Statistical analyses identified a general latent trait underlying attitudes that are antiracist, pro-working women, socially liberal, and trusting in the democratic political system. There was a strong association between higher g at age 10 and more liberal and antitraditional attitudes at age 30; this association was mediated partly via educational qualifications, but not at all via occupational social class. Very similar results were obtained for men and women. People in less professional occupations—and whose parents had been in less professional occupations—were less trusting of the democratic political system. This study confirms social attitudes as a major, novel field of adult human activity that is related to childhood intelligence differences.

Wednesday, January 02, 2008

Alarms and Anxiety in 2008 - guaranteed

John Tierney does a very nice piece in the Jan 1 NY Times Science section describing how activists, journalists and publicity-savvy scientists (called availability entrepreneurs by social scientists) selectively monitor the globe looking for newsworthy evidence of a new form of sinfulness, burning fossil fuels. Some clips from his text:
A year ago, British meteorologists made headlines predicting that the buildup of greenhouse gases would help make 2007 the hottest year on record. At year’s end, even though the British scientists reported the global temperature average was not a new record — it was actually lower than any year since 2001 — the BBC confidently proclaimed, “2007 Data Confirms Warming Trend.”

When the Arctic sea ice last year hit the lowest level ever recorded by satellites, it was big news and heralded as a sign that the whole planet was warming. When the Antarctic sea ice last year reached the highest level ever recorded by satellites, it was pretty much ignored. A large part of Antarctica has been cooling recently, but most coverage of that continent has focused on one small part that has warmed.

When Hurricane Katrina flooded New Orleans in 2005, it was supposed to be a harbinger of the stormier world predicted by some climate modelers. When the next two hurricane seasons were fairly calm — by some measures, last season in the Northern Hemisphere was the calmest in three decades — the availability entrepreneurs changed the subject. Droughts in California and Australia became the new harbingers of climate change (never mind that a warmer planet is projected to have more, not less, precipitation over all).

The most charitable excuse for this bias in weather divination is that the entrepreneurs are trying to offset another bias. The planet has indeed gotten warmer, and it is projected to keep warming because of greenhouse emissions, but this process is too slow to make much impact on the public.

When judging risks, we often go wrong by using what’s called the availability heuristic: we gauge a danger according to how many examples of it are readily available in our minds. Thus we overestimate the odds of dying in a terrorist attack or a plane crash because we’ve seen such dramatic deaths so often on television; we underestimate the risks of dying from a stroke because we don’t have so many vivid images readily available.

Slow warming doesn’t make for memorable images on television or in people’s minds, so activists, journalists and scientists have looked to hurricanes, wild fires and starving polar bears instead. They have used these images to start an “availability cascade,” a term coined by Timur Kuran, a professor of economics and law at the University of Southern California, and Cass R. Sunstein, a law professor at the University of Chicago.

The availability cascade is a self-perpetuating process: the more attention a danger gets, the more worried people become, leading to more news coverage and more fear. Once the images of Sept. 11 made terrorism seem a major threat, the press and the police lavished attention on potential new attacks and supposed plots. After Three Mile Island and “The China Syndrome,” minor malfunctions at nuclear power plants suddenly became newsworthy.

...Once a cascade is under way, it becomes tough to sort out risks because experts become reluctant to dispute the popular wisdom, and are ignored if they do. Now that the melting Arctic has become the symbol of global warming, there’s not much interest in hearing other explanations of why the ice is melting — or why the globe’s other pole isn’t melting, too.

Nature versus Nurture in Ventral Visual Cortex

Polk et al. do functional magnetic resonance imaging of monozygotic and dizygotic twins to show that genetics play a significant role in determining the cortical response to faces and places, more so than to orthographic stimuli (chairs or pseudowords). Here is their abstract, a paragraph from their concluding section and one figure from the paper.
Using functional magnetic resonance imaging, we estimated neural activity in twins to study genetic influences on the cortical response to categories of visual stimuli (faces, places, and pseudowords) that are known to elicit distinct patterns of activity in ventral visual cortex. The neural activity patterns in monozygotic twins were significantly more similar than in dizygotic twins for the face and place stimuli, but there was no effect of zygosity for pseudowords (or chairs, a control category). These results demonstrate that genetics play a significant role in determining the cortical response to faces and places, but play a significantly smaller role (if any) in the response to orthographic stimuli.


Figure legend: Patterns of estimated neural activity when viewing the four stimulus categories (axial slice). Functional activation maps were computed for the four contrasts of interest (faces, houses, pseudowords, and chairs relative to the phase-scrambled control condition), and the similarity measures (r) between these functional maps were computed for each twin pair. (Click on figure to enlarge)

The results of this study demonstrate that genetics play a significant role in determining the cortical response to faces and places. Of course, these findings do not imply that experience plays no role in determining the observed activity. To take just one example, genes that affect social behavior could potentially lead some people to look at faces and places more than other people, and the resulting difference in experience could lead to changes in the neural circuitry (we thank one of the anonymous reviewers for this example). The results simply demonstrate that genetics do play a crucial role. The results also show that genetics play a significantly smaller role in determining the cortical response to visually presented orthographic stimuli. Overall, the findings are consistent with the view that the cortical substrates of face recognition and place recognition are partially innately specified, but that the cortical response to orthographic stimuli is more dependent on experience. Face and place recognition are older than reading on an evolutionary scale, they are shared with other species, and they provide a clearer adaptive advantage. It is therefore plausible that evolution would shape the cortical response to faces and places, but not orthographic stimuli.

Tuesday, January 01, 2008

Experimental Philosophy - moving out of the armchair?

The Dec. 9 New York Times Magazine has a fascinating article by Kwame Appiah that I have been meaning to mention, on the new philosophical trend of designing explicit experiments - many to probe the classic philosophical concept of intentionality. The following sort of result is now called the "Knobe effect" after the philosophy graduate student who devised the experiment (and the article describes a number of further experiments):
Suppose the chairman of a company has to decide whether to adopt a new program. It would increase profits and help the environment too. “I don’t care at all about helping the environment,” the chairman says. “I just want to make as much profit as I can. Let’s start the new program.” Would you say that the chairman intended to help the environment?
O.K., same circumstance. Except this time the program would harm the environment. The chairman, who still couldn’t care less about the environment, authorizes the program in order to get those profits. As expected, the bottom line goes up, the environment goes down. Would you say the chairman harmed the environment intentionally?
I don’t know where you ended up, but in one survey, only 23 percent of people said that the chairman in the first situation had intentionally helped the environment. When they had to think about the second situation, though, fully 82 percent thought that the chairman had intentionally harmed the environment. There’s plenty to be said about these interestingly asymmetrical results.
... is it a good thing that we attribute intention in the curious way that we do, and if so, why? (Is the Knobe effect a bug or a feature?) You can conduct more research to try to clarify matters, but you’re left having to interpret the findings; they don’t interpret themselves. There always comes a point where the clipboards and questionnaires and M.R.I. scans have to be put aside. To sort things out, it seems, another powerful instrument is needed. Let’s see — there’s one in the corner, over there. The springs are sagging a bit, and the cushions are worn, but never mind. That armchair will do nicely.

Accelerated Human Evolution in past 40,000 years.

An interesting article by Nicholas Wade from the Dec. 11 NY Times:
Researchers analyzing variation in the human genome have concluded that human evolution accelerated enormously in the last 40,000 years under the force of natural selection.

The finding contradicts a widely held assumption that human evolution came to a halt 10,000 years ago or even 50,000 years ago. Some evolutionary psychologists, for example, assume that the mind has not evolved since the Ice Age ended 10,000 years ago.

But other experts expressed reservations about the new report, saying it is interesting but more work needs to be done.

The new survey — led by Robert K. Moyzis of the University of California, Irvine, and Henry C. Harpending of the University of Utah — developed a method of spotting human genes that have become more common through being favored by natural selection. They say that some 7 percent of human genes bear the signature of natural selection.

By dating the time that each of the genes came under selection, they have found that the rate of human evolution was fairly steady until about 50,000 years ago and then accelerated up until 10,000 years ago, they report in the current issue of The Proceedings of the National Academy of Sciences. The high rate of selection has probably continued to the present day, Dr. Moyzis said, but current data are not adequate to pick up recent selection.

Click on graphic to enlarge...

The brisk rate of human selection occurred for two reasons, Dr. Moyzis’ team says. One was that the population started to grow, first in Africa and then in the rest of the world after the first modern humans left Africa. The larger size of the population meant that there were more mutations for natural selection to work on. The second reason for the accelerated evolution was that the expanding human populations in Africa and Eurasia were encountering climates and diseases to which they had to adapt genetically. The extra mutations in their growing populations allowed them to do so.

Dr. Moyzis said it was widely assumed that once people developed culture, they protected themselves from the environment and from the forces of natural selection. But people also had to adapt to the environments that their culture created, and the new analysis shows that evolution continued even faster than before.

The researchers took their data from the HapMap project, a survey designed by the National Institutes of Health to look at sites of common variation in the human genome and to help identify the genes responsible for common diseases. The HapMap data, generated by analyzing the genomes of people from Africa, East Asia and Europe, has also been a trove for people studying human evolutionary history.

David Reich, a population geneticist at the Harvard Medical School, said the new report was “a very interesting and exciting hypothesis” but that the authors had not ruled out other explanations of the data. The power of their test for selected genes falls off in looking both at more ancient and more recent events, he said, so the overall picture might not be correct.

Similar reservations were expressed by Jonathan Pritchard, a population geneticist at the University of Chicago.

“My feeling is that they haven’t been cautious enough,” he said. “This paper will probably stimulate others to study this question.”

Monday, December 31, 2007

MindBlog freezes and thaws again





A holiday trip back to Madison Wisconsin, view from front door and back door of house on Twin Valley Road - just before the third time I had to shovel the walk... another 6 inches of snow since I left to return...

...back to Fort Lauderdale, view from front and back of condo on the South branch of the Middle River, 60-70 degree farenheit temperature increase:


Repressed Memory - A recent cultural invention?

Literary references to depression, hallucinations, anxiety, and dementia can be found throughout history. A fascinating article in Harvard Magazine by Ashley Pettus describes the research of Harrison Pope, who reasoned that if dissociative amnesia were an innate capability of the brain it also should appear in ancient texts. An extensive search, and a $1000 reward, was able to find no reference earlier than Nina, an opera by Dalayrac and Marsollier performed in Paris in 1786. The absence of dissociative amnesia in works prior to 1800 suggests that the phenomenon is not a natural neurological function, but rather a “culture-bound” syndrome rooted in the nineteenth century. From the article:
What, then, accounts for “repressed memory’s” appearance in the nineteenth century and its endurance today? Pope and his colleagues hope to answer these questions in the future. “Clearly the rise of Romanticism, at the end of the Enlightenment, created fertile soil for the idea that the mind could expunge a trauma from consciousness,” Pope says. He notes that other pseudo-neurological symptoms (such as the female “swoon”) emerged during this era, but faded relatively quickly. He suspects that two major factors helped solidify “repressed memory” in the twentieth-century imagination: psychoanalysis (with its theories of the unconscious) and Hollywood. “Film is a perfect medium for the idea of repressed memory,” he says. “Think of the ‘flashback,’ in which a whole childhood trauma is suddenly recalled. It’s an ideal dramatic device.”

Friday, December 28, 2007

Cognitive Recovery in Socially Deprived Young Children

With elaborate consideration of the ethical issues involved (commented on by Millum and Emanuel), Nelson et al. have compared the cognitive development of abandoned children reared in institutions to abandoned children placed in institutions but then moved to foster care (The Bucharest Early Intervention Project):
In a randomized controlled trial, we compared abandoned children reared in institutions to abandoned children placed in institutions but then moved to foster care. Young children living in institutions were randomly assigned to continued institutional care or to placement in foster care, and their cognitive development was tracked through 54 months of age. The cognitive outcome of children who remained in the institution was markedly below that of never-institutionalized children and children taken out of the institution and placed into foster care. The improved cognitive outcomes we observed at 42 and 54 months were most marked for the youngest children placed in foster care. These results point to the negative sequelae of early institutionalization, suggest a possible sensitive period in cognitive development, and underscore the advantages of family placements for young abandoned children.

Motion perception and production - similar neural coding

Another example of how our brain's representations of motion are tuned to biological actions. Here is the abstract of the open access article from Dayan et al., which contains some very elegant imaging figures:
Behavioral and modeling studies have established that curved and drawing human hand movements obey the 2/3 power law, which dictates a strong coupling between movement curvature and velocity. Human motion perception seems to reflect this constraint. The functional MRI study reported here demonstrates that the brain's response to this law of motion is much stronger and more widespread than to other types of motion. Compliance with this law is reflected in the activation of a large network of brain areas subserving motor production, visual motion processing, and action observation functions. Hence, these results strongly support the notion of similar neural coding for motion perception and production. These findings suggest that cortical motion representations are optimally tuned to the kinematic and geometrical invariants characterizing biological actions.

[Note: The 2/3 power law links path curvature C and angular velocity A along the movement by a power law with an exponent of 2/3. K is the velocity gain factor, which is piecewise constant during entire movement segments:
]

Thursday, December 27, 2007

Monkeys and college students: similar in non-verbal math

This work from Cantlon and Brannon suggests that humans and nonhuman primates share a cognitive system for nonverbal arithmetic, suggesting an evolutionary link in their cognitive abilities., full text in PLoS Biology, here is the abstract:
Adult humans possess mathematical abilities that are unmatched by any other member of the animal kingdom. Yet, there is increasing evidence that the ability to enumerate sets of objects nonverbally is a capacity that humans share with other animal species. That is, like humans, nonhuman animals possess the ability to estimate and compare numerical values nonverbally. We asked whether humans and nonhuman animals also share a capacity for nonverbal arithmetic. We tested monkeys and college students on a nonverbal arithmetic task in which they had to add the numerical values of two sets of dots together and choose a stimulus from two options that reflected the arithmetic sum of the two sets. Our results indicate that monkeys perform approximate mental addition in a manner that is remarkably similar to the performance of the college students. These findings support the argument that humans and nonhuman primates share a cognitive system for nonverbal arithmetic, which likely reflects an evolutionary link in their cognitive abilities.

Human genetic variation - breakthrough of the year

We differ from each other in the number and order of our genes, and in their composition. A few edited clips from E. Pennisi's summary of Science Magazine's breakthrough of the year in the Dec. 21 issue:

There are an estimated 15 million places along our genomes where one base can differ from one person or population to the next. By mid-2007, more than 3 million such locations, known as single-nucleotide polymorphisms (SNPs), had been charted. Called the HapMap, this catalog has made the use of SNPs to track down genes involved in complex diseases--so-called genome-wide association studies--a reality....New gene associations now exist for type I and II diabetes, heart disease, breast cancer, restless leg syndrome, atrial fibrillation, glaucoma, amyotrophic lateral sclerosis, multiple sclerosis, rheumatoid arthritis, colorectal cancer, ankylosing spondylitis, and autoimmune diseases. One study even identified two genes in which particular variants can slow the onset of AIDS, demonstrating the potential of this approach for understanding why people vary in their susceptibility to infectious diseases.

Genomes can differ in many other ways. Bits of DNA ranging from a few to many thousands, even millions, of bases can get lost, added, or turned around in an individual's genome. Such revisions can change the number of copies of a gene or piece of regulatory DNA or jam two genes together, changing the genes'products or shutting them down. This year marked a tipping point, as researchers became aware that these changes, which can alter a genome in just a few generations, affect more bases than SNPs....In one study, geneticists discovered 3600 so-called copy number variants among 95 individuals studied. Quite a few overlapped genes, including some implicated in our individuality--blood type, smell, hearing, taste, and metabolism, for example. Individual genomes differed in size by as many as 9 million bases.


Wednesday, December 26, 2007

Learning from errors - genetic differences between humans

From Holden's brief summary of the work:
"Once burned, twice shy" works for most people. But some people are slow to learn from bad experiences.
This work shows that:
...people with a particular gene variant have more difficulty learning via negative reinforcement.
...demonstrates that a single-base-pair difference in the genome is associated with a remarkably different ability to learn from past mistakes is quite an accomplishment
...combines brain imaging with a task in which participants chose between symbols on a computer screen,
...centers on the A1 variant, or allele, of the gene encoding the D2 receptor, a protein on the surface of brain cells activated by the neurotransmitter dopamine. Earlier studies have hinted that this variant alters the brain's reward pathways and thereby makes people more vulnerable to addictions.
Brain activity was monitored (color) as a subject chose between two symbols (inset) and was rewarded with a smiley or frowny face. In the left panel the lower colors are hippocampus, the upper one the posterior medial frontal cortex.

Here is the abstract from Klein et al.
The role of dopamine in monitoring negative action outcomes and feedback-based learning was tested in a neuroimaging study in humans grouped according to the dopamine D2 receptor gene polymorphism DRD2-TAQ-IA. In a probabilistic learning task, A1-allele carriers with reduced dopamine D2 receptor densities learned to avoid actions with negative consequences less efficiently. Their posterior medial frontal cortex (pMFC), involved in feedback monitoring, responded less to negative feedback than others' did. Dynamically changing interactions between pMFC and hippocampus found to underlie feedback-based learning were reduced in A1-allele carriers. This demonstrates that learning from errors requires dopaminergic signaling. Dopamine D2 receptor reduction seems to decrease sensitivity to negative action consequences, which may explain an increased risk of developing addictive behaviors in A1-allele carriers.

Another difference in the brains of musicians...

Being a performing musician myself (cf. the YouTube video below), I'm always fascinated by work of the sort recently done by Chen et al. They show that musicians use the prefrontal cortex to a greater degree than nonmusicians to deconstruct and organize a rhythm's temporal structure. Here is their abstract (I will spare you the MRI images this time), followed by a bit of free music...
Much is known about the motor system and its role in simple movement execution. However, little is understood about the neural systems underlying auditory–motor integration in the context of musical rhythm, or the enhanced ability of musicians to execute precisely timed sequences. Using functional magnetic resonance imaging, we investigated how performance and neural activity were modulated as musicians and nonmusicians tapped in synchrony with progressively more complex and less metrically structured auditory rhythms. A functionally connected network was implicated in extracting higher-order features of a rhythm's temporal structure, with the dorsal premotor cortex mediating these auditory–motor interactions. In contrast to past studies, musicians recruited the prefrontal cortex to a greater degree than nonmusicians, whereas secondary motor regions were recruited to the same extent. We argue that the superior ability of musicians to deconstruct and organize a rhythm's temporal structure relates to the greater involvement of the prefrontal cortex mediating working memory.
Haydn Fantasia:

Monday, December 24, 2007

J. S. Bach - Christmas Oratorio - Schlafe, mein Liebster

John Eliot Gardiner leads the Monteverdi Choir and the English Baroque Soloists, with Bernarda Fink in "Schlafe, mein Liebster," from Bach's Christmas Oratorio (BWV 248).

Schlafe, mein Liebster, genieße der Ruh,
Wache nach diesem vor aller Gedeihen!
Labe die Brust,
Empfinde die Lust,
Wo wir unser Herz erfreuen!

Sleep now, my dearest, enjoy now thy rest,
Wake on the morrow to flourish in splendor!
Lighten thy breast,
With joy be thou blest,
Where we hold our heart's great pleasure!

Neural correlates of trust

Krueger et al. offer an MRI study of brain changes that occur during a reciprocal trust game. They:
.used hyperfunctional magnetic resonance imaging, in which two strangers interacted online with one another in a sequential reciprocal trust game while their brains were simultaneously scanned. By designing a nonanonymous, alternating multiround game, trust became bidirectional, and we were able to quantify partnership building and maintenance...We show that the paracingulate cortex is critically involved in building a trust relationship by inferring another person's intentions to predict subsequent behavior. This more recently evolved brain region can be differently engaged to interact with more primitive neural systems in maintaining conditional and unconditional trust in a partnership. Conditional trust selectively activated the ventral tegmental area, a region linked to the evaluation of expected and realized reward, whereas unconditional trust selectively activated the septal area, a region linked to social attachment behavior. The interplay of these neural systems supports reciprocal exchange that operates beyond the immediate spheres of kinship, one of the distinguishing features of the human species.

Figure - Brain responses for decisions to trust. (a) Trust building. Decisions to trust contrasted with the control condition activated the PcC (Brodmann's areas, BA 9/32). (b) Trust maintenance. Decisions to trust contrasted with the control condition activated the SA (together with the adjoining hypothalamus)

Laws of Nature as resting on faith...

Dennis Overbye does a brief piece in the Dec. 18 NY Times that derives from the small firestorm of commentary ignited by a previous OpEd piece by Paul Davis, an Arizona State Univ. cosmologist, asserting that science, not unlike religion, rests on faith, not in God but in the idea of an orderly universe. (I almost did a post on that OpEd article, but decided not to). The not so minor difference, of course, is that the "laws" of science simply reflect that the order we perceive in nature has been explored and tested for more than 2,000 years by observation and experimentation. The methods of science are well known. What are the methods of faith? Overbye's article proceeds to describe positions held by a number of prominent philosophers, physicists, and cosmologists on the underlying nature of the universe.

I'm with the late Nobel laureate physicist Richard Feynman, whose famous quote is included in the article - “Philosophy of science is about as useful to scientists as ornithology is to birds.”

Friday, December 21, 2007

Children attributing causality - extension to religious and political imitation.

Blog reader Rick Thomas makes a brief comment on the previous post on children attributing causality (a comment I wish I had made), that is sufficiently pungent to bring into a post where more people will note it:
" Fascinating. I guess the effect will extend to adult religious and political imitation as well."

Even though the experiments of Lyons et al. mentioned in the previous post deal with imitation of mechanical sequences, the same tenacious and irrational attribution of causality might explain why people find it so difficult to overcome habits instilled by their early religious and political environment.

The hidden structure of over-imitation

Human children, unlike chimpanzees, will copy unnecessary or arbitrary parts of an action sequence they observe in adults, Lyons et al. term this process overimitation and suggest in an open access article with the title of this post that it reveals a hidden structure behind how children learn to attribute causality. Here is their abstract, and a graphic showing one of the three puzzle boxes used in the experiements:
Young children are surprisingly judicious imitators, but there are also times when their reproduction of others' actions appears strikingly illogical. For example, children who observe an adult inefficiently operating a novel object frequently engage in what we term overimitation, persistently reproducing the adult's unnecessary actions. Although children readily overimitate irrelevant actions that even chimpanzees ignore, this curious effect has previously attracted little interest; it has been assumed that children overimitate not for theoretically significant reasons, but rather as a purely social exercise. In this paper, however, we challenge this view, presenting evidence that overimitation reflects a more fundamental cognitive process. We show that children who observe an adult intentionally manipulating a novel object have a strong tendency to encode all of the adult's actions as causally meaningful, implicitly revising their causal understanding of the object accordingly. This automatic causal encoding process allows children to rapidly calibrate their causal beliefs about even the most opaque physical systems, but it also carries a cost. When some of the adult's purposeful actions are unnecessary—even transparently so—children are highly prone to mis-encoding them as causally significant. The resulting distortions in children's causal beliefs are the true cause of overimitation, a fact that makes the effect remarkably resistant to extinction. Despite countervailing task demands, time pressure, and even direct warnings, children are frequently unable to avoid reproducing the adult's irrelevant actions because they have already incorporated them into their representation of the target object's causal structure.

Vegansexuality

Jeff Stryker gives us more from the fringe (Dec. 9 NY Times Magazine):
Forget homo-, bi- or even metro-: the latest prefix in sexuality is vegan-, as in “vegansexual.” In a study released in May, Annie Potts, a researcher at the University of Canterbury and a director of the New Zealand Centre for Human-Animal Studies, surveyed 157 vegans and vegetarians (120 of them women) on the topic of cruelty-free living. The questions ranged from attitudes about eating meat to keeping pets to wearing possum fur to, yes, “cruelty-free sex” — that is, “rejecting meat eaters as intimate partners.”

Some of the survey respondents volunteered their reluctance to kiss meat eaters. “I couldn’t think of kissing lips that allow dead animal pieces to pass between them,” a 49-year-old vegan woman from Auckland said. For some, the resistance is the squeamishness factor. “Nonvegetarian bodies smell different to me,” a 41-year-old Christchurch vegan woman said. “They are, after all, literally sustained through carcasses — the murdered flesh of others.” For some, it is a question of finding a like-minded life partner. An Auckland ovo-vegetarian had tried a relationship with a carnivore, but reported that despite the sexual attraction, the gulf in “shared values and moral codes” was just too wide.

Potts, who coined the term vegansexuality, says the “negative response of omnivores” to her study has surprised her. Even some fellow animal lovers question the wisdom of vegansexuality. A blog for People for the Ethical Treatment of Animals noted that sleeping with only fellow vegans means forgoing the opportunity to turn carnivores into vegans by the most powerful recruiting tool available — sex.

PETA’s founder and president, Ingrid Newkirk, agrees that vegans smell fresher. (“There’s science to prove it,” she says.) But Newkirk is all about the recruiting, even if it means one convert at a time. “When my staff members come to me and say: ‘Guess what? My boyfriend, now he’s a vegan,’ I say, half-jokingly: ‘Well, it is time to ditch him and get another. You’ve done your work; move on.’ ”

Thursday, December 20, 2007

Selling brain science... Neurorealism

Matthew Hutson makes some good points in his brief comments on all those pretty brain imaging graphics you see in this MindBlog as well the daily press:
You’ve seen the headlines: This Is Your Brain on Politics. Or God. Or Super Bowl Ads. And they’re always accompanied by pictures of brains dotted with seemingly significant splotches of color. Now some scientists have seen enough. We’re like moths, they say, lured by the flickering lights of neuroimaging — and uncritically accepting of conclusions drawn from it.

A paper published online in September by the journal Cognition shows that assertions about psychology — even implausible ones like “watching television improved math skills” — seem much more believable to laypeople when accompanied by images from brain scans. And a paper accepted for publication by The Journal of Cognitive Neuroscience demonstrates that adding even an extraneous reference to the brain to a bad explanation of human behavior makes the explanation seem much more satisfying to nonexperts.

Eric Racine, a bioethicist at the Montreal Clinical Research Institute, coined the word neurorealism to describe this form of credulousness. In an article called “fMRI in the Public Eye,” he and two colleagues cited a Boston Globe article about how high-fat foods activate reward centers in the brain. The Globe headline: “Fat Really Does Bring Pleasure.” Couldn’t we have proved that with a slice of pie and a piece of paper with a check box on it?

The way conclusions from cognitive neuroscience studies are reported in the popular press, “they don’t necessarily tell us anything we couldn’t have found out without using a brain scanner,” says Deena Weisberg, an author of the Journal of Cognitive Neuroscience paper. “It just looks more believable now that we have the pretty pictures.”

Racine says he is particularly troubled by the thought of crude or unscrupulous applications of this young science to the diagnosis of psychiatric conditions, the evaluation of educational programs and the assessment of defendants in criminal trials. Drawing inferences from the data requires several degrees of analysis and interpretation, he says, and treating neuroimaging as a mind-reading technique “would be adding extra scientific credibility that is not necessarily warranted.”

Race and IQ - a few crisp facts

The debate over race and IQ seems endless and mind-numbing, usually generating more heat than light. A recent Op-Ed piece by Richard Nisbett, brief and to the point, collects several facts:
About 25 percent of the genes in the American black population are European, meaning that the genes of any individual can range from 100 percent African to mostly European. If European intelligence genes are superior, then blacks who have relatively more European genes ought to have higher I.Q.’s than those who have more African genes. But it turns out that skin color and “negroidness” of features — both measures of the degree of a black person’s European ancestry — are only weakly associated with I.Q. (even though we might well expect a moderately high association due to the social advantages of such features).

During World War II, both black and white American soldiers fathered children with German women. Thus some of these children had 100 percent European heritage and some had substantial African heritage. Tested in later childhood, the German children of the white fathers were found to have an average I.Q. of 97, and those of the black fathers had an average of 96.5, a trivial difference.

If European genes conferred an advantage, we would expect that the smartest blacks would have substantial European heritage. But when a group of investigators sought out the very brightest black children in the Chicago school system and asked them about the race of their parents and grandparents, these children were found to have no greater degree of European ancestry than blacks in the population at large.

.. a superior adoption study...looked at black and mixed-race children adopted by middle-class families, either black or white, and found no difference in I.Q. between the black and mixed-race children....children adopted by white families had I.Q.’s 13 points higher than those of children adopted by black families. The environments that even middle-class black children grow up in are not as favorable for the development of I.Q. as those of middle-class whites.

James Flynn, a philosopher and I.Q. researcher in New Zealand, has established that in the Western world as a whole, I.Q. increased markedly from 1947 to 2002. In the United States alone, it went up by 18 points. Our genes could not have changed enough over such a brief period to account for the shift; it must have been the result of powerful social factors. And if such factors could produce changes over time for the population as a whole, they could also produce big differences between subpopulations at any given time.

...interventions at every age from infancy to college can reduce racial gaps in both I.Q. and academic achievement, sometimes by substantial amounts in surprisingly little time. This mutability is further evidence that the I.Q. difference has environmental, not genetic, causes.

Video of independent leg movement controllers

Here, as a companion to my Sept. 20 post "Walking the walk" is a video illustrating the independent controllers of our right and left legs during walking.

Wednesday, December 19, 2007

The God Effect

Here I pass on another bit, by Marina Krakovsky, in the NY Times Magazine's Dec. 9 "Ideas" issue. She summarizes work by Canadian psychologists Shariff and Norenzayan published in Psychological Science:
Some anthropologists argue that the idea of God first arose in larger societies, for the purpose of curbing selfishness and promoting cooperation. Outside a tightly knit group, the reasoning goes, nobody can keep an eye on everyone’s behavior, so these cultures invented a supernatural agent who could. But does thinking of an omniscient God actually promote altruism? The University of British Columbia psychologist Ara Norenzayan wanted to find out.

In a pair of studies published in Psychological Science, Norenzayan and his student Azim F. Shariff had participants play the so-called “dictator game,” a common way of measuring generosity toward strangers. The game is simple: you’re offered 10 $1 coins and told to take as many as you want and leave the rest for the player in the other room (who is, unbeknown to you, a research confederate). The fair split, of course, is 50-50, but most anonymous “dictators” play selfishly, leaving little or nothing for the other player.

In the control group of Norenzayan’s study, the vast majority of participants kept everything or nearly everything — whether or not they said they were religious. “Religious leaders always complain that people don’t internalize religion, and they’re right,” Norenzayan observes.

But is there a way to induce generosity? In the experimental condition, the researchers prompted thoughts of God using a well-established “priming” technique: participants, who again included both theists and atheists, first had to unscramble sentences containing words such as God, divine and sacred. That way, going into the dictator game, players had God on their minds without being consciously aware of it. Sure enough, the “God prime” worked like a charm, leading to fairer splits. Without the God prime, only 12 percent of the participants split the money evenly, but when primed with the religious words, 52 percent did.

When news of these findings made headlines, some atheists were appalled by the implication that altruism depends heavily on religion. Apparently, they hadn’t heard the whole story. In a second study, the researchers had participants unscramble sentences containing words like civic, contract and police — meant to evoke secular moral institutions. This prime also increased generosity. And unlike the religious prime, it did so consistently for both believers and nonbelievers. Until he conducts further research, Norenzayan can only speculate about the significance: “We need that common denominator that works for everyone.

A Mea Culpa - Pinker and his critics

I think in general that Steven Pinker goes way overboard on the nativist angle, and so recently approvingly passed on this Churchland review in the Nov. 1 issue of Nature critical of Pinker's new book, "The Language of Thought." - I hadn't actually read the book. These retorts by Marc Hauser and Pinker himself in the Dec. 6 issue make me realize that I should have. I have zapped my original post, and I'm now going to read the book......(one thing about doing a blog is that you read fewer good long books). I admit to a residual grumpyness about Pinker (a brilliant man) from his visit to Wisconsin a number of years ago as a featured speaker. He was dragged through the usual torture of serial 30 minute interviews with local "prominent persons" (I was the Zoology Chair at that time), and during our conversation I found him to be quite remote. At his talk he read from a typescript - word for word - a lecture that I had already heard twice before.

Tuesday, December 18, 2007

Seasonal Affective Disorder - an evolutionary relic?

Friedman offers a succinct summary of information of seasonal affective disorder (SAD), with some interesting facts.
Epidemiological studies estimate that its prevalence in the adult population ranges from 1.4 percent (Florida) to 9.7 percent (New Hampshire).
In one study, patients with SAD
...had a longer duration of nocturnal melatonin secretion in the winter than in the summer, just as with other mammals with seasonal behavior.Why did the normal patients show no seasonal change in melatonin secretion? One possibility is exposure to industrial light, which can suppress melatonin.
...The effects of light therapy are fast, usually four to seven days, compared with antidepressants, which can take four to six weeks to work.
...People are most responsive to light therapy early in the morning, just when melatonin secretion begins to wane, about eight to nine hours after the nighttime surge begins...How can the average person figure that out without a blood test? By a simple questionnaire that assesses “morningness” or “eveningness” and that strongly correlates with plasma melatonin levels. The nonprofit Center for Environmental Therapeutics has a questionnaire on its Web site (www.cet.org).

"Mental reserves" as antidote to Alzheimer's disease

A fascinating aspect of various kinds of debilitation (back pain, heart attacks, dementia) is that degenerative changes in anatomy commonly associated with them (disk and vertebral degeneration, cardiac vessel blockage, brain lesion and beta-amyloid plaques-shown in figure) are often observed on autopsy in physically and mental robust people, who have shown no symptoms of debilitation. What is different about them? Apparently their bodies were able to do a more effective 'work around' or compensation for the damage. A relevant article by Jane Brody in the December 11 New York Times deals with evidence that cognitive reserves, the brain’s ability to develop and maintain extra neurons and connections between them may later in life help compensate for the rise in dementia-related brain pathology that accompanies normal aging. Some edited clips:
Cognitive reserve is greater in people who complete higher levels of education. The more intellectual challenges to the brain early in life, the more neurons and connections the brain is likely to develop and perhaps maintain into later years... brain stimulation does not have to stop with the diploma. Better-educated people may go on to choose more intellectually demanding occupations and pursue brain-stimulating hobbies, resulting in a form of lifelong learning...novelty is crucial to providing stimulation for the aging brain...as with muscles, it’s “use it or lose it.” The brain requires continued stresses to maintain or enhance its strength...In 2001, ... a long-term study of cognitively healthy elderly New Yorkers....found, on average, those who pursued the most leisure activities of an intellectual or social nature had a 38 percent lower risk of developing dementia. The more activities, the lower the risk...the most direct route to a fit mind is through a fit body...physical exercise “improves what scientists call ‘executive function,’ the set of abilities that allows you to select behavior that’s appropriate to the situation, inhibit inappropriate behavior and focus on the job at hand in spite of distractions. Executive function includes basic functions like processing speed, response speed and working memory.
This point about exercise and executive function was the subject of my Nov. 15 post.

Ambiguity Promotes Liking

For the seventh consecutive December, the New York Times magazine (Dec. 9 issue) has looked back on the passing year through the special lens of 'ideas'. Here is one of their brief essays, and I will pass on a few more in subsequent posts:

Ambiguity Promotes Liking

By MARINA KRAKOVSKY

Is it true that familiarity breeds contempt? A psychology study published this year concludes that the answer is yes. It seems we are inclined to interpret ambiguous information about someone optimistically, assuming we will get along. We are usually let down, however, when we learn more.

A team of researchers, led by Michael I. Norton of Harvard Business School, looked at online daters’ opinions of people they were about to meet for the first time and compared those ratings with another group’s post-date impressions. Before the date, based on what little information the daters saw online, most participants rated their prospective dates between a 6 and a 10 on a 10-point scale, with nobody giving a score below a 3. But post-date scores were lower, on average, and lots of people deemed their date a total dud.

Why? For starters, initial information is open to interpretation. “And people are so motivated to find somebody they like that they read things into the profiles,” Norton says. If a man writes that he likes the outdoors, his would-be mate imagines her perfect skiing companion, but when she learns more, she discovers “the outdoors” refers to nude beaches. And “once you see one dissimilarity, everything you learn afterward gets colored by that,” Norton says.

The letdown from getting more information isn’t true just for romance. In one experiment, the researchers showed college students different numbers of randomly selected traits and asked them to rate how much they’d like the person described. For the most part, the more traits participants saw, the less they said they would like the other person. But another group of students had overwhelmingly said they would like people more after learning more about them.

We make this mistake, the researchers say, largely because we can all recall cases of more knowledge leading to more liking. “You forget the people in your third-grade class you didn’t like; you remember the people you’re still friends with,” Norton explains.