Monday, September 26, 2011

Delayed gratification - 40 years later.

Casey et al. (open access) do a followup of the famous "marshmallow experiments" that showed young children who are better at delaying gratification to obtain a greater reward do better latter in life. They were able to test 60 individuals from the original study, now in their mid-40s, and in a subset of these were able to demonstrate stable differences in frontostriatal circuitries that integrate motivational and control processes in low delayers versus high delayers.
We examined the neural basis of self-regulation in individuals from a cohort of preschoolers who performed the delay-of-gratification task 4 decades ago. Nearly 60 individuals, now in their mid-forties, were tested on “hot” and “cool” versions of a go/nogo task to assess whether delay of gratification in childhood predicts impulse control abilities and sensitivity to alluring cues (happy faces). Individuals who were less able to delay gratification in preschool and consistently showed low self-control abilities in their twenties and thirties performed more poorly than did high delayers when having to suppress a response to a happy face but not to a neutral or fearful face. This finding suggests that sensitivity to environmental hot cues plays a significant role in individuals’ ability to suppress actions toward such stimuli. A subset of these participants (n = 26) underwent functional imaging for the first time to test for biased recruitment of frontostriatal circuitry when required to suppress responses to alluring cues. Whereas the prefrontal cortex differentiated between nogo and go trials to a greater extent in high delayers, the ventral striatum showed exaggerated recruitment in low delayers. Thus, resistance to temptation as measured originally by the delay-of-gratification task is a relatively stable individual difference that predicts reliable biases in frontostriatal circuitries that integrate motivational and control processes.

Friday, September 23, 2011

Nurture affects gender differences in spatial abilities

Here is a fascinating bit from Hoffman et al.:
Women remain significantly underrepresented in the science, engineering, and technology workforce. Some have argued that spatial ability differences, which represent the most persistent gender differences in the cognitive literature, are partly responsible for this gap. The underlying forces at work shaping the observed spatial ability differences revolve naturally around the relative roles of nature and nurture. Although these forces remain among the most hotly debated in all of the sciences, the evidence for nurture is tenuous, because it is difficult to compare gender differences among biologically similar groups with distinct nurture. In this study, we use a large-scale incentivized experiment with nearly 1,300 participants to show that the gender gap in spatial abilities, measured by time to solve a puzzle, disappears when we move from a patrilineal society to an adjoining matrilineal society. We also show that about one-third of the effect can be explained by differences in education. Given that none of our participants have experience with puzzle solving and that villagers from both societies have the same means of subsistence and shared genetic background, we argue that these results show the role of nurture in the gender gap in cognitive abilities.

Thursday, September 22, 2011

Testosterone modulates brain talk in social emotional behavior.

People with higher testosterone levels show more approach-related behavior during short social exchanges, and recent work has shown that this hormone influences activity of the amygdala (central to emotional behavior) and the ventral lateral (VLPFC) and orbital frontal (OFC) prefrontal areas of our cortex. I'm passing on this link to an open access article by Volman et al. that details experiments showing that testosterone modulates the effective connectivity between amygdala and VLPFC in approach-avoidance behavior. Their results indicate that endogenous testosterone influences local prefrontal activity and interregional connectivity supporting the control of social emotional behavior.

Wednesday, September 21, 2011

Evolutionary rationale for positive illusions.

Johnson and Fowler offer a fascinating explanation for why 70% of us (and 90% of college professors) feel we are above average in physical skills, intelligence, leadership, importance to our groups, driving skills, healthiness of our behavior, etc. etc. The authors make the striking suggestion that biased self-beliefs can actually lead people to make the right decision, whereas unbiased self-images would lead to a suboptimal decision. In their model overconfident populations are evolutionarily stable over a more wide range of environments than realistic populations, and they suggest this "may help to explain why overconfidence remains prevalent today, even if it contributes to hubris, market bubbles, financial collapses, policy failures, disasters and costly wars." Here is their abstract:
Confidence is an essential ingredient of success in a wide range of domains ranging from job performance and mental health to sports, business and combat. Some authors have suggested that not just confidence but overconfidence—believing you are better than you are in reality—is advantageous because it serves to increase ambition, morale, resolve, persistence or the credibility of bluffing, generating a self-fulfilling prophecy in which exaggerated confidence actually increases the probability of success. However, overconfidence also leads to faulty assessments, unrealistic expectations and hazardous decisions, so it remains a puzzle how such a false belief could evolve or remain stable in a population of competing strategies that include accurate, unbiased beliefs. Here we present an evolutionary model showing that, counterintuitively, overconfidence maximizes individual fitness and populations tend to become overconfident, as long as benefits from contested resources are sufficiently large compared with the cost of competition. In contrast, unbiased strategies are only stable under limited conditions. The fact that overconfident populations are evolutionarily stable in a wide range of environments may help to explain why overconfidence remains prevalent today, even if it contributes to hubris, market bubbles, financial collapses, policy failures, disasters and costly wars.

Tuesday, September 20, 2011

Rewriting self-fulfilling prophecies about social rejection

Stinson et al. provide yet another example of how even a very modest intervention to alter self-image can have long lasting effects:
Chronically insecure individuals often behave in ways that result in the very social rejection that they most fear. We predicted that this typical self-fulfilling prophecy is not immutable. Self-affirmation may improve insecure individuals’ relational security, and this improvement may allow them to express more welcoming social behavior. In a longitudinal experiment, a 15-min self-affirmation improved both the relational security and experimenter-rated social behavior of insecure participants up to 4 weeks after the initial intervention. Moreover, the extent to which self-affirmation improved insecure participants’ relational security at 4 weeks predicted additional improvements in social behavior another 4 weeks after that. Our finding that insecure participants continued to reap the social benefits of self-affirmation up to 8 weeks after the initial intervention demonstrates that it is indeed possible to rewrite the self-fulfilling prophecy of social rejection.
The experiment used the usual gaggle of psychology undergraduates. After answering a relational security questionaire,
Participants were assigned to one of two conditions, in both of which they ranked 11 values (e.g., academics) according to personal importance. Participants in the self-affirmation condition were instructed to write several paragraphs describing why their top-ranked value was important to them. They then listed the top two reasons why they picked that value as most important and indicated the extent to which their top-ranked value influenced their lives and was an important part of their self-image. Participants in the control condition were also instructed to write several paragraphs and answer similar questions, except that we asked this group to focus on their ninth-ranked value and why it might be important to someone else.

Monday, September 19, 2011

Musical expertise boosts language perception

Perhaps the thousands of hours I have put in on piano practice have made it easier for me to write? A few clips from the introduction of Francois and Schön :
The fact that musicians perceive some sound features more accurately than nonmusicians do is not so surprising. After all, they spend hours and hours of their life focusing on sounds and the way they are generated, paying particular attention to pitch, timber, duration, and timing. However, what seems less evident to us is whether or not this intensive musical practice can affect nonmusical abilities. Several recent studies seem to confirm this possibility….In this study, we took the challenge of focusing on a rather high cognitive function: word segmentation, namely, the ability to extract words from continuous speech…Participants listened to an artificial sung language (wherein music and language dimensions are highly intertwined) and were then tested with a 2-alternative forced-choice task on pairs of words and melodies (familiar vs. unfamiliar). The main goal of this study was to test whether musical expertise can facilitate word segmentation. With this aim, we compared 2 groups, one group with formal musical training and one without.
And here is their abstract:
Musical training is known to modify auditory perception and related cortical organization. Here, we show that these modifications may extend to higher cognitive functions and generalize to processing of speech. Previous studies have shown that adults and newborns can segment a continuous stream of linguistic and nonlinguistic stimuli based only on probabilities of occurrence between adjacent syllables or tones. In the present experiment, we used an artificial (sung) language learning design coupled with an electrophysiological approach. While behavioral results were not clear cut in showing an effect of expertise, Event-Related Potentials data showed that musicians learned better than did nonmusicians both musical and linguistic structures of the sung language. We discuss these findings in terms of practice-related changes in auditory processing, stream segmentation, and memory processes.

Friday, September 16, 2011

Neural correlates of pain reduction through meditation

Salomons and Kucyi present a nice review of experiments examining meditation and pain reduction (PDF here).
...A cognitive mechanism that is thought to be unique to mindfulness is the combination of increased attention and reduced negative evaluation...the key to reported analgesic effects of meditation training might be the co-occurring reduction in emotional and evaluative responses. Thus it is noteworthy that [several experiments] found activation patterns in regions associated with downregulation of negative affective responses, and functional decoupling of dorso-lateral prefrontal cortex and cingulate...attributed to dissociation between attention to pain and evaluation of pain. Zeidan and colleagues noted an inverse correlation between OFC activation and unpleasantness ratings, which was attributed to altered processing of reward and hedonic experiences. The degree of concordance between these studies suggests that meditative practices may indeed reduce pain through a unique neural mechanism, one corresponding to increased attention and reduced evaluative/emotional responses.

Thursday, September 15, 2011

Why laughing feels so good...

Robin Dunbar, the evolutionary psychologist at Oxford who has correlated social group size with brain size in evolution, and also argued for the importance of grooming as a group bonding mechanism, has come up with a simple and fascinating observation: social laughter increases pain resistance, suggesting that moving the muscles that are involved in a laugh causes the release of endorphins (Pain thresholds are taken to be a proxy for endorphin release). Here is the abstract from Dunbar et al.:
Although laughter forms an important part of human non-verbal communication, it has received rather less attention than it deserves in both the experimental and the observational literatures. Relaxed social (Duchenne) laughter is associated with feelings of wellbeing and heightened affect, a proximate explanation for which might be the release of endorphins. We tested this hypothesis in a series of six experimental studies in both the laboratory (watching videos) and naturalistic contexts (watching stage performances), using change in pain threshold as an assay for endorphin release. The results show that pain thresholds are significantly higher after laughter than in the control condition. This pain-tolerance effect is due to laughter itself and not simply due to a change in positive affect. We suggest that laughter, through an endorphin-mediated opiate effect, may play a crucial role in social bonding.
In a review James Gorman quotes Dunbar:
“Laughter is very weird stuff, actually,” Dr. Dunbar said. “That’s why we got interested in it.” And the findings fit well with a growing sense that laughter contributes to group bonding and may have been important in the evolution of highly social humans....Social laughter, Dr. Dunbar suggests, relaxed and contagious, is “grooming at a distance,” an activity that fosters closeness in a group the way one-on-one grooming, patting and delousing promote and maintain bonds between individual primates of all sorts.

Wednesday, September 14, 2011

Priming for self-esteem improves performance.

Yet another interesting collaboration involving Ray Dolan from the Wellcome Trust Centre for Neuroimaging.
Social cues have subtle effects on a person, often without them being aware. One explanation for this influence involves implicit priming of trait associations. To study this effect, we activated implicit associations in participants of ‘being Clever’ or ‘being Stupid’ that were task relevant, and studied its behavioural impact on an independent cognitive task (the n-back task). Activating a representation of ‘Clever’ caused participants to slow their reaction times after errors on the working memory task, while the reverse pattern was seen for associations to ‘Stupid’. Critically, these behavioural effects were absent in control conditions. Using functional magnetic resonance imaging, we show that the neural basis of this effect involves the anterior paracingulate cortex (area 32) where activity tracked the observed behavioural pattern, increasing its activity during error monitoring in the ‘Clever’ condition and decreasing in the ‘Stupid’ condition. The data provide a quantitative demonstration of how implicit cues, which specifically target a person’s self-concept, influences the way we react to our own behaviour and point to the anterior paracingulate cortex as a critical cortical locus for mediating these self-concept related behavioural regulations.
(The methods section describes how a scrambled sentence task served as the priming task.)

Tuesday, September 13, 2011

Brain changes following cognitive behavioral therapy for psychosis.

I've always been impressed by work of the sort done by Schwartz and others that shows cognitive behavioral therapy (CBT), when effective for obsessive-compulsive disorder, causes changes in brain activity similar to those caused by drugs that also alleviate symptoms. (One example of a CBT trick: instruct the patient, when symptoms appear, to think "That's not me, that's a part of my brain that is not working.") The journal BRAIN offers an open access article by Kumari et al. that makes further correlations of CBT with brain activity. They observe that in schizophrenia patients whose normal therapy is supplemented with cognitive behavior therapy there is decreased activation in several areas in response to fearful and angry expressions.
The cognitive behaviour therapy for psychosis...showed decreased activation of the inferior frontal, insula, thalamus, putamen and occipital areas to fearful and angry expressions at treatment follow-up compared with baseline. Reduction of functional magnetic resonance imaging response during angry expressions correlated directly with symptom improvement. This study provides the first evidence that cognitive behaviour therapy for psychosis attenuates brain responses to threatening stimuli and suggests that cognitive behaviour therapy for psychosis may mediate symptom reduction by promoting processing of threats in a less distressing way.

Monday, September 12, 2011

Talk about a nasty trick..plus, beta males win

Robert Sapolsky, whose work on stress I've talked about in a number of posts, is a polymath who maintains a number of quirky interests, one of which is describing a bizzare trick the protozoan Toxoplasma uses to reproduce itself, by infecting the brain of a mouse and altering its limbic system so that the poor mouse is sexually attracted to, rather than repelled by, the smell of cat urine (the protozoan requires the cat to sexually reproduce). The abstract:
Cat odors induce rapid, innate and stereotyped defensive behaviors in rats at first exposure, a presumed response to the evolutionary pressures of predation. Bizarrely, rats infected with the brain parasite Toxoplasma gondii approach the cat odors they typically avoid. Since the protozoan Toxoplasma requires the cat to sexually reproduce, this change in host behavior is thought to be a remarkable example of a parasite manipulating a mammalian host for its own benefit. Toxoplasma does not influence host response to non-feline predator odor nor does it alter behavior on olfactory, social, fear or anxiety tests, arguing for specific manipulation in the processing of cat odor. We report that Toxoplasma infection alters neural activity in limbic brain areas necessary for innate defensive behavior in response to cat odor. Moreover, Toxoplasma increases activity in nearby limbic regions of sexual attraction when the rat is exposed to cat urine, compelling evidence that Toxoplasma overwhelms the innate fear response by causing, in its stead, a type of sexual attraction to the normally aversive cat odor.
And, since I'm mentioning Sapolsky, and haven't gotten around to passing on another interesting bit from him, here is his commentary on work by Gusquiere et al. showing that the beta male in a baboon troop can end up winning in the end. Here is the NYTimes review of the work.

Friday, September 09, 2011

Civil conflict correlates with climate change.

A sobering analysis from Hsiang et al.:
It has been proposed that changes in global climate have been responsible for episodes of widespread violence and even the collapse of civilizations. Yet previous studies have not shown that violence can be attributed to the global climate, only that random weather events might be correlated with conflict in some cases. Here we directly associate planetary-scale climate changes with global patterns of civil conflict by examining the dominant interannual mode of the modern climate, the El Niño/Southern Oscillation (ENSO). Historians have argued that ENSO may have driven global patterns of civil conflict in the distant past11, a hypothesis that we extend to the modern era and test quantitatively. Using data from 1950 to 2004, we show that the probability of new civil conflicts arising throughout the tropics doubles during El Niño years relative to La Niña years. This result, which indicates that ENSO may have had a role in 21% of all civil conflicts since 1950, is the first demonstration that the stability of modern societies relates strongly to the global climate.

Thursday, September 08, 2011

Free Will: Neuroscience vs. Philosophy

Kerri Smith offers (PDF here) an update on the perennial debate between neuroscientists and philosophers over free will, it covers findings I've mentioned in previous posts...Haynes and coworkers finding that brain activity in motor cortex areas can be observed one to seven seconds before a subject is aware of willing an action to occur, and Fried et al. making even more compelling observations.
Haynes's 2008 study modernized Libet's earlier experiment: where Libet's EEG technique could look at only a limited area of brain activity, Haynes's fMRI set-up could survey the whole brain; and where Libet's participants decided simply on when to move, Haynes's test forced them to decide between two alternatives. But critics still picked holes, pointing out that Haynes and his team could predict a left or right button press with only 60% accuracy at best. Although better than chance, this isn't enough to claim that you can see the brain making its mind up before conscious awareness, argues Adina Roskies, a neuroscientist and philosopher who works on free will at Dartmouth College in Hanover, New Hampshire. Besides, "all it suggests is that there are some physical factors that influence decision-making", which shouldn't be surprising. Philosophers who know about the science, she adds, don't think this sort of study is good evidence for the absence of free will, because the experiments are caricatures of decision-making. Even the seemingly simple decision of whether to have tea or coffee is more complex than deciding whether to push a button with one hand or the other.

Haynes stands by his interpretation, and has replicated and refined his results in two studies. One uses more accurate scanning techniques3 to confirm the roles of the brain regions implicated in his previous work. In the other, which is yet to be published, Haynes and his team asked subjects to add or subtract two numbers from a series being presented on a screen. Deciding whether to add or subtract reflects a more complex intention than that of whether to push a button, and Haynes argues that it is a more realistic model for everyday decisions. Even in this more abstract task, the researchers detected activity up to four seconds before the subjects were conscious of deciding, Haynes says.

Some researchers have literally gone deeper into the brain. One of those is Itzhak Fried, a neuroscientist and surgeon at the University of California, Los Angeles, and the Tel Aviv Medical Center in Israel. He studied individuals with electrodes implanted in their brains as part of a surgical procedure to treat epilepsy4. Recording from single neurons in this way gives scientists a much more precise picture of brain activity than fMRI or EEG. Fried's experiments showed that there was activity in individual neurons of particular brain areas about a second and a half before the subject made a conscious decision to press a button. With about 700 milliseconds to go, the researchers could predict the timing of that decision with more than 80% accuracy. "At some point, things that are predetermined are admitted into consciousness," says Fried. The conscious will might be added on to a decision at a later stage, he suggests.

Wednesday, September 07, 2011

Behavioral and Brain Science Freebies...

Yet another post in which I pass on some of the goodies that are constantly flowing through my literature scans, rather than losing them as my list of potential posts grows and they are buried forever. Behavioral and Brain Sciences has opened free access to its most cited papers in 2010:
Darwin's mistake: Explaining the discontinuity between human and nonhuman minds
Derek C. Penn, Keith J. Holyoak, Daniel J. Povinelli

Language as shaped by the brain
Morten H. Christiansen, Nick Chater

Emotional responses to music: The need to consider underlying mechanisms
Patrik N. Juslin, Daniel Västfjäll

Deficits in cognitive control correlate with depression and rumination.

Joormann et al. make some observations on 'sticky thoughts.' I've edited their abstract a bit:
Cognitive inflexibility may play an important role in rumination, a risk factor for the onset and maintenance of depressive episodes. In the study reported here, we assessed participants’ ability to either reverse or maintain in working memory the order of three emotional (positive or negative) or three neutral words. Differences (or sorting costs) between response latencies in backward trials, on which participants were asked to reverse the order of the words, and forward trials, on which participants were asked to remember the words in the order in which they were presented, were calculated. [A recognition probe was used to index sorting costs (i.e., differences between response latencies on the forward and the backward trials. The probe word consisting of one of the three words was presented until the subject responded. Participants were instructed to press a key (“1,” “2,” or “3”) to indicate as quickly and as accurately as possible whether the probe was the first, second, or third word (counting forward or backward, as appropriate) in the set they had been instructed to remember.] Compared with control participants, depressed participants had higher sorting costs, particularly when presented with negative words. It is important to note that rumination predicted sorting costs for negative words but not for positive or neutral words in the depressed group. These findings indicate that depression and rumination are associated with deficits in cognitive control.

Tuesday, September 06, 2011

Our economic history

I want to point you to this article by Robert Reich in the Sunday New York Times, which is one the best description of our current economic mess that I have seen. Here is just a fragment of a great graphic summary the article provides (click to enlarge):

Neuroeconomics and the current financial crisis.

Cell press does a really remarkable job of assembling and pointing out important article in seminal areas of research.  I'm wanting in this post to point you to their Neuroscience Newsletter (that anyone can subscribe to) whose current issue emphasizes Neuroeconomics,  It has open access links to important articles. 

Our brains beat with the music...

From Nozaradan et al.:
Feeling the beat and meter is fundamental to the experience of music. However, how these periodicities are represented in the brain remains largely unknown. Here, we test whether this function emerges from the entrainment of neurons resonating to the beat and meter. We recorded the electroencephalogram while participants listened to a musical beat and imagined a binary or a ternary meter on this beat (i.e., a march or a waltz). We found that the beat elicits a sustained periodic EEG response tuned to the beat frequency. Most importantly, we found that meter imagery elicits an additional frequency tuned to the corresponding metric interpretation of this beat. These results provide compelling evidence that neural entrainment to beat and meter can be captured directly in the electroencephalogram. More generally, our results suggest that music constitutes a unique context to explore entrainment phenomena in dynamic cognitive processing at the level of neural networks.

Monday, September 05, 2011

The genetics of cognition

Trends in Cognitive Science has published a special issue on the genetics of cognition that is open access through September.
Twin, family and adoption studies have demonstrated that there is a substantial heritable component to all cognitive functions. The articles in this special issue summarize what is currently known about the genetic underpinnings of these functions and their disorders. At the same time, they highlight just how much there is yet to be discovered in this rapidly advancing field.
Robbins and Kousta provide an overview for the issue.
Review topics include:
-Genetics and criminal responsibility
-Genetics of human episodic memory: dealing with complexity
-The genetics of cognitive ability and cognitive ageing in healthy older people
-Dissecting the genetic architecture of human personality
-Genetics of emotion
-Genetics of autism spectrum disorders
-Understanding risk for psychopathology through imaging gene–environment interactions
-The genetics of cognitive impairment in schizophrenia: a phenomic perspective
-The contribution of imaging genetics to the development of predictive markers for  addictions

Women on the make are better at spotting gay men

These observations by Rule et al. sort of make sense, if you're a woman looking for a potential father of your children, you don't want to waste time dating gay men....
People can accurately infer others’ traits and group memberships across several domains. We examined heterosexual women’s accuracy in judging male sexual orientation across the fertility cycle (Study 1) and found that women’s accuracy was significantly greater the nearer they were to peak ovulation. In contrast, women’s accuracy was not related to their fertility when they judged the sexual orientations of other women (Study 2). Increased sexual interest brought about by the increased likelihood of conception near ovulation may therefore influence women’s sensitivity to male sexual orientation. To test this hypothesis, we manipulated women’s interest in mating using an unobtrusive priming task (Study 3). Women primed with romantic thoughts showed significantly greater accuracy in their categorizations of male sexual orientation (but not female sexual orientation) compared with women who were not primed. The accuracy of judgments of male sexual orientation therefore appears to be influenced by both natural variations in female perceivers’ fertility and experimentally manipulated cognitive frames.

Friday, September 02, 2011

Ironic effects of dietary supplements

Chiou et al. suggest that illusory invulnerability created by taking dietary supplements licenses health-risk behaviors. Their abstract (slightly edited):
The use of dietary supplements and the health status of individuals have an asymmetrical relationship: The growing market for dietary supplements appears not to be associated with an improvement in public health. Building on the notion of licensing, or the tendency for positive choices to license subsequent self-indulgent choices, we argue that because dietary supplements are perceived as conferring health advantages, use of such supplements may create an illusory sense of invulnerability that disinhibits unhealthy behaviors. In two experiments, participants who took placebo pills that they believed were dietary supplements, compared with participants who were told the pills were a placebo, exhibited the licensing effect across multiple forms of health-related behavior: In a first experiment they expressed less desire to engage in exercise and more desire to engage in hedonic activities, and expressed greater preference for a buffet over an organic meal. In a second experiment they walked less to benefit their health. A mediational analysis indicated that perceived invulnerability was an underlying mechanism for these effects. Thus, a license associated with the use of dietary supplements may operate within cycles of behaviors that alternately protect and endanger health.

Thursday, September 01, 2011

Microbes run the world

In my continuing scan of edge.org's annual question, I come across this essay by Stewart Brand, that continues the thread started last week on on how 'we' (humans) are mostly 'they' (microbes). Some clips:
Microbes make up 80 percent of all biomass, says Carl Woese. In one fifth of a teaspoon of seawater there's a million bacteria (and 10 million viruses), Craig Venter says, adding, "If you don't like bacteria, you're on the wrong planet. This is the planet of the bacteria." That means most of the planet's living metabolism is microbial. When James Lovelock was trying to figure out where the gases come from that make the Earth's atmosphere such an artifact of life (the Gaia Hypothesis), it was microbiologist Lynn Margulis who had the answer for him. Microbes run our atmosphere. They also run much of our body. The human microbiome in our gut, mouth, skin, and elsewhere, harbors 3,000 kinds of bacteria with 3 million distinct genes. (Our own cells struggle by on only 18,000 genes or so.) New research is showing that our microbes-on-board drive our immune systems and important portions of our digestion.

Microbial evolution, which has been going on for over 3.6 billion years, is profoundly different from what we think of as standard Darwinian evolution, where genes have to pass down generations to work slowly through the selection filter. Bacteria swap genes promiscuously within generations. They have three different mechanisms for this "horizontal gene transfer" among wildly different kinds of bacteria, and thus they evolve constantly and rapidly. Since they pass on the opportunistically acquired genes to their offspring, what they do on an hourly basis looks suspiciously Lamarckian — the inheritance of acquired characteristics.

Such routinely transgenic microbes show that there's nothing new, special, or dangerous about engineered GM crops. Field biologists are realizing that the the biosphere is looking like what some are calling a pangenome, an interconnected network of continuously circulated genes that is a superset of all the genes in all the strains of a species that form. Bioengineers in the new field of synthetic biology are working directly with the conveniently fungible genes of microbes.

This biotech century will be microbe enhanced and maybe microbe inspired. "Social Darwinism" turned out to be a bankrupt idea. The term "cultural evolution" never meant much, because the fluidity of memes and influences in society bears no relation to the turgid conservatism of standard Darwinian evolution. But "social microbialism" might mean something as we continue to explore the fluidity of traits and vast ingenuity of mechanisms among microbes — quorum sensing, biofilms, metabolic bucket brigades, "lifestyle genes," and the like.

Wednesday, August 31, 2011

G-Male

I just had to pass on this dead-on (and scary) parody of Google that my daughter pointed out to me.

Just looking at the American flag can make you a Republican?

Oh-my-gawd, now do we not only have a potential president Rick Perry who will rule by faith over reason, and doesn't believe in science, evolution, or climate change, we have a drift of the populace towards the republican base of his support by subtle nudges of the sort documented by Carter et al. Simple exposure to the American flag leads to a shift towards Republican beliefs:
There is scant evidence that incidental cues in the environment significantly alter people’s political judgments and behavior in a durable way. We report that a brief exposure to the American flag led to a shift toward Republican beliefs, attitudes, and voting behavior among both Republican and Democratic participants, despite their overwhelming belief that exposure to the flag would not influence their behavior. In Experiment 1, which was conducted online during the 2008 U.S. presidential election, a single exposure to an American flag resulted in a significant increase in participants’ Republican voting intentions, voting behavior, political beliefs, and implicit and explicit attitudes, with some effects lasting 8 months after the exposure to the prime. In Experiment 2, we replicated the findings more than a year into the current Democratic presidential term. These results constitute the first evidence that nonconscious priming effects from exposure to a national flag can bias the citizenry toward one political party and can have considerable durability.

Tuesday, August 30, 2011

Fat mice live longer with resveratrol.

Nicholas Wade points to a study by de Cabo, Sinclar, and colleages that studies the effect of a drug, SRT-1720, that has extends the lifespan of mice on a low calorie diet, but at much lower (less toxic) concentrations than resveratrol. Benefits of the drug are much easier to demonstrate in mice under physiological stress like obesity than in normal mice. The studied found that obese animals taking the drug lived 44 percent longer, on average, than control obese animals. From Wade's comments:
The sirtuins help bring about the 30 percent extension of life span enjoyed by mice and rats that are kept on very low-calorie diets. Since few people can keep to such an unappetizing diet, researchers hoped that doses of resveratrol might secure a painless path to significantly greater health and longevity...But large doses of resveratrol are required to show any effect, so chemical mimics like SRT-1720 were developed to activate sirtuin at much lower doses...Sirtuins have proved to be highly interesting proteins, but the goal of extending life span was set back last year when extensive trials of resveratrol showed it did not prolong mice’s lives, although it seemed to do them no harm. Another blow came in 2009, when biologists at Pfizer reported that SRT-1720 and other resveratrol mimics did not activate sirtuins and did not have any beneficial effects in fat mice...The report by Dr. de Cabo and his colleagues may do much to rescue SRT-1720 from this shadow. They found that SRT-1720 offered substantial benefits to the fat mice, with no signs of toxicity. Unlike the Pfizer study, which was short term, they followed large groups of mice for over three years.
Here is the article abstract:
Sirt1 is an NAD+-dependent deacetylase that extends lifespan in lower organisms and improves metabolism and delays the onset of age-related diseases in mammals. Here we show that SRT1720, a synthetic compound that was identified for its ability to activate Sirt1 in vitro, extends both mean and maximum lifespan of adult mice fed a high-fat diet. This lifespan extension is accompanied by health benefits including reduced liver steatosis, increased insulin sensitivity, enhanced locomotor activity and normalization of gene expression profiles and markers of inflammation and apoptosis, all in the absence of any observable toxicity. Using a conditional SIRT1 knockout mouse and specific gene knockdowns we show SRT1720 affects mitochondrial respiration in a Sirt1- and PGC-1α-dependent manner. These findings indicate that SRT1720 has long-term benefits and demonstrate for the first time the feasibility of designing novel molecules that are safe and effective in promoting longevity and preventing multiple age-related diseases in mammals.

Monday, August 29, 2011

Estimates of social influence - the "unfriending problem"

In the latest issue (Aug 26) of Science Magazine Barbara Jasny does a nice summary of recent work by Noel and Nyhan:
Studies of social influences on behavior have led to the idea that a range of characteristics from loneliness to obesity might be contagious. A significant problem for the field has been to distinguish effects due to similarities between people (homophily) from social influence. One strategy for doing this has been to look at changes that occur over time. However, such studies have been the subject of considerable debate, and Noel and Nyhan now add a cautionary note. Their analyses of a model used in past social contagion studies suggest that previous investigations have not fully controlled for the possibility that friendship formation and termination are dynamic processes, and friendships between people who are more similar may tend to be more stable over time. Or to put it in Facebook terms, friendships that are between people who are less similar may be less stable, and therefore may result in “unfriending.” Homophily might thus be having a larger effect than appreciated, and under certain conditions could account for most of the contagion effects observed. They conclude that this unfriending problem renders a determination of causality much more complicated in longitudinal social network data.

Friday, August 26, 2011

Synthesis of new brain cells and social dysfunction.

In rodent models of depression, antidepressant drugs are effective only if the hippocampus is able to generate new nerve cells (neurogenesis), suggesting an association between adult hippocampal neurogenesis and depression. Synder et al. have done the direct experiment of using a genetic trick to make mouse hippocampal cells sensitive to the antiviral drug valganciclovir, which inhibits cell proliferation. Valganciclovir treatment of the genetically altered mice almost completely abolished hippocampal neurogenesis. Their results support a direct role for adult neurogenesis in depressive illness. Here is their abstract:
Glucocorticoids are released in response to stressful experiences and serve many beneficial homeostatic functions. However, dysregulation of glucocorticoids is associated with cognitive impairments and depressive illness. In the hippocampus, a brain region densely populated with receptors for stress hormones, stress and glucocorticoids strongly inhibit adult neurogenesis. Decreased neurogenesis has been implicated in the pathogenesis of anxiety and depression, but direct evidence for this role is lacking. Here we show that adult-born hippocampal neurons are required for normal expression of the endocrine and behavioural components of the stress response. Using either transgenic or radiation methods to inhibit adult neurogenesis specifically, we find that glucocorticoid levels are slower to recover after moderate stress and are less suppressed by dexamethasone in neurogenesis-deficient mice than intact mice, consistent with a role for the hippocampus in regulation of the hypothalamic–pituitary–adrenal (HPA) axis. Relative to controls, neurogenesis-deficient mice also showed increased food avoidance in a novel environment after acute stress, increased behavioural despair in the forced swim test, and decreased sucrose preference, a measure of anhedonia. These findings identify a small subset of neurons within the dentate gyrus that are critical for hippocampal negative control of the HPA axis and support a direct role for adult neurogenesis in depressive illness.

Thursday, August 25, 2011

Brain excitation/inhibition balance and social dysfunction

Yates does a review of recent work by Deisseroth and colleagues, who have now shown that in mice, an elevation in the excitation/inhibition ratio in the medial prefrontal cortex (mPFC) impairs cellular information processing and leads to specific behavioral impairments. They made, and then genetically inserted, different forms of opsin molecules in different excitatory and inhibitory neuronal mPFC populations. By flashing the cortex with different wavelengths of light they could increase levels of either excitation or inhibition. Here is their abstract:
Severe behavioural deficits in psychiatric diseases such as autism and schizophrenia have been hypothesized to arise from elevations in the cellular balance of excitation and inhibition (E/I balance) within neural microcircuitry. This hypothesis could unify diverse streams of pathophysiological and genetic evidence, but has not been susceptible to direct testing. Here we design and use several novel optogenetic tools to causally investigate the cellular E/I balance hypothesis in freely moving mammals, and explore the associated circuit physiology. Elevation, but not reduction, of cellular E/I balance within the mouse medial prefrontal cortex was found to elicit a profound impairment in cellular information processing, associated with specific behavioural impairments and increased high-frequency power in the 30–80 Hz range, which have both been observed in clinical conditions in humans. Consistent with the E/I balance hypothesis, compensatory elevation of inhibitory cell excitability partially rescued social deficits caused by E/I balance elevation. These results provide support for the elevated cellular E/I balance hypothesis of severe neuropsychiatric disease-related symptoms.

Wednesday, August 24, 2011

A unified bottleneck in our brains limits our attention

It has been a common assumption that different tasks requiring our attention, like making perception distinctions or making action choices are limited by brain areas most associated with those function. Tombu et al. now find a unified attentional bottleneck, including the inferior frontal junction, superior medial frontal cortex, and bilateral insula.
Human information processing is characterized by bottlenecks that constrain throughput. These bottlenecks limit both what we can perceive and what we can act on in multitask settings. Although perceptual and response limitations are often attributed to independent information processing bottlenecks, it has recently been suggested that a common attentional limitation may be responsible for both. To date, however, evidence supporting the existence of such a “unified” bottleneck has been mixed. Here, we tested the unified bottleneck hypothesis using time-resolved fMRI. The first experiment isolated brain regions involved in the response selection bottleneck that limits speeded dual-task performance. These same brain regions were not only engaged by a perceptual encoding task in a second experiment, their activity also tracked delays to a speeded decision-making task caused by concurrent perceptual encoding in a third experiment. We conclude that a unified attentional bottleneck, including the inferior frontal junction, superior medial frontal cortex, and bilateral insula, temporally limits operations as diverse as perceptual encoding and decision-making.

Tuesday, August 23, 2011

New views on cancer - 99% of the functioning genes in our bodies are not ‘ours’.

They are the genes of bacteria and fungi that have evolved with us in a symbiotic relationship. This fascinating factoid is from an article by George Johnson describing fundamental changes in the way researchers are viewing the cancer process, as the reigning model - that “Through a series of random mutations, genes that encourage cellular division are pushed into overdrive, while genes that normally send growth-restraining signals are taken offline” - is supplemented by a number of subtle variations:
..genes in this microbiome — [of bacteria and fungi] exchanging messages with genes inside human cells — may be involved with cancers of the colon, stomach, esophagus and other organs...The idea that people in different regions of the world have co-evolved with different microbial ecosystems may be a factor — along with diet, lifestyle and other environmental agents — in explaining why they are often subject to different cancers.

...Most DNA…was long considered junk … Only about 2 percent of the human genome carries the code for making enzymes and other proteins…These days “junk” DNA is referred to more respectfully as “noncoding” DNA, and researchers are finding clues that “pseudogenes” lurking within this dark region may play a role in cancer.

...With so much internal machinery, malignant tumors are now being compared to renegade organs sprouting inside the body…[they] contain healthy cells that have been conscripted into the cause. Cells called fibroblasts collaborate by secreting proteins the tumor needs to build its supportive scaffolding and expand into surrounding tissues. Immune system cells, maneuvered into behaving as if they were healing a wound, emit growth factors that embolden the tumor and stimulate angiogenesis, the generation of new blood vessels. Endothelial cells, which form the lining of the circulatory system, are also enlisted in the construction of the tumor’s own blood supply.
The article lists a number of further ideas, involving various classes of small or micro RNAs, here's a great sentence:
...other exotic players: lincRNA, (for large intervening noncoding), siRNA (small interfering), snoRNA (small nucleolar) and piRNA (Piwi-interacting (short for “P-element induced wimpy testis” (a peculiar term that threatens to pull this sentence into a regress of nested parenthetical explanations))).

Monday, August 22, 2011

Trying to live forever - Centenarians have plenty of bad habits

Here are two recent bits on aging:
O'Connor points to a study that
...focused on Ashkenazi Jews, a group that is more genetically homogenous than other populations, making it easier to identify genetic differences that contribute to life span. In the study, the researchers followed 477 Ashkenazi centenarians who were 95 or older and living independently. They asked them about their habits and the ways they lived when they were younger. Using data collected in the 1970s, the researchers compared the long-lived group with another group of 3,000 people in the general population who were born around the same time but who generally did not make it to age 95...They found that the people who lived to 95 and beyond did not seem to exhibit healthier lifestyles than those who died younger.
The article continues to discuss social, personality, and genetic factors influencing longevity. The take home message is that people with the genes for longevity live past age 95 with habits no different from most others, but the average person would probably have to follow a healthy lifestyle to live comfortably past 80.

And, here is a bit of sanity, from Gary Gutting, on trying to live forever. He emphasizes that correlations do not prove causes (lower HDL levels correlate with more heart attacks, but clinical studies show raising HDL (good) cholesterol with drugs does nothing to protect against heart attacks.) He argues against chasing after the latest dietary supplement whose relevance is implied from correlation studies ('It can't hurt, it might help'... which I'm guilty of), and simply following the humdrum standard advice we’ve heard all our lives about eating sensibly, exercising regularly, and having recommended medical tests and exams. Apart from that, "how we die is a crap-shoot, and, short of avoiding obvious risks such as smoking and poor diet, there’s little we can do to load the dice."

Friday, August 19, 2011

Neotony - how long does our pre-frontal cortex stay young?

When I first looked at the title "Extraordinary neoteny of synaptic spines in the human prefrontal cortex", I excitedly thought "Great, I'm going to learn that my 69 year old prefrontal cortex is still crafting and pruning synapses." Alas, by extraordinary, the authors mean that they have determined that the 2-3 fold decrease in the density of dendritic spines previously thought to be largely complete by the end of adolescence continues well into the third decade of life before stabilizing at the adult level.
The major mechanism for generating diversity of neuronal connections beyond their genetic determination is the activity-dependent stabilization and selective elimination of the initially overproduced synapses [Changeux JP, Danchin A (1976) Nature 264:705–712]. The largest number of supranumerary synapses has been recorded in the cerebral cortex of human and nonhuman primates. It is generally accepted that synaptic pruning in the cerebral cortex, including prefrontal areas, occurs at puberty and is completed during early adolescence [Huttenlocher PR, et al. (1979) Brain Res 163:195–205]. In the present study we analyzed synaptic spine density on the dendrites of layer IIIC cortico–cortical and layer V cortico–subcortical projecting pyramidal neurons in a large sample of human prefrontal cortices in subjects ranging in age from newborn to 91 y. We confirm that dendritic spine density in childhood exceeds adult values by two- to threefold and begins to decrease during puberty. However, we also obtained evidence that overproduction and developmental remodeling, including substantial elimination of synaptic spines, continues beyond adolescence and throughout the third decade of life before stabilizing at the adult level. Such an extraordinarily long phase of developmental reorganization of cortical neuronal circuitry has implications for understanding the effect of environmental impact on the development of human cognitive and emotional capacities as well as the late onset of human-specific neuropsychiatric disorders.

Thursday, August 18, 2011

The dark side of emotion in decision making.

A mindblog reader emailed me pointing out this (before MindBlog started up) 2005 publication by Bechara and collaborators on the role of emotion in making decisions in risky situations (rather relevant to our current financial crisis, with investors rushing like lemmings to emotionally drive the market in huge up or down swings). Disabling normal emotional reactivity by either brain lesions or substance abuse leads people to make more advantageous decision in risky situation. (In a 2009 post I noted Bechara's more recent work on reward processing in different parts of the brain.
Can dysfunction in neural systems subserving emotion lead, under certain circumstances, to more advantageous decisions? To answer this question, we investigated how individuals with substance dependence (ISD), patients with stable focal lesions in brain regions related to emotion (lesion patients), and normal participants (normal controls) made 20 rounds of investment decisions. Like lesion patients, ISD made more advantageous decisions and ultimately earned more money from their investments than the normal controls. When normal controls either won or lost money on an investment round, they adopted a conservative strategy and became more reluctant to invest on the subsequent round, suggesting that they were more affected than lesion patients and ISD by the outcomes of decisions made in the previous rounds.



Wednesday, August 17, 2011

Why worry? It's good for you.

I've been meaning to point out an interesting piece by Robert Frank in the business section of the NYTimes, a subject mindblog has touched on in several posts. It's a bit of a gloss, but I pull out a few clips:
…people are particularly inept at predicting how changes in their life circumstances will affect their happiness. Even when the changes are huge — positive or negative — most people adapt much more quickly and completely than they expected…Paradoxically, our prediction errors often lead us to choices that are wisest in hindsight. In such cases, evolutionary biology often provides a clearer guide than cognitive psychology for thinking about why people behave as they do…the brain has evolved not to make us happy, but to motivate actions that help push our DNA into the next round. Much of the time, in fact, the brain accomplishes that by making us unhappy. Anxiety, hunger, fatigue, loneliness, thirst, anger and fear spur action to meet the competitive challenges we face…pleasure is an inherently fleeting emotion, one we experience while escaping from emotionally aversive states. In other words, pleasure is the carrot that provokes us to extricate ourselves from such states, but it almost always fades quickly…The human brain was formed by relentless competition in the natural world, so it should be no surprise that we adapt quickly to changes in circumstances.

Most people would love to have a job with interesting, capable colleagues, a high level of autonomy and ample opportunities for creative expression. But only a limited number of such jobs are available — and it’s our fretting that can motivate us to get them....Within limits, worry about success causes students to study harder to gain admission to better universities. It makes assistant professors work harder to earn tenure. It leads film makers to strive harder to create the perfect scene, and songwriters to dig deeper for the most pleasing melody. In every domain, people who work harder are more likely to succeed professionally, more likely to make a difference...The anxiety we feel about whether we’ll succeed is evolution’s way of motivating us.

Tuesday, August 16, 2011

Class warfare and voting

I thought this cartoon was a nice job, and for days have been debating passing it on in a post... so, here it is (click to enlarge).

And,while I'm at it, I'll also pass on a George Carlin video a friend sent me that is hysterical, but has (be warned) VERY offensive language.

Information and ideas are not the same thing!

Neal Gabler does a terrific opinion piece in this past Sunday's NYTimes on how our culture increasing follows present centered and transient flashes of information at the expense of integrative ideas and metaphors. It hit me between the eyes, resonating with my own frustration over feeling that I am constantly awash in streams of information chunks that do not cohere - are not integrated into perceiving patterns and overarching ideas. It was a reaffirmation of my recent decision test the effect of going cold turkey for awhile - to shut off my daily cruising of the Huffington Post and several other aggregators and news feeds. To stop watching the Jon Stewart Daily News, Colbert Report, and evening news. Already I can feel a detoxification process settling in, a slightly more calm mind. Gabler starts by noting that The Atlantic's “14 Biggest Ideas of the Year” are not in fact ideas, they are observations (sample: “Wall Street: Same as it Ever Was”) Here are some clips from Gabler's article:
Ideas just aren’t what they used to be. Once upon a time, they could ignite fires of debate, stimulate other thoughts, incite revolutions and fundamentally change the ways we look at and think about the world…They could penetrate the general culture and make celebrities out of thinkers — notably Albert Einstein, but also Reinhold Niebuhr, Daniel Bell, Betty Friedan, Carl Sagan and Stephen Jay Gould, to name a few. The ideas themselves could even be made famous: for instance, for “the end of ideology,” “the medium is the message,” “the feminine mystique,” “the Big Bang theory,” “the end of history.”…we are living in an increasingly post-idea world — a world in which big, thought-provoking ideas that can’t instantly be monetized are of so little intrinsic value that fewer people are generating them and fewer outlets are disseminating them, the Internet notwithstanding.

…especially here in America...we live in a post-Enlightenment age in which rationality, science, evidence, logical argument and debate have lost the battle in many sectors, and perhaps even in society generally, to superstition, faith, opinion and orthodoxy. While we continue to make giant technological advances, we may be the first generation to have turned back the epochal clock — to have gone backward intellectually from advanced modes of thinking into old modes of belief. But post-Enlightenment and post-idea, while related, are not exactly the same...Post-Enlightenment refers to a style of thinking that no longer deploys the techniques of rational thought. Post-idea refers to thinking that is no longer done, regardless of the style.

We live in the much vaunted Age of Information. Courtesy of the Internet, we seem to have immediate access to anything that anyone could ever want to know…In the past, we collected information not simply to know things….We also collected information to convert it into something larger than facts and ultimately more useful — into ideas that made sense of the information..But if information was once grist for ideas, over the last decade it has become competition for them. We are like the farmer who has too much wheat to make flour. We are inundated with so much information that we wouldn’t have time to process it even if we wanted to, and most of us don’t want to…We prefer knowing to thinking because knowing has more immediate value. It keeps us in the loop, keeps us connected to our friends and our cohort. Ideas are too airy, too impractical, too much work for too little reward. Few talk ideas. Everyone talks information, usually personal information. Where are you going? What are you doing? Whom are you seeing? These are today’s big questions.

…social networking sites are the primary form of communication among young people, and they are supplanting print, which is where ideas have typically gestated. …social networking sites engender habits of mind that are inimical to the kind of deliberate discourse that gives rise to ideas. Instead of theories, hypotheses and grand arguments, we get instant 140-character tweets about eating a sandwich or watching a TV show.

…We have become information narcissists, so uninterested in anything outside ourselves and our friendship circles or in any tidbit we cannot share with those friends that if a Marx or a Nietzsche were suddenly to appear, blasting his ideas, no one would pay the slightest attention, certainly not the general media, which have learned to service our narcissism.

Monday, August 15, 2011

How google effects our memory.

Daniel Wegner can be counted on to be always coming up with interesting stuff.  Here he does a series of experiments showing how google is taking a load off our explicit memory storage habits (of the sort that occurred in the transition from the oral tradition to writing). As most of us know from our daily experience, google is replacing books and encyclopedias as our main group or transactive memory, and we become increasingly able to remember where information is stored better than remembering the information itself:
The advent of the Internet, with sophisticated algorithmic search engines, has made accessing information as easy as lifting a finger. No longer do we have to make costly efforts to find the things we want. We can “Google” the old classmate, find articles online, or look up the actor who was on the tip of our tongue. The results of four studies suggest that when faced with difficult questions, people are primed to think about computers and that when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it. The Internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves.

Friday, August 12, 2011

The art of musical notation - scoring outside the lines

Pat Muchmore writes a fascinating piece on musical scores that are presented in a more exotic form than the standard music clef notations that I and other musicians have spend many thousands of hours with. He uses the term ergodic notations,
...which I derive from the game and literary theorist Espen J. Aarseth’s phrase “ergodic literature.” These are writings that require some amount of effort to read beyond simply moving one’s eyes and flipping pages. There are ancient examples, such as Egyptian texts that span several walls across several rooms or, more recently, Islamic calligrams that render Arabic words like Allah and Bismillah in many different directions and scales.
The article contains numerous modern examples of ergodic scores, and also notes:
Ergodic notation is not new. Baude Cordier, a composer of the ars subtilior style, wrote during the first half of the 15th century. He created many graphic scores, one of the most elegant of which is for a piece called “Belle, bonne, sage.”
“Belle, bonne, sage” by Baude Cordier.


It’s a love song, so it’s rendered in the shape of a heart. The performance is essentially unaffected by the shape, but we needn’t condemn it — it’s a beautiful addition to the artwork. Furthermore, not every visual element is purely decorative, the red notes indicate a rhythmic alteration that was otherwise very difficult to notate at the time.
The article provides further examples of ergodic notations from modern composers George Crumb, Peter Maxwell Davies, John Cage. It also includes notation and audio files of a composition by the author.

Thursday, August 11, 2011

Cowboys and Pit Crews

Yesterday's posting on Atul Gawande's writing has reminded of his more recent essay "Cowboys and Pit Crews," on medical practice, which has been languishing in my list of potential posts:
The core structure of medicine—how health care is organized and practiced—emerged in an era when doctors could hold all the key information patients needed in their heads and manage everything required themselves. One needed only an ethic of hard work, a prescription pad, a secretary, and a hospital willing to serve as one’s workshop, loaning a bed and nurses for a patient’s convalescence, maybe an operating room with a few basic tools. We were craftsmen. We could set the fracture, spin the blood, plate the cultures, administer the antiserum. The nature of the knowledge lent itself to prizing autonomy, independence, and self-sufficiency among our highest values, and to designing medicine accordingly. But you can’t hold all the information in your head any longer, and you can’t master all the skills. No one person can work up a patient’s back pain, run the immunoassay, do the physical therapy, protocol the MRI, and direct the treatment of the unexpected cancer found growing in the spine.

Everyone has just a piece of patient care. We’re all specialists now—even primary-care doctors. A structure that prioritizes the independence of all those specialists will have enormous difficulty achieving great care.

We don’t have to look far for evidence. Two million patients pick up infections in American hospitals, most because someone didn’t follow basic antiseptic precautions. Forty per cent of coronary-disease patients and sixty per cent of asthma patients receive incomplete or inappropriate care. And half of major surgical complications are avoidable with existing knowledge. It’s like no one’s in charge—because no one is. The public’s experience is that we have amazing clinicians and technologies but little consistent sense that they come together to provide an actual system of care, from start to finish, for people. We train, hire, and pay doctors to be cowboys. But it’s pit crews people need.

Recently, you might be interested to know, I met an actual cowboy. He described to me how cowboys do their job today, herding thousands of cattle. They have tightly organized teams, with everyone assigned specific positions and communicating with each other constantly. They have protocols and checklists for bad weather, emergencies, the inoculations they must dispense. Even the cowboys, it turns out, function like pit crews now. It may be time for us to join them.

Wednesday, August 10, 2011

Atul Gawande on aging.

I've been assembling a short list of possible essay/lecture topics, and one of the putative titles is "You're gonna die... get over it." It would be in the spirit of a crisp and clear essay in The New Yorker by Atul Gawande titled "The way we age now," which I have posted before and re-post here:

...one the best articles on aging that I have read, written by Atul Gawande (Asst. Prof. in the Harvard School of Public Health, and staff writer for the New Yorker Magazine). The article appears in the April 30 issue of the New Yorker.

Some clips:

Even though some genes have been shown to influence longevity in worms, fruit flies, and mice..
...scientists do not believe that our life spans are actually programmed into us.. (Deric note: in the post I just prepared for next Tuesday, this point is contested). After all, for most of our hundred-thousand-year existence—all but the past couple of hundred years—the average life span of human beings has been thirty years or less...Today, the average life span in developed countries is almost eighty years. If human life spans depend on our genetics, then medicine has got the upper hand. We are, in a way, freaks living well beyond our appointed time. So when we study aging what we are trying to understand is not so much a natural process as an unnatural one...

...complex systems—power plants, say—have to survive and function despite having thousands of critical components. Engineers therefore design these machines with multiple layers of redundancy: with backup systems, and backup systems for the backup systems. The backups may not be as efficient as the first-line components, but they allow the machine to keep going even as damage accumulates...within the parameters established by our genes, that’s exactly how human beings appear to work. We have an extra kidney, an extra lung, an extra gonad, extra teeth. The DNA in our cells is frequently damaged under routine conditions, but our cells have a number of DNA repair systems. If a key gene is permanently damaged, there are usually extra copies of the gene nearby. And, if the entire cell dies, other cells can fill in.

Nonetheless, as the defects in a complex system increase, the time comes when just one more defect is enough to impair the whole, resulting in the condition known as frailty. It happens to power plants, cars, and large organizations. And it happens to us: eventually, one too many joints are damaged, one too many arteries calcify. There are no more backups. We wear down until we can’t wear down anymore.
Gawande proceeds to a discussion of social and medical consequences of people over 65 becoming 20% of the population.
Improvements in the treatment and prevention of heart disease, respiratory illness, stroke, cancer, and the like mean that the average sixty-five-year-old can expect to live another nineteen years—almost four years longer than was the case in 1970. (By contrast, from the nineteenth century to 1970, sixty-five-year-olds gained just three years of life expectancy.)

The result has been called the “rectangularization” of survival. Throughout most of human history, a society’s population formed a sort of pyramid: young children represented the largest portion—the base—and each successively older cohort represented a smaller and smaller group. In 1950, children under the age of five were eleven per cent of the U.S. population, adults aged forty-five to forty-nine were six per cent, and those over eighty were one per cent. Today, we have as many fifty-year-olds as five-year-olds. In thirty years, there will be as many people over eighty as there are under five.

Americans haven’t come to grips with the new demography. We cling to the notion of retirement at sixty-five—a reasonable notion when those over sixty-five were a tiny percentage of the population, but completely untenable as they approach twenty per cent. People are putting aside less in savings for old age now than they have in any decade since the Great Depression. More than half of the very old now live without a spouse, and we have fewer children than ever before—yet we give virtually no thought to how we will live out our later years alone.

...medicine has been slow to confront the very changes that it has been responsible for—or to apply the knowledge we already have about how to make old age better. Despite a rapidly growing elderly population, the number of certified geriatricians fell by a third between 1998 and 2004.

Tuesday, August 09, 2011

Do 18-month old humans have a theory of mind?

Senju et al., following up on an experiment by Meltzoff and Brooks, use a rather clever experimental design to show that 18-month old children can attribute false beliefs to others, a capacity previously thought to appear only after 3-4 years:
In the research reported here, we investigated whether 18-month-olds would use their own past experience of visual access to attribute perception and consequent beliefs to other people. Infants in this study wore either opaque blindfolds (opaque condition) or trick blindfolds that looked opaque but were actually transparent (trick condition). Then both groups of infants observed an actor wearing one of the same blindfolds that they themselves had experienced, while a puppet removed an object from its location. Anticipatory eye movements revealed that infants who had experienced opaque blindfolds expected the actor to behave in accordance with a false belief about the object’s location, but that infants who had experienced trick blindfolds did not exhibit that expectation. Our results suggest that 18-month-olds used self-experience with the blindfolds to assess the actor’s visual access and to update her belief state accordingly. These data constitute compelling evidence that 18-month-olds infer perceptual access and appreciate its causal role in altering the epistemic states of other people.

Monday, August 08, 2011

In a nutshell....

I have to pass on the cover of the current New Yorker Magazine:


Effects of oxytocin in humans - a critical review

Over the past several years MindBlog has posted examples from the outpouring of work on the "trust hormone" oxytocin. Trends in Cognitive Science offers open access to this more critical and balanced review by Bartz et al. Their abstract:
Building on animal research, the past decade has witnessed a surge of interest in the effects of oxytocin on social cognition and prosocial behavior in humans. This work has generated considerable excitement about identifying the neurochemical underpinnings of sociality in humans, and discovering compounds to treat social functioning deficits. Inspection of the literature, however, reveals that the effects of oxytocin in the social domain are often weak and/or inconsistent. We propose that this literature can be informed by an interactionist approach in which the effects of oxytocin are constrained by features of situations and/or individuals. We show how this approach can improve understanding of extant research, suggest novel mechanisms through which oxytocin might operate, and refine predictions about oxytocin pharmacotherapy.
By the way, the same issue of Trends in Cognitive Science has a brief note by van Honk et al. on testosterone as a social hormone, also noting the complexity of hormone-behavior relationships (PDF here).

Friday, August 05, 2011

Macho mice make manly melodies.

Susan Reardon points to work by work of Pasch et al at U of F at Gainesville, who compared the songs of castrated male mice (singing mice from Costa Rica) with males having a male hormone implant. Females were attracted to speakers playing recordings of the songs of hormonally encanced males. (audio file here, video file in links above).

Thursday, August 04, 2011

Boredom - a Lively History

Peter Toohey's book with the title of this post is reviewed by Anthony Gottlieb in the NYTimes:
In Oscar Wilde’s play “A Woman of No Importance,” Lord Illingworth says of society: “To be in it is merely a bore. But to be out of it simply a tragedy.” To be a bore oneself is the ultimate failing and makes one the target for a quintessentially English put-down. “Even the grave yawns for him,” the actor and theater manager Sir Herbert Beerbohm Tree once said of an earnest writer. ...it was (and still is) regarded in some quarters as stylish and rather aristocratic to suffer from boredom, so the English ought really to thank their bores for providing them with the occasion to display wit and appear grand.

Toohey...suggests that the unpleasant feeling of simple boredom developed as a warning signal to steer us away from social situations that are “confined, predictable, too samey for one’s sanity.” In other words, it is a useful aversion: the discomfort of boredom is a blessing in disguise...a colleague of his once argued that there isn’t really any such thing as boredom, just a blurring together of a constellation of feelings and moods — frustration, surfeit, apathy and the like. Toohey rejects this idea, and perhaps there is indeed little harm in keeping the word, provided that one is vigilantly aware of the loose, subjective and confusing ways in which it is often used. When the actor George Sanders — the archetypal cad, at least on-screen, and in the title of his autobiography — committed suicide in a Spanish hotel in 1972, he left a note that began: “Dear World, I am leaving because I am bored.” It is worth noting that he was ill, lonely and had sold his beloved house on Majorca. Was boredom really what his death was about? When a man says he is bored — as Oscar Wilde never quite got round to saying — it sometimes means that he cannot be bothered to tell you what really ails him.

Wednesday, August 03, 2011

Collectivism promotes bribery

From Mazar and Aggarwal:
Why are there national differences in the propensity to bribe? To investigate this question, we conducted a correlational study with cross-national data and a laboratory experiment. We found a significant effect of the degree of collectivism versus individualism present in a national culture on the propensity to offer bribes to international business partners. Furthermore, the effect was mediated by individuals’ sense of responsibility for their actions. Together, these results suggest that collectivism promotes bribery through lower perceived responsibility for one’s actions.
later note: I forgot to put the link to this article, it's now added.

Tuesday, August 02, 2011

Diversity is Universal

Here is an interesting little nugget from Joan Chiao:
At every level in the vast and dynamic world of living things lies diversity. From biomes to biomarkers, the complex array of solutions to the most basic problems regarding survival in a given environment afforded to us by nature is riveting. In the world of humans alone, diversity is apparent in the genome, in the brain and in our behavior.

The mark of multiple populations lies in the fabric of our DNA. The signature of selfhood in the brain holds dual frames, one for thinking about one's self as absolute, the other in context of others. From this biological diversity in humans arises cultural diversity directly observable in nearly every aspect of how people think, feel and behavior. From classrooms to conventions across continents, the range and scope of human activities is stunning.

Recent centuries have seen the scientific debate regarding the nature of human nature cast as a dichotomy between diversity on the one hand and universalism on the other. Yet a seemingly paradoxical, but tractable, scientific concept that may enhance our cognitive toolkit over time is the simple notion that diversity is universal.

Monday, August 01, 2011

The sunny side of smut.

Coming across an article with the same title as this post gave me an immediate flashback to my days at Harvard, when as a graduate student and resident tutor in Winthrop House I would invite down various campus notables to have dinner in the dining hall at a table with my students (coats and ties were still required then), after which we retired to the common room for a chat over sherry (sigh....the good old days). The guest I'm remembering was the famous psychologist B.F. Skinner, whose immediate response, when he was asked how he managed to remain so vital at his advanced age, was "I read pornography." Here are a few clips from the article in the Scientific American on this topics by Moyer:
...Now pornography is just one Google search away, and much of it is free. Age restrictions have become meaningless, too, with the advent of social media—one teenager in five has sent or posted naked pictures of themselves online...Certainly pornography addiction or overconsumption seems to cause relationship problems...But what about the more casual exposure typical of most porn users?...“There’s absolutely no evidence that pornography does anything negative,” says Milton Diamond​, director of the Pacific Center for Sex and Society at the University of Hawaii at Manoa. “It’s a moral issue, not a factual issue.”...Perhaps the most serious accusation against pornography is that it incites sexual aggression. But not only do rape statistics suggest otherwise, some experts believe the consumption of pornography may actually reduce the desire to rape by offering a safe, private outlet for deviant sexual desires...as access to pornography grew in once restrictive Japan, China and Denmark in the past 40 years, rape statistics plummeted. Within the U.S., the states with the least Internet access between 1980 and 2000—and therefore the least access to Internet pornography—experienced a 53 percent increase in rape incidence, whereas the states with the most access experienced a 27 percent drop in the number of reported rapes .

It is important to note that these associations are just that—associations. They do not prove that pornography is the cause of the observed crime reductions. Nevertheless, the trends just don’t fit with the theory that rape and sexual assault are in part influenced by pornography...patients requesting treatment in clinics for sex offenders commonly say that pornography helps them keep their abnormal sexuality within the confines of their imagination. Pornography seems to be protective...perhaps because exposure correlates with lower levels of sexual repression, a potential rape risk factor.

Friday, July 29, 2011

MindBlog retrospective: A new description of our inner lives.

This is another of my old posts that emerged from the retrospective scan of this blog that I did recently, another interesting perspective I don't want to loose touch with. It drew a number of comments, and a second post several months later discussed them. Here is a repeat of the original post:

I rarely mention my internal experience and sensations on this blog - first, because I have viewed readers as "wanting the beef," objective stuff on how minds work. Second and more important, because my experience of noting the flow of my brain products as emotion laced chunks of sensing/cognition/action - knowing the names of the neurotransmitters and hormones acting during desire, arousal, calming, or affiliation - strikes me as a process which would feel quite alien to most people. Still, if we are materialists who believe that someday we will understand how the brain-body generates our consciousness and sense of a self, we will be able to think in terms like the following (a quote taken from Larissa MacFarquhar's profile of Paul and Patricia Churchland in the Feb. 12 New Yorker Magazine):

"...he and Pat like to speculate about a day when whole chunks of English, especially the bits that consitute folk psychology, are replaced by scientific words that call a thing by its proper name rather than some outworn metaphor... as people learn to speak differently they will learn to experience differently, and sooner or later even their most private introspections will be affected. Already Paul feels pain differently than he used to: when he cut himself shaving now he fells not "pain" but something more complicated - first the sharp, superficial A-delta-fibre pain, and then a couple of seconds later, the sickening, deeper feeling of C-fibre pain that lingers. The new words, far from being reductive or dry, have enhanced his sensations, he feels, as an oenophile's complex vocabulary enhances the taste of wine."

"Paul and Pat, realizing that the revolutionary neuroscience they dream of is still in its infancy, are nonetheless already preparing themselve for this future, making the appropriate adjustments in their everyday conversation. One afternoon recently, Paul says, he was home making dinner when Pat burst in the door, having come straight from a frustrating faculty meeting. "She said, 'Paul, don't speak to me, my serotonin levels have hit bottom, my brain is awash in glucocortocoids, my blood vessels are full of adrenaline, and if it weren't for my endogenous opiates I'd have driven the car into a tree on the way home. My dopamine levels need lifting. Pour me a Chardonnay, and I'll be down in a minute.' " Paul and Pat have noticed that it is not just they who talk this way - their students now talk of psychopharmacology as comfortably as of food."

Thursday, July 28, 2011

The utility of being vague.

I'm just getting to glance at the last few issue of Psychological Science, and find this gem, "In Praise of Vagueness" by Mishra et al., which they introduce as follows:
People are increasingly surrounded by devices that provide highly precise information. For instance, technologically advanced bathroom scales can now give measurements of weight, body fat, and hydration levels within two and even three decimal places. People can find out exactly how many calories they are eating, how much weight they can lift, and how many steps they walk in a typical day. The overarching belief exemplified by the use of such technologies could be summed up by the phrase, “If I can measure it, I can manage it.” In other words, people seem to believe that precise information increases their likelihood of performing better and meeting personal goals (e.g., improving physical strength or losing weight). People generally prefer precise information over vague information because precise information gives them a greater sense of security and confidence in their ability to predict unknown outcomes in their environment. Despite this preference, we have found that vague information sometimes serves people better than precise information does.

Why might individuals perform better when they receive vague information than when they receive precise information? We posit that vague information allows individuals leeway in interpretation so that they form expectancies in accordance with the outcomes that they desire. Further, we posit that these positive expectancies can give rise to favorable performance-related outcomes.
Their experiments examined the progress of people towards goals when they were given precise versus vague (error range given) feedback on that progress. Perhaps the most striking example was provided in the weight loss experiment whose participants gained, on average, one pound over the course of the experiment after being given precise feedback, those given vague feedback lost nearly four pounds. Here is their abstract:
Is the eternal quest for precise information always worthwhile? Our research suggests that, at times, vagueness has its merits. Previous research has demonstrated that people prefer precise information over vague information because it gives them a sense of security and makes their environments more predictable. However, we show that the fuzzy boundaries afforded by vague information can actually help individuals perform better than can precise information. We document these findings across two laboratory studies and one quasi–field study that involved different performance-related contexts (mental acuity, physical strength, and weight loss). We argue that the malleability of vague information allows people to interpret it in the manner they desire, so that they can generate positive response expectancies and, thereby, perform better. The rigidity of precise information discourages desirable interpretations. Hence, on certain occasions, precise information is not as helpful as vague information in boosting performance.