Tuesday, June 15, 2010

Observing disease symptoms causes a more vigorous immune response

We know that social status, positive versus negative affect, etc. can influence immune system robustness. Now Schaller et al. show that the mere observation of photographs depicting symptoms of infectious disease can boost the subsequent elevation of proinflammatory cytokines released by white blood cells in response to a bacterial stimulus. The effect was specific to the perception of disease-connoting social cues; it did not occur in response to a different category of stress-inducing interpersonal threat. The abstract and a few clips:
An experiment (N = 28) tested the hypothesis that the mere visual perception of disease-connoting cues promotes a more aggressive immune response. Participants were exposed either to photographs depicting symptoms of infectious disease or to photographs depicting guns. After incubation with a model bacterial stimulus, participants’ white blood cells produced higher levels of the proinflammatory cytokine interleukin-6 (IL-6) in the infectious-disease condition, compared with the control (guns) condition. These results provide the first empirical evidence that visual perception of other people’s symptoms may cause the immune system to respond more aggressively to infection.

This linkage may have been adaptive in ancestral ecologies, as individuals characterized by perception-facilitated immune responses would have had reduced likelihood of succumbing to pathogenic infections...People are sensitive to visual stimuli connoting the potential presence of infectious pathogens in others. These stimuli include anomalous morphological and behavioral characteristics (e.g., skin discolorations, sneezing) that suggest infection with disease-causing microorganisms. When perceived, these stimuli trigger psychological responses—such as disgust and the activation of aversive cognitions into working memory—that inhibit interpersonal contact.

Monday, June 14, 2010

You are how you eat - fast food and impatience

Here is a gem from the May issue of Psychological Science, offered by Zhong and DeVoe at the University of Toronto:
Based on recent advancements in the behavioral priming literature, three experiments investigated how incidental exposure to fast food can induce impatient behaviors and choices outside of the eating domain. We found that even an unconscious exposure to fast-food symbols can automatically increase participants’ reading speed when they are under no time pressure and that thinking about fast food increases preferences for time-saving products while there are potentially many other product dimensions to consider. More strikingly, we found that mere exposure to fast-food symbols reduced people’s willingness to save and led them to prefer immediate gain over greater future return, ultimately harming their economic interest. Thus, the way people eat has far-reaching (often unconscious) influences on behaviors and choices unrelated to eating.

"Vital Exhaustion"

Benedict Carey discusses how the term "Nervous Breakdown," popular in 1900 before yielding to number of supposedly more scientific diagnoses, has mutated to what psychiatrists in Europe have been diagnosing as “burnout syndrome,” the signs of which include “vital exhaustion.” A paper published last year defined three types: “frenetic,” “underchallenged,” and “worn out.” "Nervous breakdown" has begun to fade from use, and the same fate may or may not await "burnout syndrome".

Friday, June 11, 2010

Austin pictures

This is my last day in Austin Texas, where, after attending a 50th high school reunion, I have been revisiting scenes of my childhood.  One of the most beautiful is Hamilton's Pool, formed in a box canyon on a creek that runs into the Pedernales River about 23 miles west of Austin.  I have put a few photos from the trip on my Picassa photo page. 

Testosterone decreases trust.

Fascinating observations from Bos et al., who tested the effect of testosterone in regulating womens' rating of the trustworthiness of a series of men’s faces shown in photographs. It is essentially an antidote to oxytocin, which has been shown to increase judgments of trustworthiness. Their abstract:
Trust plays an important role in the formation and maintenance of human social relationships. But trusting others is associated with a cost, given the prevalence of cheaters and deceivers in human society. Recent research has shown that the peptide hormone oxytocin increases trust in humans. However, oxytocin also makes individuals susceptible to betrayal, because under influence of oxytocin, subjects perseverate in giving trust to others they know are untrustworthy. Testosterone, a steroid hormone associated with competition and dominance, is often viewed as an inhibitor of sociality, and may have antagonistic properties with oxytocin. The following experiment tests this possibility in a placebo-controlled, within-subjects design involving the administration of testosterone to 24 female subjects. We show that compared with the placebo, testosterone significantly decreases interpersonal trust, and, as further analyses established, this effect is determined by those who give trust easily. We suggest that testosterone adaptively increases social vigilance in these trusting individuals to better prepare them for competition over status and valued resources. In conclusion, our data provide unique insights into the hormonal regulation of human sociality by showing that testosterone downregulates interpersonal trust in an adaptive manner.
Also, check out this article by Nicholas Wade in the NYTimes discussing this work and comments on its relevance to understanding human evolution.

Neural processing of risk.

Mohr et al. show that, over a range of experiments and paradigms, risk is consistently represented in the anterior insula, a brain region known to process aversive emotions such as anxiety, disappointment, or regret. This provides further evidence that risk processing is influenced by emotions.
In our everyday life, we often have to make decisions with risky consequences, such as choosing a restaurant for dinner or choosing a form of retirement saving. To date, however, little is known about how the brain processes risk. Recent conceptualizations of risky decision making highlight that it is generally associated with emotions but do not specify how emotions are implicated in risk processing. Moreover, little is known about risk processing in non-choice situations and how potential losses influence risk processing. Here we used quantitative meta-analyses of functional magnetic resonance imaging experiments on risk processing in the brain to investigate (1) how risk processing is influenced by emotions, (2) how it differs between choice and non-choice situations, and (3) how it changes when losses are possible. By showing that, over a range of experiments and paradigms, risk is consistently represented in the anterior insula, a brain region known to process aversive emotions such as anxiety, disappointment, or regret, we provide evidence that risk processing is influenced by emotions. Furthermore, our results show risk-related activity in the dorsolateral prefrontal cortex and the parietal cortex in choice situations but not in situations in which no choice is involved or a choice has already been made. The anterior insula was predominantly active in the presence of potential losses, indicating that potential losses modulate risk processing.

Thursday, June 10, 2010

More on clever crows and complex cognition.

Taylor et al have done some beautiful experiments clearly demonstrating complex cognition in New Caledonian crows, proving that these birds are capable to thinking through their actions, not simply learning through association a series of behaviors and combining them. You really should watch the video in the review by Telis. From that review:
...Taylor trained seven wild crows to associate a short stick with ineffectiveness; the crows failed to obtain their out-of-reach food with the stick and eventually began to ignore or reject it. Then they were divided into two groups, an innovation group and a training group. The training group learned six activities—such as how to use a short stick to extract a long stick from a toolbox—that together could help them get meat with long and short tools. The innovation group wasn’t taught how to use a short stick to extract a long stick from the toolbox, but did learn other techniques.

When tasked with reaching a snack in a hole using a short stick on a string and a longer stick trapped in a toolbox, all of the crows pulled it off. The three birds in the training group linked the behaviors they had learned into a new behavior. They freed the short stick from the string, used it to dislodge the long stick, and used the long stick to obtain their food. And all four crows in the innovation group figured out the sequence on their own. One crow in the innovation group stared at the setup for less than 2 minutes and then performed the whole trial correctly on her very first attempt.
The Taylor et al. abstract:
Apes, corvids and parrots all show high rates of behavioural innovation in the wild. However, it is unclear whether this innovative behaviour is underpinned by cognition more complex than simple learning mechanisms. To investigate this question we presented New Caledonian crows with a novel three-stage metatool problem. The task involved three distinct stages: (i) obtaining a short stick by pulling up a string, (ii) using the short stick as a metatool to extract a long stick from a toolbox, and finally (iii) using the long stick to extract food from a hole. Crows with previous experience of the behaviours in stages 1–3 linked them into a novel sequence to solve the problem on the first trial. Crows with experience of only using string and tools to access food also successfully solved the problem. This innovative use of established behaviours in novel contexts was not based on resurgence, chaining and conditional reinforcement. Instead, the performance was consistent with the transfer of an abstract, causal rule: ‘out-of-reach objects can be accessed using a tool’. This suggests that high innovation rates in the wild may reflect complex cognitive abilities that supplement basic learning mechanisms.

Wednesday, June 09, 2010

Older is happier

Many diminutions come with aging,  but decreasing happiness is not apparently among them. Bakalar notes studies by Stone et al. showing, to the contrary, that by almost any measure, people get happier as they get older, for reasons that are not clear. Clips from Bakalar's summary:
On the global measure, people start out at age 18 feeling pretty good about themselves, and then, apparently, life begins to throw curve balls. They feel worse and worse until they hit 50. At that point, there is a sharp reversal, and people keep getting happier as they age. By the time they are 85, they are even more satisfied with themselves than they were at 18.

In measuring immediate well-being — yesterday’s emotional state — the researchers found that stress declines from age 22 onward, reaching its lowest point at 85. Worry stays fairly steady until 50, then sharply drops off. Anger decreases steadily from 18 on, and sadness rises to a peak at 50, declines to 73, then rises slightly again to 85. Enjoyment and happiness have similar curves: they both decrease gradually until we hit 50, rise steadily for the next 25 years, and then decline very slightly at the end, but they never again reach the low point of our early 50s.

...we can expect to be happier in our early 80s than we were in our 20s...and it’s not being driven predominantly by things that happen in life. It’s something very deep and quite human that seems to be driving this.

Tuesday, June 08, 2010

Keep Austin Weird

The title of this post is the unofficial motto of Austin Texas, where I am spending this week.  One of the things I enjoy most is its funky coffee houses,  all with high speed wireless internet, and copious outlets to plug in your laptop.

Washing away postdecisional dissonance

An interesting tidbit from Lee and Schwartz:
Hand washing removes more than dirt—it also removes the guilt of past misdeeds, weakens the urge to engage in compensatory behavior, and attenuates the impact of disgust on moral judgment. These findings are usually conceptualized in terms of a purity-morality metaphor that links physical and moral cleanliness; however, they may also reflect that washing more generally removes traces of the past by metaphorically wiping the slate clean. If so, washing one’s hands may lessen the influence of past behaviors that have no moral implications at all. We test this possibility in a choice situation. Freely choosing between two similarly attractive options (e.g., Paris or Rome for vacation) arouses cognitive dissonance, an aversive psychological state resulting from conflicting cognitions. People reduce dissonance by perceiving the chosen alternative as more attractive and the rejected alternative as less attractive after choice, thereby justifying their decision.
The authors tested whether hand washing reduces this classic postdecisional dissonance effect (the need to justify a recent choice) by designing a ranking and choice experiment in which participants, after making a choice, were subsequently asked to evaluate a soap - some with actual hand washing and some without. Their finding:
...indicate that the psychological impact of physical cleansing extends beyond the moral domain. Much as washing can cleanse us from traces of past immoral behavior, it can also cleanse us from traces of past decisions, reducing the need to justify them. This observation is not captured by the purity-morality metaphor and highlights the need for a better understanding of the processes that mediate the psychological impact of physical cleansing. To further constrain the range of plausible candidate explanations, future research may test whether the observed "clean slate" effect is limited to past acts that may threaten one’s self-view (e.g., moral transgressions and potentially poor choices) or also extends to past behaviors with positive implications.

Monday, June 07, 2010

Brunch at Fonda San Miquel in Austin

A picture from yesterday's birthday brunch for my son, now 36 years old, at my favorite Mexican gourmet restaurant in Austin Texas.  Shown from left to right are my partner Len Walker, my son Jon Bownds, Deric (me), daughter-in-law Shana Merlin, and old family friend Martha Leipziger. 

Prozac reverses maturation of some brain cells

Here is some intriguing work from Kobayashi et al. showing that fluoxetine (Prozac) indices a dematuration of hippocampus dentate gyrus cells that reinstates synaptic plasticity that is reduced with development, thereby potentially causing beneficial effects on the adult brain. (These cells are key in learning and memory processes). Their results suggest that the state of neuronal maturation, including aberrant maturation, might be controlled or corrected in adults, a unique approach to treating neuronal dysfunctions associated with neurodevelopmental disorders. Some clips from the abstract:
Serotonergic antidepressant drugs have been commonly used to treat mood and anxiety disorders, and increasing evidence suggests potential use of these drugs beyond current antidepressant therapeutics. Facilitation of adult neurogenesis in the hippocampal dentate gyrus has been suggested to be a candidate mechanism of action of antidepressant drugs, but this mechanism may be only one of the broad effects of antidepressants. Here we show a distinct unique action of the serotonergic antidepressant fluoxetine in transforming the phenotype of mature dentate granule cells. Chronic treatments of adult mice with fluoxetine strongly reduced expression of the mature granule cell marker calbindin. The fluoxetine treatment induced active somatic membrane properties resembling immature granule cells and markedly reduced synaptic facilitation that characterizes the mature dentate-to-CA3 signal transmission. These changes cannot be explained simply by an increase in newly generated immature neurons, but best characterized as “dematuration” of mature granule cells...Our results suggest that serotonergic antidepressants can reverse the established state of neuronal maturation in the adult hippocampus...Such reversal of neuronal maturation could affect proper functioning of the mature hippocampal circuit, but may also cause some beneficial effects by reinstating neuronal functions that are lost during development.

Six Famous Geniuses You Didn't Know Were Perverts

Another random curious piece

Friday, June 04, 2010

MindBlog in Austin Texas

Having just returned from Istanbul last Friday, I travel again - this time with my partner Len Walker to Austin Texas to attend the 50th reunion of Austin High School Alumni. I will be vacationing here through next week.   Being in Austin requires adapting the digestive system to gargantuan servings of TexMex dishes. 

Aging brains are less able to recover from the effects of stress

It is well documented that aging reduces the effectiveness of our prefrontal cortex in mediating cognitive processing and decision making, including working memory and flexible use of mental strategies. Bloss, McEwen et al. have now conducted studies that suggest that one reason for this decline may be that aging reduces the ability of prefrontal cortex to recover from stress-induced damage. Their abstract:
Neuronal networks in the prefrontal cortex mediate the highest levels of cognitive processing and decision making, and the capacity to perform these functions is among the cognitive features most vulnerable to aging. Despite much research, the neurobiological basis of age-related compromised prefrontal function remains elusive. Many investigators have hypothesized that exposure to stress may accelerate cognitive aging, though few studies have directly tested this hypothesis and even fewer have investigated a neuronal basis for such effects. It is known that in young animals, stress causes morphological remodeling of prefrontal pyramidal neurons that is reversible. The present studies sought to determine whether age influences the reversibility of stress-induced morphological plasticity in rat prefrontal neurons. We hypothesized that neocortical structural resilience is compromised in normal aging. To directly test this hypothesis we used a well characterized chronic restraint stress paradigm, with an additional group allowed to recover from the stress paradigm, in 3-, 12-, and 20-month-old male rats. In young animals, stress induced reductions of apical dendritic length and branch number, which were reversed with recovery; in contrast, middle-aged and aged rats failed to show reversible morphological remodeling when subjected to the same stress and recovery paradigm. The data presented here provide evidence that aging is accompanied by selective impairments in long-term neocortical morphological plasticity.

Michelangelo as brain anatomist

Check out this Andrew Sullivan post.

Thursday, June 03, 2010

Overimitation of adults by kids is a cultural universal.

Most studies showing overimitation of adults by children during learning have been conducted on middle- to upper-class kids of Western-educated parents. Nielsen and Tomaselli studied (from Telis's review)
...a culture with a distinctly different parenting style: the Bushmen of the Kalahari Desert. Whereas a Western parent might teach a youngster to use a bow and arrow by standing behind her and guiding her motions, a parent from the indigenous African Bushmen culture would allow the child to come along for a hunt and learn by observation and through trial and error. Nielsen hypothesized that a child taught in this hands-off manner would have less reason to overimitate adults and would do so less often.
The Nielsen and Tomaselli abstract:
Children are surrounded by objects that they must learn to use. One of the most efficient ways children do this is by imitation. Recent work has shown that, in contrast to nonhuman primates, human children focus more on reproducing the specific actions used than on achieving actual outcomes when learning by imitating. From 18 months of age, children will routinely copy even arbitrary and unnecessary actions. This puzzling behavior is called overimitation. By documenting similarities exhibited by children from a large, industrialized city and children from remote Bushman communities in southern Africa, we provide here the first indication that overimitation may be a universal human trait. We also show that overimitation is unaffected by the age of the child, differences in the testing environment, or familiarity with the demonstrating adult. Furthermore, we argue that, although seemingly maladaptive, overimitation reflects an evolutionary adaptation that is fundamental to the development and transmission of human culture.
The Telis review has an interesting video of overimitation in a Bushmen child.

Prehistoric makeover

From the 21 May Science Magazine "Random Samples" section:
Care to feel closer to your extinct relatives? The Smithsonian Institution's MEanderthal app for iPhones and Android devices melds your mug shot with features of Homo neanderthalensis, modern humans' closest kin—or, if you prefer, the more distant H. heidelbergensis or H. floresiensis. In the first case, expect to gain a big nose and a puffier face, says Robert Costello, an outreach manager with the Smithsonian. Neandertals needed large sinus cavities to cope with the colder climate in Europe and Asia 28,000 to 200,000 years ago.

Wednesday, June 02, 2010

The cognitive niche

Steven Pinker offers (full text, open access) one of several fascinating paper in a special PNAS supplement issue: In the light of evolution IV: The human condition. All of the papers are open access. Pinker's title is "The cognitive niche: Coevolution of intelligence, sociality, and language." The abstract:
Although Darwin insisted that human intelligence could be fully explained by the theory of evolution, the codiscoverer of natural selection, Alfred Russel Wallace, claimed that abstract intelligence was of no use to ancestral humans and could only be explained by intelligent design. Wallace's apparent paradox can be dissolved with two hypotheses about human cognition. One is that intelligence is an adaptation to a knowledge-using, socially interdependent lifestyle, the “cognitive niche.” This embraces the ability to overcome the evolutionary fixed defenses of plants and animals by applications of reasoning, including weapons, traps, coordinated driving of game, and detoxification of plants. Such reasoning exploits intuitive theories about different aspects of the world, such as objects, forces, paths, places, states, substances, and other people's beliefs and desires. The theory explains many zoologically unusual traits in Homo sapiens, including our complex toolkit, wide range of habitats and diets, extended childhoods and long lives, hypersociality, complex mating, division into cultures, and language (which multiplies the benefit of knowledge because know-how is useful not only for its practical benefits but as a trade good with others, enhancing the evolution of cooperation). The second hypothesis is that humans possess an ability of metaphorical abstraction, which allows them to coopt faculties that originally evolved for physical problem-solving and social coordination, apply them to abstract subject matter, and combine them productively. These abilities can help explain the emergence of abstract cognition without supernatural or exotic evolutionary forces and are in principle testable by analyses of statistical signs of selection in the human genome.

Tuesday, June 01, 2010

Contra doomsayers, a bright future beckons?

Matt Ridley has a new book out, "The Rational Optimist", reviewed by John Tierney.  (Ridley is a very bright polymath, I recall he did a much better job than I did some ~15 years ago,  as a fellow contributor of several chapters of a standard introductory biology text book - a hack writing for pay gig).  Ridley argues in his grand theory that it was the invention of the exchange of one object for another,  rather than increasingly big brains or cooperation and reciprocity, that started the explosive growth of civilization. 
Adam potentially now had access to objects he did not know how to make or find; and so did Oz,” Dr. Ridley writes. People traded goods, services and, most important, knowledge, creating a collective intelligence: “Ten individuals could know between them ten things, while each understanding one.”

Rulers like to take credit for the advances during their reigns, and scientists like to see their theories as the source of technological progress. But Dr. Ridley argues that they’ve both got it backward: traders’ wealth builds empires, and entrepreneurial tinkerers are more likely to inspire scientists than vice versa. From Stone Age seashells to the steam engine to the personal computer, innovation has mostly been a bottom-up process.
“Forget wars, religions, famines and poems for the moment,” Dr. Ridley writes. “This is history’s greatest theme: the metastasis of exchange, specialization and the invention it has called forth, the ‘creation’ of time.”

Progress this century could be impeded by politics, wars, plagues or climate change, but Dr. Ridley argues that, as usual, the “apocaholics” are overstating the risks and underestimating innovative responses....with new hubs of innovation emerging elsewhere, and with ideas spreading faster than ever on the Internet, Dr. Ridley expects bottom-up innovators to prevail. His prediction for the rest of the century: “Prosperity spreads, technology progresses, poverty declines, disease retreats, fecundity falls, happiness increases, violence atrophies, freedom grows, knowledge flourishes, the environment improves and wilderness expands.”