Friday, December 02, 2011

Why does synesthesia persist in the population?

Brang and Ramachandran do an interesting review of ideas on synesthesia, a condition present in 2%–4% of the population,in which a sensory stimulus presented to one modality elicits concurrent sensations in additional modalities. In two of the most common variants, auditory tones and achromatic (colorless) numbers produce vivid and perceptually salient colors. The authors point out that synesthesia can be associated with a wide variety of conceptual and perceptual benefits, suggesting that the gene(s) involved may have been selected for because of a hidden agenda. They speculate that increasing the range of sensory associations may extend to other systems such as creativity and metaphor (increasing the range of association between words). They point to examples of people with prodigious memories based largely on using synesthetic associations evoked by the items to be memorized.
In addition to facilitating processes in individual sensory modalities, synesthetes also show increased communication between the senses unrelated to their synesthetic experiences, suggesting that benefits from synesthesia generalize to other modalities as well, supporting their ability to process multisensory information. Furthermore, others have argued that synesthesia is the direct result of enhanced communication between the senses as a logical outgrowth of the cross-modality interactions present in all individuals. Taken collectively, these data suggest that synesthesia may be associated with enhanced primary sensory processing as well as the integration between the senses...synesthesia is a highly heritable phenomenon that is associated with numerous benefits to cognitive processing, potentially underscoring a basis for why this condition has survived evolutionary pressures.

Thursday, December 01, 2011

Our brain's fusiform "face" area is about holistic processing of any familiar complex visual input.

As a followup to an older MindBlog post on Chess expertise, I though I would pass on this interesting work from Bilalić et al.
The fusiform face area (FFA) is involved in face perception to such an extent that some claim it is a brain module for faces exclusively. The other possibility is that FFA is modulated by experience in individuation in any visual domain, not only faces. Here we test this latter FFA expertise hypothesis using the game of chess as a domain of investigation. We exploited the characteristic of chess, which features multiple objects forming meaningful spatial relations. In three experiments, we show that FFA activity is related to stimulus properties and not to chess skill directly. In all chess and non-chess tasks, experts' FFA was more activated than that of novices' only when they dealt with naturalistic full-board chess positions. When common spatial relationships formed by chess objects in chess positions were randomly disturbed, FFA was again differentially active only in experts, regardless of the actual task. Our experiments show that FFA contributes to the holistic processing of domain-specific multipart stimuli in chess experts. This suggests that FFA may not only mediate human expertise in face recognition but, supporting the expertise hypothesis, may mediate the automatic holistic processing of any highly familiar multipart visual input.

Wednesday, November 30, 2011

Neural (MRI) correlates of effective learning.

Here is a rather fascinating prospective use of MRI technology - to distinguish people who might become the most effective decision makers after further more extensive training in a specialization such as medical diagnosis. Their basic finding is that high performers' brains achieve better outcomes by attending to informative failures during training, rather than chasing the reward value of successes. From Downar, Bhatt, and Montague:
Accurate associative learning is often hindered by confirmation bias and success-chasing, which together can conspire to produce or solidify false beliefs in the decision-maker. We performed functional magnetic resonance imaging in 35 experienced physicians, while they learned to choose between two treatments in a series of virtual patient encounters. We estimated a learning model for each subject based on their observed behavior and this model divided clearly into high performers and low performers. The high performers showed small, but equal learning rates for both successes (positive outcomes) and failures (no response to the drug). In contrast, low performers showed very large and asymmetric learning rates, learning significantly more from successes than failures; a tendency that led to sub-optimal treatment choices. Consistently with these behavioral findings, high performers showed larger, more sustained BOLD responses to failed vs. successful outcomes in the dorsolateral prefrontal cortex and inferior parietal lobule while low performers displayed the opposite response profile. Furthermore, participants' learning asymmetry correlated with anticipatory activation in the nucleus accumbens at trial onset, well before outcome presentation. Subjects with anticipatory activation in the nucleus accumbens showed more success-chasing during learning. These results suggest that high performers' brains achieve better outcomes by attending to informative failures during training, rather than chasing the reward value of successes. The differential brain activations between high and low performers could potentially be developed into biomarkers to identify efficient learners on novel decision tasks, in medical or other contexts.

Tuesday, November 29, 2011

The speed-accuracy tradeoff in the elderly brain.

Sigh...even more information on my aging brain. The fact that I and other older folks take longer to respond when a task is presented can most charitably be attributed to our being more cautious about making errors, but Forstmann et al. find evidence that this behavior is not entirely voluntary, and can also be related to a decrease in brain connectivity with aging:
Even in the simplest laboratory tasks older adults generally take more time to respond than young adults. One of the reasons for this age-related slowing is that older adults are reluctant to commit errors, a cautious attitude that prompts them to accumulate more information before making a decision. This suggests that age-related slowing may be partly due to unwillingness on behalf of elderly participants to adopt a fast-but-careless setting when asked. We investigate the neuroanatomical and neurocognitive basis of age-related slowing in a perceptual decision-making task where cues instructed young and old participants to respond either quickly or accurately. Mathematical modeling of the behavioral data confirmed that cueing for speed encouraged participants to set low response thresholds, but this was more evident in younger than older participants. Diffusion weighted structural images suggest that the more cautious threshold settings of older participants may be due to a reduction of white matter integrity in corticostriatal tracts that connect the pre-SMA to the striatum. These results are consistent with the striatal account of the speed-accuracy tradeoff according to which an increased emphasis on response speed increases the cortical input to the striatum, resulting in global disinhibition of the cortex. Our findings suggest that the unwillingness of older adults to adopt fast speed-accuracy tradeoff settings may not just reflect a strategic choice that is entirely under voluntary control, but that it may also reflect structural limitations: age-related decrements in brain connectivity.

Monday, November 28, 2011

How not to revert to habit under stress...

A stressful situation can have the effect of making us actually less able to flexibly cope with the issue at hand, because we tend under stress to regress to older habitual responses that may be less appropriate. Observations by Schwabe et al. suggest that we might be able to lessen this behavior by popping an old fashioned pill like propanolol, a β-adrenoceptor antagonist (which has been used for many years by some musicians to quell their performance anxiety). The second abstract below, from Hermans et al. provides a more detailed view of how our brain networks are changing during stress, and how this is attenuated by β-adrenoceptor receptor blockage.
Stress modulates instrumental action in favor of habit processes that encode the association between a response and preceding stimuli and at the expense of goal-directed processes that learn the association between an action and the motivational value of the outcome. Here, we asked whether this stress-induced shift from goal-directed to habit action is dependent on noradrenergic activation and may therefore be blocked by a β-adrenoceptor antagonist. To this end, healthy men and women were administered a placebo or the β-adrenoceptor antagonist propranolol before they underwent a stress or a control procedure. Shortly after the stress or control procedure, participants were trained in two instrumental actions that led to two distinct food outcomes. After training, one of the food outcomes was selectively devalued by feeding participants to satiety with that food. A subsequent extinction test indicated whether instrumental behavior was goal-directed or habitual. As expected, stress after placebo rendered participants' behavior insensitive to the change in the value of the outcome and thus habitual. After propranolol intake, however, stressed participants behaved, same as controls, goal-directed, suggesting that propranolol blocked the stress-induced bias toward habit behavior. Our findings show that the shift from goal-directed to habitual control of instrumental action under stress necessitates noradrenergic activation and could have important clinical implications, particularly for addictive disorders.
And, more detail from Hermans et al., who find in human studies robust stressor-related changes in functional neuronal activity and connectivity within a network of brain areas, which correlate with increased reports of negative emotionality by the participants, as well as with increases of cortisol and alpha amylase in their saliva:
Acute stress shifts the brain into a state that fosters rapid defense mechanisms. Stress-related neuromodulators are thought to trigger this change by altering properties of large-scale neural populations throughout the brain. We investigated this brain-state shift in humans. During exposure to a fear-related acute stressor, responsiveness and interconnectivity within a network including cortical (frontoinsular, dorsal anterior cingulate, inferotemporal, and temporoparietal) and subcortical (amygdala, thalamus, hypothalamus, and midbrain) regions increased as a function of stress response magnitudes. β-adrenergic receptor blockade, but not cortisol synthesis inhibition, diminished this increase. Thus, our findings reveal that noradrenergic activation during acute stress results in prolonged coupling within a distributed network that integrates information exchange between regions involved in autonomic-neuroendocrine control and vigilant attentional reorienting.

Friday, November 25, 2011

A nap enhances relational memory

Lau et al. make the following interesting observations:
It is increasingly evident that sleep strengthens memory. However, it is not clear whether sleep promotes relational memory, resultant of the integration of disparate memory traces into memory networks linked by commonalities. The present study investigates the effect of a daytime nap, immediately after learning or after a delay, on a relational memory task that requires abstraction of general concept from separately learned items. Specifically, participants learned English meanings of Chinese characters with overlapping semantic components called radicals. They were later tested on new characters sharing the same radicals and on explicitly stating the general concepts represented by the radicals. Regardless of whether the nap occurred immediately after learning or after a delay, the nap participants performed better on both tasks. The results suggest that sleep – even as brief as a nap – facilitates the reorganization of discrete memory traces into flexible relational memory networks.

Thursday, November 24, 2011

Brief musical training in kids enhances other high level cognitive skills.

Article like this one from Moreno et al. make me think that my life long piano practice may be part of the reason I'm still hanging onto a few of my mental marbles as I age. (I realized the other day that my sight reading of complex musical scores, which requires glancing several measures ahead of the one being played, and remembering them, is essentially working memory training of the sort that has been shown to enhance general intelligence.) Here is the abstract from Moreno et al:
Researchers have designed training methods that can be used to improve mental health and to test the efficacy of education programs. However, few studies have demonstrated broad transfer from such training to performance on untrained cognitive activities. Here we report the effects of two interactive computerized training programs developed for preschool children: one for music and one for visual art. After only 20 days of training, only children in the music group exhibited enhanced performance on a measure of verbal intelligence, with 90% of the sample showing this improvement. These improvements in verbal intelligence were positively correlated with changes in functional brain plasticity during an executive-function task. Our findings demonstrate that transfer of a high-level cognitive skill is possible in early childhood.

Wednesday, November 23, 2011

Quantitating how positive emotions increase longevity.

Increasing general well-being of citizens is usually taken to be the the goal of government and public policy, and MindBlog has pointed to numerous studies that link positive affect and other measures of well-being with longer survival and reduced risk of diseases in old age. But...how is well-being best measured? Most studies have have mainly relied on assessments of recollected emotional states, in which people are asked to rate their feelings of happiness or well-being in general, either without any time frame or over a specific time period. Psychological research has established that recollected affect may diverge from actual experience because it is influenced by errors in recollection, recall biases, focusing illusions, and salient memory heuristics. Steptoe and Wardle1 address the issue that recollected affect may diverge from actual experience because it is influenced by errors in recollection, recall biases, focusing illusions, and salient memory heuristics. They note that the “memory–experience gap” between life as it is remembered and life as it is experienced may be important to the processes through which the past impacts on future behavior. They address this issue by looking at data aggregating momentary affect assessments over a single day for a large number of individuals:
Links between positive affect (PA) and health have predominantly been investigated by using measures of recollected emotional states. Ecological momentary assessment is regarded as a more precise measure of experienced well-being. We analyzed data from the English Longitudinal Study of Aging, a representative cohort of older men and women living in England. PA was assessed by aggregating momentary assessments over a single day in 3,853 individuals aged 52 to 79 y who were followed up for an average of 5 y. Respondents in the lowest third of PA had a death rate of 7.3%, compared with 4.6% in the medium-PA group and 3.6% in the high-PA group. Cox proportional-hazards regression showed a hazard ratio of 0.498 (95% confidence interval, 0.345–0.721) in the high-PA compared with the low-PA group, adjusted for age and sex. This was attenuated to 0.646 (95% confidence interval, 0.436–0.958) after controlling for demographic factors, negative affect, depressed mood, health indicators, and health behaviors. Negative affect and depressed mood were not related to survival after adjustment for covariates. These findings indicate that experienced PA, even over a single day, has a graded relationship with survival that is not caused by baseline health status or other covariates. Momentary PA may be causally related to survival, or may be a marker of underlying biological, behavioral, or temperamental factors, although reverse causality cannot be conclusively ruled out. The results endorse the value of assessing experienced affect, and the importance of evaluating interventions that promote happiness in older populations.

Tuesday, November 22, 2011

Compatibility of Neuroscience and Free Will? - a further discussion

Because several people have mentioned a recent NYTimes piece on neuroscience and free will to me, I've decided to pass on its basic points here, given that MindBlog has done frequent posts on the issue of free will (most recently for example, see here, here, and here). The recent article by Nahmias starts with reference to Wegner's book "The Illusion of Conscious Will," whose arguments are a central part of my "I-Illusion" web lecture you can see the MindBlog column to your left. Nahmias argues that the debate is usually mis-framed as being between scientific materialism and Cartesian dualism, and further that it does not take account of the more extended time frames involved in deliberation of alternative courses of action.
The sciences of the mind do give us good reasons to think that our minds are made of matter. But to conclude that consciousness or free will is thereby an illusion is too quick. It is like inferring from discoveries in organic chemistry that life is an illusion just because living organisms are made up of non-living stuff. Much of the progress in science comes precisely from understanding wholes in terms of their parts, without this suggesting the disappearance of the wholes. There’s no reason to define the mind or free will in a way that begins by cutting off this possibility for progress.

...people sometimes misunderstand determinism to mean that we are somehow cut out of the causal chain leading to our actions. People are threatened by a possibility I call “bypassing” — the idea that our actions are caused in ways that bypass our conscious deliberations and decisions. So, if people mistakenly take causal determinism to mean that everything that happens is inevitable no matter what you think or try to do, then they conclude that we have no free will.
but,
...discoveries about how our brains work can also explain how free will works rather than explaining it away. But first, we need to define free will in a more reasonable and useful way. Many philosophers, including me, understand free will as a set of capacities for imagining future courses of action, deliberating about one’s reasons for choosing them, planning one’s actions in light of this deliberation and controlling actions in the face of competing desires. We act of our own free will to the extent that we have the opportunity to exercise these capacities, without unreasonable external or internal pressure. We are responsible for our actions roughly to the extent that we possess these capacities and we have opportunities to exercise them.

...As long as people understand that discoveries about how our brains work do not mean that what we think or try to do makes no difference to what happens, then their belief in free will is preserved. What matters to people is that we have the capacities for conscious deliberation and self-control that I’ve suggested we identify with free will.

...None of the evidence marshaled by neuroscientists and psychologists suggests that those neural processes involved in the conscious aspects of such complex, temporally extended decision-making are in fact causal dead ends. It would be almost unbelievable if such evidence turned up. It would mean that whatever processes in the brain are involved in conscious deliberation and self-control — and the substantial energy these processes use — were as useless as our appendix, that they evolved only to observe what we do after the fact, rather than to improve our decision-making and behavior. No doubt these conscious brain processes move too slowly to be involved in each finger flex as I type, but as long as they play their part in what I do down the road — such as considering what ideas to type up — then my conscious self is not a dead end, and it is a mistake to say my free will is bypassed by what my brain does.

So, does neuroscience mean the death of free will? Well, it could if it somehow demonstrated that conscious deliberation and rational self-control did not really exist or that they worked in a sheltered corner of the brain that has no influence on our actions. But neither of these possibilities is likely. True, the mind sciences will continue to show that consciousness does not work in just the ways we thought, and they already suggest significant limitations on the extent of our rationality, self-knowledge, and self-control. Such discoveries suggest that most of us possess less free will than we tend to think, and they may inform debates about our degrees of responsibility. But they do not show that free will is an illusion.

If we put aside the misleading idea that free will depends on supernatural souls rather than our quite miraculous brains, and if we put aside the mistaken idea that our conscious thinking matters most in the milliseconds before movement, then neuroscience does not kill free will. Rather, it can help to explain our capacities to control our actions in such a way that we are responsible for them. It can help us rediscover free will.
In response to numerous comments on his article Nahmias notes:
One point I did not have time to develop, but many comments raise, is that we do not possess as much free will as we tend to think...Psychology indeed suggests that we are often unaware of what motivates us, we often rationalize our actions after we act, and we often are influenced by external factors that we'd prefer not to be influenced by... because I understand free will as a set of naturalistic capacities, I believe that empirical discoveries can illuminate not only how it works, but also limitations to it. This also means we are sometimes less praiseworthy or blameworthy than we tend to think...Conversely, I do not think that free and responsible action always requires conscious or rational deliberation. As Aristotle taught us, we are responsible not only for these sorts of choices but also for our habits and character traits that derive from these choices, though again, it largely remains to be discovered what degrees of freedom and responsibility we possess.

Monday, November 21, 2011

How to qualify as a social partner.

Most theoretical models for human social interactions explore relatively simple scenarios to allow for analytical solutions. Rockenbacha and Milinski have now devised a more sophisticated social-dilemma game that come closer to modeling real world interactions. Here is how they pose the issue:
What is the benefit of watching someone? Observing a person's behavior in a social dilemma may provide information about her qualities as a social partner for potential collaboration in the future: Does she contribute to a public good? Does she punish free riders? Does she reward contributors? Do I want to collaborate with her? Direct observation is more reliable than trusting gossip. Being watched, however, is not neutral: An individual's behavior may change in the presence of an observer (the “audience effect”), and the observed may be tempted to behave as expected to manage her reputation. Watchful eyes may induce altruistic behavior in the observed. Even a mechanistic origin of recognizing watchful eyes in the brain has been described as cortical orienting circuits that mediate nuanced and context-dependent social attention. However, watching also may induce an “arms race” of signals between observers and the observed. The observer should take into account that the behavior of the observed may change in response to observation and therefore should conceal her watching; the observed should be very alert to faint signals of being watched but should avoid any sign of having recognized that watching is occurring. The interaction between observing and being observed has implications for the large body of recent research on human altruism. Especially when a conflict of interest exists between observers and observed, they may use a rich toolbox of sophisticated strategies both to manipulate signals and to uncover manipulations.
The authors observe that in deciding on social partners observed cooperativeness is decisive, and severe punishment is hidden. Here is their abstract:
Conflicts of interest between the community and its members are at the core of human social dilemmas. If observed selfishness has future costs, individuals may hide selfish acts but display altruistic ones, and peers aim at identifying the most selfish persons to avoid them as future social partners. An interaction involving hiding and seeking information may be inevitable. We staged an experimental social-dilemma game in which actors could pay to conceal information about their contribution, giving, and punishing decisions from an observer who selects her future social partners from the actors. The observer could pay to conceal her observation of the actors. We found sophisticated dynamic strategies on either side. Actors hide their severe punishment and low contributions but display high contributions. Observers select high contributors as social partners; remarkably, punishment behavior seems irrelevant for qualifying as a social partner. That actors nonetheless pay to conceal their severe punishment adds a further puzzle to the role of punishment in human social behavior. Competition between hiding and seeking information about social behavior may be even more relevant and elaborate in the real world but usually is hidden from our eyes.

Friday, November 18, 2011

Improve your motor memory!

Here is an bit of work Zhang et al. on consolidation of motor memory that more clearly confirms what I know from my own experience of trying to learn a new piano piece. If I watch a video of myself playing a passage where I have difficulty with the notes, I remember the notes better than if I actually play them several times - the actual movement appears to get in the way of forming a motor memory of it. (The same effects can happen with mentally visualizing the movements, a trick known to many athletes and performers).
Practicing a motor task can induce neuroplastic changes in the human primary motor cortex (M1) that are subsequently consolidated, leading to a stable memory trace. Currently, little is known whether early consolidation, tested several minutes after skill acquisition, can be improved by behavioral interventions. Here we test whether movement observation, known to evoke similar neural responses in M1 as movement execution, can benefit the early consolidation of new motor memories. We show that observing the same type of movement as that previously practiced (congruent movement stimuli) substantially improves performance on a retention test 30 min after training compared with observing either an incongruent movement type or control stimuli not showing biological motion. Differences in retention following observation of congruent, incongruent, and control stimuli were not found when observed 24 h after initial training and neural evidence further confirmed that, unlike motor practice, movement observation alone did not induce plastic changes in the motor cortex. This time-specific effect is critical to conclude that movement observation of congruent stimuli interacts with training-induced neuroplasticity and enhances early consolidation of motor memories. Our findings are not only of theoretical relevance for memory research, but also have great potential for application in clinical settings when neuroplasticity needs to be maximized.

Thursday, November 17, 2011

MindBlog back in Fort Lauderdale, FL

I'm tired from three days driving with my two Abyssinian cats from Madison WI to my condo in Fort Lauderdale, FL (posts have been coming out on autopilot)... The picture is my work pod, just set up in my office here.

Resonating with others: changes in motor cortex

Individual concepts of self, or self-construals, vary across cultures. In collectivist cultures such as Japan, individuals adopt an interdependent self-construal in which relationships with others are central, whereas in individualist cultures like the U.S., a more independent self-construal with less emphasis on relationships with others is more likely to be adopted. Obhi et al note some interesting brain correlates of shifting self construal from interdependent to independent. Their data suggest that motor resonance mediates nonconscious mimicry in social settings:
“Self-construal” refers to how individuals view and make meaning of the self, and at least two subtypes have been identified. Interdependent self-construal is a view of the self that includes relationships with others, and independent self-construal is a view of the self that does not include relations with others. It has been suggested that priming these two types of self-construal affects the cognitive processing style that an individual adopts, especially with regard to context sensitivity. Specifically, an interdependent self-construal is thought to promote attention to others and social context to a greater degree than an independent self-construal. To investigate this assertion, we elicited motor-evoked potentials with transcranial magnetic stimulation during an action observation task in which human participants were presented with either interdependent or independent self-construal prime words. Priming interdependent self-construal increased motor cortical output whereas priming independent self-construal did not, compared with a no-priming baseline condition. These effects, likely mediated by changes in the mirror system, essentially tune the individual to, or shield the individual from, social input. Interestingly, the pattern of these self-construal-induced changes in the motor system corroborates with previously observed self-construal effects on overt behavioral mimicry in social settings, and as such, our results provide strong evidence that motor resonance likely mediates nonconscious mimicry in social settings. Finally, these self-construal effects may lead to the development of interventions for disorders of deficient or excessive social influence, like certain autism spectrum and compulsive imitative disorders.

Wednesday, November 16, 2011

Brain areas that increase in size with social network size.

The number of individuals that a single person can keep close track of is generally taken to be roughly 150 ("Dunbar's number), which would be the size of a tightly knit social grouping. This estimate derives from a comparative analysis of primate neuroanatomy and behavior and has led to the corollary that the magnitude of the number is determined by the size of the neocortex. Sallet et al. have now made the interesting observation that the relationship between brain size and social group magnitude can be plastic, finding that that housing macaque monkeys in larger groups increases the amount of gray matter in several parts of the brain involved in social cognition. We know from many studies that requiring increased skill in perceptual or motor abilities correlates with increases in sensory and motor areas of the brain, so it makes sense that requiring exercise of "social muscles" increases the grey matter volume in temporal and frontal lobes areas that have been identified as potential contributors to social success in both humans and monkeys. Here is the abstract:
It has been suggested that variation in brain structure correlates with the sizes of individuals’ social networks. Whether variation in social network size causes variation in brain structure, however, is unknown. To address this question, we neuroimaged 23 monkeys that had been living in social groups set to different sizes. Subject comparison revealed that living in larger groups caused increases in gray matter in mid-superior temporal sulcus and rostral prefrontal cortex and increased coupling of activity in frontal and temporal cortex. Social network size, therefore, contributes to changes both in brain structure and function. The changes have potential implications for an animal’s success in a social context; gray matter differences in similar areas were also correlated with each animal’s dominance within its social network.

Tuesday, November 15, 2011

Synaptic switch to social status in medial prefrontal cortex.

Wang et al. have determined the social hierarchy within groups of mice by using multiple behavioral tests and find that the social hierarchical status of an individual correlates with the synaptic strength in medial prefrontal cortical neurons. Furthermore, the hierarchical status of mice can be changed from dominant to subordinate, or vice versa, by manipulating the strength of synapses in the medial prefrontal cortex. Here is the abstract, followed by a figure from the review by Maroteaux and Mamell:
Dominance hierarchy has a profound impact on animals’ survival, health, and reproductive success, but its neural circuit mechanism is virtually unknown. We found that dominance ranking in mice is transitive, relatively stable, and highly correlates among multiple behavior measures. Recording from layer V pyramidal neurons of the medial prefrontal cortex (mPFC) showed higher strength of excitatory synaptic inputs in mice with higher ranking, as compared with their subordinate cage mates. Furthermore, molecular manipulations that resulted in an increase and decrease in the synaptic efficacy in dorsal mPFC neurons caused an upward and downward movement in the social rank, respectively. These results provide direct evidence for mPFC’s involvement in social hierarchy and suggest that social rank is plastic and can be tuned by altering synaptic strength in mPFC pyramidal cells.


Illustration (click to enlarge) - Synapses and rank - Excitatory synaptic drive onto cortical pyramidal neurons in the mouse brain is stronger in dominant individuals than subordinates. Modulating synaptic strength by increasing or decreasing AMPA receptor–mediated transmission switches the initial social ranking.

Monday, November 14, 2011

On understanding our brains...

I thought I would pass on some chunks of the lead editorial in the Nov. 4 Science Magazine, written by two senior establishment neuroscientists, Sidney Brenner and Terrence Sejnowski:
Like most fields in biology, neuroscience is succumbing to an “epidomic” of data collecting. There are major projects under way to completely characterize the proteomic, metabolomic, genomic, and methylomic signatures for all of the different types of neurons and glial cells in the human brain. In addition, “connectomics” plans to provide the complete network structure of brains, and “synaptomics” aims to uncover all molecules and their interactions at synapses. This is a good time to pause and ask ourselves what we expect to find at the end of this immense omic brainbow.

Linnaeus's catalog of species and the classifications he imposed on them turned data into knowledge, but it did not lead to an understanding of why they were all there. That had to wait for Darwin's theory of evolution and the development of genetics. All the lists that we will accumulate about the brain, although necessary, will be far from sufficient for understanding. The human brain contains an estimated 86 billion neurons and an equal number of glial cells. The complete structure of the enormously simpler 302-neuron network of the nematode worm Caenorhabditis elegans was published in 1986. But without the activities of neurons and their synapses, it was far from a complete “wiring diagram.” Today, with genetically encoded calcium sensors, with better knowledge of the molecules present at synapses, and by integrating the omic catalogs with developmental and dynamical data, we may finally be in sight of completing the worm wiring diagram, as required for a full understanding of this one relatively simple nervous system.

…Since the 1980s, neuroscience has received visionary financial support from private foundations, jump-starting new fields including cognitive neuroscience and computational neuroscience based on new techniques for imaging and modeling the human brain. The global view of human brain activity thus far obtained from imaging experiments has changed the way we think about ourselves. Homo neuroeconomicus has replaced the rational-agent model of human behavior, neuroeducators want to make children better learners, and neuroethicists have been inspired by the discovery of biological links to aggression, trust, and affi liation. However, individual differences often dominate. Although expensive bets are being placed on explaining the diversity of human behavior and mental disorders with genetic polymorphisms, gene mutations, and chromosomal rearrangements, the results so far have been modest

The Internet is making neuroscience more accessible to the public. The Society for Neuroscience, which convenes its annual meeting next week, will soon launch BrainFacts.org, a reliable source of information about the brain. In-depth interviews with neuroscientists can be found online at thesciencenetwork.org. And a Neuroeducation X Prize is being planned to encourage innovation in online computer games that enhance cognitive skills. Let us celebrate what our brains have discovered and what they can tell us about themselves.

Friday, November 11, 2011

Where in the brain does 'negative surprise' happen?

Egner points to an article by Alexander and Brown that provides an integrated model of how the brain deals with the non-occurrence of an expected event, such as George Bush and the famous non-opening Chinese door, suggesting this type of negative surprise drives neural responses in the dorsal anterior cingulate cortex and adjacent medial prefrontal cortex (dACC/mPFC), regions whose functions have been disputed in recent years. Their model provides a common denominator for a wide range of dACC/mPFC responses that have previously been attributed to diverse cognitive computations, making it a promising candidate for an integrative theory of this region's function. Their predicted response–outcome (PRO) model attempts to reconcile diverse finding, by suggesting that individual neurons generate signals reflecting a learned prediction of the probability and timing of the various possible outcomes of an action. These prediction signals are inhibited when the corresponding predicted outcome actually occurs. The resulting activity is therefore maximal when an expected outcome fails to occur, which suggests that what mPFC signals, in part, is the unexpected non-occurrence of a predicted outcome. Here is their abstract:
The medial prefrontal cortex (mPFC) and especially anterior cingulate cortex is central to higher cognitive function and many clinical disorders, yet its basic function remains in dispute. Various competing theories of mPFC have treated effects of errors, conflict, error likelihood, volatility and reward, using findings from neuroimaging and neurophysiology in humans and monkeys. No single theory has been able to reconcile and account for the variety of findings. Here we show that a simple model based on standard learning rules can simulate and unify an unprecedented range of known effects in mPFC. The model reinterprets many known effects and suggests a new view of mPFC, as a region concerned with learning and predicting the likely outcomes of actions, whether good or bad. Cognitive control at the neural level is then seen as a result of evaluating the probable and actual outcomes of one's actions.

Thursday, November 10, 2011

Insensitivity to social reputation in autism.

Further characterization of how social cognition is changed by the autism disorder - evidence for distinctive brain systems that mediate the effects of social reputation:
People act more prosocially when they know they are watched by others, an everyday observation borne out by studies from behavioral economics, social psychology, and cognitive neuroscience. This effect is thought to be mediated by the incentive to improve one's social reputation, a specific and possibly uniquely human motivation that depends on our ability to represent what other people think of us. Here we tested the hypothesis that social reputation effects are selectively impaired in autism, a developmental disorder characterized in part by impairments in reciprocal social interactions but whose underlying cognitive causes remain elusive. When asked to make real charitable donations in the presence or absence of an observer, matched healthy controls donated significantly more in the observer's presence than absence, replicating prior work. By contrast, people with high-functioning autism were not influenced by the presence of an observer at all in this task. However, both groups performed significantly better on a continuous performance task in the presence of an observer, suggesting intact general social facilitation in autism. The results argue that people with autism lack the ability to take into consideration what others think of them and provide further support for specialized neural systems mediating the effects of social reputation.

Wednesday, November 09, 2011

Our resting brain networks can be formed by multiple architectures.

There has been a lot of interest lately (see Jonah's Lehrer's nice summary) in the resting, default, or 'mind wandering' state of our brains. It recruits functional networks with rich endogenous dynamics which typically distributed over both cerebral cortices. An interdisciplinary collaboration involving Ralph Adolphs, whose experiments were carried out by Michael Tyszka, asked the question of whether these resting states, as one might suppose, require the presence of the corpus callosum, the large bundle of fibers connecting the two hemispheres. What they found is that a normal complement of resting-state networks and intact functional coupling between the hemispheres can emerge in the absence of the corpus callosum, suggesting that resting brain networks can be formed by multiple architectures. Their abstract:
Temporal correlations between different brain regions in the resting-state BOLD signal are thought to reflect intrinsic functional brain connectivity. The functional networks identified are typically bilaterally distributed across the cerebral hemispheres, show similarity to known white matter connections, and are seen even in anesthetized monkeys. Yet it remains unclear how they arise. Here we tested two distinct possibilities: (1) functional networks arise largely from structural connectivity constraints, and generally require direct interactions between functionally coupled regions mediated by white-matter tracts; and (2) functional networks emerge flexibly with the development of normal cognition and behavior and can be realized in multiple structural architectures. We conducted resting-state fMRI in eight adult humans with complete agenesis of the corpus callosum (AgCC) and normal intelligence, and compared their data to those from eight healthy matched controls. We performed three main analyses: anatomical region-of-interest-based correlations to test homotopic functional connectivity, independent component analysis (ICA) to reveal functional networks with a data-driven approach, and ICA-based interhemispheric correlation analysis. Both groups showed equivalently strong homotopic BOLD correlation. Surprisingly, almost all of the group-level independent components identified in controls were observed in AgCC and were predominantly bilaterally symmetric. The results argue that a normal complement of resting-state networks and intact functional coupling between the hemispheres can emerge in the absence of the corpus callosum, favoring the second over the first possibility listed above.

Tuesday, November 08, 2011

When it's an error to mirror...

Mimicry and imitation can facilitate cultural learning, maintenance of culture, and group cohesion, and individuals must competently select the appropriate models and actions to imitate. Mimicry and imitation also play an important role in dyadic social interactions. People mimic their partners’ mannerisms, which increases rapport and the partners’ liking of the mimickersA collaboration between psychologists and philosophers at the Univ. of California, San Diego asks whether and how mimicry unconsciously influences evaluations made by third-party observers of dyadic interactions. Their results indicate that third-party observers make judgments about individuals’ competence on the basis of their decisions concerning whether and whom to mimic. Contrary to the notion that mimicry is uniformly beneficial to the mimicker, people who mimicked an unfriendly model were rated as less competent than nonmimics. Thus, a positive reputation depends not only on the ability to mimic, but also on the ability to discriminate when not to mimic. Here is their experimental setup (click on figure to enlarge):



Figure: Illustration of the experimental paradigm and experimental results. Subjects watched two videos, in each of which an interviewer (model) interacted with an interviewee. After each video, subjects rated the interviewee’s competence, trustworthiness, and likeability. For each subject, one video showed a mimicking interviewee, and the other showed a nonmimicking interviewee. In Experiment 1, video frames were uncropped, so subjects could see the interviewer; in Experiment 2, video frames were cropped, so subjects could not see the interviewer, and mimicry was obscured. The interviewer’s attitude varied between subjects; some subjects saw videos with a cordial interviewer, and other subjects saw videos with a condescending interviewer. The graph shows the difference in average competence ratings between the cordial- and condescending-model conditions as a function of whether or not the interviewee mimicked the interviewer, separately for Experiments 1 and 2. Error bars represent standard errors of the difference between conditions.

Monday, November 07, 2011

Booze and our brains.

This is all I needed!..... yet another reason to chill on my happy hour Martini. I keep reminding myself that, in addition to yielding a pleasant ‘buzz’, a modest amount of alcohol is supposed to have overall health benefits. There is a growing body of evidence that alcohol triggers rapid changes in the immune system in the brain as well as neuronal changes. This immune response lies behind some of the well-known alcohol-related behavioral changes, such as difficulty controlling the muscles involved in walking and talking. Wu et al. find in experiments on mice that a receptor on immune cells (TLR4) that controls expression of genes related to the inflammatory response to pathogens is involved in alcohol-induced sedation and impaired motor activity. If the receptor’s action is blocked either by a drug (naloxone) or by the receptor’s genetic removal, the effects of alcohol are reduced. This suggests that drugs specifically targeting the TLR4 receptor might be useful in treating alcohol dependence and acute overdoses.

Friday, November 04, 2011

Dynamnic views of MindBlog

Some time ago I put a link you can see at the right top of this MindBlog home page to "DYNAMIC VIEWS OF MINDBLOG."  A reminder email from Google's Blogger News prompted me to click on the link for the first time in quite a while.  They have obviously been refining these views.  Give it a try....you can click through some very interesting ways of presenting posts in various arrays, many quite appealing visually.

Economic inequality is linked to biased self-perception

A common view is that westerners are more likely to be individualists who seek personal success and uniqueness, and thus self-enhance - (i.e. emphasize or exaggerate one’s desirable qualities relative to other people’s) - more than do Easterners, who are more likely to be collectivists seeking interpersonal harmony and belonging. An international group of collaborators proposes an alternative explanation that favors socioeconomic differences over cultural dimensions. They suggest that the extent to which people engage in biased self-perception is influenced by the economic structure of their society, specifically its level of economic inequality. They gathered data from 1,625 participants in five continents and 15 nations: Europe (Belgium, Estonia, Germany, Hungary, Italy, Spain), the Americas (Peru, the United States, Venezuela), Asia (China, Japan, Singapore, South Korea), Africa (South Africa), and Oceania (Australia). Participants completed a standard questionnaire assessing self-enhancement. The bottom line is that people in societies with more income inequality tend to view themselves as superior to others, and people in societies with less income inequality tend to see themselves as more similar to their peers.
Their abstract:
People’s self-perception biases often lead them to see themselves as better than the average person (a phenomenon known as self-enhancement). This bias varies across cultures, and variations are typically explained using cultural variables, such as individualism versus collectivism. We propose that socioeconomic differences among societies—specifically, relative levels of economic inequality—play an important but unrecognized role in how people evaluate themselves. Evidence for self-enhancement was found in 15 diverse nations, but the magnitude of the bias varied. Greater self-enhancement was found in societies with more income inequality, and income inequality predicted cross-cultural differences in self-enhancement better than did individualism/collectivism. These results indicate that macrosocial differences in the distribution of economic goods are linked to microsocial processes of perceiving the self.


Figure: Scatter plot (with best-fitting regression line) showing self-enhancement (as indexed by beta weights from a two-level model) as a function of economic inequality (as indexed by the Gini coefficient) across nations. The data points for Australia and Italy are very close and overlap on the graph.

A bit from their discussion:
It is unlikely that economic inequality directly leads to biased self-perception. It seems more likely that there are intervening factors that result from socioeconomic differences. One possibility ... is perceived competition. When benefits and costs are polarized by inequality, people may compete for social superiority. One manifestation of this drive may be the presentation of the self as superior through self-enhancement. Thus, it may be the competitiveness triggered by economic inequality that drives biased self-perception. It is interesting to note that competitiveness may be related to differences in individualism as well, with more individualistic societies also fostering greater competition. Both individualism and economic inequality may work in concert to foster a perception of competition that results in cultural differences in levels of self-enhancement.1 Likewise, both individualism and economic inequality may undermine the norm of modesty. Modesty norms play an important role in reducing self-enhancement, and when they are compromised, self-enhancement increases. In societies with more income equality, people may not only have more-equal incomes, but they may also feel a pressure to seem more similar to others. This may manifest as a modesty norm, whereby people are discouraged from voicing both real and perceived superiority. Understanding the relationship between socioeconomic structure and individual psychology can help bridge the gulf between large-scale sociological studies of societies and individual social and psychological functioning.
An important limitation of the study was that, except for the United States, participants were drawn from university populations, and university students might often find themselves in situations in which their social standing is actually better than the average person’s, an effect which would be more pronounced in societies with more income inequality.

Thursday, November 03, 2011

Climate change and human crisis

In case you need anything further to depress you about our impending climate changes, Zhang et al. do a more fine-grained analysis of the effect of past episodes of climate catastrophe by bringing more quantitative scrutiny to try to confirm what scholars have qualitatively noted: that massive social disturbance, societal collapse, and population collapse often coincided with great climate change in America, the Middle East, China, and many other countries in preindustrial times.
Recent studies have shown strong temporal correlations between past climate changes and societal crises. However, the specific causal mechanisms underlying this relation have not been addressed. We explored quantitative responses of 14 fine-grained agro-ecological, socioeconomic, and demographic variables to climate fluctuations from A.D. 1500–1800 in Europe (a period that contained both periods of harmony and times of crisis). Results show that cooling from A.D. 1560–1660 caused successive agro-ecological, socioeconomic, and demographic catastrophes, leading to the General Crisis of the Seventeenth Century. We identified a set of causal linkages between climate change and human crisis. Using temperature data and climate-driven economic variables, we simulated the alternation of defined “golden” and “dark” ages in Europe and the Northern Hemisphere during the past millennium. Our findings indicate that climate change was the ultimate cause, and climate-driven economic downturn was the direct cause, of large-scale human crises in preindustrial Europe and the Northern Hemisphere.
Their data support the causal links shown in this figure:


Figure - Set of causal linkages from climate change to large-scale human crisis in preindustrial Europe. The terms in bold black type are sectors, and terms in red type within parentheses are variables that represent the sector. The thickness of the arrow indicates the degree of average correlation

Wednesday, November 02, 2011

Narcissistic Leaders and Group Performance

Nevica et al. point to yet another case where reality is at odds with perceptions:
Although narcissistic individuals are generally perceived as arrogant and overly dominant, they are particularly skilled at radiating an image of a prototypically effective leader. As a result, they tend to emerge as leaders in group settings. Despite people’s positive perceptions of narcissists as leaders, it was previously unknown if and how leaders’ narcissism is related to the performance of the people they lead. In this study, we used a hidden-profile paradigm to investigate this question and found evidence for discordance between the positive image of narcissists as leaders and the reality of group performance. We hypothesized and found that although narcissistic leaders are perceived as effective because of their displays of authority, a leader’s narcissism actually inhibits information exchange between group members and thereby negatively affects group performance. Our findings thus indicate that perceptions and reality can be at odds and have important practical and theoretical implications.

Tuesday, November 01, 2011

The evolution of cognition

It is now commonly recognized that high-level cognitive function is not limited to primate lineages and like many other traits, is shaped by selection imposed by ecological and environmental demands. MacClean et al. (PDF here) propose that a merger of the fields of comparative psychology and phylogenetics would greatly improve our ability to understand the forces that drive cognitive evolution:
Now more than ever animal studies have the potential to test hypotheses regarding how cognition evolves. Comparative psychologists have developed new techniques to probe the cognitive mechanisms underlying animal behavior, and they have become increasingly skillful at adapting methodologies to test multiple species. Meanwhile, evolutionary biologists have generated quantitative approaches to investigate the phylogenetic distribution and function of phenotypic traits, including cognition. In particular, phylogenetic methods can quantitatively (1) test whether specific cognitive abilities are correlated with life history (e.g., lifespan), morphology (e.g., brain size), or socio-ecological variables (e.g., social system), (2) measure how strongly phylogenetic relatedness predicts the distribution of cognitive skills across species, and (3) estimate the ancestral state of a given cognitive trait using measures of cognitive performance from extant species. Phylogenetic methods can also be used to guide the selection of species comparisons that offer the strongest tests of a priori predictions of cognitive evolutionary hypotheses (i.e., phylogenetic targeting). Here, we explain how an integration of comparative psychology and evolutionary biology will answer a host of questions regarding the phylogenetic distribution and history of cognitive traits, as well as the evolutionary processes that drove their evolution.

Monday, October 31, 2011

Very brief meditation training produces brain changes associated with positive emotions.

It is known that the ratio of left frontal to right frontal lobe activation is relatively higher in individuals with higher positive affect - this effect can be monitored by e.e.g. electrodes placed on the head. (In light of Gilbert's recent observation that "A wandering mind is an unhappy mind," I wonder if resting frontal asymmetry correlates with mind wandering....) Davidson and colleagues have shown that a fairly rigorous 8-week meditation training program can cause a significant increase the left-sided anterior activation associated with positive affect. Now Moyer et al. at the Univ. of Wisconsin, Stout, claim that a much briefer intervention can be effective. Participants..
...who did not differ in frontal EEG asymmetry before training, were randomly assigned to the meditation training (MT; n = 11) or waiting-list (WL; n = 10) group. MT participants were told that nine 30-min sessions of meditation instruction were available to them and were encouraged to attend as many sessions as possible. A standard protocol was used to measure positive and negative affect before and after 15 min of attempted focused-attention meditation according to provided instructions (“relax with your eyes closed, and focus on the flow of your breath at the tip of your nose; if a random thought arises, acknowledge the thought and then simply let it go by gently bringing your attention back to the flow of your breath”).
Some results from the paper:
MT and WL participants did not differ in frontal EEG asymmetry before training, paired t(13) = 0.16, r = −.01, p = .88, d = 0.06 (see Figure, click to enlarge). During training, MT participants attended an average of 6.73 (SD = 1.35, range = 4−8) instruction sessions and reported engaging in independent 15-min intervals of meditation an average of 2.24 (SD = 1.01, range = 1−5) times per week. MT participants averaged 6 hr 13 min of training (SD = 1 hr 35 min, range = 3 hr 15 min to 9 hr 8 min) across the 5 weeks. After training, MT participants had significantly greater leftward shift in frontal EEG asymmetry than WL participants did across all time points, paired t(13) = 10.80, r = .40, p less than .001, d = 3.18 (see Figure).

Comparison with the control group (WL) seems somewhat shakey. They were doing nothing except knowing they would be offered training after the first group? What about being given an amount for some other kind of 'instruction' (religious, philosophical, whatever) for the same intervals?

Some clips from the discussion:
With training, focused-attention meditation shifts frontal EEG asymmetry toward a pattern associated with positive, approach-oriented emotions. Further, this shift does not require hundreds or even dozens of hours of practice. Individual MT participants in this study averaged only 5 to 16 min of active training (i.e., instruction, independent practice) per day across 5 weeks, but still exhibited a strong change in EEG asymmetry compared with the WL group. Our results suggest that the benefits of meditation may be more accessible than was previously believed. However, this study does not indicate if such asymmetry is pervasive or is limited to the time of meditation and the brief intervals that immediately surround it...We suggest two explanations for the increase in EEG asymmetry that emerged after so little training. First, our MT participants were able to decide when to practice, and for how long; this flexibility allowed them to determine for themselves when they would be most receptive to meditation, and choosing advantageous times may have heightened the efficacy of the meditation. Second, the small amount of active practice participants reported may have enabled a larger amount of passive practice to occur spontaneously, without a conscious decision to meditate; such passive practice may have strengthened the effects of meditation. This latter explanation is consistent with reports from some MT participants that they occasionally found themselves focusing their attention in the way they had been taught, even without having set out to do so.

Friday, October 28, 2011

Adolescent brain changes while viewing media violence

Strenziok et al. (open access) note a habituation and desensitization of adolescent emotional network brain responses to TV violence, which raises the obvious concern that diminishing the linking of the consequences of aggression with an emotional response might promote aggressive attitudes and behavior.
Adolescents spend a significant part of their leisure time watching TV programs and movies that portray violence. It is unknown, however, how the extent of violent media use and the severity of aggression displayed affect adolescents’ brain function. We investigated skin conductance responses, brain activation and functional brain connectivity to media violence in healthy adolescents. In an event-related functional magnetic resonance imaging experiment, subjects repeatedly viewed normed videos that displayed different degrees of aggressive behavior. We found a downward linear adaptation in skin conductance responses with increasing aggression and desensitization towards more aggressive videos. Our results further revealed adaptation in a fronto-parietal network including the left lateral orbitofrontal cortex (lOFC), right precuneus and bilateral inferior parietal lobules, again showing downward linear adaptations and desensitization towards more aggressive videos. Granger causality mapping analyses revealed attenuation in the left lOFC, indicating that activation during viewing aggressive media is driven by input from parietal regions that decreased over time, for more aggressive videos. We conclude that aggressive media activates an emotion–attention network that has the capability to blunt emotional responses through reduced attention with repeated viewing of aggressive media contents, which may restrict the linking of the consequences of aggression with an emotional response, and therefore potentially promotes aggressive attitudes and behavior.

Thursday, October 27, 2011

A Fauré Nocturne - new born chicks would like it....

I'm finding some of the Gabriel Fauré Noctures very pleasant. Here is Nocturne no. 3, op. 33. And, the abstract following the video is relevant to the debate over whether our preference of consonant music of this sort is rooted in acoustic properties important to the auditory system or is acquired through enculturation. Italian researchers find newly hatched domestic chicks show a spontaneous preference for a visual imprinting object associated with consonant sound intervals over an identical object associated with dissonant sound intervals. This suggests that preference for harmonic relationships between frequency components may be related to the prominence of harmonic spectra in biological sounds in natural environments.



Here is the abstract from Chiandetti and
Vallortigara1
:
The question of whether preference for consonance is rooted in acoustic properties important to the auditory system or is acquired through enculturation has not yet been resolved. Two-month-old infants prefer consonant over dissonant intervals, but it is possible that this preference is rapidly acquired through exposure to music soon after birth or in utero. Controlled-rearing studies with animals can help shed light on this question because such studies allow researchers to distinguish between biological predispositions and learned preferences. In the research reported here, we found that newly hatched domestic chicks show a spontaneous preference for a visual imprinting object associated with consonant sound intervals over an identical object associated with dissonant sound intervals. We propose that preference for harmonic relationships between frequency components may be related to the prominence of harmonic spectra in biological sounds in natural environments.

Wednesday, October 26, 2011

Our "divided brain" - an animated tutorial.

A loyal mindblog reader has pointed me to an animated lecture, by psychiatrist and writer Iain McGilchrist, that is quite fun to watch. It starts by briefly describing and debunking the pop-psychology about our split brains that reached a peak in the 1970s (for imagination and reason you in fact need BOTH hemispheres), and then proceeds through a lucid explanation of the evolution and function of our brain's hemispheric asymmetry, and the advanced functions made possible by the inhibitory functions of our frontal lobes. McGilchrist cites and illustrates Pascal's point that "The end point of rationality is to demonstrate the limits of rationality." The last few minutes of the 11 minute video are notably informative and hysterically funny. At the conclusion McGilchrist cites Einstein's comment that "The intuitive mind is a sacred gift and the rational mind is a faithful servant," noting that we have created a society that honors the servant. (You should check out the RSA organization's website, which points to a number of lectures in this style.)

Tuesday, October 25, 2011

Women's memory enhanced by lower male voice pitch.

Women are attracted to low male voices, and a prevailing idea is that this is relevant to mate selection. Kevin Allen and colleagues now show that they are also more likely to remember what those low voices say to them. (It makes sense that memory should be sensitive towards content of adaptive value and thus help us to act in ways that enhance our reproductive fitness):
From a functionalist perspective, human memory should be attuned to information of adaptive value for one’s survival and reproductive fitness. While evidence of sensitivity to survival-related information is growing, specific links between memory and information that could impact upon reproductive fitness have remained elusive. Here, in two experiments, we showed that memory in women is sensitive to male voice pitch, a sexually dimorphic cue important for mate choice because it not only serves as an indicator of genetic quality, but may also signal behavioural traits undesirable in a long-term partner. In a first experiment, we found that women’s visual object memory is significantly enhanced when an object’s name is spoken during encoding in a masculinised (i.e., lower-pitch) versus feminised (i.e., higher-pitch) male voice, but that no analogous effect occurs when women listen to other women’s voices. A second experiment replicated this pattern of results, additionally showing that lowering and raising male voice pitch enhanced and impaired women’s memory, respectively, relative to a baseline (i.e., unmanipulated) voice condition. The modulatory effect of sexual dimorphism cues in the male voice may reveal a mate-choice adaptation within women’s memory, sculpted by evolution in response to the dilemma posed by the double-edged qualities of male masculinity.

Monday, October 24, 2011

Human genes still evolve rapidly

A Harvard group has found evolution in response to natural selection in a contemporary human population, showing a dramatic decrease in age of reproduction in a defined population since 1720. Context on the location of their study:
Ile aux Coudres is a 34-km2 island located ∼80 km to the northeast of Québec City along the St. Lawrence River (Canada). Thirty families settled on the island between 1720 and 1773 and the population reached 1,585 people by the 1950s. This population is ideal to study the genetic basis of life-history traits (LHTs). First, church registers provide exceptionally detailed records of dates of births, marriages, and deaths. Second, the long-term data and endogamy (marriages within the population) provide a deep and intricate pedigree to facilitate the separation of genetic and environmental influences on LHTs. Third, the population was very homogeneous among families, particularly in traits known to correlate with the timing of reproduction (social class, education, and religion). In addition, the split of resources among families was quite even due to the type of land distribution, and the number of professions was limited. This relative homogeneity should minimize confounding socioeconomic or shared environmental influences within quantitative genetic analyses.
Here is their abstract:
It is often claimed that modern humans have stopped evolving because cultural and technological advancements have annihilated natural selection. In contrast, recent studies show that selection can be strong in contemporary populations. However, detecting a response to selection is particularly challenging; previous evidence from wild animals has been criticized for both applying anticonservative statistical tests and failing to consider random genetic drift. Here we study life-history variation in an insular preindustrial French-Canadian population and apply a recently proposed conservative approach to testing microevolutionary responses to selection. As reported for other such societies, natural selection favored an earlier age at first reproduction (AFR) among women. AFR was also highly heritable and genetically correlated to fitness, predicting a microevolutionary change toward earlier reproduction. In agreement with this prediction, AFR declined from about 26–22 y over a 140-y period. Crucially, we uncovered a substantial change in the breeding values for this trait, indicating that the change in AFR largely occurred at the genetic level. Moreover, the genetic trend was higher than expected under the effect of random genetic drift alone. Our results show that microevolution can be detectable over relatively few generations in humans and underscore the need for studies of human demography and reproductive ecology to consider the role of evolutionary processes.

Friday, October 21, 2011

The brain's fountain of youth

Williams points to an article that suggests that Dracula may have gotten it right. Young blood can restore an aging body. Giving young blood to older mice is know to boost their immune system and muscle function, and now it turns out that it also causes the synthesis of new nerve cells, boosting the number of cells in the hippocampus involved in memory formation. Conversely, serum from older mice decreases the number of these memory cells in younger mice. Wyss-Coray and collaborators find a blood borne protein (cytokine CCL11) that increases with aging and inhibits synthesis of new nerve cells. Factors stimulating neurogenesis are being sought. Here is their abstract:
In the central nervous system, ageing results in a precipitous decline in adult neural stem/progenitor cells and neurogenesis, with concomitant impairments in cognitive functions. Interestingly, such impairments can be ameliorated through systemic perturbations such as exercise1. Here, using heterochronic parabiosis we show that blood-borne factors present in the systemic milieu can inhibit or promote adult neurogenesis in an age-dependent fashion in mice. Accordingly, exposing a young mouse to an old systemic environment or to plasma from old mice decreased synaptic plasticity, and impaired contextual fear conditioning and spatial learning and memory. We identify chemokines—including CCL11 (also known as eotaxin)—the plasma levels of which correlate with reduced neurogenesis in heterochronic parabionts and aged mice, and the levels of which are increased in the plasma and cerebrospinal fluid of healthy ageing humans. Lastly, increasing peripheral CCL11 chemokine levels in vivo in young mice decreased adult neurogenesis and impaired learning and memory. Together our data indicate that the decline in neurogenesis and cognitive impairments observed during ageing can be in part attributed to changes in blood-borne factors.

Thursday, October 20, 2011

"Neurotrash"

In The Chronicle of Higher Education Marc Parry notes the crusade of Raymond Tallis to throw out the "Neurotrash." This is the goal of his new book "Aping Mankind: Neuromania, Darwinitis, and the Misrepresentation of Humanity" (McGill-Queen's University Press). Parry describes a Tallis lecture that seeks to demolish
…two "pillars of unwisdom." The first, "neuromania," is the notion that to understand people you must peer into the "intracranial darkness" of their skulls with brain-scanning technology. The second, "Darwinitis," is the idea that Charles Darwin's evolutionary theory can explain not just the origin of the human species—a claim Tallis enthusiastically accepts—but also the nature of human behavior and institutions….Those trends, as Tallis sees them, are like "intellectual illnesses" metastasizing from academic labs into popular culture. He sees the symptoms in neuro-economic thinkers who explain our susceptibility to subprime mortgages by describing how our brains evolved to favor short-term rewards. He sees them in philosophers who claim that our primate minds admire paintings of landscapes that would have supported hunting and gathering. He sees it in neurotheologians who preach that "God is a tingle in the 'God spot' in the brain."
The points Tallis makes are good, much of the press description of 'love spots in the brain' , etc. is nonsense…but Tallis does seem to throw out the baby with the bathwater. How could we make sense of the irrational social behaviors described in the previous two mindblog posts this week outside of an evolutionary framework, and how do we explain that activities of certain brain areas, when perturbed by strokes, electrical stimulation, or drugs do alter fairly discrete classes of behaviors?

Brief neuroscience video-tutorials

I recently received an email from Aki Nikolaidis, a neuroscience graduate student at the University of Illinois, asking me to have a look some brief instructional videos he had been making over the past several years. I took a look, and found them quite engaging. Here they are:

Fluid Intelligence

Consciousness and Free Will

Non-conscious Information Processing

Electroencephalogram (EEG)

Working Memory Training

Wednesday, October 19, 2011

The science of irrationality.

As a followup to yesterday's post on how common sense, ideology and intuition lead us astray in our attempts to fix social problems - while social intervention programs that have been validated by true randomized experiments are ignored - I point to Jonah Lehrer's brief review of Daniel Kahneman's new book "Thinking, Fast and Slow.", which describes his work on evolved blind spots in our rational processes which appear to be virtually impossible to fix, even though we understand that they are there.
When people face an uncertain situation, they don't carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on mental short cuts, which often lead them to make foolish decisions. The short cuts aren't a faster way of doing the math; they're a way of skipping the math altogether...The biases and blind-spots identified by Messrs. Kahneman and Tversky aren't symptoms of stupidity. They're an essential part of our humanity, the inescapable byproducts of a brain that evolution engineered over millions of years.

Consider the overconfidence bias, which drives many of our mistakes in decision-making. The best demonstration of the bias comes from the world of investing. Although many fund managers charge high fees to oversee stock portfolios, they routinely fail a basic test of skill: persistent achievement. As Mr. Kahneman notes, the year-to-year correlation between the performance of the vast majority of funds is barely above zero, which suggests that most successful managers are banking on luck, not talent...This shouldn't be too surprising. The stock market is a case study in randomness, a system so complex that it's impossible to predict. Nevertheless, professional investors routinely believe that they can see what others can't. The end result is that they make far too many trades, with costly consequences.

We like to see ourselves as a Promethean species, uniquely endowed with the gift of reason. But Mr. Kahneman's simple experiments reveal a very different mind, stuffed full of habits that, in most situations, lead us astray. Though overconfidence may encourage us to take necessary risks—Mr. Kahneman calls it the "engine of capitalism"—it's generally a dangerous (and expensive) illusion.

What's even more upsetting is that these habits are virtually impossible to fix. As Mr. Kahneman himself admits, "My intuitive thinking is just as prone to overconfidence, extreme predictions and the planning fallacy as it was before I made a study of these issues."...Even when we know why we stumble, we still find a way to fall.

Tuesday, October 18, 2011

The science of psychological change

In the Oct 14 issue of Science Magazine
Geoffrey L. Cohen reviews Timothy Wilson's new book "Redirect: The Surprising New Science of Psychological Change." The book reviews success stories in social psychology, and I thought it would be worthwhile to pass on a few clips from that review:
There are interventions that harness the power of expressive writing and volunteerism to improve happiness and health and to lessen rates of teen pregnancy. There are interventions that reduce student failure and close gaps between minority and nonminority students by inculcating in them core positive beliefs that sustain them through hardship, such as the belief that intelligence is not a fixed entity but rather like a muscle that grows with effort. There are interventions that improve intertribal trust in Rwanda by modeling cooperative intergroup relations through radio soap operas. In the United States, interventions that defuse blacks' and whites' fear of interracial rejection increase their likelihood of becoming friends. And...there are studies that cleverly manipulate social norms to reduce teen alcohol use and encourage energy conservation.
And notes:
What these interventions share is that they are grounded in science, found effective in randomized experiments, have surprisingly large and durable effects—and, by and large, aren't used. Over and over, Wilson writes, schools, government agencies, and workplaces opt for interventions that not only have never been subjected to experimental test but also, when they finally are, often yield null and even negative effects. These interventions are usually based on a combination of intuition, ideology, and good intentions. Wilson critiques several popular but unwise interventions: Drug Abuse Resistance Education (D.A.R.E., implemented in 75% of the school districts in the United States), “scared straight,” certain forms of posttraumatic grief counseling, many commonplace diversity training programs, and the self-help and positive thinking industry in general ["The Secret" book receives sustained criticism]. These are analogous, Wilson writes, to the practices of leeching and blood-letting before the scientific method took hold in medicine.

Wilson uses the thought-provoking metaphor of “story editing” to describe the ingredient common to many of the successful interventions he reviews. They alter the narratives people tell themselves about their world and their place in it: Is it safe or threatening? Do I belong or not? Am I capable or not? During sensitive periods, people's storytelling can be redirected and the change can build on itself over time. Amend the opening sentence of the story of your transition to college, or to a new job, and the arc of your story may be entirely different from what it would have been otherwise. This helps explain why seemingly simple interventions, such as writing about a traumatic experience, or volunteering for a humanitarian cause, improve health and well-being. They give people an organizing narrative that puts their lives in an optimistic context.

Monday, October 17, 2011

Dissolution of the social contract

I've been meaning to point to a very cogent essay by Marmor and Mashaw on why conditions for recovery from the great depression of the early 1930's were more propitious than those that prevail in our current recession - which promises to be very prolonged.
...there is a crucial difference between then and now: the words that our political leaders use to talk about our problems have changed. Where politicians once drew on a morally resonant language of people, family and shared social concern, they now deploy the cold technical idiom of budgetary accounting...This is more than a superficial difference in rhetoric. It threatens to deprive us of the intellectual resources needed to address today’s problems.

From the 1930s to the 1960s...American public discourse was filled with references to the social circumstances of average citizens, our common institutions and our common history. Over the last five decades, that discourse has changed in ways that emphasize individual choice, agency and preferences. The language of sociology and common culture has been replaced by the language of economics and individualism.

In 1934, the government was us. We had shared circumstances, shared risks and shared obligations. Today the government is the other — not an institution for the achievement of our common goals, but an alien presence that stands between us and the realization of individual ambitions. Programs of social insurance have become “entitlements,” a word apparently meant to signify not a collectively provided and cherished basis for family-income security, but a sinister threat to our national well-being.

Over the last 50 years we seem to have lost the words — and with them the ideas — to frame our situation appropriately.

Friday, October 14, 2011

Differences in reality monitoring correlate with prefrontal cortex variations

Here is an intriguing bit of work from Buda et al.:
Much recent interest has centered on understanding the relationship between brain structure variability and individual differences in cognition, but there has been little progress in identifying specific neuroanatomical bases of such individual differences. One cognitive ability that exhibits considerable variability in the healthy population is reality monitoring; the cognitive processes used to introspectively judge whether a memory came from an internal or external source (e.g., whether an event was imagined or actually occurred). Neuroimaging research has implicated the medial anterior prefrontal cortex (PFC) in reality monitoring, and here we sought to determine whether morphological variability in a specific anteromedial PFC brain structure, the paracingulate sulcus (PCS), might underlie performance. Fifty-three healthy volunteers were selected on the basis of MRI scans and classified into four groups according to presence or absence of the PCS in their left or right hemisphere. The group with absence of the PCS in both hemispheres showed significantly reduced reality monitoring performance and ability to introspect metacognitively about their performance when compared with other participants. Consistent with the prediction that sulcal absence might mean greater volume in the surrounding frontal gyri, voxel-based morphometry revealed a significant negative correlation between anterior PFC gray matter and reality monitoring performance. The findings provide evidence that individual differences in introspective abilities like reality monitoring may be associated with specific structural variability in the PFC.
Examples of prominent (left) and absent (right) PCS classifications. In the left panel, PCS is indicated by the red arrow.

Thursday, October 13, 2011

Merging emotional information from voice and faces in the brain.

Klasen et al. find the ventral posterior cingulate to be a central structure for supramodal representation of complex emotional information. The left amygdala reflects input of happy stimuli from multiple sensory inputs:
Supramodal representation of emotion and its neural substrates have recently attracted attention as a marker of social cognition. However, the question whether perceptual integration of facial and vocal emotions takes place in primary sensory areas, multimodal cortices, or in affective structures remains unanswered yet. Using novel computer-generated stimuli, we combined emotional faces and voices in congruent and incongruent ways and assessed functional brain data (fMRI) during an emotional classification task. Both congruent and incongruent audiovisual stimuli evoked larger responses in thalamus and superior temporal regions compared with unimodal conditions. Congruent emotions were characterized by activation in amygdala, insula, ventral posterior cingulate (vPCC), temporo-occipital, and auditory cortices; incongruent emotions activated a frontoparietal network and bilateral caudate nucleus, indicating a greater processing load in working memory and emotion-encoding areas. The vPCC alone exhibited differential reactions to congruency and incongruency for all emotion categories and can thus be considered a central structure for supramodal representation of complex emotional information. Moreover, the left amygdala reflected supramodal representation of happy stimuli. These findings document that emotional information does not merge at the perceptual audiovisual integration level in unimodal or multimodal areas, but in vPCC and amygdala.

Wednesday, October 12, 2011

Measuring Zeitgeist from the Tweet stream.

Golder and Macy have analyzed more than 509 million Twitter posts by 2.4 million users over a 2-year period in order to study collective mood (using Twitter to track the mood of nations sort of like using satellites to track the state of the atmosphere!). They used a freely available protocol provided by Twitter to download tweets originating from 84 countries between February 2008 and January 2010, and searched these messages for roughly 1000 words on a tried-and-tested list of words associated with positive (agree, fantastic, super) and negative (afraid, mad, panic) emotion. Despite very different cultures, geographies, and religions, the U.S., Canada, the UK, Australia, India, and English-speaking Africa all showed similar mood rhythms:
We identified individual-level diurnal and seasonal mood rhythms in cultures across the globe, using data from millions of public Twitter messages. We found that individuals awaken in a good mood that deteriorates as the day progresses—which is consistent with the effects of sleep and circadian rhythm—and that seasonal change in baseline positive affect varies with change in daylength. People are happier on weekends, but the morning peak in positive affect is delayed by 2 hours, which suggests that people awaken later on weekends.
A graphic from the article:
Hourly changes in individual affect broken down by day of the week (top, positive affect, PA; bottom,  negative affect, NA). Each series shows mean affect (black lines) and 95% confidence interval (colored regions). (Experimental psychologists have repeatedly demonstrated that positive and negative affect are independent dimensions. Positive affect (PA) includes enthusiasm, delight, activeness, and alertness, whereas negative affect (NA) includes distress, fear, anger, guilt, and disgust)

Tuesday, October 11, 2011

Fatherhood decreases testosterone.

Gettler et al. find that, in a community-based sample from the Philippines, men with higher testosterone level are more likely to marry than men with lower testosterone; that men who marry and become fathers experience declines in testosterone; and that men who provide more paternal care have lower testosterone levels than fathers who provide less care:
In species in which males care for young, testosterone (T) is often high during mating periods but then declines to allow for caregiving of resulting offspring. This model may apply to human males, but past human studies of T and fatherhood have been cross-sectional, making it unclear whether fatherhood suppresses T or if men with lower T are more likely to become fathers. Here, we use a large representative study in the Philippines (n = 624) to show that among single nonfathers at baseline (2005) (21.5 ± 0.3 y), men with high waking T were more likely to become partnered fathers by the time of follow-up 4.5 y later (P < 0.05). Men who became partnered fathers then experienced large declines in waking (median: −26%) and evening (median: −34%) T, which were significantly greater than declines in single nonfathers (P < 0.001). Consistent with the hypothesis that child interaction suppresses T, fathers reporting 3 h or more of daily childcare had lower T at follow-up compared with fathers not involved in care (P < 0.05). Using longitudinal data, these findings show that T and reproductive strategy have bidirectional relationships in human males, with high T predicting subsequent mating success but then declining rapidly after men become fathers. Our findings suggest that T mediates tradeoffs between mating and parenting in humans, as seen in other species in which fathers care for young. They also highlight one likely explanation for previously observed health disparities between partnered fathers and single men.

Monday, October 10, 2011

Mimicry can foster both rapport and threat.

Liu does experiments that illustrate two aspects of mimicry. Mimicry can bond people by fostering rapport and liking, but appears to have the opposite effect if participants in an interaction are primed by reminders of money. Money-primed participants liked a mimicking interaction partner less than they liked a nonmimicking partner, an effect that appears to have been due to enhanced feelings of threat. Their observations are consonant with studies that have shown that reminders of money elicit a self-sufficient state characterized by two tendencies: First, eagerness to pursue personal goals and freedom, persisting longer than others on difficult tasks and hesitating to ask for help; and second, acting more insensitive to others, desiring solo versus activities and showing more indifference to social exclusion.

Friday, October 07, 2011

The Jekyll and Hyde of emotional intelligence

Côté et al. make some observations about emotional smarts:
Does emotional intelligence promote behavior that strictly benefits the greater good, or can it also advance interpersonal deviance? In the investigation reported here, we tested the possibility that a core facet of emotional intelligence—emotion-regulation knowledge—can promote both prosocial and interpersonally deviant behavior. Drawing from research on how the effective regulation of emotion promotes goal achievement, we predicted that emotion-regulation knowledge would strengthen the effects of other-oriented and self-oriented personality traits on prosocial behavior and interpersonal deviance, respectively. Among individuals with higher emotion-regulation knowledge, a first study noted that moral identity exhibited a stronger positive association with prosocial behavior in a social dilemma [i.e. the study confirmed the authors' prediction that there is an association between moral identity and prosocial behavior in a social dilemma, this association being stronger among individuals with high emotion-regulation knowledge]. A second study found that the positive relation between Machiavellianism and interpersonal deviance was stronger when emotion-regulation knowledge was high rather than when it was low, thus pointing out the dark side of emotion-regulation knowledge.