Tuesday, August 05, 2014

Anger as the most easily spread emotion.

Teddy Wayne does an essay on how anger is the emotion that spreads the most easily over social media. Some clips:
A 2013 study, from Beihang University in Beijing, of Weibo, a Twitter-like site, found that anger is the emotion that spreads the most easily over social media. Joy came in a distant second. The main difference, said Ryan Martin, a psychology professor at the University of Wisconsin, Green Bay, who studies anger, is that although we tend to share the happiness only of people we are close to, we are willing to join in the rage of strangers. As the study suggests, outrage is lavishly rewarded on social media, whether through supportive comments, retweets or Facebook likes. People prone to Internet outrage are looking for validation, Professor Martin said. “They want to hear that others share it,” he said, “because they feel they’re vindicated and a little less lonely and isolated in their belief.”
...outrage carries a different flavor from pure anger; it suggests an affront to one’s value system as opposed to seething, Hulk-like fury. So whereas a venomous insult from an anonymous commenter simply seeks to tear down another person or institution, an outraged Twitter post from an identified account calls attention to the user’s own probity. By throwing 140-character stones from our Google Glass houses, we preserve our belief (or delusion) that we are morally superior to those who have offended us.
Perhaps the real problem, Professor Martin suggested, isn’t our rage but our rashness, and its relationship to our easily accessible devices. The Internet exacerbates impulse-control problems. You get mad, and you can tell the world about it in moments before you’ve had a chance to calm down and think things through.

Monday, August 04, 2014

Brain noise? Insomnia? Try A.S.M.R.

Fairyington does an interesting piece on a phenomenon called autonomous sensory meridian response (A.S.M.R.), which is felt as a mild calming tingling sensation that travels over the scalp or other part of the body in response to some kinds of subtle repetitive visual, auditory, or smell stimulation (rustling pages, whispering; tapping, scratching, etc.). The article contains numerous links to YouTube sites devoted to this effect. Some clips:
Carl W. Bazil, a sleep disorders specialist at Columbia University, says A.S.M.R. videos may provide novel ways to switch off our brains...“People who have insomnia are in a hyper state of arousal,” he said. “Behavioral treatments — guided imagery, progressive relaxation, hypnosis and meditation — are meant to try to trick your unconscious into doing what you want it to do. A.S.M.R. videos seem to be a variation on finding ways to shut your brain down.”
Bryson Lochte, a post-baccalaureate fellow at the National Institute on Drug Abuse who looked into A.S.M.R. for his senior thesis as a neuroscience major at Dartmouth College last year, has submitted his paper for publication in a scientific journal. Mr. Lochte said, “We focused on those areas in the brain associated with motivation, emotion and arousal to probe the effect A.S.M.R. has on the ‘reward system’ — the neural structures that trigger a dopamine surge amid pleasing reinforcements, like food or sex.
He compared A.S.M.R. to another idiosyncratic but well-studied sensation called musical frisson, which provokes a thrilling ripple of chills or goose bumps (technically termed piloerection) over one’s body in emotional response to music. Mathias Benedek, a research assistant at the University of Graz in Austria who co-authored two studies on emotion-provoked piloerection, says A.S.M.R. may be a softer, quieter version of the same phenomenon. “Frisson may simply be a stronger, full-blown response,” he said. And like A.S.M.R., the melodies that ignite frisson in one person may not in another.
Robert J. Zatorre, a professor of neuroscience at the Montreal Neurological Institute and Hospital at McGill University who has also studied musical frisson, said that “the upshot of my paper is that pleasurable music elicits dopamine activity in the striatum, which is a key component of the reward system” in the brain. Writing in The New York Times last year, in an article titled “Why Music Makes Our Brain Sing,” he notes, “What may be most interesting here is when this neurotransmitter is released: not only when the music rises to a peak emotional moment, but also several seconds before, during what we might call the anticipation phase.”
Perhaps the everyday experiences that A.S.M.R. videos capture — whispering, crinkling, opening and closing of boxes — evoke similar anticipatory mechanisms, sparking memories of past pleasures that we anticipate and relive each time we watch and listen.

Saturday, August 02, 2014

MindBlog gets married.

Deric Bownds and his partner of 25 years, Len Walker, having brunch at Palmer House in Chicago after getting married at the Cook County Courthouse during a visit with friends Mark Weber and Roy Wesley.


Friday, August 01, 2014

Faith and ideology trump reason...

Sigh...sorry to spread such pessimistic material, but I pass on two items on the persistence of faith or ideology over reason. Nyhan describes a number of studies, including one by Kahan, who finds that the divide over belief in evolution between more and less religious people is wider among people who otherwise show familiarity with math and science, which suggests that the problem isn’t a lack of information. And, Paul Krugman issues another installment in his railing about the inflation delusions clung to by conservative economists and politicans.

Thursday, July 31, 2014

Brain activity that reflects positive and negative emotion.

Knutson et al. at Stanford note a correlation between self reported positive and negative arousal and fMRI measurement of brain activity in the nucleus accumbens and anterior insula (if you go to Google images and enter these terms you can see the locations of these regions in brain). Their abstract:
Neuroimaging findings are often interpreted in terms of affective experience, but researchers disagree about the advisability or even possibility of such inferences, and few frameworks explicitly link these levels of analysis. Here, we suggest that the spatial and temporal resolution of functional magnetic resonance imaging (fMRI) data could support inferences about affective states. Specifically, we propose that fMRI nucleus accumbens (NAcc) activity is associated with positive arousal, whereas a combination of anterior insula activity and NAcc activity is associated with negative arousal. This framework implies quantifiable and testable inferences about affect from fMRI data, which may ultimately inform predictions about approach and avoidance behavior.
And a figure from their paper:


Meta-analytic results for activity in nucleus accumbens (NAcc; white circles) and anterior insula (black circles) during incentive anticipation. Activation likelihood estimate maps adapted from Bartra et al.  - who also present a list of regions correlating with affect -  superimposed onto the affective circumplex [from right to left: positive minus negative subjective value (SV), positive subjective value, positive plus negative subjective value, and negative subjective value]

Wednesday, July 30, 2014

Brain correlates of behaviors in market bubbles.

Interesting...from Smith et al. a visualization of the part of our brains that seem to be saying "go for it" during a market bubble (and making less money) and another region that is saying "Whoa..." (whose activity is more prominent in successful traders who pull out of the market before the crash.)
Groups of humans routinely misassign value to complex future events, especially in settings involving the exchange of resources. If properly structured, experimental markets can act as excellent probes of human group-level valuation mechanisms during pathological overvaluations—price bubbles. The connection between the behavioral and neural underpinnings of such phenomena has been absent, in part due to a lack of enabling technology. We used a multisubject functional MRI paradigm to measure neural activity in human subjects participating in experimental asset markets in which endogenous price bubbles formed and crashed. Although many ideas exist about how and why such bubbles may form and how to identify them, our experiment provided a window on the connection between neural responses and behavioral acts (buying and selling) that created the bubbles. We show that aggregate neural activity in the nucleus accumbens (NAcc) tracks the price bubble and that NAcc activity aggregated within a market predicts future price changes and crashes. Furthermore, the lowest-earning subjects express a stronger tendency to buy as a function of measured NAcc activity. Conversely, we report a signal in the anterior insular cortex in the highest earners that precedes the impending price peak, is associated with a higher propensity to sell in high earners, and that may represent a neural early warning signal in these subjects. Such markets could be a model system to understand neural and behavior mechanisms in other settings where emergent group-level activity exhibits mistaken belief or valuation.

Tuesday, July 29, 2014

A intriguing take on consciousness as a perceptual construct.

A recent review by Aaron Schurger in Science Magazine pointed me to Michael Graziano's 2013 book "Consciousness and the Social Brain", which I immediately downloaded, read, and abstracted. Very engaging and clear writing (although I am dumbfounded that he makes no reference to Thomas Metzinger's work and 'ego tunnel' model, which has common elements with his own.) In Graziano's theory awareness is information, the brain's simplified, schematic model of the complicated, data-handling process of attention. A brain can use the construct of awareness to model its own attentional state or to model someone else’s attentional state. An extract from Schurger's review:
In Consciousness and the Social Brain, Michael Graziano argues that consciousness is a perceptual construct—the brain attributes it to other people in much the same way that the brain attributes speech to the ventriloquist's puppet. To clarify, imagine being greeted by a very lifelike android version of your best friend with a prerecorded behavioral program that had you genuinely fooled for a few minutes. From your perspective, for those minutes, the android was endowed with consciousness. Thus there need be no truth or falsity to the statement “My friend standing before me is conscious.” Your brain decides that the android–best friend standing in front of you is conscious, and that is what you perceive to be true.
According to Graziano's “attention schema” theory, our own consciousness is also a perceptual construct—a unique one that emerges when the brain applies the same perceptual attribution recursively to itself. We attribute consciousness to others as part of our perceptual model of what they are paying attention to (an inference particularly useful for predicting their behavior). This model describes the process of attention as a mysterious something extra in the brains of beings that are selectively processing information that guides their behavior. When the brain applies the model to itself, “I” become endowed with this extra something as well—although, as with the android, it was never there in the first place.
According to the theory, consciousness is to attention what the body schema is to the body: it is the brain's perceptual description of its own process of attention. The two phenomena are thus locked “in a positive feedback loop,” which explains the tight connection between attention and consciousness. In essence, consciousness is a descriptive story about a real physical phenomenon (attention). The ink in which the story is written (neural activity) is real, and the physical phenomenon that the story is “about” (attention) is real. But, like the talking puppet, the story itself need not be real. We say that we have consciousness, and that it seems irreducible to physical phenomena, because that is how the brain describes the process of attention (in ourselves and in others): as something ineffable.
I'll also give you a clip from my abstracting of the book:
The heart of the theory is that awareness is a schematized, descriptive model of attention. The model is not perfectly accurate, but it is good enough to be useful. It is a rich information set, as rich as a sensory representation. It can be bound to a representation of an object as though it were another sensory attribute like color or motion….the purpose of a model in the brain is to be useful in interacting with the world, not to be accurate.

The body schema and the attention schema may share more than a formal similarity. They may partially overlap. The body schema is an internal model— an organized set of information that represents the shape, structure, and movement of the body, that distinguishes between objects belonging to the body and objects that are foreign.
In the present theory, the attention schema is similar to the body schema. Rather than representing one’s physical body, it models a different aspect of oneself, also a complex dynamical system, the process of attention— the process by which some signals in the brain become enhanced at the expense of others. It is a predictive model of attention, its dynamics, its essential meaning, its potential impact on behavior, what it can and can’t do, what affects it, and how. It is a simulation. The quirky way that attention shifts from place to place, from item to item, its fluctuating intensity, its spatial and temporal dynamics— all of these aspects are incorporated into the model.

Monday, July 28, 2014

Video game puzzle that improves executive function.

Maybe you don't have to pay brainhq.com or luminosity.com a monthly fee for brain exercises to improve your brain's executive functions. An iOS or Android App costing three dollars might do the job. Oei and Patterson make the interesting observation that executive function (making decision in rapidly changing circumstances) can be improved 30% by a video game (Cut the Rope) that requires physics-based puzzle solving but not by an action video game, a fast paced arcade game, or a real-time strategy game. Tests of executive function were administered before and a week after the game training. Their abstract:
Recent research suggests a causal link between action video game playing and enhanced attention and visual-perceptual skills. In contrast, evidence linking action video games and enhanced executive function is equivocal. We investigated whether action and non-action video games enhance executive function. Fifty-five inexperienced video game players played one of four different games: an action video game (Modern Combat), a physics-based puzzle game (Cut the Rope), a real-time strategy game (Starfront Collision), and a fast paced arcade game (Fruit Ninja) for 20 h. Three pre and post training tests of executive function were administered: a random task switching, a flanker, and a response inhibition task (Go/No-go). Only the group that trained on the physics-based puzzle game significantly improved in all three tasks relative to the pre-test. No training-related improvements were seen in other groups. These results suggest that playing a complex puzzle game that demands strategizing, reframing, and planning improves several aspects of executive function.

Friday, July 25, 2014

Life purpose, longevity, and Alzheimers disease.

From Hill and Turiano:
Having a purpose in life has been cited consistently as an indicator of healthy aging for several reasons, including its potential for reducing mortality risk. In the current study, we sought to extend previous findings by examining whether purpose in life promotes longevity across the adult years, using data from the longitudinal Midlife in the United States (MIDUS) sample. Proportional-hazards models demonstrated that purposeful individuals lived longer than their counterparts did during the 14 years after the baseline assessment, even when controlling for other markers of psychological and affective well-being. Moreover, these longevity benefits did not appear to be conditional on the participants’ age, how long they lived during the follow-up period, or whether they had retired from the workforce. In other words, having a purpose in life appears to widely buffer against mortality risk across the adult years.
(MIDUS refers to a longitudinal study of health and well-being that began in 1994–1995. 7,108 participants were recruited from a nationally representative, random-digit-dialing sample of noninstitutionalized adults between the ages of 20 and 75 (mean age = 46.92 years, SD = 12.94)).
An article by Span points to other studies following almost 1,000 people (age 80, on average) for up to seven years, finding that those with high purpose scores were 2.4 times more likely to remain free of Alzheimer’s than those with low scores and also less likely to develop mild cognitive impairment, often a precursor...In a subset of 246 people who died, autopsies found that many of the purposeful subjects also showed the distinctive markers of Alzheimer’s, suggesting that even for people developing the plaques and tangles in their brains, having purpose in life allows them to tolerate them and still maintain their cognition...Another study, of 1,238 people followed for up to five years (average age: 78)found that those with high purpose had roughly half the mortality rate of those with low purpose.

Thursday, July 24, 2014

How do you get to Carnegie Hall?

The standard answer, which I've used to end several of my lectures, is "practice, practice, practice." Macnamara et al. suggest there is a bit more to it than that (like genetics....there's no way my piano sight reading ability, obvious at age 6, was due to practice.):
More than 20 years ago, researchers proposed that individual differences in performance in such domains as music, sports, and games largely reflect individual differences in amount of deliberate practice, which was defined as engagement in structured activities created specifically to improve performance in a domain. This view is a frequent topic of popular-science writing—but is it supported by empirical evidence? To answer this question, we conducted a meta-analysis covering all major domains in which deliberate practice has been investigated. We found that deliberate practice explained 26% of the variance in performance for games, 21% for music, 18% for sports, 4% for education, and less than 1% for professions. We conclude that deliberate practice is important, but not as important as has been argued.

Wednesday, July 23, 2014

Why is melody in the high notes and rhythm in the base?

Hove et al. examine to what extent musical convention might be shaped by evolutionarily-shaped human physiology.
Across cultures, polyphonic music most often conveys melody in higher-pitched sounds and rhythm in lower-pitched sounds. They show that, when two streams of tones are presented simultaneously, the brain better detects timing deviations in the lower-pitched than in the higher-pitched stream and that tapping synchronization to the tones is more influenced by the lower-pitched stream. Furthermore, their modeling reveals that, with simultaneous sounds, superior encoding of timing for lower sounds and of pitch for higher sounds arises early in the auditory pathway in the cochlea of the inner ear. Thus, these musical conventions likely arise from very basic auditory physiology.
The abstract:
The auditory environment typically contains several sound sources that overlap in time, and the auditory system parses the complex sound wave into streams or voices that represent the various sound sources. Music is also often polyphonic. Interestingly, the main melody (spectral/pitch information) is most often carried by the highest-pitched voice, and the rhythm (temporal foundation) is most often laid down by the lowest-pitched voice. Previous work using electroencephalography (EEG) demonstrated that the auditory cortex encodes pitch more robustly in the higher of two simultaneous tones or melodies, and modeling work indicated that this high-voice superiority for pitch originates in the sensory periphery. Here, we investigated the neural basis of carrying rhythmic timing information in lower-pitched voices. We presented simultaneous high-pitched and low-pitched tones in an isochronous stream and occasionally presented either the higher or the lower tone 50 ms earlier than expected, while leaving the other tone at the expected time. EEG recordings revealed that mismatch negativity responses were larger for timing deviants of the lower tones, indicating better timing encoding for lower-pitched compared with higher-pitch tones at the level of auditory cortex. A behavioral motor task revealed that tapping synchronization was more influenced by the lower-pitched stream. Results from a biologically plausible model of the auditory periphery suggest that nonlinear cochlear dynamics contribute to the observed effect. The low-voice superiority effect for encoding timing explains the widespread musical practice of carrying rhythm in bass-ranged instruments and complements previously established high-voice superiority effects for pitch and melody.

Tuesday, July 22, 2014

Blue is warmer than red?

Red colors are arousing, blue colors calming, so at first the results of Ho et al. seem counter-intuitive. A red object at the same temperature as a blue object feels colder, and they suggest that this is because our prior expectation from the red color that it should be warmer biases our perception to make it seem cooler than it is.
It is commonly believed that reddish color induces warm feelings while bluish color induces cold feelings. We, however, demonstrate an opposite effect when the temperature information is acquired by direct touch. Experiment 1 found that a red object, relative to a blue object, raises the lowest temperature required for an object to feel warm, indicating that a blue object is more likely to be judged as warm than a red object of the same physical temperature. Experiment 2 showed that hand colour also affects temperature judgment, with the direction of the effect opposite to object colours. This study provides the first demonstration that colour can modulate temperature judgments when the temperature information is acquired by direct touch. The effects apparently oppose the common conception of red-hot/blue-cold association. We interpret this phenomenon in terms of “Anti-Bayesian” integration, which suggests that the brain integrates direct temperature input with prior expectations about temperature relationship between object and hand in a way that emphasizes the contrast between the two.

Monday, July 21, 2014

Ecstasy (MDMA) and LSD as therapeutic drugs

Kupferschmidt offers two pieces in Science magazine on using two currently banned classes of drugs for therapeutic purposes: the party drug ecstacy (3,4-methylenedioxymethamphetamine, or MDMA), and hallucinogenic compounds derived from fungus or mushrooms (LSD and psilocybin).

NDMA activates brain receptors for dopamine and noradrenaline and releases serotonin from nerve endings, leading to the characteristic feeling of euphoria that made it popular in clubs and at dance events. One study in which 10 out of 12 PTSD patients no longer met the diagnostic criteria for PTSD after two months of taking MDMA has motivated the launching of phase II clinical studies in Israel, Canada, and the United States.

LSD and psilocybin, which bind to serotonin and other brain receptors, are being tested in studies to treat depression, obsessive-compulsive disorder, anxiety, cluster headaches, and nicotine, alcohol, or cocaine addictions.

Friday, July 18, 2014

Feeling the social touch being observed in others.

Interesting work by Bolognini et al. on our mirroring of the emotions of others :
Touch has an emotional and communicative meaning, and it plays a crucial role in social perception and empathy. The intuitive link between others’ somatosensations and our sense of touch becomes ostensible in mirror-touch synesthesia, a condition in which the view of a touch on another person’s body elicits conscious tactile sensations on the observer’s own body. This peculiar phenomenon may implicate normal social mirror mechanisms. Here, we show that mirror-touch interference effects, synesthesia-like sensations, and even phantom touches can be induced in nonsynesthetes by priming the primary somatosensory cortex (SI) directly or indirectly via the posterior parietal cortex. These results were obtained by means of facilitatory paired-pulse transcranial magnetic stimulation (ppTMS) contingent upon the observation of touch. For these vicarious effects, the SI is engaged at 150 ms from the onset of the visual touch. Intriguingly, individual differences in empathic abilities, assessed with the Interpersonal Reactivity Index, drive the activity of the SI when nonsynesthetes witness others’ tactile sensations. This evidence implies that, under normal conditions, touch observation activates the SI below the threshold for perceptual awareness; through the visual-dependent tuning of SI activity by ppTMS, what is seen becomes felt, namely, mirror-touch synesthesia. On a broader perspective, the visual responsivity of the SI may allow an automatic and unconscious transference of the sensation that another person is experiencing onto oneself, and, in turn, the empathic sharing of somatosensations.

Thursday, July 17, 2014

Brain activity can reveal whom someone is thinking about.

A collaboration between five different research centers shows that in predicting or imagining the behavior of others based on their personality the brain relys on the same network of regions that support other forms of mental simulation, such as remembering the past and planning for the future:
The behaviors of other people are often central to envisioning the future. The ability to accurately predict the thoughts and actions of others is essential for successful social interactions, with far-reaching consequences. Despite its importance, little is known about how the brain represents people in order to predict behavior. In this functional magnetic resonance imaging study, participants learned the unique personality of 4 protagonists and imagined how each would behave in different scenarios. The protagonists' personalities were composed of 2 traits: Agreeableness and Extraversion. Which protagonist was being imagined was accurately inferred based solely on activity patterns in the medial prefrontal cortex using multivariate pattern classification, providing novel evidence that brain activity can reveal whom someone is thinking about. Lateral temporal and posterior cingulate cortex discriminated between different degrees of agreeableness and extraversion, respectively. Functional connectivity analysis confirmed that regions associated with trait-processing and individual identities were functionally coupled. Activity during the imagination task, and revealed by functional connectivity, was consistent with the default network. Our results suggest that distinct regions code for personality traits, and that the brain combines these traits to represent individuals. The brain then uses this “personality model” to predict the behavior of others in novel situations.

Wednesday, July 16, 2014

Response of large scale brain networks to acute stress.

I pass on this interesting summary and graphic by Hermans et al.
Exposure to acute stress prompts a reallocation of resources to a salience network, promoting fear and vigilance, at the cost of an executive control network. After stress subsides, resource allocation to these two networks reverses, which normalizes emotional reactivity and enhances higher-order cognitive processes important for long-term survival.

Schematic anatomical overview of salience and executive control networks. The sphere sizes illustrate the relative sizes of the clusters that co-activate with the respective networks. Our model proposes that these two neurocognitive systems are regulated in a time-dependent and reciprocal fashion by stress-related neuromodulators. Adapted from. Abbreviations: AI, anterior insula; am, amygdala; DACC, dorsal anterior cingulate cortex; DLPFC, dorsolateral prefrontal cortex; DMPFC, dorsomedial prefrontal cortex; DPPC, dorsal posterior parietal cortex; FEF, frontal eye fields (precentral/superior frontal sulci); hy, hypothalamus; IT, inferotemporal cortex; mb, midbrain; Th, thalamus; TPJ, temporoparietal junction; vs, ventral striatum.

Tuesday, July 15, 2014

Increased self control without increased willpower

Here is a fascinating bit of work from Magen et. al., who show that a simple cognitive reframing of the classic immediate or delayed gratification test makes energy requiring willpower less necessary.
In our paradigm, instead of presenting choices in a traditional hidden-zero format (e.g., “Would you prefer [A] $5 today OR [B] $10 in a month?”), choices are presented in an explicit-zero format, which references the nonreward consequences of each choice (e.g., “Would you prefer [A] $5 today and $0 in a month OR [B] $0 today and $10 in a month?”). Including future outcomes in all choice options has been argued to reduce the attentional bias toward immediate rewards that contributes to impulsive behavior.
Here, then, is their abstract:
People often exert willpower to choose a more valuable delayed reward over a less valuable immediate reward, but using willpower is taxing and frequently fails. In this research, we demonstrate the ability to enhance self-control (i.e., forgoing smaller immediate rewards in favor of larger delayed rewards) without exerting additional willpower. Using behavioral and neuroimaging data, we show that a reframing of rewards (i) reduced the subjective value of smaller immediate rewards relative to larger delayed rewards, (ii) increased the likelihood of choosing the larger delayed rewards when choosing between two real monetary rewards, (iii) reduced the brain reward responses to immediate rewards in the dorsal and ventral striatum, and (iv) reduced brain activity in the dorsolateral prefrontal cortex (a correlate of willpower) when participants chose the same larger later rewards across the two choice frames. We conclude that reframing can promote self-control while avoiding the need for additional willpower expenditure.

Monday, July 14, 2014

Cooperating with the future

You should have a look at this nice Nature Video that very simply illustrates work by Hauser et al. dealing with how we might design policies aimed at preserving shared resources, such as clean air or fish stocks. They show conditions under which individuals will share current resources with future generations who cannot return the favor. Preservation rather than depletion of a resource for future generations can be obtained if a group of people agrees to a binding vote on how much each member should take from the common pool.
Overexploitation of renewable resources today has a high cost on the welfare of future generations. Unlike in other public goods games, however, future generations cannot reciprocate actions made today. What mechanisms can maintain cooperation with the future? To answer this question, we devise a new experimental paradigm, the ‘Intergenerational Goods Game’. A line-up of successive groups (generations) can each either extract a resource to exhaustion or leave something for the next group. Exhausting the resource maximizes the payoff for the present generation, but leaves all future generations empty-handed. Here we show that the resource is almost always destroyed if extraction decisions are made individually. This failure to cooperate with the future is driven primarily by a minority of individuals who extract far more than what is sustainable. In contrast, when extractions are democratically decided by vote, the resource is consistently sustained. Voting is effective for two reasons. First, it allows a majority of cooperators to restrain defectors. Second, it reassures conditional cooperators that their efforts are not futile. Voting, however, only promotes sustainability if it is binding for all involved. Our results have implications for policy interventions designed to sustain intergenerational public goods.
A variation of this procedure has been put to use by the United States’ largest electric utility, PG&E, to get customers to sign up for monitoring which helps prevent summer electrical grid failure and blackouts by slightly reducing air conditioning when the grid is under stress. Enrollment in the program was enhanced by publicly posting the names of those who had signed up.

Friday, July 11, 2014

Does phase of the moon influence our sleep? Three contradictory studies.

This is an update on a previous MindBlog posting. Vyazovskiy and Foster review three recent studies that give contradictory results on how or whether the phase of the moon influences our sleep. They note that the three studies compared data obtained from different subjects at different lunar phases and were biased and imbalanced in terms of age, gender, and many other factors. They suggest that in future research it should be mandatory to design within-subject experiments, rather than perform further retrospective studies. Here is their statement of the situation:
Whether the moon affects our sleep has intrigued our species since ancient times, but in the last decades only relatively few attempts have been made to address this issue with scientific rigor, and solid conclusions have been elusive [1]. A new cycle of research on the lunar effects on sleep was triggered by a retrospective study which carefully re-analyzed the sleep data collected under laboratory conditions in 33 subjects (age range 20–74 years) and showed clear cut effects of the lunar phase on several subjective and objective sleep parameters [2]. Specifically, EEG slow-wave activity (SWA), total sleep time and subjective sleep quality were reduced around the time of the full moon, while sleep latency and latency to REM sleep were prolonged. This study corroborated an earlier report [5], which found a significant decrease in the amount of subjective sleep around the full moon in 31 subjects (mean age of 50 years). This report triggered two further studies, published in the current issue, which either contradict or report novel effects of lunar phase 3 and 4.
One of these studies, a re-analysis of existing large data sets, could not confirm any of the findings made by Cajochen et al. [3]. By contrast, a second retrospective study [4], in which 47 young volunteers were analyzed, confirmed a decreased total sleep time around the full moon, but REM sleep latency was longer around the new moon. This contradicts the Cajochen et al. study as they found that the latency to REM was longest around the full moon [2].
References: 1. R.G. Foster, T. Roenneberg. Human responses to the geophysical daily, annual and lunar cycles. Curr. Biol., 18 (2008), pp. R784–R794 2. C. Cajochen, S. Altanay-Ekici, M. Munch, S. Frey, V. Knoblauch, A. Wirz-Justice. Evidence that the lunar cycle influences human sleep. Curr. Biol., 23 (2013), pp. 1485–1488 3. M. Cordi, S. Ackermann, F.W. Bes, F. Hartmann, B.N. Konrad, L. Genzel, M. Pawlowski, A. Steiger, H. Schulz, B. Rasch, M. Dresler. Lunar cycle effects on sleep and the file drawer problem. Curr. Biol., 24 (2014), pp. R549–R550 4. M. Smith, I. Croy, K.P. Waye. Human sleep and cortical reactivity are influenced by lunar phase. Curr. Biol., 24 (2014), pp. R551–R552 5. M. Roosli, P. Juni, C. Braun-Fahrlander, M.W. Brinkhof, N. Low, M. Egger. Sleepless night, the moon is bright: longitudinal study of lunar phase and sleep J. Sleep Res., 15 (2006), pp. 149–153

Thursday, July 10, 2014

The untutored mind does not like to be alone with itself.

Our mammalian brains evolved to physically engage the world in the interest of our survival and passing on genes. Humans are distinctive among animals in being able to disengage, and some meditation techniques train just such disengagement. A recent collaboration including Daniel Gilbert (see "A wandering mind is an unhappy mind.") makes the interesting observation that not only is disengagement difficult for most people, some, if asked to just sit in a room and do nothing (with a nine volt battery the only entertainment provided), prefer to electrically shock themselves rather than be deprived of external sensory stimuli!
In 11 studies, we found that participants typically did not enjoy spending 6 to 15 minutes in a room by themselves with nothing to do but think, that they enjoyed doing mundane external activities much more, and that many preferred to administer electric shocks to themselves instead of being left alone with their thoughts. Most people seem to prefer to be doing something rather than nothing, even if that something is negative.

Wednesday, July 09, 2014

Caring for the future

In a fascinating behavioral economics experiment Hauser et al examine willingness of people in a group to sacrifice personal gains for future generations, and show that whether majorities that will sacrifice for the future are adequate for the task depends on whether choices are made individually or by group decision. Nature magazine does a nice presentation of this work with an instructive video and a news and views commentary. Here is the abstract of the article:
Overexploitation of renewable resources today has a high cost on the welfare of future generations. Unlike in other public goods games however, future generations cannot reciprocate actions made today. What mechanisms can maintain cooperation with the future? To answer this question, we devise a new experimental paradigm, the ‘Intergenerational Goods Game’. A line-up of successive groups (generations) can each either extract a resource to exhaustion or leave something for the next group. Exhausting the resource maximizes the payoff for the present generation, but leaves all future generations empty-handed. Here we show that the resource is almost always destroyed if extraction decisions are made individually. This failure to cooperate with the future is driven primarily by a minority of individuals who extract far more than what is sustainable. In contrast, when extractions are democratically decided by vote, the resource is consistently sustained. Voting is effective for two reasons. First, it allows a majority of cooperators to restrain defectors. Second, it reassures conditional cooperators that their efforts are not futile. Voting, however, only promotes sustainability if it is binding for all involved. Our results have implications for policy interventions designed to sustain intergenerational public goods.
And, by the way, here is a nice piece on "Caring for the present", how peer presence and pressure can help preserve electric grids.

Tuesday, July 08, 2014

Information integration without awareness.

I want to point to an excellent review by Christof Koch and colleagues. It contains some useful summary graphics.
•Empirical data suggest that consciousness is not necessary for integration.
•Unconscious integration is nevertheless limited.
•Consciousness enables integrations over extended spatiotemporal windows.
•Consciousness may be needed for novel and high-level semantic integrations.
Information integration and consciousness are closely related, if not interdependent. But, what exactly is the nature of their relation? Which forms of integration require consciousness? Here, we examine the recent experimental literature with respect to perceptual and cognitive integration of spatiotemporal, multisensory, semantic, and novel information. We suggest that, whereas some integrative processes can occur without awareness, their scope is limited to smaller integration windows, to simpler associations, or to ones that were previously acquired consciously. This challenges previous claims that consciousness of some content is necessary for its integration; yet it also suggests that consciousness holds an enabling role in establishing integrative mechanisms that can later operate unconsciously, and in allowing wider-range integration, over bigger semantic, spatiotemporal, and sensory integration windows.

Monday, July 07, 2014

Couch potato? You should be taking omega-3!

I pass on the abstract from Leckie et al. (DHA refers to docosahexaenoic acid, an omega-3 polyunsaturated fatty acid that is highly concentrated in the brain, and has been associated with better performance on measures of executive function.) I guess the idea is that if you don't want to exercise, you should at least take an omega-3 supplement!
Greater amounts of physical activity (PA) and omega-3 fatty acids have both been independently associated with better cognitive performance. Because of the overlapping biological effects of omega-3 fatty acids and PA, fatty acid intake may modify the effects of PA on neurocognitive function. The present study tested this hypothesis by examining whether the ratio of serum omega-6 to omega-3 fatty acid levels would moderate the association between PA and executive and memory functions in 344 participants (Mean age=44.42 years, SD=6.72). The Paffenbarger Physical Activity Questionnaire (PPAQ), serum fatty acid levels, and performance on a standard neuropsychological battery were acquired on all subjects. A principal component analysis reduced the number of cognitive outcomes to three factors: n-back working memory, Trail Making test, and Logical Memory. We found a significant interaction between PA and the ratio of omega-6 to omega-3 fatty acid serum levels on Trail Making performance and n-back performance, such that higher amounts of omega-3 levels offset the deleterious effects of lower amounts of PA. These effects remained significant in a subsample (n=299) controlling for overall dietary fat consumption. There were no significant additive or multiplicative benefits of higher amounts of both omega-3 and PA on cognitive performance. Our results demonstrate that a diet high in omega-3 fatty acids might mitigate the effect of lower levels of PA on cognitive performance. This study illuminates the importance of understanding dietary and PA factors in tandem when exploring their effects on neurocognitive health.

Friday, July 04, 2014

Moral judgements depend on what language we’re speaking.

Costa et al. use the famous trolley problem to offer another example of the incredible power of the tribal or "us versus them" nature of our psychology. Studies have shown that this mentality (fundamental, for example, to the current chaos in the Middle East) emerges spontaneously in previously homogenous groups of young children as well as adults. In the trolley problem, the following scenario is presented to subjects: An approaching trolley is about to kill five people farther down the tracks. The only way to stop it is to push a large man off the footbridge and onto the tracks below. This will save the five people but kill the man. (It will not help if you jump; you are not large enough.) Do you push him? Costa et al. find that when people are presented with the trolley problem in a foreign language, they are more willing to sacrifice one person to save five than when they are presented with the dilemma in their native tongue. Their abstract:
Should you sacrifice one man to save five? Whatever your answer, it should not depend on whether you were asked the question in your native language or a foreign tongue so long as you understood the problem. And yet here we report evidence that people using a foreign language make substantially more utilitarian decisions when faced with such moral dilemmas. We argue that this stems from the reduced emotional response elicited by the foreign language, consequently reducing the impact of intuitive emotional concerns. In general, we suggest that the increased psychological distance of using a foreign language induces utilitarianism. This shows that moral judgments can be heavily affected by an orthogonal property to moral principles, and importantly, one that is relevant to hundreds of millions of individuals on a daily basis.

Thursday, July 03, 2014

Consciousness is constructed through a discrete set of activity spaces.

This fascinating work by Hudson et al. shows that as the brain recovers consciousness from a perturbation such as anesthesia, it does not follows a steady and monotonic path towards consciousness, but rather passes through several discrete activity states. They performed a principal component analysis on local field potentials recorded with electrodes inserted into rat anterior cingulate and retrosplenial cortices and the intralaminar thalamus:
It is not clear how, after a large perturbation, the brain explores the vast space of potential neuronal activity states to recover those compatible with consciousness. Here, we analyze recovery from pharmacologically induced coma to show that neuronal activity en route to consciousness is confined to a low-dimensional subspace. In this subspace, neuronal activity forms discrete metastable states persistent on the scale of minutes. The network of transitions that links these metastable states is structured such that some states form hubs that connect groups of otherwise disconnected states. Although many paths through the network are possible, to ultimately enter the activity state compatible with consciousness, the brain must first pass through these hubs in an orderly fashion. This organization of metastable states, along with dramatic dimensionality reduction, significantly simplifies the task of sampling the parameter space to recover the state consistent with wakefulness on a physiologically relevant timescale.

Wednesday, July 02, 2014

Parenting rewires men's brains.

From Abraham et al., interesting material on a global "parental caregiving" neural network in our brains:
Although contemporary socio-cultural changes dramatically increased fathers' involvement in childrearing, little is known about the brain basis of human fatherhood, its comparability with the maternal brain, and its sensitivity to caregiving experiences. We measured parental brain response to infant stimuli using functional MRI, oxytocin, and parenting behavior in three groups of parents (n = 89) raising their firstborn infant: heterosexual primary-caregiving mothers (PC-Mothers), heterosexual secondary-caregiving fathers (SC-Fathers), and primary-caregiving homosexual fathers (PC-Fathers) rearing infants without maternal involvement. Results revealed that parenting implemented a global “parental caregiving” neural network, mainly consistent across parents, which integrated functioning of two systems: the emotional processing network including subcortical and paralimbic structures associated with vigilance, salience, reward, and motivation, and mentalizing network involving frontopolar-medial-prefrontal and temporo-parietal circuits implicated in social understanding and cognitive empathy. These networks work in concert to imbue infant care with emotional salience, attune with the infant state, and plan adequate parenting. PC-Mothers showed greater activation in emotion processing structures, correlated with oxytocin and parent-infant synchrony, whereas SC-Fathers displayed greater activation in cortical circuits, associated with oxytocin and parenting. PC-Fathers exhibited high amygdala activation similar to PC-Mothers, alongside high activation of superior temporal sulcus (STS) comparable to SC-Fathers, and functional connectivity between amygdala and STS. Among all fathers, time spent in direct childcare was linked with the degree of amygdala-STS connectivity. Findings underscore the common neural basis of maternal and paternal care, chart brain–hormone–behavior pathways that support parenthood, and specify mechanisms of brain malleability with caregiving experiences in human fathers.

Monday, June 30, 2014

Emotional contagion through social networks

I have frequently noticed that simply reading the barrage of negative news in the daily New York Times about the myriad things in our world that aren't working can unconsciously tilt me into a more negative or depressed mood that requires active countermeasures. Kramer et al. now use the newsfeed of a social network to demonstrate and quantify such a phenomenon. (The collaboration of Facebook with researchers that used a random selection of 500,000 Facebook users as lab rats has drawn a storm of comment.. I'm posting this earlier than I planned, because now I'm watching the NBC evening news do a segment on the issue. And, here is Jaron Lanier weighing in on the debate.) The article's abstract:
Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others. Data from a large real-world social network, collected over a 20-y period suggests that longer-lasting moods (e.g., depression, happiness) can be transferred through networks [Fowler JH, Christakis NA (2008) BMJ 337:a2338], although the results are controversial. In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.

Mind reading has to be taught.

A review by Heyes and Frith notes how our learning to read minds is like learning to read print, except that it occurs much earlier in our development. It is a slow effortful process in which a novice adds to innate neurocognitive mechanisms by developing culture-specific skill through expert tuition .
It is not just a manner of speaking: “Mind reading,” or working out what others are thinking and feeling, is markedly similar to print reading. Both of these distinctly human skills recover meaning from signs, depend on dedicated cortical areas, are subject to genetically heritable disorders, show cultural variation around a universal core, and regulate how people behave. But when it comes to development, the evidence is conflicting. Some studies show that, like learning to read print, learning to read minds is a long, hard process that depends on tuition. Others indicate that even very young, nonliterate infants are already capable of mind reading. Here, we propose a resolution to this conflict. We suggest that infants are equipped with neurocognitive mechanisms that yield accurate expectations about behavior (“automatic” or “implicit” mind reading), whereas “explicit” mind reading, like literacy, is a culturally inherited skill; it is passed from one generation to the next by verbal instruction.

Friday, June 27, 2014

Stress chemistry predicts age-related cognitive decline.

Anderson et al. show, in studies on rats, that elevated hypothalamo-pituitary-adrenal (HPA) activity impairs not only hippocampal function during aging but may underlie deterioration of other cognitive functions. In aging (but not younger) animals spatial working memory deficits were exacerbated by increased HPA activity. They note changes in nerve synapse structure that correlate with these deficits. The abstract offers details:
Cognitive decline in aging is marked by considerable variability, with some individuals experiencing significant impairments and others retaining intact functioning. Whereas previous studies have linked elevated hypothalamo-pituitary-adrenal (HPA) axis activity with impaired hippocampal function during aging, the idea has languished regarding whether such differences may underlie the deterioration of other cognitive functions. Here we investigate whether endogenous differences in HPA activity are predictive of age-related impairments in prefrontal structural and behavioral plasticity. Young and aged rats (4 and 21 months, respectively) were partitioned into low or high HPA activity, based upon averaged values of corticosterone release from each animal obtained from repeated sampling across a 24 h period. Pyramidal neurons in the prelimbic area of medial prefrontal cortex were selected for intracellular dye filling, followed by 3D imaging and analysis of dendritic spine morphometry. Aged animals displayed dendritic spine loss and altered geometric characteristics; however, these decrements were largely accounted for by the subgroup bearing elevated corticosterone. Moreover, high adrenocortical activity in aging was associated with downward shifts in frequency distributions for spine head diameter and length, whereas aged animals with low corticosterone showed an upward shift in these indices. Follow-up behavioral experiments revealed that age-related spatial working memory deficits were exacerbated by increased HPA activity. By contrast, variations in HPA activity in young animals failed to impact structural or behavioral plasticity. These data implicate the cumulative exposure to glucocorticoids as a central underlying process in age-related prefrontal impairment and define synaptic features accounting for different trajectories in age-related cognitive function.

Thursday, June 26, 2014

Megascience efforts and the brain.

Grillner offers a commentary on the current megascience efforts (costing gazillions of dollars) to develop infrastructure for tools, modeling, or neuroinformatics. It seems a bit surprising that they are not instead focused directly on gaining fundamental new insights into brain function. We're putting the cart before the horse...the macro-efforts are not going to get us there if we haven't painstakingly worked out the steps of what goes on between ion channels, nerve cells, simple systems of nerve cells, and increasingly higher levels of integration.  He makes his simple point in a summary graphic which I pass on:

The Interface between the Cellular Level and Global Brain Function Is the Major Challenge for Current Neuroscience.



The two extreme levels of neuroscience that have evolved rapidly, cellular (left) and brain imaging (right). The challenge is to bridge these levels in order to be able to explain behavior in terms of cells and synapses. It emphasizes the fact that in order to have a solid underpinning of the circuits underlying a specific function, one needs to be able to bridge from gene through the different steps indicated below in the figure to, for instance, a cognitive or behavioral function.

Wednesday, June 25, 2014

Executive control training reduces rumination.

Rumination (thinking repetitively and passively about negative emotions, focusing on symptoms of distress) is a maladaptive form of self-focus that is a core factor in depression and other disorders of emotion dysregulation. Cohen et al. show that training individuals to exert executive control when processing negative stimuli (to attenuate the effects of emotional content) can alleviate subsequent ruminative thinking and rumination-related sad mood. The training used an arrow version of the flanker task:
In this task, participants indicate the direction of a middle arrow while ignoring two distracting arrows on either side. The distractors are either congruent or incongruent with the direction of the target arrow. Responding to incongruent trials is slower than responding to congruent trials because they present a cognitive conflict and executive control is recruited to resolve this conflict. In each trial, the flanker stimulus is followed by a negative or a neutral picture. The effect of the flanker stimulus type (congruent vs. incongruent) on emotional interference is assessed using a simple discrimination task in which participants are required to indicate whether a square is blue or green. The discrimination task is performed immediately after the presentation of the picture. Our prior work has shown that prolonged RT on the discrimination task following negative compared with neutral pictures was found following congruent but not incongruent stimuli...Participants were assigned either to an experimental group, in which 90% of the negative pictures were preceded by incongruent stimuli, or to a control group, in which only 10% of the negative pictures were preceded by incongruent stimuli. For both groups, the emotional flanker task contained 50% congruent and 50% incongruent stimuli, which were followed by 50% negative and 50% neutral pictures. Thus, the only difference between the groups was the pairing of congruent and incongruent flanker stimuli with negative or neutral pictures. We predicted that compared with participants in the control group, participants in the experimental group (i.e., where 90% of the negative pictures were preceded by incongruent stimuli) would show reduced ruminative thinking and rumination-related negative mood following the training.
Their data show that processing incongruent flanker stimuli prior to the presentation of emotional stimuli (i.e., exercising executive control) can promote inhibition of irrelevant emotional information and reduce its interference. Here is the abstract:
Rumination, a maladaptive self-reflection, is a risk factor for depression, thought to be maintained by executive control deficits that impair ruminators’ ability to ignore emotional information. The current research examined whether training individuals to exert executive control when exposed to negative stimuli can ease rumination. A total of 85 participants were randomly assigned to one of two training conditions. In the experimental condition activation of executive control was followed predominantly by the presentation of negative pictures, whereas in the control condition it was followed predominantly by neutral pictures. As predicted, participants in the experimental group showed reduced state rumination compared with those in the control group. Furthermore, trait rumination, and particularly its maladaptive subtype brooding, was associated with increased sadness only among participants in the control group, and not in the experimental group. We argue that training individuals to exert executive control when processing negative stimuli can alleviate ruminative thinking and rumination-related sad mood.

Tuesday, June 24, 2014

Distinguishing the 50 united states with a tightness-looseness measure.

Harrington and Gelfand offer a parsimonious mechanism for the striking cultural and political differences between the 50 United States by suggesting that the states differ in tightness (many strongly enforced rules and little tolerance for deviance) versus looseness (few strongly enforced rules and greater tolerance for deviance), with this being a logical outcome of their different circumstances (ecological threats, human threats, etc.). They find that tightness–looseness and collectivism–individualism are distinct constructs. Data from their index and state-level indices of collectivism–individualism demonstrate that there are tight states that are collectivistic (e.g., Alabama, Mississippi, Texas, South Carolina), loose states that are collectivistic (e.g., Hawaii, New Jersey, Maryland, California), loose states that are individualistic (e.g., Oregon, Washington, New Hampshire, Vermont), and tight states that are individualistic (e.g., Wyoming, Kansas, Oklahoma, Ohio). In this tightness-looseness figure, the states are organized into quintiles, with the top ten loosest being the lightest color. The map (click to enlarge) was constructed at www.diymaps.net.


Here is their abstract:
This research demonstrates wide variation in tightness–looseness (the strength of punishment and degree of latitude/permissiveness) at the state level in the United States, as well as its association with a variety of ecological and historical factors, psychological characteristics, and state-level outcomes. Consistent with theory and past research, ecological and man-made threats—such as a higher incidence of natural disasters, greater disease prevalence, fewer natural resources, and greater degree of external threat—predicted increased tightness at the state level. Tightness is also associated with higher trait conscientiousness and lower trait openness, as well as a wide array of outcomes at the state level. Compared with loose states, tight states have higher levels of social stability, including lowered drug and alcohol use, lower rates of homelessness, and lower social disorganization. However, tight states also have higher incarceration rates, greater discrimination and inequality, lower creativity, and lower happiness relative to loose states. In all, tightness–looseness provides a parsimonious explanation of the wide variation we see across the 50 states of the United States of America.

Monday, June 23, 2014

Early music training enhances cognitive capacities in adults.

Further experiments on the profound effect that early musical training has on executive functioning in adults. The article has a useful introduction that references previous related work. (As a lifelong performing pianist, I enjoy articles like this!)
Executive functions (EF) are cognitive capacities that allow for planned, controlled behavior and strongly correlate with academic abilities. Several extracurricular activities have been shown to improve EF, however, the relationship between musical training and EF remains unclear due to methodological limitations in previous studies. To explore this further, two experiments were performed; one with 30 adults with and without musical training and one with 27 musically trained and untrained children (matched for general cognitive abilities and socioeconomic variables) with a standardized EF battery. Furthermore, the neural correlates of EF skills in musically trained and untrained children were investigated using fMRI. Adult musicians compared to non-musicians showed enhanced performance on measures of cognitive flexibility, working memory, and verbal fluency. Musically trained children showed enhanced performance on measures of verbal fluency and processing speed, and significantly greater activation in pre-SMA/SMA and right VLPFC during rule representation and task-switching compared to musically untrained children. Overall, musicians show enhanced performance on several constructs of EF, and musically trained children further show heightened brain activation in traditional EF regions during task-switching. These results support the working hypothesis that musical training may promote the development and maintenance of certain EF skills, which could mediate the previously reported links between musical training and enhanced cognitive skills and academic achievement.

Functional MRI imaging during mental task switching: Panels A and B shows brain activation in musically trained and untrained children, respectively. Panel C shows brain areas that are more active in musically trained than musically untrained children

Monday music - Debussy Nocturne

I pass on this Debussy Nocturne I recorded at my Twin Valley Middleton home last week, after playing it for a local music group on Tuesday evening.

Friday, June 20, 2014

On the precipice - a "Majority-Minority" America.

Sigh.... another chilling vision of America's future from Craig and Richeson. Increasing polarization of groups:
The U.S. Census Bureau projects that racial minority groups will make up a majority of the U.S. national population in 2042, effectively creating a so-called majority-minority nation. In four experiments, we explored how salience of such racial demographic shifts affects White Americans’ political-party leanings and expressed political ideology. Study 1 revealed that making California’s majority-minority shift salient led politically unaffiliated White Americans to lean more toward the Republican Party and express greater political conservatism. Studies 2, 3a, and 3b revealed that making the changing national racial demographics salient led White Americans (regardless of political affiliation) to endorse conservative policy positions more strongly. Moreover, the results implicate group-status threat as the mechanism underlying these effects. Taken together, this work suggests that the increasing diversity of the nation may engender a widening partisan divide.

Thursday, June 19, 2014

Dopamine receptor genes and independent versus interdependent social orientation.

Kitayama et al. make yet another stab at finding correlates of the often cited distinction of European American (more independent) and Asians (more interdependent). Their suggested genetic correlate can be compared with the environmental correlate I just noted in a recent post. Here, with the usual 'correlations are not causes' disclaimer, is their abstract:
Prior research suggests that cultural groups vary on an overarching dimension of independent versus interdependent social orientation, with European Americans being more independent, or less interdependent, than Asians. Drawing on recent evidence suggesting that the dopamine D4 receptor gene (DRD4) plays a role in modulating cultural learning, we predicted that carriers of DRD4 polymorphisms linked to increased dopamine signaling (7- or 2-repeat alleles) would show higher levels of culturally dominant social orientations, compared with noncarriers. European Americans and Asian-born Asians (total N = 398) reported their social orientation on multiple scales. They were also genotyped for DRD4. As in earlier work, European Americans were more independent, and Asian-born Asians more interdependent. This cultural difference was significantly more pronounced for carriers of the 7- or 2-repeat alleles than for noncarriers. Indeed, no cultural difference was apparent among the noncarriers. Implications for potential coevolution of genes and culture are discussed.
Given that the independent/interdependent ratio is a consequence of gene-cultural environment interaction, it is possible that some cultural effects might be moderated by specific dopamine receptor genetic variants. (Other work has suggested different alleles of the serotonin transporter gene correlate with susceptibility to stress and depression, and that serotonin 1A receptor gene polymorphism correlates with cultural difference in holistic attention.)

Wednesday, June 18, 2014

Speed reading apps blow away comprehension.

Schotter et al. make a demonstration that being able to glance back during reading (not allowed under speed reading conditions) significantly enhances comprehension...readers' control over their eye movements is important.
Recent Web apps have spurred excitement around the prospect of achieving speed reading by eliminating eye movements (i.e., with rapid serial visual presentation, or RSVP, in which words are presented briefly one at a time and sequentially). Our experiment using a novel trailing-mask paradigm contradicts these claims. Subjects read normally or while the display of text was manipulated such that each word was masked once the reader’s eyes moved past it. This manipulation created a scenario similar to RSVP: The reader could read each word only once; regressions (i.e., rereadings of words), which are a natural part of the reading process, were functionally eliminated. Crucially, the inability to regress affected comprehension negatively. Furthermore, this effect was not confined to ambiguous sentences. These data suggest that regressions contribute to the ability to understand what one has read and call into question the viability of speed-reading apps that eliminate eye movements (e.g., those that use RSVP).

Tuesday, June 17, 2014

Watching the physical correlate of memory improvement during sleep.

Euston and Steenland do a perspective on nice work by Yang et al. that probes the role of sleep in altering mouse brain structures. I pass on their summary figure (click to enlarge) and some context comments.:

Figure Legend. Three phenomena that occur during sleep have been linked to memory enhancement—slow-wave oscillations in brain electrical activity, reactivation of recent experiences, and changes in synaptic connectivity but the strength of the evidence (indicated by arrow thickness) varies. As shown in red, Yang et al. link both reactivation and slow-wave sleep to changes in synaptic connectivity that enhance learning.
To address whether synaptic strength increases or decreases during sleep, Yang et al. used a powerful technique to visualize dendritic spines in the motor cortex of live mice. The mice were genetically engineered to express a fluorescent protein in a subset of cortical cells. A small window was created in the skull, allowing microscopic imaging of dendritic spines repeatedly over the course of hours or even days. This technique was previously used to show that training mice to stay atop a rotating rod—an acquired skill—induced the formation of new dendritic spines in the motor cortex. Further, the rate of new spine formation was correlated with the degree of task improvement. These findings provided direct evidence that synaptic change in the mammalian cortex underlies learning. Yang et al. extend these findings, showing that learning-induced spine changes are segregated on specific dendritic branches. After learning, when two branches on the same dendritic arbor were examined, one typically showed many more new spines than the other. If mice were subsequently trained on a different skill (i.e., running backward on the spinning rod), the new spines induced by the second task grew selectively on the previously underproductive branch. Hence, different skills seem to be localized on different dendritic branches.
To test the role of sleep in spine formation, Yang et al. repeated their experiment with and without an 8-hour period of sleep deprivation immediately after training. Sleep deprivation markedly decreased the number of new spines. This effect also was branch-specific in that sleep deprivation reduced spine formation primarily on the dendritic branch with the higher number of new spines. Importantly, sleep had no effect on the rate of spine elimination. The authors also observed that sleep made newly formed spines much more likely to still be present 1 day later, consistent with the idea that consolidated memories are less sensitive to decay. In other words, sleep gives spines staying power.

Monday, June 16, 2014

When being a control-freak doesn't help....

Bocanegra and Hommel note limits to the usefulness of cognitive control, showing, in particular, how overcontrol (induced by task instructions) can prevent the otherwise automatic exploitation of statistical stimulus characteristics needed to optimize behavior. They describe how they set up the experiment:
Participants performed a two-alternative forced-choice task on a foveally presented stimulus that could vary on a subset of binary perceptual features, such as color (red, green), shape (diamond, square), size (large, small), topology (open, closed), and location (up, down). Unbeknownst to the participants, we manipulated the statistical informativeness of an additional feature that was not part of the task, such that this feature always predicted the correct response in one condition (the predictive condition) but not in the other condition (the baseline condition). Because the cognitive system is known to exploit statistical stimulus-response contingencies automatically, performance was expected to be better in the predictive than in the baseline condition.
We embedded these predictive and baseline conditions into two different tasks, which we thought would induce different cognitive-control states. The control task included instructions intended to emphasize the need for top-down control: Participants were instructed to classify the stimulus according to a feature-conjunction rule (e.g., size and topology: left response key for large and open or small and closed shapes, right response key for small and open or large and closed shapes). The automatic task included instructions intended to deemphasize the need for control: Participants were instructed to classify the stimulus according to a single feature (e.g., shape: left response key for a diamond and right response key for a square). In the automatic task, the features were mapped consistently on responses and thus allowed automatic visuomotor translation. In contrast, the stimulus-response mapping in the control task required the attention-demanding integration of two features before the response could be determined.
As expected, the predictive feature improved performance when participants performed the task automatically. Counterintuitively, however, the predictive feature impaired performance when subjects were performing the exact same task in a top-down, controlled manner.
Their abstract:
In order to engage in goal-directed behavior, cognitive agents have to control the processing of task-relevant features in their environments. Although cognitive control is critical for performance in unpredictable task environments, it is currently unknown how it affects performance in highly structured and predictable environments. In the present study, we showed that, counterintuitively, top-down control can impair and interfere with the otherwise automatic integration of statistical information in a predictable task environment, and it can render behavior less efficient than it would have been without the attempt to control the flow of information. In other words, less can sometimes be more (in terms of cognitive control), especially if the environment provides sufficient information for the cognitive system to behave on autopilot based on automatic processes alone.

Another Poulenc offering: Improvisation No. 13

Another Monday morning post of a recent piano recording I've done.

  


Friday, June 13, 2014

Brain initiative meets physics….Opps!

Scientists leading the much-heralded Obama Brain Initiative initially providing $100 million (now NIH is seeking 4.5 billion for their part of the project) to craft new tools for measuring brain activity may have been insufficiently aware that some of their ideas:
“violated either a physical law or some very significant engineering constraint or biological constraint,”
I want to pass on the text of this article noting a meeting sponsored by the National Science Foundation at Cold Spring Harbor Laboratory.

The goal is to have a realistic discussion of what the physical limits are, he says, so “scientists who want to make devices will not make crazy proposals,” or, “if a proposal is crazy, one could recognize it as such” and look for other ways to make the idea work.

One such “fanciful” idea is to build nanosized radios that could snuggle up to individual neurons to record and transmit information about their activity, says physicist Peter Littlewood, director of Argonne National Laboratory in Lemont, Illinois. But any radio small enough to be injected into the brain without causing significant harm would not be able to transmit any information out through tissue and bone, he says. Make the devices any more powerful, he adds, and they'd likely cook the surrounding brain. Another aspiration that is likely doomed is to get microscopes that probe the brain with pulses of light to penetrate much further than they already do, Mitra says. A little more than 1 mm is possible, he adds, but even 1 cm is “out of the question, since the signal to background [noise] ratio decreases exponentially with depth.”

But physicists and engineers shouldn't simply shoot down outlandish proposals—or gripe about the intrinsic messiness of the brain's biology. They should model themselves as “fancy technicians” who can help develop revolutionary tools, Littlewood says. There are precedents for such collaboration, he notes: He, Mitra, and their colleagues at Bell Labs, for example, helped develop functional magnetic resonance imaging in the 1990s.

One area where physical scientists can help today is in fashioning small, strong, conductive wires that can record from many different neurons simultaneously, says neuro physicist David Kleinfeld of the University of California, San Diego. For decades, neuro scientists have relied largely on electrodes fashioned from fragile glass pipettes. But only a small number of these sensors will fit in a given brain region without disrupting connections between cells or killing them outright. Biophysicist Timothy Harris at the Janelia Farm Research Campus in Ashburn, Virginia, and others have had some success at making much smaller ones for fish and fly brains—some, made of silicon, are roughly 3 microns wide, about 25 times thinner than a human hair.

These probes are by no means the tiniest possible—polymer-coated carbon nanotubes, for example, can be 0.1 microns or smaller across and are highly conductive. Such thin wires tend to be very short and too flexible to get into the brain easily, however—when pushed, they simply buckle. One question Harris plans to pose at the meeting is whether the probes could be magnetized, then pulled, rather than pushed, into the brain with a powerful magnet.

Ultimately, researchers hope to measure neural activity inside the brain without poking wires into living tissue, and there, too, physics can help. Harris has his eye on light-sheet microscopy, which shines a plane of light across a living brain tissue, illuminating neurons engineered to fluoresce green or red when they are flooded by calcium during neuronal firing. Last year, neuroscientist Misha Ahrens and colleagues at Janelia Farm used this technique to produce the first “real” whole-brain activity map of a zebrafish larva, Harris says.

A larval zebrafish brain is 1000 times smaller than a mouse brain, however. It is also conveniently transparent, while mouse and human brain tissue scatter and blur light. Using the same optical techniques that astronomers employ to discern very faint or close-together stars with a telescope, researchers such as physicist Na Ji, also at Janelia Farm, have discovered ways to distinguish between hard-to-see neurons in murky brain tissue.

In preparation for the meeting, Mitra has brushed off an old copy of Principles of Optics by Emil Wolf and Max Born, one of the most venerable and difficult physics tomes. Getting back to basics, he hopes, will help him and his BRAIN project colleagues determine which rules must be followed to the letter, and which might be cleverly circumvented.

Thursday, June 12, 2014

Gratitude reduces economic impatience.

Whenever I come across yet another self-help laundry list of useful tricks for feeling better, and try a few, I repeatedly find that briefly following instructions to practice feeling gratitude has a very salutary, calming, effect...taking the edge off any impatience I might be feeling. DeSteno et al. look at this in a more systematic way, distinguishing the effect of gratitude from more positive global emotions of happiness with respect impatience, or short-term gratification. 75 study participants were split in three groups with different emotion-induction conditions: being asked to write brief essays on experiences of feeling grateful, happy, or neutral. They then made choices between receiving smaller cash amounts (ranging from $11 to $80) immediately and larger cash amounts (ranging from $25 to $85) at a point from 1 week to 6 months in the future. Their results clearly revealed that gratitude reduces excessive economic impatience (the temporal discounting of future versus immediate rewards) compared with the neutral and happy conditions, which were about equal. Here is their abstract:
The human mind tends to excessively discount the value of delayed rewards relative to immediate ones, and it is thought that “hot” affective processes drive desires for short-term gratification. Supporting this view, recent findings demonstrate that sadness exacerbates financial impatience even when the sadness is unrelated to the economic decision at hand. Such findings might reinforce the view that emotions must always be suppressed to combat impatience. But if emotions serve adaptive functions, then certain emotions might be capable of reducing excessive impatience for delayed rewards. We found evidence supporting this alternative view. Specifically, we found that (a) the emotion gratitude reduces impatience even when real money is at stake, and (b) the effects of gratitude are differentiable from those of the more general positive state of happiness. These findings challenge the view that individuals must tamp down affective responses through effortful self-regulation to reach more patient and adaptive economic decisions.

Wednesday, June 11, 2014

Childhood bullying predicts adult inflammation.

How is this for a chilling finding? Childhood bullying leaves bullies with lower, and victims with higher, levels of chronic inflammation than those uninvolved in bullying. From Copeland et al. :
Bullying is a common childhood experience that involves repeated mistreatment to improve or maintain one’s status. Victims display long-term social, psychological, and health consequences, whereas bullies display minimal ill effects. The aim of this study is to test how this adverse social experience is biologically embedded to affect short- or long-term levels of C-reactive protein (CRP), a marker of low-grade systemic inflammation. The prospective population-based Great Smoky Mountains Study (n = 1,420), with up to nine waves of data per subject, was used, covering childhood/adolescence (ages 9–16) and young adulthood (ages 19 and 21). Structured interviews were used to assess bullying involvement and relevant covariates at all childhood/adolescent observations. Blood spots were collected at each observation and assayed for CRP levels. During childhood and adolescence, the number of waves at which the child was bullied predicted increasing levels of CRP. Although CRP levels rose for all participants from childhood into adulthood, being bullied predicted greater increases in CRP levels, whereas bullying others predicted lower increases in CRP compared with those uninvolved in bullying. This pattern was robust, controlling for body mass index, substance use, physical and mental health status, and exposures to other childhood psychosocial adversities. A child’s role in bullying may serve as either a risk or a protective factor for adult low-grade inflammation, independent of other factors. Inflammation is a physiological response that mediates the effects of both social adversity and dominance on decreases in health.
Added note: I just came across this related article by Raposa et. al. on the developmental pathway from early life stress to inflammation.

Tuesday, June 10, 2014

Tonics for a long life?

I've recently come across two articles relevant to life extension (work done with mice and worms, to be sure, but a human who reads these papers might well be trying to get their hands on some of the stuff described to give it a try!). Dubai et al. report their work on Klotho, an aging regulator that, when overexpressed, extends lifespan in mice and nematode worms, and, when disrupted, accelerates aging phenotypes. (A lifespan expanding human variant of the KLOTHO gene, KL-VS, is associated with enhanced cognition in heterozygous carriers.) Here is their summary:
Aging is the primary risk factor for cognitive decline, an emerging health threat to aging societies worldwide. Whether anti-aging factors such as klotho can counteract cognitive decline is unknown. We show that a lifespan-extending variant of the human KLOTHO gene, KL-VS, is associated with enhanced cognition in heterozygous carriers. Because this allele increased klotho levels in serum, we analyzed transgenic mice with systemic overexpression of klotho. They performed better than controls in multiple tests of learning and memory. Elevating klotho in mice also enhanced long-term potentiation, a form of synaptic plasticity, and enriched synaptic GluN2B, an N-methyl-D-aspartate receptor (NMDAR) subunit with key functions in learning and memory. Blockade of GluN2B abolished klotho-mediated effects. Surprisingly, klotho effects were evident also in young mice and did not correlate with age in humans, suggesting independence from the aging process. Augmenting klotho or its effects may enhance cognition and counteract cognitive deficits at different life stages.
And, Ye et al. have done a screen, using nematodes, of over 1200 drugs active on human cells, including drugs approved for human use, finding ~60 that increase C. elegans lifespan up to 43%. They mainly act on proteins that function in signaling pathways between cells relevant to oxidative stress resistance - hormone or neurotransmitter receptors, particularly those for adrenaline and noradrenaline, serotonin, dopamine, histamine, and serotonin. This suggests and narrows down a list of drugs that might be tested for life extension in mammals.
One goal of aging research is to find drugs that delay the onset of age-associated disease. Studies in invertebrates, particularly Caenorhabditis elegans, have uncovered numerous genes involved in aging, many conserved in mammals. However, which of these encode proteins suitable for drug targeting is unknown. To investigate this question, we screened a library of compounds with known mammalian pharmacology for compounds that increase C. elegans lifespan. We identified 60 compounds that increase longevity in C. elegans, 33 of which also increased resistance to oxidative stress. Many of these compounds are drugs approved for human use. Enhanced resistance to oxidative stress was associated primarily with compounds that target receptors for biogenic amines, such as dopamine or serotonin. A pharmacological network constructed with these data reveal that lifespan extension and increased stress resistance cluster together in a few pharmacological classes, most involved in intercellular signaling. These studies identify compounds that can now be explored for beneficial effects on aging in mammals, as well as tools that can be used to further investigate the mechanisms underlying aging in C. elegans.

Monday, June 09, 2014

Rapidity of human brain and muscle evolution - the downside of smarts?

Roberts does a summary of fascinating work by Bozak et al. He sets the context: 
Somewhat narcissistically, one of the spectacular changes in phenotype that we tend to be most interested in is the enhancement in our own brain power which has occurred over the 6 million years that separate us from our last shared ancestor with chimpanzees. The chimp genome is famously very similar to our own, but the technological, linguistic, and cultural phenotype is clearly profoundly different. Several studies have asked open-ended questions as to what happens between the genotype and phenotype to make us so different from our cousins, finding differences in levels, splicing, and editing of gene transcripts, for example. Now a paper just published in PLOS Biology by Katarzyna Bozek, Philipp Khaitovich, and colleagues looks at another intermediate phenotype—the metabolome—with some intriguing and unexpected answers...The metabolome is the set of small molecules (metabolites) that are found in a given tissue; by “small” we mean those with a molecular weight of less than 1,500 Daltons, which includes fats, amino acids, sugars, nucleotides, and vitamins (vitamin B12, for example, is near the top end of this range).  
...the metabolomes of human prefrontal cortex (and of combined brain regions) have changed four times as rapidly in the last 6 million years as those of chimps. While gratifying, this largely confirms for metabolites what was already known for transcripts. 
...brain is not the most spectacular outlier here. The real surprise is that the human muscle metabolome has experienced more than eight times as much change as its chimp counterpart. Indeed, metabolomically speaking, human muscle has changed more in the last 6 million years than mouse muscle has since we parted company from mice back in the Early Cretaceous.  
...the authors compared the performance of humans, chimps, and macaques in a strength test that involved pulling a handle to raise a weight. Human strength, as measured by this test, was barely half that of the non-human primates. Amazingly, untrained chimps and macaques raised in captivity easily outperformed university-level basketball players and professional mountain climbers. The authors speculate that the fates of human brain and muscle may be inextricably entwined, and that weak muscle may be the price we pay for the metabolic demands of our amazing cognitive powers.

A Monday musical offering - Poulenc Improvisation No. 7

This is recorded on the Steinway B at my Twin Valley Rd. residence in Middleton, WI.  I used to regularly post my piano work on MindBlog, and will try to return to the habit.


Friday, June 06, 2014

First direct evidence for human sex pheromones.

Here is a clever experiment by Zhou et al., who digitally morph the gender of moving point light displays of walking from male to female while subjects are exposed to two human steroids that they can not discriminate. Here is their summary and abstract:

•Human steroid androstadienone conveys masculinity to straight women and gay men
•Human steroid estratetraenol conveys femininity to straight men
•The effects take place in the absence of awareness
•Human gender perception draws on subconscious chemosensory biological cues

Recent studies have suggested the existence of human sex pheromones, with particular interest in two human steroids: androstadienone (androsta-4,16,-dien-3-one) and estratetraenol (estra-1,3,5(10),16-tetraen-3-ol). The current study takes a critical step to test the qualification of the two steroids as sex pheromones by examining whether they communicate gender information in a sex-specific manner. By using dynamic point-light displays that portray the gaits of walkers whose gender is digitally morphed from male to female, we show that smelling androstadienone systematically biases heterosexual females, but not males, toward perceiving the walkers as more masculine. By contrast, smelling estratetraenol systematically biases heterosexual males, but not females, toward perceiving the walkers as more feminine. Homosexual males exhibit a response pattern akin to that of heterosexual females, whereas bisexual or homosexual females fall in between heterosexual males and females. These effects are obtained despite that the olfactory stimuli are not explicitly discriminable. The results provide the first direct evidence that the two human steroids communicate opposite gender information that is differentially effective to the two sex groups based on their sexual orientation. Moreover, they demonstrate that human visual gender perception draws on subconscious chemosensory biological cues, an effect that has been hitherto unsuspected.

Thursday, June 05, 2014

Social attention and our ventromedial prefrontal cortex.

Ralph Adolphs points to an interesting article by Wolf et al. showing that bilateral ventromedial prefrontal cortex damage impairs visual attention to the eye regions of faces, particularly for fearful faces. From Adolphs summary:



Failing to look at the eyes. Shown in each image are the regions of a face at which different groups of subjects look, as measured using eye-tracking. The hottest colours (red regions) denote those regions of the face where people look the most. Whereas this corresponds to the eye region of the face in healthy controls (far left), it is abnormal in certain clinical populations, including individuals with lesions of the vmPFC (top right) or amygdala (bottom right) and individuals with autism spectrum disorder (bottom centre) Top row: from Wolf et al. 2014. Bottom row: data from Michael Spezio, Daniel Kennedy, Ralph Adolphs. All images represent spatially smoothed data averaged across multiple fixations, multiple stimuli and multiple subjects within the indicated group.