Tuesday, June 24, 2014

Distinguishing the 50 united states with a tightness-looseness measure.

Harrington and Gelfand offer a parsimonious mechanism for the striking cultural and political differences between the 50 United States by suggesting that the states differ in tightness (many strongly enforced rules and little tolerance for deviance) versus looseness (few strongly enforced rules and greater tolerance for deviance), with this being a logical outcome of their different circumstances (ecological threats, human threats, etc.). They find that tightness–looseness and collectivism–individualism are distinct constructs. Data from their index and state-level indices of collectivism–individualism demonstrate that there are tight states that are collectivistic (e.g., Alabama, Mississippi, Texas, South Carolina), loose states that are collectivistic (e.g., Hawaii, New Jersey, Maryland, California), loose states that are individualistic (e.g., Oregon, Washington, New Hampshire, Vermont), and tight states that are individualistic (e.g., Wyoming, Kansas, Oklahoma, Ohio). In this tightness-looseness figure, the states are organized into quintiles, with the top ten loosest being the lightest color. The map (click to enlarge) was constructed at www.diymaps.net.


Here is their abstract:
This research demonstrates wide variation in tightness–looseness (the strength of punishment and degree of latitude/permissiveness) at the state level in the United States, as well as its association with a variety of ecological and historical factors, psychological characteristics, and state-level outcomes. Consistent with theory and past research, ecological and man-made threats—such as a higher incidence of natural disasters, greater disease prevalence, fewer natural resources, and greater degree of external threat—predicted increased tightness at the state level. Tightness is also associated with higher trait conscientiousness and lower trait openness, as well as a wide array of outcomes at the state level. Compared with loose states, tight states have higher levels of social stability, including lowered drug and alcohol use, lower rates of homelessness, and lower social disorganization. However, tight states also have higher incarceration rates, greater discrimination and inequality, lower creativity, and lower happiness relative to loose states. In all, tightness–looseness provides a parsimonious explanation of the wide variation we see across the 50 states of the United States of America.

Monday, June 23, 2014

Early music training enhances cognitive capacities in adults.

Further experiments on the profound effect that early musical training has on executive functioning in adults. The article has a useful introduction that references previous related work. (As a lifelong performing pianist, I enjoy articles like this!)
Executive functions (EF) are cognitive capacities that allow for planned, controlled behavior and strongly correlate with academic abilities. Several extracurricular activities have been shown to improve EF, however, the relationship between musical training and EF remains unclear due to methodological limitations in previous studies. To explore this further, two experiments were performed; one with 30 adults with and without musical training and one with 27 musically trained and untrained children (matched for general cognitive abilities and socioeconomic variables) with a standardized EF battery. Furthermore, the neural correlates of EF skills in musically trained and untrained children were investigated using fMRI. Adult musicians compared to non-musicians showed enhanced performance on measures of cognitive flexibility, working memory, and verbal fluency. Musically trained children showed enhanced performance on measures of verbal fluency and processing speed, and significantly greater activation in pre-SMA/SMA and right VLPFC during rule representation and task-switching compared to musically untrained children. Overall, musicians show enhanced performance on several constructs of EF, and musically trained children further show heightened brain activation in traditional EF regions during task-switching. These results support the working hypothesis that musical training may promote the development and maintenance of certain EF skills, which could mediate the previously reported links between musical training and enhanced cognitive skills and academic achievement.

Functional MRI imaging during mental task switching: Panels A and B shows brain activation in musically trained and untrained children, respectively. Panel C shows brain areas that are more active in musically trained than musically untrained children

Monday music - Debussy Nocturne

I pass on this Debussy Nocturne I recorded at my Twin Valley Middleton home last week, after playing it for a local music group on Tuesday evening.

Friday, June 20, 2014

On the precipice - a "Majority-Minority" America.

Sigh.... another chilling vision of America's future from Craig and Richeson. Increasing polarization of groups:
The U.S. Census Bureau projects that racial minority groups will make up a majority of the U.S. national population in 2042, effectively creating a so-called majority-minority nation. In four experiments, we explored how salience of such racial demographic shifts affects White Americans’ political-party leanings and expressed political ideology. Study 1 revealed that making California’s majority-minority shift salient led politically unaffiliated White Americans to lean more toward the Republican Party and express greater political conservatism. Studies 2, 3a, and 3b revealed that making the changing national racial demographics salient led White Americans (regardless of political affiliation) to endorse conservative policy positions more strongly. Moreover, the results implicate group-status threat as the mechanism underlying these effects. Taken together, this work suggests that the increasing diversity of the nation may engender a widening partisan divide.

Thursday, June 19, 2014

Dopamine receptor genes and independent versus interdependent social orientation.

Kitayama et al. make yet another stab at finding correlates of the often cited distinction of European American (more independent) and Asians (more interdependent). Their suggested genetic correlate can be compared with the environmental correlate I just noted in a recent post. Here, with the usual 'correlations are not causes' disclaimer, is their abstract:
Prior research suggests that cultural groups vary on an overarching dimension of independent versus interdependent social orientation, with European Americans being more independent, or less interdependent, than Asians. Drawing on recent evidence suggesting that the dopamine D4 receptor gene (DRD4) plays a role in modulating cultural learning, we predicted that carriers of DRD4 polymorphisms linked to increased dopamine signaling (7- or 2-repeat alleles) would show higher levels of culturally dominant social orientations, compared with noncarriers. European Americans and Asian-born Asians (total N = 398) reported their social orientation on multiple scales. They were also genotyped for DRD4. As in earlier work, European Americans were more independent, and Asian-born Asians more interdependent. This cultural difference was significantly more pronounced for carriers of the 7- or 2-repeat alleles than for noncarriers. Indeed, no cultural difference was apparent among the noncarriers. Implications for potential coevolution of genes and culture are discussed.
Given that the independent/interdependent ratio is a consequence of gene-cultural environment interaction, it is possible that some cultural effects might be moderated by specific dopamine receptor genetic variants. (Other work has suggested different alleles of the serotonin transporter gene correlate with susceptibility to stress and depression, and that serotonin 1A receptor gene polymorphism correlates with cultural difference in holistic attention.)

Wednesday, June 18, 2014

Speed reading apps blow away comprehension.

Schotter et al. make a demonstration that being able to glance back during reading (not allowed under speed reading conditions) significantly enhances comprehension...readers' control over their eye movements is important.
Recent Web apps have spurred excitement around the prospect of achieving speed reading by eliminating eye movements (i.e., with rapid serial visual presentation, or RSVP, in which words are presented briefly one at a time and sequentially). Our experiment using a novel trailing-mask paradigm contradicts these claims. Subjects read normally or while the display of text was manipulated such that each word was masked once the reader’s eyes moved past it. This manipulation created a scenario similar to RSVP: The reader could read each word only once; regressions (i.e., rereadings of words), which are a natural part of the reading process, were functionally eliminated. Crucially, the inability to regress affected comprehension negatively. Furthermore, this effect was not confined to ambiguous sentences. These data suggest that regressions contribute to the ability to understand what one has read and call into question the viability of speed-reading apps that eliminate eye movements (e.g., those that use RSVP).

Tuesday, June 17, 2014

Watching the physical correlate of memory improvement during sleep.

Euston and Steenland do a perspective on nice work by Yang et al. that probes the role of sleep in altering mouse brain structures. I pass on their summary figure (click to enlarge) and some context comments.:

Figure Legend. Three phenomena that occur during sleep have been linked to memory enhancement—slow-wave oscillations in brain electrical activity, reactivation of recent experiences, and changes in synaptic connectivity but the strength of the evidence (indicated by arrow thickness) varies. As shown in red, Yang et al. link both reactivation and slow-wave sleep to changes in synaptic connectivity that enhance learning.
To address whether synaptic strength increases or decreases during sleep, Yang et al. used a powerful technique to visualize dendritic spines in the motor cortex of live mice. The mice were genetically engineered to express a fluorescent protein in a subset of cortical cells. A small window was created in the skull, allowing microscopic imaging of dendritic spines repeatedly over the course of hours or even days. This technique was previously used to show that training mice to stay atop a rotating rod—an acquired skill—induced the formation of new dendritic spines in the motor cortex. Further, the rate of new spine formation was correlated with the degree of task improvement. These findings provided direct evidence that synaptic change in the mammalian cortex underlies learning. Yang et al. extend these findings, showing that learning-induced spine changes are segregated on specific dendritic branches. After learning, when two branches on the same dendritic arbor were examined, one typically showed many more new spines than the other. If mice were subsequently trained on a different skill (i.e., running backward on the spinning rod), the new spines induced by the second task grew selectively on the previously underproductive branch. Hence, different skills seem to be localized on different dendritic branches.
To test the role of sleep in spine formation, Yang et al. repeated their experiment with and without an 8-hour period of sleep deprivation immediately after training. Sleep deprivation markedly decreased the number of new spines. This effect also was branch-specific in that sleep deprivation reduced spine formation primarily on the dendritic branch with the higher number of new spines. Importantly, sleep had no effect on the rate of spine elimination. The authors also observed that sleep made newly formed spines much more likely to still be present 1 day later, consistent with the idea that consolidated memories are less sensitive to decay. In other words, sleep gives spines staying power.

Monday, June 16, 2014

When being a control-freak doesn't help....

Bocanegra and Hommel note limits to the usefulness of cognitive control, showing, in particular, how overcontrol (induced by task instructions) can prevent the otherwise automatic exploitation of statistical stimulus characteristics needed to optimize behavior. They describe how they set up the experiment:
Participants performed a two-alternative forced-choice task on a foveally presented stimulus that could vary on a subset of binary perceptual features, such as color (red, green), shape (diamond, square), size (large, small), topology (open, closed), and location (up, down). Unbeknownst to the participants, we manipulated the statistical informativeness of an additional feature that was not part of the task, such that this feature always predicted the correct response in one condition (the predictive condition) but not in the other condition (the baseline condition). Because the cognitive system is known to exploit statistical stimulus-response contingencies automatically, performance was expected to be better in the predictive than in the baseline condition.
We embedded these predictive and baseline conditions into two different tasks, which we thought would induce different cognitive-control states. The control task included instructions intended to emphasize the need for top-down control: Participants were instructed to classify the stimulus according to a feature-conjunction rule (e.g., size and topology: left response key for large and open or small and closed shapes, right response key for small and open or large and closed shapes). The automatic task included instructions intended to deemphasize the need for control: Participants were instructed to classify the stimulus according to a single feature (e.g., shape: left response key for a diamond and right response key for a square). In the automatic task, the features were mapped consistently on responses and thus allowed automatic visuomotor translation. In contrast, the stimulus-response mapping in the control task required the attention-demanding integration of two features before the response could be determined.
As expected, the predictive feature improved performance when participants performed the task automatically. Counterintuitively, however, the predictive feature impaired performance when subjects were performing the exact same task in a top-down, controlled manner.
Their abstract:
In order to engage in goal-directed behavior, cognitive agents have to control the processing of task-relevant features in their environments. Although cognitive control is critical for performance in unpredictable task environments, it is currently unknown how it affects performance in highly structured and predictable environments. In the present study, we showed that, counterintuitively, top-down control can impair and interfere with the otherwise automatic integration of statistical information in a predictable task environment, and it can render behavior less efficient than it would have been without the attempt to control the flow of information. In other words, less can sometimes be more (in terms of cognitive control), especially if the environment provides sufficient information for the cognitive system to behave on autopilot based on automatic processes alone.

Another Poulenc offering: Improvisation No. 13

Another Monday morning post of a recent piano recording I've done.

  


Friday, June 13, 2014

Brain initiative meets physics….Opps!

Scientists leading the much-heralded Obama Brain Initiative initially providing $100 million (now NIH is seeking 4.5 billion for their part of the project) to craft new tools for measuring brain activity may have been insufficiently aware that some of their ideas:
“violated either a physical law or some very significant engineering constraint or biological constraint,”
I want to pass on the text of this article noting a meeting sponsored by the National Science Foundation at Cold Spring Harbor Laboratory.

The goal is to have a realistic discussion of what the physical limits are, he says, so “scientists who want to make devices will not make crazy proposals,” or, “if a proposal is crazy, one could recognize it as such” and look for other ways to make the idea work.

One such “fanciful” idea is to build nanosized radios that could snuggle up to individual neurons to record and transmit information about their activity, says physicist Peter Littlewood, director of Argonne National Laboratory in Lemont, Illinois. But any radio small enough to be injected into the brain without causing significant harm would not be able to transmit any information out through tissue and bone, he says. Make the devices any more powerful, he adds, and they'd likely cook the surrounding brain. Another aspiration that is likely doomed is to get microscopes that probe the brain with pulses of light to penetrate much further than they already do, Mitra says. A little more than 1 mm is possible, he adds, but even 1 cm is “out of the question, since the signal to background [noise] ratio decreases exponentially with depth.”

But physicists and engineers shouldn't simply shoot down outlandish proposals—or gripe about the intrinsic messiness of the brain's biology. They should model themselves as “fancy technicians” who can help develop revolutionary tools, Littlewood says. There are precedents for such collaboration, he notes: He, Mitra, and their colleagues at Bell Labs, for example, helped develop functional magnetic resonance imaging in the 1990s.

One area where physical scientists can help today is in fashioning small, strong, conductive wires that can record from many different neurons simultaneously, says neuro physicist David Kleinfeld of the University of California, San Diego. For decades, neuro scientists have relied largely on electrodes fashioned from fragile glass pipettes. But only a small number of these sensors will fit in a given brain region without disrupting connections between cells or killing them outright. Biophysicist Timothy Harris at the Janelia Farm Research Campus in Ashburn, Virginia, and others have had some success at making much smaller ones for fish and fly brains—some, made of silicon, are roughly 3 microns wide, about 25 times thinner than a human hair.

These probes are by no means the tiniest possible—polymer-coated carbon nanotubes, for example, can be 0.1 microns or smaller across and are highly conductive. Such thin wires tend to be very short and too flexible to get into the brain easily, however—when pushed, they simply buckle. One question Harris plans to pose at the meeting is whether the probes could be magnetized, then pulled, rather than pushed, into the brain with a powerful magnet.

Ultimately, researchers hope to measure neural activity inside the brain without poking wires into living tissue, and there, too, physics can help. Harris has his eye on light-sheet microscopy, which shines a plane of light across a living brain tissue, illuminating neurons engineered to fluoresce green or red when they are flooded by calcium during neuronal firing. Last year, neuroscientist Misha Ahrens and colleagues at Janelia Farm used this technique to produce the first “real” whole-brain activity map of a zebrafish larva, Harris says.

A larval zebrafish brain is 1000 times smaller than a mouse brain, however. It is also conveniently transparent, while mouse and human brain tissue scatter and blur light. Using the same optical techniques that astronomers employ to discern very faint or close-together stars with a telescope, researchers such as physicist Na Ji, also at Janelia Farm, have discovered ways to distinguish between hard-to-see neurons in murky brain tissue.

In preparation for the meeting, Mitra has brushed off an old copy of Principles of Optics by Emil Wolf and Max Born, one of the most venerable and difficult physics tomes. Getting back to basics, he hopes, will help him and his BRAIN project colleagues determine which rules must be followed to the letter, and which might be cleverly circumvented.

Thursday, June 12, 2014

Gratitude reduces economic impatience.

Whenever I come across yet another self-help laundry list of useful tricks for feeling better, and try a few, I repeatedly find that briefly following instructions to practice feeling gratitude has a very salutary, calming, effect...taking the edge off any impatience I might be feeling. DeSteno et al. look at this in a more systematic way, distinguishing the effect of gratitude from more positive global emotions of happiness with respect impatience, or short-term gratification. 75 study participants were split in three groups with different emotion-induction conditions: being asked to write brief essays on experiences of feeling grateful, happy, or neutral. They then made choices between receiving smaller cash amounts (ranging from $11 to $80) immediately and larger cash amounts (ranging from $25 to $85) at a point from 1 week to 6 months in the future. Their results clearly revealed that gratitude reduces excessive economic impatience (the temporal discounting of future versus immediate rewards) compared with the neutral and happy conditions, which were about equal. Here is their abstract:
The human mind tends to excessively discount the value of delayed rewards relative to immediate ones, and it is thought that “hot” affective processes drive desires for short-term gratification. Supporting this view, recent findings demonstrate that sadness exacerbates financial impatience even when the sadness is unrelated to the economic decision at hand. Such findings might reinforce the view that emotions must always be suppressed to combat impatience. But if emotions serve adaptive functions, then certain emotions might be capable of reducing excessive impatience for delayed rewards. We found evidence supporting this alternative view. Specifically, we found that (a) the emotion gratitude reduces impatience even when real money is at stake, and (b) the effects of gratitude are differentiable from those of the more general positive state of happiness. These findings challenge the view that individuals must tamp down affective responses through effortful self-regulation to reach more patient and adaptive economic decisions.

Wednesday, June 11, 2014

Childhood bullying predicts adult inflammation.

How is this for a chilling finding? Childhood bullying leaves bullies with lower, and victims with higher, levels of chronic inflammation than those uninvolved in bullying. From Copeland et al. :
Bullying is a common childhood experience that involves repeated mistreatment to improve or maintain one’s status. Victims display long-term social, psychological, and health consequences, whereas bullies display minimal ill effects. The aim of this study is to test how this adverse social experience is biologically embedded to affect short- or long-term levels of C-reactive protein (CRP), a marker of low-grade systemic inflammation. The prospective population-based Great Smoky Mountains Study (n = 1,420), with up to nine waves of data per subject, was used, covering childhood/adolescence (ages 9–16) and young adulthood (ages 19 and 21). Structured interviews were used to assess bullying involvement and relevant covariates at all childhood/adolescent observations. Blood spots were collected at each observation and assayed for CRP levels. During childhood and adolescence, the number of waves at which the child was bullied predicted increasing levels of CRP. Although CRP levels rose for all participants from childhood into adulthood, being bullied predicted greater increases in CRP levels, whereas bullying others predicted lower increases in CRP compared with those uninvolved in bullying. This pattern was robust, controlling for body mass index, substance use, physical and mental health status, and exposures to other childhood psychosocial adversities. A child’s role in bullying may serve as either a risk or a protective factor for adult low-grade inflammation, independent of other factors. Inflammation is a physiological response that mediates the effects of both social adversity and dominance on decreases in health.
Added note: I just came across this related article by Raposa et. al. on the developmental pathway from early life stress to inflammation.

Tuesday, June 10, 2014

Tonics for a long life?

I've recently come across two articles relevant to life extension (work done with mice and worms, to be sure, but a human who reads these papers might well be trying to get their hands on some of the stuff described to give it a try!). Dubai et al. report their work on Klotho, an aging regulator that, when overexpressed, extends lifespan in mice and nematode worms, and, when disrupted, accelerates aging phenotypes. (A lifespan expanding human variant of the KLOTHO gene, KL-VS, is associated with enhanced cognition in heterozygous carriers.) Here is their summary:
Aging is the primary risk factor for cognitive decline, an emerging health threat to aging societies worldwide. Whether anti-aging factors such as klotho can counteract cognitive decline is unknown. We show that a lifespan-extending variant of the human KLOTHO gene, KL-VS, is associated with enhanced cognition in heterozygous carriers. Because this allele increased klotho levels in serum, we analyzed transgenic mice with systemic overexpression of klotho. They performed better than controls in multiple tests of learning and memory. Elevating klotho in mice also enhanced long-term potentiation, a form of synaptic plasticity, and enriched synaptic GluN2B, an N-methyl-D-aspartate receptor (NMDAR) subunit with key functions in learning and memory. Blockade of GluN2B abolished klotho-mediated effects. Surprisingly, klotho effects were evident also in young mice and did not correlate with age in humans, suggesting independence from the aging process. Augmenting klotho or its effects may enhance cognition and counteract cognitive deficits at different life stages.
And, Ye et al. have done a screen, using nematodes, of over 1200 drugs active on human cells, including drugs approved for human use, finding ~60 that increase C. elegans lifespan up to 43%. They mainly act on proteins that function in signaling pathways between cells relevant to oxidative stress resistance - hormone or neurotransmitter receptors, particularly those for adrenaline and noradrenaline, serotonin, dopamine, histamine, and serotonin. This suggests and narrows down a list of drugs that might be tested for life extension in mammals.
One goal of aging research is to find drugs that delay the onset of age-associated disease. Studies in invertebrates, particularly Caenorhabditis elegans, have uncovered numerous genes involved in aging, many conserved in mammals. However, which of these encode proteins suitable for drug targeting is unknown. To investigate this question, we screened a library of compounds with known mammalian pharmacology for compounds that increase C. elegans lifespan. We identified 60 compounds that increase longevity in C. elegans, 33 of which also increased resistance to oxidative stress. Many of these compounds are drugs approved for human use. Enhanced resistance to oxidative stress was associated primarily with compounds that target receptors for biogenic amines, such as dopamine or serotonin. A pharmacological network constructed with these data reveal that lifespan extension and increased stress resistance cluster together in a few pharmacological classes, most involved in intercellular signaling. These studies identify compounds that can now be explored for beneficial effects on aging in mammals, as well as tools that can be used to further investigate the mechanisms underlying aging in C. elegans.

Monday, June 09, 2014

Rapidity of human brain and muscle evolution - the downside of smarts?

Roberts does a summary of fascinating work by Bozak et al. He sets the context: 
Somewhat narcissistically, one of the spectacular changes in phenotype that we tend to be most interested in is the enhancement in our own brain power which has occurred over the 6 million years that separate us from our last shared ancestor with chimpanzees. The chimp genome is famously very similar to our own, but the technological, linguistic, and cultural phenotype is clearly profoundly different. Several studies have asked open-ended questions as to what happens between the genotype and phenotype to make us so different from our cousins, finding differences in levels, splicing, and editing of gene transcripts, for example. Now a paper just published in PLOS Biology by Katarzyna Bozek, Philipp Khaitovich, and colleagues looks at another intermediate phenotype—the metabolome—with some intriguing and unexpected answers...The metabolome is the set of small molecules (metabolites) that are found in a given tissue; by “small” we mean those with a molecular weight of less than 1,500 Daltons, which includes fats, amino acids, sugars, nucleotides, and vitamins (vitamin B12, for example, is near the top end of this range).  
...the metabolomes of human prefrontal cortex (and of combined brain regions) have changed four times as rapidly in the last 6 million years as those of chimps. While gratifying, this largely confirms for metabolites what was already known for transcripts. 
...brain is not the most spectacular outlier here. The real surprise is that the human muscle metabolome has experienced more than eight times as much change as its chimp counterpart. Indeed, metabolomically speaking, human muscle has changed more in the last 6 million years than mouse muscle has since we parted company from mice back in the Early Cretaceous.  
...the authors compared the performance of humans, chimps, and macaques in a strength test that involved pulling a handle to raise a weight. Human strength, as measured by this test, was barely half that of the non-human primates. Amazingly, untrained chimps and macaques raised in captivity easily outperformed university-level basketball players and professional mountain climbers. The authors speculate that the fates of human brain and muscle may be inextricably entwined, and that weak muscle may be the price we pay for the metabolic demands of our amazing cognitive powers.

A Monday musical offering - Poulenc Improvisation No. 7

This is recorded on the Steinway B at my Twin Valley Rd. residence in Middleton, WI.  I used to regularly post my piano work on MindBlog, and will try to return to the habit.


Friday, June 06, 2014

First direct evidence for human sex pheromones.

Here is a clever experiment by Zhou et al., who digitally morph the gender of moving point light displays of walking from male to female while subjects are exposed to two human steroids that they can not discriminate. Here is their summary and abstract:

•Human steroid androstadienone conveys masculinity to straight women and gay men
•Human steroid estratetraenol conveys femininity to straight men
•The effects take place in the absence of awareness
•Human gender perception draws on subconscious chemosensory biological cues

Recent studies have suggested the existence of human sex pheromones, with particular interest in two human steroids: androstadienone (androsta-4,16,-dien-3-one) and estratetraenol (estra-1,3,5(10),16-tetraen-3-ol). The current study takes a critical step to test the qualification of the two steroids as sex pheromones by examining whether they communicate gender information in a sex-specific manner. By using dynamic point-light displays that portray the gaits of walkers whose gender is digitally morphed from male to female, we show that smelling androstadienone systematically biases heterosexual females, but not males, toward perceiving the walkers as more masculine. By contrast, smelling estratetraenol systematically biases heterosexual males, but not females, toward perceiving the walkers as more feminine. Homosexual males exhibit a response pattern akin to that of heterosexual females, whereas bisexual or homosexual females fall in between heterosexual males and females. These effects are obtained despite that the olfactory stimuli are not explicitly discriminable. The results provide the first direct evidence that the two human steroids communicate opposite gender information that is differentially effective to the two sex groups based on their sexual orientation. Moreover, they demonstrate that human visual gender perception draws on subconscious chemosensory biological cues, an effect that has been hitherto unsuspected.

Thursday, June 05, 2014

Social attention and our ventromedial prefrontal cortex.

Ralph Adolphs points to an interesting article by Wolf et al. showing that bilateral ventromedial prefrontal cortex damage impairs visual attention to the eye regions of faces, particularly for fearful faces. From Adolphs summary:



Failing to look at the eyes. Shown in each image are the regions of a face at which different groups of subjects look, as measured using eye-tracking. The hottest colours (red regions) denote those regions of the face where people look the most. Whereas this corresponds to the eye region of the face in healthy controls (far left), it is abnormal in certain clinical populations, including individuals with lesions of the vmPFC (top right) or amygdala (bottom right) and individuals with autism spectrum disorder (bottom centre) Top row: from Wolf et al. 2014. Bottom row: data from Michael Spezio, Daniel Kennedy, Ralph Adolphs. All images represent spatially smoothed data averaged across multiple fixations, multiple stimuli and multiple subjects within the indicated group.

Wednesday, June 04, 2014

Blue Mind - looking at water improves your health and calm

I just spend three days this past weekend in a guesthouse cabin in Door County, Wisconsin - three days of seeing mainly gorgeous green forests and the blue expanses of Lake Michigan on both sides of the Door County peninsula. Over those days, just looking at the water, I could feel a calm growing that quieted my normally chattering mind. Now, back in my university office, what do I stumble across but an article (on stress) that mentions this calming effect of water, a review by Michael Gross titled "Chronic stress means we’re always on the hunt", which first notes that one relief described for chronic stress (dubbed "Red Mind") is to give the stress system some real exercise (doing something like sky diving) to put the more mundane stresses of daily life in perspective. But then, in a second portion of his article, he points to work of Wallace J. Nichols and others who use the phrase "Blue mind" to describe the interface between our psychology and natural environment, particularly water, the largest feature of that environment. Gross notes that Nichols has put his ideas
...in a new book, called Blue Mind: How Water Makes You Happier, More Connected, and Better at What You Do, which is due to appear in June. In his book, Nichols discusses a spate of recent psychology papers showing that the proximity of “blue nature” can improve people’s physical and mental health and counterbalance the damaging effects of the chronic stress and the permanent engagement of the red mind. While the opportunity to exercise plays a part, several studies have shown that the positive effect of being near water can be separated from that aspect.

Tuesday, June 03, 2014

Crowdsourcing our brain’s wiring.

You too can be be a neuroscientist! I have to join in the general chorus of press pointing to the efforts of Sebastian Seung, now moving from MIT to Princeton Univ. Detailed analysis of serial ultra-thin electron microscope sections of brain neurons is extremely laborious - tracing one cells takes about 50 hours. Seung has asked for help from the general public ("citizen neuroscientists") in doing this with neurons in the mouse retina. The data on how bipolar cells connect to amacrine motion detecting cells in the retina has suggested a model for motion detection. This animation of their results is a treat to watch:

Monday, June 02, 2014

The science of inequality.

The May 23 issue of Science Magazine has a large section devoted the origins and analysis of economic inequality. And, the general gist of virtually all the articles is that inequality is here to stay, predicted by social and also simple physical models of exchange. A weekly chaos and complexity discussion group that I attend just this past Tuesday discussed the economic Yard-Sale Model of wealth redistribution, in which power-law-like distributions result even when all individuals are identical and playing by the same rules. In a previous post, I have passed on a simple model illustrated by Clint Sprott, organizer of the seminars.