Tuesday, May 03, 2016

Video games for Neuro-Cognitive Optimization

Continuing the MindBlog thread on brain games (cf. here), I pass on the introduction to a brief review by Mishra, Anguera, and Gazzaley on designing the next generation of closed-loop video games (CLVGs) that offer the prospect of enhancing cognition:
Humans of all ages engage deeply in game play. Game-based interactive environments provide a rich source of enjoyment, but also generate powerful experiences that promote learning and behavioral change (Pellegrini, 2009). In the modern era, software-based video games have become ubiquitous. The degree of interactivity and immersion in these video games can now be further enhanced like never before with the advent of consumer-accessible technologies like virtual reality, augmented reality, wearable physiological devices, and motion capture, all of which can be readily integrated using accessible game engines. This technological revolution presents a huge opportunity for neuroscientists to design targeted, novel game-based tools that drive positive neuroplasticity, accelerate learning, and strengthen cognitive function, and thereby promote mental wellbeing in both healthy and impaired brains.
In fact, there is now a burgeoning brain-training industry that already claims to have achieved this goal. However, many commercial claims are unsubstantiated and dismissed by the scientific community (Max Planck Institute for Human Development/Stanford Center on Longevity, 2014, Underwood, 2016). It seems prudent for us to slow down and approach this opportunity with scientific rigor and conservative optimism. Enhancing brain function should not be viewed as a clever, profitable start-up idea that can be conquered with a large marketing budget. If the field continues to be led by overinflated claims, we will jeopardize the careful and iterative process of evidence-based innovations in brain training and thereby risk throwing out the baby with the bathwater.

To strike the right balance, the path to commercialization needs to be accomplished via cutting-edge, neuroscientifically informed video game development tightly coupled with refinement and validation of the software in well-controlled empirical studies. Additionally, to separate the grain from the chaff, these studies and the claims based on them need verification and approval by independent regulatory agencies and the broader scientific community. High-level video game development and rigorous scientific validation need to become the twin pillar foundations of the next generation of closed-loop video games (CLVGs). Here, we define CLVGs as interactive video games that incorporate rapid, real-time, performance-driven, adaptive game challenges and performance feedback. The time is ideal for intensified effort in this important endeavor; CLVGs that are methodically developed and validated have the potential to benefit a broad array of disciplines in need of effective tools to enhance brain function, including education, medicine, and wellness.

Monday, May 02, 2016

Embodied Prediction - perception and mind turned upside down

Andy Clark does a fascinating discussion and analysis of predictive processing, which turns the traditional picture of perception on its head. The embodied mind model, which seems to me completely compelling, shows the stark inadequacy of most brain centered models of mind and cognition. I pass on the end of his introduction and the closing paragraph of the essay. (This essay is just one of many on a fascinating website , Open Mind, that has posted 39 essays (edited by Thomas Metzinger and Jennifer Windt) by contributors who are both junior and senior members of the academic philosophy of mind field.
Predictive processing plausibly represents the last and most radical step in a retreat from the passive, input-dominated view of the flow of neural processing. According to this emerging class of models, naturally intelligent systems (humans and other animals) do not passively await sensory stimulation. Instead, they are constantly active, trying to predict the streams of sensory stimulation before they arrive. Before an “input” arrives on the scene, these pro-active cognitive systems are already busy predicting its most probable shape and implications. Systems like this are already (and almost constantly) poised to act, and all they need to process are any sensed deviations from the predicted state. It is these calculated deviations from predicted states (known as prediction errors) that thus bear much of the information-processing burden, informing us of what is salient and newsworthy within the dense sensory barrage. The extensive use of top-down probabilistic prediction here provides an effective means of avoiding the kinds of “representational bottleneck” feared by early opponents of representation-heavy—but feed-forward dominated—forms of processing. Instead, the downward flow of prediction now does most of the computational “heavy-lifting”, allowing moment-by-moment processing to focus only on the newsworthy departures signified by salient prediction errors. Such economy and preparedness is biologically attractive, and neatly sidesteps the many processing bottlenecks associated with more passive models of the flow of information.
Action itself...then needs to be reconceived. Action is not so much a response to an input as a neat and efficient way of selecting the next “input”, and thereby driving a rolling cycle. These hyperactive systems are constantly predicting their own upcoming states, and actively moving so as to bring some of them into being. We thus act so as to bring forth the evolving streams of sensory information that keep us viable (keeping us fed, warm, and watered) and that serve our increasingly recondite ends. PP thus implements a comprehensive reversal of the traditional (bottom-up, forward-flowing) schema. The largest contributor to ongoing neural response, if PP is correct, is the ceaseless anticipatory buzz of downwards-flowing neural prediction that drives both perception and action. Incoming sensory information is just one further factor perturbing those restless pro-active seas. Within those seas, percepts and actions emerge via a recurrent cascade of sub-personal predictions forged from unconscious expectations spanning multiple spatial and temporal scales.
Conceptually, this implies a striking reversal, in that the driving sensory signal is really just providing corrective feedback on the emerging top-down predictions. As ever-active prediction engines, these kinds of minds are not, fundamentally, in the business of solving puzzles given to them as inputs. Rather, they are in the business of keeping us one step ahead of the game, poised to act and actively eliciting the sensory flows that keep us viable and fulfilled. If this is on track, then just about every aspect of the passive forward-flowing model is false. We are not passive cognitive couch potatoes so much as proactive predictavores, forever trying to stay one step ahead of the incoming waves of sensory stimulation.
Conclusion: Towards a mature science of the embodied mind
By self-organizing around prediction error, and by learning a generative rather than a merely discriminative (i.e., pattern-classifying) model, these approaches realize many of the goals of previous work in artificial neural networks, robotics, dynamical systems theory, and classical cognitive science. They self-organize around prediction error signals, perform unsupervised learning using a multi-level architecture, and acquire a satisfying grip—courtesy of the problem decompositions enabled by their hierarchical form—upon structural relations within a domain. They do this, moreover, in ways that are firmly grounded in the patterns of sensorimotor experience that structure learning, using continuous, non-linguaform, inner encodings (probability density functions and probabilistic inference). Precision-based restructuring of patterns of effective connectivity then allow us to nest simplicity within complexity, and to make as much (or as little) use of body and world as task and context dictate. This is encouraging. It might even be that models in this broad ballpark offer us a first glimpse of the shape of a fundamental and unified science of the embodied mind.

Friday, April 29, 2016

The privileged fifth.

I tweeted this well researched OpEd piece by Thomas Edsall the first time I read it, and after my third reading, want to urge you to read it.  I pass on two  summary graphics that are part of the description of how the privileged top fifth of the U.S. population is becoming a self-perpetuating class that is steadily separating itself by geography, education, and income.

Thursday, April 28, 2016

Sleep deprivation, brain structure, and learning

Saletin et al. find that individual differences in the anatomy of the human hippocampus explain many of the differences in learning impairment after sleep loss. These structural differences also predict the subsequent EEG slow-wave activity during recovery sleep and the restoration of learning after sleep.

Significance statement
Sleep deprivation does not impact all people equally. Some individuals show cognitive resilience to the effects of sleep loss, whereas others express striking vulnerability, the reasons for which remain largely unknown. Here, we demonstrate that structural features of the human brain, specifically those within the hippocampus, accurately predict which individuals are susceptible (or conversely, resilient) to memory impairments caused by sleep deprivation. Moreover, this same structural feature determines the success of memory restoration following subsequent recovery sleep. Therefore, structural properties of the human brain represent a novel biomarker predicting individual vulnerability to (and recovery from) the effects of sleep loss, one with occupational relevance in professions where insufficient sleep is pervasive yet memory function is paramount.
Sleep deprivation impairs the formation of new memories. However, marked interindividual variability exists in the degree to which sleep loss compromises learning, the mechanistic reasons for which are unclear. Furthermore, which physiological sleep processes restore learning ability following sleep deprivation are similarly unknown. Here, we demonstrate that the structural morphology of human hippocampal subfields represents one factor determining vulnerability (and conversely, resilience) to the impact of sleep deprivation on memory formation. Moreover, this same measure of brain morphology was further associated with the quality of nonrapid eye movement slow wave oscillations during recovery sleep, and by way of such activity, determined the success of memory restoration. Such findings provide a novel human biomarker of cognitive susceptibility to, and recovery from, sleep deprivation. Moreover, this metric may be of special predictive utility for professions in which memory function is paramount yet insufficient sleep is pervasive (e.g., aviation, military, and medicine).
For further reading on insomnia, this article notes several other studies, one noting several right brain regions of lowered connectivity in people with primary insomnia.

Wednesday, April 27, 2016

Grandiose narcissism and the U.S. presidency

Many of us are scratching our heads about what a Trump presidency might be like, particularly in regard to his outstanding personality trait: grandiose narcissism. Watts et al. have looked at the historical record to note how this trait has correlated with both positive and negative leadership behaviors in U.S. presidents up until Obama. Their abstract:
Recent research and theorizing suggest that narcissism may predict both positive and negative leadership behaviors. We tested this hypothesis with data on the 42 U.S. presidents up to and including George W. Bush, using (a) expert-derived narcissism estimates, (b) independent historical surveys of presidential performance, and (c) largely or entirely objective indicators of presidential performance. Grandiose, but not vulnerable, narcissism was associated with superior overall greatness in an aggregate poll; it was also positively associated with public persuasiveness, crisis management, agenda setting, and allied behaviors, and with several objective indicators of performance, such as winning the popular vote and initiating legislation. Nevertheless, grandiose narcissism was also associated with several negative outcomes, including congressional impeachment resolutions and unethical behaviors. We found that presidents exhibit elevated levels of grandiose narcissism compared with the general population, and that presidents’ grandiose narcissism has been rising over time. Our findings suggest that grandiose narcissism may be a double-edged sword in the leadership domain.
The two highest scorers on grandiose narcissism were Lyndon B. Johnson and Theodore Roosevelt. Richard M. Nixon scored high on "vulnerable narcissism," a trait associated with being self-absorbed and thin-skinned. From the authors' popular account of their work:
Studies in the Journal of Personality in 2013 and in Personality and Individual Differences in 2009 have shown that narcissistic individuals tend to impress others during brief interactions and to perform well in public, two attributes that lend themselves to political success. They are also willing to take risks, which can be a valuable asset in a leader.
In contrast, the psychologist W. Keith Campbell and others have found that narcissists tend to be overconfident when making decisions, to overestimate their abilities and to portray their ideas as innovative when they are not. Compared with their non-narcissistic counterparts, they are more likely to accumulate resources for themselves at others’ expense.
The psychologists Brad Bushman and Roy F. Baumeister have found that narcissists, but not people with garden-variety high self-esteem, are prone to retaliating harshly against people who have criticized them. If, for example, you present narcissists with negative feedback about essays they’ve written, they’re likely to exact revenge against their presumed essay evaluators by blasting them with loud noises (as one amusing study found).
Still other work by the psychologist Mitja Back and colleagues suggests that narcissists are generally well liked in the short term, often creating positive first impressions. Other research indicates, though, that after a while they are usually more disliked than other individuals. Their charisma tends to wear off.

Tuesday, April 26, 2016

Are we smart enough to know how smart animals are?

I want to pass on some clips from Silk's recent review of Frans de Waal's recent book whose title is the title of this post:
Natural selection, he argues, shapes cognitive abilities in the same way as it shapes traits such as wing length. As animals' challenges and habitats differ, so do their cognitive abilities. This idea, which he calls evolutionary cognition, has gained traction in psychology and biology in the past few decades.
For de Waal, evolutionary cognition has two key consequences. First, it is inconsistent with the concept of a 'great chain of being' in which organisms can be ordered from primitive to advanced, simple to complex, stupid to smart. Name a 'unique' human trait, and biologists will find another organism with a similar one. Humans make and use tools; so do wild New Caledonian crows (Corvus moneduloides). Humans develop cultures; so do humpback whales (Megaptera novaeangliae), which socially transmit foraging techniques. We can mentally 'time travel', remembering past events and planning for the future; so can western scrub jays (Aphelocoma californica), which can recall what they had for breakfast on one day, anticipate whether they will be given breakfast the next and selectively cache food when breakfast won't be delivered.
Furthermore, humans do not necessarily outdo other animals in all cognitive domains. Black-capped chickadees (Poecile atricapillus) store seeds in hundreds of locations each day, and can remember what they stored and where, as well as whether items in each location have been eaten, or stolen. Natural selection has favoured those prodigious feats of memory because they spell the difference between surviving winter and starving before spring. Human memory doesn't need to be as good: primates evolved in the tropics. “In the utilitarian view of biology,” de Waal argues, “animals have the brains they need — nothing more, nothing less.”
The second consequence of de Waal's view is that there is continuity across taxa. One source of continuity is based on evolutionary history: natural selection modifies traits to create new ones, producing commonalities among species with a common history. He points out that tool use is found not just in humans and chimpanzees, but also in other apes and monkeys, implying that relevant cognitive building blocks are shared across all primates. Continuity is also generated by convergent evolution, which produces similar traits in distantly related organisms such as New Caledonian crows and capuchin monkeys. De Waal opines that continuity “ought to be the default position for at least all mammals, and perhaps also birds and other vertebrates”.
...researchers are eager to understand what is distinctly human; some are driven by curiosity about how humans came to dominate the planet..Our success presumably has something to do with the emergence of a unique suite of cognitive traits...De Waal recognizes only one such trait: our rich and flexible system of symbolic communication, and our ability to exchange information about past and future. His commitment to the principle of continuity forces him to discount the importance of language for human cognition because of evidence of thinking by non-linguistic creatures. And he ignores compelling findings from linguists and developmental psychologists such as Elizabeth Spelke on the formative role of language in cognition.
A more satisfying book would leave readers with a clearer understanding of why, a few million years after our lineage diverged from the lineage of chimpanzees, we are the ones reading this book, and not them.

Monday, April 25, 2016

Essential role of default mode network in higher cognitive processing.

The respective roles of attentional and default mode networks in our brains has been the subject of numerous MindBlog posts (enter 'default mode' in the search box in the left column). A summary article by Bola and Borchardt notes an important recent contribution by Vatansever et al., whose abstract is shown below, followed by a graphic from the summary article.  Their work changes the previous view that the default mode disengages during goal-directed tasks.

The default mode network (DMN) has been traditionally assumed to hinder behavioral performance in externally focused, goal-directed paradigms and to provide no active contribution to human cognition. However, recent evidence suggests greater DMN activity in an array of tasks, especially those that involve self-referential and memory-based processing. Although data that robustly demonstrate a comprehensive functional role for DMN remains relatively scarce, the global workspace framework, which implicates the DMN in global information integration for conscious processing, can potentially provide an explanation for the broad range of higher-order paradigms that report DMN involvement. We used graph theoretical measures to assess the contribution of the DMN to global functional connectivity dynamics in 22 healthy volunteers during an fMRI-based n-back working-memory paradigm with parametric increases in difficulty. Our predominant finding is that brain modularity decreases with greater task demands, thus adapting a more global workspace configuration, in direct relation to increases in reaction times to correct responses. Flexible default mode regions dynamically switch community memberships and display significant changes in their nodal participation coefficient and strength, which may reflect the observed whole-brain changes in functional connectivity architecture. These findings have important implications for our understanding of healthy brain function, as they suggest a central role for the DMN in higher cognitive processing.
The default mode network (DMN) has been shown to increase its activity during the absence of external stimulation, and hence was historically assumed to disengage during goal-directed tasks. Recent evidence, however, implicates the DMN in self-referential and memory-based processing. We provide robust evidence for this network's active contribution to working memory by revealing dynamic reconfiguration in its interactions with other networks and offer an explanation within the global workspace theoretical framework. These promising findings may help redefine our understanding of the exact DMN role in human cognition.
Graphic from Review

Schematic representation of the main findings of Vatansever et al. Community representation and colors are in the style of Figures 1 and 3 in the article by Vatansever et al. (2015), and the DMN is represented by Community 4. In the low-demanding 0-back condition, the network was highly modular (high Q index) and was divided into four distinct modules. With the increasing cognitive load, the modularity of the network decreased, and three communities merged into one. Thus, while local segregation was prevalent in the low-demanding task, increasing cognitive effort was associated with more pronounced global integration.

Friday, April 22, 2016

How to attract others.

Well, Duh...... Interesting, but talk about showing the obvious!.. from Vacharkulksemsuka et al.:
Across two field studies of romantic attraction, we demonstrate that postural expansiveness makes humans more romantically appealing. In a field study (n = 144 speed-dates), we coded nonverbal behaviors associated with liking, love, and dominance. Postural expansiveness—expanding the body in physical space—was most predictive of attraction, with each one-unit increase in coded behavior from the video recordings nearly doubling a person’s odds of getting a “yes” response from one’s speed-dating partner. In a subsequent field experiment (n = 3,000), we tested the causality of postural expansion (vs. contraction) on attraction using a popular Global Positioning System-based online-dating application. Mate-seekers rapidly flipped through photographs of potential sexual/date partners, selecting those they desired to meet for a date. Mate-seekers were significantly more likely to select partners displaying an expansive (vs. contractive) nonverbal posture. Mediation analyses demonstrate one plausible mechanism through which expansiveness is appealing: Expansiveness makes the dating candidate appear more dominant. In a dating world in which success sometimes is determined by a split-second decision rendered after a brief interaction or exposure to a static photograph, single persons have very little time to make a good impression. Our research suggests that a nonverbal dominance display increases a person’s chances of being selected as a potential mate.

Thursday, April 21, 2016

Impulsivity, sensation seeking, and substance use correlate with reduced brain cortical thickness.

From Holmes et al.:
Individuals vary widely in their tendency to seek stimulation and act impulsively, early developing traits with genetic origins. Failures to regulate these behaviors increase risk for maladaptive outcomes including substance abuse. Here, we explored the neuroanatomical correlates of sensation seeking and impulsivity in healthy young adults. Our analyses revealed links between sensation seeking and reduced cortical thickness that were preferentially localized to regions implicated in cognitive control, including anterior cingulate and middle frontal gyrus (n = 1015). These associations generalized to self-reported motor impulsivity, replicated in an independent group (n = 219), and correlated with heightened alcohol, tobacco, and caffeine use. Critically, the relations between sensation seeking and brain structure were evident in participants without a history of alcohol or tobacco use, suggesting that observed associations with anatomy are not solely a consequence of substance use. These results demonstrate that individual differences in the tendency to seek stimulation, act on impulse, and engage in substance use are correlated with the anatomical structure of cognitive control circuitry. Our findings suggest that, in healthy populations, covariation across these complex multidimensional behaviors may in part originate from a common underlying biology.

Wednesday, April 20, 2016

Metaphorical conflict shapes social perception when spatial and ideological collide.

Kleiman et al. do some intriguing experiments. I give you their abstract first, which doesn't actually say how they did the experiments, and then give you some further description from their text. The abstract:
In the present article, we introduce the concept of metaphorical conflict—a conflict between the concrete and abstract aspects of a metaphor. We used the association between the concrete (spatial) and abstract (ideological) components of the political left-right metaphor to demonstrate that metaphorical conflict has marked implications for cognitive processing and social perception. Specifically, we showed that creating conflict between a spatial location and a metaphorically linked concept reduces perceived differences between the attitudes of partisans who are generally viewed as possessing fundamentally different worldviews (Democrats and Republicans). We further demonstrated that metaphorical conflict reduces perceived attitude differences by creating a mind-set in which categories are represented as possessing broader boundaries than when concepts are metaphorically compatible. These results suggest that metaphorical conflict shapes social perception by making members of distinct groups appear more similar than they are generally thought to be. These findings have important implications for research on conflict, embodied cognition, and social perception.
In the first experiment they asked subjects to categorize a series of pictures of Barack Obama and Mitt Romney. One group categorized the Romney pictures using their right hand (the P key)and Obama pictures with their left hand using the Q key - compatible with the right wing, left wing political metaphor. A second group was asked to identify Obama with their right hand and Romney with their left - in this case the physical action and the candidate's ideology were metaphorically incompatible. The interesting result was that:
...participants in the incompatible condition perceived the difference between the candidates’ ideologies as smaller than did participants in the compatible condition...Additionally, participants in the incompatible condition perceived the difference between the candidates’ stances on specific political issues as smaller than did participants in the compatible condition
A second experiment asked participants to estimate the ideology of the typical Democrat and Republican using a scale of 1 to 9 that was either compatible or incompatible with the metaphorical association linking spatial locations to political ideologies.
Participants assigned to the incompatible condition (n = 194) provided their response on a horizontally displayed scale with the values in the opposite sequence, that is, from 1 (extremely conservative) to 9 (extremely liberal). Note that this scale reversed the traditional spatial assignment and placed liberal views on the right and conservative views on the left, which metaphorically puts the physical location and ideology in conflict... consistent with predictions, participants who rated their perceptions on the incompatible scale perceived the typical Republican’s and typical Democrat’s attitudes as more similar than did participants who rated their perceptions on the compatible scale.
Two further control experiments were done.

Tuesday, April 19, 2016

Political polarization and prejudice.

Yesterday's post dealt with softening prejudicial attitudes towards transgender people. This is relevant to prejudice rising from the right versus left political polarization that continues to increase in this county. From a recent NYTimes OpEd piece by Arthur Brooks:
Thirty-eight percent of Democrats have a “very unfavorable” view of Republicans, and 43 percent of Republicans hold that view of Democrats. About half of “consistently liberal” Americans say most of their friends share their views, and about a third say it’s important to live in a place where that is so. For those who are “consistently conservative,” these preferences are even more pronounced.
...the average American is becoming more ideologically predictable. A Pew Research Center study from 2014 shows that the share of Americans with “consistently conservative” or “consistently liberal” views has more than doubled in the last two decades to 21 percent from 10 percent...In 1994, nearly 40 percent of Republicans were more liberal than the median Democrat, and 30 percent of Democrats were more conservative than the median Republican. Today, those numbers have plummeted to 8 percent and 6 percent.
This polarization has led to political discrimination that studies have shown to be stronger than racial discrimination
...Bigotry’s cousin is contempt...Watch and listen to politically polarized commentary today, and you will see that it is more contemptuous than angry, overflowing with sneering, mockery and disgust.
So what’s the antidote? I asked the Dalai Lama, one of the world’s experts on bringing people together. He made two points. First, the solution starts not with institutions, but with individuals. We look too much to political parties or Congress to make progress, but not nearly enough at our own behavior...You can’t single-handedly change the country, but you can change yourself. By declaring your independence from the bitterness washing over our nation, you can strike a small blow for greater national unity.
Second, each of us must aspire to what the Dalai Lama calls “warmheartedness” toward those with whom we disagree. This might sound squishy, but it is actually tough and practical advice. As he has stated, “I defeat my enemies when I make them my friends.” He is not advocating surrender to the views of those with whom we disagree. Liberals should be liberals and conservatives should be conservatives. But our duty is to be respectful, fair and friendly to all, even those with whom we have great differences.
Yesterday's post on changing prejudice suggests a further technique for reconciliation: active or analogic perspective taking. This is essentially imagining a situation in which you felt contempt from others, and also putting yourself in the shoes of others, imagining their concerns, etc.

Monday, April 18, 2016

How to change prejudice...for real this time

John Bohannon summarizes the interesting story of two researchers, who after finding that a study on reversing homophobia was based on fake data, went ahead to find that the effect claimed by the fraudulent study was real after all. Broockman and Kalla used a technique developed by the Los Angeles LGBT center:
...the LGBT Center has its canvassers follow one called “analogic perspective taking.” By inviting someone to discuss an experience in which that person was perceived as different and treated unfairly, a canvasser tries to generate sympathy for the suffering of another group—such as gay or transgender people.
Here is the abstract:
Existing research depicts intergroup prejudices as deeply ingrained, requiring intense intervention to lastingly reduce. Here, we show that a single approximately 10-minute conversation encouraging actively taking the perspective of others can markedly reduce prejudice for at least 3 months. We illustrate this potential with a door-to-door canvassing intervention in South Florida targeting antitransgender prejudice. Despite declines in homophobia, transphobia remains pervasive. For the intervention, 56 canvassers went door to door encouraging active perspective-taking with 501 voters at voters’ doorsteps. A randomized trial found that these conversations substantially reduced transphobia, with decreases greater than Americans’ average decrease in homophobia from 1998 to 2012. These effects persisted for 3 months, and both transgender and nontransgender canvassers were effective. The intervention also increased support for a nondiscrimination law, even after exposing voters to counterarguments.

Friday, April 15, 2016

Brain correlates of how the risk taking of others influences our own risk taking

From Suzuki et al., another upstairs/downstairs story. Risk is represented in the caudate nucleus (downstairs), the risk activity of others is represented in the dorsolateral prefrontal cortex (upstairs). The strength of the connections between these areas determines how susceptible our behavior is to influence by others.
Our attitude toward risk plays a crucial role in influencing our everyday decision-making. Despite its importance, little is known about how human risk-preference can be modulated by observing risky behavior in other agents at either the behavioral or the neural level. Using fMRI combined with computational modeling of behavioral data, we show that human risk-preference can be systematically altered by the act of observing and learning from others’ risk-related decisions. The contagion is driven specifically by brain regions involved in the assessment of risk: the behavioral shift is implemented via a neural representation of risk in the caudate nucleus, whereas the representations of other decision-related variables such as expected value are not affected. Furthermore, we uncover neural computations underlying learning about others’ risk-preferences and describe how these signals interact with the neural representation of risk in the caudate. Updating of the belief about others’ preferences is associated with neural activity in the dorsolateral prefrontal cortex (dlPFC). Functional coupling between the dlPFC and the caudate correlates with the degree of susceptibility to the contagion effect, suggesting that a frontal–subcortical loop, the so-called dorsolateral prefrontal–striatal circuit, underlies the modulation of risk-preference. Taken together, these findings provide a mechanistic account for how observation of others’ risky behavior can modulate an individual’s own risk-preference.

Thursday, April 14, 2016

Aging brains - more physical activity, more gray matter, less Alzheimers.

I like to pass on any work I see relevant to exercise, aging, and the brain. The following is from Raji et al.
BACKGROUND: Physical activity (PA) can be neuroprotective and reduce the risk for Alzheimer's disease (AD). In assessing physical activity, caloric expenditure is a proxy marker reflecting the sum total of multiple physical activity types conducted by an individual. 
OBJECTIVE: To assess caloric expenditure, as a proxy marker of PA, as a predictive measure of gray matter (GM) volumes in the normal and cognitively impaired elderly persons. 
METHODS: All subjects in this study were recruited from the Institutional Review Board approved Cardiovascular Health Study (CHS), a multisite population-based longitudinal study in persons aged 65 and older. We analyzed a sub-sample of CHS participants 876 subjects (mean age 78.3, 57.5% F, 42.5% M) who had i) energy output assessed as kilocalories (kcal) per week using the standardized Minnesota Leisure-Time Activities questionnaire, ii) cognitive assessments for clinical classification of normal cognition, mild cognitive impairment (MCI), and AD, and iii) volumetric MR imaging of the brain. Voxel-based morphometry modeled the relationship between kcal/week and GM volumes while accounting for standard covariates including head size, age, sex, white matter hyperintensity lesions, MCI or AD status, and site. Multiple comparisons were controlled using a False Discovery Rate of 5 percent. 
RESULTS: Higher energy output, from a variety of physical activity types, was associated with larger GM volumes in frontal, temporal, and parietal lobes, as well as hippocampus, thalamus, and basal ganglia. High levels of caloric expenditure moderated neurodegeneration-associated volume loss in the precuneus, posterior cingulate, and cerebellar vermis. 
CONCLUSION: Increasing energy output from a variety of physical activities is related to larger gray matter volumes in the elderly, regardless of cognitive status.

Wednesday, April 13, 2016

Distraction in the digital era….what about since 1710?

I want to pass on some clips from an interesting essay by Frank Furedi, "The Ages of Distraction."
The rise of the internet and the widespread availability of digital technology has surrounded us with endless sources of distraction: texts, emails and Instagrams from friends, streaming music and videos, ever-changing stock quotes, news and more news. To get our work done, we could try to turn off the digital stream, but that’s difficult to do when we’re plagued by FOMO, the modern fear of missing out. Some people think that our willpower is so weak because our brains have been damaged by digital noise. But blaming technology for the rise in inattention is misplaced. History shows that the disquiet is fueled not by the next new thing but by the threat this thing – whatever it might be – poses to the moral authority of the day.
The first time inattention emerged as a social threat was in 18th-century Europe, during the Enlightenment, just as logic and science were pushing against religion and myth. The Oxford English Dictionary cites a 1710 entry from Tatler as its first reference to this word, coupling inattention with indolence; both are represented as moral vices of serious public concern.
The recent decades have seen a dramatic reversal in the conceptualization of inattention. Unlike in the 18th century when it was perceived as abnormal, today inattention is often presented as the normal state. The current era is frequently characterized as the Age of Distraction, and inattention is no longer depicted as a condition that afflicts a few. Nowadays, the erosion of humanity’s capacity for attention is portrayed as an existential problem, linked with the allegedly corrosive effects of digitally driven streams of information relentlessly flowing our way.
The perception of an Age of Distraction is related to our uncertainty about the answer to the question of ‘attention to what or to whom’. The sublimation of anxieties about moral authority through the fetish of technologically driven distraction has acquired pathological proportions in relation to children and young people. Yet as most sensible observers understand, children who are inattentive to their teachers are often obsessively attentive to the text messages that they receive. The constant lament about inattentive youth in the Anglo-American world could be interpreted as a symptom of problems related to the exercise of adult authority.
Often the failure to inspire and capture the imagination of young people is blamed on their inattentive state of minds. Too often educators have responded to this condition by adopting a fatalistic approach of accommodating to the supposed inattentive reading practices of digital natives. This pattern is evident in higher education where the assumption that college students can no longer be expected to read long and challenging texts or pay attention to serious lectures has led to the adaptation of course material to the inattentive mentality of the digital native. Calls to change the educational environment to ‘fit the student’ have become widespread in higher education.
How different from the reaction of moral philosophers such as Dugald Stewart, also concerned with the problem of the inattentive student. Author of Outlines of Moral Philosophy: For the Use of Students in the University of Edinburgh (1793), Stewart believed that the problem of inattention could be overcome through moral education. Unlike some contemporary academics, he regarded the ‘early habit of inattention’ a problem to be solved rather than an unalterable fact of existence. Helvétius fervently believed that everyone had the potential to acquire ‘continued attention’ and ‘triumph over indolence’.
Regrettably, the optimism of Helvétius has given way to a mood of resignation. Attention is still seen as desirable but almost impossible to achieve. As one alarmist account warns, ‘an epidemic erosion of attention is a sure sign of an impending dark age’. Helvétius would have been distressed by the fatalism expressed in this lament.

Tuesday, April 12, 2016

The evolutionary origins of smiles, laughter, and tears.

Graziano suggests that our smile originated in the defensive reaction of monkeys to other monkeys moving into their personal space. He then proceeds to make just-so stories about simian origins of our laughing and crying. To begin, imagine Monkey B steps into the personal space of Monkey A.
Monkey A squints, protecting his eyes. His upper lip pulls up. This does expose the teeth, but only as a side-effect: in a defensive reaction, the point of the curled lip is not to prepare for a biting attack so much as it is to bunch the facial skin upward, further padding the eyes in folds of skin...The head pulls down and the shoulders pull up to protect the vulnerable throat and jugular....The torso curves forward to protect the abdomen...Monkey B can learn a lot by watching the reaction of Monkey A...And so the stage is set for a social signal to evolve: natural selection will favour monkeys that can read the cringe reactions of their peers and adjust their behaviour accordingly...If Monkey B can glean useful information by watching Monkey A, then it’s useful for Monkey A to manipulate that information and influence Monkey B. Evolution therefore favours monkeys that can, in the right circumstances, pantomime a defensive reaction. It helps to convince others that you’re non-threatening. Finally we see the origin of the smile: a briefly flashed imitation of a defensive stance.
In people, the smile has been pared down to little more than its facial components — the lifting of the upper lip, the upward bunching of the cheeks, the squint. These days we use it mainly to communicate a friendly lack of aggression rather than outright subservience...We can’t help feeling warmer towards someone who beams that Duchenne smile.
On laughing:
...chimps have something like laughter: they open their mouths and make short exhalations during play fights, or if someone tickles them. Gorillas and orangutans do the same. The psychologist Marina Ross compared the noises made by different species of ape and found that it was the sound of bonobos at play that comes closest to human laughter, again, when play-fighting or tickling. All of which makes it seem quite likely that the original type of human laughter also emerged from, yes, play-fighting and tickling.
On crying:
My best guess, strange as it might sound, is that our ancestors were in the habit of punching each other on the nose. Such injuries would have resulted in copious tear production...According to recent analysis by David Carrier and Michael Morgan from Utah University, the shape of human facial bones might well have evolved to withstand the physical trauma of frequent punching. Thickly buttressed facial bones are first seen in fossils of Australopithecus, which appeared following our split with chimpanzees...the reason we weep now may well be that our ancestors discussed their differences by hitting each other in the face. Some of us still do, I suppose.
In any event, the entire behavioural display that we call crying – the tear production, the squinting, the raised upper lip, the repeated alarm calls – makes for a useful signifier. Evolution would have favoured animals that reacted to it with an emotional desire to dispense comfort.
Graziano's speculative summary:
An age-old defensive mechanism, a mechanism that monitors bubbles of space around the body and organises protective movements, suddenly takes flight in the hyper-social world of primates, spinning into smiles and laughter and crying and cringeing. Each one of those behaviours then splits further, branching into a whole codebook of signals for use in different social circumstances. Not all of human expression can be explained in this way, but much of it can. A Duchenne smile, a cold smile, laughter at a joke, laughter that acknowledges a clever witticism, cruel laughter, a cringe to show servility, standing straight to show confidence, the arms-crossed expression of suspicion, the arms-open expression of welcome, tilting your head as a sign of surrender to a lover, the fleeting crinkling of the face that hints at crying as we show sympathy for some sad story, or a full blown sobbing jag: this whole vast range of expression could well have emerged from a protective sensory-motor loop that has nothing to do with communication. Evolution is bizarre.

Monday, April 11, 2016

Another list - "Keys to happiness"

The New York Times has done a simple list of pointers to basic articles and research on well being. I'm passing on a few of the items from a condensed version of that list, rearranging the list in almost reverse order to reflect not importance, but items that seem to me to be less commonly acted on. So, keys of happiness:

Don't obsess about it, and don't overdo it.

If all else fails, fake it.

Gratitude helps.

Make friends, family, and weekends a priority

Be healthy

Friday, April 08, 2016

A succinct list of some of our common psychological errors.

I want to point to Belsky's article on why we think we are better decision makers under uncertainty than we really are. He summarizes several common errors:

The sunk cost fallacy - hanging on to a decision, or an investment, in an unconscious desire to justify it.

Loss aversion - reacting more strongly to loss of a resource (time, goods, or money) than to a similar gain.

Overconfidence - overrating our abilities, knowledge, and skill (two thirds of investors rate their financial sophistication as advanced, but barely pass a financial literacy exam.)

Optimism bias - which seems to be hard-wired into our brains because it has evolutionarily useful, driving humans to strive in the face of long odds.

Hindsight bias - rewriting history to make ourselves look good, as in misremembering our forecasts in a way that makes us look smarter.

Attribution bias - attributing good outcomes to our own skills, but bad outcomes to causes over which we had no control.

Confirmation bias - giving too much weight to information that supports our existing beliefs and discounting that which does not.

Thursday, April 07, 2016

Muscle mass and nerve control enhanced in octogenarian athletes.

Power et al. expand their earlier studies on active runners ~65 years old to find ~14% greater muscle mass and ~28% more functioning motor nerve units in octogenarian masters athletes than in healthy age-matched controls.
Our group has shown a greater number of functioning motor units (MU) in a cohort of highly-active older(~65y) masters runners relative to age-matched controls. Owing to the precipitous loss in the number of functioning MUs in the 8th and 9th decade of life it is unknown whether older world class octogenarian masters athletes (MA) would also have greater numbers of functioning MUs (MUNE) compared with age-matched controls. We measured MU numbers and neuromuscular transmission stability in the tibialis anterior of world champion MAs (~80y), and compared the values to healthy age-matched controls (~80y). Decomposition-enhanced spike-triggered averaging was used to collect surface and intramuscular electromyography signals during dorsiflexion at ~25% of maximum voluntary isometric contraction(MVC). Near fibre (NF) MU potential analysis was used to assess neuromuscular transmission stability. For the MAs as compared with age-matched controls; the amount of excitable muscle mass (CMAP) was 14% greater (p less than 0.05), there was a trend (p=0.07) towards a 27% smaller surface detected motor unit potential - representative of less collateral reinnervation, and 28% more functioning MUs (p less than 0.05). Additionally, the MAs had greater MU neuromuscular stability than the controls as indicated by lower NF jitter and jiggle values (p less than 0.05). These results demonstrate that high performing octogenarians better maintain neuromuscular stability of the MU and mitigate the loss of MUs associated with aging well into the later decades of life during which time the loss of muscle mass and strength become functionally relevant. Future studies need to identify the concomitant roles genetics and exercise play in neuroprotection.

Wednesday, April 06, 2016

Why sad music can make us feel good.

As an update to a previous MindBlog post on why we like sad music, I want to note Ojiaku's brief mention of several articles on this subject.
Sad music might make people feel vicarious unpleasant emotions, found a study published last year in Frontiers in Psychology. But this experience can ultimately be pleasurable because it allows a negative emotion to exist indirectly, and at a safe distance. Instead of feeling the depths of despair, people can feel nostalgia for a time when they were in a similar emotional state: a non-threatening way to remember a sadness.
People who are very empathetic are more likely to take pleasure in the emotional experience of sad music, according to another study in Frontiers of Psychology. Others enjoy sad songs because they help them return to an emotionally balanced state, according to a review in Frontiers in Human Neuroscience, published in 2015. And those more open to varied experiences might enjoy the songs because the unique emotions that come up when listening to the music fulfill their need for novelty in thoughts and feelings.
From the Frontiers in Neurosciences abstract:
We offer a framework to account for how listening to sad music can lead to positive feelings, contending that this effect hinges on correcting an ongoing homeostatic imbalance. Sadness evoked by music is found pleasurable: (1) when it is perceived as non-threatening; (2) when it is aesthetically pleasing; and (3) when it produces psychological benefits such as mood regulation, and empathic feelings, caused, for example, by recollection of and reflection on past events.