Tuesday, December 11, 2018

Watching memories change the brain - a challenge to the traditional view

I pass on both the Science Magazine summary of Brodt et al., as well as the summary graphic in a review of their article by Assaf, and finally the Brodt et al. abstract:
How fast do learning-induced anatomical changes occur in the brain? The traditional view postulates that neocortical memory representations reflect reinstatement processes initiated by the hippocampus and that a genuine physical trace develops only through reactivation over extended periods. Brodt et al. combined functional magnetic resonance imaging (MRI) with diffusion-weighted MRI during an associative declarative learning task to examine experience-dependent structural brain plasticity in human subjects (see the Perspective by Assaf). This plasticity was rapidly induced after learning, persisted for more than 12 hours, drove behavior, and was localized in areas displaying memory-related functional brain activity. These plastic changes in the posterior parietal cortex, and their fast temporal dynamics, challenge traditional views of systems memory consolidation.
Models of systems memory consolidation postulate a fast-learning hippocampal store and a slowly developing, stable neocortical store. Accordingly, early neocortical contributions to memory are deemed to reflect a hippocampus-driven online reinstatement of encoding activity. In contrast, we found that learning rapidly engenders an enduring memory engram in the human posterior parietal cortex. We assessed microstructural plasticity via diffusion-weighted magnetic resonance imaging as well as functional brain activity in an object–location learning task. We detected neocortical plasticity as early as 1 hour after learning and found that it was learning specific, enabled correct recall, and overlapped with memory-related functional activity. These microstructural changes persisted over 12 hours. Our results suggest that new traces can be rapidly encoded into the parietal cortex, challenging views of a slow-learning neocortex.

Monday, December 10, 2018

The coding of perception in language is not universal.

From Majid et al.:
Is there a universal hierarchy of the senses, such that some senses (e.g., vision) are more accessible to consciousness and linguistic description than others (e.g., smell)? The long-standing presumption in Western thought has been that vision and audition are more objective than the other senses, serving as the basis of knowledge and understanding, whereas touch, taste, and smell are crude and of little value. This predicts that humans ought to be better at communicating about sight and hearing than the other senses, and decades of work based on English and related languages certainly suggests this is true. However, how well does this reflect the diversity of languages and communities worldwide? To test whether there is a universal hierarchy of the senses, stimuli from the five basic senses were used to elicit descriptions in 20 diverse languages, including 3 unrelated sign languages. We found that languages differ fundamentally in which sensory domains they linguistically code systematically, and how they do so. The tendency for better coding in some domains can be explained in part by cultural preoccupations. Although languages seem free to elaborate specific sensory domains, some general tendencies emerge: for example, with some exceptions, smell is poorly coded. The surprise is that, despite the gradual phylogenetic accumulation of the senses, and the imbalances in the neural tissue dedicated to them, no single hierarchy of the senses imposes itself upon language.

Friday, December 07, 2018

The neuroscience of hugs.

Packheiser et al. observed more than 2,500 hugs at an international airport, hugs with positive emotions at arrival gates and hugs with negative emotions at departure gates. (Hugging causes the release of oxytocin, the human pair-bonding hormone.) They also looked at neutral hugs of people who offered blindfolded hugs to strangers in the street. Most people showed a preference for right-sided hugs in all three situations (leading with the right hand and arm, the right hand being used by most people for skilled activities). Left-sided hugs occurred more frequently in emotional situations, no matter whether they were positive or negative. The left side of the body is controlled by the right side of the brain — which is heavily involved in processing both positive and negative emotions. Thus, this drift to the left side may show an interaction between emotional networks and motor preferences. Their abstract:
Humans are highly social animals that show a wide variety of verbal and non-verbal behaviours to communicate social intent. One of the most frequently used non-verbal social behaviours is embracing, commonly used as an expression of love and affection. However, it can also occur in a large variety of social situations entailing negative (fear or sadness) or neutral emotionality (formal greetings). Embracing is also experienced from birth onwards in mother–infant interactions and is thus accompanying human social interaction across the whole lifespan. Despite the importance of embraces for human social interactions, their underlying neurophysiology is unknown. Here, we demonstrated in a well-powered sample of more than 2500 adults that humans show a significant rightward bias during embracing. Additionally, we showed that this general motor preference is strongly modulated by emotional contexts: the induction of positive or negative affect shifted the rightward bias significantly to the left, indicating a stronger involvement of right-hemispheric neural networks during emotional embraces. In a second laboratory study, we were able to replicate both of these findings and furthermore demonstrated that the motor preferences during embracing correlate with handedness. Our studies therefore not only show that embracing is controlled by an interaction of motor and affective networks, they also demonstrate that emotional factors seem to activate right-hemispheric systems in valence-invariant ways.

Thursday, December 06, 2018

Limited prosocial effects of meditation.

Kreplin et al. do a meta-analysis, and Kreplin writes a more general review of studies on the effects of meditation. The Krepline et al. abstract:
Many individuals believe that meditation has the capacity to not only alleviate mental-illness but to improve prosociality. This article systematically reviewed and meta-analysed the effects of meditation interventions on prosociality in randomized controlled trials of healthy adults. Five types of social behaviours were identified: compassion, empathy, aggression, connectedness and prejudice. Although we found a moderate increase in prosociality following meditation, further analysis indicated that this effect was qualified by two factors: type of prosociality and methodological quality. Meditation interventions had an effect on compassion and empathy, but not on aggression, connectedness or prejudice. We further found that compassion levels only increased under two conditions: when the teacher in the meditation intervention was a co-author in the published study; and when the study employed a passive (waiting list) control group but not an active one. Contrary to popular beliefs that meditation will lead to prosocial changes, the results of this meta-analysis showed that the effects of meditation on prosociality were qualified by the type of prosociality and methodological quality of the study. We conclude by highlighting a number of biases and theoretical problems that need addressing to improve quality of research in this area.

Wednesday, December 05, 2018

How stress changes our brains' blood flow.

From Elbau et al.:
Ample evidence links dysregulation of the stress response to the risk for psychiatric disorders. However, we lack an integrated understanding of mechanisms that are adaptive during the acute stress response but potentially pathogenic when dysregulated. One mechanistic link emerging from rodent studies is the interaction between stress effectors and neurovascular coupling, a process that adjusts cerebral blood flow according to local metabolic demands. Here, using task-related fMRI, we show that acute psychosocial stress rapidly impacts the peak latency of the hemodynamic response function (HRF-PL) in temporal, insular, and prefrontal regions in two independent cohorts of healthy humans. These latency effects occurred in the absence of amplitude effects and were moderated by regulatory genetic variants of KCNJ2, a known mediator of the effect of stress on vascular responsivity. Further, hippocampal HRF-PL correlated with both cortisol response and genetic variants that influence the transcriptional response to stress hormones and are associated with risk for major depression. We conclude that acute stress modulates hemodynamic response properties as part of the physiological stress response and suggest that HRF indices could serve as endophenotype of stress-related disorders.

Tuesday, December 04, 2018

More on the sociopathy of social media.

Languishing in my queue of potential posts have been two articles that I want to mention and pass on to readers.

Max Fisher writes on the unintended consequences of social media, from Myanmar to Germany:
I first went to Myanmar in early 2014, when the country was opening up, and there was no such thing as personal technology. Not even brick phones.
When I went back in late 2017, I could hardly believe it was the same country. Everybody had his or her nose in a smartphone, often logged in to Facebook. You’d meet with the same sources at the same roadside cafe, but now they’d drop a stack of iPhones on the table next to the tea.
It was like the purest possible experiment in what the same society looks like with or without modern consumer technology. Most people loved it, but it also helped drive genocidal violence against the Rohingya minority, empower military hard-liners and spin up riots.
...we’re starting to understand the risks that come from these platforms working exactly as designed. Facebook, YouTube and others use algorithms to identify and promote content that will keep us engaged, which turns out to amplify some of our worst impulses. (Fisher has done articles on algorithm driven violence in Germany and Sri Lanka)
And, Rich Hardy points to further work linking social media use and feelings of depression and loneliness. Work of Hunt et al. suggests that decreasing one's social media use can lead to significant improvements in personal well-being.

Monday, December 03, 2018

Our brains are prediction machines. Friston's free-energy principle

Further reading on the article noted in the previous post has made me realize that I have been seriously remiss in not paying more attention to a revolution in how we view our brains. From a Karl Friston piece in Nature Neuroscience on predictive coding:
In the 20th century we thought the brain extracted knowledge from sensations. The 21st century witnessed a ‘strange inversion’, in which the brain became an organ of inference, actively constructing explanations for what’s going on ‘out there’, beyond its sensory epithelia.
And, key points from a Friston review, "The free-energy principle: a unified brain theory?:
Adaptive agents must occupy a limited repertoire of states and therefore minimize the long-term average of surprise associated with sensory exchanges with the world. Minimizing surprise enables them to resist a natural tendency to disorder.
Surprise rests on predictions about sensations, which depend on an internal generative model of the world. Although surprise cannot be measured directly, a free-energy bound on surprise can be, suggesting that agents minimize free energy by changing their predictions (perception) or by changing the predicted sensory inputs (action).
Perception optimizes predictions by minimizing free energy with respect to synaptic activity (perceptual inference), efficacy (learning and memory) and gain (attention and salience). This furnishes Bayes-optimal (probabilistic) representations of what caused sensations (providing a link to the Bayesian brain hypothesis).
Bayes-optimal perception is mathematically equivalent to predictive coding and maximizing the mutual information between sensations and the representations of their causes. This is a probabilistic generalization of the principle of efficient coding (the infomax principle) or the minimum-redundancy principle.
Learning under the free-energy principle can be formulated in terms of optimizing the connection strengths in hierarchical models of the sensorium. This rests on associative plasticity to encode causal regularities and appeals to the same synaptic mechanisms as those underlying cell assembly formation.
Action under the free-energy principle reduces to suppressing sensory prediction errors that depend on predicted (expected or desired) movement trajectories. This provides a simple account of motor control, in which action is enslaved by perceptual (proprioceptive) predictions.
Perceptual predictions rest on prior expectations about the trajectory or movement through the agent's state space. These priors can be acquired (as empirical priors during hierarchical inference) or they can be innate (epigenetic) and therefore subject to selective pressure.
Predicted motion or state transitions realized by action correspond to policies in optimal control theory and reinforcement learning. In this context, value is inversely proportional to surprise (and implicitly free energy), and rewards correspond to innate priors that constrain policies.