Thursday, December 13, 2018

Stop talking about 'male' and 'female' brains.

As a counterpoint to yesterday's post, which invokes the Extreme Male Brain theory of autism, I pass on some clips from a piece by Joel and Fine that contests this categorization...
Consider, for example, Cambridge University psychologist Simon Baron-Cohen’s influential Empathizing-Systemizing theory of brains and the accompanying “extreme male brain” theory of autism. This presupposes there is a particular “systemizing” brain type that we could meaningfully describe as “the male brain,” that drives ways of thinking, feeling, and behaving that distinguish the typical boy and man from the typical “empathizing” girl and woman.
...one of us, Daphna Joel, led an analysis of four large data sets of brain scans, and found that the sex differences you see overall between men’s and women’s brains aren’t neatly and consistently seen in individual brains. In other words, humans generally don’t have brains with mostly or exclusively “female-typical” features or “male-typical” features. Instead, what’s most common in both females and males are brains with “mosaics” of features, some of them more common in males and some more common in females.
...Joel and colleagues then applied the same kind of analysis to large data sets of psychological variables, to ask: Do sex differences in personality characteristics, attitudes, preferences, and behaviors add up in a consistent way to create two types of humans, each with its own set of psychological features? The answer, again, was no: As for brain structure, the differences created mosaics of feminine and masculine personality traits, attitudes, interests, and behaviors...what was typical of both men and women (70 percent of them, to be exact) was a mosaic of feminine and masculine characteristics.
...if autism is indeed more prevalent in males, this may be associated with a difference between the sexes in the odds that a rare combination of brain characteristics makes an appearance, rather than with the typical male brain being a little more “autistic" than the typical female brain. Indeed, a recent study found that males with autism spectrum disorder had an atypical combination of “female-like” and “male-like” brain activity patterns.
The key point here is that although there are sex differences in brain and behavior, when you move away from group-level differences in single features and focus at the level of the individual brain or person, you find that the differences, regardless of their origins, usually “mix up” rather than “add up.” (The reason for this mixing-up of characteristics is that the genetic and hormonal effects of sex on brain and behavior depend on, and interact with, many other factors.) This yields many types of brain and behavior, which neither fall into a “male” and a “female” type, nor line up tidily along a male-female continuum.
The claim that science tells us that the possibility of greater merging of gender roles is unlikely because of “natural” differences between the sexes, focuses on average sex differences in the population — often in combination with the implicit assumption that whatever we think men are “more” of, is what is most valuable for male-dominated roles. (Why else would organizations offer confidence workshops for women, rather than modesty training for men?) But the world is inhabited by individuals whose unique mosaics of characteristics can’t be predicted on the basis of their sex. So let’s keep working on overcoming gender stereotypes, bias, discrimination, and structural barriers before concluding that sex, despite being a poor guide to our brains and psychological characteristics, is a strong determinant of social structure.

Wednesday, December 12, 2018

Testing theories of sex differences and autism with big data.

From Greenberg et al:

Significance
In the largest study to date of autistic traits, we test 10 predictions from the Empathizing–Systemizing (E-S) theory of sex differences and the Extreme Male Brain (EMB) theory of autism. We confirmed that typical females on average are more empathic, typical males on average are more systems-oriented, and autistic people on average show a “masculinized” profile. The strengths of the study are the inclusion of a replication sample and the use of big data. These two theories can be considered to have strong support. We demonstrate that D-scores (difference between E and S) account for 19 times the variance in autistic traits than do other demographic variables, including sex, underscoring the importance of brain types in autism.
Abstract
The Empathizing–Systemizing (E-S) theory of typical sex differences suggests that individuals may be classified based on empathy and systemizing. An extension of the E-S theory, the Extreme Male Brain (EMB) theory suggests that autistic people on average have a shift towards a more masculinized brain along the E-S dimensions. Both theories have been investigated in small sample sizes, limiting their generalizability. Here we leverage two large datasets (discovery n = 671,606, including 36,648 autistic individuals primarily; and validation n = 14,354, including 226 autistic individuals) to investigate 10 predictions of the E-S and the EMB theories. In the discovery dataset, typical females on average showed higher scores on short forms of the Empathy Quotient (EQ) and Sensory Perception Quotient (SPQ), and typical males on average showed higher scores on short forms of the Autism Spectrum Quotient (AQ) and Systemizing Quotient (SQ). Typical sex differences in these measures were attenuated in autistic individuals. Analysis of “brain types” revealed that typical females on average were more likely to be Type E (EQ > SQ) or Extreme Type E and that typical males on average were more likely to be Type S (SQ > EQ) or Extreme Type S. In both datasets, autistic individuals, regardless of their reported sex, on average were “masculinized.” Finally, we demonstrate that D-scores (difference between EQ and SQ) account for 19 times more of the variance in autistic traits (43%) than do other demographic variables including sex. Our results provide robust evidence in support of both the E-S and EMB theories.

Tuesday, December 11, 2018

Watching memories change the brain - a challenge to the traditional view

I pass on both the Science Magazine summary of Brodt et al., as well as the summary graphic in a review of their article by Assaf, and finally the Brodt et al. abstract:
How fast do learning-induced anatomical changes occur in the brain? The traditional view postulates that neocortical memory representations reflect reinstatement processes initiated by the hippocampus and that a genuine physical trace develops only through reactivation over extended periods. Brodt et al. combined functional magnetic resonance imaging (MRI) with diffusion-weighted MRI during an associative declarative learning task to examine experience-dependent structural brain plasticity in human subjects (see the Perspective by Assaf). This plasticity was rapidly induced after learning, persisted for more than 12 hours, drove behavior, and was localized in areas displaying memory-related functional brain activity. These plastic changes in the posterior parietal cortex, and their fast temporal dynamics, challenge traditional views of systems memory consolidation.
Models of systems memory consolidation postulate a fast-learning hippocampal store and a slowly developing, stable neocortical store. Accordingly, early neocortical contributions to memory are deemed to reflect a hippocampus-driven online reinstatement of encoding activity. In contrast, we found that learning rapidly engenders an enduring memory engram in the human posterior parietal cortex. We assessed microstructural plasticity via diffusion-weighted magnetic resonance imaging as well as functional brain activity in an object–location learning task. We detected neocortical plasticity as early as 1 hour after learning and found that it was learning specific, enabled correct recall, and overlapped with memory-related functional activity. These microstructural changes persisted over 12 hours. Our results suggest that new traces can be rapidly encoded into the parietal cortex, challenging views of a slow-learning neocortex.


Monday, December 10, 2018

The coding of perception in language is not universal.

From Majid et al.:
Is there a universal hierarchy of the senses, such that some senses (e.g., vision) are more accessible to consciousness and linguistic description than others (e.g., smell)? The long-standing presumption in Western thought has been that vision and audition are more objective than the other senses, serving as the basis of knowledge and understanding, whereas touch, taste, and smell are crude and of little value. This predicts that humans ought to be better at communicating about sight and hearing than the other senses, and decades of work based on English and related languages certainly suggests this is true. However, how well does this reflect the diversity of languages and communities worldwide? To test whether there is a universal hierarchy of the senses, stimuli from the five basic senses were used to elicit descriptions in 20 diverse languages, including 3 unrelated sign languages. We found that languages differ fundamentally in which sensory domains they linguistically code systematically, and how they do so. The tendency for better coding in some domains can be explained in part by cultural preoccupations. Although languages seem free to elaborate specific sensory domains, some general tendencies emerge: for example, with some exceptions, smell is poorly coded. The surprise is that, despite the gradual phylogenetic accumulation of the senses, and the imbalances in the neural tissue dedicated to them, no single hierarchy of the senses imposes itself upon language.

Friday, December 07, 2018

The neuroscience of hugs.

Packheiser et al. observed more than 2,500 hugs at an international airport, hugs with positive emotions at arrival gates and hugs with negative emotions at departure gates. (Hugging causes the release of oxytocin, the human pair-bonding hormone.) They also looked at neutral hugs of people who offered blindfolded hugs to strangers in the street. Most people showed a preference for right-sided hugs in all three situations (leading with the right hand and arm, the right hand being used by most people for skilled activities). Left-sided hugs occurred more frequently in emotional situations, no matter whether they were positive or negative. The left side of the body is controlled by the right side of the brain — which is heavily involved in processing both positive and negative emotions. Thus, this drift to the left side may show an interaction between emotional networks and motor preferences. Their abstract:
Humans are highly social animals that show a wide variety of verbal and non-verbal behaviours to communicate social intent. One of the most frequently used non-verbal social behaviours is embracing, commonly used as an expression of love and affection. However, it can also occur in a large variety of social situations entailing negative (fear or sadness) or neutral emotionality (formal greetings). Embracing is also experienced from birth onwards in mother–infant interactions and is thus accompanying human social interaction across the whole lifespan. Despite the importance of embraces for human social interactions, their underlying neurophysiology is unknown. Here, we demonstrated in a well-powered sample of more than 2500 adults that humans show a significant rightward bias during embracing. Additionally, we showed that this general motor preference is strongly modulated by emotional contexts: the induction of positive or negative affect shifted the rightward bias significantly to the left, indicating a stronger involvement of right-hemispheric neural networks during emotional embraces. In a second laboratory study, we were able to replicate both of these findings and furthermore demonstrated that the motor preferences during embracing correlate with handedness. Our studies therefore not only show that embracing is controlled by an interaction of motor and affective networks, they also demonstrate that emotional factors seem to activate right-hemispheric systems in valence-invariant ways.

Thursday, December 06, 2018

Limited prosocial effects of meditation.

Kreplin et al. do a meta-analysis, and Kreplin writes a more general review of studies on the effects of meditation. The Krepline et al. abstract:
Many individuals believe that meditation has the capacity to not only alleviate mental-illness but to improve prosociality. This article systematically reviewed and meta-analysed the effects of meditation interventions on prosociality in randomized controlled trials of healthy adults. Five types of social behaviours were identified: compassion, empathy, aggression, connectedness and prejudice. Although we found a moderate increase in prosociality following meditation, further analysis indicated that this effect was qualified by two factors: type of prosociality and methodological quality. Meditation interventions had an effect on compassion and empathy, but not on aggression, connectedness or prejudice. We further found that compassion levels only increased under two conditions: when the teacher in the meditation intervention was a co-author in the published study; and when the study employed a passive (waiting list) control group but not an active one. Contrary to popular beliefs that meditation will lead to prosocial changes, the results of this meta-analysis showed that the effects of meditation on prosociality were qualified by the type of prosociality and methodological quality of the study. We conclude by highlighting a number of biases and theoretical problems that need addressing to improve quality of research in this area.

Wednesday, December 05, 2018

How stress changes our brains' blood flow.

From Elbau et al.:
Ample evidence links dysregulation of the stress response to the risk for psychiatric disorders. However, we lack an integrated understanding of mechanisms that are adaptive during the acute stress response but potentially pathogenic when dysregulated. One mechanistic link emerging from rodent studies is the interaction between stress effectors and neurovascular coupling, a process that adjusts cerebral blood flow according to local metabolic demands. Here, using task-related fMRI, we show that acute psychosocial stress rapidly impacts the peak latency of the hemodynamic response function (HRF-PL) in temporal, insular, and prefrontal regions in two independent cohorts of healthy humans. These latency effects occurred in the absence of amplitude effects and were moderated by regulatory genetic variants of KCNJ2, a known mediator of the effect of stress on vascular responsivity. Further, hippocampal HRF-PL correlated with both cortisol response and genetic variants that influence the transcriptional response to stress hormones and are associated with risk for major depression. We conclude that acute stress modulates hemodynamic response properties as part of the physiological stress response and suggest that HRF indices could serve as endophenotype of stress-related disorders.

Tuesday, December 04, 2018

More on the sociopathy of social media.

Languishing in my queue of potential posts have been two articles that I want to mention and pass on to readers.

Max Fisher writes on the unintended consequences of social media, from Myanmar to Germany:
I first went to Myanmar in early 2014, when the country was opening up, and there was no such thing as personal technology. Not even brick phones.
When I went back in late 2017, I could hardly believe it was the same country. Everybody had his or her nose in a smartphone, often logged in to Facebook. You’d meet with the same sources at the same roadside cafe, but now they’d drop a stack of iPhones on the table next to the tea.
It was like the purest possible experiment in what the same society looks like with or without modern consumer technology. Most people loved it, but it also helped drive genocidal violence against the Rohingya minority, empower military hard-liners and spin up riots.
...we’re starting to understand the risks that come from these platforms working exactly as designed. Facebook, YouTube and others use algorithms to identify and promote content that will keep us engaged, which turns out to amplify some of our worst impulses. (Fisher has done articles on algorithm driven violence in Germany and Sri Lanka)
And, Rich Hardy points to further work linking social media use and feelings of depression and loneliness. Work of Hunt et al. suggests that decreasing one's social media use can lead to significant improvements in personal well-being.

Monday, December 03, 2018

Our brains are prediction machines. Friston's free-energy principle

Further reading on the article noted in the previous post has made me realize that I have been seriously remiss in not paying more attention to a revolution in how we view our brains. From a Karl Friston piece in Nature Neuroscience on predictive coding:
In the 20th century we thought the brain extracted knowledge from sensations. The 21st century witnessed a ‘strange inversion’, in which the brain became an organ of inference, actively constructing explanations for what’s going on ‘out there’, beyond its sensory epithelia.
And, key points from a Friston review, "The free-energy principle: a unified brain theory?:
Adaptive agents must occupy a limited repertoire of states and therefore minimize the long-term average of surprise associated with sensory exchanges with the world. Minimizing surprise enables them to resist a natural tendency to disorder.
Surprise rests on predictions about sensations, which depend on an internal generative model of the world. Although surprise cannot be measured directly, a free-energy bound on surprise can be, suggesting that agents minimize free energy by changing their predictions (perception) or by changing the predicted sensory inputs (action).
Perception optimizes predictions by minimizing free energy with respect to synaptic activity (perceptual inference), efficacy (learning and memory) and gain (attention and salience). This furnishes Bayes-optimal (probabilistic) representations of what caused sensations (providing a link to the Bayesian brain hypothesis).
Bayes-optimal perception is mathematically equivalent to predictive coding and maximizing the mutual information between sensations and the representations of their causes. This is a probabilistic generalization of the principle of efficient coding (the infomax principle) or the minimum-redundancy principle.
Learning under the free-energy principle can be formulated in terms of optimizing the connection strengths in hierarchical models of the sensorium. This rests on associative plasticity to encode causal regularities and appeals to the same synaptic mechanisms as those underlying cell assembly formation.
Action under the free-energy principle reduces to suppressing sensory prediction errors that depend on predicted (expected or desired) movement trajectories. This provides a simple account of motor control, in which action is enslaved by perceptual (proprioceptive) predictions.
Perceptual predictions rest on prior expectations about the trajectory or movement through the agent's state space. These priors can be acquired (as empirical priors during hierarchical inference) or they can be innate (epigenetic) and therefore subject to selective pressure.
Predicted motion or state transitions realized by action correspond to policies in optimal control theory and reinforcement learning. In this context, value is inversely proportional to surprise (and implicitly free energy), and rewards correspond to innate priors that constrain policies.

Friday, November 30, 2018

Being a Beast Machine: The Somatic Basis of Selfhood

Seth and Tsakiris offer a review in Trends in Cognitive Sciences, with the title of this post, that immediately caught my eye. I'm working on a lecture now that incorporates some of its themes. Here I pass on the abstract, motivated readers can obtain a copy of the full article from me.

Highlights
We conceptualise experiences of embodied selfhood in terms of control-oriented predictive regulation (allostasis) of physiological states.
We account for distinctive phenomenological aspects of embodied selfhood, including its (partly) non-object-like nature and its subjective stability over time.
We explain predictive perception as a generalisation from a fundamental biological imperative to maintain physiological integrity: to stay alive.
We bring together several cognitive science traditions, including predictive processing, perceptual control theory, cybernetics, the free energy principle, and sensorimotor contingency theory.
We show how perception of the world around us, and of ourselves within it, happens with, through, and because of our living bodies.
We draw implications for developmental psychology and identify open questions in psychiatry and artificial intelligence.
Abstract
Modern psychology has long focused on the body as the basis of the self. Recently, predictive processing accounts of interoception (perception of the body ‘from within’) have become influential in accounting for experiences of body ownership and emotion. Here, we describe embodied selfhood in terms of ‘instrumental interoceptive inference’ that emphasises allostatic regulation and physiological integrity. We apply this approach to the distinctive phenomenology of embodied selfhood, accounting for its non-object-like character and subjective stability over time. Our perspective has implications for the development of selfhood and illuminates longstanding debates about relations between life and mind, implying, contrary to Descartes, that experiences of embodied selfhood arise because of, and not in spite of, our nature as ‘beast machines’.

Thursday, November 29, 2018

A molecular basis for the placebo effect.

Several popular articles point to work I wish I had been more aware off. Gary Greenberg in the NYTimes, and Cari Romm in The Atlantic, point to work of Kathryn Hall and collaborators showing that placebo responses are strongest in patients with a variant of a gene (COMT, which regulates the amount of dopamine in the brain) that causes higher levels of dopamine, which is linked to pain the the good feeling that come with reward. Irritable bowel syndrome patients with the high-dopamine version of the gene were more likely to report that the placebo treatment had relieved their symptoms, an effect that was even stronger in the group that had received their treatment from a caring provider. Variations in the COMT gene locus are unlikely to fully account for a complex behavior like the placebo response, but contribute to the puzzle. Here is the abstract from the Hall et al. paper:
• Predisposition to respond to placebo treatment may be in part a stable heritable trait. 
• Candidate placebo response pathways may interact with drugs to modify outcomes in both the placebo and drug treatment arms of clinical trials. 
• Genomic analysis of randomized placebo and no-treatment controlled trials are needed to fully realize the potential of the placebome.
Placebos are indispensable controls in randomized clinical trials (RCTs), and placebo responses significantly contribute to routine clinical outcomes. Recent neurophysiological studies reveal neurotransmitter pathways that mediate placebo effects. Evidence that genetic variations in these pathways can modify placebo effects raises the possibility of using genetic screening to identify placebo responders and thereby increase RCT efficacy and improve therapeutic care. Furthermore, the possibility of interaction between placebo and drug molecular pathways warrants consideration in RCT design. The study of genomic effects on placebo response, ‘the placebome’, is in its infancy. Here, we review evidence from placebo studies and RCTs to identify putative genes in the placebome, examine evidence for placebo–drug interactions, and discuss implications for RCTs and clinical care.

Wednesday, November 28, 2018

Factoids about an ideal gas.

I pass on this neat slide from a lecture by physics professor Clint Sprott ("Ergodicity in Chaotic Oscillators") given to the Nov. 20 session of the Chaos and Complex Systems Seminar at Univ. of Wisc. Madison.


Tuesday, November 27, 2018

Impacts of outdoor artificial light on plant and animal species.

Gaston does a perspective article describing how the nighttime lighting up of our planet is profoundly disturbing the activities of many animal and plant species. I pass on three paragraphs:
Artificial light at night can usefully be thought of as having two linked components. The first component—direct emissions from outdoor lighting sources, which include streetlights, building and infrastructure lighting, and road vehicle headlamps—is spatially extremely heterogeneous. Ground-level illuminance in the immediate vicinity can vary from less than 10 lux (lx) to more than 100 lx (for context, a full moon on a clear night has an illuminance of up to 0.1 lx). It often declines rapidly over distances of a few meters. However, emissions from unshielded lights can, when unobstructed, carry horizontally over many kilometers, making artificial light at night both an urban and a rural issue.
The second component of artificial light at night is skyglow, the brightening of the nighttime sky caused mainly by upwardly emitted and reflected artificial light that is scattered in the atmosphere by water, dust, and gas molecules. Although absolute illuminance levels are at most about 0.2 to 0.5 lx, much lower than those from direct emissions, these are often sufficiently high to obscure the Milky Way, which is used for orientation by some organisms. In many urban areas, skyglow even obscures lunar light cycles, which are used by many organisms as cues for biological activity.

In the laboratory, organismal responses, such as suppression of melatonin levels and changes to behavioral activity patterns, generally increase with greater intensities of artificial light at night. It is challenging to establish the form of such functional relationships in the field, but experiments and observations have shown that commonplace levels of artificial light at night influence a wide range of biological phenomena across a wide diversity of taxa, including individual physiology and behavior, species abundances and distributions, community structure and dynamics, and ecosystem function and process. Exposure to even dim nighttime lighting (below 1 lx) can drastically change activity patterns of both naturally day-active and night-active species. These effects can be exacerbated by trophic interactions, such that the abundances of species whose activity is not directly altered may nonetheless be severely affected under low levels of nighttime lighting.

Monday, November 26, 2018

Dietary fat: From foe to friend?

The title of the post is the title of one of the articles in a special section of the Nov. 16 issue of Science devoted to Diet and Health. I want to pass on the abstract of this article, as well as the list of points of consensus that emerge from many different studies cited in the article. It emphasizes the importance of which particular fat or carbohydrate sources are consumed:

Abstract
For decades, dietary advice was based on the premise that high intakes of fat cause obesity, diabetes, heart disease, and possibly cancer. Recently, evidence for the adverse metabolic effects of processed carbohydrate has led to a resurgence in interest in lower-carbohydrate and ketogenic diets with high fat content. However, some argue that the relative quantity of dietary fat and carbohydrate has little relevance to health and that focus should instead be placed on which particular fat or carbohydrate sources are consumed. This review, by nutrition scientists with widely varying perspectives, summarizes existing evidence to identify areas of broad consensus amid ongoing controversy regarding macronutrients and chronic disease.



Points of consensus.
1. With a focus on nutrient quality, good health and low chronic disease risk can be achieved for many people on diets with a broad range of carbohydrate-to-fat ratios. 
2. Replacement of saturated fat with naturally occurring unsaturated fats provides health benefits for the general population. Industrially produced trans fats are harmful and should be eliminated. The metabolism of saturated fat may differ on carbohydrate-restricted diets, an issue that requires study. 
3. Replacement of highly processed carbohydrates (including refined grains, potato products, and free sugars) with unprocessed carbohydrates (nonstarchy vegetables, whole fruits, legumes, and whole or minimally processed grains) provides health benefits. 
4. Biological factors appear to influence responses to diets of differing macronutrient composition. People with relatively normal insulin sensitivity and β cell function may do well on diets with a wide range of carbohydrate-to-fat ratios; those with insulin resistance, hypersecretion of insulin, or glucose intolerance may benefit from a lower-carbohydrate, higher-fat diet. 
5. A ketogenic diet may confer particular metabolic benefits for some people with abnormal carbohydrate metabolism, a possibility that requires long-term study. 
6. Well-formulated low-carbohydrate, high-fat diets do not require high intakes of protein or animal products. Reduced carbohydrate consumption can be achieved by substituting grains, starchy vegetables, and sugars with nonhydrogenated plant oils, nuts, seeds, avocado, and other high-fat plant foods. 
7. There is broad agreement regarding the fundamental components of a healthful diet that can serve to inform policy, clinical management, and individual dietary choice. Nonetheless, important questions relevant to the epidemics of diet-related chronic disease remain. Greater investment in nutrition research should assume a high priority.

Friday, November 23, 2018

Social learning circuits in the brain.

Allsop et al. at MIT, observe brain circuits that let an animal learn from the experience of others: 


Highlights
•Neurons in cortex and amygdala respond to cues that predict shock to another mouse 
•Cortex → amygdala neurons preferentially represent socially derived information 
•Cortical input to amygdala instructs encoding of observationally learned cues 
•Corticoamygdala inhibition impairs observational learning and social interaction 
Summary
Observational learning is a powerful survival tool allowing individuals to learn about threat-predictive stimuli without directly experiencing the pairing of the predictive cue and punishment. This ability has been linked to the anterior cingulate cortex (ACC) and the basolateral amygdala (BLA). To investigate how information is encoded and transmitted through this circuit, we performed electrophysiological recordings in mice observing a demonstrator mouse undergo associative fear conditioning and found that BLA-projecting ACC (ACC→BLA) neurons preferentially encode socially derived aversive cue information. Inhibition of ACC→BLA alters real-time amygdala representation of the aversive cue during observational conditioning. Selective inhibition of the ACC→BLA projection impaired acquisition, but not expression, of observational fear conditioning. We show that information derived from observation about the aversive value of the cue is transmitted from the ACC to the BLA and that this routing of information is critically instructive for observational fear conditioning.

Thursday, November 22, 2018

Conversation with your angry uncle over Thanksgiving - a chat bot.

I have to pass on this gem from this morning's NYTimes. Karin Tamerius, founder of "Smart Politics" offers a chat bot to help train you for a conversation with a relative in a political tribe different from yours. The trick is to not engage their defensive mechanisms, but to remain empathetic and interactive, sharing your own experience. The summary points.:
1. Ask open-ended, genuinely curious, nonjudgmental questions. 
2. Listen to what people you disagree with say and deepen your understanding with follow-up inquiries. 
3. Reflect back their perspective by summarizing their answers and noting underlying emotions. 
4. Agree before disagreeing by naming ways in which you agree with their point of view.   
5. Share your perspective by telling a story about a personal experience. 
At the heart of the method is a simple idea: People cannot communicate effectively about politics when they feel threatened. Direct attacks – whether in the form of logical argument, evidence, or name-calling – trigger the sympathetic nervous system, limiting our capacity for reason, empathy, and self-reflection. To have productive conversations, we first need to make people feel safe. 
Most political conversations founder because challenges to our beliefs trigger our sympathetic nervous system. The goal is ensuring people feel safe enough during political dialogues to avoid this. That way the rational part of their brains stays in control and they’re better able to hear, absorb and adapt to new information. 
While it’s a powerful approach, it isn’t easy. It takes patience, tolerance and conscious engagement to get through all five steps. The method puts the burden for keeping the conversation calm on you: Not only must you not trigger the other person, but you must not get triggered yourself. 
Given the challenge, it’s tempting to avoid political discussions in mixed company altogether. Why risk provoking your angry uncle when you can chat about pumpkin pie instead? The answer is that when we choose avoidance over engagement, we are sacrificing a critical opportunity and responsibility to facilitate social and political change. 
Throughout American history, important strides were made because people dared to share their political views with relatives. The civil rights movement, the women’s movement, the antiwar movement, the gay rights movement, the struggle for marriage equality – all gained acceptance through difficult conversations among family members who initially disagreed vehemently with one another. 
To improve political discourse, remember your goal isn’t to score points, vent or put people in their place; it’s to make a difference. And that means sharing your message in a way that people who disagree with you – including your angry uncle – can hear.

Top-down and Bottom-up causation in the emergence of complexity.

I want to pass on just the first section of a commentary by George F.R. Ellis on a paper by Aharonov et al., whose evidence and analysis support a top–down structure in quantum mechanics according to which higher-order correlations can always determine lower-order ones, but not vice versa. Ellis puts this in the context of top-down versus bottom-up causation in the emergence of complexity at higher levels of organization.
The nature of emergence of complexity out of the underlying physics is a key issue in understanding the world around us.  Genuine emergence can be claimed to depend on top-down causation, which enables higher emergent levels to direct the outcomes of causation at lower levels to fulfill higher-level causal requirements; for example, the needs of heart physiology at the systems level determine gene expression at the cellular level via gene regulatory networks (see Figure, click to enlarge). However, the idea of top-down causation has been denied by a number of commentators. The paper by Aharonov et al. makes a strong contribution to this debate by giving quantum physics examples where top-down causation manifestly occurs. This physics result has strong implications for the philosophical debate about whether strong emergence is possible. Indeed, it gives specific examples where it occurs in a remarkably strong form.
Now, the word “causation” is regarded with suspicion by many philosophers of science, so to characterize what is happening one can perhaps rather use a number of different descriptions such as “whole–part constraint” or “top-down realization.” The key point remains the same, that higher levels can influence lower-level outcomes in many ways, and hence explain how strong emergence is possible. This occurs across science in general, and in physics in particular. The latter point is key because of the alleged causal completeness of physics, which supposition underlies supervenience arguments against strong emergence and the supposed possibility of overdetermination of lower-level outcomes. However, if top-down action occurs in physics in general, and in quantum physics — the bottom level of the hierarchy of emergence (See Figure) — in particular, such claims are undermined.

Wednesday, November 21, 2018

REM sleep in naps and memory consolidation in typical and Down syndrome children.

From Spano et al.:
Sleep is recognized as a physiological state associated with learning, with studies showing that knowledge acquisition improves with naps. Little work has examined sleep-dependent learning in people with developmental disorders, for whom sleep quality is often impaired. We examined the effect of natural, in-home naps on word learning in typical young children and children with Down syndrome (DS). Despite similar immediate memory retention, naps benefitted memory performance in typical children but hindered performance in children with DS, who retained less when tested after a nap, but were more accurate after a wake interval. These effects of napping persisted 24 h later in both groups, even after an intervening overnight period of sleep. During naps in typical children, memory retention for object-label associations correlated positively with percent of time in rapid eye movement (REM) sleep. However, in children with DS, a population with reduced REM, learning was impaired, but only after the nap. This finding shows that a nap can increase memory loss in a subpopulation, highlighting that naps are not universally beneficial. Further, in healthy preschooler’s naps, processes in REM sleep may benefit learning.

Tuesday, November 20, 2018

The ecstasy of speed - or leisure?

The Google Blogger platform by Deric's MindBlog emails me comments on posts to approve (or delete, or mark as spam). The almost daily comments are usually platitudes unrelated to a post that contain links to a commercial site. Sometimes serendipity strikes as I read the post, before rejecting the comment, and find it so relevant to the present that I think it worth repeating. Here is such a post from September 13, 2016:

Because I so frequently feel overwhelmed by input streams of chunks of information, I wonder how readers of this blog manage to find time to attend to its contents. (I am gratified that so many seem to do so.) Thoughts like this made me pause over Maria Popova's recent essay on our anxiety about time. I want to pass on a few clips, and recommend that you read all of it. She quotes extensively from James Gleick's book published in 2000: "Faster: The Acceleration of Just About Everything.", and begins by noting a 1918 Bertrand Russell quote, “both in thought and in feeling, even though time be real, to realise the unimportance of time is the gate of wisdom.”
Half a century after German philosopher Josef Pieper argued that leisure is the basis of culture and the root of human dignity, Gleick writes:
We are in a rush. We are making haste. A compression of time characterizes the life of the century....We have a word for free time: leisure. Leisure is time off the books, off the job, off the clock. If we save time, we commonly believe we are saving it for our leisure. We know that leisure is really a state of mind, but no dictionary can define it without reference to passing time. It is unrestricted time, unemployed time, unoccupied time. Or is it? Unoccupied time is vanishing. The leisure industries (an oxymoron maybe, but no contradiction) fill time, as groundwater fills a sinkhole. The very variety of experience attacks our leisure as it attempts to satiate us. We work for our amusement...Sociologists in several countries have found that increasing wealth and increasing education bring a sense of tension about time. We believe that we possess too little of it: that is a myth we now live by.
To fully appreciate Gleick’s insightful prescience, it behooves us to remember that he is writing long before the social web as we know it, before the conspicuous consumption of “content” became the currency of the BuzzMalnourishment industrial complex, before the timelines of Twitter and Facebook came to dominate our record and experience of time. (Prescience, of course, is a form of time travel — perhaps our only nonfictional way to voyage into the future.) Gleick writes:
We live in the buzz. We wish to live intensely, and we wonder about the consequences — whether, perhaps, we face the biological dilemma of the waterflea, whose heart beats faster as the temperature rises. This creature lives almost four months at 46 degrees Fahrenheit but less than one month at 82 degrees...Yet we have made our choices and are still making them. We humans have chosen speed and we thrive on it — more than we generally admit. Our ability to work fast and play fast gives us power. It thrills us… No wonder we call sudden exhilaration a rush.
Gleick considers what our units of time reveal about our units of thought:
We have reached the epoch of the nanosecond. This is the heyday of speed. “Speed is the form of ecstasy the technical revolution has bestowed on man,” laments the Czech novelist Milan Kundera, suggesting by ecstasy a state of simultaneous freedom and imprisonment… That is our condition, a culmination of millennia of evolution in human societies, technologies, and habits of mind.
The more I experience and read about the winding up and acceleration of our lives (think of the rate and omnipresence of the current presidential campaign!),  the more I realize the importance of rediscovering the sanity of leisure and quiet spaces.

Monday, November 19, 2018

Practicing gratitude, kindness, and compassion - can our i-devices help?

My Apple Watch occasionally, and unexpectedly, prompts me to stop and breathe (does it not like the pulse that it is measuring?). Noticing whether you are holding your breath or breathing can be very useful (The title of one my web lectures is “Are you holding your breath? - Structures of arousal and calm). My Univ. of Wisconsin colleague Richard Davidson writes a brief piece suggesting that this sort of prompting might be carried a bit further, to enhance other beneficial behaviors, suggesting that As technology permeates our lives, it should be designed to boost our kindness, empathy, and happiness.
...tech giants Apple and Google recently announced new software improvements to empower iPhone and Android smartphone users to be more aware and potentially limit smartphone use. I certainly think it’s a necessary step in the right direction. But is it enough? I see this as one of the first admissions by these companies that their technologies have powerful effects on us as humans—effects we have been discovering as we all participate in this grand experiment that none of us signed up for.
This admission by the technology leaders opens the door to a huge opportunity to start designing the interactions and the actual contents of what we consume to prioritize the well-being of users. For instance, what if artificial intelligence used in virtual assistants like Apple’s Siri or Amazon’s Alexa were designed to detect variations in the tone of voice to determine when someone was struggling with loneliness or depression and to intervene by providing a simple mental exercise to cultivate well-being? Or a mental health resource? This is one idea tech leaders are exploring more seriously, and for good reason.
In our lab at UW–Madison, we’re looking to make video game play a prosocial and entertaining experience for kids. In collaboration with video games experts, our lab created a research video game to train empathy in kids, which has shown potential in changing circuits of the brain that underlie empathy in some middle schoolers.
We’re exploring similar programs in adults that go above and beyond meditation apps for people to participate in bite-sized mental training practices that help them connect with others, as well as deepen their attention and resilience. What if your next smartphone notification were a prompt to reflect on what you’re grateful for or a challenge to take a break from your device and notice the natural environment? We know that activities like cultivating gratitude and spending time in nature or connecting with loved ones can have therapeutic effects. There’s nothing stopping us from integrating these reminders into our digital lives.
Ultimately, I think it will take soul-searching from companies and consumers to get us closer to technologies that truly help and don’t hinder the nurturing of user well-being.
We have a moral obligation to take what we know about the human mind and harness it in this ever-changing digital frontier to promote well-being. I think we can succeed if we can deliberately design our systems to nurture the basic goodness of people. This is a vision in which human flourishing would be supported, rather than diminished, by the rapidly evolving technology that is shaping our minds.