Wednesday, October 17, 2018

Feeling unsafe in a safe world - the unsafety theory of stress

Brosschot et al. make the point that our body's stress response is chronically turned on, unless it is actively inhibited by our upstairs prefrontal perception of safety. Thus the chronic stress many of us are feeling in our immediate current political environment is as much due to a generalized sense of unsafety rather than specific stressors. Here is a clip from their introduction, followed by their summary points and abstract:
Current neurobiological evidence and evolutionary reasoning imply that the stress response is a default response of the organism, and that it is the response the organism automatically falls back upon when no other information is available. So, the problem should not be formulated as: “what causes chronic stress responses?” but as “what mechanism allows the default stress response to be turned off?—and when does this ‘switch off’ mode fail to work?” To answer this last question is the chief goal of this article. We hypothesize that the mechanism that explains most chronic stress responses in daily life is the generalized perception of unsafety (GU), that is largely automatic (and as a result mainly unconscious). The argument in a nutshell: GU causes the default stress response to remain activated, whenever our phylogenetically ancient mind-body organism fails to perceive safety in a wide range of situations in modern society that are not intrinsically dangerous. This new explanation forms a radical shift from current stress theory – including our own PC hypothesis – that focuses on stressors and PC. It comprises a completely new theory called, as mentioned, the “Generalized Unsafety Theory of Stress” (GUTS). A key principle of GUTS is that not being able to switch off, or inhibit the default stress response is not dependent on actual stressors or PC: perceived GU is sufficient, GU is the crucial element here. Due to GU, chronic stress responses occur in an objectively safe world, with no threatening information. The GUTS has a far greater explanatory ability than other current stress theories.
Summary points and abstract:
The stress response is a default response, it is ‘always there’; it is not generated but disinhibited.
This default response is under tonic prefrontal inhibition as long as safety is perceived; and the conditions of safety are learned during an individual organism's lifespan.
This tonic inhibition is reflected by high tonic vagally mediated heart rate variability, and is relatively energy-economic.
Chronic stress responses are due to generalized unsafety (GU) rather than stressors.
GU is present in many other conditions, including obesity, old age and loneliness.
Based on neurobiological and evolutionary arguments, the generalized unsafety theory of stress (GUTS) hypothesizes that the stress response is a default response, and that chronic stress responses are caused by generalized unsafety (GU), independent of stressors or their cognitive representation. Three highly prevalent conditions are particularly vulnerable to becoming ‘compromised’ in terms of GU, and carry considerable health risks:
(1) ‘Compromised bodies’: in conditions with reduced bodily capacity, namely obesity, low aerobic fitness and older age, GU is preserved due to its evolutionary survival value;
(2) ‘Compromised social network’: in loneliness the primary source of safety is lacking, i.e. being part of a cohesive social network;
(3) ‘Compromised contexts’: in case of specific stressors (e.g. work stressors), daily contexts that are neutral by themselves (e.g. office building, email at home) may become unsafe by previously being paired with stressors, via context conditioning.
Thus, GUTS critically revises and expands stress theory, by focusing on safety instead of threat, and by including risk factors that have hitherto not been attributed to stress.

Tuesday, October 16, 2018

Health of fathers influences the well-being of their progeny.

Watkins et al. show in mice that a low protein diet during the period of spermatogenesis leads to offspring with disturbed metabolic health:

Parental health and diet at the time of conception determine the development and life-long disease risk of their offspring. While the association between poor maternal diet and offspring health is well established, the underlying mechanisms linking paternal diet with offspring health are poorly defined. Possible programming pathways include changes in testicular and sperm epigenetic regulation and status, seminal plasma composition, and maternal reproductive tract responses regulating early embryo development. In this study, we demonstrate that paternal low-protein diet induces sperm-DNA hypomethylation in conjunction with blunted female reproductive tract embryotrophic, immunological, and vascular remodeling responses. Furthermore, we identify sperm- and seminal plasma-specific programming effects of paternal diet with elevated offspring adiposity, metabolic dysfunction, and altered gut microbiota.
The association between poor paternal diet, perturbed embryonic development, and adult offspring ill health represents a new focus for the Developmental Origins of Health and Disease hypothesis. However, our understanding of the underlying mechanisms remains ill-defined. We have developed a mouse paternal low-protein diet (LPD) model to determine its impact on semen quality, maternal uterine physiology, and adult offspring health. We observed that sperm from LPD-fed male mice displayed global hypomethylation associated with reduced testicular expression of DNA methylation and folate-cycle regulators compared with normal protein diet (NPD) fed males. Furthermore, females mated with LPD males display blunted preimplantation uterine immunological, cell signaling, and vascular remodeling responses compared to controls. These data indicate paternal diet impacts on offspring health through both sperm genomic (epigenetic) and seminal plasma (maternal uterine environment) mechanisms. Extending our model, we defined sperm- and seminal plasma-specific effects on offspring health by combining artificial insemination with vasectomized male mating of dietary-manipulated males. All offspring derived from LPD sperm and/or seminal plasma became heavier with increased adiposity, glucose intolerance, perturbed hepatic gene expression symptomatic of nonalcoholic fatty liver disease, and altered gut bacterial profiles. These data provide insight into programming mechanisms linking poor paternal diet with semen quality and offspring health.

Monday, October 15, 2018

Too much or too little sleep correlates with cognitive deficits.

Wild et al. collected sleep and cognitive performance data from ~10,000 people to find that less than 7 or more than 8 hours of sleep a night diminishes high-level cognitive functioning.
Most people will at some point experience not getting enough sleep over a period of days, weeks, or months. However, the effects of this kind of everyday sleep restriction on high-level cognitive abilities—such as the ability to store and recall information in memory, solve problems, and communicate—remain poorly understood. In a global sample of over 10000 people, we demonstrated that cognitive performance, measured using a set of 12 well-established tests, is impaired in people who reported typically sleeping less, or more, than 7–8 hours per night—which was roughly half the sample. Crucially, performance was not impaired evenly across all cognitive domains. Typical sleep duration had no bearing on short-term memory performance, unlike reasoning and verbal skills, which were impaired by too little, or too much, sleep. In terms of overall cognition, a self-reported typical sleep duration of 4 hours per night was equivalent to aging 8 years. Also, sleeping more than usual the night before testing (closer to the optimal amount) was associated with better performance, suggesting that a single night’s sleep can benefit cognition. The relationship between sleep and cognition was invariant with respect to age, suggesting that the optimal amount of sleep is similar for all adult age groups, and that sleep-related impairments in cognition affect all ages equally. These findings have significant real-world implications, because many people, including those in positions of responsibility, operate on very little sleep and may suffer from impaired reasoning, problem-solving, and communications skills on a daily basis.

Friday, October 12, 2018

A new algorithm for predicting disease risk.

I pass on the text of this piece from Gina Kolata, and then the abstract of the article by Khera et al. she is referencing:
By surveying changes in DNA at 6.6 million places in the human genome, investigators at the Broad Institute and Harvard University were able to identify many more people at risk than do the usual genetic tests, which take into account very few genes.
Of 100 heart attack patients, for example, the standard methods will identify two who have a single genetic mutation that place them at increased risk. But the new tool will find 20 of them...The researchers are now building a website that will allow anyone to upload genetic data from a company like 23andMe or Users will receive risk scores for heart disease, breast cancer, Type 2 diabetes, chronic inflammatory bowel disease and atrial fibrillation...People will not be charged for their scores.
The abstract from Nature Genetics:
A key public health need is to identify individuals at high risk for a given disease to enable enhanced screening or preventive therapies. Because most common diseases have a genetic component, one important approach is to stratify individuals based on inherited DNA variation. Proposed clinical applications have largely focused on finding carriers of rare monogenic mutations at several-fold increased risk. Although most disease risk is polygenic in nature, it has not yet been possible to use polygenic predictors to identify individuals at risk comparable to monogenic mutations. Here, we develop and validate genome-wide polygenic scores for five common diseases. The approach identifies 8.0, 6.1, 3.5, 3.2, and 1.5% of the population at greater than threefold increased risk for coronary artery disease, atrial fibrillation, type 2 diabetes, inflammatory bowel disease, and breast cancer, respectively. For coronary artery disease, this prevalence is 20-fold higher than the carrier frequency of rare monogenic mutations conferring comparable risk. We propose that it is time to contemplate the inclusion of polygenic risk prediction in clinical care, and discuss relevant issues.

Thursday, October 11, 2018

Digital media and developing minds

I want to point to the Oct. 2 issue of PNAS, which free online access to a Sackler Colloquium on Digital Media and Developing Minds. The place to start is the introductory article by David Meyer, "From savannas to blue-phase LCD screens: Prospects and perils for child development in the Post-Modern Digital Information Age." Some clips from his article:
The Sackler Colloquium “Digital Media and Developing Minds” was an interdisciplinary collaborative endeavor to promote joint interests of the National Academy of Sciences, the Arthur M. Sackler Foundation, and the Institute of Digital Media and Child Development.‡‡ At the colloquium, a select group of media-savvy experts in diverse disciplines assembled to pursue several interrelated goals: (i) reporting results from state-of-the art scientific research; (ii) establishing a dialogue between medical researchers, social scientists, communications specialists, policy officials, and other interested parties who study media effects; and (iii) setting a future research agenda to maximize the benefits, curtail the costs, and minimize the risks for children and teens in the Post-Modern Digital Information Age.
Christakis et al. (36) report on “How early media exposure may affect cognitive function: A review of results from observations in humans and experiments in mice,” reviewing relevant results from empirical studies of humans and animal models that concern how intense environmental stimulation influences neural brain development and behavior.
Lytle et al. (37) report on “Two are better than one: Infant language learning from video improves in the presence of peers,” showing that social copresence with other same-aged peers facilitates 9-mo-old infants’ learning of spoken phonemes through interactions with visual touch screens.
Kirkorian and Anderson (38) report on “Effect of sequential video shot comprehensibility on attentional synchrony: A comparison of children and adults,” using temporally extended eye-movement records to investigate how “top-down” cognitive comprehension processes for interpreting video narratives develop over an age-range from early childhood (4-y-old) to adulthood.
Beyens et al. (39) report on “Screen media use and ADHD-related behaviors: Four decades of research,” systematically surveying representative scientific literature that suggests a modest positive correlation—moderated by variables such as gender and chronic aggressive tendencies—between media use and ADHD-related behaviors, thereby helping pave the way toward future detailed theoretical models of these phenomena.
Prescott et al. (40) report on “Metaanalysis of the relationship between violent video game play and physical aggression over time,” applying sophisticated statistical techniques to assess data from a large cross-cultural sample of studies (n = 24; aggregated participant sample size > 17,000) about associations between video game violence and prospective future physical aggression, which has yielded evidence of small but reliable direct relationships that are largest among Whites, intermediate among Asians, and smallest (unreliable) among Hispanics.
Uncapher and Wagner (41) report on “Minds and brains of media multitaskers: Current findings and future directions,” evaluating whether intensive media multitasking (i.e., engaging simultaneously with multiple media streams; for example, texting friends on smart phones while answering email messages on laptop computers and playing video games on other electronic devices) leads to relatively poor performance on various cognitive tests under single-tasking conditions, which might happen because chronic media multitasking diminishes individuals’ powers of sustained goal-directed attention.
Finally, Katz et al. (42) report on “How to play 20 questions with nature and lose: Reflections on 100 years of brain-training research,” analyzing how and why past research based on various laboratory and real-world approaches to training basic mental processes (e.g., selective attention, working memory, and cognitive control)—including contemporary video game playing (also known as “brain training”)—have yet to yield consistently positive, practically significant, outcomes, such as durable long-term enhancements of general fluid intelligence.

Wednesday, October 10, 2018

Where is free will in our brains?

Really fascinating work from Darby et al. identifying the brain areas that make us feel like we have free will, the perception that we are in control of, and responsible for, our actions (whether or not we actually have free will is another matter, see my "I Illusion" web lecture.):

Free will consists of a desire to act (volition) and a sense of responsibility for that action (agency), but the brain regions responsible for these processes remain unknown. We found that brain lesions that disrupt volition occur in many different locations, but fall within a single brain network, defined by connectivity to the anterior cingulate. Lesions that disrupt agency also occur in many different locations, but fall within a separate network, defined by connectivity to the precuneus. Together, these networks may underlie our perception of free will, with implications for neuropsychiatric diseases in which these processes are impaired.
Our perception of free will is composed of a desire to act (volition) and a sense of responsibility for our actions (agency). Brain damage can disrupt these processes, but which regions are most important for free will perception remains unclear. Here, we study focal brain lesions that disrupt volition, causing akinetic mutism (n = 28), or disrupt agency, causing alien limb syndrome (n = 50), to better localize these processes in the human brain. Lesion locations causing either syndrome were highly heterogeneous, occurring in a variety of different brain locations. We next used a recently validated technique termed lesion network mapping to determine whether these heterogeneous lesion locations localized to specific brain networks. Lesion locations causing akinetic mutism all fell within one network, defined by connectivity to the anterior cingulate cortex. Lesion locations causing alien limb fell within a separate network, defined by connectivity to the precuneus. Both findings were specific for these syndromes compared with brain lesions causing similar physical impairments but without disordered free will. Finally, our lesion-based localization matched network localization for brain stimulation locations that disrupt free will and neuroimaging abnormalities in patients with psychiatric disorders of free will without overt brain lesions. Collectively, our results demonstrate that lesions in different locations causing disordered volition and agency localize to unique brain networks, lending insight into the neuroanatomical substrate of free will perception.

Tuesday, October 09, 2018

Sans Forgetica

A fascinating piece from Taylor Telford in The Washington Post describes a new font devised by psychology and design researchers at RMIT Univ. in Melbourne...
...designed to boost information retention for readers. It’s based on a theory called “desirable difficulty,” which suggests that people remember things better when their brains have to overcome minor obstacles while processing information. Sans Forgetica is sleek and back-slanted with intermittent gaps in each letter, which serve as a “simple puzzle” for the reader...The back-slanting in Sans Forgetica would be foreign to most readers...The openings in the letters make the brain pause to identify the shapes.
It may be my imagination, but I feel my brain perking up, working harder, to take in theis graphic:

The team tested the font’s efficacy along with other intentionally complicated fonts on 400 students in lab and online experiments and found that “Sans Forgetica broke just enough design principles without becoming too illegible and aided memory retention.

Monday, October 08, 2018

In praise of mediocrity

Tim Wu does an engaging essay on how the pursuit of excellence has infiltrated and corrupted the world of leisure.
I’m a little surprised by how many people tell me they have no hobbies...we seem to have forgotten the importance of doing things solely because we enjoy them...Yes, I know: We are all so very busy...But there’s a deeper reason...Our “hobbies,” if that’s even the word for them anymore, have become too serious, too demanding, too much an occasion to become anxious about whether you are really the person you claim to be.
If you’re a jogger, it is no longer enough to cruise around the block; you’re training for the next marathon. If you’re a painter, you are no longer passing a pleasant afternoon, just you, your watercolors and your water lilies; you are trying to land a gallery show or at least garner a respectable social media following.
Lost here is the gentle pursuit of a modest competence, the doing of something just because you enjoy it, not because you are good at it...alien values like “the pursuit of excellence” have crept into and corrupted what was once the realm of leisure, leaving little room for the true amateur...There are depths of experience that come with mastery. But there is also a real and pure joy, a sweet, childlike delight, that comes from just learning and trying to get better. Looking back, you will find that the best years of, say, scuba-diving or doing carpentry were those you spent on the learning curve, when there was exaltation in the mere act of doing.
...the demands of excellence are at war with what we call freedom. For to permit yourself to do only that which you are good at is to be trapped in a cage whose bars are not steel but self-judgment. Especially when it comes to physical pursuits, but also with many other endeavors, most of us will be truly excellent only at whatever we started doing in our teens...What if you decide in your 60s that you want to learn to speak Italian? The expectation of excellence can be stultifying.
The promise of our civilization, the point of all our labor and technological progress, is to free us from the struggle for survival and to make room for higher pursuits. But demanding excellence in all that we do can undermine that; it can threaten and even destroy freedom. It steals from us one of life’s greatest rewards — the simple pleasure of doing something you merely, but truly, enjoy.

Friday, October 05, 2018

Militarized police forces do not enhance safety or reduce crime, but do diminish police reputation.

From Jonathan Mummolo:

National debates over heavy-handed police tactics, including so-called “militarized” policing, are often framed as a trade-off between civil liberties and public safety, but the costs and benefits of controversial police practices remain unclear due to data limitations. Using an array of administrative data sources and original experiments I show that militarized “special weapons and tactics” (SWAT) teams are more often deployed in communities of color, and—contrary to claims by police administrators—provide no detectable benefits in terms of officer safety or violent crime reduction, on average. However, survey experiments suggest that seeing militarized police in news reports erodes opinion toward law enforcement. Taken together, these findings suggest that curtailing militarized policing may be in the interest of both police and citizens.
The increasingly visible presence of heavily armed police units in American communities has stoked widespread concern over the militarization of local law enforcement. Advocates claim militarized policing protects officers and deters violent crime, while critics allege these tactics are targeted at racial minorities and erode trust in law enforcement. Using a rare geocoded census of SWAT team deployments from Maryland, I show that militarized police units are more often deployed in communities with large shares of African American residents, even after controlling for local crime rates. Further, using nationwide panel data on local police militarization, I demonstrate that militarized policing fails to enhance officer safety or reduce local crime. Finally, using survey experiments—one of which includes a large oversample of African American respondents—I show that seeing militarized police in news reports may diminish police reputation in the mass public. In the case of militarized policing, the results suggest that the often-cited trade-off between public safety and civil liberties is a false choice.

Thursday, October 04, 2018

The number of neurons in the amygdala normally increases during development, but not in autism.

Avino et al. point out one possible underlying cause of the characteristic difficulty that people with autism spectrum disorder have in understanding the emotional expressions of others.
Remarkably little is known about the postnatal cellular development of the human amygdala. It plays a central role in mediating emotional behavior and has an unusually protracted development well into adulthood, increasing in size by 40% from youth to adulthood. Variation from this typical neurodevelopmental trajectory could have profound implications on normal emotional development. We report the results of a stereological analysis of the number of neurons in amygdala nuclei of 52 human brains ranging from 2 to 48 years of age [24 neurotypical and 28 autism spectrum disorder (ASD)]. In neurotypical development, the number of mature neurons in the basal and accessory basal nuclei increases from childhood to adulthood, coinciding with a decrease of immature neurons within the paralaminar nucleus. Individuals with ASD, in contrast, show an initial excess of amygdala neurons during childhood, followed by a reduction in adulthood across nuclei. We propose that there is a long-term contribution of mature neurons from the paralaminar nucleus to other nuclei of the neurotypical human amygdala and that this growth trajectory may be altered in ASD, potentially underlying the volumetric changes detected in ASD and other neurodevelopmental or neuropsychiatric disorders.

Wednesday, October 03, 2018

Income inequality drives female sexualization.

Blake et al. do a fascinating analysis that suggests that rising economic inequality promotes status competition among women, by means of the posting of "sexy selfies." The prevalence of sexy selfies is greatest in environments characterized by highly unequal incomes.

Female sexualization is increasing, and scholars are divided on whether this trend reflects a form of gendered oppression or an expression of female competitiveness. Here, we proxy local status competition with income inequality, showing that female sexualization and physical appearance enhancement are most prevalent in environments that are economically unequal. We found no association with gender oppression. Exploratory analyses show that the association between economic inequality and sexualization is stronger in developed nations. Our findings have important implications: Sexualization manifests in response to economic conditions but does not covary with female subordination. These results raise the possibility that sexualization may be a marker of social climbing among women that track the degree of status competition in the local environment.
Publicly displayed, sexualized depictions of women have proliferated, enabled by new communication technologies, including the internet and mobile devices. These depictions are often claimed to be outcomes of a culture of gender inequality and female oppression, but, paradoxically, recent rises in sexualization are most notable in societies that have made strong progress toward gender parity. Few empirical tests of the relation between gender inequality and sexualization exist, and there are even fewer tests of alternative hypotheses. We examined aggregate patterns in 68,562 sexualized self-portrait photographs (“sexy selfies”) shared publicly on Twitter and Instagram and their association with city-, county-, and cross-national indicators of gender inequality. We then investigated the association between sexy-selfie prevalence and income inequality, positing that sexualization—a marker of high female competition—is greater in environments in which incomes are unequal and people are preoccupied with relative social standing. Among 5,567 US cities and 1,622 US counties, areas with relatively more sexy selfies were more economically unequal but not more gender oppressive. A complementary pattern emerged cross-nationally (113 nations): Income inequality positively covaried with sexy-selfie prevalence, particularly within more developed nations. To externally validate our findings, we investigated and confirmed that economically unequal (but not gender-oppressive) areas in the United States also had greater aggregate sales in goods and services related to female physical appearance enhancement (beauty salons and women’s clothing). Here, we provide an empirical understanding of what female sexualization reflects in societies and why it proliferates.

Tuesday, October 02, 2018

Daily fasting can improve health span and life span

Mitchell et al. show (in mice) that caloric restriction (a 30% reduction in daily intake) or single-meal feeding (resulting in fasting during each day but no caloric restriction) increases life span and delays the onset of age-associated liver pathologies in mice, compared with no feeding restrictions. This suggests that daily fasting, even without caloric restriction, may improve health span in humans.
The importance of dietary composition and feeding patterns in aging remains largely unexplored, but was implicated recently in two prominent nonhuman primate studies. Here, we directly compare in mice the two diets used in the primate studies focusing on three paradigms: ad libitum (AL), 30% calorie restriction (CR), and single-meal feeding (MF), which accounts for differences in energy density and caloric intake consumed by the AL mice. MF and CR regimes enhanced longevity regardless of diet composition, which alone had no significant impact within feeding regimens. Like CR animals, MF mice ate quickly, imposing periods of extended daily fasting on themselves that produced significant improvements in morbidity and mortality compared with AL. These health and survival benefits conferred by periods of extended daily fasting, independent of dietary composition, have major implications for human health and clinical applicability.

Monday, October 01, 2018

Constancy of the architecture of shame across cultures is due to biological, not cultural, evolution.

Interesting work from Sznycer and other collaborators of Cosmides and Tooby suggests that shame’s match to audience devaluation is a design feature crafted by selection and not a product of cultural contact or convergent cultural evolution:

This set of experiments shows that in 15 traditional small-scale societies there is an extraordinarily close correspondence between (i) the intensity of shame felt if one exhibited specific acts or traits and (ii) the magnitude of devaluation expressed in response to those acts or traits by local audiences, and even foreign audiences. Three important and widely acknowledged sources of cultural variation between communities—geographic proximity, linguistic similarity, and religious similarity—all failed to account for the strength of between-community correlations in the shame–devaluation link. This supplies a parallel line of evidence that shame is a universal system, part of our species’ cooperative biology, rather than a product of cultural evolution.
Human foragers are obligately group-living, and their high dependence on mutual aid is believed to have characterized our species’ social evolution. It was therefore a central adaptive problem for our ancestors to avoid damaging the willingness of other group members to render them assistance. Cognitively, this requires a predictive map of the degree to which others would devalue the individual based on each of various possible acts. With such a map, an individual can avoid socially costly behaviors by anticipating how much audience devaluation a potential action (e.g., stealing) would cause and weigh this against the action’s direct payoff (e.g., acquiring). The shame system manifests all of the functional properties required to solve this adaptive problem, with the aversive intensity of shame encoding the social cost. Previous data from three Western(ized) societies indicated that the shame evoked when the individual anticipates committing various acts closely tracks the magnitude of devaluation expressed by audiences in response to those acts. Here we report data supporting the broader claim that shame is a basic part of human biology. We conducted an experiment among 899 participants in 15 small-scale communities scattered around the world. Despite widely varying languages, cultures, and subsistence modes, shame in each community closely tracked the devaluation of local audiences (mean r = +0.84). The fact that the same pattern is encountered in such mutually remote communities suggests that shame’s match to audience devaluation is a design feature crafted by selection and not a product of cultural contact or convergent cultural evolution.