Monday, December 09, 2019

Heritable gaps between chronological age and brain age are increased in common brain disorders.

Kaufmann et al. have used machine learning on s large dataset to estimate robust estimation of individual biological brain ages on the basis of structural brain imaging features. The deviation between brain age and chronological age — termed the brain age gap — appears to be a promising marker of brain health. It was largest in schizophrenia, multiple sclerosis, dementia, and bipolar spectrum disorder. The authors also assessed the overlap between the genetic underpinnings of brain age gap and common brain disorders. The bottom line conclusion (from a very extensive and complex analysis) is that common brain disorders are associated with heritable patterns of apparent aging of the brain Their abstract:
Common risk factors for psychiatric and other brain disorders are likely to converge on biological pathways influencing the development and maintenance of brain structure and function across life. Using structural MRI data from 45,615 individuals aged 3–96 years, we demonstrate distinct patterns of apparent brain aging in several brain disorders and reveal genetic pleiotropy between apparent brain aging in healthy individuals and common brain disorders.

Friday, December 06, 2019

Same-Sex behavior in animals - a new view.

Monk et al. offer a fresh perspective on the "problem" of how same-sex sexual behavior could have evolved. It is a problem only if different-sex sexual behavior is the baseline condition for animals, from which single-sex behavior has evolved. The authors suggest that same-sex behavior is bound up in the very origins of animal sex. It hasn’t had to continually re-evolve: It’s always been there. The arguments of Monk and collaborators are summarized in a review by Elbein:
Instead of wondering why same-sex behavior had independently evolved in so many species, Ms. Monk and her colleagues suggest that it may have been present in the oldest parts of the animal family tree. The earliest sexually reproducing animals may have mated with any other individual they came across, regardless of sex. Such reproductive strategies are still practiced today by hermaphroditic species, like snails, and species that don’t appear to differentiate, like sea urchins.
Over time, Ms. Monk said, sexual signals evolved — different sizes, colors, anatomical features and behaviors — allowing different sexes to more accurately target each other for reproduction. But same-sex behavior continued in some organisms, leading to diverse sexual behaviors and strategies across the animal kingdom. And while same-sex behavior may grant some evolutionary benefits, an ancient origin would mean those benefits weren’t required for it to exist.
But how has same-sex behavior stuck around? The answer may be that such behaviors aren’t as evolutionarily costly as assumed. Traditionally, Ms. Monk said, any mating behavior that doesn’t produce young is seen as a waste. But animal behavior often doesn’t fit neatly into an economic accounting of costs and benefits.
Here is the abstract of Monk et al.:
Same-sex sexual behaviour (SSB) has been recorded in over 1,500 animal species with a widespread distribution across most major clades. Evolutionary biologists have long sought to uncover the adaptive origins of ‘homosexual behaviour’ in an attempt to resolve this apparent Darwinian paradox: how has SSB repeatedly evolved and persisted despite its presumed fitness costs? This question implicitly assumes that ‘heterosexual’ or exclusive different-sex sexual behaviour (DSB) is the baseline condition for animals, from which SSB has evolved. We question the idea that SSB necessarily presents an evolutionary conundrum, and suggest that the literature includes unchecked assumptions regarding the costs, benefits and origins of SSB. Instead, we offer an alternative null hypothesis for the evolutionary origin of SSB that, through a subtle shift in perspective, moves away from the expectation that the origin and maintenance of SSB is a problem in need of a solution. We argue that the frequently implicit assumption of DSB as ancestral has not been rigorously examined, and instead hypothesize an ancestral condition of indiscriminate sexual behaviours directed towards all sexes. By shifting the lens through which we study animal sexual behaviour, we can more fruitfully examine the evolutionary history of diverse sexual strategies.

Wednesday, December 04, 2019

Something in the way we move.

Gretchen Reynolds points to work by Hug et al. suggesting that each of us has a unique muscle activation signature that can be revealed during walking and pedaling. Understanding movement patterns could help in improving and refining robotics, prosthetics, physical therapy and personalized exercise programs. On the darker side, a Chinese company (Watrix) is using computer vision to to enhance the recognition of individuals in crowds by their walking postures:
...its gait recognition solution “Shuidi Shenjian” ... will enable security departments to quickly search and recognize identities by their body shape and walking posture. The company notes that this product is highly effective when targets walk from a long distance or in weak light, cover their faces or wear different clothes, and would be a great supplement to current computer vision products.
Here is the complete abstract from Hug et al.:
Although it is known that the muscle activation patterns used to produce even simple movements can vary between individuals, these differences have not been considered to prove the existence of individual muscle activation strategies (or signatures). We used a machine learning approach (support vector machine) to test the hypothesis that each individual has unique muscle activation signatures. Eighty participants performed a series of pedaling and gait tasks, and 53 of these participants performed a second experimental session on a subsequent day. Myoelectrical activity was measured from eight muscles: vastus lateralis and medialis, rectus femoris, gastrocnemius lateralis and medialis, soleus, tibialis anterior, and biceps femoris-long head. The classification task involved separating data into training and testing sets. For the within-day classification, each pedaling/gait cycle was tested using the classifier, which had been trained on the remaining cycles. For the between-day classification, each cycle from day 2 was tested using the classifier, which had been trained on the cycles from day 1. When considering all eight muscles, the activation profiles were assigned to the corresponding individuals with a classification rate of up to 99.28% (2,353/2,370 cycles) and 91.22% (1,341/1,470 cycles) for the within-day and between-day classification, respectively. When considering the within-day classification, a combination of two muscles was sufficient to obtain a classification rate >80% for both pedaling and gait. When considering between-day classification, a combination of four to five muscles was sufficient to obtain a classification rate >80% for pedaling and gait. These results demonstrate that strategies not only vary between individuals, as is often assumed, but are unique to each individual.

Monday, December 02, 2019

Rival theories of consciousness being tested by large project.

In the first phase of a $20 million dollar project, six laboratories are going to run experiments with more than 500 participants to test two of the primary theories of consciousness:
The first two contenders are the global workspace theory (GWT), championed by Stanislas Dehaene of the Collège de France in Paris, and the integrated information theory (IIT), proposed by Giulio Tononi of the Uni-versity of Wisconsin in Madison. The GWT says the brain’s prefrontal cortex, which con-trols higher order cognitive processes like decision-making, acts as a central computer that collects and prioritizes information from sensory input. It then broadcasts the infor-mation to other parts of the brain that carry out tasks. Dehaene thinks this selection pro-cess is what we perceive as consciousness. By contrast, the IIT proposes that conscious-ness arises from the interconnectedness of brain networks. The more neurons interact with one another, the more a being feels conscious—even without sensory input. IIT proponents suspect this process occurs in the back of the brain, where neurons con-nect in a gridlike structure...Tononi and Dehaene have agreed to pa-rameters for the experiments and have reg-istered their predictions. To avoid conflicts of interest, the scientists will neither collect nor interpret the data. If the results appear to disprove one theory, each has agreed to admit he was wrong—at least to some extent
The labs, in the United States, Germany, the United Kingdom, and China, will use three techniques to record brain activity as volun-teers perform consciousness-related tasks: functional magnetic resonance imaging, electroencephalography, and electrocortico-graphy (a form of EEG done during brain sur-gery, in which electrodes are placed directly on the brain). In one experiment, research-ers will measure the brain’s response when a person becomes aware of an image. The GWT predicts the front of the brain will suddenly become active, whereas the IIT says the back of the brain will be consistently active.

Friday, November 29, 2019

The real cost of texting and tweeting.

Agnes Callard, an associate professor of philosophy at the University of Chicago, crystallizes some fascinating points in an NYTimes Op-Ed piece. She wonders why she broadcasts the details of her daily life on twitter...some clips:
To allow others to think about us in whatever way they feel like — perhaps to laugh at us, perhaps to dismiss us — is a huge loss of control. So why do we allow it? What is the attraction of it? I think that it’s the increase in control we get in return. Social media has enabled the Great Control Swap. And it is happening right now, beneath our notice.
The first baby step toward the Great Swap was the shift from phone calls to texts. A phone interaction requires participants to be “on the same time,” which entails negotiations over entrance into and exit from the conversation...A text or email interaction, by contrast, liberates the parties so that each may operate on their own time. But the cost comes in another form of control: data....text-based communication requires stationary words...they leave a trail.
We understood from the start that this form of socializing — like an affair without physical contact — was shallower than the other, more demanding kind. We were prepared to accept that trade-off, but failed to grasp that we were trading away more than depth. We were also trading away a kind of control.
All of us have a desire to connect, to be seen. But we live in a world that is starting to allow us to satisfy that desire without feeling the common-sense moral strictures that have traditionally governed human relationships. We can engage without obligation, without boredom and, most importantly, without subjecting our attention to the command of another. On Twitter, I’m never obligated to listen through to the end of someone’s story.
The immense appeal of this free-form socializing lies in the way it makes one a master of one’s own time — but it cannot happen without a place. All that data has to sit somewhere so that people can freely access it whenever they wish. Data storage is the loss of control by which we secure social control: Facebook is our faithless mistress’s leaky inbox.
When we alienate our identities as text data, and put that data “out there” to be read by anyone who wanders by, we are putting ourselves into the interpretive hands of those who have no bonds or obligations or agreements with us, people with whom we are, quite literally, prevented from seeing “eye to eye.” People we cannot trust.
The Great Control Swap buys us control over the logistics of our interactions at the cost of interpretive control over the content of those interactions. Our words have lost their wings, and fallen to the ground as data.

Wednesday, November 27, 2019

Cognitive and noncognitive predictors of success.

An interesting bit of work from Duckworth et al.
When predicting success, how important are personal attributes other than cognitive ability? To address this question, we capitalized on a full decade of prospective, longitudinal data from n = 11,258 cadets entering training at the US Military Academy at West Point. Prior to training, cognitive ability was negatively correlated with both physical ability and grit. Cognitive ability emerged as the strongest predictor of academic and military grades, but noncognitive attributes were more prognostic of other achievement outcomes, including successful completion of initiation training and 4-y graduation. We conclude that noncognitive aspects of human capital deserve greater attention from both scientists and practitioners interested in predicting real-world success.

Monday, November 25, 2019

How trance states might have forged human societies

I want to pass on a series of clips I have made for my own use from an intriguing article by Mark Vernon in Aeon:
With anatomically modern humans comes culture in a way that had never happened before. And from that culture came religion, with various proposals to map the hows and whys of its emergence. Until recently, the proposals fell into two broad groups – ‘big gods’ theories and ‘false agency’ hypotheses. Big gods theories envisage religion as conjuring up punishing deities. These disciplining gods provided social bonding by telling individuals that wrongdoing incurs massive costs. The problem is that big gods are not a universal feature of religions and, if they are present, they seem correlated to big societies not causes of them. False agency hypotheses...assume that our forebears were jumpy and superstitious: they thought that a shrub swayed because of a spirit not the wind; and they were easily fooled, though their mistakes were evolutionarily advantageous because, on occasion, the swaying was caused by a predator. The false agency hypothesis has been tested and disconfirmed across many experiments.
...there is a need for a new idea, and coming to the fore now is an old one revisited...The explanation is resurfacing in what can be called the trance theory of religious origins, which proposes that our paleolithic ancestors hit on effervescence upon finding that they could induce altered states of consciousness...Effervescence is generated when humans come together to make music or perform rituals, an experience that lingers when the ceremonies are over. The suggestion, therefore, is that collective experiences that are religious or religious-like unify groups and create the energy to sustain them.
Research to test and develop this idea is underway in a multidisciplinary team led by Robin Dunbar at the University of Oxford. The approach appeals to him, in part, because it seems to capture a crucial aspect of religious phenomena missing in suggestions about punishing gods or dangerous spirits. It is not about the fine details of theology, but is about the raw feelings of experience...this raw-feelings element has a transcendental mystical component – something that is only fully experienced in trance states...this sense of transcendence and other worlds is present at some level in almost all forms of religious experience.
...there’s evidence that monkeys and apes experience the antecedents to ecstasy because they seem to experience wonder...a few hundred thousand years ago, archaic humans took a step that ramped up this capacity. They started deliberately to make music, dance and sing. When the synchronised and collective nature of these practices became sufficiently intense, individuals likely entered trance states in which they experienced not only this-worldly splendour but otherworldly intrigue... What you might call religiosity was born. It stuck partly because it also helped to ease tensions and bond groups, via the endorphin surges produced in trance states. In other words, altered states proved evolutionarily advantageous: the awoken human desire for ecstasy simultaneously prompted a social revolution because it meant that social groups could grow to much larger sizes via the shared intensity of heightened experiences.
The trance hypothesis...rests on the rituals that produce peak experiences, which means it doesn’t require speculating about what ancient people did or didn’t believe about spirits and gods...Asking when religion evolved is not a good question because religion is more than one thing...asking when the various elements such as supernatural agents and moral obligations started to coalesce together is a better question. And they invariably start to coalesce around rituals.
...when villages and then towns appear...new techniques for managing social pressures are required...religious systems (Doctrinal religions) that include specialists such as priests and impressive constructions we’d call temples and/or domestic house-based shrines...sustain the prosocial effects of earlier types of religiosity for groups that are now growing very large indeed...a tension .. arises when religious experiences are institutionalised....what’s on offer is somewhat thinner than experiences gained in the immersive rites that precipitate altered states. Encountering spirit entities directly in a dance or chase is not the same as the uplift offered by a monumental building.
...religions are caught between the Scylla of socially useful but potentially dreary religious rites and the Charybdis of altered states that are intrinsically exciting but socially disruptive. It’s why they bring bloody conflicts as well as social goods. This way of putting it highlights another feature of the trance theory. It interweaves two levels of explanation: one focused on the allure of spiritual vitality; the other on practical needs.
..science cannot decide whether the claims of any one religion are true. But the new theory still makes quite a strong claim, which brings me back to the role of the supernatural, transcendence and religious gods that today’s secularists seem inclined to sideline. If the science cannot confirm convictions about any divine revelations received, it does lend credence to the reasonableness, even necessity, of having them. Where the big gods and false agency hypotheses seemed inherently sniffy about human religiosity, the trance hypothesis positively values it...The trance hypothesis is neutral about the truth claims of religions whether you believe or don’t, though it does suggest that transcendent states of mind are meaningful to human beings and can evolve into religious systems of belief.
And in this final observation there is, perhaps, some good news for us, whether we’re religious or not. It’s often said that many of today’s troubles, from divisive political debates to spats on social media, are due to our tribal nature. It’s added, somewhat fatalistically, that deep within our evolutionary past is the tendency to identify with one group and demonise another. We are destined to be at war, culturally or otherwise. But if the trance theory is true, it shows that the evolutionary tendency to be tribal rests on an evolutionary taste for that which surpasses tribal experience – the transcendence that humans glimpsed in altered states of mind that enabled them to form tribes to start with.
If we long to belong, we also long to be in touch with ‘the more’, as the great pioneer of the study of religious experiences William James called it. That more will be envisaged in numerous ways. But it might help us by prompting new visions that exceed our herd instincts and binary thinking, and ease social tensions. If it helped our ancestors to survive, why would we think we are any different?

Friday, November 22, 2019

Evidence for premature aging caused by insufficient sleep.

I have come to realize in the past year or so that my physical and mental robustness require getting at least seven, and preferably eight, hours of sleep every night. Thus I was intrigued by finding an extensive and well documented study by Teo et al. (open source) showing that telomeres, sequences of DNA on the end of chromosomes taken as a marker of biological aging, are, on average, 356 base pairs shorter in study participants who slept for fewer than five hours per night than in those who slept for seven hours. They found that sleep metrics were reported more accurately by wearable fitness trackers than by self report. Here is the abstract of their article, titled "Digital phenotyping by consumer wearables identifies sleep-associated markers of cardiovascular disease risk and biological aging."
Sleep is associated with various health outcomes. Despite their growing adoption, the potential for consumer wearables to contribute sleep metrics to sleep-related biomedical research remains largely uncharacterized. Here we analyzed sleep tracking data, along with questionnaire responses and multi-modal phenotypic data generated from 482 normal volunteers. First, we compared wearable-derived and self-reported sleep metrics, particularly total sleep time (TST) and sleep efficiency (SE). We then identified demographic, socioeconomic and lifestyle factors associated with wearable-derived TST; they included age, gender, occupation and alcohol consumption. Multi-modal phenotypic data analysis showed that wearable-derived TST and SE were associated with cardiovascular disease risk markers such as body mass index and waist circumference, whereas self-reported measures were not. Using wearable-derived TST, we showed that insufficient sleep was associated with premature telomere attrition. Our study highlights the potential for sleep metrics from consumer wearables to provide novel insights into data generated from population cohort studies.

Wednesday, November 20, 2019

A "Department of the Attention Economy"

Popping up on my daily input stream (in this case the Google News aggregator - which knows more that I do about what I might like to see) is a CNN business perspective titled "Andrew Yang: As president, I will establish a Department of the Attention Economy." It is an idea that I wish some of the more likely democratic nominees would take up.

The article immediately caught my attention, because faced with the immense array of input text and video streams competing for my attention I feel, as I suspect many MindBlog readers do, like one of the dogs in Martin Seligman's classic learned helplessness experiments whose stress and immune systems eventually are compromised by uncertainty. For entertainment should I be subscribing to Netflix, Hulu, Amazon Prime, Disney+, YouTube +, Apple TV+, CBS All Access, AcornTV, Britbox, Shudder, YouTbue, Facebook Watch, Tubi, etc.? For news, there are too many options to even begin to list them. Apart from my own qualms about using Google as a prosthesis (Blogger, Google Docs, Calendar, Mail, etc.), I look at how my 5 and 7 year old grandsons' lives are potentially compromised by the amount of free time they spend on digital inputs rather than playing outside with friends.

Clips rom Yang's article:
...technology is addictive and damaging the mental health of our children. Research shows that too much time spent on social media increases stress, anxiety, depression and feelings of isolation. Other studies have found that extended screen time can negatively affect sleep...As president, I will establish a Department of the Attention Economy that will work with tech companies and implement regulations that curb the negative effects of smartphones and social media.
A few of his suggestions:
We can start by curbing design features that maximize screen time, such as removing autoplay video and capping recommendations for videos, articles and posts for each user each day. Platforms can also use deep-learning algorithms to determine whether a user is a child, and then explore capping the user's screen hours per day.
Design features that encourage social validation should also be removed. Instagram is leading the way by testing hiding likes on the posts of some users. That's a step in the right direction and it should be implemented as soon as possible. In addition, the number of followers a person has on social media should be hidden too, as it represents a false equivalence with a person's social standing.
Another area that deserves attention is the content our kids consume. When I was growing up, television time meant morning cartoons and after-school specials. Rules and standards should be established to protect kids from graphic content and violent imagery. Subsequently, these regulations would also incentivize the production of high-quality content and positive programming.
It shouldn't stop there. Parents have a major role to play — and they want to — but they could use some help. Companies should be required to provide parents with guidance on kid-healthy content (similar to the rating system for TV or movies), and parents should easily be able to monitor content and screen time for children.

Monday, November 18, 2019

Social class is revealed by brief clips of speech.

Kraus et al. - a collective modern version of Professor Henry Higgins in George Bernard Shaw's play Pygmalion - offer a detailed analytic update on how social class is reproduced through subtle cues expressed in brief speech. Here is their abstract:
Economic inequality is at its highest point on record and is linked to poorer health and well-being across countries. The forces that perpetuate inequality continue to be studied, and here we examine how a person’s position within the economic hierarchy, their social class, is accurately perceived and reproduced by mundane patterns embedded in brief speech. Studies 1 through 4 examined the extent that people accurately perceive social class based on brief speech patterns. We find that brief speech spoken out of context is sufficient to allow respondents to discern the social class of speakers at levels above chance accuracy, that adherence to both digital and subjective standards for English is associated with higher perceived and actual social class of speakers, and that pronunciation cues in speech communicate social class over and above speech content. In study 5, we find that people with prior hiring experience use speech patterns in preinterview conversations to judge the fit, competence, starting salary, and signing bonus of prospective job candidates in ways that bias the process in favor of applicants of higher social class. Overall, this research provides evidence for the stratification of common speech and its role in both shaping perceiver judgments and perpetuating inequality during the briefest interactions.
Here is a sample explanatory clip from their results section:
A total of 229 perceivers were asked to listen to the speech of 27 unique speakers whose utterances were collected as part of a larger sample of 189 speakers through the International Dialects of English Archive (IDEA). These 27 speakers varied in terms of age, race, gender, and social class, which we measured in the present study in terms of high school or college degree attainment. Our sample of perceivers listened to 7 words spoken by each of the speakers presented consecutively and randomly without any other accompanying speech and answered “Yes” or “No” to 4 questions: “Is this person a college graduate/woman/young/white?” Participants answered these 4 questions in a randomized order, and we calculated the proportion of correct responses for each question...

Friday, November 15, 2019

Explaining the puzzle of human diversity in the Christian world

Fascinating work by Schulz et al. is reviewed by both Gelfand and also Zauzmer. Schultz et al. show how the specific practices of Medieval Christianity can in part explain widespread variation in human psychology around the world.

From Zauzmer:
The story begins with kinship networks — the tribes and clans of densely connected, insular groups of relatives who formed most human societies before medieval times. Catholic Church teachings disrupted those networks, in large part by vehemently prohibiting marriage between relatives (which had been de rigeur), and eventually provoked a wholesale transformation of communities, changing the norm from large clans into small, monogamous nuclear families.
The team analyzed Vatican records to document the extent of a country or region’s exposure to Catholicism before the year 1500, and found that longer exposure to Catholicism correlated with low measures of kinship intensity in the modern era, including low rates of cousins marrying each other. Both measures correlated with psychology, the researchers found by looking at 24 different psychological traits of people in different cultures: Countries exposed to Catholicism early have citizens today who exhibit qualities such as being more individualistic and independent, and being more trusting of strangers.
From Gelfand:
...the authors found that both longer exposure to the Western Church and weaker kinship intensity (which were negatively related, as expected) were associated with greater individualism and independence, less conformity and obedience, and greater prosociality toward strangers—relationships that mostly held when controlling for a range of geographic variables. The results were replicated across 440 regions in 36 European countries: Longer exposure to the Western Church was generally associated with the same WEIRD (Western, Educated, Industrialized, Rich and Democratic) psychological shifts, even when controlling for alternate explanations (e.g., the influence of Roman political institutions, schooling, migration).

Wednesday, November 13, 2019

New work on how and why we sleep.

The fact that I'm finding the quality of my sleep to be central to my robustness and well-being makes me want to pass on descriptions of four pieces of work described in recent issues of Science Magazine, work showing housekeeping changes in our brains happening while we sleep, changes whose disruption by sleep deprivation has debilitating consequences. Fultz et al. show that deep sleep drives brain fluid oscillations that may facilitate communication between fluid compartments and clearance of waste products. Todorova and Zugaro show that spikes during delta waves of sleep (widespread cortical silence) support memory consolidation. Brüning et al. find in the mouse brain that half of the 2000 synaptic phosphoproteins quantified show changes with daily activity-rest cycles. Sleep deprivation abolishes nearly all (98%) of these phosphorylation cycles at synapses. Noya et al. find a sleep-wake cycle in which transcripts and proteins associated with synaptic signaling accumulate before the active phase (dusk for nocturnal mice), whereas messenger RNAs and proteins associated with metabolism and translation accumulate before the resting phase.

Monday, November 11, 2019

Why we can't tell the truth about aging.

I've enjoyed reading the New Yorker essay by Arthur Krystal titled "Why we can't tell the truth about aging," which points to and discusses numerous recent (as well as a few ancient) books on aging. Here is a selection of rearranged small clips from the article.
Average life expectancy was indeed a sorry number for the greater part of history (for Americans born as late as 1900, it wasn’t even fifty), which may be one reason that people didn’t write books about aging: there weren’t enough old folks around to sample them. But now that more people on the planet are over sixty-five than under five, an army of readers stands waiting to learn what old age has in store.
Now that we’re living longer, we have the time to write books about living longe...The library on old age has grown so voluminous that the fifty million Americans over the age of sixty-five could spend the rest of their lives reading such books, even as lusty retirees and power-lifting septuagenarians turn out new ones.
Our senior years are evidently a time to celebrate ourselves and the wonderful things to come: travelling, volunteering, canoodling, acquiring new skills, and so on. No one, it seems, wants to disparage old age...we get cheerful tidings...chatty accounts meant to reassure us that getting old just means that we have to work harder at staying young...authors aren’t blind to the perils of aging; they just prefer to see the upside. All maintain that seniors are more comfortable in their own skins.
There is, of course, a chance that you may be happier at eighty than you were at twenty or forty, but you’re going to feel much worse. I know this because two recent books provide a sobering look at what happens to the human body as the years pile up. Elizabeth Blackburn and Elissa Epel’s “The Telomere Effect: Living Younger, Healthier, Longer” and Sue Armstrong’s “Borrowed Time: The Science of How and Why We Age” describe what is essentially a messy business.
Basically, most cells divide and replicate some fifty-plus times before becoming senescent. Not nearly as inactive as the name suggests, senescent cells contribute to chronic inflammation and interfere with protective collagens...The so-called epigenetic clock shows our DNA getting gummed up, age-related mitochondrial mutations reducing the cells’ ability to generate energy, and our immune system slowly growing less efficient. Bones weaken, eyes strain, hearts flag. Bladders empty too often, bowels not often enough, and toxic proteins build up in the brain to form the plaque and the spaghetti-like tangles that are associated with Alzheimer’s disease. Not surprisingly, sixty-eight per cent of Medicare beneficiaries today have multiple chronic conditions. Not a lot of grace, force, or fascination in that.
In short, the optimistic narrative of pro-aging writers doesn’t line up with the dark story told by the human body. But maybe that’s not the point. “There is only one solution if old age is not to be an absurd parody of our former life,” Simone de Beauvoir wrote in her expansive 1970 study “The Coming of Age,” “and that is to go on pursuing ends that give our existence a meaning—devotion to individuals, to groups, or to causes—social, political, intellectual, or creative work.”
One would, of course, like to approach old age with grace and fortitude, but old age makes it difficult. Those who feel that it’s a welcome respite from the passions, anxieties, and troubles of youth or middle age are either very lucky or toweringly reasonable. Why rail against the inevitable—what good will it do? None at all. Complaining is both pointless and unseemly. Existence itself may be pointless and unseemly.
We should all make peace with aging. And so my hat is off to Dr. Oliver Sacks, who chose to regard old age as “a time of leisure and freedom, freed from the factitious urgencies of earlier days, free to explore whatever I wish, and to bind the thoughts and feelings of a lifetime together.”

Friday, November 08, 2019

World wide movement of people into cities is degrading the human microbiome

From the Oct. 25 issue of Science Magazine:
Sonnenburg and Sonnenburg review how the shift of recent generations from rural, outdoor environments to urbanized and industrialized settings has profoundly affected our biology and health. The signals of change are seen most strikingly in the reduction of commensal microbial taxa and loss of their metabolic functions. The extirpation of human commensals is a result of bombardment by new chemicals, foodstuffs, sanitation, and medical practices. For most people, sanitation and readily available food have been beneficial, but have we now reached a tipping point? How do we “conserve” our beneficial symbionts and keep the pathogens at bay?
Here is their abstract:
The human body is an ecosystem that is home to a complex array of microbes known as the microbiome or microbiota. This ecosystem plays an important role in human health, but as a result of recent lifestyle changes occurring around the planet, whole populations are seeing a major shift in their gut microbiota. Measures meant to kill or limit exposure to pathogenic microbes, such as antibiotics and sanitation, combined with other factors such as processed food, have had unintended consequences for the human microbial ecosystem, including changes that may be difficult to reverse. Microbiota alteration and the accompanying loss of certain functional attributes might result in the microbial communities of people living in industrialized societies being suboptimal for human health. As macroecologists, conservationists, and climate scientists race to document, understand, predict, and delay global changes in our wider environment, microbiota scientists may benefit by using analogous approaches to study and protect our intimate microbial ecosystems.

Wednesday, November 06, 2019

How human breeding has changed dogs’ brains

Hecht et al. have identified brain networks in dogs related to behavioral specializations roughly corresponding to sight hunting, scent hunting, guarding, and companionship. Here is their detailed abstract:
Humans have bred different lineages of domestic dogs for different tasks such as hunting, herding, guarding, or companionship. These behavioral differences must be the result of underlying neural differences, but surprisingly, this topic has gone largely unexplored. The current study examined whether and how selective breeding by humans has altered the gross organization of the brain in dogs. We assessed regional volumetric variation in MRI studies of 62 male and female dogs of 33 breeds. Neuroanatomical variation is plainly visible across breeds. This variation is distributed nonrandomly across the brain. A whole-brain, data-driven independent components analysis established that specific regional subnetworks covary significantly with each other. Variation in these networks is not simply the result of variation in total brain size, total body size, or skull shape. Furthermore, the anatomy of these networks correlates significantly with different behavioral specialization(s) such as sight hunting, scent hunting, guarding, and companionship. Importantly, a phylogenetic analysis revealed that most change has occurred in the terminal branches of the dog phylogenetic tree, indicating strong, recent selection in individual breeds. Together, these results establish that brain anatomy varies significantly in dogs, likely due to human-applied selection for behavior.

Monday, November 04, 2019

A triple drug combination increases lifespan by 48%

In Drosophila flies, to be sure, but the nutrient sensing pathways that are the target of the drugs are common to all animals. Here is the abstract from open source article by Castillo-!uan et al.:
Increasing life expectancy is causing the prevalence of age-related diseases to rise, and there is an urgent need for new strategies to improve health at older ages. Reduced activity of insulin/insulin-like growth factor signaling (IIS) and mechanistic target of rapamycin (mTOR) nutrient-sensing signaling network can extend lifespan and improve health during aging in diverse organisms. However, the extensive feedback in this network and adverse side effects of inhibition imply that simultaneous targeting of specific effectors in the network may most effectively combat the effects of aging. We show that the mitogen-activated protein kinase kinase (MEK) inhibitor trametinib, the mTOR complex 1 (mTORC1) inhibitor rapamycin, and the glycogen synthase kinase-3 (GSK-3) inhibitor lithium act additively to increase longevity in Drosophila. Remarkably, the triple drug combination increased lifespan by 48%. Furthermore, the combination of lithium with rapamycin cancelled the latter’s effects on lipid metabolism. In conclusion, a polypharmacology approach of combining established, prolongevity drug inhibitors of specific nodes may be the most effective way to target the nutrient-sensing network to improve late-life health.

Friday, November 01, 2019

Skill development - the intelligence vs. practice debate reframed

Vaci et al. note that what is often overlooked in the nature vs. nurture debate is the fact that both factors interact with each other:
The relative importance of different factors in the development of human skills has been extensively discussed. Research on expertise indicates that focused practice may be the sole determinant of skill, while intelligence researchers underline the relative importance of abilities at even the highest level of skill. There is indeed a large body of research that acknowledges the role of both factors in skill development and retention. It is, however, unknown how intelligence and practice come together to enable the acquisition and retention of complex skills across the life span. Instead of focusing on the 2 factors, intelligence and practice, in isolation, here we look at their interplay throughout development. In a longitudinal study that tracked chess players throughout their careers, we show that both intelligence and practice positively affect the acquisition and retention of chess skill. Importantly, the nonlinear interaction between the 2 factors revealed that more intelligent individuals benefited more from practice. With the same amount of practice, they acquired chess skill more quickly than less intelligent players, reached a higher peak performance, and arrested decline in older age. Our research demonstrates the futility of scrutinizing the relative importance of highly intertwined factors in human development.

Wednesday, October 30, 2019

The miracle cure - just move

A summary from the editor of the British Medical Journal of work by Ekelund et. al.:
As miracle cures are hard to come by, any claims that a treatment is 100% safe and effective must always be viewed with intense scepticism. There is perhaps one exception. Physical activity has been called a miracle cure by no less a body than the Academy of Medical Sciences (http://bit.ly/2lTqDvc); and, like those who avail themselves of it, the supporting science grows stronger by the day. The BMJ recently published a systematic review showing a clear dose-response relation between physical activity and all cause mortality (doi:10.1136/bmj.l4570). The authors concluded that any level of activity is better than none, and more is better still, a message recently encapsulated in the updated guidelines from the UK’s chief medical officers (doi:10.1136/bmj.l5470).
The statement from the authors on their analysis of numerous studies:
We conducted a harmonised meta-analysis to examine the association between accelerometer measured physical activity and sedentary time and all cause mortality. Specifically, we examined the dose-response relations of total physical activity, different intensities of physical activity (light, low light, high light, moderate to vigorous, and vigorous) and sedentary time and all cause mortality.

Monday, October 28, 2019

Feeling grateful - a shortcut to virtuous behavior.

Psychologist David DeSteno, in an Aeon essay, summarizes his experiments suggesting that moving towards a more virtuous life might accomplished by simply cultivating gratitude, a more simple route than through deep deliberation on noble qualities such as honesty and generosity. Subjects were asked to report the results of a coin toss in which heads yielded a larger financial reward than tails. The coins were rigged to always come up tails.
The percentage of cheaters fell by half (from almost 49 per cent to 27 per cent) among those who had just recalled a time when they felt grateful, compared with those who described a time when they felt happy or no particular emotion at all.
They then did a second experiment that..
...had two key differences. First, the coin flip determined whether any given participant would have to complete an enjoyable 10-minute task or a difficult 45-minute one. Second, we led participants to believe that the next person to come would be assigned to complete whichever task remained.
In deciding to cheat by reporting that the virtual coin flip came up heads, people were giving themselves a much shorter and more enjoyable task, but in so doing, were also unfairly dooming another person to a more onerous task.
As one might imagine, the overall frequency of cheating was lower. Nonetheless, gratitude worked in the exact same way. Whereas 17 per cent of people cheated when feeling neutral or happy, only 2 per cent cheated when feeling grateful.
The empirical literature shows a similar influence of gratitude on other virtues. People feeling grateful are more likely to help others who request assistance, to divide their profits in a more egalitarian way, to be loyal even at cost to themselves, to be less materialistic, and even to exercise as opposed to loafing.

Friday, October 25, 2019

Five myths about aging.

I want to pass on a few clips from this piece by William Mair, who researches the biology of aging at Harvard:
MYTH NO. 1 Biological aging can't be slowed. The growing field of geroscience offers hope... that genetic alterations and drugs such as rapamycin can slow the rate at which animals age.
MYTH NO. 2 Live fast, die young. Recent work shows that regular exercise helps slow key signs of aging, boosting immune function and curbing mental decline. If anything, conserving our batteries as isolated couch potatoes ages us faster.
MYTH NO. 3 Antioxidants slow aging. Free radicals...have captured the public imagination as a source of old age with little scientific evidence. There is in fact more negative data than positive examples.
MYTH NO. 4 Fewer calories mean a longer life. In truth, we just don’t know that the benefit of strict diets lies solely in their calorie content. Increasingly, it seems that many of the positive effects of calorie restriction on aging may be unrelated to caloric intake. Hungry animals and people tend to eat faster, and as a result spend more of their day eating nothing. These extended periods of abstinence are enough to slow aging in mice, whether overall calorie intake is reduced or not. The science uncoupling the effects of fasting and calorie restriction on aging is in its infancy.
MYTH NO. 5 Short telomeres explain aging. Aging is not caused by one event, however, as compelling as fraying telomeres may be. Some of our cells do not divide at all, and they age without shortened telomeres. Many animals have telomeres much longer than ours, yet they age faster than we do. Shortening telomeres may even be useful, protecting against unchecked cell division, which is a hallmark of cancer.

Wednesday, October 23, 2019

The Metamorphosis of the Western Soul

I want to point to an article by Will Storr "The Metamorphosis of the Western Soul" that has been languishing for over a year in my list of references that might become the basis of a MindBlog post. Storr presents a nice distillation of the story of how Between 1965 and 1985, the Western self was transformed. Storr's basic point is that economic forces are the dominant reason for these changes.
We turned from anti-materialistic, stick-it-to-the-Man hippies into greed-is-good yuppies... While the origins of such changes cannot be reduced to a single source, I believe we can point to a dominant one: the economy. In the early 1980s, President Ronald Reagan and the British Prime Minister Margaret Thatcher rewrote the rules by which we had once lived. And that, with stunning rapidity, changed who we were.
Storr proceeds review the historical story of how citizens of the individualistic West and the collectivist East have developed fundamental cognitive differences - largely adaptations to different physical landscapes - in how they view the world through collectivist versus individualistic filters. But, there is plasticity:
Humans are born incomplete. The brain absorbs huge amounts of essential information throughout childhood and adolescence, which it uses to carry on building who we are. It’s as if the brain asks a single, vital question: Who do I have to be, in this place, to thrive? If it was a boastful hustler in ancient Greece and a humble team-player in ancient China, then who is it in the West today?
The answer is a neoliberal...After the economic chaos of the 1970s, it was decided that the United States and Britain had become too collective. Previous decades had seen the introduction of the New Deal, which included the Social Security Act, strict regulations on banking and business, and the rising power of the unions. This collectively tilted economy sired a collectively tilted people...For Mr. Reagan and Mrs. Thatcher, saving ourselves meant rediscovering our individualist roots.
They cut taxes and regulations; they battled unions; they shrunk the welfare state; they privatized assets and weakened the state’s safety nets. They pursued the neoliberal dream of globalization — one free market that covered the earth. As much of human life as possible was to become a competition of self versus self...In 1981, Margaret Thatcher said “Economics are the method: The object is to change the soul.” And that’s precisely what happened.
Before 2008, it felt as if neoliberalism was basically working for most people. But since the crash, millions have come to see the system as broken...We have seen the neoliberal Hillary Clinton falter and the antiglobalist Donald Trump triumph. Britain’s Brexit was secured by antiglobalist arguments....The perception of a broken, rigged economy has left us angry and increasingly tribal, which might explain this recent trend toward “us” over the narcissistic “me.”
If this is correct, it’s yet more evidence that who we are is powerfully influenced by where we are. Humans want to get along and get ahead and will become whoever they need to be in order to do so. In the 21st century, those rules are no longer set by our physical landscape. Today, the deep and enormously powerful controlling force is the economy.

Monday, October 21, 2019

Shape of your heart is determined by whether you run or sit.

Shave et al. show that endurance runners and farmers have larger, elongated left ventricles with thin walls (traits that help pump large volumes of blood for a long time) compared with football linemen, whose training emphasizes short, high-intensity exercise. Linemen, as well has sedentary people, have shorter, wider ventricles with thicker walls and are more prone to hypertensive heart disease. Their experiments used ultrasound imaging to examine the hearts of more than 160 adult men from four groups: long-distance runners, sedentary adults, highly trained football linemen, and the Tarahumara, Native American farmers renowned for their running ability.

Abstract
Chimpanzees and gorillas, when not inactive, engage primarily in short bursts of resistance physical activity (RPA), such as climbing and fighting, that creates pressure stress on the cardiovascular system. In contrast, to initially hunt and gather and later to farm, it is thought that preindustrial human survival was dependent on lifelong moderate-intensity endurance physical activity (EPA), which creates a cardiovascular volume stress. Although derived musculoskeletal and thermoregulatory adaptations for EPA in humans have been documented, it is unknown if selection acted similarly on the heart. To test this hypothesis, we compared left ventricular (LV) structure and function across semiwild sanctuary chimpanzees, gorillas, and a sample of humans exposed to markedly different physical activity patterns. We show the human LV possesses derived features that help augment cardiac output (CO) thereby enabling EPA. However, the human LV also demonstrates phenotypic plasticity and, hence, variability, across a wide range of habitual physical activity. We show that the human LV’s propensity to remodel differentially in response to chronic pressure or volume stimuli associated with intense RPA and EPA as well as physical inactivity represents an evolutionary trade-off with potential implications for contemporary cardiovascular health. Specifically, the human LV trades off pressure adaptations for volume capabilities and converges on a chimpanzee-like phenotype in response to physical inactivity or sustained pressure loading. Consequently, the derived LV and lifelong low blood pressure (BP) appear to be partly sustained by regular moderate-intensity EPA whose decline in postindustrial societies likely contributes to the modern epidemic of hypertensive heart disease.

Friday, October 18, 2019

The default mode network represents esthetic appeal.

Vessel et al. note another role for the default mode network of our brain:

Significance
Despite being highly subjective, aesthetic experiences are powerful moments of interaction with one’s surroundings, shaping behavior, mood, beliefs, and even a sense of self. The default-mode network (DMN), which sits atop the cortical hierarchy and has been implicated in self-referential processing, is typically suppressed when a person engages with the external environment. Yet not only is the DMN surprisingly engaged when one finds a visual artwork aesthetically moving, here we present evidence that the DMN also represents aesthetic appeal in a manner that generalizes across visual aesthetic domains, such as artworks, landscapes, or architecture. This stands in contrast to ventral occipitotemporal cortex (VOT), which represents the content of what we see, but does not contain domain-general information about aesthetic appeal.
Abstract
Visual aesthetic evaluations, which impact decision-making and well-being, recruit the ventral visual pathway, subcortical reward circuitry, and parts of the medial prefrontal cortex overlapping with the default-mode network (DMN). However, it is unknown whether these networks represent aesthetic appeal in a domain-general fashion, independent of domain-specific representations of stimulus content (artworks versus architecture or natural landscapes). Using a classification approach, we tested whether the DMN or ventral occipitotemporal cortex (VOT) contains a domain-general representation of aesthetic appeal. Classifiers were trained on multivoxel functional MRI response patterns collected while observers made aesthetic judgments about images from one aesthetic domain. Classifier performance (high vs. low aesthetic appeal) was then tested on response patterns from held-out trials from the same domain to derive a measure of domain-specific coding, or from a different domain to derive a measure of domain-general coding. Activity patterns in category-selective VOT contained a degree of domain-specific information about aesthetic appeal, but did not generalize across domains. Activity patterns from the DMN, however, were predictive of aesthetic appeal across domains. Importantly, the ability to predict aesthetic appeal varied systematically; predictions were better for observers who gave more extreme ratings to images subsequently labeled as “high” or “low.” These findings support a model of aesthetic appreciation whereby domain-specific representations of the content of visual experiences in VOT feed in to a “core” domain-general representation of visual aesthetic appeal in the DMN. Whole-brain “searchlight” analyses identified additional prefrontal regions containing information relevant for appreciation of cultural artifacts (artwork and architecture) but not landscapes.

Wednesday, October 16, 2019

Cross-national negativity bias in reacting to news

There seems to be a world-wide anxiety industry of media that find maximum profits in presenting mostly negative news - in a way similar to the drug companies that have reaped great profits from flooding distressed population areas with opioids. An interesting study in this area comes from Soroka et al., who provide more information on our human tendency to react more strongly to negative than positive information. (See also my post on Pinker's "Enlightenment Now" book that engages this topic):
What accounts for the prevalence of negative news content? One answer may lie in the tendency for humans to react more strongly to negative than positive information. “Negativity biases” in human cognition and behavior are well documented, but existing research is based on small Anglo-American samples and stimuli that are only tangentially related to our political world. This work accordingly reports results from a 17-country, 6-continent experimental study examining psychophysiological reactions to real video news content. Results offer the most comprehensive cross-national demonstration of negativity biases to date, but they also serve to highlight considerable individual-level variation in responsiveness to news content. Insofar as our results make clear the pervasiveness of negativity biases on average, they help account for the tendency for audience-seeking news around the world to be predominantly negative. Insofar as our results highlight individual-level variation, however, they highlight the potential for more positive content, and suggest that there may be reason to reconsider the conventional journalistic wisdom that “if it bleeds, it leads.”

Monday, October 14, 2019

An update on the science of ‘free will’

I want to point to an excellent article by Gholilpour in the Atlantic Magazine that describes a reinterpretation of experiments by Libet taken to suggest that our brains 'decide' to initiate a movement before our subjective awareness of intending to initiate that movement. A 'readiness potential' is observed about 500 msec before an action occurs, while a subject reports initiating that action about 150 msec before it occurs. Gholilpour points to work of Schurger and colleagues that suggests that the readiness potential is not the mark of a brain's brewing intention, but something much more circumstantial.
...Schurger and his colleagues ... proposed an explanation. Neuroscientists know that for people to make any type of decision, our neurons need to gather evidence for each option. The decision is reached when one group of neurons accumulates evidence past a certain threshold. Sometimes, this evidence comes from sensory information from the outside world: If you’re watching snow fall, your brain will weigh the number of falling snowflakes against the few caught in the wind, and quickly settle on the fact that the snow is moving downward.
Libet’s experiment, Schurger pointed out, provided its subjects with no such external cues. To decide when to tap their fingers, the participants simply acted whenever the moment struck them. Those spontaneous moments, Schurger reasoned, must have coincided with the haphazard ebb and flow of the participants’ brain activity. They would have been more likely to tap their fingers when their motor system happened to be closer to a threshold for movement initiation.
This would not imply, as Libet had thought, that people’s brains “decide” to move their fingers before they know it. Hardly. Rather, it would mean that the noisy activity in people’s brains sometimes happens to tip the scale if there’s nothing else to base a choice on, saving us from endless indecision when faced with an arbitrary task. The readiness potential would be the rising part of the brain fluctuations that tend to coincide with the decisions. This is a highly specific situation, not a general case for all, or even many, choices.
The name Schurger rang a bell with me, and so I did a MindBlog search, only to discover that I had reported Schurger's work in a 2016 post "A 50 year misunderstanding of how we decide to initiate action - our intuition is valid". I then proceeded to completely forget about it when I was preparing a subsequent 2019 lecture mentioning Libet's work. The conventional dogma that we are 'late to action' was apparently burned into my brain - most embarrassing. (I've now inserted the new perspective into four of my web lectures, dating as far back as 2012). The real clincher is...
In a new study under review for publication in the Proceedings of the National Academy of Sciences, Schurger and two Princeton researchers repeated a version of Libet’s experiment. To avoid unintentionally cherry-picking brain noise, they included a control condition in which people didn’t move at all. An artificial-intelligence classifier allowed them to find at what point brain activity in the two conditions diverged. If Libet was right, that should have happened at 500 milliseconds before the movement. But the algorithm couldn’t tell any difference until about only 150 milliseconds before the movement, the time people reported making decisions in Libet’s original experiment.
In other words, people’s subjective experience of a decision—what Libet’s study seemed to suggest was just an illusion—appeared to match the actual moment their brains showed them making a decision.
Gholilpour points out that this does not resolve the question of free will, it only deepens the question, which is the subject of an intensive collaboration between neuroscientists and philosophers, backed by $7 million from two private foundations, the John Templeton Foundation and the Fetzer Institute.

Friday, October 11, 2019

More critique of work linking loving kindness meditation and cellular again

In my Sept. 23 post, "Loving kindness meditation slows cellular aging?", I gave a few of the reasons that the article triggered my bullshit detector. A email from Harris Friedman, Univ. of Florida, has pointed me to their more extended critique in a letter to the editor of Psychoneuroendocrinology, which I pass on in a slightly truncated version:
Extraordinary claims require compelling evidence: Concerns about “loving-kindness meditation slows biological aging in novices”
The recent paper by Le Nguyen et al. (2019) makes the extra- ordinary claim that loving-kindness meditation (LKM) slows biological aging. Unfortunately, its headline-grabbing title lacks compelling evidence. This paper shows telomere length (TL) decreased considerably in a control group over a very short time period, as compared to a LKM group, while a mindfulness meditation (MM) group was somewhere in between. From this difference, the paper argues that LKM slows biological aging, which is quite a logical leap. Clearly LKM had nothing to do with the extent of TL shrinkage in the control group, and why the control group’s TL decreased so much is ignored in the paper. More generally, there are many problems with using TL as a proxy for biological aging. Even if this paper’s basic logic is accepted, there are many problems with how the paper’s data are handled.1 The most important problem is the absence of any analyses that provides a direct and straightforward examination of pre-post TL as a function of experimental condition. Consequently, we ran a 2×3 mixed factorial ANOVA using TL measured across two times (pre and post) compared across three conditions (control, MM, and LKM) with the paper’s data. Although a significant repeated measures main effect was found, the interaction with experimental condition was non-sig- nificant. One-way ANOVAs examining pre-, post-, and change/differ- ence TL variables as a function of condition also produced non-sig nificant results. Looking at the data using other statistical approaches, however, did show some pattern of mixed results that trend, although most are non-significant, in the direction of the LKM group having less TL shortening compared to the other groups. Regardless of analysis, effect sizes were consistently meager.
In addition, there are several serious confounds compromising any valid comparison among the groups. For example, the data show that six in the control group engaged in some meditation, and one even reported meditating 16 days during the study’s short time-span. This hardly constitutes an adequate control group for a meditation study. Also, the LKM group spent considerably more time meditating than the MM group, so these did not differ only in meditation type. We are unable to address many other problems with this paper due to this journal’s length restrictions in a letter to the editor. We simply conclude that this paper’s extraordinary claim does not have the compelling evidence to back it up, and we urge not making extraordinary claims without such evidence. 
Harris L. Friedman - University of Florida, United States
Douglas A. MacDonald - University of Detroit Mercy, United States
Nicholas J.L. Brown-  University of Groningen, the Netherlands
James C. Coyne University of Pennsylvania, United States

Wednesday, October 09, 2019

Psilocybin + Mindfulness Meditation change brain connectivity - lasting positive effects

Dolan points to work by Smigielski et al.:
Both psychedelics and meditation exert profound modulatory effects on consciousness, perception and cognition, but their combined, possibly synergistic effects on neurobiology are unknown. Accordingly, we conducted a randomized, double-blind, placebo-controlled study with 38 participants following a single administration of the psychedelic psilocybin (315 μg/kg p.o.) during a 5-day mindfulness retreat. Brain dynamics were quantified directly pre- and post-intervention by functional magnetic resonance imaging during the resting state and two meditation forms. The analysis of functional connectivity identified psilocybin-related and mental state–dependent alterations in self-referential processing regions of the default mode network (DMN). Notably, decoupling of medial prefrontal and posterior cingulate cortices, which is thought to mediate sense of self, was associated with the subjective ego dissolution effect during the psilocybin-assisted mindfulness session. The extent of ego dissolution and brain connectivity predicted positive changes in psycho-social functioning of participants 4 months later. Psilocybin, combined with meditation, facilitated neurodynamic modulations in self-referential networks, subserving the process of meditation by acting along the anterior–posterior DMN connection. The study highlights the link between altered self-experience and subsequent behavioral changes. Understanding how interventions facilitate transformative experiences may open novel therapeutic perspectives. Insights into the biology of discrete mental states foster our understanding of non-ordinary forms of human self-consciousness and their concomitant brain substrate.

Monday, October 07, 2019

Mindfulness doesn't reduce impulsive behavior.

Korponay, Davidson and colleagues present results of a study that were contrary to their expectation that the practice of mindfulness meditation would correlate with a reduction in impulsive behaviors (like having that second dish of ice cream). What they found is that neither short-term nor long-term meditation appears to be effective for reducing impulsivity that is not related to attentional difficulties, but rather is a function of motor control and planning capacities. Here is their detailed abstract:
Interest has grown in using mindfulness meditation to treat conditions featuring excessive impulsivity. However, while prior studies find that mindfulness practice can improve attention, it remains unclear whether it improves other cognitive faculties whose deficiency can contribute to impulsivity. Here, an eight-week mindfulness intervention did not reduce impulsivity on the go/no-go task or Barratt Impulsiveness Scale (BIS-11), nor produce changes in neural correlates of impulsivity (i.e. frontostriatal gray matter, functional connectivity, and dopamine levels) compared to active or wait-list control groups. Separately, long-term meditators (LTMs) did not perform differently than meditation-naïve participants (MNPs) on the go/no-go task. However, LTMs self-reported lower attentional impulsivity, but higher motor and non-planning impulsivity on the BIS-11 than MNPs. LTMs had less striatal gray matter, greater cortico-striatal-thalamic functional connectivity, and lower spontaneous eye-blink rate (a physiological dopamine indicator) than MNPs. LTM total lifetime practice hours (TLPH) did not signifcantly relate to impulsivity or neurobiological metrics. Findings suggest that neither short nor long-term mindfulness practice may be efective for redressing impulsive behavior derived from inhibitory motor control or planning capacity defcits in healthy adults. Given the absence of TLPH relationships to impulsivity or neurobiological metrics, diferences between LTMs and MNPs may be attributable to pre-existing diferences.

Friday, October 04, 2019

Gender and race stereotypes - what's in a name...

Eaton et al. give us more data on the application of gender and race stereotypes in academia:
The current study examines how intersecting stereotypes about gender and race influence faculty perceptions of post-doctoral candidates in STEM fields in the United States. Using a fully-crossed, between-subjects experimental design, biology and physics professors (n = 251) from eight large, public, U.S. research universities were asked to read one of eight identical curriculum vitae (CVs) depicting a hypothetical doctoral graduate applying for a post-doctoral position in their field, and rate them for competence, hireability, and likeability. The candidate’s name on the CV was used to manipulate race (Asian, Black, Latinx, and White) and gender (female or male), with all other aspects of the CV held constant across conditions. Faculty in physics exhibited a gender bias favoring the male candidates as more competent and more hirable than the otherwise identical female candidates. Further, physics faculty rated Asian and White candidates as more competent and hirable than Black and Latinx candidates, while those in biology rated Asian candidates as more competent and hirable than Black candidates, and as more hireable than Latinx candidates. An interaction between candidate gender and race emerged for those in physics, whereby Black women and Latinx women and men candidates were rated the lowest in hireability compared to all others. Women were rated more likeable than men candidates across departments. Our results highlight how understanding the underrepresentation of women and racial minorities in STEM requires examining both racial and gender biases as well as how they intersect.

Wednesday, October 02, 2019

Cognitive and Affective Neuroscience

I want to point the subset of MindBlog readers interested in brain-behavior correlations to the journal “Social Cognitive and Affective Neuroscience,” particularly to the June 2019 issue. Most of the articles in this issue are open source, and a scan of the article abstracts gives a good sense of the kind of work being done in this field to identify correlations between behaviors and brain activities in different brain areas. Such work gives us an idea of where to focus our attention when extreme behaviors are at issue, such as (in the first article) being willing to fight and die for a cause. Another interesting article suggests that the medial prefrontal cortex may play a role in maintaining a positively biased self-concept.

Monday, September 30, 2019

People of color - getting quantitative about police-involved deaths

From Edwards et al.:

Significance
Police violence is a leading cause of death for young men in the United States. Over the life course, about 1 in every 1,000 black men can expect to be killed by police. Risk of being killed by police peaks between the ages of 20 y and 35 y for men and women and for all racial and ethnic groups. Black women and men and American Indian and Alaska Native women and men are significantly more likely than white women and men to be killed by police. Latino men are also more likely to be killed by police than are white men.
Abstract
We use data on police-involved deaths to estimate how the risk of being killed by police use of force in the United States varies across social groups. We estimate the lifetime and age-specific risks of being killed by police by race and sex. We also provide estimates of the proportion of all deaths accounted for by police use of force. We find that African American men and women, American Indian/Alaska Native men and women, and Latino men face higher lifetime risk of being killed by police than do their white peers. We find that Latina women and Asian/Pacific Islander men and women face lower risk of being killed by police than do their white peers. Risk is highest for black men, who (at current levels of risk) face about a 1 in 1,000 chance of being killed by police over the life course. The average lifetime odds of being killed by police are about 1 in 2,000 for men and about 1 in 33,000 for women. Risk peaks between the ages of 20 y and 35 y for all groups. For young men of color, police use of force is among the leading causes of death.

Friday, September 27, 2019

Red and Blue Voters Live in Different Economies

Wow... if you want some graphics that powerfully describe our current political malaise, check out Edsall's Op-Ed piece in the NYTimes. Here is a graph showing how economic output and income for Democratic and Republican House districts diverged over the last decade - two different economies that are diverging rapidly:

Another Edsall piece notes that insecurity caused by rapid cultural change as well as economic insecurity has generated a class of voters who feel powerless and angry, and actually feel a need for the chaos Trump generates - that they hope might disrupt the system that disadvantages them.

Wednesday, September 25, 2019

Mindfulness training and attentional control of mind-wandering

Several studies have suggested that mindfulness training can strengthen the connections between the networks in our dorsolateral prefrontal cortex [DLPFC] attentional executive control network and those supporting the posterior cingulate cortex [PCC] default mode network active in mind-wandering. Davidson and his colleagues have expanded on this to see whether a Mindfulness-Based Stress Reduction (MBSR) course can increase DLPFC-PCC connectivity:
Mindfulness meditation training has been shown to increase resting state functional connectivity between nodes of the frontoparietal executive control network (dorsolateral prefrontal cortex [DLPFC]) and the default mode network (posterior cingulate cortex [PCC]). We investigated whether these effects generalized to a Mindfulness-Based Stress Reduction (MBSR) course, and tested for structural and behaviorally relevant consequences of change in connectivity. Healthy, meditationnaïve adults were randomized to either MBSR (N=48), an active (N=47) or waitlist (N=45) control group. Participants completed behavioral testing, resting state fMRI scans, and diffusion tensor scans at pre-randomization (T1), post-intervention (T2) and approximately 5.5 months later (T3). We found increased T2-T1 PCC–DLPFC resting connectivity for MBSR relative to control groups. Although these effects did not persist through long-term follow-up (T3-T1), MBSR participants showed a significantly stronger relationship between days of practice (T1 to T3) and increased PCC–DLPFC resting connectivity than participants in the active control group. Increased PCC–DLPFC resting connectivity in MBSR participants was associated with increased microstructural connectivity of a white matter tract connecting these regions, and increased self reported attention. These data show that MBSR increases PCC–DLPFC resting connectivity, which is related to increased practice time, attention, and structural connectivity.

Monday, September 23, 2019

Loving kindness meditation slows cellular aging?

I like to be generous, loving, kind and nice, but a recent PsyPost article on the slowing of cellular aging by loving-kindness meditation triggered my bullshit detector. I first pass on the summary and abstract of the work referenced in the article by Le Nguyen et al., and then get a bit into the details available only to those who have journal access (motivated readers can request a PDF of the article from me). It is a noble and complicated effort, with proper double blind controls, but with statistics compromised by a small sample size and measured changes in DNA telomere length (a proxy for aging) over a short period of time much larger than would be expected from longer terms studies. While I would like to believe the conclusions drawn by the authors, I have to remain wary of accepting them.

Highlights
•Over 12 weeks, loving-kindness meditation buffered telomere attrition.
•Telomere length decreased in the mindfulness group and the control group.
•The loving-kindness group showed less telomere attrition than the control group.
Abstract
Combinations of multiple meditation practices have been shown to reduce the attrition of telomeres, the protective caps of chromosomes (Carlson et al., Cancer, 121 (2015), pp. 476-484). Here, we probed the distinct effects on telomere length (TL) of mindfulness meditation (MM) and loving-kindness meditation (LKM). Midlife adults (N = 142) were randomized to be in a waitlist control condition or to learn either MM or LKM in a 6-week workshop. Telomere length was assessed 2 weeks before the start of the workshops and 3 weeks after their termination. After controlling for appropriate demographic covariates and baseline TL, we found TL decreased significantly in the MM group and the control group, but not in the LKM group. There was also significantly less TL attrition in the LKM group than the control group. The MM group showed changes in TL that were intermediate between the LKM and control groups yet not significantly different from either. Self-reported emotions and practice intensity (duration and frequency) did not mediate these observed group differences. This study is the first to disentangle the effects of LKM and MM on TL and suggests that LKM may buffer telomere attrition.


Fig. 1. Descriptive Mean and Standard Errors of Changes in TL per Experimental Conditions. (TL is expressed as the ratio (T/S) of telomeric (T) to single copy (S) gene product for a particular blood sample)
Apart from picking apart their statistics and the "controls for demographic covariates," the discussion paragraph that troubles me most is:
The difference in telomere length between the Control and LKM groups was .048 T/S ratio after demographic covariates were considered. Based on comparison of T/S ratios and telomere length measured by Southern blot in a series of quality control samples from the same lab, we estimated this difference of .048 T/S ratio to be 115 basepairs. This TL decrease over the 12-week period appears to be large compared to studies of TL change with much longer time periods (Müezzinler et al., Obes. Rev., 15 (2014), pp. 192-201). It is possible that this reflects short term dynamic change, or potential systematic differences in the collection and/or assay of baseline and follow-up samples. The fact that DNA extraction and assay were done as one batch (all samples from Time 1 and Time 2) argues against the latter concern, although we cannot completely rule out other potential unaccounted for systematic differences. Although it is unknown whether this effect remains longer than 3 weeks post-intervention, the current study demonstrated proof of concept for the malleability of TL changes, and that certain forms of meditation, in this case loving-kindness meditation, may buffer against telomere erosion.
After several paragraphs of other reservations and cautions on the data, the authors state:
...one should interpret the differences in TL changes here with caution, treating them as evidence for “apparent” rather than true alterations in TL.
In balance, the PsyPost article title: "Study provides evidence that loving-kindness meditation slows cellular aging" is wrong.

Friday, September 20, 2019

Gender neutral pronouns reduce bias in favor of traditional gender roles

An interesting study by Tavits and Pérez on Sweden's 2015 incorporation into the Swedish Academy Glossary (which sets norms for Sweden's language) of the gender-neutral pronoun hen. A majority of Swedes now use hen alongside the explicitly gendered hon (she) and han (he) as part of their grammatical toolkit.

Significance
Evidence from 3 survey experiments traces the effects of gender-neutral pronoun use on mass judgments of gender equality and tolerance toward lesbian, gay, bisexual, and transgender (LGBT) communities. The results establish that individual use of gender-neutral pronouns reduces the mental salience of males. This shift is associated with people expressing less bias in favor of traditional gender roles and categories, as manifested in more positive attitudes toward women and LGBT individuals in public affairs.
Abstract
To improve gender equality and tolerance toward lesbian, gay, bisexual, and transgender (LGBT) communities, several nations have promoted the use of gender-neutral pronouns and words. Do these linguistic devices actually reduce biases that favor men over women, gays, lesbians, and transgender individuals? The current article explores this question with 3 large-scale experiments in Sweden, which formally incorporated a gender-neutral pronoun into its language alongside established gendered pronouns equivalent to he and she. The evidence shows that compared with masculine pronouns, use of gender-neutral pronouns decreases the mental salience of males. This shift is associated with individuals expressing less bias in favor of traditional gender roles and categories, as reflected in more favorable attitudes toward women and LGBT individuals in public life. Additional analyses reveal similar patterns for feminine pronouns. The influence of both pronouns is more automatic than controlled.

Wednesday, September 18, 2019

The power of "cute"

I want to point MindBlog readers to an article by Simon May at aeon.co that encapsulates the contents of his new book "The Power of Cute" (2019), and to a Trends in Cognitive Sciences review article by Kringelbach et al. on cuteness that summarizes work on brain activities underlying survival related cuteness responses. The latter article's introduction notes that the prevailing view of cuteness...
...came from the founding fathers of ethology, Nobel prizewinners Konrad Lorenz and Niko Tinbergen. They proposed that the cute facial features of infants form a ‘Kindchenschema’ (infant schema), a prime example of an ‘innate releasing mechanism’ that unlocks instinctual behaviours...These characteristics contribute to ‘cuteness’ and propel our caregiving behaviours, which is vital because infants need our constant attention to survive and thrive. Infants attract us through all our senses, which helps make cuteness one of the most basic and powerful forces shaping our behaviour.
May considers the increasing popularity of child-like figures in popular culture and asks:
In such uncertain and uneasy times, and with so much injustice, hate and intolerance threatening the world, don’t we have more serious things to focus on than the escapades of that feline girl-figure Hello Kitty? Or Pokémon, the video-game franchise that’s hot again in 2019...The craze for all things cute is motivated, most obviously, by the urge to escape from precisely such a threatening world into a garden of innocence in which childlike qualities arouse deliciously protective feelings, and bestow contentment and solace. Cute cues include behaviours that appear helpless, harmless, charming and yielding, and anatomical features such as outsize heads, protruding foreheads, saucer-like eyes, retreating chins and clumsy gaits.
May suggests that the increasingly popularity of cuteness derives not only from the 'sweet' end of the whole spectrum of cuteness but also from moving towards the 'uncanny' and ambiguous end, a....
...faintly menacing subversion of boundaries – between the fragile and the resilient, the reassuring and the unsettling, the innocent and the knowing – when presented in cute’s frivolous, teasing idiom, is central to its immense popularity... ‘unpindownability’, as we might call it, that pervades cute – the erosion of borders between what used to be seen as distinct or discontinuous realms, such as childhood and adulthood – is also reflected in the blurred gender of many cute objects such as Balloon Dog or a lot of Pokémon. It is reflected, too, in their frequent blending of human and nonhuman forms, as in the cat-girl Hello Kitty. And in their often undefinable age...In such ways, cute is attuned to an era that is no longer so wedded to such hallowed dichotomies as masculine and feminine, sexual and nonsexual, adult and child, being and becoming, transient and eternal, body and soul, absolute and contingent, and even good and bad.
Although attraction to such cute objects as the mouthless, fingerless Hello Kitty can express a desire for power, cuteness can also parody and subvert power by playing with the viewer’s sense of her own power, now painting her into a dominant pose, now sowing uncertainty about who is really in charge...

Monday, September 16, 2019

Psychological adaptation to the apocalypse - meditate, or just be happy?

In this post, not exactly an upper, I point first to two in-your-face articles on how we ought to be afraid, very afraid, about humanity's future technological and ecological environment, and then note two pieces of writing on psychological adaptations that might dampen down the full turn on of our brains' fear machinery.

Novelist Jonathan Franzen does a screed very effective at scaring the bejesus out of us. His basic argument: “The climate apocalypse is coming. To prepare for it, we need to admit that we can’t prevent it.” A chorus of criticism has greeted Franzen's article: "Franzen is wrong on the science, on the politics, and on the psychology of human behavior as it pertains to climate change." (See also Chrobak.)

And, for alarm on our looming digital environment, The 6,000 word essay by Glenn S. Gerstell, general counsel of the National Security, and summarized by Warzel, should do the job. The first nation to crack quantum computing (China or the US) will rule the world! 

So, how do we manage to wake up cheerful in the morning? Futurist Yuval Harari offers his approach in Chapter 21 of his book "21 Lessons for the 21st century," by describing his experience of learning to meditate, starting with the initial instructions (to observe your process of breathing) in his first Vipassana meditation course. He now meditates two hours every day.
The point is that meditation is a tool for observing the mind directly...For at least two hours a day I actually observe reality as it is, while for the other twenty-two hours I get overwhelmed by emails and tweets and cute-puppy videos. Without the focus and clarity provided by this practice, I could not have written Sapiens or Homo Deus.
A glimmer of hopefulness can also be obtained by reading books in the vein of Pinker's "Enlightenment Now", which documents again and again, for many areas, how dire predictions about the future have not come to pass. The injunction here would be to be optimistic, not a bad idea, given the recent PNAS article by Lee et al. documenting that the lifespan of optimistic people, on average, is 11 to 15% longer.

Friday, September 13, 2019

Twitter is making us dumber.

Stanley-Becker points to some research providing hardly surprising evidence that communicating about complex issues using 280 character chunks of text dumbs down the understanding of twitter users. Using Twitter to teach literature has an overall negative effect on students’ average achievement, with the effect being strongest on students who usually perform better. Numerous schools have started to utilize twitter discussion among students assuming that this would enhance intellectual attainment, but in fact it undermines it.

Wednesday, September 11, 2019

Can we reverse our biological age? The usual media hysteria...

I must have seen at least 10 of my media inputs hyping a small study by Fahy et al. (9 white men, and lacking controls) pointed to by Abbott suggesting that the body's epigenetic clock might be reversed. The study actually had the goal of seeing whether human growth hormone could stimulates regeneration of the thymus gland and enhance immune function. Because the hormone can promote diabetes, the trial included two widely used anti-diabetic drugs, dehydroepiandrosterone (DHEA) and metformin, in the treatment cocktail. (Metformin is being evaluated as an anti-aging drug in several large scale studies).
Checking the effect of the drugs on the participants’ epigenetic clocks was an afterthought. The clinical study had finished when Fahy approached Horvath to conduct an analysis. (Epigenetic clocks are constructed by selecting sets of DNA-methylation sites across the genome. In the past few years, Horvath — a pioneer in epigenetic-clock research — has developed some of the most accurate ones)...Horvath used four different epigenetic clocks to assess each patient’s biological age, and he found significant reversal for each trial participant in all of the tests. “This told me that the biological effect of the treatment was robust,” he says. What’s more, the effect persisted in the six participants who provided a final blood sample six months after stopping the trial, he says.
The understandable excitement over this result is probably out of proportion to the probability it will be confirmed in larger experiments with proper controls.

Monday, September 09, 2019

Training to reduce cognitive biases.

Sellier et al. show that students assigned to solve a business case exercise are less likely to choose an inferior confirmatory solution when they have previously undergone a debiasing-training intervention:
The primary objection to debiasing-training interventions is a lack of evidence that they improve decision making in field settings, where reminders of bias are absent. We gave graduate students in three professional programs (N = 290) a one-shot training intervention that reduces confirmation bias in laboratory experiments. Natural variance in the training schedule assigned participants to receive training before or after solving an unannounced business case modeled on the decision to launch the Space Shuttle Challenger. We used case solutions to surreptitiously measure participants’ susceptibility to confirmation bias. Trained participants were 29% less likely to choose the inferior hypothesis-confirming solution than untrained participants. Analysis of case write-ups suggests that a reduction in confirmatory hypothesis testing accounts for their improved decision making in the case. The results provide promising evidence that debiasing-training effects transfer to field settings and can improve decision making in professional and private life.

Friday, September 06, 2019

How personal and professional conduct relate to one another.

From Griffin et al.:

Significance
The relative importance of personal traits compared with context for predicting behavior is a long-standing issue in psychology. This debate plays out in a practical way every time an employer, voter, or other decision maker has to infer expected professional conduct based on observed personal behavior. Despite its theoretical and practical importance, there is little academic consensus on this question. We fill this void with evidence connecting personal infidelity to professional behavior in 4 different settings.
Abstract
We study the connection between personal and professional behavior by introducing usage of a marital infidelity website as a measure of personal conduct. Police officers and financial advisors who use the infidelity website are significantly more likely to engage in professional misconduct. Results are similar for US Securities and Exchange Commission (SEC) defendants accused of white-collar crimes, and companies with chief executive officers (CEOs) or chief financial officers (CFOs) who use the website are more than twice as likely to engage in corporate misconduct. The relation is not explained by a wide range of regional, firm, executive, and cultural variables. These findings suggest that personal and workplace behavior are closely related.

Wednesday, September 04, 2019

Training wisdom - the Illeist (third person) method.

I think my most sane moments are those when I experience myself as watching, in third-person mode, rather than “being” Deric, the immersed actor. Science journalist David Robson does an essay on this perspective in Aeon, “Why speaking to yourself in the third person makes you wiser,” noting that this ancient rhetorical method, used by Julius Caesar and termed ‘illeism’ in 1809 by the poet Coleridge (latin ille meaning ‘he, that’) can clear the emotional fog of simple rumination, shifting perspective to see past biases. Robson notes the work of Igor Grossmann at the University of Waterloo in Canada, whose aim is:
...to build a strong experimental footing for the study of wisdom, which had long been considered too nebulous for scientific enquiry. In one of his earlier experiments, he established that it’s possible to measure wise reasoning and that, as with IQ, people’s scores matter. He did this by asking participants to discuss out-loud a personal or political dilemma, which he then scored on various elements of thinking long-considered crucial to wisdom, including: intellectual humility; taking the perspective of others; recognising uncertainty; and having the capacity to search for a compromise. Grossmann found that these wise-reasoning scores were far better than intelligence tests at predicting emotional wellbeing, and relationship satisfaction – supporting the idea that wisdom, as defined by these qualities, constitutes a unique construct that determines how we navigate life challenges.
The abstract from Grossmann et al.:
We tested the utility of illeism – a practice of referring to oneself in the third person – for the trainability of wisdom-related characteristics in everyday life: i) wise reasoning (intellectual humility, open-mindedness in ways a situation may unfold, perspective-taking, attempts to integrate different viewpoints) and ii) accuracy in emotional forecasts toward close others. In a month-long field experiment, people adopted either the third-person training or first-person control perspective when describing their most significant daily experiences. Assessment of spontaneous wise reasoning before and after the intervention revealed substantial growth in the training (vs. control) condition. At the end of the intervention, people forecasted their feelings toward a close other in challenging situations. A month later, these forecasted feelings were compared against their experienced feelings. Participants in the training (vs. control) condition showed greater alignment of forecasts and experiences, largely due to changes in their emotional experiences. The present research demonstrates a path to evidence-based training of wisdom-related processes via the practice of illeism.
Robson finds this work particularly fascinating,
...considering the fact that illeism is often considered to be infantile. Just think of Elmo in the children’s TV show Sesame Street, or the intensely irritating Jimmy in the sitcom Seinfeld – hardly models of sophisticated thinking. Alternatively, it can be taken to be the sign of a narcissistic personality – the very opposite of personal wisdom. After all, Coleridge believed that it was a ruse to cover up one’s own egotism: just think of the US president’s critics who point out that Donald Trump often refers to himself in the third person. Clearly, politicians might use illeism for purely rhetorical purposes but, when applied to genuine reflection, it appears to be a powerful tool for wiser reasoning.
For an example of third person usage reflecting not wisdom, but a narcissistic personality, look no further than our current president, Donald Trump, as noted in this Washington Post piece by Rieger.