Wednesday, November 27, 2019

Cognitive and noncognitive predictors of success.

An interesting bit of work from Duckworth et al.
When predicting success, how important are personal attributes other than cognitive ability? To address this question, we capitalized on a full decade of prospective, longitudinal data from n = 11,258 cadets entering training at the US Military Academy at West Point. Prior to training, cognitive ability was negatively correlated with both physical ability and grit. Cognitive ability emerged as the strongest predictor of academic and military grades, but noncognitive attributes were more prognostic of other achievement outcomes, including successful completion of initiation training and 4-y graduation. We conclude that noncognitive aspects of human capital deserve greater attention from both scientists and practitioners interested in predicting real-world success.

Monday, November 25, 2019

How trance states might have forged human societies

I want to pass on a series of clips I have made for my own use from an intriguing article by Mark Vernon in Aeon:
With anatomically modern humans comes culture in a way that had never happened before. And from that culture came religion, with various proposals to map the hows and whys of its emergence. Until recently, the proposals fell into two broad groups – ‘big gods’ theories and ‘false agency’ hypotheses. Big gods theories envisage religion as conjuring up punishing deities. These disciplining gods provided social bonding by telling individuals that wrongdoing incurs massive costs. The problem is that big gods are not a universal feature of religions and, if they are present, they seem correlated to big societies not causes of them. False agency hypotheses...assume that our forebears were jumpy and superstitious: they thought that a shrub swayed because of a spirit not the wind; and they were easily fooled, though their mistakes were evolutionarily advantageous because, on occasion, the swaying was caused by a predator. The false agency hypothesis has been tested and disconfirmed across many experiments.
...there is a need for a new idea, and coming to the fore now is an old one revisited...The explanation is resurfacing in what can be called the trance theory of religious origins, which proposes that our paleolithic ancestors hit on effervescence upon finding that they could induce altered states of consciousness...Effervescence is generated when humans come together to make music or perform rituals, an experience that lingers when the ceremonies are over. The suggestion, therefore, is that collective experiences that are religious or religious-like unify groups and create the energy to sustain them.
Research to test and develop this idea is underway in a multidisciplinary team led by Robin Dunbar at the University of Oxford. The approach appeals to him, in part, because it seems to capture a crucial aspect of religious phenomena missing in suggestions about punishing gods or dangerous spirits. It is not about the fine details of theology, but is about the raw feelings of experience...this raw-feelings element has a transcendental mystical component – something that is only fully experienced in trance states...this sense of transcendence and other worlds is present at some level in almost all forms of religious experience.
...there’s evidence that monkeys and apes experience the antecedents to ecstasy because they seem to experience wonder...a few hundred thousand years ago, archaic humans took a step that ramped up this capacity. They started deliberately to make music, dance and sing. When the synchronised and collective nature of these practices became sufficiently intense, individuals likely entered trance states in which they experienced not only this-worldly splendour but otherworldly intrigue... What you might call religiosity was born. It stuck partly because it also helped to ease tensions and bond groups, via the endorphin surges produced in trance states. In other words, altered states proved evolutionarily advantageous: the awoken human desire for ecstasy simultaneously prompted a social revolution because it meant that social groups could grow to much larger sizes via the shared intensity of heightened experiences.
The trance hypothesis...rests on the rituals that produce peak experiences, which means it doesn’t require speculating about what ancient people did or didn’t believe about spirits and gods...Asking when religion evolved is not a good question because religion is more than one thing...asking when the various elements such as supernatural agents and moral obligations started to coalesce together is a better question. And they invariably start to coalesce around rituals.
...when villages and then towns appear...new techniques for managing social pressures are required...religious systems (Doctrinal religions) that include specialists such as priests and impressive constructions we’d call temples and/or domestic house-based shrines...sustain the prosocial effects of earlier types of religiosity for groups that are now growing very large indeed...a tension .. arises when religious experiences are institutionalised....what’s on offer is somewhat thinner than experiences gained in the immersive rites that precipitate altered states. Encountering spirit entities directly in a dance or chase is not the same as the uplift offered by a monumental building.
...religions are caught between the Scylla of socially useful but potentially dreary religious rites and the Charybdis of altered states that are intrinsically exciting but socially disruptive. It’s why they bring bloody conflicts as well as social goods. This way of putting it highlights another feature of the trance theory. It interweaves two levels of explanation: one focused on the allure of spiritual vitality; the other on practical needs.
..science cannot decide whether the claims of any one religion are true. But the new theory still makes quite a strong claim, which brings me back to the role of the supernatural, transcendence and religious gods that today’s secularists seem inclined to sideline. If the science cannot confirm convictions about any divine revelations received, it does lend credence to the reasonableness, even necessity, of having them. Where the big gods and false agency hypotheses seemed inherently sniffy about human religiosity, the trance hypothesis positively values it...The trance hypothesis is neutral about the truth claims of religions whether you believe or don’t, though it does suggest that transcendent states of mind are meaningful to human beings and can evolve into religious systems of belief.
And in this final observation there is, perhaps, some good news for us, whether we’re religious or not. It’s often said that many of today’s troubles, from divisive political debates to spats on social media, are due to our tribal nature. It’s added, somewhat fatalistically, that deep within our evolutionary past is the tendency to identify with one group and demonise another. We are destined to be at war, culturally or otherwise. But if the trance theory is true, it shows that the evolutionary tendency to be tribal rests on an evolutionary taste for that which surpasses tribal experience – the transcendence that humans glimpsed in altered states of mind that enabled them to form tribes to start with.
If we long to belong, we also long to be in touch with ‘the more’, as the great pioneer of the study of religious experiences William James called it. That more will be envisaged in numerous ways. But it might help us by prompting new visions that exceed our herd instincts and binary thinking, and ease social tensions. If it helped our ancestors to survive, why would we think we are any different?

Friday, November 22, 2019

Evidence for premature aging caused by insufficient sleep.

I have come to realize in the past year or so that my physical and mental robustness require getting at least seven, and preferably eight, hours of sleep every night. Thus I was intrigued by finding an extensive and well documented study by Teo et al. (open source) showing that telomeres, sequences of DNA on the end of chromosomes taken as a marker of biological aging, are, on average, 356 base pairs shorter in study participants who slept for fewer than five hours per night than in those who slept for seven hours. They found that sleep metrics were reported more accurately by wearable fitness trackers than by self report. Here is the abstract of their article, titled "Digital phenotyping by consumer wearables identifies sleep-associated markers of cardiovascular disease risk and biological aging."
Sleep is associated with various health outcomes. Despite their growing adoption, the potential for consumer wearables to contribute sleep metrics to sleep-related biomedical research remains largely uncharacterized. Here we analyzed sleep tracking data, along with questionnaire responses and multi-modal phenotypic data generated from 482 normal volunteers. First, we compared wearable-derived and self-reported sleep metrics, particularly total sleep time (TST) and sleep efficiency (SE). We then identified demographic, socioeconomic and lifestyle factors associated with wearable-derived TST; they included age, gender, occupation and alcohol consumption. Multi-modal phenotypic data analysis showed that wearable-derived TST and SE were associated with cardiovascular disease risk markers such as body mass index and waist circumference, whereas self-reported measures were not. Using wearable-derived TST, we showed that insufficient sleep was associated with premature telomere attrition. Our study highlights the potential for sleep metrics from consumer wearables to provide novel insights into data generated from population cohort studies.

Wednesday, November 20, 2019

A "Department of the Attention Economy"

Popping up on my daily input stream (in this case the Google News aggregator - which knows more that I do about what I might like to see) is a CNN business perspective titled "Andrew Yang: As president, I will establish a Department of the Attention Economy." It is an idea that I wish some of the more likely democratic nominees would take up.

The article immediately caught my attention, because faced with the immense array of input text and video streams competing for my attention I feel, as I suspect many MindBlog readers do, like one of the dogs in Martin Seligman's classic learned helplessness experiments whose stress and immune systems eventually are compromised by uncertainty. For entertainment should I be subscribing to Netflix, Hulu, Amazon Prime, Disney+, YouTube +, Apple TV+, CBS All Access, AcornTV, Britbox, Shudder, YouTbue, Facebook Watch, Tubi, etc.? For news, there are too many options to even begin to list them. Apart from my own qualms about using Google as a prosthesis (Blogger, Google Docs, Calendar, Mail, etc.), I look at how my 5 and 7 year old grandsons' lives are potentially compromised by the amount of free time they spend on digital inputs rather than playing outside with friends.

Clips rom Yang's article:
...technology is addictive and damaging the mental health of our children. Research shows that too much time spent on social media increases stress, anxiety, depression and feelings of isolation. Other studies have found that extended screen time can negatively affect sleep...As president, I will establish a Department of the Attention Economy that will work with tech companies and implement regulations that curb the negative effects of smartphones and social media.
A few of his suggestions:
We can start by curbing design features that maximize screen time, such as removing autoplay video and capping recommendations for videos, articles and posts for each user each day. Platforms can also use deep-learning algorithms to determine whether a user is a child, and then explore capping the user's screen hours per day.
Design features that encourage social validation should also be removed. Instagram is leading the way by testing hiding likes on the posts of some users. That's a step in the right direction and it should be implemented as soon as possible. In addition, the number of followers a person has on social media should be hidden too, as it represents a false equivalence with a person's social standing.
Another area that deserves attention is the content our kids consume. When I was growing up, television time meant morning cartoons and after-school specials. Rules and standards should be established to protect kids from graphic content and violent imagery. Subsequently, these regulations would also incentivize the production of high-quality content and positive programming.
It shouldn't stop there. Parents have a major role to play — and they want to — but they could use some help. Companies should be required to provide parents with guidance on kid-healthy content (similar to the rating system for TV or movies), and parents should easily be able to monitor content and screen time for children.

Monday, November 18, 2019

Social class is revealed by brief clips of speech.

Kraus et al. - a collective modern version of Professor Henry Higgins in George Bernard Shaw's play Pygmalion - offer a detailed analytic update on how social class is reproduced through subtle cues expressed in brief speech. Here is their abstract:
Economic inequality is at its highest point on record and is linked to poorer health and well-being across countries. The forces that perpetuate inequality continue to be studied, and here we examine how a person’s position within the economic hierarchy, their social class, is accurately perceived and reproduced by mundane patterns embedded in brief speech. Studies 1 through 4 examined the extent that people accurately perceive social class based on brief speech patterns. We find that brief speech spoken out of context is sufficient to allow respondents to discern the social class of speakers at levels above chance accuracy, that adherence to both digital and subjective standards for English is associated with higher perceived and actual social class of speakers, and that pronunciation cues in speech communicate social class over and above speech content. In study 5, we find that people with prior hiring experience use speech patterns in preinterview conversations to judge the fit, competence, starting salary, and signing bonus of prospective job candidates in ways that bias the process in favor of applicants of higher social class. Overall, this research provides evidence for the stratification of common speech and its role in both shaping perceiver judgments and perpetuating inequality during the briefest interactions.
Here is a sample explanatory clip from their results section:
A total of 229 perceivers were asked to listen to the speech of 27 unique speakers whose utterances were collected as part of a larger sample of 189 speakers through the International Dialects of English Archive (IDEA). These 27 speakers varied in terms of age, race, gender, and social class, which we measured in the present study in terms of high school or college degree attainment. Our sample of perceivers listened to 7 words spoken by each of the speakers presented consecutively and randomly without any other accompanying speech and answered “Yes” or “No” to 4 questions: “Is this person a college graduate/woman/young/white?” Participants answered these 4 questions in a randomized order, and we calculated the proportion of correct responses for each question...

Friday, November 15, 2019

Explaining the puzzle of human diversity in the Christian world

Fascinating work by Schulz et al. is reviewed by both Gelfand and also Zauzmer. Schultz et al. show how the specific practices of Medieval Christianity can in part explain widespread variation in human psychology around the world.

From Zauzmer:
The story begins with kinship networks — the tribes and clans of densely connected, insular groups of relatives who formed most human societies before medieval times. Catholic Church teachings disrupted those networks, in large part by vehemently prohibiting marriage between relatives (which had been de rigeur), and eventually provoked a wholesale transformation of communities, changing the norm from large clans into small, monogamous nuclear families.
The team analyzed Vatican records to document the extent of a country or region’s exposure to Catholicism before the year 1500, and found that longer exposure to Catholicism correlated with low measures of kinship intensity in the modern era, including low rates of cousins marrying each other. Both measures correlated with psychology, the researchers found by looking at 24 different psychological traits of people in different cultures: Countries exposed to Catholicism early have citizens today who exhibit qualities such as being more individualistic and independent, and being more trusting of strangers.
From Gelfand:
...the authors found that both longer exposure to the Western Church and weaker kinship intensity (which were negatively related, as expected) were associated with greater individualism and independence, less conformity and obedience, and greater prosociality toward strangers—relationships that mostly held when controlling for a range of geographic variables. The results were replicated across 440 regions in 36 European countries: Longer exposure to the Western Church was generally associated with the same WEIRD (Western, Educated, Industrialized, Rich and Democratic) psychological shifts, even when controlling for alternate explanations (e.g., the influence of Roman political institutions, schooling, migration).

Wednesday, November 13, 2019

New work on how and why we sleep.

The fact that I'm finding the quality of my sleep to be central to my robustness and well-being makes me want to pass on descriptions of four pieces of work described in recent issues of Science Magazine, work showing housekeeping changes in our brains happening while we sleep, changes whose disruption by sleep deprivation has debilitating consequences. Fultz et al. show that deep sleep drives brain fluid oscillations that may facilitate communication between fluid compartments and clearance of waste products. Todorova and Zugaro show that spikes during delta waves of sleep (widespread cortical silence) support memory consolidation. BrĂ¼ning et al. find in the mouse brain that half of the 2000 synaptic phosphoproteins quantified show changes with daily activity-rest cycles. Sleep deprivation abolishes nearly all (98%) of these phosphorylation cycles at synapses. Noya et al. find a sleep-wake cycle in which transcripts and proteins associated with synaptic signaling accumulate before the active phase (dusk for nocturnal mice), whereas messenger RNAs and proteins associated with metabolism and translation accumulate before the resting phase.

Monday, November 11, 2019

Why we can't tell the truth about aging.

I've enjoyed reading the New Yorker essay by Arthur Krystal titled "Why we can't tell the truth about aging," which points to and discusses numerous recent (as well as a few ancient) books on aging. Here is a selection of rearranged small clips from the article.
Average life expectancy was indeed a sorry number for the greater part of history (for Americans born as late as 1900, it wasn’t even fifty), which may be one reason that people didn’t write books about aging: there weren’t enough old folks around to sample them. But now that more people on the planet are over sixty-five than under five, an army of readers stands waiting to learn what old age has in store.
Now that we’re living longer, we have the time to write books about living longe...The library on old age has grown so voluminous that the fifty million Americans over the age of sixty-five could spend the rest of their lives reading such books, even as lusty retirees and power-lifting septuagenarians turn out new ones.
Our senior years are evidently a time to celebrate ourselves and the wonderful things to come: travelling, volunteering, canoodling, acquiring new skills, and so on. No one, it seems, wants to disparage old age...we get cheerful tidings...chatty accounts meant to reassure us that getting old just means that we have to work harder at staying young...authors aren’t blind to the perils of aging; they just prefer to see the upside. All maintain that seniors are more comfortable in their own skins.
There is, of course, a chance that you may be happier at eighty than you were at twenty or forty, but you’re going to feel much worse. I know this because two recent books provide a sobering look at what happens to the human body as the years pile up. Elizabeth Blackburn and Elissa Epel’s “The Telomere Effect: Living Younger, Healthier, Longer” and Sue Armstrong’s “Borrowed Time: The Science of How and Why We Age” describe what is essentially a messy business.
Basically, most cells divide and replicate some fifty-plus times before becoming senescent. Not nearly as inactive as the name suggests, senescent cells contribute to chronic inflammation and interfere with protective collagens...The so-called epigenetic clock shows our DNA getting gummed up, age-related mitochondrial mutations reducing the cells’ ability to generate energy, and our immune system slowly growing less efficient. Bones weaken, eyes strain, hearts flag. Bladders empty too often, bowels not often enough, and toxic proteins build up in the brain to form the plaque and the spaghetti-like tangles that are associated with Alzheimer’s disease. Not surprisingly, sixty-eight per cent of Medicare beneficiaries today have multiple chronic conditions. Not a lot of grace, force, or fascination in that.
In short, the optimistic narrative of pro-aging writers doesn’t line up with the dark story told by the human body. But maybe that’s not the point. “There is only one solution if old age is not to be an absurd parody of our former life,” Simone de Beauvoir wrote in her expansive 1970 study “The Coming of Age,” “and that is to go on pursuing ends that give our existence a meaning—devotion to individuals, to groups, or to causes—social, political, intellectual, or creative work.”
One would, of course, like to approach old age with grace and fortitude, but old age makes it difficult. Those who feel that it’s a welcome respite from the passions, anxieties, and troubles of youth or middle age are either very lucky or toweringly reasonable. Why rail against the inevitable—what good will it do? None at all. Complaining is both pointless and unseemly. Existence itself may be pointless and unseemly.
We should all make peace with aging. And so my hat is off to Dr. Oliver Sacks, who chose to regard old age as “a time of leisure and freedom, freed from the factitious urgencies of earlier days, free to explore whatever I wish, and to bind the thoughts and feelings of a lifetime together.”

Friday, November 08, 2019

World wide movement of people into cities is degrading the human microbiome

From the Oct. 25 issue of Science Magazine:
Sonnenburg and Sonnenburg review how the shift of recent generations from rural, outdoor environments to urbanized and industrialized settings has profoundly affected our biology and health. The signals of change are seen most strikingly in the reduction of commensal microbial taxa and loss of their metabolic functions. The extirpation of human commensals is a result of bombardment by new chemicals, foodstuffs, sanitation, and medical practices. For most people, sanitation and readily available food have been beneficial, but have we now reached a tipping point? How do we “conserve” our beneficial symbionts and keep the pathogens at bay?
Here is their abstract:
The human body is an ecosystem that is home to a complex array of microbes known as the microbiome or microbiota. This ecosystem plays an important role in human health, but as a result of recent lifestyle changes occurring around the planet, whole populations are seeing a major shift in their gut microbiota. Measures meant to kill or limit exposure to pathogenic microbes, such as antibiotics and sanitation, combined with other factors such as processed food, have had unintended consequences for the human microbial ecosystem, including changes that may be difficult to reverse. Microbiota alteration and the accompanying loss of certain functional attributes might result in the microbial communities of people living in industrialized societies being suboptimal for human health. As macroecologists, conservationists, and climate scientists race to document, understand, predict, and delay global changes in our wider environment, microbiota scientists may benefit by using analogous approaches to study and protect our intimate microbial ecosystems.

Wednesday, November 06, 2019

How human breeding has changed dogs’ brains

Hecht et al. have identified brain networks in dogs related to behavioral specializations roughly corresponding to sight hunting, scent hunting, guarding, and companionship. Here is their detailed abstract:
Humans have bred different lineages of domestic dogs for different tasks such as hunting, herding, guarding, or companionship. These behavioral differences must be the result of underlying neural differences, but surprisingly, this topic has gone largely unexplored. The current study examined whether and how selective breeding by humans has altered the gross organization of the brain in dogs. We assessed regional volumetric variation in MRI studies of 62 male and female dogs of 33 breeds. Neuroanatomical variation is plainly visible across breeds. This variation is distributed nonrandomly across the brain. A whole-brain, data-driven independent components analysis established that specific regional subnetworks covary significantly with each other. Variation in these networks is not simply the result of variation in total brain size, total body size, or skull shape. Furthermore, the anatomy of these networks correlates significantly with different behavioral specialization(s) such as sight hunting, scent hunting, guarding, and companionship. Importantly, a phylogenetic analysis revealed that most change has occurred in the terminal branches of the dog phylogenetic tree, indicating strong, recent selection in individual breeds. Together, these results establish that brain anatomy varies significantly in dogs, likely due to human-applied selection for behavior.

Monday, November 04, 2019

A triple drug combination increases lifespan by 48%

In Drosophila flies, to be sure, but the nutrient sensing pathways that are the target of the drugs are common to all animals. Here is the abstract from open source article by Castillo-!uan et al.:
Increasing life expectancy is causing the prevalence of age-related diseases to rise, and there is an urgent need for new strategies to improve health at older ages. Reduced activity of insulin/insulin-like growth factor signaling (IIS) and mechanistic target of rapamycin (mTOR) nutrient-sensing signaling network can extend lifespan and improve health during aging in diverse organisms. However, the extensive feedback in this network and adverse side effects of inhibition imply that simultaneous targeting of specific effectors in the network may most effectively combat the effects of aging. We show that the mitogen-activated protein kinase kinase (MEK) inhibitor trametinib, the mTOR complex 1 (mTORC1) inhibitor rapamycin, and the glycogen synthase kinase-3 (GSK-3) inhibitor lithium act additively to increase longevity in Drosophila. Remarkably, the triple drug combination increased lifespan by 48%. Furthermore, the combination of lithium with rapamycin cancelled the latter’s effects on lipid metabolism. In conclusion, a polypharmacology approach of combining established, prolongevity drug inhibitors of specific nodes may be the most effective way to target the nutrient-sensing network to improve late-life health.

Friday, November 01, 2019

Skill development - the intelligence vs. practice debate reframed

Vaci et al. note that what is often overlooked in the nature vs. nurture debate is the fact that both factors interact with each other:
The relative importance of different factors in the development of human skills has been extensively discussed. Research on expertise indicates that focused practice may be the sole determinant of skill, while intelligence researchers underline the relative importance of abilities at even the highest level of skill. There is indeed a large body of research that acknowledges the role of both factors in skill development and retention. It is, however, unknown how intelligence and practice come together to enable the acquisition and retention of complex skills across the life span. Instead of focusing on the 2 factors, intelligence and practice, in isolation, here we look at their interplay throughout development. In a longitudinal study that tracked chess players throughout their careers, we show that both intelligence and practice positively affect the acquisition and retention of chess skill. Importantly, the nonlinear interaction between the 2 factors revealed that more intelligent individuals benefited more from practice. With the same amount of practice, they acquired chess skill more quickly than less intelligent players, reached a higher peak performance, and arrested decline in older age. Our research demonstrates the futility of scrutinizing the relative importance of highly intertwined factors in human development.

Wednesday, October 30, 2019

The miracle cure - just move

A summary from the editor of the British Medical Journal of work by Ekelund et. al.:
As miracle cures are hard to come by, any claims that a treatment is 100% safe and effective must always be viewed with intense scepticism. There is perhaps one exception. Physical activity has been called a miracle cure by no less a body than the Academy of Medical Sciences (http://bit.ly/2lTqDvc); and, like those who avail themselves of it, the supporting science grows stronger by the day. The BMJ recently published a systematic review showing a clear dose-response relation between physical activity and all cause mortality (doi:10.1136/bmj.l4570). The authors concluded that any level of activity is better than none, and more is better still, a message recently encapsulated in the updated guidelines from the UK’s chief medical officers (doi:10.1136/bmj.l5470).
The statement from the authors on their analysis of numerous studies:
We conducted a harmonised meta-analysis to examine the association between accelerometer measured physical activity and sedentary time and all cause mortality. Specifically, we examined the dose-response relations of total physical activity, different intensities of physical activity (light, low light, high light, moderate to vigorous, and vigorous) and sedentary time and all cause mortality.

Monday, October 28, 2019

Feeling grateful - a shortcut to virtuous behavior.

Psychologist David DeSteno, in an Aeon essay, summarizes his experiments suggesting that moving towards a more virtuous life might accomplished by simply cultivating gratitude, a more simple route than through deep deliberation on noble qualities such as honesty and generosity. Subjects were asked to report the results of a coin toss in which heads yielded a larger financial reward than tails. The coins were rigged to always come up tails.
The percentage of cheaters fell by half (from almost 49 per cent to 27 per cent) among those who had just recalled a time when they felt grateful, compared with those who described a time when they felt happy or no particular emotion at all.
They then did a second experiment that..
...had two key differences. First, the coin flip determined whether any given participant would have to complete an enjoyable 10-minute task or a difficult 45-minute one. Second, we led participants to believe that the next person to come would be assigned to complete whichever task remained.
In deciding to cheat by reporting that the virtual coin flip came up heads, people were giving themselves a much shorter and more enjoyable task, but in so doing, were also unfairly dooming another person to a more onerous task.
As one might imagine, the overall frequency of cheating was lower. Nonetheless, gratitude worked in the exact same way. Whereas 17 per cent of people cheated when feeling neutral or happy, only 2 per cent cheated when feeling grateful.
The empirical literature shows a similar influence of gratitude on other virtues. People feeling grateful are more likely to help others who request assistance, to divide their profits in a more egalitarian way, to be loyal even at cost to themselves, to be less materialistic, and even to exercise as opposed to loafing.

Friday, October 25, 2019

Five myths about aging.

I want to pass on a few clips from this piece by William Mair, who researches the biology of aging at Harvard:
MYTH NO. 1 Biological aging can't be slowed. The growing field of geroscience offers hope... that genetic alterations and drugs such as rapamycin can slow the rate at which animals age.
MYTH NO. 2 Live fast, die young. Recent work shows that regular exercise helps slow key signs of aging, boosting immune function and curbing mental decline. If anything, conserving our batteries as isolated couch potatoes ages us faster.
MYTH NO. 3 Antioxidants slow aging. Free radicals...have captured the public imagination as a source of old age with little scientific evidence. There is in fact more negative data than positive examples.
MYTH NO. 4 Fewer calories mean a longer life. In truth, we just don’t know that the benefit of strict diets lies solely in their calorie content. Increasingly, it seems that many of the positive effects of calorie restriction on aging may be unrelated to caloric intake. Hungry animals and people tend to eat faster, and as a result spend more of their day eating nothing. These extended periods of abstinence are enough to slow aging in mice, whether overall calorie intake is reduced or not. The science uncoupling the effects of fasting and calorie restriction on aging is in its infancy.
MYTH NO. 5 Short telomeres explain aging. Aging is not caused by one event, however, as compelling as fraying telomeres may be. Some of our cells do not divide at all, and they age without shortened telomeres. Many animals have telomeres much longer than ours, yet they age faster than we do. Shortening telomeres may even be useful, protecting against unchecked cell division, which is a hallmark of cancer.

Wednesday, October 23, 2019

The Metamorphosis of the Western Soul

I want to point to an article by Will Storr "The Metamorphosis of the Western Soul" that has been languishing for over a year in my list of references that might become the basis of a MindBlog post. Storr presents a nice distillation of the story of how Between 1965 and 1985, the Western self was transformed. Storr's basic point is that economic forces are the dominant reason for these changes.
We turned from anti-materialistic, stick-it-to-the-Man hippies into greed-is-good yuppies... While the origins of such changes cannot be reduced to a single source, I believe we can point to a dominant one: the economy. In the early 1980s, President Ronald Reagan and the British Prime Minister Margaret Thatcher rewrote the rules by which we had once lived. And that, with stunning rapidity, changed who we were.
Storr proceeds review the historical story of how citizens of the individualistic West and the collectivist East have developed fundamental cognitive differences - largely adaptations to different physical landscapes - in how they view the world through collectivist versus individualistic filters. But, there is plasticity:
Humans are born incomplete. The brain absorbs huge amounts of essential information throughout childhood and adolescence, which it uses to carry on building who we are. It’s as if the brain asks a single, vital question: Who do I have to be, in this place, to thrive? If it was a boastful hustler in ancient Greece and a humble team-player in ancient China, then who is it in the West today?
The answer is a neoliberal...After the economic chaos of the 1970s, it was decided that the United States and Britain had become too collective. Previous decades had seen the introduction of the New Deal, which included the Social Security Act, strict regulations on banking and business, and the rising power of the unions. This collectively tilted economy sired a collectively tilted people...For Mr. Reagan and Mrs. Thatcher, saving ourselves meant rediscovering our individualist roots.
They cut taxes and regulations; they battled unions; they shrunk the welfare state; they privatized assets and weakened the state’s safety nets. They pursued the neoliberal dream of globalization — one free market that covered the earth. As much of human life as possible was to become a competition of self versus self...In 1981, Margaret Thatcher said “Economics are the method: The object is to change the soul.” And that’s precisely what happened.
Before 2008, it felt as if neoliberalism was basically working for most people. But since the crash, millions have come to see the system as broken...We have seen the neoliberal Hillary Clinton falter and the antiglobalist Donald Trump triumph. Britain’s Brexit was secured by antiglobalist arguments....The perception of a broken, rigged economy has left us angry and increasingly tribal, which might explain this recent trend toward “us” over the narcissistic “me.”
If this is correct, it’s yet more evidence that who we are is powerfully influenced by where we are. Humans want to get along and get ahead and will become whoever they need to be in order to do so. In the 21st century, those rules are no longer set by our physical landscape. Today, the deep and enormously powerful controlling force is the economy.

Monday, October 21, 2019

Shape of your heart is determined by whether you run or sit.

Shave et al. show that endurance runners and farmers have larger, elongated left ventricles with thin walls (traits that help pump large volumes of blood for a long time) compared with football linemen, whose training emphasizes short, high-intensity exercise. Linemen, as well has sedentary people, have shorter, wider ventricles with thicker walls and are more prone to hypertensive heart disease. Their experiments used ultrasound imaging to examine the hearts of more than 160 adult men from four groups: long-distance runners, sedentary adults, highly trained football linemen, and the Tarahumara, Native American farmers renowned for their running ability.

Abstract
Chimpanzees and gorillas, when not inactive, engage primarily in short bursts of resistance physical activity (RPA), such as climbing and fighting, that creates pressure stress on the cardiovascular system. In contrast, to initially hunt and gather and later to farm, it is thought that preindustrial human survival was dependent on lifelong moderate-intensity endurance physical activity (EPA), which creates a cardiovascular volume stress. Although derived musculoskeletal and thermoregulatory adaptations for EPA in humans have been documented, it is unknown if selection acted similarly on the heart. To test this hypothesis, we compared left ventricular (LV) structure and function across semiwild sanctuary chimpanzees, gorillas, and a sample of humans exposed to markedly different physical activity patterns. We show the human LV possesses derived features that help augment cardiac output (CO) thereby enabling EPA. However, the human LV also demonstrates phenotypic plasticity and, hence, variability, across a wide range of habitual physical activity. We show that the human LV’s propensity to remodel differentially in response to chronic pressure or volume stimuli associated with intense RPA and EPA as well as physical inactivity represents an evolutionary trade-off with potential implications for contemporary cardiovascular health. Specifically, the human LV trades off pressure adaptations for volume capabilities and converges on a chimpanzee-like phenotype in response to physical inactivity or sustained pressure loading. Consequently, the derived LV and lifelong low blood pressure (BP) appear to be partly sustained by regular moderate-intensity EPA whose decline in postindustrial societies likely contributes to the modern epidemic of hypertensive heart disease.

Friday, October 18, 2019

The default mode network represents esthetic appeal.

Vessel et al. note another role for the default mode network of our brain:

Significance
Despite being highly subjective, aesthetic experiences are powerful moments of interaction with one’s surroundings, shaping behavior, mood, beliefs, and even a sense of self. The default-mode network (DMN), which sits atop the cortical hierarchy and has been implicated in self-referential processing, is typically suppressed when a person engages with the external environment. Yet not only is the DMN surprisingly engaged when one finds a visual artwork aesthetically moving, here we present evidence that the DMN also represents aesthetic appeal in a manner that generalizes across visual aesthetic domains, such as artworks, landscapes, or architecture. This stands in contrast to ventral occipitotemporal cortex (VOT), which represents the content of what we see, but does not contain domain-general information about aesthetic appeal.
Abstract
Visual aesthetic evaluations, which impact decision-making and well-being, recruit the ventral visual pathway, subcortical reward circuitry, and parts of the medial prefrontal cortex overlapping with the default-mode network (DMN). However, it is unknown whether these networks represent aesthetic appeal in a domain-general fashion, independent of domain-specific representations of stimulus content (artworks versus architecture or natural landscapes). Using a classification approach, we tested whether the DMN or ventral occipitotemporal cortex (VOT) contains a domain-general representation of aesthetic appeal. Classifiers were trained on multivoxel functional MRI response patterns collected while observers made aesthetic judgments about images from one aesthetic domain. Classifier performance (high vs. low aesthetic appeal) was then tested on response patterns from held-out trials from the same domain to derive a measure of domain-specific coding, or from a different domain to derive a measure of domain-general coding. Activity patterns in category-selective VOT contained a degree of domain-specific information about aesthetic appeal, but did not generalize across domains. Activity patterns from the DMN, however, were predictive of aesthetic appeal across domains. Importantly, the ability to predict aesthetic appeal varied systematically; predictions were better for observers who gave more extreme ratings to images subsequently labeled as “high” or “low.” These findings support a model of aesthetic appreciation whereby domain-specific representations of the content of visual experiences in VOT feed in to a “core” domain-general representation of visual aesthetic appeal in the DMN. Whole-brain “searchlight” analyses identified additional prefrontal regions containing information relevant for appreciation of cultural artifacts (artwork and architecture) but not landscapes.

Wednesday, October 16, 2019

Cross-national negativity bias in reacting to news

There seems to be a world-wide anxiety industry of media that find maximum profits in presenting mostly negative news - in a way similar to the drug companies that have reaped great profits from flooding distressed population areas with opioids. An interesting study in this area comes from Soroka et al., who provide more information on our human tendency to react more strongly to negative than positive information. (See also my post on Pinker's "Enlightenment Now" book that engages this topic):
What accounts for the prevalence of negative news content? One answer may lie in the tendency for humans to react more strongly to negative than positive information. “Negativity biases” in human cognition and behavior are well documented, but existing research is based on small Anglo-American samples and stimuli that are only tangentially related to our political world. This work accordingly reports results from a 17-country, 6-continent experimental study examining psychophysiological reactions to real video news content. Results offer the most comprehensive cross-national demonstration of negativity biases to date, but they also serve to highlight considerable individual-level variation in responsiveness to news content. Insofar as our results make clear the pervasiveness of negativity biases on average, they help account for the tendency for audience-seeking news around the world to be predominantly negative. Insofar as our results highlight individual-level variation, however, they highlight the potential for more positive content, and suggest that there may be reason to reconsider the conventional journalistic wisdom that “if it bleeds, it leads.”

Monday, October 14, 2019

An update on the science of ‘free will’

I want to point to an excellent article by Gholilpour in the Atlantic Magazine that describes a reinterpretation of experiments by Libet taken to suggest that our brains 'decide' to initiate a movement before our subjective awareness of intending to initiate that movement. A 'readiness potential' is observed about 500 msec before an action occurs, while a subject reports initiating that action about 150 msec before it occurs. Gholilpour points to work of Schurger and colleagues that suggests that the readiness potential is not the mark of a brain's brewing intention, but something much more circumstantial.
...Schurger and his colleagues ... proposed an explanation. Neuroscientists know that for people to make any type of decision, our neurons need to gather evidence for each option. The decision is reached when one group of neurons accumulates evidence past a certain threshold. Sometimes, this evidence comes from sensory information from the outside world: If you’re watching snow fall, your brain will weigh the number of falling snowflakes against the few caught in the wind, and quickly settle on the fact that the snow is moving downward.
Libet’s experiment, Schurger pointed out, provided its subjects with no such external cues. To decide when to tap their fingers, the participants simply acted whenever the moment struck them. Those spontaneous moments, Schurger reasoned, must have coincided with the haphazard ebb and flow of the participants’ brain activity. They would have been more likely to tap their fingers when their motor system happened to be closer to a threshold for movement initiation.
This would not imply, as Libet had thought, that people’s brains “decide” to move their fingers before they know it. Hardly. Rather, it would mean that the noisy activity in people’s brains sometimes happens to tip the scale if there’s nothing else to base a choice on, saving us from endless indecision when faced with an arbitrary task. The readiness potential would be the rising part of the brain fluctuations that tend to coincide with the decisions. This is a highly specific situation, not a general case for all, or even many, choices.
The name Schurger rang a bell with me, and so I did a MindBlog search, only to discover that I had reported Schurger's work in a 2016 post "A 50 year misunderstanding of how we decide to initiate action - our intuition is valid". I then proceeded to completely forget about it when I was preparing a subsequent 2019 lecture mentioning Libet's work. The conventional dogma that we are 'late to action' was apparently burned into my brain - most embarrassing. (I've now inserted the new perspective into four of my web lectures, dating as far back as 2012). The real clincher is...
In a new study under review for publication in the Proceedings of the National Academy of Sciences, Schurger and two Princeton researchers repeated a version of Libet’s experiment. To avoid unintentionally cherry-picking brain noise, they included a control condition in which people didn’t move at all. An artificial-intelligence classifier allowed them to find at what point brain activity in the two conditions diverged. If Libet was right, that should have happened at 500 milliseconds before the movement. But the algorithm couldn’t tell any difference until about only 150 milliseconds before the movement, the time people reported making decisions in Libet’s original experiment.
In other words, people’s subjective experience of a decision—what Libet’s study seemed to suggest was just an illusion—appeared to match the actual moment their brains showed them making a decision.
Gholilpour points out that this does not resolve the question of free will, it only deepens the question, which is the subject of an intensive collaboration between neuroscientists and philosophers, backed by $7 million from two private foundations, the John Templeton Foundation and the Fetzer Institute.