Monday, December 30, 2019

We’ve just had the best decade (and year) in human history. Seriously

To provide a faintly upbeat end of the year post, I want to point to a Matt Ridley piece in The Spectator that provides a bit of a tonic for our times, by pointing out facts that haven't made the news, because good news is no news. Bad things capture our attention while the world overall is still getting better. In the same vein, Nicholas Kristof does a NYTimes piece titled "This Has Been the Best Year Ever" which has some nice graphics describing amazing declines in poverty and infant deaths, and gains in literacy. Some clips from the Ridley piece:
Extreme poverty has fallen below 10 per cent of the world’s population for the first time...Global inequality has been plunging as Africa and Asia experience faster economic growth than Europe and North America; child mortality has fallen to record low levels; famine virtually went extinct; malaria, polio and heart disease are all in decline.
...we are getting more sustainable, not less, in the way we use the planet...some nations are beginning to use less stuff: less metal, less water, less land. Not just in proportion to productivity: less stuff overall...what if economic growth means using less stuff, not more?’ For example, a normal drink can today contains 13 grams of aluminium, much of it recycled. In 1959, it contained 85 grams. Substituting the former for the latter is a contribution to economic growth, but it reduces the resources consumed per drink...The quantity of all resources consumed per person in Britain (domestic extraction of biomass, metals, minerals and fossil fuels, plus imports minus exports) fell by a third between 2000 and 2017, from 12.5 tonnes to 8.5 tonnes. That’s a faster decline than the increase in the number of people, so it means fewer resources consumed overall.
Mobile phones have the computing power of room-sized computers of the 1970s. I use mine instead of a camera, radio, torch, compass, map, calendar, watch, CD player, newspaper and pack of cards. LED light bulbs consume about a quarter as much electricity as incandescent bulbs for the same light...Even in cases when the use of stuff is not falling, it is rising more slowly than expected. For instance, experts in the 1970s forecast how much water the world would consume in the year 2000. In fact, the total usage that year was half as much as predicted. Not because there were fewer humans, but because human inventiveness allowed more efficient irrigation for agriculture, the biggest user of water.
...despite the growing number of people and their demand for more and better food, the productivity of agriculture is rising so fast that human needs can be supplied by a shrinking amount of land...we use 65 per cent less land to produce a given quantity of food compared with 50 years ago. By 2050, it’s estimated that an area the size of India will have been released from the plough and the cow.
Since its inception, the environmental movement has been obsessed by finite resources. The two books that kicked off the green industry in the early 1970s, The Limits to Growth in America and Blueprint for Survival in Britain, both lamented the imminent exhaustion of metals, minerals and fuels. The Limits to Growth predicted that if growth continued, the world would run out of gold, mercury, silver, tin, zinc, copper and lead well before 2000...To this day none of those metals has significantly risen in price or fallen in volume of reserves, let alone run out.
A modern irony is that many green policies advocated now would actually reverse the trend towards using less stuff. A wind farm requires far more concrete and steel than an equivalent system based on gas. Environmental opposition to nuclear power has hindered the generating system that needs the least land, least fuel and least steel or concrete per megawatt.
As we enter the third decade of this century, I’ll make a prediction: by the end of it, we will see less poverty, less child mortality, less land devoted to agriculture in the world. There will be more tigers, whales, forests and nature reserves. Britons will be richer, and each of us will use fewer resources. The global political future may be uncertain, but the environmental and technological trends are pretty clear — and pointing in the right direction.

Friday, December 27, 2019

World in decline? New authoritarian age? - the dangers of declinism.

These are times of high anxiety for both Red and Blue state America. A powerful Op-Ed by Roger Cohen, in the wake of the British election, sees a coming Trump victory in 2010, and David Brooks suggests that voters will pick whichever candidate exhausts them less. Numerous articles outline an emerging world of authoritarian surveillance states that fail to engage the looming disaster of global warming.

Jeremy Adelman, director of the Global History Lab at Princeton University, offers an interesting essay, noting that the idea of decline is one thing the extremes of Left and Right agree upon. Adelman reviews the history of declinism, and describes the fate of several previous dire predictions about humanity's future that did not come to pass, from Malthus' 18th century essay to The Club of Rome's "The Limits to Growth of 1972.  (For a recent rebuttal of declinism, see Matt Ridley's essay in The Spectator: "We've just had the best decade in human history. Seriously.")

Here are some clips from Adelman's piece:
Declinisms share some traits. They have more purchase in times of turmoil and uncertainty. They are also prone to thinking that the circles of hell can be avoided only with a great catharsis or a great charismatic figure.
But most of all: they ignore signs of improvement that point to less drastic ways out of trouble. Declinists have a big blindspot because they are attracted to daring, total, all-encompassing alternatives to the humdrum greyness of modest solutions. Why go for partial and piecemeal when you can overturn the whole system?
One dissenting voice in the 1970s was Albert O Hirschman’s. He worried about the lure of doomsaying. Dire predictions, he warned, can blind big-picture observers to countervailing forces, positive stories and glimmers of solutions. There is a reason why: declinists confuse the growing pains of change with signs of the end of entire systems. Declinism misses the possibility that behind the downsizing old ways there might be new ones poking through.
Why the allure of declinism if history seldom conforms to the predictions? To Hirschman, it was traceable to a prophetic style, one that appealed to intellectuals drawn to ‘fundamentalist’ explanations and who preferred to point to intractable causes of social problems. For revolutionaries, what awaits is a utopian alternative. For reactionaries, what lies in wait is dystopia. The result is an ‘antagonistic’ mode of thinking, a belief that history swings from one big, integrated, all-encompassing system to another. Compared with modest advances, compromises and concessions – how boring! – the magnificent vision of a complete overhaul has so many charms.
The problem with declinism is that it confirms the virtues of our highest, impossible solutions to fundamental problems. It also confirms the disappointments we harbour in the changes we have actually made. This is not to say there aren’t deep-seated problems. But seeing them as evidence of ineluctable demise can impoverish our imaginations by luring us to the sirens of either total change or fatalism.

Wednesday, December 25, 2019

For the holiday season - the gift of self care

To acknowledge that today is a special one for a large fraction of humanity, I want to pass on Parker-Pope's description of suggestions by Korean Buddhist teacher Haemin Sunim - five simple steps to quiet the mind and soothe stress at any time of the year, all in the spirit of "be good to yourself first - then to others."

Breathe
Start by just taking a deep breath. Become mindful of your breathing. You’ll notice that when you begin, your breathing is shorter and more shallow, but as you continue, your breathing becomes deeper. Take just a few minutes each day to focus on your breathing. “As my breathing becomes much deeper and I’m paying attention to it, I feel much more centered and calm,” Haemin Sunim said. “I feel I can manage whatever is happening right now.”
Accept
Acceptance — of ourselves, our feelings and of life’s imperfections — is a common theme in “Love for Imperfect Things.” The path to self-care starts with acceptance, especially of our struggles. “If we accept the struggling self, our state of mind will soon undergo a change,” Haemin Sunim writes. “When we regard our difficult emotions as a problem and try to overcome them, we only struggle more. In contrast, when we accept them, strangely enough our mind stops struggling and suddenly grows quiet. Rather than trying to change or control difficult emotions from the inside, allow them to be there, and your mind will rest.”
Write
Begin to practice acceptance through a simple writing exercise. Write down the situation you must accept and all that you are feeling. Write down the things in your life that are weighing on you'''the goal is to leave it all on the paper. Now go to bed and when you wake up, choose the easiest task on the list to complete. “In the morning, rather than resisting, I will simply do the easiest thing I can do from the list,” Haemin Sunim said. “Once I finish the easiest task, it’s much easier to work on the second.”
Talk
Never underestimate the value of meaningful conversation for your well-being. Make time on a regular basis for a close, nonjudgmental friend...Choose someone who will listen without any kind of judgment...Once the story is released, you can see it more objectively, and you will know what it is you need to do.”
Walk
One of the easiest ways to care for yourself is to take a walk. Just walking...can distract your mind and create space between you and whatever is causing stress in your life... If you start walking, our physical energy changes and rather than dwelling on that story, you can pay attention to nature — a tree trunk, a rock. You begin to see things more objectively, and oftentimes that stress within your body will be released simply by walking.”

Monday, December 23, 2019

Is this what my grandson will be doing in a few years?

My seven year old grandson Sebastian is a performer, taking piano lessons and reminding me a bit of myself when I was doing the same thing at his age. I think the tenuous similarity in our experiences will soon evaporate, especially in a few years if he joins the world of YouTube "Creators." The following YouTube summary of most popular videos of 2019 lets me (the 77 year old retired professor) know I am living on Mars.



Friday, December 20, 2019

Setting back our epigenic age?

I would like to point to Josh Mitteldorf's Blog "Aging Matters", in particular "Pulsed Yamanaka Factors Set Back Epigenic Age." A clip:
There’s a preprint from David Sinclair’s Harvard laboratory, posted on BioRxiv but not yet published, with very encouraging news for those of us who think that resetting the epigenetic (methylation) clock is a path to anti-aging. They suggest that 3 of the 4 Yamanaka factors, administered in short pulses, can set back the Horvath methylation clock without turning functioning tissues back into stem cells. The same study offers evidence to support the hypothesis that the epigenetic clock is a lethal driver of aging, rather than an adaptive response to damage.

Wednesday, December 18, 2019

MindBlog reviews the HealthyMinds App

The Center for Healthy Minds at the University of Wisconsin, established by my former University of Wisconsin colleague Richard Davidson, is trying to export the results of its basic research into practical forms - workplace programs, school programs, and now an App for tablets or cell phones.


I've put the App on my iPhone, and want to report my experience of going through its sequence of mini-lectures and exercises, organized under five categories: Foundations, awareness, connection, insight, and purpose. The basic science presented and the exercises were familiar to me, and I was struck by how effectively the essence of each was presented in its most simple and accessible form. (I would have liked to have been able to see the brief voice lectures also offered in text form.) For example, the instruction most often seen for the exercise of paying attention to breathing suggest counting each breath, from one to nine or ten, then repeating, and returning to this practice when one notices having been distracted by other thoughts.  Why not just count to three, as instructed by the App.  It is more simple, and equally effective.  For each of the exercises, one can choose a sitting or active version (active meaning doing routine, but not novel, activities),  a male or female narrator, and varying duration from 5 to 30 min.  It was not until I had finished the available series of lectures and exercises and tried to go back to earlier stages and examine them in more detail that I encountered a requirement that a subscription be purchased (~$5/month, but see below).   If the ideas and exercises presented were less familiar to me, and I was not already fairly settled in my various practices, I'm sure I would purchase a subscription to the App.

(added note: Please see the comment below.  A subscription is not required to repeat the Foundations section I was describing. It is  offered for further engagement of the Awareness, Connection, Insight, and Purpose modules.) 

Favorite sentences

I have to pass on a few of the favorite sentences collected by book critic Dwight Garner from his 2019 list of books - you can get the citations from the article.
“Watch for the glamorous sentence that appears from nowhere — it might have plans for you.”
“If you don’t know the exact moment when the lights will go out, you might as well read until they do.”
“The one good thing about national anthems is that we’re already on our feet, and therefore ready to run.”
“How’re you doing,” a character asked in Ali Smith’s novel “Spring,” “apart from the end of liberal capitalist democracy?”
“Take a simpleton and give him power and confront him with intelligence — and you have a tyrant.”
In Robert Menasse’s sophisticated novel “The Capital,” set in Brussels, a character watched old nationalist ghosts rise in a tabloid culture, and commented: “He had been prepared for everything, but not everything in caricature.”...also from this novel...“Back in 1914, his grandfather had said, Brussels was the richest and most beautiful city in the world — then they came three times, twice in their boots with rifles, the third time in their trainers with cameras.”
Or, to let it all pass by...
Nabokov told an interviewer in 1974, “I don’t even know who Mr. Watergate is.”
And, I will add a favorite sentence of my own, from Pinker’s guide to writing “Sense of Style..”
"The key to good style, far more than obeying any list of commandments, is to have a clear conception of the make-believe world in which you’re pretending to communicate."

Monday, December 16, 2019

Our blood protein profiles change in the fourth, seventh and eighth decades of life

Lehallier et al. find that ~1,380 of the ~3,000 plasma proteins in blood samples from 4,263 people between the ages of 18 and 95 vary significantly with age, with big shifts occurring around the ages of 34, 60, and 78 - in the fourth, seventh and eighth decades of life:
Aging is a predominant risk factor for several chronic diseases that limit healthspan. Mechanisms of aging are thus increasingly recognized as potential therapeutic targets. Blood from young mice reverses aspects of aging and disease across multiple tissues, which supports a hypothesis that age-related molecular changes in blood could provide new insights into age-related disease biology. We measured 2,925 plasma proteins from 4,263 young adults to nonagenarians (18–95 years old) and developed a new bioinformatics approach that uncovered marked non-linear alterations in the human plasma proteome with age. Waves of changes in the proteome in the fourth, seventh and eighth decades of life reflected distinct biological pathways and revealed differential associations with the genome and proteome of age-related diseases and phenotypic traits. This new approach to the study of aging led to the identification of unexpected signatures and pathways that might offer potential targets for age-related diseases.

Friday, December 13, 2019

Our visual system uses recurrence in its representational dynamics

Fundamental work from Kietzmann et al. shows how recurrence - lateral and top-down feedback from higher to the more primary visual areas of the brain that first register visual input - is occurring during forming visual representations. This process is missing from engineering and neuroscience models that emphasize feedforward neural network models. (Click the link to the article and scroll down to see a fascinating video of their real time magnetoencephalography (MEG) measurements. ) 


Significance
Understanding the computational principles that underlie human vision is a key challenge for neuroscience and could help improve machine vision. Feedforward neural network models process their input through a deep cascade of computations. These models can recognize objects in images and explain aspects of human rapid recognition. However, the human brain contains recurrent connections within and between stages of the cascade, which are missing from the models that dominate both engineering and neuroscience. Here, we measure and model the dynamics of human brain activity during visual perception. We compare feedforward and recurrent neural network models and find that only recurrent models can account for the dynamic transformations of representations among multiple regions of visual cortex.
Abstract
The human visual system is an intricate network of brain regions that enables us to recognize the world around us. Despite its abundant lateral and feedback connections, object processing is commonly viewed and studied as a feedforward process. Here, we measure and model the rapid representational dynamics across multiple stages of the human ventral stream using time-resolved brain imaging and deep learning. We observe substantial representational transformations during the first 300 ms of processing within and across ventral-stream regions. Categorical divisions emerge in sequence, cascading forward and in reverse across regions, and Granger causality analysis suggests bidirectional information flow between regions. Finally, recurrent deep neural network models clearly outperform parameter-matched feedforward models in terms of their ability to capture the multiregion cortical dynamics. Targeted virtual cooling experiments on the recurrent deep network models further substantiate the importance of their lateral and top-down connections. These results establish that recurrent models are required to understand information processing in the human ventral stream.

Wednesday, December 11, 2019

More insight into metformin's beneficial effects on diabetes, aging, and several diseases.

A group at McMaster University has shown an effect of the diabetes drug metformin beyond its suppression of liver glucose production that might partially explain its beneficial effects on aging and a number of diverse diseases such as cognitive disorders, cancer and cardiovascular disease. (There are currently over 1,500 registered clinical trials to test the effects of metformin in aging and different diseases.) It induces the expression and secretion of growth differentiating factor 15 (GDF15) in mouse liver cells, a protein known to suppress appetite and cause weight loss.

I'm sorely tempted to try to get myself a prescription for the stuff! Here is the technical abstract of the article:
Metformin is the most commonly prescribed medication for type 2 diabetes, owing to its glucose-lowering effects, which are mediated through the suppression of hepatic glucose production (reviewed in refs. 1,2,3). However, in addition to its effects on the liver, metformin reduces appetite and in preclinical models exerts beneficial effects on ageing and a number of diverse diseases (for example, cognitive disorders, cancer, cardiovascular disease) through mechanisms that are not fully understood1,2,3. Given the high concentration of metformin in the liver and its many beneficial effects beyond glycemic control, we reasoned that metformin may increase the secretion of a hepatocyte-derived endocrine factor that communicates with the central nervous system4. Here we show, using unbiased transcriptomics of mouse hepatocytes and analysis of proteins in human serum, that metformin induces expression and secretion of growth differentiating factor 15 (GDF15). In primary mouse hepatocytes, metformin stimulates the secretion of GDF15 by increasing the expression of activating transcription factor 4 (ATF4) and C/EBP homologous protein (CHOP; also known as DDIT3). In wild-type mice fed a high-fat diet, oral administration of metformin increases serum GDF15 and reduces food intake, body mass, fasting insulin and glucose intolerance; these effects are eliminated in GDF15 null mice. An increase in serum GDF15 is also associated with weight loss in patients with type 2 diabetes who take metformin. Although further studies will be required to determine the tissue source(s) of GDF15 produced in response to metformin in vivo, our data indicate that the therapeutic benefits of metformin on appetite, body mass and serum insulin depend on GDF15.

Monday, December 09, 2019

Heritable gaps between chronological age and brain age are increased in common brain disorders.

Kaufmann et al. have used machine learning on s large dataset to estimate robust estimation of individual biological brain ages on the basis of structural brain imaging features. The deviation between brain age and chronological age — termed the brain age gap — appears to be a promising marker of brain health. It was largest in schizophrenia, multiple sclerosis, dementia, and bipolar spectrum disorder. The authors also assessed the overlap between the genetic underpinnings of brain age gap and common brain disorders. The bottom line conclusion (from a very extensive and complex analysis) is that common brain disorders are associated with heritable patterns of apparent aging of the brain Their abstract:
Common risk factors for psychiatric and other brain disorders are likely to converge on biological pathways influencing the development and maintenance of brain structure and function across life. Using structural MRI data from 45,615 individuals aged 3–96 years, we demonstrate distinct patterns of apparent brain aging in several brain disorders and reveal genetic pleiotropy between apparent brain aging in healthy individuals and common brain disorders.

Friday, December 06, 2019

Same-Sex behavior in animals - a new view.

Monk et al. offer a fresh perspective on the "problem" of how same-sex sexual behavior could have evolved. It is a problem only if different-sex sexual behavior is the baseline condition for animals, from which single-sex behavior has evolved. The authors suggest that same-sex behavior is bound up in the very origins of animal sex. It hasn’t had to continually re-evolve: It’s always been there. The arguments of Monk and collaborators are summarized in a review by Elbein:
Instead of wondering why same-sex behavior had independently evolved in so many species, Ms. Monk and her colleagues suggest that it may have been present in the oldest parts of the animal family tree. The earliest sexually reproducing animals may have mated with any other individual they came across, regardless of sex. Such reproductive strategies are still practiced today by hermaphroditic species, like snails, and species that don’t appear to differentiate, like sea urchins.
Over time, Ms. Monk said, sexual signals evolved — different sizes, colors, anatomical features and behaviors — allowing different sexes to more accurately target each other for reproduction. But same-sex behavior continued in some organisms, leading to diverse sexual behaviors and strategies across the animal kingdom. And while same-sex behavior may grant some evolutionary benefits, an ancient origin would mean those benefits weren’t required for it to exist.
But how has same-sex behavior stuck around? The answer may be that such behaviors aren’t as evolutionarily costly as assumed. Traditionally, Ms. Monk said, any mating behavior that doesn’t produce young is seen as a waste. But animal behavior often doesn’t fit neatly into an economic accounting of costs and benefits.
Here is the abstract of Monk et al.:
Same-sex sexual behaviour (SSB) has been recorded in over 1,500 animal species with a widespread distribution across most major clades. Evolutionary biologists have long sought to uncover the adaptive origins of ‘homosexual behaviour’ in an attempt to resolve this apparent Darwinian paradox: how has SSB repeatedly evolved and persisted despite its presumed fitness costs? This question implicitly assumes that ‘heterosexual’ or exclusive different-sex sexual behaviour (DSB) is the baseline condition for animals, from which SSB has evolved. We question the idea that SSB necessarily presents an evolutionary conundrum, and suggest that the literature includes unchecked assumptions regarding the costs, benefits and origins of SSB. Instead, we offer an alternative null hypothesis for the evolutionary origin of SSB that, through a subtle shift in perspective, moves away from the expectation that the origin and maintenance of SSB is a problem in need of a solution. We argue that the frequently implicit assumption of DSB as ancestral has not been rigorously examined, and instead hypothesize an ancestral condition of indiscriminate sexual behaviours directed towards all sexes. By shifting the lens through which we study animal sexual behaviour, we can more fruitfully examine the evolutionary history of diverse sexual strategies.

Wednesday, December 04, 2019

Something in the way we move.

Gretchen Reynolds points to work by Hug et al. suggesting that each of us has a unique muscle activation signature that can be revealed during walking and pedaling. Understanding movement patterns could help in improving and refining robotics, prosthetics, physical therapy and personalized exercise programs. On the darker side, a Chinese company (Watrix) is using computer vision to to enhance the recognition of individuals in crowds by their walking postures:
...its gait recognition solution “Shuidi Shenjian” ... will enable security departments to quickly search and recognize identities by their body shape and walking posture. The company notes that this product is highly effective when targets walk from a long distance or in weak light, cover their faces or wear different clothes, and would be a great supplement to current computer vision products.
Here is the complete abstract from Hug et al.:
Although it is known that the muscle activation patterns used to produce even simple movements can vary between individuals, these differences have not been considered to prove the existence of individual muscle activation strategies (or signatures). We used a machine learning approach (support vector machine) to test the hypothesis that each individual has unique muscle activation signatures. Eighty participants performed a series of pedaling and gait tasks, and 53 of these participants performed a second experimental session on a subsequent day. Myoelectrical activity was measured from eight muscles: vastus lateralis and medialis, rectus femoris, gastrocnemius lateralis and medialis, soleus, tibialis anterior, and biceps femoris-long head. The classification task involved separating data into training and testing sets. For the within-day classification, each pedaling/gait cycle was tested using the classifier, which had been trained on the remaining cycles. For the between-day classification, each cycle from day 2 was tested using the classifier, which had been trained on the cycles from day 1. When considering all eight muscles, the activation profiles were assigned to the corresponding individuals with a classification rate of up to 99.28% (2,353/2,370 cycles) and 91.22% (1,341/1,470 cycles) for the within-day and between-day classification, respectively. When considering the within-day classification, a combination of two muscles was sufficient to obtain a classification rate >80% for both pedaling and gait. When considering between-day classification, a combination of four to five muscles was sufficient to obtain a classification rate >80% for pedaling and gait. These results demonstrate that strategies not only vary between individuals, as is often assumed, but are unique to each individual.

Monday, December 02, 2019

Rival theories of consciousness being tested by large project.

In the first phase of a $20 million dollar project, six laboratories are going to run experiments with more than 500 participants to test two of the primary theories of consciousness:
The first two contenders are the global workspace theory (GWT), championed by Stanislas Dehaene of the Collège de France in Paris, and the integrated information theory (IIT), proposed by Giulio Tononi of the Uni-versity of Wisconsin in Madison. The GWT says the brain’s prefrontal cortex, which con-trols higher order cognitive processes like decision-making, acts as a central computer that collects and prioritizes information from sensory input. It then broadcasts the infor-mation to other parts of the brain that carry out tasks. Dehaene thinks this selection pro-cess is what we perceive as consciousness. By contrast, the IIT proposes that conscious-ness arises from the interconnectedness of brain networks. The more neurons interact with one another, the more a being feels conscious—even without sensory input. IIT proponents suspect this process occurs in the back of the brain, where neurons con-nect in a gridlike structure...Tononi and Dehaene have agreed to pa-rameters for the experiments and have reg-istered their predictions. To avoid conflicts of interest, the scientists will neither collect nor interpret the data. If the results appear to disprove one theory, each has agreed to admit he was wrong—at least to some extent
The labs, in the United States, Germany, the United Kingdom, and China, will use three techniques to record brain activity as volun-teers perform consciousness-related tasks: functional magnetic resonance imaging, electroencephalography, and electrocortico-graphy (a form of EEG done during brain sur-gery, in which electrodes are placed directly on the brain). In one experiment, research-ers will measure the brain’s response when a person becomes aware of an image. The GWT predicts the front of the brain will suddenly become active, whereas the IIT says the back of the brain will be consistently active.

Friday, November 29, 2019

The real cost of texting and tweeting.

Agnes Callard, an associate professor of philosophy at the University of Chicago, crystallizes some fascinating points in an NYTimes Op-Ed piece. She wonders why she broadcasts the details of her daily life on twitter...some clips:
To allow others to think about us in whatever way they feel like — perhaps to laugh at us, perhaps to dismiss us — is a huge loss of control. So why do we allow it? What is the attraction of it? I think that it’s the increase in control we get in return. Social media has enabled the Great Control Swap. And it is happening right now, beneath our notice.
The first baby step toward the Great Swap was the shift from phone calls to texts. A phone interaction requires participants to be “on the same time,” which entails negotiations over entrance into and exit from the conversation...A text or email interaction, by contrast, liberates the parties so that each may operate on their own time. But the cost comes in another form of control: data....text-based communication requires stationary words...they leave a trail.
We understood from the start that this form of socializing — like an affair without physical contact — was shallower than the other, more demanding kind. We were prepared to accept that trade-off, but failed to grasp that we were trading away more than depth. We were also trading away a kind of control.
All of us have a desire to connect, to be seen. But we live in a world that is starting to allow us to satisfy that desire without feeling the common-sense moral strictures that have traditionally governed human relationships. We can engage without obligation, without boredom and, most importantly, without subjecting our attention to the command of another. On Twitter, I’m never obligated to listen through to the end of someone’s story.
The immense appeal of this free-form socializing lies in the way it makes one a master of one’s own time — but it cannot happen without a place. All that data has to sit somewhere so that people can freely access it whenever they wish. Data storage is the loss of control by which we secure social control: Facebook is our faithless mistress’s leaky inbox.
When we alienate our identities as text data, and put that data “out there” to be read by anyone who wanders by, we are putting ourselves into the interpretive hands of those who have no bonds or obligations or agreements with us, people with whom we are, quite literally, prevented from seeing “eye to eye.” People we cannot trust.
The Great Control Swap buys us control over the logistics of our interactions at the cost of interpretive control over the content of those interactions. Our words have lost their wings, and fallen to the ground as data.

Wednesday, November 27, 2019

Cognitive and noncognitive predictors of success.

An interesting bit of work from Duckworth et al.
When predicting success, how important are personal attributes other than cognitive ability? To address this question, we capitalized on a full decade of prospective, longitudinal data from n = 11,258 cadets entering training at the US Military Academy at West Point. Prior to training, cognitive ability was negatively correlated with both physical ability and grit. Cognitive ability emerged as the strongest predictor of academic and military grades, but noncognitive attributes were more prognostic of other achievement outcomes, including successful completion of initiation training and 4-y graduation. We conclude that noncognitive aspects of human capital deserve greater attention from both scientists and practitioners interested in predicting real-world success.

Monday, November 25, 2019

How trance states might have forged human societies

I want to pass on a series of clips I have made for my own use from an intriguing article by Mark Vernon in Aeon:
With anatomically modern humans comes culture in a way that had never happened before. And from that culture came religion, with various proposals to map the hows and whys of its emergence. Until recently, the proposals fell into two broad groups – ‘big gods’ theories and ‘false agency’ hypotheses. Big gods theories envisage religion as conjuring up punishing deities. These disciplining gods provided social bonding by telling individuals that wrongdoing incurs massive costs. The problem is that big gods are not a universal feature of religions and, if they are present, they seem correlated to big societies not causes of them. False agency hypotheses...assume that our forebears were jumpy and superstitious: they thought that a shrub swayed because of a spirit not the wind; and they were easily fooled, though their mistakes were evolutionarily advantageous because, on occasion, the swaying was caused by a predator. The false agency hypothesis has been tested and disconfirmed across many experiments.
...there is a need for a new idea, and coming to the fore now is an old one revisited...The explanation is resurfacing in what can be called the trance theory of religious origins, which proposes that our paleolithic ancestors hit on effervescence upon finding that they could induce altered states of consciousness...Effervescence is generated when humans come together to make music or perform rituals, an experience that lingers when the ceremonies are over. The suggestion, therefore, is that collective experiences that are religious or religious-like unify groups and create the energy to sustain them.
Research to test and develop this idea is underway in a multidisciplinary team led by Robin Dunbar at the University of Oxford. The approach appeals to him, in part, because it seems to capture a crucial aspect of religious phenomena missing in suggestions about punishing gods or dangerous spirits. It is not about the fine details of theology, but is about the raw feelings of experience...this raw-feelings element has a transcendental mystical component – something that is only fully experienced in trance states...this sense of transcendence and other worlds is present at some level in almost all forms of religious experience.
...there’s evidence that monkeys and apes experience the antecedents to ecstasy because they seem to experience wonder...a few hundred thousand years ago, archaic humans took a step that ramped up this capacity. They started deliberately to make music, dance and sing. When the synchronised and collective nature of these practices became sufficiently intense, individuals likely entered trance states in which they experienced not only this-worldly splendour but otherworldly intrigue... What you might call religiosity was born. It stuck partly because it also helped to ease tensions and bond groups, via the endorphin surges produced in trance states. In other words, altered states proved evolutionarily advantageous: the awoken human desire for ecstasy simultaneously prompted a social revolution because it meant that social groups could grow to much larger sizes via the shared intensity of heightened experiences.
The trance hypothesis...rests on the rituals that produce peak experiences, which means it doesn’t require speculating about what ancient people did or didn’t believe about spirits and gods...Asking when religion evolved is not a good question because religion is more than one thing...asking when the various elements such as supernatural agents and moral obligations started to coalesce together is a better question. And they invariably start to coalesce around rituals.
...when villages and then towns appear...new techniques for managing social pressures are required...religious systems (Doctrinal religions) that include specialists such as priests and impressive constructions we’d call temples and/or domestic house-based shrines...sustain the prosocial effects of earlier types of religiosity for groups that are now growing very large indeed...a tension .. arises when religious experiences are institutionalised....what’s on offer is somewhat thinner than experiences gained in the immersive rites that precipitate altered states. Encountering spirit entities directly in a dance or chase is not the same as the uplift offered by a monumental building.
...religions are caught between the Scylla of socially useful but potentially dreary religious rites and the Charybdis of altered states that are intrinsically exciting but socially disruptive. It’s why they bring bloody conflicts as well as social goods. This way of putting it highlights another feature of the trance theory. It interweaves two levels of explanation: one focused on the allure of spiritual vitality; the other on practical needs.
..science cannot decide whether the claims of any one religion are true. But the new theory still makes quite a strong claim, which brings me back to the role of the supernatural, transcendence and religious gods that today’s secularists seem inclined to sideline. If the science cannot confirm convictions about any divine revelations received, it does lend credence to the reasonableness, even necessity, of having them. Where the big gods and false agency hypotheses seemed inherently sniffy about human religiosity, the trance hypothesis positively values it...The trance hypothesis is neutral about the truth claims of religions whether you believe or don’t, though it does suggest that transcendent states of mind are meaningful to human beings and can evolve into religious systems of belief.
And in this final observation there is, perhaps, some good news for us, whether we’re religious or not. It’s often said that many of today’s troubles, from divisive political debates to spats on social media, are due to our tribal nature. It’s added, somewhat fatalistically, that deep within our evolutionary past is the tendency to identify with one group and demonise another. We are destined to be at war, culturally or otherwise. But if the trance theory is true, it shows that the evolutionary tendency to be tribal rests on an evolutionary taste for that which surpasses tribal experience – the transcendence that humans glimpsed in altered states of mind that enabled them to form tribes to start with.
If we long to belong, we also long to be in touch with ‘the more’, as the great pioneer of the study of religious experiences William James called it. That more will be envisaged in numerous ways. But it might help us by prompting new visions that exceed our herd instincts and binary thinking, and ease social tensions. If it helped our ancestors to survive, why would we think we are any different?

Friday, November 22, 2019

Evidence for premature aging caused by insufficient sleep.

I have come to realize in the past year or so that my physical and mental robustness require getting at least seven, and preferably eight, hours of sleep every night. Thus I was intrigued by finding an extensive and well documented study by Teo et al. (open source) showing that telomeres, sequences of DNA on the end of chromosomes taken as a marker of biological aging, are, on average, 356 base pairs shorter in study participants who slept for fewer than five hours per night than in those who slept for seven hours. They found that sleep metrics were reported more accurately by wearable fitness trackers than by self report. Here is the abstract of their article, titled "Digital phenotyping by consumer wearables identifies sleep-associated markers of cardiovascular disease risk and biological aging."
Sleep is associated with various health outcomes. Despite their growing adoption, the potential for consumer wearables to contribute sleep metrics to sleep-related biomedical research remains largely uncharacterized. Here we analyzed sleep tracking data, along with questionnaire responses and multi-modal phenotypic data generated from 482 normal volunteers. First, we compared wearable-derived and self-reported sleep metrics, particularly total sleep time (TST) and sleep efficiency (SE). We then identified demographic, socioeconomic and lifestyle factors associated with wearable-derived TST; they included age, gender, occupation and alcohol consumption. Multi-modal phenotypic data analysis showed that wearable-derived TST and SE were associated with cardiovascular disease risk markers such as body mass index and waist circumference, whereas self-reported measures were not. Using wearable-derived TST, we showed that insufficient sleep was associated with premature telomere attrition. Our study highlights the potential for sleep metrics from consumer wearables to provide novel insights into data generated from population cohort studies.

Wednesday, November 20, 2019

A "Department of the Attention Economy"

Popping up on my daily input stream (in this case the Google News aggregator - which knows more that I do about what I might like to see) is a CNN business perspective titled "Andrew Yang: As president, I will establish a Department of the Attention Economy." It is an idea that I wish some of the more likely democratic nominees would take up.

The article immediately caught my attention, because faced with the immense array of input text and video streams competing for my attention I feel, as I suspect many MindBlog readers do, like one of the dogs in Martin Seligman's classic learned helplessness experiments whose stress and immune systems eventually are compromised by uncertainty. For entertainment should I be subscribing to Netflix, Hulu, Amazon Prime, Disney+, YouTube +, Apple TV+, CBS All Access, AcornTV, Britbox, Shudder, YouTbue, Facebook Watch, Tubi, etc.? For news, there are too many options to even begin to list them. Apart from my own qualms about using Google as a prosthesis (Blogger, Google Docs, Calendar, Mail, etc.), I look at how my 5 and 7 year old grandsons' lives are potentially compromised by the amount of free time they spend on digital inputs rather than playing outside with friends.

Clips rom Yang's article:
...technology is addictive and damaging the mental health of our children. Research shows that too much time spent on social media increases stress, anxiety, depression and feelings of isolation. Other studies have found that extended screen time can negatively affect sleep...As president, I will establish a Department of the Attention Economy that will work with tech companies and implement regulations that curb the negative effects of smartphones and social media.
A few of his suggestions:
We can start by curbing design features that maximize screen time, such as removing autoplay video and capping recommendations for videos, articles and posts for each user each day. Platforms can also use deep-learning algorithms to determine whether a user is a child, and then explore capping the user's screen hours per day.
Design features that encourage social validation should also be removed. Instagram is leading the way by testing hiding likes on the posts of some users. That's a step in the right direction and it should be implemented as soon as possible. In addition, the number of followers a person has on social media should be hidden too, as it represents a false equivalence with a person's social standing.
Another area that deserves attention is the content our kids consume. When I was growing up, television time meant morning cartoons and after-school specials. Rules and standards should be established to protect kids from graphic content and violent imagery. Subsequently, these regulations would also incentivize the production of high-quality content and positive programming.
It shouldn't stop there. Parents have a major role to play — and they want to — but they could use some help. Companies should be required to provide parents with guidance on kid-healthy content (similar to the rating system for TV or movies), and parents should easily be able to monitor content and screen time for children.

Monday, November 18, 2019

Social class is revealed by brief clips of speech.

Kraus et al. - a collective modern version of Professor Henry Higgins in George Bernard Shaw's play Pygmalion - offer a detailed analytic update on how social class is reproduced through subtle cues expressed in brief speech. Here is their abstract:
Economic inequality is at its highest point on record and is linked to poorer health and well-being across countries. The forces that perpetuate inequality continue to be studied, and here we examine how a person’s position within the economic hierarchy, their social class, is accurately perceived and reproduced by mundane patterns embedded in brief speech. Studies 1 through 4 examined the extent that people accurately perceive social class based on brief speech patterns. We find that brief speech spoken out of context is sufficient to allow respondents to discern the social class of speakers at levels above chance accuracy, that adherence to both digital and subjective standards for English is associated with higher perceived and actual social class of speakers, and that pronunciation cues in speech communicate social class over and above speech content. In study 5, we find that people with prior hiring experience use speech patterns in preinterview conversations to judge the fit, competence, starting salary, and signing bonus of prospective job candidates in ways that bias the process in favor of applicants of higher social class. Overall, this research provides evidence for the stratification of common speech and its role in both shaping perceiver judgments and perpetuating inequality during the briefest interactions.
Here is a sample explanatory clip from their results section:
A total of 229 perceivers were asked to listen to the speech of 27 unique speakers whose utterances were collected as part of a larger sample of 189 speakers through the International Dialects of English Archive (IDEA). These 27 speakers varied in terms of age, race, gender, and social class, which we measured in the present study in terms of high school or college degree attainment. Our sample of perceivers listened to 7 words spoken by each of the speakers presented consecutively and randomly without any other accompanying speech and answered “Yes” or “No” to 4 questions: “Is this person a college graduate/woman/young/white?” Participants answered these 4 questions in a randomized order, and we calculated the proportion of correct responses for each question...