Wednesday, February 01, 2017

Are human-specific plastic cortical synaptic connections what makes us human?

I want to pass on an excellent primer (open source) on the plasticity of a specific synapse between pyramidal neurons and fast-spiking interneurons of the human neocortex observed only in our human brains.
One outstanding difference between Homo sapiens and other mammals is the ability to perform highly complex cognitive tasks and behaviors, such as language, abstract thinking, and cultural diversity. How is this accomplished? According to one prominent theory, cognitive complexity is proportional to the repetition of specific computational modules over a large surface expansion of the cerebral cortex (neocortex). However, the human neocortex was shown to also possess unique features at the cellular and synaptic levels, raising the possibility that expanding the computational module is not the only mechanism underlying complex thinking. In a study published in PLOS Biology, Szegedi and colleagues analyzed a specific cortical circuit from live postoperative human tissue, showing that human-specific, very powerful excitatory connections between principal pyramidal neurons and inhibitory neurons are highly plastic. This suggests that exclusive plasticity of specific microcircuits might be considered among the mechanisms endowing the human neocortex with the ability to perform highly complex cognitive tasks.

Tuesday, January 31, 2017

An individual's ultimate economic burden can be forecast in childhood

Important work from Caspi et al., who show that 20% of the population accounts for close to 80% of economic burden. This group can be predicted with high accuracy from as early as age 3.
Policymakers are interested in early-years interventions to ameliorate childhood risks. They hope for improved adult outcomes in the long run that bring a return on investment. The size of the return that can be expected partly depends on how strongly childhood risks forecast adult outcomes, but there is disagreement about whether childhood determines adulthood. We integrated multiple nationwide administrative databases and electronic medical records with the four-decade-long Dunedin birth cohort study to test child-to-adult prediction in a different way, using a population-segmentation approach. A segment comprising 22% of the cohort accounted for 36% of the cohort’s injury insurance claims; 40% of excess obese kilograms; 54% of cigarettes smoked; 57% of hospital nights; 66% of welfare benefits; 77% of fatherless child-rearing; 78% of prescription fills; and 81% of criminal convictions. Childhood risks, including poor brain health at three years of age, predicted this segment with large effect sizes. Early-years interventions that are effective for this population segment could yield very large returns on investment.

Monday, January 30, 2017

The uniformity illusion.

Otten et al. investigate a visual illusion in which the accurate and detailed vision in the center of our visual field, accomplished by the fovea, influences our perception of peripheral stimuli, making them seem more similar to the center. The open source article contains several nice examples of this illusion.
Vision in the fovea, the center of the visual field, is much more accurate and detailed than vision in the periphery. This is not in line with the rich phenomenology of peripheral vision. Here, we investigated a visual illusion that shows that detailed peripheral visual experience is partially based on a reconstruction of reality. Participants fixated on the center of a visual display in which central stimuli differed from peripheral stimuli. Over time, participants perceived that the peripheral stimuli changed to match the central stimuli, so that the display seemed uniform. We showed that a wide range of visual features, including shape, orientation, motion, luminance, pattern, and identity, are susceptible to this uniformity illusion. We argue that the uniformity illusion is the result of a reconstruction of sparse visual information (from the periphery) based on more readily available detailed visual information (from the fovea), which gives rise to a rich, but illusory, experience of peripheral vision.

Friday, January 27, 2017

Regression to the mean - Why we would all be better off if we ignored Trump’s tweets

O’Donnell’s answer to the annual edge.org question "What scientific term or concept ought to be more widely known?":
My candidate is an old, simple, and powerful one: the law of regression to the mean. It’s a concept from the discipline of statistics, but in real life it means that anomalies are anomalies, coincidences happen (all the time, with stunning frequency), and the main thing they tell us is that the next thing to happen is very likely to be a lot more boring, ordinary, and predictable. Put in the simplest human terms, it teaches us not to be so excitable, not to be so worried, not to be so excited: Life really will be, for the most part, boring and predictable.
The ancient and late antique intellectuals whom I spend my life studying wouldn’t talk so much about miracles and portents if they could calm down and think about the numbers. The baseball fans thrilled to see the guy on a hitting streak come to the plate wouldn’t be so disappointed when he struck out. Even people reading election returns would see much more normality lurking inside shocking results than television reporters can admit.
Heeding the law of regression to the mean would help us slow down, calm down, pay attention to the long term and the big picture, and react with a more strategic patience to crises large and small. We’d all be better off.

Thursday, January 26, 2017

Smartphone reprogramming of our brains?

Nicolelis makes some good points as he adds to the genre of literature that predicts a diminution of our brain power caused by dependence on the latest technological advance (abacus, slide rule, electronic calculator, computer, etc.). Here is his statement of alarm:
Could our constant reciprocal interaction with digital logic (through laptops, tablets, smartphones, all the way to highly immersive virtual reality environments), particularly when it leads to powerfully hedonic experiences, result in the slow compromise or even elimination of some of the behaviours and cognitive aptitudes that represent the most exquisite and cherished attributes of the human condition? Attributes such as our multifaceted social skills, empathy, linguistic semantics, aesthetic sense, artistic expression, intuition, creativity and the ability to improvise solutions to novel contingencies, to name just a few. In other words, could opting for the fast lane of the never-ending highway to full digital immersion and automation — an obvious current trend in our modern society — produce a reduction in human cognitive capabilities?
Nicolelis goes on to note that the human brain can not be reduced to the algorithmic nature of Turing machine, but rather is an organic computer in which hardware and software from the molecular to the organismal level cannot be dissociated, one that uses a recursive mix of analogue and digital processing.
Even though the brain cannot be reduced to a digital machine, could the human brain simply assimilate and begin to mimic the rigid binary logic and algorithmic mode of operation of digital machines due to the growing overexposure to digital devices and the hedonic response triggered by these interactions, and become a biological digital system?
I would volunteer the notion that passive immersion in the digital systems of modern airplanes (in the case of pilots), digital imaging diagnostics (radiologists) and computer-assisted design (architects) may gradually curtail the range and acuity of some mental functions and cognitive skills, such as creativity, insight and the ability to solve novel problems…when people believe that a series of statements that they have been asked to remember will be stored online, they perform worse than a control group that relies only on their own biological memory to remember the statements. This suggests that subcontracting some simple mental searches to Google may, after all, reduce our own brain’s ability to store and recall memories reliably.
The impact of online social media on our natural social skills is another area in which we may be able to measure the true effects of digital systems on human behaviour…An intense presence on social media and virtual reality environments can produce significant anxiety, a reduction in real social interactions, lack of social skills and human empathy, and difficulties in handling solitude. … symptoms and signs of addiction to virtual life are often reported…I began wondering whether the new ‘always connected’ routine is overtaxing our cerebral cortex by dramatically expanding the number of people with whom we can closely communicate, almost instantaneously, via the multitude of social media outlets available on the internet. Instead of respecting the group size limit (about 150 individuals) afforded by our cortical volume, we are now in continuous contact with a group of people that could far exceed that neurobiological limit. What are the consequences of this cortical overtaxing? Anxiety, attention, cognitive and even memory deficits?
Homo digitalis
Is the above scenario something we should pay attention to? I think so. If not because of the potential impact on the mental health of this and future generations, but also because of the far-reaching consequences of our increasing interaction with digital systems. For example, at the far limit, I can conceive that this staggering expansion in our online social connectivity is capable of providing a completely new type of selective pressure that may, eventually, bias the evolutionary future of our species. One may begin wondering whether the dawn of ‘Homo digitalis’ is upon us or, more surprisingly, whether he/she is already around, texting and tweeting without being noticed.

Wednesday, January 25, 2017

Gender and the conflation of equality and sameness

I want to pass on some clips from a sane brief essay by Helena Cronin, author of "The Ant and the Peacock: Altruism and Sexual Selection from Darwin to Today."
The poet Philip Larkin famously proclaimed that sex began in 1963. He was inaccurate by 800 million years. Moreover, what began in the 1960s was instead a campaign to oust sex—in particular, sex differences—in favor of gender...biological differences were thought to spell genetic determinism, immutability, anti-feminism and, most egregiously, women's oppression. Gender, however, was the realm of societal forces; "male" and "female" were social constructs...
...gender has distorted social policy. This is because the campaign has undergone baleful mission-creep. Its aim has morphed from ending discrimination against women into a deeply misguided quest for sameness of outcome for males and females in all fields—above all, 50:50 across the entire workplace. This stems from a fundamental error: the conflation of equality and sameness. And it's an error all too easily made if your starting point is that the sexes are "really" the same and that apparent differences are mere artifacts of sexist socialization.
Equality is about fair treatment, not about people or outcomes being identical; so fairness does not and should not require sameness. However, when sameness gets confused with equality—and equality is of course to do with fairness—then sameness ends up undeservedly sharing their moral high ground. And male/female discrepancies become a moral crusade. Why so few women CEOs or engineers? It becomes socially suspect to explain this as the result not of discrimination but of differential choice.
Well, it shouldn’t be suspect. Because the sexes do differ—and in ways that, on average, make a notable difference to their distribution in today's workplace.
Here's why the sexes differ. A sexual organism must divide its total reproductive investment into two—competing for mates and caring for offspring. Almost from the dawn of sexual reproduction, one sex specialized slightly more in competing for mates and the other slightly more in caring for offspring...the differences go far beyond reproductive plumbing. They are distinctive adaptations for the different life-strategies of competers and carers. Wherever ancestral males and females faced different adaptive problems, we should expect sex differences—encompassing bodies, brains and behaviour. And we should expect that, reflecting those differences, competers and carers will have correspondingly different life-priorities.
As for different outcomes in the workplace, the causes are above all different interests and temperaments (and not women being "less clever" than men). Women on average have a strong preference for working with people—hence the nurses and teachers; and, compared to men, they care more about family and relationships and have broader interests and priorities—hence little appeal in becoming CEOs. Men have far more interest in "things"—hence the engineers; and they are vastly more competitive: more risk-taking, ambitious, status-seeking, single-minded, opportunistic—hence the CEOs. So men and women have, on average, different conceptions of what constitutes success (despite the gender quest to impose the same—male—conception on all).
And here's some intriguing evidence. "Gender" predicts that, as discrimination diminishes, males and females will increasingly converge. But a study of 55 nations found that it was in the most liberal, democratic, equality-driven countries that divergence was greatest. The less the sexism, the greater the sex differences. Difference, this suggests, is evidence not of oppression but of choice; not socialization, not patriarchy, not false consciousness, not even pink t-shirts or personal pronouns … but female choice.
An evolutionary understanding shows that you can't have sex without sex differences. It is only within that powerful scientific framework—in which ideological questions become empirical answers—that gender can be properly understood. And, as the fluidity of "sexualities" enters public awareness, sex is again crucial for informed, enlightened discussion.
So for the sake of science, society and sense, bring back sex.

Tuesday, January 24, 2017

Knowing how confidently we know

Here is a fascinating piece of work from Miyamoto et al. showing that parallel stream of information in the brain regulate the confidence that a memory is correct, apart from the memory itself. From the journal's description of the work:
Self-monitoring and evaluation of our own memory is a mental process called metamemory. For metamemory, we need access to information about the strength of our own memory traces. The brain structures and neural mechanisms involved in metamemory are completely unknown. Miyamoto et al. devised a test paradigm for metamemory in macaques, in which the monkeys judged their own confidence in remembering past experiences. The authors combined this approach with functional brain imaging to reveal the neural substrates of metamemory for retrospection. A specific region in the prefrontal brain was essential for meta mnemonic decision-making. Inactivation of this region caused selective impairment of metamemory, but not of memory itself.
and, the abstract from Miyamoto et al.:
We know how confidently we know: Metacognitive self-monitoring of memory states, so-called “metamemory,” enables strategic and efficient information collection based on past experiences. However, it is unknown how metamemory is implemented in the brain. We explored causal neural mechanism of metamemory in macaque monkeys performing metacognitive confidence judgments on memory. By whole-brain searches via functional magnetic resonance imaging, we discovered a neural correlate of metamemory for temporally remote events in prefrontal area 9 (or 9/46d), along with that for recent events within area 6. Reversible inactivation of each of these identified loci induced doubly dissociated selective impairments in metacognitive judgment performance on remote or recent memory, without impairing recognition performance itself. The findings reveal that parallel metamemory streams supervise recognition networks for remote and recent memory, without contributing to recognition itself.

Monday, January 23, 2017

How our evolutionary psychology elected Donald Trump.

While I feel that in principle our world might be best governed by a multinational meritocratic elite (of the sort that just met in Davos Switzerland) I can’t even begin to feel the same kind of emotional bonding to this vague impersonal entity that I feel towards my hometown of Austin Texas, or Madison Wisconsin where I spent my adult working life. (And, business oligarchies governing the world have shown much more regard for maximizing profits than for the maintenance and quality of local human communities, the entities that most of us care about and can bond to.) Our brains evolved and are hard wired for caring most about family and tribe. Brooks makes these points very compellingly in his recent Op-Ed piece that notes the old German sociological distinction between gemeinschaft and gesellschaft.
All across the world, we have masses of voters who live in a world of gemeinschaft: where relationships are personal, organic and fused by particular affections. These people define their loyalty to community, faith and nation in personal, in-the-gut sort of ways.
But we have a leadership class and an experience of globalization that is from the world of gesellschaft: where systems are impersonal, rule based, abstract, indirect and formal.
Many people in Europe love their particular country with a vestigial affection that is like family — England, Holland or France. But meritocratic elites of Europe gave them an abstract intellectual construct called the European Union.
Many Americans think their families and their neighborhoods are being denuded by the impersonal forces of globalization, finance and technology. All the Republican establishment could offer was abstract paeans to the free market. All the Democrats could offer was Hillary Clinton, the ultimate cautious, remote, calculating, gesellschaft thinker.
It was the right moment for Trump, the ultimate gemeinschaft man. He is all gut instinct, all blood and soil, all about loyalty over detached reason. His business is a pre-modern family clan, not an impersonal corporation, and he is staffing his White House as a pre-modern family monarchy, with his relatives and a few royal retainers. In his business and political dealings, he simply doesn’t acknowledge the difference between private and public, personal and impersonal. Everything is personal, pulsating outward from his needy core.
Brooks goes on to argue that what made Trump right electorally will also make him an incompetent president. The danger is not so much the rise of fascism, a new authoritarian age, but that "everything will become disorganized, chaotic, degenerate, clownish and incompetent." How does the ultimate anti-institutional man sit at the nerve center of a four-million-person institution?

I think a good analogy is to hope that over time these millions of people, like the nerve cells in our brain, will do a "work-around" the focal lesion (Trump) to restore and maintain normal operations of the system.

Friday, January 20, 2017

The deepening of our cultural echo chambers.

Farhad Manjoo does a nice piece in the Tech and Society section of the NY Times, pointing out how much has changed since the 1970s, when TV programs like “All in the Family” had broad cultural reach, being watched by one out of every three households with a television. Norman Lear’s “One Day at a Time” was watched by 17 million viewers every week. A new version of “One day at a Time” on Netflix will almost certainly fail to replicate such a broad cultural reach. Some clips from Manjoo:
The shows are separated by 40 years of technological advances — a progression from the over-the-air broadcast era in which Mr. Lear made it big, to the cable age of MTV and CNN and HBO, to, finally, the modern era of streaming services like Netflix. Each new technology allowed a leap forward in choice, flexibility and quality; the “Golden Age of TV” offers so much choice that some critics wonder if it’s become overwhelming…Across the entertainment business, from music to movies to video games, technology has flooded us with a profusion of cultural choice.
...we’re returning to the cultural era that predated radio and TV, an era in which entertainment was fragmented and bespoke…It was a really odd moment in history to have so many people watching the same thing at the same time… for a brief while, from the 1950s to the late 1980s, broadcast television served cultural, social and political roles far greater than the banality of its content would suggest. Because it featured little choice, TV offered something else: the raw material for a shared culture.
As the broadcast era changed into one of cable and then streaming, TV was transformed from a wasteland into a bubbling sea of creativity. But it has become a sea in which everyone swims in smaller schools...Only around 12 percent of television households, or about 14 million to 15 million people, regularly tuned into “NCIS” and “The Big Bang Theory,” the two most popular network shows of the 2015-16 season…Netflix’s biggest original drama last year, “Stranger Things,” was seen by about 14 million adults in the month after it first aired…during much of the 1980s, a broadcast show that attracted 14 million to 16 million viewers would have been in danger of cancellation.
A spokesman for Netflix pointed out that even if audiences were smaller than in the past, its shows still had impact. “Making a Murderer” set off a re-examination of a widely criticized murder trial, for instance, while “Orange Is the New Black” was one of the first shows to feature a transgender actor, Laverne Cox….But I suspect the impacts, like the viewership, tend to be restricted along the same social and cultural echo chambers into which we’ve split ourselves in the first place. Those effects do not approach the vast ways that TV once remade the culture.

Thursday, January 19, 2017

The immensity of the vacated present.

I am repeating, as I did with last Thursday's post, a post from several years ago with material that continues to be personally important to me. Here it is:

The title of this post is a phrase from a recent essay by Vivian Gornick, "The cost of daydreaming," describing an experience that very much resonates with my own, and that I think is describing her discovery and way of noticing the distinction between our internal mind wandering (default mode) and present centered outwardly oriented (attentional) brain networks (the subject of many MindBlog posts). On finding that she could sense the start of daydreaming and suppress it:
...the really strange and interesting thing happened. A vast emptiness began to open up behind my eyes as I went about my daily business. The daydreaming, it seemed, had occupied more space than I’d ever imagined. It was as though a majority of my waking time had routinely been taken up with fantasizing, only a narrow portion of consciousness concentrated on the here and now...I began to realize what daydreaming had done for me — and to me.
Turning 60 was like being told I had six months to live. Overnight, retreating into the refuge of a fantasized tomorrow became a thing of the past. Now there was only the immensity of the vacated present...It wasn’t hard to cut short the daydreaming, but how exactly did one manage to occupy the present when for so many years one hadn’t?"
Then, after a period of time:
...I became aware, after a street encounter, that the vacancy within was stirring with movement. A week later another encounter left me feeling curiously enlivened. It was the third one that did it. A hilarious exchange had taken place between me and a pizza deliveryman, and sentences from it now started repeating themselves in my head as I walked on, making me laugh each time anew, and each time with yet deeper satisfaction. Energy — coarse and rich — began to swell inside the cavity of my chest. Time quickened, the air glowed, the colors of the day grew vivid; my mouth felt fresh. A surprising tenderness pressed against my heart with such strength it seemed very nearly like joy; and with unexpected sharpness I became alert not to the meaning but to the astonishment of human existence. It was there on the street, I realized, that I was filling my skin, occupying the present.

Wednesday, January 18, 2017

A fitness downside to statin drugs?

Before passing on this article by Gretchen Reynolds and the work of Chung et al. that it points to, I'll start with a personal account of why it immediately caught my attention. I started taking a statin (10 mg simvastatin) over 20 years ago, not because my lipids were high, but because I read that statins had anti-inflammatory effects. Over the past year I have become increasingly alarmed that my hand muscle and grip strength were weakening. For a recital pianist, this can mean the end of performing. Since a known side effect of statins is to do just this, I stopped taking simvastatin, and within days could feel muscle mass and strength returning. (I'm doing a recital on Feb. 19.) It's interesting that this side effect became obvious only after many years, I would guess a function of aging (I'm 74).

Now, getting to the work Reynolds notes , a new study in mice suggest that statin drugs make exercise more difficult and less beneficial. Animals on statins loose grip strength, are more easily fatigued, and do not show the normal exercise-induced increase in muscle fiber size. Here is the technical abstract:
HMG-CoA reductase inhibitors (statins) are the most effective pharmacological means of reducing cardiovascular disease risk. The most common side effect of statin use is skeletal muscle myopathy, which may be exacerbated by exercise. Hypercholesterolemia and training status are factors that are rarely considered in the progression of myopathy. The purpose of this study was to determine the extent to which acute and chronic exercise can influence statin-induced myopathy in hypercholesterolemic (ApoE-/-) mice. Mice either received daily injections of saline or simvastatin (20 mg/kg) while: 1) remaining sedentary (Sed), 2) engaging in daily exercise for two weeks (novel, Nov), or 3) engaging in daily exercise for two weeks after a brief period of training (accustomed, Acct) (2x3 design, n = 60). Cholesterol, activity, strength, and indices of myofiber damage and atrophy were assessed. Running wheel activity declined in both exercise groups receiving statins (statin x time interaction, p less than 0.05). Cholesterol, grip strength, and maximal isometric force were significantly lower in all groups following statin treatment (statin main effect, p less than 0.05). Mitochondrial content and myofiber size were increased and 4-HNE was decreased by exercise (statin x exercise interaction, p less than 0.05), and these beneficial effects were abrogated by statin treatment. Exercise (Acct and Nov) increased atrogin-1 mRNA in combination with statin treatment, yet enhanced fiber damage or atrophy was not observed. The results from this study suggest that exercise (Nov, Acct) does not exacerbate statin-induced myopathy in ApoE-/- mice, yet statin treatment reduces activity in a manner that prevents muscle from mounting a beneficial adaptive response to training.

Tuesday, January 17, 2017

Research on consequences of low socioeconomic status becoming a small industry.

It is becoming hard to keep up with research on biological and behavioral consequences of low socioeconomic status - one of MindBlog's subject threads since its beginning in 2006. I pass on abstract of two recent samples of work. Gary Evans on childhood poverty and adult psychological well-being:
Childhood disadvantage has repeatedly been linked to adult physical morbidity and mortality. We show in a prospective, longitudinal design that childhood poverty predicts multimethodological indices of adult (24 y of age) psychological well-being while holding constant similar childhood outcomes assessed at age 9. Adults from low-income families manifest more allostatic load, an index of chronic physiological stress, higher levels of externalizing symptoms (e.g., aggression) but not internalizing symptoms (e.g., depression), and more helplessness behaviors. In addition, childhood poverty predicts deficits in adult short-term spatial memory.
And, Gillian and Nettle in an upcoming target article for Behavioral and Brain Science titled "The Behavioural Constellation of Deprivation: Causes and Consequences":
Socioeconomic differences in behaviour are pervasive and well documented, but their causes are not yet well understood. Here, we make the case that there is a cluster of behaviours associated with lower socioeconomic status, which we call the behavioural constellation of deprivation. We propose that the relatively limited control associated with lower socioeconomic status curtails the extent to which people can expect to realise deferred rewards, leading to more present-oriented behaviour in a range of domains. We illustrate this idea using the specific factor of extrinsic mortality risk, an important factor in evolutionary theoretical models. We emphasise the idea that the present-oriented behaviours of the constellation are a contextually appropriate response to structural and ecological factors, rather than pathology or a failure of willpower. We highlight some principles from evolutionary theoretical models that can deepen our understanding of how socioeconomic inequalities can become amplified and embedded. These principles are that: 1) Small initial disparities can lead to larger eventual inequalities, 2) Feed-back loops can operate to embed early life circumstances, 3) Constraints can breed further constraints, and 4) Feed-back loops can operate over generations. We discuss some of the mechanisms by which socioeconomic status may influence behaviour. We then review how the contextually appropriate response perspective that we have outlined fits with other findings about control and temporal discounting. Finally, we discuss the implications of this interpretation for research and policy.

Monday, January 16, 2017

Positivity in older adults is more related to cognitive decline than to emotion regulation.

It is commonly supposed that the more positive outlook characteristic of older people is due to their ability to regulate their emotions more effectively than younger people. Zebrowitz et al, to the contrary, suggest a decline in cognitive capacity is responsible, arguing that more cognitive resources are required to process negative stimuli, because they are more cognitively elaborated than positive ones:
An older adult positivity effect, i.e., the tendency for older adults to favor positive over negative stimulus information more than do younger adults, has been previously shown in attention, memory, and evaluations. This effect has been attributed to greater emotion regulation in older adults. In the case of attention and memory, this explanation has been supported by some evidence that the older adult positivity effect is most pronounced for negative stimuli, which would motivate emotion regulation, and that it is reduced by cognitive load, which would impede emotion regulation. We investigated whether greater older adult positivity in the case of evaluative responses to faces is also enhanced for negative stimuli and attenuated by cognitive load, as an emotion regulation explanation would predict. In two studies, younger and older adults rated trustworthiness of faces that varied in valence both under low and high cognitive load, with the latter manipulated by a distracting backwards counting task. In Study 1, face valence was manipulated by attractiveness (low /disfigured faces, medium, high/fashion models’ faces). In Study 2, face valence was manipulated by trustworthiness (low, medium, high). Both studies revealed a significant older adult positivity effect. However, contrary to an emotion regulation account, this effect was not stronger for more negative faces, and cognitive load increased rather than decreased the rated trustworthiness of negatively valenced faces. Although inconsistent with emotion regulation, the latter effect is consistent with theory and research arguing that more cognitive resources are required to process negative stimuli, because they are more cognitively elaborated than positive ones. The finding that increased age and increased cognitive load both enhanced the positivity of trustworthy ratings suggests that the older adult positivity effect in evaluative ratings of faces may reflect age-related declines in cognitive capacity rather than increases in the regulation of negative emotions.

Friday, January 13, 2017

Seeing faces of young black boys facilitates identification of threatening stimuli.

From Todd et al. (open source):
Pervasive stereotypes linking Black men with violence and criminality can lead to implicit cognitive biases, including the misidentification of harmless objects as weapons. In four experiments, we investigated whether these biases extend even to young Black boys (5-year-olds). White participants completed sequential priming tasks in which they categorized threatening and nonthreatening objects and words after brief presentations of faces of various races (Black and White) and ages (children and adults). Results consistently revealed that participants had less difficulty (i.e., faster response times, fewer errors) identifying threatening stimuli and more difficulty identifying nonthreatening stimuli after seeing Black faces than after seeing White faces, and this racial bias was equally strong following adult and child faces. Process-dissociation-procedure analyses further revealed that these effects were driven entirely by automatic (i.e., unintentional) racial biases. The collective findings suggest that the perceived threat commonly associated with Black men may generalize even to young Black boys.
Note: There is a correction to the description of experiment 3.

Thursday, January 12, 2017

The milliseconds of a choice - Watching your mind when it matters.

I'm finding, with increasing frequency, that an article about health or psychology in the New York Times that I find interesting has an attached note that it was first published several years earlier. While working on yesterday's MindBlog post I came across a 2014 post I wrote that I think makes some important points about our self-regulation that are worth repeating. So, I'm going to copy what the Times is doing and repeat it today. I'm tempted to edit it, but won't, beyond mentioning that I would considerably tone down my positive reference to brain training games (that I no longer indulge in). Here is the 2014 post:

This is actually a post about mindfulness, in reaction to Dan Hurley's article describing how contemporary applications of the ancient tradition of mindfulness meditation are being engaged in many more contexts than the initial emphasis on chilling out in the 1970s, and being employed for very practical purses such as mental resilience in a war zone. It seems like to me that we are approaching a well defined technology of brain control whose brain basis is understood in some detail. I've done numerous posts on behavioral and brain correlates of mindfulness meditation (enter 'meditation' or 'mindfulness' in MindBlog's search box in the left column). For example, only four weeks of a mindfulness meditation regime emphasizing relaxation of different body parts correlates with increases in white matter (nerve tract) efficiency. Improvements in cognitive performance, working memory, etc. have been claimed. A special issue of The journal Social Cognitive and Affective Neuroscience discusses issue in the research.

Full time mindfulness might be a bad idea, suppressing the mind wandering that facilitates bursts of creative insight. (During my vision research career, my most original ideas popped up when I was spacing out, once when I was riding a bike along a lakeshore path.) Many physicists and writers reports their best ideas happen when they are disengaged. It also appears that mindfulness may inhibit implicit learning in which habits and skill are acquired without conscious awareness.

Obviously knowing whether we are in an attentional or mind wandering (default, narrative) modes is useful (see here, and here), and this is where the title of this posts comes in. To note and distinguish our mind state is most effectively accomplished with a particular style of alertness or awareness that is functioning very soon (less than 200 milliseconds) after a new thought or sensory perception appears to us. This is a moment of fragility that offers a narrow time window of choice over whether our new brain activity will be either enhanced or diminished in favor of a more desired activity. This is precisely what is happening in mindfulness meditation that instructs a central focus of some sort (breathing, body relaxation, or whatever) to which one returns as soon as one notes that any other thoughts or distractions have popped into awareness. The ability to rapidly notice and attend to thoughts and emotions of these short time scales is enhanced by brain training regimes of the sort offered by BrainHq of positscience.com and others. I have found the exercises on this site, originated by Michael Merznich, to be the most useful.  It offers summaries of changes in brain speed, attention, memory, intelligence, navigation, etc. that result from performing the exercises - changes that can persist for years.

A book title that has been popping into my head for at least the last 15 years is "The 200 Millisecond Manager." (a riff on the title the popular book of the early 1980's by Blanchard and Johnson, "The One Minute Manager.") The gist of the argument would be that given in the "Guide" section of some 2005 writing, and actually in Chapter 12 of my book, Figure 12-7.

It might make the strident assertion that the most important thing that matters in regulating our thoughts, feelings, and actions is their first 100-200 msec in the brain, which is when the levers and pulleys are actually doing their thing. It would be a nuts and bolts approach to altering - or at least inhibiting - self limiting behaviors. It would suggest that a central trick is to avoid taking on on the ‘enormity of it all,’ and instead use a variety of techniques to get our awareness down to the normally invisible 100-200 msec time interval in which our actions are being programmed. Here we are talking mechanics during the time period is when all the limbic and other routines that result from life script, self image, temperament, etc., actually can start-up. The suggestion is that you can short circuit some of this process if you bring awareness to the level of observing the moments during which a reaction or behavior is becoming resident, and can sometimes say “I don’t think so, I think I'll do something else instead.”

"The 200 msec Manager" has gone through the ‘this could be a book’ cycle several times, the actual execution  bogging down as I actually got into description of the underlying science and techniques for expanding awareness. Also, I note the enormous number of books out there on meditation, relaxation, etc. that are all really addressing the same core processes in different ways.

Wednesday, January 11, 2017

People who move more are happier.

No surprises here, but this study polling people using a smartphone app designed by the experimenters quantifies the effect. The use of smartphones to gather large-scale data is becoming a growth industry. A notable earlier study of this sort was Killingsworth and Gilbert's 2010 "A wandering mind is an unhappy mind."
Physical activity, both exercise and non-exercise, has far-reaching benefits to physical health. Although exercise has also been linked to psychological health (e.g., happiness), little research has examined physical activity more broadly, taking into account non-exercise activity as well as exercise. We examined the relationship between physical activity (measured broadly) and happiness using a smartphone application. This app has collected self-reports of happiness and physical activity from over ten thousand participants, while passively gathering information about physical activity from the accelerometers on users' phones. The findings reveal that individuals who are more physically active are happier. Further, individuals are happier in the moments when they are more physically active. These results emerged when assessing activity subjectively, via self-report, or objectively, via participants' smartphone accelerometers. Overall, this research suggests that not only exercise but also non-exercise physical activity is related to happiness. This research further demonstrates how smartphones can be used to collect large-scale data to examine psychological, behavioral, and health-related phenomena as they naturally occur in everyday life.

Tuesday, January 10, 2017

From Power to Inaction.

An interesting little piece from Durso et al. on a consequence of feeling powerful (The paper appears to be open source, so you can note the details of the two experiments, involving the usual gaggle of undergraduate psychology students used as subjects and given credit for their participation.)
Research has shown that people who feel powerful are more likely to act than those who feel powerless, whereas people who feel ambivalent are less likely to act than those whose reactions are univalent (entirely positive or entirely negative). But what happens when powerful people also are ambivalent? On the basis of the self-validation theory of judgment, we hypothesized that power and ambivalence would interact to predict individuals’ action. Because power can validate individuals’ reactions, we reasoned that feeling powerful strengthens whatever reactions people have during a decision. It can strengthen univalent reactions and increase action orientation, as shown in past research. Among people who hold an ambivalent judgment, however, those who feel powerful would be less action oriented than those who feel powerless. Two experiments provide evidence for this hypothesized interactive effect of power and ambivalence on individuals’ action tendencies during both positive decisions (promoting an employee; Experiment 1) and negative decisions (firing an employee; Experiment 2). In summary, when individuals’ reactions are ambivalent, power increases the likelihood of inaction.

Monday, January 09, 2017

The Second Law of Thermodynamics is the First Law of Psychology.

I pass along some a clip from Steven Pinker’s contribution to the edge.org annual question “What scientific term or concept ought to be more widely know.” He notes a recent paper by Tooby, Cosmides, and Barrett with the title of this post, and continues:
The Second Law of Thermodynamics states that in an isolated system (one that is not taking in energy), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.)
Why the awe for the Second Law? The Second Law defines the ultimate purpose of life, mind, and human striving: to deploy energy and information to fight back the tide of entropy and carve out refuges of beneficial order. An underappreciation of the inherent tendency toward disorder, and a failure to appreciate the precious niches of order we carve out, are a major source of human folly.
To start with, the Second Law implies that misfortune may be no one’s fault. The biggest breakthrough of the scientific revolution was to nullify the intuition that the universe is saturated with purpose: that everything happens for a reason. In this primitive understanding, when bad things happen—accidents, disease, famine—someone or something must have wanted them to happen. This in turn impels people to find a defendant, demon, scapegoat, or witch to punish. Galileo and Newton replaced this cosmic morality play with a clockwork universe in which events are caused by conditions in the present, not goals for the future. The Second Law deepens that discovery: Not only does the universe not care about our desires, but in the natural course of events it will appear to thwart them, because there are so many more ways for things to go wrong than to go right. Houses burn down, ships sink, battles are lost for the want of a horseshoe nail.
Poverty, too, needs no explanation. In a world governed by entropy and evolution, it is the default state of humankind. Matter does not just arrange itself into shelter or clothing, and living things do everything they can not to become our food. What needs to be explained is wealth. Yet most discussions of poverty consist of arguments about whom to blame for it. More generally, an underappreciation of the Second Law lures people into seeing every unsolved social problem as a sign that their country is being driven off a cliff. It’s in the very nature of the universe that life has problems. But it’s better to figure out how to solve them—to apply information and energy to expand our refuge of beneficial order—than to start a conflagration and hope for the best. 

Friday, January 06, 2017

Dual streams of speech processing.

A large number of studies have documented how visual information in the brain is processed in dual streams of information: dorsal (where is it?), and ventral (what is it?). Fridriksson et al. have now applied a dual route model to speech processing that distinguishes form to meaning from form to articulation processing, and  I pass on their abstract plus one graphic showing the brain regions they are dealing with:
Several dual route models of human speech processing have been proposed suggesting a large-scale anatomical division between cortical regions that support motor–phonological aspects vs. lexical–semantic aspects of speech processing. However, to date, there is no complete agreement on what areas subserve each route or the nature of interactions across these routes that enables human speech processing. Relying on an extensive behavioral and neuroimaging assessment of a large sample of stroke survivors, we used a data-driven approach using principal components analysis of lesion-symptom mapping to identify brain regions crucial for performance on clusters of behavioral tasks without a priori separation into task types. Distinct anatomical boundaries were revealed between a dorsal frontoparietal stream and a ventral temporal–frontal stream associated with separate components. Collapsing over the tasks primarily supported by these streams, we characterize the dorsal stream as a form-to-articulation pathway and the ventral stream as a form-to-meaning pathway. This characterization of the division in the data reflects both the overlap between tasks supported by the two streams as well as the observation that there is a bias for phonological production tasks supported by the dorsal stream and lexical–semantic comprehension tasks supported by the ventral stream. As such, our findings show a division between two processing routes that underlie human speech processing and provide an empirical foundation for studying potential computational differences that distinguish between the two routes.


Component 1 (Form-to-meaning processing necessary for single word and sentence comprehension, also reversed (meaning-to-form processing) to support lexical–semantic aspects of speech production) is represented in Left, Component 2 (form-to-articulation processing) is represented in Center (Component 2a), and Component 2 modulated by a lesion component derived from lesion maps is represented in Right (Component 2b).

Thursday, January 05, 2017

The effect of status on stress depends on the stability of the hierarchy.

In most human societies, individuals with higher socioeconomic status live longer, experience increased well-being, and have lower rates of stress-related diseases such as cardiovascular conditions and type 2 diabetes, benefits that may be explained in part by the stress-buffering effects of status. Knight and Mehta provide evidence that this effect depends on how stable the social hierarchy is. They suggest that during times of hierarchical instability, when status could change, that high status might boost, not buffer, stress responses. I want to pass on their description of how social status and hierarchy stability were experimentally manipulated in the undergraduate participants in their study, followed by their abstract.
We tested our predictions by experimentally manipulating social status and hierarchy stability in undergraduate participants (n = 118; 57.3% female; age: M = 19.8) who were recruited for course credit. Participants were told that, on the basis of their responses to prelaboratory questionnaires, they had been assigned to complete an upcoming puzzle-building task as either a “manager” (high status) or “builder” (low status), and that another participant (actually a confederate) would perform the unassigned role. Participants were told specifically that the assignment was based on their “leadership skills and experience” to connect the role assignment to prestige. In reality, roles were randomly assigned. Participants were also told that the manager would be in charge of directing subordinates in the building process and would evaluate the “builder” at the end of the task to determine how to split bonus money.
Next, all participants were asked to complete the The Trier Social Stress Test (TSST), a 5-min speech about one’s qualification for a job and a 5-min serial subtraction math task in front of a panel of observers. To manipulate hierarchy stability, participants were told that their role (manager/builder) could change based on the speech/math task (unstable hierarchy) or that their performance on the task would not affect their role assignment (stable hierarchy). A 5-min preparation period was completed in the presence of a sex-matched confederate to increase the salience of the manipulations. Panelists and confederates were blind to participants’ assigned conditions. Participants provided informed consent to participate in a group activity and perform a speech task. The University of Oregon’s Institutional Review Board approved all methods.
Hormones were assayed from saliva collected via passive drool ∼10 min after arriving at the laboratory (baseline), as well as 0, 20, and 40 min after the TSST. Participants responded to a prompt asking how “in control” they felt after assignment to status and stability conditions and after the TSST, which was included as a separate item in a broader measure of self-reported affect. Three independent observers rated videos of each participant’s speech for status-relevant behaviors and two items that assessed overall interview performance
Abstract
High social status reduces stress responses in numerous species, but the stress-buffering effect of status may dissipate or even reverse during times of hierarchical instability. In an experimental test of this hypothesis, 118 participants (57.3% female) were randomly assigned to a high- or low-status position in a stable or unstable hierarchy and were then exposed to a social-evaluative stressor (a mock job interview). High status in a stable hierarchy buffered stress responses and improved interview performance, but high status in an unstable hierarchy boosted stress responses and did not lead to better performance. This general pattern of effects was observed across endocrine (cortisol and testosterone), psychological (feeling in control), and behavioral (competence, dominance, and warmth) responses to the stressor. The joint influence of status and hierarchy stability on interview performance was explained by feelings of control and testosterone reactivity. Greater feelings of control predicted enhanced interview performance, whereas increased testosterone reactivity predicted worse performance. These results provide direct causal evidence that high status confers adaptive benefits for stress reduction and performance only when the social hierarchy is stable. When the hierarchy is unstable, high status actually exacerbates stress responses.

Wednesday, January 04, 2017

What is different about the brains of “superagers”?

Barrett and colleagues have performed fMRI studies on “superagers” age 60-80, and find that superagers not only perform similarly to young adults on memory testing, they also do not show the patterns of brain atrophy typical of aging in “emotional” (midcingulate cortex and the anterior insula) regions that are major hubs for general communication throughout the brain, serving language, stress, internal organ regulation, and sensory coordination. These are the default mode network well known to be involved in episodic memory function, and the salience network implicated in attention, executive control, and motivational and inhibitory processes integral to memory encoding and retrieval. The authors suggest that the key to maintaining these areas and their function is strenuous physical and mental athleticism, working hard at difficult tasks, whether physical or mental.

Here is a graphic from the article followed by the abstract:


Superaging signature. The figure shows key nodes of the salience network (blue) and default mode network (yellow) where superagers and young adults are indistinguishable in cortical thickness. Preserved thickness in these regions is what distinguishes superagers from typical older adults.
Abstract
Decline in cognitive skills, especially in memory, is often viewed as part of “normal” aging. Yet some individuals “age better” than others. Building on prior research showing that cortical thickness in one brain region, the anterior midcingulate cortex, is preserved in older adults with memory performance abilities equal to or better than those of people 20–30 years younger (i.e., “superagers”), we examined the structural integrity of two large-scale intrinsic brain networks in superaging: the default mode network, typically engaged during memory encoding and retrieval tasks, and the salience network, typically engaged during attention, motivation, and executive function tasks. We predicted that superagers would have preserved cortical thickness in critical nodes in these networks. We defined superagers (60–80 years old) based on their performance compared to young adults (18–32 years old) on the California Verbal Learning Test Long Delay Free Recall test. We found regions within the networks of interest where the cerebral cortex of superagers was thicker than that of typical older adults, and where superagers were anatomically indistinguishable from young adults; hippocampal volume was also preserved in superagers. Within the full group of older adults, thickness of a number of regions, including the anterior temporal cortex, rostral medial prefrontal cortex, and anterior midcingulate cortex, correlated with memory performance, as did the volume of the hippocampus. These results indicate older adults with youthful memory abilities have youthful brain regions in key paralimbic and limbic nodes of the default mode and salience networks that support attentional, executive, and mnemonic processes subserving memory function.
In the NYTimes piece describing this work Barrett suggests:
The road to superaging is difficult, though, because these brain regions have another intriguing property: When they increase in activity, you tend to feel pretty bad — tired, stymied, frustrated. Think about the last time you grappled with a math problem or pushed yourself to your physical limits. Hard work makes you feel bad in the moment. The Marine Corps has a motto that embodies this principle: “Pain is weakness leaving the body.” That is, the discomfort of exertion means you’re building muscle and discipline. Superagers are like Marines: They excel at pushing past the temporary unpleasantness of intense effort. Studies suggest that the result is a more youthful brain that helps maintain a sharper memory and a greater ability to pay attention.

Tuesday, January 03, 2017

How to market the reality of climate change more effectively.

Baldwin and Lammers perform several studies to show that conservative are positively affected by past but not by future-focused environmental comparisons. In one of the studies, for example, subjects were shown a set of satellite images of a river basin either full of water or dried up. The authors manipulated temporal comparisons by describing the photographs as reflecting changes in the environment from the past to the present (past-focused condition) or reflecting expected changes in the environment from the present to the future (future-focused condition). Participants then reported their proenvironmental attitudes. Conservatives were more proenvironmental after the past to present description than the present to future description. Here are their summaries:

Significance
Political polarization on important issues can have dire consequences for society, and divisions regarding the issue of climate change could be particularly catastrophic. Building on research in social cognition and psychology, we show that temporal comparison processes largely explain the political gap in respondents’ attitudes towards and behaviors regarding climate change. We found that conservatives’ proenvironmental attitudes and behaviors improved consistently and drastically when we presented messages that compared the environment today with that of the past. This research shows how ideological differences can arise from basic psychological processes, demonstrates how such differences can be overcome by framing a message consistent with these basic processes, and provides a way to market the science behind climate change more effectively.
Abstract
Conservatives appear more skeptical about climate change and global warming and less willing to act against it than liberals. We propose that this unwillingness could result from fundamental differences in conservatives’ and liberals’ temporal focus. Conservatives tend to focus more on the past than do liberals. Across six studies, we rely on this notion to demonstrate that conservatives are positively affected by past- but not by future-focused environmental comparisons. Past comparisons largely eliminated the political divide that separated liberal and conservative respondents’ attitudes toward and behavior regarding climate change, so that across these studies conservatives and liberals were nearly equally likely to fight climate change. This research demonstrates how psychological processes, such as temporal comparison, underlie the prevalent ideological gap in addressing climate change. It opens up a promising avenue to convince conservatives effectively of the need to address climate change and global warming.

Monday, January 02, 2017

I used to be a human being - how technology almost killed me.

Andrew Sullivan does a striking piece, describing a process that began with his daily immersion in The Daily Dish, an early blog that was a precursor of everything to come. Here are some clips...you should read the whole article.
I was…a very early adopter of what we might now call living-in-the-web. And as the years went by, I realized I was no longer alone. Facebook soon gave everyone the equivalent of their own blog and their own audience. More and more people got a smartphone — connecting them instantly to a deluge of febrile content, forcing them to cull and absorb and assimilate the online torrent as relentlessly as I had once. Twitter emerged as a form of instant blogging of microthoughts. Users were as addicted to the feedback as I had long been — and even more prolific. Then the apps descended, like the rain, to inundate what was left of our free time. It was ubiquitous now, this virtual living, this never-stopping, this always-updating. I remember when I decided to raise the ante on my blog in 2007 and update every half-hour or so, and my editor looked at me as if I were insane. But the insanity was now banality; the once-unimaginable pace of the professional blogger was now the default for everyone.
…the rewards were many: an audience of up to 100,000 people a day…a way to measure success — in big and beautiful data — that was a constant dopamine bath for the writerly ego.
I tried reading books, but that skill now began to elude me. After a couple of pages, my fingers twitched for a keyboard. I tried meditation, but my mind bucked and bridled as I tried to still it…Although I spent hours each day, alone and silent, attached to a laptop, it felt as if I were in a constant cacophonous crowd of words and images, sounds and ideas, emotions and tirades..I’d begun to fear that this new way of living was actually becoming a way of not-living.
…my real life and body were still here. But then I began to realize, as my health and happiness deteriorated, that this was not a both-and kind of situation. It was either-or. Every hour I spent online was not spent in the physical world. Every minute I was engrossed in a virtual interaction I was not involved in a human encounter. Every second absorbed in some trivia was a second less for any form of reflection, or calm, or spirituality. “Multitasking” was a mirage. This was a zero-sum question. I either lived as a voice online or I lived as a human being in the world that humans had lived in since the beginning of time...And so I decided, after 15 years, to live in reality.
Truly being with another person means being experientially with them, picking up countless tiny signals from the eyes and voice and body language and context, and reacting, often unconsciously, to every nuance. These are our deepest social skills, which have been honed through the aeons. They are what make us distinctively human.
By rapidly substituting virtual reality for reality, we are diminishing the scope of this interaction even as we multiply the number of people with whom we interact. We remove or drastically filter all the information we might get by being with another person. We reduce them to some outlines — a Facebook “friend,” an Instagram photo, a text message — in a controlled and sequestered world that exists largely free of the sudden eruptions or encumbrances of actual human interaction. We become each other’s “contacts,” efficient shadows of ourselves...When we enter a coffee shop in which everyone is engrossed in their private online worlds, we respond by creating one of our own. When someone next to you answers the phone and starts talking loudly as if you didn’t exist, you realize that, in her private zone, you don’t. And slowly, the whole concept of a public space — where we meet and engage and learn from our fellow citizens — evaporates.
Has our enslavement to dopamine — to the instant hits of validation that come with a well-crafted tweet or Snapchat streak — made us happier? I suspect it has simply made us less unhappy, or rather less aware of our unhappiness, and that our phones are merely new and powerful antidepressants of a non-pharmaceutical variety...You need to build an ability to just be yourself and not be doing something. That’s what the phones are taking away...Underneath in your life there’s that thing … that forever empty … that knowledge that it’s all for nothing and you’re alone … That’s why we text and drive … because we don’t want to be alone for a second.
...our need for quiet has never fully gone away, because our practical achievements, however spectacular, never quite fulfill us. They are always giving way to new wants and needs, always requiring updating or repairing, always falling short. The mania of our online lives reveals this: We keep swiping and swiping because we are never fully satisfied. The late British philosopher Michael Oakeshott starkly called this truth “the deadliness of doing.” There seems no end to this paradox of practical life, and no way out, just an infinite succession of efforts, all doomed ultimately to fail.
The Judeo-Christian tradition recognized a critical distinction — and tension — between noise and silence, between getting through the day and getting a grip on one’s whole life. The Sabbath — the Jewish institution co-opted by Christianity — was a collective imposition of relative silence, a moment of calm to reflect on our lives under the light of eternity. It helped define much of Western public life once a week for centuries — only to dissipate, with scarcely a passing regret, into the commercial cacophony of the past couple of decades. It reflected a now-battered belief that a sustained spiritual life is simply unfeasible for most mortals without these refuges from noise and work to buffer us and remind us who we really are. But just as modern street lighting has slowly blotted the stars from the visible skies, so too have cars and planes and factories and flickering digital screens combined to rob us of a silence that was previously regarded as integral to the health of the human imagination...This changes us. It slowly removes — without our even noticing it — the very spaces where we can gain a footing in our minds and souls that is not captive to constant pressures or desires or duties. And the smartphone has all but banished them.
I haven’t given up, even as, each day, at various moments, I find myself giving in. There are books to be read; landscapes to be walked; friends to be with; life to be fully lived. And I realize that this is, in some ways, just another tale in the vast book of human frailty. But this new epidemic of distraction is our civilization’s specific weakness. And its threat is not so much to our minds, even as they shape-shift under the pressure. The threat is to our souls. At this rate, if the noise does not relent, we might even forget we have any.

Friday, December 30, 2016

Resetting the clock of aging - at least in mice.

I pass on a few clips from Nicholas Wade's recent discussion of work done by researchers at the Salk Institute in La Jolla, CA.
In the first attempt to reverse aging by reprogramming the genome, they have rejuvenated the organs of mice and lengthened their life spans by 30 percent. The technique, which requires genetic engineering, cannot be applied directly to people, but the achievement points toward better understanding of human aging and the possibility of rejuvenating human tissues by other means.
The aging process is clocklike in the sense that a steady accumulation of changes eventually degrades the efficiency of the body’s cells. In one of the deepest mysteries of biology, the clock’s hands are always set back to zero at conception...Ten years ago, the Japanese biologist Shinya Yamanaka amazed researchers by identifying four critical genes that reset the clock of the fertilized egg. The four genes are so powerful that they will reprogram even the genome of skin or intestinal cells back to the embryonic state.
The Salk Inst. researchers, using whole animals, tested the idea:
...that reprogramming is a stepwise process, and that a small dose of the Yamanaka factors might rejuvenate cells without the total reprogramming that converts cells to the embryonic state...The solution his team developed was to genetically engineer mice with extra copies of the four Yamanaka genes, and to have the genes activated only when the mice received a certain drug in their drinking water, applied just two days a week...“What we saw is that the animal has fewer signs of aging, healthier organs, and at the end of the experiment we could see they had lived 30 percent longer than control mice,” Dr. Izpisua Belmonte said.
Dr. Izpisua Belmonte believes these beneficial effects have been obtained by resetting the clock of the aging process. The clock is created by the epigenome, the system of proteins that clads the cell’s DNA and controls which genes are active and which are suppressed...He sees the epigenome as being like a manuscript that is continually edited. “At the end of life there are many marks and it is difficult for the cell to read them,” he said...What the Yamanaka genes are doing in his mice, he believes, is eliminating the extra marks, thus reverting the cell to a more youthful state.
Dr. Izpisua Belmonte said he was testing drugs to see if he could achieve the same rejuvenation as with the Yamanaka factors. The use of chemicals “will be more translatable to human therapies and clinical applications.”

Thursday, December 29, 2016

Killing old cells to stay young.

I want to pass on one of Science Magazine's choices for the top ten scientific breakthroughs of the year.
Pricey plastic surgery won't stop you from getting old. Nor will dietary supplements, testosterone injections, or those wrinkle creams that imply they'll make you look 21 again. But this year, researchers demonstrated one way to postpone some ravages of time—at least in mice. When they selectively weeded out rundown cells, the animals lived longer and remained healthier as they aged.
The infirm cells the scientists targeted had undergone a partial shutdown known as senescence, in which they lose the ability to divide. Researchers think senescence may prevent worn-out, cancer-prone cells from initiating tumors, but it may also promote aging. As we grow older, more and more cells stop reproducing, potentially robbing our tissues of the ability to replace dead or injured cells. Senescent cells also discharge molecules that can cause problems such as abnormal cell growth and inflammation.
The first study showing that eliminating senescent cells can produce health and longevity benefits, at least in middle-aged mice, came out in February. Deterioration of the animals' hearts and kidneys slowed, and they didn't sprout tumors until later in their lives. Some age-related declines, such as in memory and muscle coordination, didn't abate. Nonetheless, the rodents outlived their contemporaries by more than 20%.
In October, the same research team took aim at senescent cells from the immune system that amass in artery-clogging plaques and may drive their formation. Removing these cells from mice that are prone to atherosclerosis reduced the amount of fatty buildup in the animals' arteries by 60%, even though the rodents gorged on fat-laden food.
The multibillion-dollar question: Will taking out senescent cells help humans stay young longer? Both studies used genetically modified mice that clear away their senescent cells in response to a particular compound—a technique that isn't feasible in humans. But researchers have created several so-called senolytic drugs that slay senescent cells without genetic tinkering. Next year, scientists will launch the first clinical trial of one of those drugs in people who have arthritis.
References:
D. J. Baker et al., “Clearance of p16Ink4a-positive senescent cells delays ageing-associated disorders,” Nature 479, 232 (2 November 2016)
D. J. Baker et al., “Naturally occurring p16Ink4a-positive cells shorten healthy lifespan,,” Nature 530, 184 (11 February 2016)
B. G. Childs et al., “Senescent intimal foam cells are deleterious at all stages of atherosclerosis,” News from Science 354, 472 (28 October 2016)

Wednesday, December 28, 2016

Creative versus destructive chaos in Trump-land. Is there a ray of hope?

I am a member of the professional intelligentsia bubble still feeling post-traumatic stress from the presidential election. I grasp at any small reassurances that the sky may not in fact be falling, and so point to this piece by David Ignatius noting the current influence of Robert Gates, who has worked in senior national security positions for the past five presidents. Some clips:
At the top of Gates’s to-do list is striking the right balance between improving relations with Russia and appearing too cooperative with a belligerent President Vladimir Putin...“I think the challenge for any new administration would have been how to thread the needle — between stopping the downward spiral in U.S.-Russian relations, which had real dangers, and pushing back on Putin’s aggressiveness and general thuggery,” Gates said.
Gates has shared the role of informal counselor to the Trump transition team with two other veterans of the Bush administration, former secretary of state Condoleezza Rice, who talks regularly with Vice President-elect Mike Pence, and former national security adviser Stephen Hadley. The three have a consulting firm, RiceHadleyGates, which has proposed candidates for Cabinet and sub-Cabinet jobs, including Rex Tillerson and retired Marine Gen. James N. Mattis, the choices for State and Defense, respectively.
Gates, Hadley and Rice have also talked with foreign governments that are puzzled about how to approach Trump. In an interview this week, Hadley summarized his basic advice:..“We’ve never had a populist movement or political insurgency quite like this — that actually captured the White House. That means there will be more discontinuities in our foreign policy. I’m telling people: ‘Give us some space here and have some strategic patience. And don’t overreact — even to Trump’s tweets.’ ”
One issue that worries Gates is the multiplicity of people surrounding Trump in the White House, seeking to influence an undisciplined chief executive. “What happens when someone tries to get in to see the president with a proposal or initiative and is rebuffed by one gatekeeper — and simply goes through another door? It’s a formula for a disjointed process.”
“There will be a rough break-in period,” Gates predicted. Part of the challenge is that Trump believes his success stems from his freewheeling, undisciplined style, and personal messaging through Twitter — which makes him resist limits.

Tuesday, December 27, 2016

Artificial intelligence ups its game

I pass on this description by John Bohannon in Science Magazine of a recent triumph of A.I.:
This year, artificial intelligence (AI) passed a significant milestone when a computer program called AlphaGo beat the world's No. 2 Go player in a five-game match. It's not the first time that AI has surpassed human mastery of a game. After all, it was 20 years ago that IBM's Deep Blue first beat Garry Kasparov in a game of chess, toppling the world champion the following year in a six-game match. But that is where the similarity ends.
The rules of Go are more straightforward than those of chess: You simply place identical stones on a grid, capturing territory by surrounding your opponent's positions. But that simplicity and openness result in an explosion in the number of possible moves for a player to consider—far more than there are atoms in the known universe. That makes it impossible for AI to beat Go masters with an approach like that used by Deep Blue, which relies on handcoded strategies from chess experts to evaluate each possible move.
Instead, AlphaGo, designed by the London-based Google subsidiary DeepMind, studied hundreds of thousands of online Go games played between humans, using those sequences of moves as data for a machine-learning algorithm. Then AlphaGo played against itself—or, rather, slightly different versions of itself—over and over, finetuning its strategies with a technique called deep reinforcement learning. The final result is AI that wins not just with brute-force calculation, but with something that looks strikingly like human intuition.
Most of the things we want AI to master involve a seemingly unmanageable number of possible decisions—walking a robot safely through a crowded room, routing driverless cars, making small talk with passengers. Because hard-coded rules fail for such tasks, AlphaGo's triumph shows just how powerful deep reinforcement learning can be.
References
D. Mackenzie, “Update: Why this week’s man-versus-Go match doesn’t matter (and what does,” News from Science (15 March 2016)
D. Silver, “Mastering the game of Go with deep neural networks and tree search,” Nature 589, 224 (28 January 2016)

Monday, December 26, 2016

Making the world nicer is a tough slog - two organizations trying.

The end of 2016 approaches, I am thinking about charitable donations I have made or might make in this angry and uncertain time of huge political changes. Angry voters in Europe and America are turning back the clock, and the paradigm of America may be irreversibly changing. A collective trauma is being generated by the severing of ties that previously have bound Americans together. How might we try to be more kind, gentle, and understanding with each other?

I've decided to make year end contributions to two university associated organizations trying to promote the greater good through research, teaching, and understanding - trying to find ways to spread and promote the virtues of altruism, compassion, gratitude, empathy, forgiveness, and mindfulness. One is the Greater Good Science Center at the University of California, Berkeley, founded by Dacher Keltner. The other is the Center for Healthy Minds at the University of Wisconsin, Madison, led by my former colleague Richard Davidson. I would encourage MindBlog readers to check out their websites, and consider donations to both.

Friday, December 23, 2016

Our automated jobless future.

Elizabeth Kolbert offers an interesting review of ideas in several recent books dealing with our automated future. How long will it be before you lose your job to a robot? Here are a few clips:
Imagine a matrix with two axes, manual versus cognitive and routine versus nonroutine. Jobs can then be arranged into four boxes: manual routine, manual nonroutine, and so on…Jobs on an assembly line fall into the manual-routine box, jobs in home health care into the manual-nonroutine box. Keeping track of inventory is in the cognitive-routine box; dreaming up an ad campaign is cognitive nonroutine.
The highest-paid jobs are clustered in the last box; managing a hedge fund, litigating a bankruptcy, and producing a TV show are all cognitive and nonroutine. Manual, nonroutine jobs, meanwhile, tend to be among the lowest paid—emptying bedpans, bussing tables, cleaning hotel rooms (and folding towels). Routine jobs on the factory floor or in payroll or accounting departments tend to fall in between. And it’s these middle-class jobs that robots have the easiest time laying their grippers on.
How much technology has contributed to the widening income gap in the U.S. is a matter of debate; some economists treat it as just one factor, others treat it as the determining factor. In either case, the trend line is ominous. Facebook is worth two hundred and seventy billion dollars and employs just thirteen thousand people. In 2014, Facebook acquired Whatsapp for twenty-two billion dollars. At that point, the messaging firm had a grand total of fifty-five employees. When a twenty-two-billion-dollar company can fit its entire workforce into a Greyhound bus, the concept of surplus labor would seem to have run its course.
Martin Ford (author of “Rise of the Robots: Technology and the Threat of a Jobless Future”) worries that we are headed toward an era of “techno-feudalism,” He imagines a plutocracy shut away “in gated communities or in elite cities, perhaps guarded by autonomous military robots and drones.” Under the old feudalism, the peasants were exploited; under the new arrangement, they’ll merely be superfluous. The best we can hope for, he suggests, is a collective form of semi-retirement. He recommends a guaranteed basic income for all, to be paid for with new taxes, levelled, at least in part, on the new gazillionaires.
To one degree or another, just about everyone writing on the topic shares this view. Jerry Kaplan proposes that the federal government create a 401(k)-like account for every ten-year-old in the U.S. Those who ultimately do find jobs could contribute some of their earnings to the accounts; those who don’t could perform volunteer work in return for government contributions.
...if it’s unrealistic to suppose that smart machines can be stopped, it’s probably just as unrealistic to imagine that smart policies will follow. Which brings us ... to Trump. The other day, during his “victory lap” through the Midwest, the President-elect vowed to “usher in a new Industrial Revolution,” apparently unaware that such a revolution is already under way, and that this is precisely the problem. The pain of dislocation he spoke to during the campaign is genuine; the solutions he offers are not.

Thursday, December 22, 2016

A stark graphic - the income gap continues to widen.

The NYTimes piece by Patricia Cohen and graphic summaries by Ashkenas are worth reading. The top 1% and the bottom 50% have swapped their relative shares of the national income. Forty years ago, the top 1 percent of earners took home 10.5 percent of the total national income, and the bottom half earned 20 percent of it. By 2014, those percentages effectively flipped, with the top 1 percent earning a 20 percent share and the bottom half dropping to 12.5 percent.


Wednesday, December 21, 2016

Our Arthropod housemates.

Now I know more about what is in the haze of particles I see illuminated by the horizontal rays of the rising sun flowing through my Fort Lauderdale condo in early morning.  I pass on, under the "random curious stuff" MindBlog category,  an accounting by Madden et al. that shows the ubiquity of insects detected in settled dust samples collected from inside homes. They used a DNA-based method for investigating the arthropod diversity in homes via high-throughput marker gene sequencing of home dust. Settled dust samples were collected by citizen scientists from both inside and outside more than 700 homes across the United States, yielding the first continental-scale estimates of arthropod diversity associated with our residences. Here is a graphic (click to enlarge), in which (A) shows the Genera detected, (B) shows orders detected in at least 5% of homes. The Y-axes indicate the percentage of homes (of 651 homes with arthropods detected) where those arthropods were detected.



Tuesday, December 20, 2016

Reducing future fears by suppressing episodic simulation in the brain.

Benoit et al. offer some findings relevant to understanding the heightened anxiety many are feeling in the Age of Trump.

An edited summary that starts the discussion section of their paper:
Recurrently imagining dreaded future situations potentiates fears and can even support the development and maintenance of anxiety disorders. We tested the hypothesis that such simulations can be suppressed with the opposite effect of down-regulating apprehensiveness. Our data indicate that future suppression is based on a brain mechanism that is remarkably similar to a system implicated in the voluntary suppression of past experiences. This mechanism recruits right dorsolateral prefrontal cortex, which originates an inhibitory signal that down-regulates activation in brain regions supporting both retrieval and episode-construction processes. Paralleling the suppression of recently acquired memories, the regions targeted by future suppression included the hippocampus, a structure that is fundamental for the retrieval of past episodes and the construction of coherent future and fictitious events. Critically, the suppression of recurring fears of the future differs from suppressing past events in that it also involved modulating the vmPFC (ventromedial prefrontal cortex). The mPFC fosters the integration of overlapping memories into a common representation.
A clip describing their procedure:
To examine future suppression, we adapted the “Think/No-Think” procedure, used to study the suppression of past events, to create the new “Imagine/No-Imagine” paradigm. The procedure first asked participants to describe their fears. Importantly, they only provided recurrent future fears—that is, those that they had already worried might happen before entering the experiment. Participants then gave one key detail for each fear that was typical to their recurring imaginings of it. (These typical event details served as a dependent measure; see below.) Afterward, they entered the critical Imagine/No-Imagine phase, which was composed of trials that presented reminders to these fears. For some trials, participants were asked to imagine the feared event as vividly as possible in response to the reminder (Imagine condition); for others, participants were asked to suppress their imagining of the event, upon seeing the reminder (Suppress condition). (A third of the originally provided episodes, the Baseline items, were set aside and were not cued during this phase.) Over the course of the Imagine/No-Imagine phase, participants either imagined or suppressed a feared event 12 times. Following this phase, we gave participants each reminder again and asked them to recall the typical feature of its corresponding fear. Once all typical details were tested, participants were then asked to freely imagine each episode aloud in detail for 2 min. Finally, we assessed the impact of suppression on participants’ apprehensiveness toward these future events.
Here are the significance and abstract section of their paper:
Significance 
Humans possess the remarkable ability to recombine details of divergent memories into imaginings of future events. Such imaginings are useful, for example, because they foster planning and motivate farsighted decisions. Importantly, recurrently imagining feared situations can also undermine our well-being and may even contribute to the development of anxiety. Here, we demonstrate that fearful imaginings about the future can be inhibited by neural mechanisms that help to suppress the past. Importantly, suppression reduces later apprehensiveness about the feared events, a benefit that was diminished in individuals with greater trait anxiety. This pattern suggests that the observed inhibition mechanism serves to control people’s future fears and its disruption may foster psychological disorders characterized by intrusive prospective thoughts. 
Abstract 
Imagining future events conveys adaptive benefits, yet recurrent simulations of feared situations may help to maintain anxiety. In two studies, we tested the hypothesis that people can attenuate future fears by suppressing anticipatory simulations of dreaded events. Participants repeatedly imagined upsetting episodes that they feared might happen to them and suppressed imaginings of other such events. Suppressing imagination engaged the right dorsolateral prefrontal cortex, which modulated activation in the hippocampus and in the ventromedial prefrontal cortex (vmPFC). Consistent with the role of the vmPFC in providing access to details that are typical for an event, stronger inhibition of this region was associated with greater forgetting of such details. Suppression further hindered participants’ ability to later freely envision suppressed episodes. Critically, it also reduced feelings of apprehensiveness about the feared scenario, and individuals who were particularly successful at down-regulating fears were also less trait-anxious. Attenuating apprehensiveness by suppressing simulations of feared events may thus be an effective coping strategy, suggesting that a deficiency in this mechanism could contribute to the development of anxiety.

Monday, December 19, 2016

Can evolution have a 'higher purpose'?

I want to point to this essay by Robert Wright, author of "The Moral Animal," who makes the point that arguments that life on earth may have some larger purpose do not necessarily have to depart from a scientific worldview, invoke supernatural beings, or depart from the model of evolution by natural selection. Among these arguments are:
... “simulation” scenarios, which hold that our seemingly tangible world is actually a kind of projection emanating from some sort of mind-blowingly powerful computer; and the history of our universe, including evolution on this planet, is the unfolding of a computer algorithm...When an argument for higher purpose is put this way — that is, when it doesn’t involve the phrase “higher purpose” and, further, is cast more as a technological scenario than a metaphysical one — it is considered intellectually respectable. I don’t mean there aren’t plenty of people who dismiss it. I’m talking about how people dismiss it. [Neil deGrasse Tyson and Elon Musk find this view plausible.]
Wright quotes from a conversation with William Hamilton, who says:
There’s one theory of the universe that I rather like — I accept it in an almost joking spirit — and that is that Planet Earth in our solar system is a kind of zoo for extraterrestrial beings who dwell out there somewhere. And this is the best, the most interesting experiment they could set up: to set up the evolution on Planet Earth going in such a way that it would produce these really interesting characters — humans who go around doing things — and they watch their experiment, interfering hardly at all so that almost everything we do comes out according to the laws of nature. But every now and then they see something which doesn’t look quite right — this zoo is going to kill itself off if they let you do this or that.” So, he continued, these extraterrestrials “insert a finger and just change some little thing. And maybe those are the miracles which the religious people like to so emphasize.” He reiterated: “I put it forward in an almost joking spirit. But I think it
Another scenario:
...emerges from one version of physicist Lee Smolin’s theory of “cosmological natural selection.” Smolin thinks our universe may itself be a product of a kind of evolution: maybe universes can replicate themselves via black holes, so over time — over a lot of time — you get universes whose physical laws are more and more conducive to replication. (So that’s why our universe is so good at black-hole making!) In some variants of Smolin’s theory — such as those developed by the late cosmologist Edward Harrison and the mathematician Louis Crane — intelligent beings can play a role in this replication once their technology reaches a point where they can produce black holes. So through cosmological natural selection you’d get universes whose physical properties were more and more conducive to the evolution of intelligent life. This might explain the much-discussed observation that the physical constants of this universe seem “fine-tuned” to permit the emergence of life.
Wilson's ending points:
I think discussion of higher purpose should be respectable even in a scientific age. I don’t mean I buy the simulation scenario in particular, or the space alien scenario, or the cosmological natural selection scenario. But I do think there’s reason to suspect that there’s some point to this exercise we Earthlings are engaged in, some purpose imbued by something — and that, even if identifying that something is for now hopeless, there are grounds for speculating about what the point of the exercise is.
I won’t elaborate much on this, since I’ve done that elsewhere, arguing that higher purpose can be framed as a hypothesis, and that evidence for or against the hypothesis can be marshaled. But I will say that the evidence I see for purpose includes not just the direction of biological evolution, but the direction of technological evolution and of the broader social and cultural evolution it drives — the evolution that has carried us from hunter-gatherer bands to the brink of a cohesive global community. And if the purpose involves sustaining this direction — becoming a true global community — then it would seem to include moral progress. In particular, our purpose would involve transcending the psychology of tribalism that can otherwise divide people along ethnic, national, religious and ideological lines. Which would mean — in light of recent political and social developments in the United States and abroad — that our work is cut out for us.