Wednesday, May 25, 2016

A model for aggression and violence around the world.

I want to pass on the abstract of a forthcoming article in Brain and Behavioral Science for which reviewer's comments are being solicited. (I'm on the mailing list of potential reviewers because I authored an article in the journal in the 1990s). It's model of climate, aggression, and self control makes total sense in terms of my experience of living both in Madison Wisconsin (from May-September) and Fort Lauderdale Florida (October-April). (I returned to Madison two weeks ago and, as usual, have been struck by how much less defensiveness and aggression is exhibited by strangers in public in the more northern Madison location. Strangers at grocery stores are more benign, pleasant, and occasionally even make eye contact!)
Target Article: Aggression and Violence Around the World: A Model of Climate, Aggression, and Self-control in Humans (CLASH)
Authors: Paul A. M. Van Lange, Maria I. Rinderu, and Brad J. Bushman
Deadline for Commentary Proposals: Thursday June 9, 2016
Abstract: Worldwide there are substantial differences within and between countries in aggression and violence. Although there are various exceptions, a general rule is that aggression and violence increase as one moves closer to the equator, which suggests the important role of climate differences. While this pattern is robust, theoretical explanations for these large differences in aggression and violence within countries and around the world are lacking. Most extant explanations focus on the influence of average temperature as a factor that triggers aggression (The General Aggression Model), or the notion that warm temperature allows for more social interaction situations (Routine Activity Theory) in which aggression is likely to unfold. We propose a new model of CLimate, Aggression, and Self-control in Humans (CLASH) that seeks to understand differences within and between countries in aggression and violence in terms of differences in climate. Lower temperatures, and especially larger degrees of seasonal variation in climate, calls for individuals and groups to adopt a slower life history strategy, and exert more focus on the future (versus present), and a stronger focus on self-control. The CLASH model further outlines that slow life strategy, future orientation, and strong self-control are important determinants of inhibiting aggression and violence. We also discuss how CLASH is different from other recently developed models that emphasize climate differences for understanding conflict. We conclude by discussing the theoretical and societal importance of climate in shaping individual and societal differences in aggression and violence.

Tuesday, May 24, 2016

The Dalai Lama’s Atlas of emotions

You might have a look at this curious website pointed to by Kevin Randall, an atlas of emotions developed by Paul Ekman and collaborators commissioned by the Dalai Lama (who paid ~$750,000 for the project). After surveying 248 of the most active emotion researchers in the world, Ekman chose to divide emotions into five broad categories (anger, fear, disgust, sadness and enjoyment), each having an elaborate subset of emotional states, triggers, actions and moods. A cartography and data visualization firm was engaged to help depict them in a visual, and hopefully useful, way.


I'm really at a bit of a loss to figure out how the byzantine complexity of the beautiful graphic displays are supposed to be useful. They don't quite do it for me. Maybe this is supposed to be a lookup guide for an emotion one is feeling but not quite categorizing? Sort of a bestiary of emotions? A well-intentioned effort, surely, but many of the descriptions seem quite banal and obvious.

Monday, May 23, 2016

When philosophy lost its way.

Frodeman and Briggle offer a lament over the irreversible passing of the practice of philosophy as a moral endeavor, one that might offer a view of the good society apart from the prescriptions of religion. Some clips from their essay:
Before its migration to the university, philosophy had never had a central home. Philosophers could be found anywhere — serving as diplomats, living off pensions, grinding lenses, as well as within a university. Afterward, if they were “serious” thinkers, the expectation was that philosophers would inhabit the research university…This purification occurred in response to at least two events. The first was the development of the natural sciences, as a field of study clearly distinct from philosophy, circa 1870, and the appearance of the social sciences in the decade thereafter. ..The second event was the placing of philosophy as one more discipline alongside these sciences within the modern research university. A result was that philosophy, previously the queen of the disciplines, was displaced, as the natural and social sciences divided the world between them.
Philosophers needed to embrace the structure of the modern research university, which consists of various specialties demarcated from one another. That was the only way to secure the survival of their newly demarcated, newly purified discipline. “Real” or “serious” philosophers had to be identified, trained and credentialed. Disciplinary philosophy became the reigning standard for what would count as proper philosophy.
Having adopted the same structural form as the sciences, it’s no wonder philosophy fell prey to physics envy and feelings of inadequacy. Philosophy adopted the scientific modus operandi of knowledge production, but failed to match the sciences in terms of making progress in describing the world. Much has been made of this inability of philosophy to match the cognitive success of the sciences. But what has passed unnoticed is philosophy’s all-too-successful aping of the institutional form of the sciences. We, too, produce research articles. We, too, are judged by the same coin of the realm: peer-reviewed products. We, too, develop sub-specializations far from the comprehension of the person on the street. In all of these ways we are so very “scientific.”
The act of purification accompanying the creation of the modern research university was not just about differentiating realms of knowledge. It was also about divorcing knowledge from virtue. Though it seems foreign to us now, before purification the philosopher (and natural philosopher) was assumed to be morally superior to other sorts of people. ..The study of philosophy elevated those who pursued it. Knowing and being good were intimately linked. It was widely understood that the point of philosophy was to become good rather than simply to collect or produce knowledge…The purification made it no longer sensible to speak of nature, including human nature, in terms of purposes and functions…By the late 19th century, Kierkegaard and Nietzsche had proved the failure of philosophy to establish any shared standard for choosing one way of life over another…There was a brief window when philosophy could have replaced religion as the glue of society; but the moment passed. People stopped listening as philosophers focused on debates among themselves.
Once knowledge and goodness were divorced, scientists could be regarded as experts, but there are no morals or lessons to be drawn from their work. Science derives its authority from impersonal structures and methods, not the superior character of the scientist. The individual scientist is no different from the average Joe, with no special authority to pronounce on what ought to be done…philosophy has aped the sciences by fostering a culture that might be called “the genius contest.” Philosophic activity devolved into a contest to prove just how clever one can be in creating or destroying arguments. Today, a hyperactive productivist churn of scholarship keeps philosophers chained to their computers. Like the sciences, philosophy has largely become a technical enterprise, the only difference being that we manipulate words rather than genes or chemicals. Lost is the once common-sense notion that philosophers are seeking the good life — that we ought to be (in spite of our failings) model citizens and human beings. Having become specialists, we have lost sight of the whole. The point of philosophy now is to be smart, not good. It has been the heart of our undoing.

Friday, May 20, 2016

This is how fascism comes to America

I pass on a few clips from a must-read article in the Washington Post by Robert Kagan, on Donald Trump:
Republican politicians marvel at how he has “tapped into” a hitherto unknown swath of the voting public. But what he has tapped into is what the founders most feared when they established the democratic republic: the popular passions unleashed, the “mobocracy.” Conservatives have been warning for decades about government suffocating liberty. But here is the other threat to liberty that Alexis de Tocqueville and the ancient philosophers warned about: that the people in a democracy, excited, angry and unconstrained, might run roughshod over even the institutions created to preserve their freedoms. As Alexander Hamilton watched the French Revolution unfold, he feared in America what he saw play out in France — that the unleashing of popular passions would lead not to greater democracy but to the arrival of a tyrant, riding to power on the shoulders of the people.
This phenomenon has arisen in other democratic and quasi-democratic countries over the past century, and it has generally been called “fascism.” Fascist movements, too, had no coherent ideology, no clear set of prescriptions for what ailed society. “National socialism” was a bundle of contradictions, united chiefly by what, and who, it opposed; fascism in Italy was anti-liberal, anti-democratic, anti-Marxist, anti-capitalist and anti-clerical. Successful fascism was not about policies but about the strongman, the leader (Il Duce, Der Fuhrer), in whom could be entrusted the fate of the nation. Whatever the problem, he could fix it. Whatever the threat, internal or external, he could vanquish it, and it was unnecessary for him to explain how. Today, there is Putinism, which also has nothing to do with belief or policy but is about the tough man who singlehandedly defends his people against all threats, foreign and domestic.
To understand how such movements take over a democracy, one only has to watch the Republican Party today. These movements play on all the fears, vanities, ambitions and insecurities that make up the human psyche. In democracies, at least for politicians, the only thing that matters is what the voters say they want — vox populi vox dei. A mass political movement is thus a powerful and, to those who would oppose it, frightening weapon. When controlled and directed by a single leader, it can be aimed at whomever the leader chooses. If someone criticizes or opposes the leader, it doesn’t matter how popular or admired that person has been. He might be a famous war hero, but if the leader derides and ridicules his heroism, the followers laugh and jeer. He might be the highest-ranking elected guardian of the party’s most cherished principles. But if he hesitates to support the leader, he faces political death.
This is how fascism comes to America, not with jackboots and salutes (although there have been salutes, and a whiff of violence) but with a television huckster, a phony billionaire, a textbook egomaniac “tapping into” popular resentments and insecurities, and with an entire national political party — out of ambition or blind party loyalty, or simply out of fear — falling into line behind him.

Thursday, May 19, 2016

Brain modules that process human consensus decision-making

Suzuki et al. offer a study noting brain areas important in consensus decision-making, with different decision variables being associated with activity in different brain area that are integrated by distributed neural activity (See Network hubs in the human brain for an overall review of domains of cognitive function with some great summary graphics). The summary and abstract:

Highlights
•A task is used to study how the brain implements consensus decision-making 
•Consensus decision-making depends on three distinct computational processes 
•These different signals are encoded in distinct brain regions 
•Integration of these signals occurs in the dorsal anterior cingulate cortex
Summary
Consensus building in a group is a hallmark of animal societies, yet little is known about its underlying computational and neural mechanisms. Here, we applied a computational framework to behavioral and fMRI data from human participants performing a consensus decision-making task with up to five other participants. We found that participants reached consensus decisions through integrating their own preferences with information about the majority group members’ prior choices, as well as inferences about how much each option was stuck to by the other people. These distinct decision variables were separately encoded in distinct brain areas—the ventromedial prefrontal cortex, posterior superior temporal sulcus/temporoparietal junction, and intraparietal sulcus—and were integrated in the dorsal anterior cingulate cortex. Our findings provide support for a theoretical account in which collective decisions are made through integrating multiple types of inference about oneself, others, and environments, processed in distinct brain modules.

Wednesday, May 18, 2016

Aerobic fitness: one minute of all out effort = 45 min. of moderate effort

Reynolds has written a series of articles describing experiments showing the benefits of high-intensity interval training. She now points to a study by Gillen et al. showing that high intensity effort periods of only 1 minute can have a big effect. Twelve weeks of a regime of 3 cycling sessions per week lasting 10 minutes each, with only one minute of that time being strenuous, caused the same 20% increase in aerobic fitness as sessions of 45 min of cycling at a moderate pace.

Tuesday, May 17, 2016

America in decline?

I pass on a few clips from Easterbrook's article, on the prevailing negative depiction (especially by Republican candidates) of America's current state and direction:
...most American social indicators have been positive at least for years, in many cases for decades. The country is, on the whole, in the best shape it’s ever been in. So what explains all the bad vibes?..the core reason for the disconnect between the nation’s pretty-good condition and the gloomy conventional wisdom is that optimism itself has stopped being respectable. Pessimism is now the mainstream, with optimists viewed as Pollyannas. If you don’t think everything is awful, you don’t understand the situation!
Objectively, the glass looks significantly more than half full.
Job growth has been strong for five years, with unemployment now below where it was for most of the 1990s, a period some extol as the “good old days.” The American economy is No. 1 by a huge margin, larger than Nos. 2 and 3 (China and Japan) combined. Americans are seven times as productive, per capita, as Chinese citizens. The dollar is the currency the world craves — which means other countries perceive America’s long-term prospects as very good.
Pollution, discrimination, crime and most diseases are in an extended decline; living standards, longevity and education levels continue to rise. The American military is not only the world’s strongest, it is the strongest ever. The United States leads the world in science and engineering, in business innovation, in every aspect of creativity, including the arts. Terrorism is a serious concern, but in the last 15 years, even taking into account Sept. 11, an American is five times more likely to be hit by lightning than to be killed by a terrorist.
Easterbrook continues with a discussion of the dire straits of the middle class, changes in manufacturing jobs ("Manufacturing jobs described by Mr. Trump and Mr. Sanders as “lost” to China cannot be found there, or anywhere."), etc.
...developing the postindustrial economy — while addressing issues such as inequality, greenhouse emissions and the condition of public schools — will require optimism. Pessimists think in terms of rear-guard actions to turn back the clock. Optimists understand that where the nation has faults, it’s time to roll up our sleeves and get to work.
That’s why the lack of progressive optimism is so keenly felt. In recent decades, progressives drank too deeply of instant-doomsday claims. If their predictions had come true, today petroleum would be exhausted, huge numbers of major animal species would be extinct, crop failures would be causing mass starvation, developing-world poverty would be getting worse instead of declining fast. (In 1990, 37 percent of humanity lived in what the World Bank defines as extreme poverty; today it’s 10 percent.)

Monday, May 16, 2016

Downsides of diversity.

I want to thank the anonymous commentator on the “Diversity makes you brighter” post, who sent links to interesting articles by Jonas and by Dinesena and Sønderskov. I pass on just some clips from Jonas, noting work by Putnam and Page:
...a fascinating new portrait of diversity emerging from recent scholarship. Diversity, it shows, makes us uncomfortable -- but discomfort, it turns out, isn't always a bad thing. Unease with differences helps explain why teams of engineers from different cultures may be ideally suited to solve a vexing problem. Culture clashes can produce a dynamic give-and-take, generating a solution that may have eluded a group of people with more similar backgrounds and approaches. At the same time, though, Putnam's work adds to a growing body of research indicating that more diverse populations seem to extend themselves less on behalf of collective needs and goals.
In more diverse communities, he says, there were neither great bonds formed across group lines nor heightened ethnic tensions, but a general civic malaise. And in perhaps the most surprising result of all, levels of trust were not only lower between groups in more diverse settings, but even among members of the same group...
So, there is a diversity paradox:
...those in more diverse communities may do more bowling alone, but the creative tensions unleashed by those differences in the workplace may vault those same places to the cutting edge of the economy and of creative culture.

Friday, May 13, 2016

Two ways to be satisfied.

Anna North points to an article by Helzer and Jayawickreme that examines two different control strategies for obtaining short and long term life satisfaction, “primary control” — the ability to directly affect one's circumstances — and “secondary control” — the ability to affect how one responds to those circumstances.
How does a sense of control relate to well-being? We consider two distinguishable control strategies, primary and secondary control, and their relationships with two facets of subjective well-being, daily positive/negative affective experience and global life satisfaction. Using undergraduate and online samples, the results suggest that these different control strategies are associated uniquely with distinct facets of well-being. After controlling for shared variance among constructs, primary control (the tendency to achieve mastery over circumstances via goal striving) was associated more consistently with daily affective experience than was secondary control, and secondary control (the tendency to achieve mastery over circumstances via sense-making) was associated more strongly with life satisfaction than primary control, but only within the student sample and community members not in a committed relationship. The results highlight the importance of both control strategies to everyday health and provide insights into the mechanisms underlying the relationship between control and well-being.
It is not clear why relationship status makes a difference. Helzer suggests that having a partner may help people deal with adversity the same way secondary control does, so secondary control may have less of an effect

Thursday, May 12, 2016

John Oliver on "Scientific Studies show...."

I have to pass on this great bit from John Oliver, on the vacuity of most scientific reporting.


Diversity makes you brighter.

Providing some data relevant to debates over affirmative action, Levine et al. show that ethnic diversity can increase intelligent behaviors. Misfits between market prices and the true value of assets (market bubbles) are more likely in ethnically homogeneous than in diverse markets.
Markets are central to modern society, so their failures can be devastating. Here, we examine a prominent failure: price bubbles. Bubbles emerge when traders err collectively in pricing, causing misfit between market prices and the true values of assets. The causes of such collective errors remain elusive. We propose that bubbles are affected by ethnic homogeneity in the market and can be thwarted by diversity. In homogenous markets, traders place undue confidence in the decisions of others. Less likely to scrutinize others’ decisions, traders are more likely to accept prices that deviate from true values. To test this, we constructed experimental markets in Southeast Asia and North America, where participants traded stocks to earn money. We randomly assigned participants to ethnically homogeneous or diverse markets. We find a marked difference: Across markets and locations, market prices fit true values 58% better in diverse markets. The effect is similar across sites, despite sizeable differences in culture and ethnic composition. Specifically, in homogenous markets, overpricing is higher as traders are more likely to accept speculative prices. Their pricing errors are more correlated than in diverse markets. In addition, when bubbles burst, homogenous markets crash more severely. The findings suggest that price bubbles arise not only from individual errors or financial conditions, but also from the social context of decision making. The evidence may inform public discussion on ethnic diversity: it may be beneficial not only for providing variety in perspectives and skills, but also because diversity facilitates friction that enhances deliberation and upends conformity.

Wednesday, May 11, 2016

What art unveils

I pass on some initial and final clips from an essay by Alva Noë that is worth reading in its entirely.
Is there a way of thinking about art that will get us closer to an understanding of its essential nature, and our own?...the trend is to try to answer these questions in the key of neuroscience. I recommend a different approach, but not because I don’t think it is crucial to explore the links between art and our biological nature. The problem is that neuroscience has yet to frame an adequate conception of our nature. You look in vain in the writings of neuroscientists for satisfying accounts of experience or consciousness. For this reason, I believe, we can’t use neuroscience to explain art and its place in our lives. Indeed, if I am right, the order of explanation may go in the other direction: Art can help us frame a better picture of our human nature.
...Design, the work of technology, stops, and art begins, when we are unable to take the background of our familiar technologies and activities for granted, and when we can no longer take for granted what is, in fact, a precondition of the very natural-seeming intelligibility of such things as doorknobs and pictures, words and sounds. When you and are I talking, I don’t pay attention to the noises you are making; your language is a transparency through which I encounter you. Design, at least when it is optimal, is transparent in just this way; it disappears from view and gets absorbed in application. You study the digital image of the shirt on the website, you don’t contemplate its image.
Art, in contrast, makes things strange. You do contemplate the image, when you examine Leonardo’s depiction of the lady with the ermine. You are likely, for example, to notice her jarringly oversized and masculine hand and to wonder why Leonardo draws our attention to that feature of this otherwise beautiful young person. Art disrupts plain looking and it does so on purpose. By doing so it discloses just what plain looking conceals.
Art unveils us ourselves. Art is a making activity because we are by nature and culture organized by making activities. A work of art is a strange tool. It is an alien implement that affords us the opportunity to bring into view everything that was hidden in the background.
If I am right, art isn’t a phenomenon to be explained. Not by neuroscience, and not by philosophy. Art is itself a research practice, a way of investigating the world and ourselves. Art displays us to ourselves, and in a way makes us anew, by disrupting our habitual activities of doing and making.

Tuesday, May 10, 2016

Our brain activity at rest predicts our performance on tasks.

The Science Mazaine precis of Travor et al.:
We all differ in how we perceive, think, and act. What drives individual differences in evoked brain activity? Tavor et al. applied computational models to functional magnetic resonance imaging (fMRI) data from the Human Connectome Project. Brain activity in the “resting” state when subjects were not performing any explicit task predicted differences in fMRI activation across a range of cognitive paradigms. This suggests that individual differences in many cognitive tasks are a stable trait marker. Resting-state functional connectivity thus already contains the repertoire that is then expressed during task-based fMRI.
And the article abstract:
When asked to perform the same task, different individuals exhibit markedly different patterns of brain activity. This variability is often attributed to volatile factors, such as task strategy or compliance. We propose that individual differences in brain responses are, to a large degree, inherent to the brain and can be predicted from task-independent measurements collected at rest. Using a large set of task conditions, spanning several behavioral domains, we train a simple model that relates task-independent measurements to task activity and evaluate the model by predicting task activation maps for unseen subjects using magnetic resonance imaging. Our model can accurately predict individual differences in brain activity and highlights a coupling between brain connectivity and function that can be captured at the level of individual subjects.

Monday, May 09, 2016

The key to political persuasion

I pass on clips from this interesting piece, that has been languishing in my queue of potential posts for some time, in which Willer and Feinberg give a more accessible account of their work reported in the Personality and Social Psychology Bulletin.
In business, everyone knows that if you want to persuade people to make a deal with you, you have to focus on what they value, not what you do. If you’re trying to sell your car, you emphasize the features of the sale that appeal to the buyer (the reliability and reasonable price of the vehicle), not the ones that appeal to you (the influx of cash).
This rule of salesmanship also applies in political debate — i.e., you should frame your position in terms of the moral values of the person you’re trying to convince. But when it comes to politics, this turns out to be hard to do. We found that people struggled to set aside their reasons for taking a political position and failed to consider how someone with different values might come to support that same position.
In one study, we presented liberals and conservatives with one of two messages in support of same-sex marriage. One message emphasized the need for equal rights for same-sex couples. This is the sort of fairness-based message that liberals typically advance for same-sex marriage. It is framed in terms of a value — equality — that research has shown resonates more strongly among liberals than conservatives. The other message was designed to appeal to values of patriotism and group loyalty, which have been shown to resonate more with conservatives. (It argued that “same-sex couples are proud and patriotic Americans” who “contribute to the American economy and society.”)
Liberals showed the same support for same-sex marriage regardless of which message they encountered. But conservatives supported same-sex marriage significantly more if they read the patriotism message rather than the fairness one.
In a parallel experiment, we targeted liberals for persuasion. We presented a group of liberals and conservatives with one of two messages in support of increased military spending. One message argued that we should “take pride in our military,” which “unifies us both at home and abroad.” The other argued that military spending is necessary because, through the military, the poor and disadvantaged “can achieve equal standing,” by ensuring they have “a reliable salary and a future apart from the challenges of poverty and inequality.”
For conservatives, it didn’t matter which message they read; their support for military spending was the same. However, liberals expressed significantly greater support for increasing military spending if they read the fairness message rather than the patriotism one.
If you’re thinking that these reframed arguments don’t sound like ones that conservatives and liberals would naturally be inclined to make, you’re right. In an additional study, we asked liberals to write a persuasive argument in favor of same-sex marriage aimed at convincing conservatives — and we offered a cash prize to the participant who wrote the most persuasive message. Despite the financial incentive, just 9 percent of liberals made arguments that appealed to more conservative notions of morality, while 69 percent made arguments based on more liberal values.
Conservatives were not much better. When asked to write an argument in favor of making English the official language of the United States that would be persuasive to liberals (with the same cash incentive), just 8 percent of conservatives appealed to liberal values, while 59 percent drew upon conservative values.
Why do we find moral reframing so challenging? There are a number of reasons. You might find it off-putting to endorse values that you don’t hold yourself. You might not see a link between your political positions and your audience’s values. And you might not even know that your audience endorses different values from your own. But whatever the source of the gulf, it can be bridged with effort and consideration.
Maybe reframing political arguments in terms of your audience’s morality should be viewed less as an exercise in targeted, strategic persuasion, and more as an exercise in real, substantive perspective taking. To do it, you have to get into the heads of the people you’d like to persuade, think about what they care about and make arguments that embrace their principles. If you can do that, it will show that you view those with whom you disagree not as enemies, but as people whose values are worth your consideration.
Even if the arguments that you wind up making aren’t those that you would find most appealing, you will have dignified the morality of your political rivals with your attention, which, if you think about it, is the least that we owe our fellow citizens.

Friday, May 06, 2016

Our perception of our body shape is very malleable - making your finger feel shorter.

Here is a neat trick. It works! (I tried it). Ekroll et al. show that illusory visual completion of an object's invisible backside can make you finger feel shorter. Here is their summary and the central graphic from the article.

Highlights
•The experience of the hidden backsides of things acts as a real percept 
•These percepts have causal powers, although they do not correspond to real objects 
•They can evoke a bizarre illusion in which the observer’s own finger feels shrunken 
•The perceptual representation of body shape is highly malleable
Summary
In a well-known magic trick known as multiplying balls, conjurers fool their audience with the use of a semi-spherical shell, which the audience perceives as a complete ball. Here, we report that this illusion persists even when observers touch the inside of the shell with their own finger. Even more intriguingly, this also produces an illusion of bodily self-awareness in which the finger feels shorter, as if to make space for the purely illusory volume of the visually completed ball. This observation provides strong evidence for the controversial and counterintuitive idea that our experience of the hidden backsides of objects is shaped by genuine perceptual representations rather than mere cognitive guesswork or imagery.
Figure


A Well-Known Magic Trick and the Shrunken Finger Illusion
(A and B) The multiplying balls routine. The magician first holds what seems to be a single ball between his fingers (A). After a quick flick of the wrist, a second ball seems to materialize (B). In reality, the lower “ball” is a hollow semi-spherical shell, from which the real ball is pulled out.
(C and D) Schematic illustration of the shrunken finger illusion. When a semi-spherical shell is balanced on the observer’s finger as shown in (C) and viewed from above, the observer often reports perceiving the shell as a complete ball (D), while his or her finger is felt to be unusually short, as if to make space for the illusory volume of the complete ball. Note that this drawing is an exaggerated caricature of the perceptual experience. In particular, the real effect may be smaller than depicted here. In the experiments, only the middle finger was extended, while the other fingers were closed to a fist (see Figure below).

Thursday, May 05, 2016

What happens if we all live to 100?

I want to mention an interesting article by Easterbrook that has been languishing in my queue of potential posts for more than a year. It notes numerous studies on aging and life extension, and the question of how long the eerily linear rise in life expectancy since 1840 (from the 40's to the 80's) can continue. Two clips:
No specific development or discovery has caused the rise: improvements in nutrition, public health, sanitation, and medical knowledge all have helped, but the operative impetus has been the “stream of continuing progress.”
One view is that increases will continue at least until life expectancy at birth surpasses 100. Jay Olshansky, a professor of public health at the University of Illinois at Chicago disagrees, saying:
...the rise in life expectancy will “hit a wall soon, if it hasn’t already....Most of the 20th-century gains in longevity came from reduced infant mortality, and those were one time gains.” Infant mortality in the United States trails some other nations’, but has dropped so much—down to one in 170—that little room for improvement remains. “There’s tremendous statistical impact on life expectancy when the young are saved,” Olshansky says. “A reduction in infant mortality saves the entire span of a person’s life. Avoiding mortality in a young person—say, by vaccine—saves most of the person’s life. Changes in medicine or lifestyle that extend the lives of the old don’t add much to the numbers.” Olshansky calculates that if cancer were eliminated, American life expectancy would rise by only three years, because a host of other chronic fatal diseases are waiting to take its place. He thinks the 21st century will see the average life span extend “another 10 years or so,” with a bonus of more health span. Then the increase will slow noticeably, or stop.
Easterbrook's discussion of the social, economic, and political aspects of our graying future is well worth reading. The number of Americans 65 or older could reach 108 million by 2050, like adding three more Floridas inhabited entirely by seniors.
The nonpartisan think tank Third Way has calculated that at the beginning of the Kennedy presidency, the federal government spent $2.50 on public investments—infrastructure, education, and research—for every $1 it spent on entitlements. By 2022, Third Way predicts, the government will spend $5 on entitlements for every $1 on public investments. Infrastructure, education, and research lead to economic growth; entitlement subsidies merely allow the nation to tread water.

Wednesday, May 04, 2016

Semantic maps in our brains - and some interactive graphics

Huth et al. have performed functional MRI on subjects listening to hours of narrative stories to find semantic domains that seem to be consistent across individuals. This interactive 3D viewer (a preliminary version with limited data that takes a while to download and requires a fairly fast computer) shows a color coding of areas with different semantic selectivities (body part, person, place, time, outdoor, visual, tactile, violence, etc.) Here is their Nature abstract:
The meaning of language is represented in regions of the cerebral cortex collectively known as the ‘semantic system’. However, little of the semantic system has been mapped comprehensively, and the semantic selectivity of most regions is unknown. Here we systematically map semantic selectivity across the cortex using voxel-wise modelling of functional MRI (fMRI) data collected while subjects listened to hours of narrative stories. We show that the semantic system is organized into intricate patterns that seem to be consistent across individuals. We then use a novel generative model to create a detailed semantic atlas. Our results suggest that most areas within the semantic system represent information about specific semantic domains, or groups of related concepts, and our atlas shows which domains are represented in each area. This study demonstrates that data-driven methods—commonplace in studies of human neuroanatomy and functional connectivity—provide a powerful and efficient means for mapping functional representations in the brain.

Tuesday, May 03, 2016

Video games for Neuro-Cognitive Optimization

Continuing the MindBlog thread on brain games (cf. here), I pass on the introduction to a brief review by Mishra, Anguera, and Gazzaley on designing the next generation of closed-loop video games (CLVGs) that offer the prospect of enhancing cognition:
Humans of all ages engage deeply in game play. Game-based interactive environments provide a rich source of enjoyment, but also generate powerful experiences that promote learning and behavioral change (Pellegrini, 2009). In the modern era, software-based video games have become ubiquitous. The degree of interactivity and immersion in these video games can now be further enhanced like never before with the advent of consumer-accessible technologies like virtual reality, augmented reality, wearable physiological devices, and motion capture, all of which can be readily integrated using accessible game engines. This technological revolution presents a huge opportunity for neuroscientists to design targeted, novel game-based tools that drive positive neuroplasticity, accelerate learning, and strengthen cognitive function, and thereby promote mental wellbeing in both healthy and impaired brains.
In fact, there is now a burgeoning brain-training industry that already claims to have achieved this goal. However, many commercial claims are unsubstantiated and dismissed by the scientific community (Max Planck Institute for Human Development/Stanford Center on Longevity, 2014, Underwood, 2016). It seems prudent for us to slow down and approach this opportunity with scientific rigor and conservative optimism. Enhancing brain function should not be viewed as a clever, profitable start-up idea that can be conquered with a large marketing budget. If the field continues to be led by overinflated claims, we will jeopardize the careful and iterative process of evidence-based innovations in brain training and thereby risk throwing out the baby with the bathwater.

To strike the right balance, the path to commercialization needs to be accomplished via cutting-edge, neuroscientifically informed video game development tightly coupled with refinement and validation of the software in well-controlled empirical studies. Additionally, to separate the grain from the chaff, these studies and the claims based on them need verification and approval by independent regulatory agencies and the broader scientific community. High-level video game development and rigorous scientific validation need to become the twin pillar foundations of the next generation of closed-loop video games (CLVGs). Here, we define CLVGs as interactive video games that incorporate rapid, real-time, performance-driven, adaptive game challenges and performance feedback. The time is ideal for intensified effort in this important endeavor; CLVGs that are methodically developed and validated have the potential to benefit a broad array of disciplines in need of effective tools to enhance brain function, including education, medicine, and wellness.

Monday, May 02, 2016

Embodied Prediction - perception and mind turned upside down

Andy Clark does a fascinating discussion and analysis of predictive processing, which turns the traditional picture of perception on its head. The embodied mind model, which seems to me completely compelling, shows the stark inadequacy of most brain centered models of mind and cognition. I pass on the end of his introduction and the closing paragraph of the essay. (This essay is just one of many on a fascinating website , Open Mind, that has posted 39 essays (edited by Thomas Metzinger and Jennifer Windt) by contributors who are both junior and senior members of the academic philosophy of mind field.
Predictive processing plausibly represents the last and most radical step in a retreat from the passive, input-dominated view of the flow of neural processing. According to this emerging class of models, naturally intelligent systems (humans and other animals) do not passively await sensory stimulation. Instead, they are constantly active, trying to predict the streams of sensory stimulation before they arrive. Before an “input” arrives on the scene, these pro-active cognitive systems are already busy predicting its most probable shape and implications. Systems like this are already (and almost constantly) poised to act, and all they need to process are any sensed deviations from the predicted state. It is these calculated deviations from predicted states (known as prediction errors) that thus bear much of the information-processing burden, informing us of what is salient and newsworthy within the dense sensory barrage. The extensive use of top-down probabilistic prediction here provides an effective means of avoiding the kinds of “representational bottleneck” feared by early opponents of representation-heavy—but feed-forward dominated—forms of processing. Instead, the downward flow of prediction now does most of the computational “heavy-lifting”, allowing moment-by-moment processing to focus only on the newsworthy departures signified by salient prediction errors. Such economy and preparedness is biologically attractive, and neatly sidesteps the many processing bottlenecks associated with more passive models of the flow of information.
Action itself...then needs to be reconceived. Action is not so much a response to an input as a neat and efficient way of selecting the next “input”, and thereby driving a rolling cycle. These hyperactive systems are constantly predicting their own upcoming states, and actively moving so as to bring some of them into being. We thus act so as to bring forth the evolving streams of sensory information that keep us viable (keeping us fed, warm, and watered) and that serve our increasingly recondite ends. PP thus implements a comprehensive reversal of the traditional (bottom-up, forward-flowing) schema. The largest contributor to ongoing neural response, if PP is correct, is the ceaseless anticipatory buzz of downwards-flowing neural prediction that drives both perception and action. Incoming sensory information is just one further factor perturbing those restless pro-active seas. Within those seas, percepts and actions emerge via a recurrent cascade of sub-personal predictions forged from unconscious expectations spanning multiple spatial and temporal scales.
Conceptually, this implies a striking reversal, in that the driving sensory signal is really just providing corrective feedback on the emerging top-down predictions. As ever-active prediction engines, these kinds of minds are not, fundamentally, in the business of solving puzzles given to them as inputs. Rather, they are in the business of keeping us one step ahead of the game, poised to act and actively eliciting the sensory flows that keep us viable and fulfilled. If this is on track, then just about every aspect of the passive forward-flowing model is false. We are not passive cognitive couch potatoes so much as proactive predictavores, forever trying to stay one step ahead of the incoming waves of sensory stimulation.
Conclusion: Towards a mature science of the embodied mind
By self-organizing around prediction error, and by learning a generative rather than a merely discriminative (i.e., pattern-classifying) model, these approaches realize many of the goals of previous work in artificial neural networks, robotics, dynamical systems theory, and classical cognitive science. They self-organize around prediction error signals, perform unsupervised learning using a multi-level architecture, and acquire a satisfying grip—courtesy of the problem decompositions enabled by their hierarchical form—upon structural relations within a domain. They do this, moreover, in ways that are firmly grounded in the patterns of sensorimotor experience that structure learning, using continuous, non-linguaform, inner encodings (probability density functions and probabilistic inference). Precision-based restructuring of patterns of effective connectivity then allow us to nest simplicity within complexity, and to make as much (or as little) use of body and world as task and context dictate. This is encouraging. It might even be that models in this broad ballpark offer us a first glimpse of the shape of a fundamental and unified science of the embodied mind.

Friday, April 29, 2016

The privileged fifth.

I tweeted this well researched OpEd piece by Thomas Edsall the first time I read it, and after my third reading, want to urge you to read it.  I pass on two  summary graphics that are part of the description of how the privileged top fifth of the U.S. population is becoming a self-perpetuating class that is steadily separating itself by geography, education, and income.


Thursday, April 28, 2016

Sleep deprivation, brain structure, and learning

Saletin et al. find that individual differences in the anatomy of the human hippocampus explain many of the differences in learning impairment after sleep loss. These structural differences also predict the subsequent EEG slow-wave activity during recovery sleep and the restoration of learning after sleep.

Significance statement
Sleep deprivation does not impact all people equally. Some individuals show cognitive resilience to the effects of sleep loss, whereas others express striking vulnerability, the reasons for which remain largely unknown. Here, we demonstrate that structural features of the human brain, specifically those within the hippocampus, accurately predict which individuals are susceptible (or conversely, resilient) to memory impairments caused by sleep deprivation. Moreover, this same structural feature determines the success of memory restoration following subsequent recovery sleep. Therefore, structural properties of the human brain represent a novel biomarker predicting individual vulnerability to (and recovery from) the effects of sleep loss, one with occupational relevance in professions where insufficient sleep is pervasive yet memory function is paramount.
Abstract
Sleep deprivation impairs the formation of new memories. However, marked interindividual variability exists in the degree to which sleep loss compromises learning, the mechanistic reasons for which are unclear. Furthermore, which physiological sleep processes restore learning ability following sleep deprivation are similarly unknown. Here, we demonstrate that the structural morphology of human hippocampal subfields represents one factor determining vulnerability (and conversely, resilience) to the impact of sleep deprivation on memory formation. Moreover, this same measure of brain morphology was further associated with the quality of nonrapid eye movement slow wave oscillations during recovery sleep, and by way of such activity, determined the success of memory restoration. Such findings provide a novel human biomarker of cognitive susceptibility to, and recovery from, sleep deprivation. Moreover, this metric may be of special predictive utility for professions in which memory function is paramount yet insufficient sleep is pervasive (e.g., aviation, military, and medicine).
For further reading on insomnia, this article notes several other studies, one noting several right brain regions of lowered connectivity in people with primary insomnia.

Wednesday, April 27, 2016

Grandiose narcissism and the U.S. presidency

Many of us are scratching our heads about what a Trump presidency might be like, particularly in regard to his outstanding personality trait: grandiose narcissism. Watts et al. have looked at the historical record to note how this trait has correlated with both positive and negative leadership behaviors in U.S. presidents up until Obama. Their abstract:
Recent research and theorizing suggest that narcissism may predict both positive and negative leadership behaviors. We tested this hypothesis with data on the 42 U.S. presidents up to and including George W. Bush, using (a) expert-derived narcissism estimates, (b) independent historical surveys of presidential performance, and (c) largely or entirely objective indicators of presidential performance. Grandiose, but not vulnerable, narcissism was associated with superior overall greatness in an aggregate poll; it was also positively associated with public persuasiveness, crisis management, agenda setting, and allied behaviors, and with several objective indicators of performance, such as winning the popular vote and initiating legislation. Nevertheless, grandiose narcissism was also associated with several negative outcomes, including congressional impeachment resolutions and unethical behaviors. We found that presidents exhibit elevated levels of grandiose narcissism compared with the general population, and that presidents’ grandiose narcissism has been rising over time. Our findings suggest that grandiose narcissism may be a double-edged sword in the leadership domain.
The two highest scorers on grandiose narcissism were Lyndon B. Johnson and Theodore Roosevelt. Richard M. Nixon scored high on "vulnerable narcissism," a trait associated with being self-absorbed and thin-skinned. From the authors' popular account of their work:
Studies in the Journal of Personality in 2013 and in Personality and Individual Differences in 2009 have shown that narcissistic individuals tend to impress others during brief interactions and to perform well in public, two attributes that lend themselves to political success. They are also willing to take risks, which can be a valuable asset in a leader.
In contrast, the psychologist W. Keith Campbell and others have found that narcissists tend to be overconfident when making decisions, to overestimate their abilities and to portray their ideas as innovative when they are not. Compared with their non-narcissistic counterparts, they are more likely to accumulate resources for themselves at others’ expense.
The psychologists Brad Bushman and Roy F. Baumeister have found that narcissists, but not people with garden-variety high self-esteem, are prone to retaliating harshly against people who have criticized them. If, for example, you present narcissists with negative feedback about essays they’ve written, they’re likely to exact revenge against their presumed essay evaluators by blasting them with loud noises (as one amusing study found).
Still other work by the psychologist Mitja Back and colleagues suggests that narcissists are generally well liked in the short term, often creating positive first impressions. Other research indicates, though, that after a while they are usually more disliked than other individuals. Their charisma tends to wear off.

Tuesday, April 26, 2016

Are we smart enough to know how smart animals are?

I want to pass on some clips from Silk's recent review of Frans de Waal's recent book whose title is the title of this post:
Natural selection, he argues, shapes cognitive abilities in the same way as it shapes traits such as wing length. As animals' challenges and habitats differ, so do their cognitive abilities. This idea, which he calls evolutionary cognition, has gained traction in psychology and biology in the past few decades.
For de Waal, evolutionary cognition has two key consequences. First, it is inconsistent with the concept of a 'great chain of being' in which organisms can be ordered from primitive to advanced, simple to complex, stupid to smart. Name a 'unique' human trait, and biologists will find another organism with a similar one. Humans make and use tools; so do wild New Caledonian crows (Corvus moneduloides). Humans develop cultures; so do humpback whales (Megaptera novaeangliae), which socially transmit foraging techniques. We can mentally 'time travel', remembering past events and planning for the future; so can western scrub jays (Aphelocoma californica), which can recall what they had for breakfast on one day, anticipate whether they will be given breakfast the next and selectively cache food when breakfast won't be delivered.
Furthermore, humans do not necessarily outdo other animals in all cognitive domains. Black-capped chickadees (Poecile atricapillus) store seeds in hundreds of locations each day, and can remember what they stored and where, as well as whether items in each location have been eaten, or stolen. Natural selection has favoured those prodigious feats of memory because they spell the difference between surviving winter and starving before spring. Human memory doesn't need to be as good: primates evolved in the tropics. “In the utilitarian view of biology,” de Waal argues, “animals have the brains they need — nothing more, nothing less.”
The second consequence of de Waal's view is that there is continuity across taxa. One source of continuity is based on evolutionary history: natural selection modifies traits to create new ones, producing commonalities among species with a common history. He points out that tool use is found not just in humans and chimpanzees, but also in other apes and monkeys, implying that relevant cognitive building blocks are shared across all primates. Continuity is also generated by convergent evolution, which produces similar traits in distantly related organisms such as New Caledonian crows and capuchin monkeys. De Waal opines that continuity “ought to be the default position for at least all mammals, and perhaps also birds and other vertebrates”.
...researchers are eager to understand what is distinctly human; some are driven by curiosity about how humans came to dominate the planet..Our success presumably has something to do with the emergence of a unique suite of cognitive traits...De Waal recognizes only one such trait: our rich and flexible system of symbolic communication, and our ability to exchange information about past and future. His commitment to the principle of continuity forces him to discount the importance of language for human cognition because of evidence of thinking by non-linguistic creatures. And he ignores compelling findings from linguists and developmental psychologists such as Elizabeth Spelke on the formative role of language in cognition.
A more satisfying book would leave readers with a clearer understanding of why, a few million years after our lineage diverged from the lineage of chimpanzees, we are the ones reading this book, and not them.

Monday, April 25, 2016

Essential role of default mode network in higher cognitive processing.

The respective roles of attentional and default mode networks in our brains has been the subject of numerous MindBlog posts (enter 'default mode' in the search box in the left column). A summary article by Bola and Borchardt notes an important recent contribution by Vatansever et al., whose abstract is shown below, followed by a graphic from the summary article.  Their work changes the previous view that the default mode disengages during goal-directed tasks.

ABSTRACT
The default mode network (DMN) has been traditionally assumed to hinder behavioral performance in externally focused, goal-directed paradigms and to provide no active contribution to human cognition. However, recent evidence suggests greater DMN activity in an array of tasks, especially those that involve self-referential and memory-based processing. Although data that robustly demonstrate a comprehensive functional role for DMN remains relatively scarce, the global workspace framework, which implicates the DMN in global information integration for conscious processing, can potentially provide an explanation for the broad range of higher-order paradigms that report DMN involvement. We used graph theoretical measures to assess the contribution of the DMN to global functional connectivity dynamics in 22 healthy volunteers during an fMRI-based n-back working-memory paradigm with parametric increases in difficulty. Our predominant finding is that brain modularity decreases with greater task demands, thus adapting a more global workspace configuration, in direct relation to increases in reaction times to correct responses. Flexible default mode regions dynamically switch community memberships and display significant changes in their nodal participation coefficient and strength, which may reflect the observed whole-brain changes in functional connectivity architecture. These findings have important implications for our understanding of healthy brain function, as they suggest a central role for the DMN in higher cognitive processing.
SIGNIFICANCE STATEMENT
The default mode network (DMN) has been shown to increase its activity during the absence of external stimulation, and hence was historically assumed to disengage during goal-directed tasks. Recent evidence, however, implicates the DMN in self-referential and memory-based processing. We provide robust evidence for this network's active contribution to working memory by revealing dynamic reconfiguration in its interactions with other networks and offer an explanation within the global workspace theoretical framework. These promising findings may help redefine our understanding of the exact DMN role in human cognition.
Graphic from Review

Schematic representation of the main findings of Vatansever et al. Community representation and colors are in the style of Figures 1 and 3 in the article by Vatansever et al. (2015), and the DMN is represented by Community 4. In the low-demanding 0-back condition, the network was highly modular (high Q index) and was divided into four distinct modules. With the increasing cognitive load, the modularity of the network decreased, and three communities merged into one. Thus, while local segregation was prevalent in the low-demanding task, increasing cognitive effort was associated with more pronounced global integration.

Friday, April 22, 2016

How to attract others.

Well, Duh...... Interesting, but talk about showing the obvious!.. from Vacharkulksemsuka et al.:
Across two field studies of romantic attraction, we demonstrate that postural expansiveness makes humans more romantically appealing. In a field study (n = 144 speed-dates), we coded nonverbal behaviors associated with liking, love, and dominance. Postural expansiveness—expanding the body in physical space—was most predictive of attraction, with each one-unit increase in coded behavior from the video recordings nearly doubling a person’s odds of getting a “yes” response from one’s speed-dating partner. In a subsequent field experiment (n = 3,000), we tested the causality of postural expansion (vs. contraction) on attraction using a popular Global Positioning System-based online-dating application. Mate-seekers rapidly flipped through photographs of potential sexual/date partners, selecting those they desired to meet for a date. Mate-seekers were significantly more likely to select partners displaying an expansive (vs. contractive) nonverbal posture. Mediation analyses demonstrate one plausible mechanism through which expansiveness is appealing: Expansiveness makes the dating candidate appear more dominant. In a dating world in which success sometimes is determined by a split-second decision rendered after a brief interaction or exposure to a static photograph, single persons have very little time to make a good impression. Our research suggests that a nonverbal dominance display increases a person’s chances of being selected as a potential mate.

Thursday, April 21, 2016

Impulsivity, sensation seeking, and substance use correlate with reduced brain cortical thickness.

From Holmes et al.:
Individuals vary widely in their tendency to seek stimulation and act impulsively, early developing traits with genetic origins. Failures to regulate these behaviors increase risk for maladaptive outcomes including substance abuse. Here, we explored the neuroanatomical correlates of sensation seeking and impulsivity in healthy young adults. Our analyses revealed links between sensation seeking and reduced cortical thickness that were preferentially localized to regions implicated in cognitive control, including anterior cingulate and middle frontal gyrus (n = 1015). These associations generalized to self-reported motor impulsivity, replicated in an independent group (n = 219), and correlated with heightened alcohol, tobacco, and caffeine use. Critically, the relations between sensation seeking and brain structure were evident in participants without a history of alcohol or tobacco use, suggesting that observed associations with anatomy are not solely a consequence of substance use. These results demonstrate that individual differences in the tendency to seek stimulation, act on impulse, and engage in substance use are correlated with the anatomical structure of cognitive control circuitry. Our findings suggest that, in healthy populations, covariation across these complex multidimensional behaviors may in part originate from a common underlying biology.

Wednesday, April 20, 2016

Metaphorical conflict shapes social perception when spatial and ideological collide.

Kleiman et al. do some intriguing experiments. I give you their abstract first, which doesn't actually say how they did the experiments, and then give you some further description from their text. The abstract:
In the present article, we introduce the concept of metaphorical conflict—a conflict between the concrete and abstract aspects of a metaphor. We used the association between the concrete (spatial) and abstract (ideological) components of the political left-right metaphor to demonstrate that metaphorical conflict has marked implications for cognitive processing and social perception. Specifically, we showed that creating conflict between a spatial location and a metaphorically linked concept reduces perceived differences between the attitudes of partisans who are generally viewed as possessing fundamentally different worldviews (Democrats and Republicans). We further demonstrated that metaphorical conflict reduces perceived attitude differences by creating a mind-set in which categories are represented as possessing broader boundaries than when concepts are metaphorically compatible. These results suggest that metaphorical conflict shapes social perception by making members of distinct groups appear more similar than they are generally thought to be. These findings have important implications for research on conflict, embodied cognition, and social perception.
In the first experiment they asked subjects to categorize a series of pictures of Barack Obama and Mitt Romney. One group categorized the Romney pictures using their right hand (the P key)and Obama pictures with their left hand using the Q key - compatible with the right wing, left wing political metaphor. A second group was asked to identify Obama with their right hand and Romney with their left - in this case the physical action and the candidate's ideology were metaphorically incompatible. The interesting result was that:
...participants in the incompatible condition perceived the difference between the candidates’ ideologies as smaller than did participants in the compatible condition...Additionally, participants in the incompatible condition perceived the difference between the candidates’ stances on specific political issues as smaller than did participants in the compatible condition
A second experiment asked participants to estimate the ideology of the typical Democrat and Republican using a scale of 1 to 9 that was either compatible or incompatible with the metaphorical association linking spatial locations to political ideologies.
Participants assigned to the incompatible condition (n = 194) provided their response on a horizontally displayed scale with the values in the opposite sequence, that is, from 1 (extremely conservative) to 9 (extremely liberal). Note that this scale reversed the traditional spatial assignment and placed liberal views on the right and conservative views on the left, which metaphorically puts the physical location and ideology in conflict... consistent with predictions, participants who rated their perceptions on the incompatible scale perceived the typical Republican’s and typical Democrat’s attitudes as more similar than did participants who rated their perceptions on the compatible scale.
Two further control experiments were done.

Tuesday, April 19, 2016

Political polarization and prejudice.

Yesterday's post dealt with softening prejudicial attitudes towards transgender people. This is relevant to prejudice rising from the right versus left political polarization that continues to increase in this county. From a recent NYTimes OpEd piece by Arthur Brooks:
Thirty-eight percent of Democrats have a “very unfavorable” view of Republicans, and 43 percent of Republicans hold that view of Democrats. About half of “consistently liberal” Americans say most of their friends share their views, and about a third say it’s important to live in a place where that is so. For those who are “consistently conservative,” these preferences are even more pronounced.
...the average American is becoming more ideologically predictable. A Pew Research Center study from 2014 shows that the share of Americans with “consistently conservative” or “consistently liberal” views has more than doubled in the last two decades to 21 percent from 10 percent...In 1994, nearly 40 percent of Republicans were more liberal than the median Democrat, and 30 percent of Democrats were more conservative than the median Republican. Today, those numbers have plummeted to 8 percent and 6 percent.
This polarization has led to political discrimination that studies have shown to be stronger than racial discrimination
...Bigotry’s cousin is contempt...Watch and listen to politically polarized commentary today, and you will see that it is more contemptuous than angry, overflowing with sneering, mockery and disgust.
So what’s the antidote? I asked the Dalai Lama, one of the world’s experts on bringing people together. He made two points. First, the solution starts not with institutions, but with individuals. We look too much to political parties or Congress to make progress, but not nearly enough at our own behavior...You can’t single-handedly change the country, but you can change yourself. By declaring your independence from the bitterness washing over our nation, you can strike a small blow for greater national unity.
Second, each of us must aspire to what the Dalai Lama calls “warmheartedness” toward those with whom we disagree. This might sound squishy, but it is actually tough and practical advice. As he has stated, “I defeat my enemies when I make them my friends.” He is not advocating surrender to the views of those with whom we disagree. Liberals should be liberals and conservatives should be conservatives. But our duty is to be respectful, fair and friendly to all, even those with whom we have great differences.
Yesterday's post on changing prejudice suggests a further technique for reconciliation: active or analogic perspective taking. This is essentially imagining a situation in which you felt contempt from others, and also putting yourself in the shoes of others, imagining their concerns, etc.

Monday, April 18, 2016

How to change prejudice...for real this time

John Bohannon summarizes the interesting story of two researchers, who after finding that a study on reversing homophobia was based on fake data, went ahead to find that the effect claimed by the fraudulent study was real after all. Broockman and Kalla used a technique developed by the Los Angeles LGBT center:
...the LGBT Center has its canvassers follow one called “analogic perspective taking.” By inviting someone to discuss an experience in which that person was perceived as different and treated unfairly, a canvasser tries to generate sympathy for the suffering of another group—such as gay or transgender people.
Here is the abstract:
Existing research depicts intergroup prejudices as deeply ingrained, requiring intense intervention to lastingly reduce. Here, we show that a single approximately 10-minute conversation encouraging actively taking the perspective of others can markedly reduce prejudice for at least 3 months. We illustrate this potential with a door-to-door canvassing intervention in South Florida targeting antitransgender prejudice. Despite declines in homophobia, transphobia remains pervasive. For the intervention, 56 canvassers went door to door encouraging active perspective-taking with 501 voters at voters’ doorsteps. A randomized trial found that these conversations substantially reduced transphobia, with decreases greater than Americans’ average decrease in homophobia from 1998 to 2012. These effects persisted for 3 months, and both transgender and nontransgender canvassers were effective. The intervention also increased support for a nondiscrimination law, even after exposing voters to counterarguments.

Friday, April 15, 2016

Brain correlates of how the risk taking of others influences our own risk taking

From Suzuki et al., another upstairs/downstairs story. Risk is represented in the caudate nucleus (downstairs), the risk activity of others is represented in the dorsolateral prefrontal cortex (upstairs). The strength of the connections between these areas determines how susceptible our behavior is to influence by others.
Our attitude toward risk plays a crucial role in influencing our everyday decision-making. Despite its importance, little is known about how human risk-preference can be modulated by observing risky behavior in other agents at either the behavioral or the neural level. Using fMRI combined with computational modeling of behavioral data, we show that human risk-preference can be systematically altered by the act of observing and learning from others’ risk-related decisions. The contagion is driven specifically by brain regions involved in the assessment of risk: the behavioral shift is implemented via a neural representation of risk in the caudate nucleus, whereas the representations of other decision-related variables such as expected value are not affected. Furthermore, we uncover neural computations underlying learning about others’ risk-preferences and describe how these signals interact with the neural representation of risk in the caudate. Updating of the belief about others’ preferences is associated with neural activity in the dorsolateral prefrontal cortex (dlPFC). Functional coupling between the dlPFC and the caudate correlates with the degree of susceptibility to the contagion effect, suggesting that a frontal–subcortical loop, the so-called dorsolateral prefrontal–striatal circuit, underlies the modulation of risk-preference. Taken together, these findings provide a mechanistic account for how observation of others’ risky behavior can modulate an individual’s own risk-preference.

Thursday, April 14, 2016

Aging brains - more physical activity, more gray matter, less Alzheimers.

I like to pass on any work I see relevant to exercise, aging, and the brain. The following is from Raji et al.
BACKGROUND: Physical activity (PA) can be neuroprotective and reduce the risk for Alzheimer's disease (AD). In assessing physical activity, caloric expenditure is a proxy marker reflecting the sum total of multiple physical activity types conducted by an individual. 
OBJECTIVE: To assess caloric expenditure, as a proxy marker of PA, as a predictive measure of gray matter (GM) volumes in the normal and cognitively impaired elderly persons. 
METHODS: All subjects in this study were recruited from the Institutional Review Board approved Cardiovascular Health Study (CHS), a multisite population-based longitudinal study in persons aged 65 and older. We analyzed a sub-sample of CHS participants 876 subjects (mean age 78.3, 57.5% F, 42.5% M) who had i) energy output assessed as kilocalories (kcal) per week using the standardized Minnesota Leisure-Time Activities questionnaire, ii) cognitive assessments for clinical classification of normal cognition, mild cognitive impairment (MCI), and AD, and iii) volumetric MR imaging of the brain. Voxel-based morphometry modeled the relationship between kcal/week and GM volumes while accounting for standard covariates including head size, age, sex, white matter hyperintensity lesions, MCI or AD status, and site. Multiple comparisons were controlled using a False Discovery Rate of 5 percent. 
RESULTS: Higher energy output, from a variety of physical activity types, was associated with larger GM volumes in frontal, temporal, and parietal lobes, as well as hippocampus, thalamus, and basal ganglia. High levels of caloric expenditure moderated neurodegeneration-associated volume loss in the precuneus, posterior cingulate, and cerebellar vermis. 
CONCLUSION: Increasing energy output from a variety of physical activities is related to larger gray matter volumes in the elderly, regardless of cognitive status.

Wednesday, April 13, 2016

Distraction in the digital era….what about since 1710?

I want to pass on some clips from an interesting essay by Frank Furedi, "The Ages of Distraction."
The rise of the internet and the widespread availability of digital technology has surrounded us with endless sources of distraction: texts, emails and Instagrams from friends, streaming music and videos, ever-changing stock quotes, news and more news. To get our work done, we could try to turn off the digital stream, but that’s difficult to do when we’re plagued by FOMO, the modern fear of missing out. Some people think that our willpower is so weak because our brains have been damaged by digital noise. But blaming technology for the rise in inattention is misplaced. History shows that the disquiet is fueled not by the next new thing but by the threat this thing – whatever it might be – poses to the moral authority of the day.
The first time inattention emerged as a social threat was in 18th-century Europe, during the Enlightenment, just as logic and science were pushing against religion and myth. The Oxford English Dictionary cites a 1710 entry from Tatler as its first reference to this word, coupling inattention with indolence; both are represented as moral vices of serious public concern.
The recent decades have seen a dramatic reversal in the conceptualization of inattention. Unlike in the 18th century when it was perceived as abnormal, today inattention is often presented as the normal state. The current era is frequently characterized as the Age of Distraction, and inattention is no longer depicted as a condition that afflicts a few. Nowadays, the erosion of humanity’s capacity for attention is portrayed as an existential problem, linked with the allegedly corrosive effects of digitally driven streams of information relentlessly flowing our way.
The perception of an Age of Distraction is related to our uncertainty about the answer to the question of ‘attention to what or to whom’. The sublimation of anxieties about moral authority through the fetish of technologically driven distraction has acquired pathological proportions in relation to children and young people. Yet as most sensible observers understand, children who are inattentive to their teachers are often obsessively attentive to the text messages that they receive. The constant lament about inattentive youth in the Anglo-American world could be interpreted as a symptom of problems related to the exercise of adult authority.
Often the failure to inspire and capture the imagination of young people is blamed on their inattentive state of minds. Too often educators have responded to this condition by adopting a fatalistic approach of accommodating to the supposed inattentive reading practices of digital natives. This pattern is evident in higher education where the assumption that college students can no longer be expected to read long and challenging texts or pay attention to serious lectures has led to the adaptation of course material to the inattentive mentality of the digital native. Calls to change the educational environment to ‘fit the student’ have become widespread in higher education.
How different from the reaction of moral philosophers such as Dugald Stewart, also concerned with the problem of the inattentive student. Author of Outlines of Moral Philosophy: For the Use of Students in the University of Edinburgh (1793), Stewart believed that the problem of inattention could be overcome through moral education. Unlike some contemporary academics, he regarded the ‘early habit of inattention’ a problem to be solved rather than an unalterable fact of existence. Helvétius fervently believed that everyone had the potential to acquire ‘continued attention’ and ‘triumph over indolence’.
Regrettably, the optimism of Helvétius has given way to a mood of resignation. Attention is still seen as desirable but almost impossible to achieve. As one alarmist account warns, ‘an epidemic erosion of attention is a sure sign of an impending dark age’. Helvétius would have been distressed by the fatalism expressed in this lament.

Tuesday, April 12, 2016

The evolutionary origins of smiles, laughter, and tears.

Graziano suggests that our smile originated in the defensive reaction of monkeys to other monkeys moving into their personal space. He then proceeds to make just-so stories about simian origins of our laughing and crying. To begin, imagine Monkey B steps into the personal space of Monkey A.
Monkey A squints, protecting his eyes. His upper lip pulls up. This does expose the teeth, but only as a side-effect: in a defensive reaction, the point of the curled lip is not to prepare for a biting attack so much as it is to bunch the facial skin upward, further padding the eyes in folds of skin...The head pulls down and the shoulders pull up to protect the vulnerable throat and jugular....The torso curves forward to protect the abdomen...Monkey B can learn a lot by watching the reaction of Monkey A...And so the stage is set for a social signal to evolve: natural selection will favour monkeys that can read the cringe reactions of their peers and adjust their behaviour accordingly...If Monkey B can glean useful information by watching Monkey A, then it’s useful for Monkey A to manipulate that information and influence Monkey B. Evolution therefore favours monkeys that can, in the right circumstances, pantomime a defensive reaction. It helps to convince others that you’re non-threatening. Finally we see the origin of the smile: a briefly flashed imitation of a defensive stance.
In people, the smile has been pared down to little more than its facial components — the lifting of the upper lip, the upward bunching of the cheeks, the squint. These days we use it mainly to communicate a friendly lack of aggression rather than outright subservience...We can’t help feeling warmer towards someone who beams that Duchenne smile.
On laughing:
...chimps have something like laughter: they open their mouths and make short exhalations during play fights, or if someone tickles them. Gorillas and orangutans do the same. The psychologist Marina Ross compared the noises made by different species of ape and found that it was the sound of bonobos at play that comes closest to human laughter, again, when play-fighting or tickling. All of which makes it seem quite likely that the original type of human laughter also emerged from, yes, play-fighting and tickling.
On crying:
My best guess, strange as it might sound, is that our ancestors were in the habit of punching each other on the nose. Such injuries would have resulted in copious tear production...According to recent analysis by David Carrier and Michael Morgan from Utah University, the shape of human facial bones might well have evolved to withstand the physical trauma of frequent punching. Thickly buttressed facial bones are first seen in fossils of Australopithecus, which appeared following our split with chimpanzees...the reason we weep now may well be that our ancestors discussed their differences by hitting each other in the face. Some of us still do, I suppose.
In any event, the entire behavioural display that we call crying – the tear production, the squinting, the raised upper lip, the repeated alarm calls – makes for a useful signifier. Evolution would have favoured animals that reacted to it with an emotional desire to dispense comfort.
Graziano's speculative summary:
An age-old defensive mechanism, a mechanism that monitors bubbles of space around the body and organises protective movements, suddenly takes flight in the hyper-social world of primates, spinning into smiles and laughter and crying and cringeing. Each one of those behaviours then splits further, branching into a whole codebook of signals for use in different social circumstances. Not all of human expression can be explained in this way, but much of it can. A Duchenne smile, a cold smile, laughter at a joke, laughter that acknowledges a clever witticism, cruel laughter, a cringe to show servility, standing straight to show confidence, the arms-crossed expression of suspicion, the arms-open expression of welcome, tilting your head as a sign of surrender to a lover, the fleeting crinkling of the face that hints at crying as we show sympathy for some sad story, or a full blown sobbing jag: this whole vast range of expression could well have emerged from a protective sensory-motor loop that has nothing to do with communication. Evolution is bizarre.

Monday, April 11, 2016

Another list - "Keys to happiness"

The New York Times has done a simple list of pointers to basic articles and research on well being. I'm passing on a few of the items from a condensed version of that list, rearranging the list in almost reverse order to reflect not importance, but items that seem to me to be less commonly acted on. So, keys of happiness:

Don't obsess about it, and don't overdo it.

If all else fails, fake it.

Gratitude helps.

Make friends, family, and weekends a priority

Be healthy




Friday, April 08, 2016

A succinct list of some of our common psychological errors.

I want to point to Belsky's article on why we think we are better decision makers under uncertainty than we really are. He summarizes several common errors:

The sunk cost fallacy - hanging on to a decision, or an investment, in an unconscious desire to justify it.

Loss aversion - reacting more strongly to loss of a resource (time, goods, or money) than to a similar gain.

Overconfidence - overrating our abilities, knowledge, and skill (two thirds of investors rate their financial sophistication as advanced, but barely pass a financial literacy exam.)

Optimism bias - which seems to be hard-wired into our brains because it has evolutionarily useful, driving humans to strive in the face of long odds.

Hindsight bias - rewriting history to make ourselves look good, as in misremembering our forecasts in a way that makes us look smarter.

Attribution bias - attributing good outcomes to our own skills, but bad outcomes to causes over which we had no control.

Confirmation bias - giving too much weight to information that supports our existing beliefs and discounting that which does not.

Thursday, April 07, 2016

Muscle mass and nerve control enhanced in octogenarian athletes.

Power et al. expand their earlier studies on active runners ~65 years old to find ~14% greater muscle mass and ~28% more functioning motor nerve units in octogenarian masters athletes than in healthy age-matched controls.
Our group has shown a greater number of functioning motor units (MU) in a cohort of highly-active older(~65y) masters runners relative to age-matched controls. Owing to the precipitous loss in the number of functioning MUs in the 8th and 9th decade of life it is unknown whether older world class octogenarian masters athletes (MA) would also have greater numbers of functioning MUs (MUNE) compared with age-matched controls. We measured MU numbers and neuromuscular transmission stability in the tibialis anterior of world champion MAs (~80y), and compared the values to healthy age-matched controls (~80y). Decomposition-enhanced spike-triggered averaging was used to collect surface and intramuscular electromyography signals during dorsiflexion at ~25% of maximum voluntary isometric contraction(MVC). Near fibre (NF) MU potential analysis was used to assess neuromuscular transmission stability. For the MAs as compared with age-matched controls; the amount of excitable muscle mass (CMAP) was 14% greater (p less than 0.05), there was a trend (p=0.07) towards a 27% smaller surface detected motor unit potential - representative of less collateral reinnervation, and 28% more functioning MUs (p less than 0.05). Additionally, the MAs had greater MU neuromuscular stability than the controls as indicated by lower NF jitter and jiggle values (p less than 0.05). These results demonstrate that high performing octogenarians better maintain neuromuscular stability of the MU and mitigate the loss of MUs associated with aging well into the later decades of life during which time the loss of muscle mass and strength become functionally relevant. Future studies need to identify the concomitant roles genetics and exercise play in neuroprotection.

Wednesday, April 06, 2016

Why sad music can make us feel good.

As an update to a previous MindBlog post on why we like sad music, I want to note Ojiaku's brief mention of several articles on this subject.
Sad music might make people feel vicarious unpleasant emotions, found a study published last year in Frontiers in Psychology. But this experience can ultimately be pleasurable because it allows a negative emotion to exist indirectly, and at a safe distance. Instead of feeling the depths of despair, people can feel nostalgia for a time when they were in a similar emotional state: a non-threatening way to remember a sadness.
People who are very empathetic are more likely to take pleasure in the emotional experience of sad music, according to another study in Frontiers of Psychology. Others enjoy sad songs because they help them return to an emotionally balanced state, according to a review in Frontiers in Human Neuroscience, published in 2015. And those more open to varied experiences might enjoy the songs because the unique emotions that come up when listening to the music fulfill their need for novelty in thoughts and feelings.
From the Frontiers in Neurosciences abstract:
We offer a framework to account for how listening to sad music can lead to positive feelings, contending that this effect hinges on correcting an ongoing homeostatic imbalance. Sadness evoked by music is found pleasurable: (1) when it is perceived as non-threatening; (2) when it is aesthetically pleasing; and (3) when it produces psychological benefits such as mood regulation, and empathic feelings, caused, for example, by recollection of and reflection on past events.

Tuesday, April 05, 2016

The Social Gene

I want to pass on some clips from Joseph's Swift's review of a book, "The Society of Genes" by Yanai and Lercher that updates Richard Dawkins's classic "The Selfish Gene" publised 40 years ago. (Their title reminds me of "Society of Mind," a classic book published in 1986 by Marvin Minsky, who recently died at age 88.)
Genetic research has moved rapidly since the publication of Richard Dawkins's The Selfish Gene 40 years ago. In the intervening years, we have come to realize that many of the most interesting and important phenomena in human biology are not caused by any single gene. Processes like the immune system's ability to recognize infection, or the timing of our sleep-wake cycle, for example, are the product of many genes working together in a highly integrated way. Citing a wealth of recent research that explores the ways genes work together to produce complex biological processes, Itai Yanai and Martin Lercher argue that it is time to embrace a new, more holistic, metaphor in their book, The Society of Genes.
Rather than focus on any one gene, Yanai and Lercher invite the reader to step back and observe how genes assemble together to make a global genetic system, or genome. From here, one can see that the labor within the genome is not divided equally. Whereas many genes encode for proteins that perform a single monotonous task, such as breaking down a certain type of sugar or producing a specific skin pigment, there are others that serve such fundamental roles that their removal would lead to the crumbling of the genomic society altogether. Among the latter group are genes that manage the behavior of a host of other genes.
When genes are mismanaged by their masters, organisms can be transformed in dramatic ways. For example, in humans, when SOX9 fails to direct its wide range of subordinates succinctly, sex reversal and skeletal malformations can occur.
Given that catastrophic things tend to happen when genes don't work together properly, changes to how the genomic society is run are a rare occurrence. When genes with new abilities evolve, Darwinian selection determines whether they will join the ranks as productive members of society. Our ancestors obtained genes that could interpret light as color and a gene for a more efficient oxygen-carrying hemoglobin in this very way.
And then there are the genes that don't contribute to society at all. Instead, they secure their position by hijacking the system. The LINE1 gene, for example, encodes only for its own dispersal, copying and pasting itself throughout our genome while providing the society with no clear benefit. The “bad behavior” of genes amounts to scandal in the genomic society, and learning about their exploits is one of the most enjoyable elements of reading the book.
There are even genes that work to ensure the survival of individual cells within an organism by wreaking havoc on others. In fruit flies, for example, a pair of genes involved in sperm production work in concert to produce both a poison and its antidote. The toxic compound is released from the cell, while the antidote is retained. In this way, surrounding sperm cells without the gene pair are killed. On reading about such systems, one begins to realize that it's not quite right to imagine our genome as some idealized republic. This is a society that is easily compromised from within its own ranks.
In the years since The Selfish Gene was published, the human genome has been sequenced, along with the genomes of many other species. Indeed, probing one's own genes is beginning to become routine. Thus, The Society of Genes represents a timely and welcome handbook for navigating this postgenomic era.