Monday, May 16, 2016

Downsides of diversity.

I want to thank the anonymous commentator on the “Diversity makes you brighter” post, who sent links to interesting articles by Jonas and by Dinesena and Sønderskov. I pass on just some clips from Jonas, noting work by Putnam and Page:
...a fascinating new portrait of diversity emerging from recent scholarship. Diversity, it shows, makes us uncomfortable -- but discomfort, it turns out, isn't always a bad thing. Unease with differences helps explain why teams of engineers from different cultures may be ideally suited to solve a vexing problem. Culture clashes can produce a dynamic give-and-take, generating a solution that may have eluded a group of people with more similar backgrounds and approaches. At the same time, though, Putnam's work adds to a growing body of research indicating that more diverse populations seem to extend themselves less on behalf of collective needs and goals.
In more diverse communities, he says, there were neither great bonds formed across group lines nor heightened ethnic tensions, but a general civic malaise. And in perhaps the most surprising result of all, levels of trust were not only lower between groups in more diverse settings, but even among members of the same group...
So, there is a diversity paradox:
...those in more diverse communities may do more bowling alone, but the creative tensions unleashed by those differences in the workplace may vault those same places to the cutting edge of the economy and of creative culture.

Friday, May 13, 2016

Two ways to be satisfied.

Anna North points to an article by Helzer and Jayawickreme that examines two different control strategies for obtaining short and long term life satisfaction, “primary control” — the ability to directly affect one's circumstances — and “secondary control” — the ability to affect how one responds to those circumstances.
How does a sense of control relate to well-being? We consider two distinguishable control strategies, primary and secondary control, and their relationships with two facets of subjective well-being, daily positive/negative affective experience and global life satisfaction. Using undergraduate and online samples, the results suggest that these different control strategies are associated uniquely with distinct facets of well-being. After controlling for shared variance among constructs, primary control (the tendency to achieve mastery over circumstances via goal striving) was associated more consistently with daily affective experience than was secondary control, and secondary control (the tendency to achieve mastery over circumstances via sense-making) was associated more strongly with life satisfaction than primary control, but only within the student sample and community members not in a committed relationship. The results highlight the importance of both control strategies to everyday health and provide insights into the mechanisms underlying the relationship between control and well-being.
It is not clear why relationship status makes a difference. Helzer suggests that having a partner may help people deal with adversity the same way secondary control does, so secondary control may have less of an effect

Thursday, May 12, 2016

John Oliver on "Scientific Studies show...."

I have to pass on this great bit from John Oliver, on the vacuity of most scientific reporting.


Diversity makes you brighter.

Providing some data relevant to debates over affirmative action, Levine et al. show that ethnic diversity can increase intelligent behaviors. Misfits between market prices and the true value of assets (market bubbles) are more likely in ethnically homogeneous than in diverse markets.
Markets are central to modern society, so their failures can be devastating. Here, we examine a prominent failure: price bubbles. Bubbles emerge when traders err collectively in pricing, causing misfit between market prices and the true values of assets. The causes of such collective errors remain elusive. We propose that bubbles are affected by ethnic homogeneity in the market and can be thwarted by diversity. In homogenous markets, traders place undue confidence in the decisions of others. Less likely to scrutinize others’ decisions, traders are more likely to accept prices that deviate from true values. To test this, we constructed experimental markets in Southeast Asia and North America, where participants traded stocks to earn money. We randomly assigned participants to ethnically homogeneous or diverse markets. We find a marked difference: Across markets and locations, market prices fit true values 58% better in diverse markets. The effect is similar across sites, despite sizeable differences in culture and ethnic composition. Specifically, in homogenous markets, overpricing is higher as traders are more likely to accept speculative prices. Their pricing errors are more correlated than in diverse markets. In addition, when bubbles burst, homogenous markets crash more severely. The findings suggest that price bubbles arise not only from individual errors or financial conditions, but also from the social context of decision making. The evidence may inform public discussion on ethnic diversity: it may be beneficial not only for providing variety in perspectives and skills, but also because diversity facilitates friction that enhances deliberation and upends conformity.

Wednesday, May 11, 2016

What art unveils

I pass on some initial and final clips from an essay by Alva Noë that is worth reading in its entirely.
Is there a way of thinking about art that will get us closer to an understanding of its essential nature, and our own?...the trend is to try to answer these questions in the key of neuroscience. I recommend a different approach, but not because I don’t think it is crucial to explore the links between art and our biological nature. The problem is that neuroscience has yet to frame an adequate conception of our nature. You look in vain in the writings of neuroscientists for satisfying accounts of experience or consciousness. For this reason, I believe, we can’t use neuroscience to explain art and its place in our lives. Indeed, if I am right, the order of explanation may go in the other direction: Art can help us frame a better picture of our human nature.
...Design, the work of technology, stops, and art begins, when we are unable to take the background of our familiar technologies and activities for granted, and when we can no longer take for granted what is, in fact, a precondition of the very natural-seeming intelligibility of such things as doorknobs and pictures, words and sounds. When you and are I talking, I don’t pay attention to the noises you are making; your language is a transparency through which I encounter you. Design, at least when it is optimal, is transparent in just this way; it disappears from view and gets absorbed in application. You study the digital image of the shirt on the website, you don’t contemplate its image.
Art, in contrast, makes things strange. You do contemplate the image, when you examine Leonardo’s depiction of the lady with the ermine. You are likely, for example, to notice her jarringly oversized and masculine hand and to wonder why Leonardo draws our attention to that feature of this otherwise beautiful young person. Art disrupts plain looking and it does so on purpose. By doing so it discloses just what plain looking conceals.
Art unveils us ourselves. Art is a making activity because we are by nature and culture organized by making activities. A work of art is a strange tool. It is an alien implement that affords us the opportunity to bring into view everything that was hidden in the background.
If I am right, art isn’t a phenomenon to be explained. Not by neuroscience, and not by philosophy. Art is itself a research practice, a way of investigating the world and ourselves. Art displays us to ourselves, and in a way makes us anew, by disrupting our habitual activities of doing and making.

Tuesday, May 10, 2016

Our brain activity at rest predicts our performance on tasks.

The Science Mazaine precis of Travor et al.:
We all differ in how we perceive, think, and act. What drives individual differences in evoked brain activity? Tavor et al. applied computational models to functional magnetic resonance imaging (fMRI) data from the Human Connectome Project. Brain activity in the “resting” state when subjects were not performing any explicit task predicted differences in fMRI activation across a range of cognitive paradigms. This suggests that individual differences in many cognitive tasks are a stable trait marker. Resting-state functional connectivity thus already contains the repertoire that is then expressed during task-based fMRI.
And the article abstract:
When asked to perform the same task, different individuals exhibit markedly different patterns of brain activity. This variability is often attributed to volatile factors, such as task strategy or compliance. We propose that individual differences in brain responses are, to a large degree, inherent to the brain and can be predicted from task-independent measurements collected at rest. Using a large set of task conditions, spanning several behavioral domains, we train a simple model that relates task-independent measurements to task activity and evaluate the model by predicting task activation maps for unseen subjects using magnetic resonance imaging. Our model can accurately predict individual differences in brain activity and highlights a coupling between brain connectivity and function that can be captured at the level of individual subjects.

Monday, May 09, 2016

The key to political persuasion

I pass on clips from this interesting piece, that has been languishing in my queue of potential posts for some time, in which Willer and Feinberg give a more accessible account of their work reported in the Personality and Social Psychology Bulletin.
In business, everyone knows that if you want to persuade people to make a deal with you, you have to focus on what they value, not what you do. If you’re trying to sell your car, you emphasize the features of the sale that appeal to the buyer (the reliability and reasonable price of the vehicle), not the ones that appeal to you (the influx of cash).
This rule of salesmanship also applies in political debate — i.e., you should frame your position in terms of the moral values of the person you’re trying to convince. But when it comes to politics, this turns out to be hard to do. We found that people struggled to set aside their reasons for taking a political position and failed to consider how someone with different values might come to support that same position.
In one study, we presented liberals and conservatives with one of two messages in support of same-sex marriage. One message emphasized the need for equal rights for same-sex couples. This is the sort of fairness-based message that liberals typically advance for same-sex marriage. It is framed in terms of a value — equality — that research has shown resonates more strongly among liberals than conservatives. The other message was designed to appeal to values of patriotism and group loyalty, which have been shown to resonate more with conservatives. (It argued that “same-sex couples are proud and patriotic Americans” who “contribute to the American economy and society.”)
Liberals showed the same support for same-sex marriage regardless of which message they encountered. But conservatives supported same-sex marriage significantly more if they read the patriotism message rather than the fairness one.
In a parallel experiment, we targeted liberals for persuasion. We presented a group of liberals and conservatives with one of two messages in support of increased military spending. One message argued that we should “take pride in our military,” which “unifies us both at home and abroad.” The other argued that military spending is necessary because, through the military, the poor and disadvantaged “can achieve equal standing,” by ensuring they have “a reliable salary and a future apart from the challenges of poverty and inequality.”
For conservatives, it didn’t matter which message they read; their support for military spending was the same. However, liberals expressed significantly greater support for increasing military spending if they read the fairness message rather than the patriotism one.
If you’re thinking that these reframed arguments don’t sound like ones that conservatives and liberals would naturally be inclined to make, you’re right. In an additional study, we asked liberals to write a persuasive argument in favor of same-sex marriage aimed at convincing conservatives — and we offered a cash prize to the participant who wrote the most persuasive message. Despite the financial incentive, just 9 percent of liberals made arguments that appealed to more conservative notions of morality, while 69 percent made arguments based on more liberal values.
Conservatives were not much better. When asked to write an argument in favor of making English the official language of the United States that would be persuasive to liberals (with the same cash incentive), just 8 percent of conservatives appealed to liberal values, while 59 percent drew upon conservative values.
Why do we find moral reframing so challenging? There are a number of reasons. You might find it off-putting to endorse values that you don’t hold yourself. You might not see a link between your political positions and your audience’s values. And you might not even know that your audience endorses different values from your own. But whatever the source of the gulf, it can be bridged with effort and consideration.
Maybe reframing political arguments in terms of your audience’s morality should be viewed less as an exercise in targeted, strategic persuasion, and more as an exercise in real, substantive perspective taking. To do it, you have to get into the heads of the people you’d like to persuade, think about what they care about and make arguments that embrace their principles. If you can do that, it will show that you view those with whom you disagree not as enemies, but as people whose values are worth your consideration.
Even if the arguments that you wind up making aren’t those that you would find most appealing, you will have dignified the morality of your political rivals with your attention, which, if you think about it, is the least that we owe our fellow citizens.

Friday, May 06, 2016

Our perception of our body shape is very malleable - making your finger feel shorter.

Here is a neat trick. It works! (I tried it). Ekroll et al. show that illusory visual completion of an object's invisible backside can make you finger feel shorter. Here is their summary and the central graphic from the article.

Highlights
•The experience of the hidden backsides of things acts as a real percept 
•These percepts have causal powers, although they do not correspond to real objects 
•They can evoke a bizarre illusion in which the observer’s own finger feels shrunken 
•The perceptual representation of body shape is highly malleable
Summary
In a well-known magic trick known as multiplying balls, conjurers fool their audience with the use of a semi-spherical shell, which the audience perceives as a complete ball. Here, we report that this illusion persists even when observers touch the inside of the shell with their own finger. Even more intriguingly, this also produces an illusion of bodily self-awareness in which the finger feels shorter, as if to make space for the purely illusory volume of the visually completed ball. This observation provides strong evidence for the controversial and counterintuitive idea that our experience of the hidden backsides of objects is shaped by genuine perceptual representations rather than mere cognitive guesswork or imagery.
Figure


A Well-Known Magic Trick and the Shrunken Finger Illusion
(A and B) The multiplying balls routine. The magician first holds what seems to be a single ball between his fingers (A). After a quick flick of the wrist, a second ball seems to materialize (B). In reality, the lower “ball” is a hollow semi-spherical shell, from which the real ball is pulled out.
(C and D) Schematic illustration of the shrunken finger illusion. When a semi-spherical shell is balanced on the observer’s finger as shown in (C) and viewed from above, the observer often reports perceiving the shell as a complete ball (D), while his or her finger is felt to be unusually short, as if to make space for the illusory volume of the complete ball. Note that this drawing is an exaggerated caricature of the perceptual experience. In particular, the real effect may be smaller than depicted here. In the experiments, only the middle finger was extended, while the other fingers were closed to a fist (see Figure below).

Thursday, May 05, 2016

What happens if we all live to 100?

I want to mention an interesting article by Easterbrook that has been languishing in my queue of potential posts for more than a year. It notes numerous studies on aging and life extension, and the question of how long the eerily linear rise in life expectancy since 1840 (from the 40's to the 80's) can continue. Two clips:
No specific development or discovery has caused the rise: improvements in nutrition, public health, sanitation, and medical knowledge all have helped, but the operative impetus has been the “stream of continuing progress.”
One view is that increases will continue at least until life expectancy at birth surpasses 100. Jay Olshansky, a professor of public health at the University of Illinois at Chicago disagrees, saying:
...the rise in life expectancy will “hit a wall soon, if it hasn’t already....Most of the 20th-century gains in longevity came from reduced infant mortality, and those were one time gains.” Infant mortality in the United States trails some other nations’, but has dropped so much—down to one in 170—that little room for improvement remains. “There’s tremendous statistical impact on life expectancy when the young are saved,” Olshansky says. “A reduction in infant mortality saves the entire span of a person’s life. Avoiding mortality in a young person—say, by vaccine—saves most of the person’s life. Changes in medicine or lifestyle that extend the lives of the old don’t add much to the numbers.” Olshansky calculates that if cancer were eliminated, American life expectancy would rise by only three years, because a host of other chronic fatal diseases are waiting to take its place. He thinks the 21st century will see the average life span extend “another 10 years or so,” with a bonus of more health span. Then the increase will slow noticeably, or stop.
Easterbrook's discussion of the social, economic, and political aspects of our graying future is well worth reading. The number of Americans 65 or older could reach 108 million by 2050, like adding three more Floridas inhabited entirely by seniors.
The nonpartisan think tank Third Way has calculated that at the beginning of the Kennedy presidency, the federal government spent $2.50 on public investments—infrastructure, education, and research—for every $1 it spent on entitlements. By 2022, Third Way predicts, the government will spend $5 on entitlements for every $1 on public investments. Infrastructure, education, and research lead to economic growth; entitlement subsidies merely allow the nation to tread water.

Wednesday, May 04, 2016

Semantic maps in our brains - and some interactive graphics

Huth et al. have performed functional MRI on subjects listening to hours of narrative stories to find semantic domains that seem to be consistent across individuals. This interactive 3D viewer (a preliminary version with limited data that takes a while to download and requires a fairly fast computer) shows a color coding of areas with different semantic selectivities (body part, person, place, time, outdoor, visual, tactile, violence, etc.) Here is their Nature abstract:
The meaning of language is represented in regions of the cerebral cortex collectively known as the ‘semantic system’. However, little of the semantic system has been mapped comprehensively, and the semantic selectivity of most regions is unknown. Here we systematically map semantic selectivity across the cortex using voxel-wise modelling of functional MRI (fMRI) data collected while subjects listened to hours of narrative stories. We show that the semantic system is organized into intricate patterns that seem to be consistent across individuals. We then use a novel generative model to create a detailed semantic atlas. Our results suggest that most areas within the semantic system represent information about specific semantic domains, or groups of related concepts, and our atlas shows which domains are represented in each area. This study demonstrates that data-driven methods—commonplace in studies of human neuroanatomy and functional connectivity—provide a powerful and efficient means for mapping functional representations in the brain.

Tuesday, May 03, 2016

Video games for Neuro-Cognitive Optimization

Continuing the MindBlog thread on brain games (cf. here), I pass on the introduction to a brief review by Mishra, Anguera, and Gazzaley on designing the next generation of closed-loop video games (CLVGs) that offer the prospect of enhancing cognition:
Humans of all ages engage deeply in game play. Game-based interactive environments provide a rich source of enjoyment, but also generate powerful experiences that promote learning and behavioral change (Pellegrini, 2009). In the modern era, software-based video games have become ubiquitous. The degree of interactivity and immersion in these video games can now be further enhanced like never before with the advent of consumer-accessible technologies like virtual reality, augmented reality, wearable physiological devices, and motion capture, all of which can be readily integrated using accessible game engines. This technological revolution presents a huge opportunity for neuroscientists to design targeted, novel game-based tools that drive positive neuroplasticity, accelerate learning, and strengthen cognitive function, and thereby promote mental wellbeing in both healthy and impaired brains.
In fact, there is now a burgeoning brain-training industry that already claims to have achieved this goal. However, many commercial claims are unsubstantiated and dismissed by the scientific community (Max Planck Institute for Human Development/Stanford Center on Longevity, 2014, Underwood, 2016). It seems prudent for us to slow down and approach this opportunity with scientific rigor and conservative optimism. Enhancing brain function should not be viewed as a clever, profitable start-up idea that can be conquered with a large marketing budget. If the field continues to be led by overinflated claims, we will jeopardize the careful and iterative process of evidence-based innovations in brain training and thereby risk throwing out the baby with the bathwater.

To strike the right balance, the path to commercialization needs to be accomplished via cutting-edge, neuroscientifically informed video game development tightly coupled with refinement and validation of the software in well-controlled empirical studies. Additionally, to separate the grain from the chaff, these studies and the claims based on them need verification and approval by independent regulatory agencies and the broader scientific community. High-level video game development and rigorous scientific validation need to become the twin pillar foundations of the next generation of closed-loop video games (CLVGs). Here, we define CLVGs as interactive video games that incorporate rapid, real-time, performance-driven, adaptive game challenges and performance feedback. The time is ideal for intensified effort in this important endeavor; CLVGs that are methodically developed and validated have the potential to benefit a broad array of disciplines in need of effective tools to enhance brain function, including education, medicine, and wellness.

Monday, May 02, 2016

Embodied Prediction - perception and mind turned upside down

Andy Clark does a fascinating discussion and analysis of predictive processing, which turns the traditional picture of perception on its head. The embodied mind model, which seems to me completely compelling, shows the stark inadequacy of most brain centered models of mind and cognition. I pass on the end of his introduction and the closing paragraph of the essay. (This essay is just one of many on a fascinating website , Open Mind, that has posted 39 essays (edited by Thomas Metzinger and Jennifer Windt) by contributors who are both junior and senior members of the academic philosophy of mind field.
Predictive processing plausibly represents the last and most radical step in a retreat from the passive, input-dominated view of the flow of neural processing. According to this emerging class of models, naturally intelligent systems (humans and other animals) do not passively await sensory stimulation. Instead, they are constantly active, trying to predict the streams of sensory stimulation before they arrive. Before an “input” arrives on the scene, these pro-active cognitive systems are already busy predicting its most probable shape and implications. Systems like this are already (and almost constantly) poised to act, and all they need to process are any sensed deviations from the predicted state. It is these calculated deviations from predicted states (known as prediction errors) that thus bear much of the information-processing burden, informing us of what is salient and newsworthy within the dense sensory barrage. The extensive use of top-down probabilistic prediction here provides an effective means of avoiding the kinds of “representational bottleneck” feared by early opponents of representation-heavy—but feed-forward dominated—forms of processing. Instead, the downward flow of prediction now does most of the computational “heavy-lifting”, allowing moment-by-moment processing to focus only on the newsworthy departures signified by salient prediction errors. Such economy and preparedness is biologically attractive, and neatly sidesteps the many processing bottlenecks associated with more passive models of the flow of information.
Action itself...then needs to be reconceived. Action is not so much a response to an input as a neat and efficient way of selecting the next “input”, and thereby driving a rolling cycle. These hyperactive systems are constantly predicting their own upcoming states, and actively moving so as to bring some of them into being. We thus act so as to bring forth the evolving streams of sensory information that keep us viable (keeping us fed, warm, and watered) and that serve our increasingly recondite ends. PP thus implements a comprehensive reversal of the traditional (bottom-up, forward-flowing) schema. The largest contributor to ongoing neural response, if PP is correct, is the ceaseless anticipatory buzz of downwards-flowing neural prediction that drives both perception and action. Incoming sensory information is just one further factor perturbing those restless pro-active seas. Within those seas, percepts and actions emerge via a recurrent cascade of sub-personal predictions forged from unconscious expectations spanning multiple spatial and temporal scales.
Conceptually, this implies a striking reversal, in that the driving sensory signal is really just providing corrective feedback on the emerging top-down predictions. As ever-active prediction engines, these kinds of minds are not, fundamentally, in the business of solving puzzles given to them as inputs. Rather, they are in the business of keeping us one step ahead of the game, poised to act and actively eliciting the sensory flows that keep us viable and fulfilled. If this is on track, then just about every aspect of the passive forward-flowing model is false. We are not passive cognitive couch potatoes so much as proactive predictavores, forever trying to stay one step ahead of the incoming waves of sensory stimulation.
Conclusion: Towards a mature science of the embodied mind
By self-organizing around prediction error, and by learning a generative rather than a merely discriminative (i.e., pattern-classifying) model, these approaches realize many of the goals of previous work in artificial neural networks, robotics, dynamical systems theory, and classical cognitive science. They self-organize around prediction error signals, perform unsupervised learning using a multi-level architecture, and acquire a satisfying grip—courtesy of the problem decompositions enabled by their hierarchical form—upon structural relations within a domain. They do this, moreover, in ways that are firmly grounded in the patterns of sensorimotor experience that structure learning, using continuous, non-linguaform, inner encodings (probability density functions and probabilistic inference). Precision-based restructuring of patterns of effective connectivity then allow us to nest simplicity within complexity, and to make as much (or as little) use of body and world as task and context dictate. This is encouraging. It might even be that models in this broad ballpark offer us a first glimpse of the shape of a fundamental and unified science of the embodied mind.

Friday, April 29, 2016

The privileged fifth.

I tweeted this well researched OpEd piece by Thomas Edsall the first time I read it, and after my third reading, want to urge you to read it.  I pass on two  summary graphics that are part of the description of how the privileged top fifth of the U.S. population is becoming a self-perpetuating class that is steadily separating itself by geography, education, and income.


Thursday, April 28, 2016

Sleep deprivation, brain structure, and learning

Saletin et al. find that individual differences in the anatomy of the human hippocampus explain many of the differences in learning impairment after sleep loss. These structural differences also predict the subsequent EEG slow-wave activity during recovery sleep and the restoration of learning after sleep.

Significance statement
Sleep deprivation does not impact all people equally. Some individuals show cognitive resilience to the effects of sleep loss, whereas others express striking vulnerability, the reasons for which remain largely unknown. Here, we demonstrate that structural features of the human brain, specifically those within the hippocampus, accurately predict which individuals are susceptible (or conversely, resilient) to memory impairments caused by sleep deprivation. Moreover, this same structural feature determines the success of memory restoration following subsequent recovery sleep. Therefore, structural properties of the human brain represent a novel biomarker predicting individual vulnerability to (and recovery from) the effects of sleep loss, one with occupational relevance in professions where insufficient sleep is pervasive yet memory function is paramount.
Abstract
Sleep deprivation impairs the formation of new memories. However, marked interindividual variability exists in the degree to which sleep loss compromises learning, the mechanistic reasons for which are unclear. Furthermore, which physiological sleep processes restore learning ability following sleep deprivation are similarly unknown. Here, we demonstrate that the structural morphology of human hippocampal subfields represents one factor determining vulnerability (and conversely, resilience) to the impact of sleep deprivation on memory formation. Moreover, this same measure of brain morphology was further associated with the quality of nonrapid eye movement slow wave oscillations during recovery sleep, and by way of such activity, determined the success of memory restoration. Such findings provide a novel human biomarker of cognitive susceptibility to, and recovery from, sleep deprivation. Moreover, this metric may be of special predictive utility for professions in which memory function is paramount yet insufficient sleep is pervasive (e.g., aviation, military, and medicine).
For further reading on insomnia, this article notes several other studies, one noting several right brain regions of lowered connectivity in people with primary insomnia.

Wednesday, April 27, 2016

Grandiose narcissism and the U.S. presidency

Many of us are scratching our heads about what a Trump presidency might be like, particularly in regard to his outstanding personality trait: grandiose narcissism. Watts et al. have looked at the historical record to note how this trait has correlated with both positive and negative leadership behaviors in U.S. presidents up until Obama. Their abstract:
Recent research and theorizing suggest that narcissism may predict both positive and negative leadership behaviors. We tested this hypothesis with data on the 42 U.S. presidents up to and including George W. Bush, using (a) expert-derived narcissism estimates, (b) independent historical surveys of presidential performance, and (c) largely or entirely objective indicators of presidential performance. Grandiose, but not vulnerable, narcissism was associated with superior overall greatness in an aggregate poll; it was also positively associated with public persuasiveness, crisis management, agenda setting, and allied behaviors, and with several objective indicators of performance, such as winning the popular vote and initiating legislation. Nevertheless, grandiose narcissism was also associated with several negative outcomes, including congressional impeachment resolutions and unethical behaviors. We found that presidents exhibit elevated levels of grandiose narcissism compared with the general population, and that presidents’ grandiose narcissism has been rising over time. Our findings suggest that grandiose narcissism may be a double-edged sword in the leadership domain.
The two highest scorers on grandiose narcissism were Lyndon B. Johnson and Theodore Roosevelt. Richard M. Nixon scored high on "vulnerable narcissism," a trait associated with being self-absorbed and thin-skinned. From the authors' popular account of their work:
Studies in the Journal of Personality in 2013 and in Personality and Individual Differences in 2009 have shown that narcissistic individuals tend to impress others during brief interactions and to perform well in public, two attributes that lend themselves to political success. They are also willing to take risks, which can be a valuable asset in a leader.
In contrast, the psychologist W. Keith Campbell and others have found that narcissists tend to be overconfident when making decisions, to overestimate their abilities and to portray their ideas as innovative when they are not. Compared with their non-narcissistic counterparts, they are more likely to accumulate resources for themselves at others’ expense.
The psychologists Brad Bushman and Roy F. Baumeister have found that narcissists, but not people with garden-variety high self-esteem, are prone to retaliating harshly against people who have criticized them. If, for example, you present narcissists with negative feedback about essays they’ve written, they’re likely to exact revenge against their presumed essay evaluators by blasting them with loud noises (as one amusing study found).
Still other work by the psychologist Mitja Back and colleagues suggests that narcissists are generally well liked in the short term, often creating positive first impressions. Other research indicates, though, that after a while they are usually more disliked than other individuals. Their charisma tends to wear off.

Tuesday, April 26, 2016

Are we smart enough to know how smart animals are?

I want to pass on some clips from Silk's recent review of Frans de Waal's recent book whose title is the title of this post:
Natural selection, he argues, shapes cognitive abilities in the same way as it shapes traits such as wing length. As animals' challenges and habitats differ, so do their cognitive abilities. This idea, which he calls evolutionary cognition, has gained traction in psychology and biology in the past few decades.
For de Waal, evolutionary cognition has two key consequences. First, it is inconsistent with the concept of a 'great chain of being' in which organisms can be ordered from primitive to advanced, simple to complex, stupid to smart. Name a 'unique' human trait, and biologists will find another organism with a similar one. Humans make and use tools; so do wild New Caledonian crows (Corvus moneduloides). Humans develop cultures; so do humpback whales (Megaptera novaeangliae), which socially transmit foraging techniques. We can mentally 'time travel', remembering past events and planning for the future; so can western scrub jays (Aphelocoma californica), which can recall what they had for breakfast on one day, anticipate whether they will be given breakfast the next and selectively cache food when breakfast won't be delivered.
Furthermore, humans do not necessarily outdo other animals in all cognitive domains. Black-capped chickadees (Poecile atricapillus) store seeds in hundreds of locations each day, and can remember what they stored and where, as well as whether items in each location have been eaten, or stolen. Natural selection has favoured those prodigious feats of memory because they spell the difference between surviving winter and starving before spring. Human memory doesn't need to be as good: primates evolved in the tropics. “In the utilitarian view of biology,” de Waal argues, “animals have the brains they need — nothing more, nothing less.”
The second consequence of de Waal's view is that there is continuity across taxa. One source of continuity is based on evolutionary history: natural selection modifies traits to create new ones, producing commonalities among species with a common history. He points out that tool use is found not just in humans and chimpanzees, but also in other apes and monkeys, implying that relevant cognitive building blocks are shared across all primates. Continuity is also generated by convergent evolution, which produces similar traits in distantly related organisms such as New Caledonian crows and capuchin monkeys. De Waal opines that continuity “ought to be the default position for at least all mammals, and perhaps also birds and other vertebrates”.
...researchers are eager to understand what is distinctly human; some are driven by curiosity about how humans came to dominate the planet..Our success presumably has something to do with the emergence of a unique suite of cognitive traits...De Waal recognizes only one such trait: our rich and flexible system of symbolic communication, and our ability to exchange information about past and future. His commitment to the principle of continuity forces him to discount the importance of language for human cognition because of evidence of thinking by non-linguistic creatures. And he ignores compelling findings from linguists and developmental psychologists such as Elizabeth Spelke on the formative role of language in cognition.
A more satisfying book would leave readers with a clearer understanding of why, a few million years after our lineage diverged from the lineage of chimpanzees, we are the ones reading this book, and not them.

Monday, April 25, 2016

Essential role of default mode network in higher cognitive processing.

The respective roles of attentional and default mode networks in our brains has been the subject of numerous MindBlog posts (enter 'default mode' in the search box in the left column). A summary article by Bola and Borchardt notes an important recent contribution by Vatansever et al., whose abstract is shown below, followed by a graphic from the summary article.  Their work changes the previous view that the default mode disengages during goal-directed tasks.

ABSTRACT
The default mode network (DMN) has been traditionally assumed to hinder behavioral performance in externally focused, goal-directed paradigms and to provide no active contribution to human cognition. However, recent evidence suggests greater DMN activity in an array of tasks, especially those that involve self-referential and memory-based processing. Although data that robustly demonstrate a comprehensive functional role for DMN remains relatively scarce, the global workspace framework, which implicates the DMN in global information integration for conscious processing, can potentially provide an explanation for the broad range of higher-order paradigms that report DMN involvement. We used graph theoretical measures to assess the contribution of the DMN to global functional connectivity dynamics in 22 healthy volunteers during an fMRI-based n-back working-memory paradigm with parametric increases in difficulty. Our predominant finding is that brain modularity decreases with greater task demands, thus adapting a more global workspace configuration, in direct relation to increases in reaction times to correct responses. Flexible default mode regions dynamically switch community memberships and display significant changes in their nodal participation coefficient and strength, which may reflect the observed whole-brain changes in functional connectivity architecture. These findings have important implications for our understanding of healthy brain function, as they suggest a central role for the DMN in higher cognitive processing.
SIGNIFICANCE STATEMENT
The default mode network (DMN) has been shown to increase its activity during the absence of external stimulation, and hence was historically assumed to disengage during goal-directed tasks. Recent evidence, however, implicates the DMN in self-referential and memory-based processing. We provide robust evidence for this network's active contribution to working memory by revealing dynamic reconfiguration in its interactions with other networks and offer an explanation within the global workspace theoretical framework. These promising findings may help redefine our understanding of the exact DMN role in human cognition.
Graphic from Review

Schematic representation of the main findings of Vatansever et al. Community representation and colors are in the style of Figures 1 and 3 in the article by Vatansever et al. (2015), and the DMN is represented by Community 4. In the low-demanding 0-back condition, the network was highly modular (high Q index) and was divided into four distinct modules. With the increasing cognitive load, the modularity of the network decreased, and three communities merged into one. Thus, while local segregation was prevalent in the low-demanding task, increasing cognitive effort was associated with more pronounced global integration.

Friday, April 22, 2016

How to attract others.

Well, Duh...... Interesting, but talk about showing the obvious!.. from Vacharkulksemsuka et al.:
Across two field studies of romantic attraction, we demonstrate that postural expansiveness makes humans more romantically appealing. In a field study (n = 144 speed-dates), we coded nonverbal behaviors associated with liking, love, and dominance. Postural expansiveness—expanding the body in physical space—was most predictive of attraction, with each one-unit increase in coded behavior from the video recordings nearly doubling a person’s odds of getting a “yes” response from one’s speed-dating partner. In a subsequent field experiment (n = 3,000), we tested the causality of postural expansion (vs. contraction) on attraction using a popular Global Positioning System-based online-dating application. Mate-seekers rapidly flipped through photographs of potential sexual/date partners, selecting those they desired to meet for a date. Mate-seekers were significantly more likely to select partners displaying an expansive (vs. contractive) nonverbal posture. Mediation analyses demonstrate one plausible mechanism through which expansiveness is appealing: Expansiveness makes the dating candidate appear more dominant. In a dating world in which success sometimes is determined by a split-second decision rendered after a brief interaction or exposure to a static photograph, single persons have very little time to make a good impression. Our research suggests that a nonverbal dominance display increases a person’s chances of being selected as a potential mate.

Thursday, April 21, 2016

Impulsivity, sensation seeking, and substance use correlate with reduced brain cortical thickness.

From Holmes et al.:
Individuals vary widely in their tendency to seek stimulation and act impulsively, early developing traits with genetic origins. Failures to regulate these behaviors increase risk for maladaptive outcomes including substance abuse. Here, we explored the neuroanatomical correlates of sensation seeking and impulsivity in healthy young adults. Our analyses revealed links between sensation seeking and reduced cortical thickness that were preferentially localized to regions implicated in cognitive control, including anterior cingulate and middle frontal gyrus (n = 1015). These associations generalized to self-reported motor impulsivity, replicated in an independent group (n = 219), and correlated with heightened alcohol, tobacco, and caffeine use. Critically, the relations between sensation seeking and brain structure were evident in participants without a history of alcohol or tobacco use, suggesting that observed associations with anatomy are not solely a consequence of substance use. These results demonstrate that individual differences in the tendency to seek stimulation, act on impulse, and engage in substance use are correlated with the anatomical structure of cognitive control circuitry. Our findings suggest that, in healthy populations, covariation across these complex multidimensional behaviors may in part originate from a common underlying biology.

Wednesday, April 20, 2016

Metaphorical conflict shapes social perception when spatial and ideological collide.

Kleiman et al. do some intriguing experiments. I give you their abstract first, which doesn't actually say how they did the experiments, and then give you some further description from their text. The abstract:
In the present article, we introduce the concept of metaphorical conflict—a conflict between the concrete and abstract aspects of a metaphor. We used the association between the concrete (spatial) and abstract (ideological) components of the political left-right metaphor to demonstrate that metaphorical conflict has marked implications for cognitive processing and social perception. Specifically, we showed that creating conflict between a spatial location and a metaphorically linked concept reduces perceived differences between the attitudes of partisans who are generally viewed as possessing fundamentally different worldviews (Democrats and Republicans). We further demonstrated that metaphorical conflict reduces perceived attitude differences by creating a mind-set in which categories are represented as possessing broader boundaries than when concepts are metaphorically compatible. These results suggest that metaphorical conflict shapes social perception by making members of distinct groups appear more similar than they are generally thought to be. These findings have important implications for research on conflict, embodied cognition, and social perception.
In the first experiment they asked subjects to categorize a series of pictures of Barack Obama and Mitt Romney. One group categorized the Romney pictures using their right hand (the P key)and Obama pictures with their left hand using the Q key - compatible with the right wing, left wing political metaphor. A second group was asked to identify Obama with their right hand and Romney with their left - in this case the physical action and the candidate's ideology were metaphorically incompatible. The interesting result was that:
...participants in the incompatible condition perceived the difference between the candidates’ ideologies as smaller than did participants in the compatible condition...Additionally, participants in the incompatible condition perceived the difference between the candidates’ stances on specific political issues as smaller than did participants in the compatible condition
A second experiment asked participants to estimate the ideology of the typical Democrat and Republican using a scale of 1 to 9 that was either compatible or incompatible with the metaphorical association linking spatial locations to political ideologies.
Participants assigned to the incompatible condition (n = 194) provided their response on a horizontally displayed scale with the values in the opposite sequence, that is, from 1 (extremely conservative) to 9 (extremely liberal). Note that this scale reversed the traditional spatial assignment and placed liberal views on the right and conservative views on the left, which metaphorically puts the physical location and ideology in conflict... consistent with predictions, participants who rated their perceptions on the incompatible scale perceived the typical Republican’s and typical Democrat’s attitudes as more similar than did participants who rated their perceptions on the compatible scale.
Two further control experiments were done.

Tuesday, April 19, 2016

Political polarization and prejudice.

Yesterday's post dealt with softening prejudicial attitudes towards transgender people. This is relevant to prejudice rising from the right versus left political polarization that continues to increase in this county. From a recent NYTimes OpEd piece by Arthur Brooks:
Thirty-eight percent of Democrats have a “very unfavorable” view of Republicans, and 43 percent of Republicans hold that view of Democrats. About half of “consistently liberal” Americans say most of their friends share their views, and about a third say it’s important to live in a place where that is so. For those who are “consistently conservative,” these preferences are even more pronounced.
...the average American is becoming more ideologically predictable. A Pew Research Center study from 2014 shows that the share of Americans with “consistently conservative” or “consistently liberal” views has more than doubled in the last two decades to 21 percent from 10 percent...In 1994, nearly 40 percent of Republicans were more liberal than the median Democrat, and 30 percent of Democrats were more conservative than the median Republican. Today, those numbers have plummeted to 8 percent and 6 percent.
This polarization has led to political discrimination that studies have shown to be stronger than racial discrimination
...Bigotry’s cousin is contempt...Watch and listen to politically polarized commentary today, and you will see that it is more contemptuous than angry, overflowing with sneering, mockery and disgust.
So what’s the antidote? I asked the Dalai Lama, one of the world’s experts on bringing people together. He made two points. First, the solution starts not with institutions, but with individuals. We look too much to political parties or Congress to make progress, but not nearly enough at our own behavior...You can’t single-handedly change the country, but you can change yourself. By declaring your independence from the bitterness washing over our nation, you can strike a small blow for greater national unity.
Second, each of us must aspire to what the Dalai Lama calls “warmheartedness” toward those with whom we disagree. This might sound squishy, but it is actually tough and practical advice. As he has stated, “I defeat my enemies when I make them my friends.” He is not advocating surrender to the views of those with whom we disagree. Liberals should be liberals and conservatives should be conservatives. But our duty is to be respectful, fair and friendly to all, even those with whom we have great differences.
Yesterday's post on changing prejudice suggests a further technique for reconciliation: active or analogic perspective taking. This is essentially imagining a situation in which you felt contempt from others, and also putting yourself in the shoes of others, imagining their concerns, etc.

Monday, April 18, 2016

How to change prejudice...for real this time

John Bohannon summarizes the interesting story of two researchers, who after finding that a study on reversing homophobia was based on fake data, went ahead to find that the effect claimed by the fraudulent study was real after all. Broockman and Kalla used a technique developed by the Los Angeles LGBT center:
...the LGBT Center has its canvassers follow one called “analogic perspective taking.” By inviting someone to discuss an experience in which that person was perceived as different and treated unfairly, a canvasser tries to generate sympathy for the suffering of another group—such as gay or transgender people.
Here is the abstract:
Existing research depicts intergroup prejudices as deeply ingrained, requiring intense intervention to lastingly reduce. Here, we show that a single approximately 10-minute conversation encouraging actively taking the perspective of others can markedly reduce prejudice for at least 3 months. We illustrate this potential with a door-to-door canvassing intervention in South Florida targeting antitransgender prejudice. Despite declines in homophobia, transphobia remains pervasive. For the intervention, 56 canvassers went door to door encouraging active perspective-taking with 501 voters at voters’ doorsteps. A randomized trial found that these conversations substantially reduced transphobia, with decreases greater than Americans’ average decrease in homophobia from 1998 to 2012. These effects persisted for 3 months, and both transgender and nontransgender canvassers were effective. The intervention also increased support for a nondiscrimination law, even after exposing voters to counterarguments.

Friday, April 15, 2016

Brain correlates of how the risk taking of others influences our own risk taking

From Suzuki et al., another upstairs/downstairs story. Risk is represented in the caudate nucleus (downstairs), the risk activity of others is represented in the dorsolateral prefrontal cortex (upstairs). The strength of the connections between these areas determines how susceptible our behavior is to influence by others.
Our attitude toward risk plays a crucial role in influencing our everyday decision-making. Despite its importance, little is known about how human risk-preference can be modulated by observing risky behavior in other agents at either the behavioral or the neural level. Using fMRI combined with computational modeling of behavioral data, we show that human risk-preference can be systematically altered by the act of observing and learning from others’ risk-related decisions. The contagion is driven specifically by brain regions involved in the assessment of risk: the behavioral shift is implemented via a neural representation of risk in the caudate nucleus, whereas the representations of other decision-related variables such as expected value are not affected. Furthermore, we uncover neural computations underlying learning about others’ risk-preferences and describe how these signals interact with the neural representation of risk in the caudate. Updating of the belief about others’ preferences is associated with neural activity in the dorsolateral prefrontal cortex (dlPFC). Functional coupling between the dlPFC and the caudate correlates with the degree of susceptibility to the contagion effect, suggesting that a frontal–subcortical loop, the so-called dorsolateral prefrontal–striatal circuit, underlies the modulation of risk-preference. Taken together, these findings provide a mechanistic account for how observation of others’ risky behavior can modulate an individual’s own risk-preference.

Thursday, April 14, 2016

Aging brains - more physical activity, more gray matter, less Alzheimers.

I like to pass on any work I see relevant to exercise, aging, and the brain. The following is from Raji et al.
BACKGROUND: Physical activity (PA) can be neuroprotective and reduce the risk for Alzheimer's disease (AD). In assessing physical activity, caloric expenditure is a proxy marker reflecting the sum total of multiple physical activity types conducted by an individual. 
OBJECTIVE: To assess caloric expenditure, as a proxy marker of PA, as a predictive measure of gray matter (GM) volumes in the normal and cognitively impaired elderly persons. 
METHODS: All subjects in this study were recruited from the Institutional Review Board approved Cardiovascular Health Study (CHS), a multisite population-based longitudinal study in persons aged 65 and older. We analyzed a sub-sample of CHS participants 876 subjects (mean age 78.3, 57.5% F, 42.5% M) who had i) energy output assessed as kilocalories (kcal) per week using the standardized Minnesota Leisure-Time Activities questionnaire, ii) cognitive assessments for clinical classification of normal cognition, mild cognitive impairment (MCI), and AD, and iii) volumetric MR imaging of the brain. Voxel-based morphometry modeled the relationship between kcal/week and GM volumes while accounting for standard covariates including head size, age, sex, white matter hyperintensity lesions, MCI or AD status, and site. Multiple comparisons were controlled using a False Discovery Rate of 5 percent. 
RESULTS: Higher energy output, from a variety of physical activity types, was associated with larger GM volumes in frontal, temporal, and parietal lobes, as well as hippocampus, thalamus, and basal ganglia. High levels of caloric expenditure moderated neurodegeneration-associated volume loss in the precuneus, posterior cingulate, and cerebellar vermis. 
CONCLUSION: Increasing energy output from a variety of physical activities is related to larger gray matter volumes in the elderly, regardless of cognitive status.

Wednesday, April 13, 2016

Distraction in the digital era….what about since 1710?

I want to pass on some clips from an interesting essay by Frank Furedi, "The Ages of Distraction."
The rise of the internet and the widespread availability of digital technology has surrounded us with endless sources of distraction: texts, emails and Instagrams from friends, streaming music and videos, ever-changing stock quotes, news and more news. To get our work done, we could try to turn off the digital stream, but that’s difficult to do when we’re plagued by FOMO, the modern fear of missing out. Some people think that our willpower is so weak because our brains have been damaged by digital noise. But blaming technology for the rise in inattention is misplaced. History shows that the disquiet is fueled not by the next new thing but by the threat this thing – whatever it might be – poses to the moral authority of the day.
The first time inattention emerged as a social threat was in 18th-century Europe, during the Enlightenment, just as logic and science were pushing against religion and myth. The Oxford English Dictionary cites a 1710 entry from Tatler as its first reference to this word, coupling inattention with indolence; both are represented as moral vices of serious public concern.
The recent decades have seen a dramatic reversal in the conceptualization of inattention. Unlike in the 18th century when it was perceived as abnormal, today inattention is often presented as the normal state. The current era is frequently characterized as the Age of Distraction, and inattention is no longer depicted as a condition that afflicts a few. Nowadays, the erosion of humanity’s capacity for attention is portrayed as an existential problem, linked with the allegedly corrosive effects of digitally driven streams of information relentlessly flowing our way.
The perception of an Age of Distraction is related to our uncertainty about the answer to the question of ‘attention to what or to whom’. The sublimation of anxieties about moral authority through the fetish of technologically driven distraction has acquired pathological proportions in relation to children and young people. Yet as most sensible observers understand, children who are inattentive to their teachers are often obsessively attentive to the text messages that they receive. The constant lament about inattentive youth in the Anglo-American world could be interpreted as a symptom of problems related to the exercise of adult authority.
Often the failure to inspire and capture the imagination of young people is blamed on their inattentive state of minds. Too often educators have responded to this condition by adopting a fatalistic approach of accommodating to the supposed inattentive reading practices of digital natives. This pattern is evident in higher education where the assumption that college students can no longer be expected to read long and challenging texts or pay attention to serious lectures has led to the adaptation of course material to the inattentive mentality of the digital native. Calls to change the educational environment to ‘fit the student’ have become widespread in higher education.
How different from the reaction of moral philosophers such as Dugald Stewart, also concerned with the problem of the inattentive student. Author of Outlines of Moral Philosophy: For the Use of Students in the University of Edinburgh (1793), Stewart believed that the problem of inattention could be overcome through moral education. Unlike some contemporary academics, he regarded the ‘early habit of inattention’ a problem to be solved rather than an unalterable fact of existence. Helvétius fervently believed that everyone had the potential to acquire ‘continued attention’ and ‘triumph over indolence’.
Regrettably, the optimism of Helvétius has given way to a mood of resignation. Attention is still seen as desirable but almost impossible to achieve. As one alarmist account warns, ‘an epidemic erosion of attention is a sure sign of an impending dark age’. Helvétius would have been distressed by the fatalism expressed in this lament.

Tuesday, April 12, 2016

The evolutionary origins of smiles, laughter, and tears.

Graziano suggests that our smile originated in the defensive reaction of monkeys to other monkeys moving into their personal space. He then proceeds to make just-so stories about simian origins of our laughing and crying. To begin, imagine Monkey B steps into the personal space of Monkey A.
Monkey A squints, protecting his eyes. His upper lip pulls up. This does expose the teeth, but only as a side-effect: in a defensive reaction, the point of the curled lip is not to prepare for a biting attack so much as it is to bunch the facial skin upward, further padding the eyes in folds of skin...The head pulls down and the shoulders pull up to protect the vulnerable throat and jugular....The torso curves forward to protect the abdomen...Monkey B can learn a lot by watching the reaction of Monkey A...And so the stage is set for a social signal to evolve: natural selection will favour monkeys that can read the cringe reactions of their peers and adjust their behaviour accordingly...If Monkey B can glean useful information by watching Monkey A, then it’s useful for Monkey A to manipulate that information and influence Monkey B. Evolution therefore favours monkeys that can, in the right circumstances, pantomime a defensive reaction. It helps to convince others that you’re non-threatening. Finally we see the origin of the smile: a briefly flashed imitation of a defensive stance.
In people, the smile has been pared down to little more than its facial components — the lifting of the upper lip, the upward bunching of the cheeks, the squint. These days we use it mainly to communicate a friendly lack of aggression rather than outright subservience...We can’t help feeling warmer towards someone who beams that Duchenne smile.
On laughing:
...chimps have something like laughter: they open their mouths and make short exhalations during play fights, or if someone tickles them. Gorillas and orangutans do the same. The psychologist Marina Ross compared the noises made by different species of ape and found that it was the sound of bonobos at play that comes closest to human laughter, again, when play-fighting or tickling. All of which makes it seem quite likely that the original type of human laughter also emerged from, yes, play-fighting and tickling.
On crying:
My best guess, strange as it might sound, is that our ancestors were in the habit of punching each other on the nose. Such injuries would have resulted in copious tear production...According to recent analysis by David Carrier and Michael Morgan from Utah University, the shape of human facial bones might well have evolved to withstand the physical trauma of frequent punching. Thickly buttressed facial bones are first seen in fossils of Australopithecus, which appeared following our split with chimpanzees...the reason we weep now may well be that our ancestors discussed their differences by hitting each other in the face. Some of us still do, I suppose.
In any event, the entire behavioural display that we call crying – the tear production, the squinting, the raised upper lip, the repeated alarm calls – makes for a useful signifier. Evolution would have favoured animals that reacted to it with an emotional desire to dispense comfort.
Graziano's speculative summary:
An age-old defensive mechanism, a mechanism that monitors bubbles of space around the body and organises protective movements, suddenly takes flight in the hyper-social world of primates, spinning into smiles and laughter and crying and cringeing. Each one of those behaviours then splits further, branching into a whole codebook of signals for use in different social circumstances. Not all of human expression can be explained in this way, but much of it can. A Duchenne smile, a cold smile, laughter at a joke, laughter that acknowledges a clever witticism, cruel laughter, a cringe to show servility, standing straight to show confidence, the arms-crossed expression of suspicion, the arms-open expression of welcome, tilting your head as a sign of surrender to a lover, the fleeting crinkling of the face that hints at crying as we show sympathy for some sad story, or a full blown sobbing jag: this whole vast range of expression could well have emerged from a protective sensory-motor loop that has nothing to do with communication. Evolution is bizarre.

Monday, April 11, 2016

Another list - "Keys to happiness"

The New York Times has done a simple list of pointers to basic articles and research on well being. I'm passing on a few of the items from a condensed version of that list, rearranging the list in almost reverse order to reflect not importance, but items that seem to me to be less commonly acted on. So, keys of happiness:

Don't obsess about it, and don't overdo it.

If all else fails, fake it.

Gratitude helps.

Make friends, family, and weekends a priority

Be healthy




Friday, April 08, 2016

A succinct list of some of our common psychological errors.

I want to point to Belsky's article on why we think we are better decision makers under uncertainty than we really are. He summarizes several common errors:

The sunk cost fallacy - hanging on to a decision, or an investment, in an unconscious desire to justify it.

Loss aversion - reacting more strongly to loss of a resource (time, goods, or money) than to a similar gain.

Overconfidence - overrating our abilities, knowledge, and skill (two thirds of investors rate their financial sophistication as advanced, but barely pass a financial literacy exam.)

Optimism bias - which seems to be hard-wired into our brains because it has evolutionarily useful, driving humans to strive in the face of long odds.

Hindsight bias - rewriting history to make ourselves look good, as in misremembering our forecasts in a way that makes us look smarter.

Attribution bias - attributing good outcomes to our own skills, but bad outcomes to causes over which we had no control.

Confirmation bias - giving too much weight to information that supports our existing beliefs and discounting that which does not.

Thursday, April 07, 2016

Muscle mass and nerve control enhanced in octogenarian athletes.

Power et al. expand their earlier studies on active runners ~65 years old to find ~14% greater muscle mass and ~28% more functioning motor nerve units in octogenarian masters athletes than in healthy age-matched controls.
Our group has shown a greater number of functioning motor units (MU) in a cohort of highly-active older(~65y) masters runners relative to age-matched controls. Owing to the precipitous loss in the number of functioning MUs in the 8th and 9th decade of life it is unknown whether older world class octogenarian masters athletes (MA) would also have greater numbers of functioning MUs (MUNE) compared with age-matched controls. We measured MU numbers and neuromuscular transmission stability in the tibialis anterior of world champion MAs (~80y), and compared the values to healthy age-matched controls (~80y). Decomposition-enhanced spike-triggered averaging was used to collect surface and intramuscular electromyography signals during dorsiflexion at ~25% of maximum voluntary isometric contraction(MVC). Near fibre (NF) MU potential analysis was used to assess neuromuscular transmission stability. For the MAs as compared with age-matched controls; the amount of excitable muscle mass (CMAP) was 14% greater (p less than 0.05), there was a trend (p=0.07) towards a 27% smaller surface detected motor unit potential - representative of less collateral reinnervation, and 28% more functioning MUs (p less than 0.05). Additionally, the MAs had greater MU neuromuscular stability than the controls as indicated by lower NF jitter and jiggle values (p less than 0.05). These results demonstrate that high performing octogenarians better maintain neuromuscular stability of the MU and mitigate the loss of MUs associated with aging well into the later decades of life during which time the loss of muscle mass and strength become functionally relevant. Future studies need to identify the concomitant roles genetics and exercise play in neuroprotection.

Wednesday, April 06, 2016

Why sad music can make us feel good.

As an update to a previous MindBlog post on why we like sad music, I want to note Ojiaku's brief mention of several articles on this subject.
Sad music might make people feel vicarious unpleasant emotions, found a study published last year in Frontiers in Psychology. But this experience can ultimately be pleasurable because it allows a negative emotion to exist indirectly, and at a safe distance. Instead of feeling the depths of despair, people can feel nostalgia for a time when they were in a similar emotional state: a non-threatening way to remember a sadness.
People who are very empathetic are more likely to take pleasure in the emotional experience of sad music, according to another study in Frontiers of Psychology. Others enjoy sad songs because they help them return to an emotionally balanced state, according to a review in Frontiers in Human Neuroscience, published in 2015. And those more open to varied experiences might enjoy the songs because the unique emotions that come up when listening to the music fulfill their need for novelty in thoughts and feelings.
From the Frontiers in Neurosciences abstract:
We offer a framework to account for how listening to sad music can lead to positive feelings, contending that this effect hinges on correcting an ongoing homeostatic imbalance. Sadness evoked by music is found pleasurable: (1) when it is perceived as non-threatening; (2) when it is aesthetically pleasing; and (3) when it produces psychological benefits such as mood regulation, and empathic feelings, caused, for example, by recollection of and reflection on past events.

Tuesday, April 05, 2016

The Social Gene

I want to pass on some clips from Joseph's Swift's review of a book, "The Society of Genes" by Yanai and Lercher that updates Richard Dawkins's classic "The Selfish Gene" publised 40 years ago. (Their title reminds me of "Society of Mind," a classic book published in 1986 by Marvin Minsky, who recently died at age 88.)
Genetic research has moved rapidly since the publication of Richard Dawkins's The Selfish Gene 40 years ago. In the intervening years, we have come to realize that many of the most interesting and important phenomena in human biology are not caused by any single gene. Processes like the immune system's ability to recognize infection, or the timing of our sleep-wake cycle, for example, are the product of many genes working together in a highly integrated way. Citing a wealth of recent research that explores the ways genes work together to produce complex biological processes, Itai Yanai and Martin Lercher argue that it is time to embrace a new, more holistic, metaphor in their book, The Society of Genes.
Rather than focus on any one gene, Yanai and Lercher invite the reader to step back and observe how genes assemble together to make a global genetic system, or genome. From here, one can see that the labor within the genome is not divided equally. Whereas many genes encode for proteins that perform a single monotonous task, such as breaking down a certain type of sugar or producing a specific skin pigment, there are others that serve such fundamental roles that their removal would lead to the crumbling of the genomic society altogether. Among the latter group are genes that manage the behavior of a host of other genes.
When genes are mismanaged by their masters, organisms can be transformed in dramatic ways. For example, in humans, when SOX9 fails to direct its wide range of subordinates succinctly, sex reversal and skeletal malformations can occur.
Given that catastrophic things tend to happen when genes don't work together properly, changes to how the genomic society is run are a rare occurrence. When genes with new abilities evolve, Darwinian selection determines whether they will join the ranks as productive members of society. Our ancestors obtained genes that could interpret light as color and a gene for a more efficient oxygen-carrying hemoglobin in this very way.
And then there are the genes that don't contribute to society at all. Instead, they secure their position by hijacking the system. The LINE1 gene, for example, encodes only for its own dispersal, copying and pasting itself throughout our genome while providing the society with no clear benefit. The “bad behavior” of genes amounts to scandal in the genomic society, and learning about their exploits is one of the most enjoyable elements of reading the book.
There are even genes that work to ensure the survival of individual cells within an organism by wreaking havoc on others. In fruit flies, for example, a pair of genes involved in sperm production work in concert to produce both a poison and its antidote. The toxic compound is released from the cell, while the antidote is retained. In this way, surrounding sperm cells without the gene pair are killed. On reading about such systems, one begins to realize that it's not quite right to imagine our genome as some idealized republic. This is a society that is easily compromised from within its own ranks.
In the years since The Selfish Gene was published, the human genome has been sequenced, along with the genomes of many other species. Indeed, probing one's own genes is beginning to become routine. Thus, The Society of Genes represents a timely and welcome handbook for navigating this postgenomic era.

Monday, April 04, 2016

Pterostilbene anti-aging supplement - undesirable side effects

In a January 15 MindBlog post, I noted the start of my most recent foray into supplements meant to have salutary effects on energy, mind, body, longevity, etc., giving some references to work on pterostilbene, a resveratrol cousin whose added methyl groups allow more rapid absorption after ingestion and slow down its removal by the liver. I’m wanting to report now on my experience of trying Elysium’s pills containing 125 mg pterostilbene and 25 mg nicotinamide riboside, (one per day, instead of the two recommended). I emphasize that this is a single report, people doubtless vary in their sensitivity.

I used the half dose because my previous 2008 experiment with resveratrol was terminated after 19 days because of increasing arthritic symptoms, especially in hands, which disappeared with a week after stopping the supplement. The MindBlog post reporting this result received 33 comments noting side effects of arthritic symptoms, foot and finger soreness and stiffness, sleep disturbance, joint pain, etc. All of these effects are consistent with possible immune system activation and inflammation. 

On starting one pill a day, I thought I noticed after a few days a subtle increase in body energy, a slightly more benign and positive temperament (there are a few reports of anxiolytic effects of low doses of pterostilbene), and most interesting to me, a decrease in rumination or mind wandering versus focused attention . By day 30 increasing stiffness in fingers and body movement was obvious. I terminated the pills, and stiffness disappeared over the next few days. After seven days off, I started one pill a day again. After four days, stiffness and arthritic symptoms had clearly increased in my hands. On stopping the pills again, stiffness disappeared over the next few days.

So, I guess that’s it for me on the resveratrol or pterostilbene trip. Pity… chemical studies noting their desirable effects are quite compelling. I do think Elysium and other vendors of these products should state their possible side effects.

Friday, April 01, 2016

Are we all sexists and racists?

I want to point to Miller's review of work of Levanon et al., as well as other studies, showing that that when women have moved into occupations in large numbers, those jobs have begun paying less even after controlling for education, work experience, skills, race and geography.
A striking example is to be found in the field of recreation — working in parks or leading camps — which went from predominantly male to female from 1950 to 2000. Median hourly wages in this field declined 57 percentage points, accounting for the change in the value of the dollar, according to a complex formula used by Professor Levanon. The job of ticket agent also went from mainly male to female during this period, and wages dropped 43 percentage points...The same thing happened when women in large numbers became designers (wages fell 34 percentage points), housekeepers (wages fell 21 percentage points) and biologists (wages fell 18 percentage points). The reverse was true when a job attracted more men. Computer programming, for instance, used to be a relatively menial role done by women. But when male programmers began to outnumber female ones, the job began paying more and gained prestige.
In the vein of work described in a previous MindBlog post, an excellent article by Ojiaku, a former Neuroscience Graduate Student at the University of Wisconsin, gives extensive references to work demonstrating that our unconscious racism starts early, and creates a deadly empathy gap. The studies Ojiaku cites:
...showed racially biased differences in cognitive and emotion-related brain regions, including the anterior cingulate cortex (ACC). One of the ACC’s functions is registering when you experience your own pain or empathy for another person’s pain. In the study, Chinese and Caucasian college students were shown video clips of both Chinese and Caucasian faces either in pain or not in pain as scientists conducted brain scans. The researchers measured increased ACC activity in the brains of those viewing painful expressions on the faces belonging to their own race, but decreased ACC activity when viewing pain in another race, uncovering a racially biased difference in empathetic response to pain in the brain.
Another study:
...asked participants to view video images of white, black, and violet-illuminated (for racially neutral) hands being pricked with a needle. While watching the prick, the volunteers were tested for their empathetic response via transcranial magnetic stimulation (TMS); the greater the reaction to the stimulation, the higher the empathetic response to the pain...Interestingly enough, both black and white participants had an adequately empathetic response to seeing the violet hand being pricked. However, all of the participants – both black and white – failed to react as strongly to the pain of someone who was outside their racial group. The study also found that people who scored higher in racial bias on the IAT (implicit association test) – meaning that they showed more implicit preferences for faces belonging to their own race – also showed less reactivity to pain experienced by someone from another race.
The last straw:
...is a study from the University of Iowa published in Psychological Science in February 2016. Incredibly, the researchers found that when white test subjects were primed with photos of five-year-old black boys, they were far more likely to mistake objects such as toys for guns – or even to claim to see guns when there were none. In sharp contrast, when subjects were primed with photos of five-year-old white boys before seeing the objects, the effect reserved, as they were more likely to mistake guns for toys. These findings are ominous for black children, because it shows that youth does not mitigate their potential to become targets of racist events, as in the case of the 12-year-old Tamir Rice, a young black boy carrying a toy ‘BB gun’ in a park in Cleveland, Ohio, whom the police shot in less than two seconds after arriving on the scene.

Thursday, March 31, 2016

Another social science, Economics, looks at itself.

Mindblog recently noted a large study that tested the replicability of studies in psychology journals. Only 36% of the findings were repeated. The quality of results has also been questioned in many fields such as medicine, neuroscience, and genetics. Camerer et al. have now tested the replicability of experiments published in top-tier economics journals, finding that two-thirds of the 18 studies examined yielded replicable estimates of effect size and direction. Their abstract:
The replicability of some scientific findings has recently been called into question. To contribute data about replicability in economics, we replicated 18 studies published in the American Economic Review and the Quarterly Journal of Economics between 2011 and 2014. All of these replications followed predefined analysis plans that were made publicly available beforehand, and they all have a statistical power of at least 90% to detect the original effect size at the 5% significance level. We found a significant effect in the same direction as in the original study for 11 replications (61%); on average, the replicated effect size is 66% of the original. The replicability rate varies between 67% and 78% for four additional replicability indicators, including a prediction market measure of peer beliefs.

Wednesday, March 30, 2016

Overkill in techno-aids for 'Mens Sana in Corpore Sano'

None of us would argue with the 'sound mind in a sound body' injunction from Juvenal’s Latin satires (~100 AD), a goal that can be accomplished by diligent pursuit of a few simple activities. Two NYTimes articles note how modern technology manages, for a profit, to vastly encumber that pursuit.

With regard to 'sound mind,' Gelles notes:
The other morning, I woke up and brewed a cup of Mindful Lotus tea ($6 for 20 bags). On the subway, I loaded the Headspace app on my iPhone and followed a guided mindfulness exercise ($13 a month for premium content). Later in the day, I dropped by Mndfl, a meditation studio in Greenwich Village ($20 for a 30-minute class)...There are more than two dozen mindfulness apps for smartphones, some offering $400 lifetime subscriptions. The Great Courses has two mindfulness packages, each with a couple of dozen DVDs for $250. For an enterprising contemplative, it’s never been easier to make a buck...On a recent trip to Whole Foods, near the kombucha, I came across a new product from the health food maker Earth Balance: a dairy-free mayonnaise substitute called Mindful Mayo ($4.50 a jar). Then, in line, I picked up a copy of Mindful magazine ($6)....With so many mindful goods and services for sale, it can be easy to forget that mindfulness is a quality of being, not a piece of merchandise
...with so many cashing in on the meditation craze, it’s hard not to wonder whether something essential is being lost...Increasingly, mindfulness is being packaged as a one-minute reprieve, an interlude between checking Instagram and starting the next episode of “House of Cards.” One company proclaims it has found the “minimum effective dose” of meditation that will change your life. On Amazon, you can pick up “One-Minute Mindfulness: 50 Simple Ways to Find Peace, Clarity, and New Possibilities in a Stressed-Out World.” Dubious courses promise to help people “master mindfulness” in a few weeks.
More often than not, however, the people I know who take time to meditate — carefully observing thoughts, emotions and sensations — are sincere in their aspirations to become less stressed, more accepting and at least a little happier.
Hutchinson discusses the greater than billion dollar market for body fitness aids (which are not used by more than half their buyers six months after their purchase) suggesting:
...a more fundamental question about our rapid adoption of wearable fitness tech: Is the data we collect with these devices actually useful?...Last September, in The British Journal of Sports Medicine, Australian researchers published a review of studies that compared subjective and objective measures of “athlete well-being” during training. The objective measures included state-of-the-art monitoring of heart rate, blood, hormones and more; the subjective measure boiled down to asking the athletes how they felt. The results were striking: The researchers found that as the athletes worked out, their own perception registered changes in training stress with “superior sensitivity and consistency” to the high tech measures...running with a GPS watch “slackens the bond between perception and action.” In other words, when you’re running, instead of speeding up or slowing down based on immediate and intuitive feedback from your body and environment, you’re inserting an unwieldy extra cognitive step that relies on checking your device as you go.
On the positive side:
Health researchers also want to use your tracked data to figure out what works in the real world to improve health and fitness, rather than testing theories in the artificial conditions of the lab. An analysis of in-the-wild data from 4.2 million MyFitnessPal users, for example, recently yielded unexpected insights into the habits of successful weight-losers compared with unsuccessful ones: They ate nearly a third more fiber, and 11 percent less meat. And the dietary changes the successful dieters made between 2014 and 2015 bucked broader trends: They consumed more grains, cereal and raw fruit, but fewer eggs.
As prosaic as it sounds, this is the greatest promise of the wearables revolution. Once the novelty of tracking your exercise habits wears off, knowing how many steps you took today or what your resting heart rate was yesterday soon loses its interest. But together, 100 million of us wearing wristbands could uncover some truly valuable insights into what works to make us healthier and fitter.
Perhaps the most effective and simple way to increase aerobic fitness: use a jump rope!

Tuesday, March 29, 2016

Storing long term emotional memories.

A Journal of Neuroscience precis of an article by Cambiaghi et al., slightly edited:
When we remember events, we often also remember what we were feeling at the time. Cambiaghi et al. asked where in the brain we store such connections. To answer this, they conditioned rats to associate a tone with an unpleasant experience. They then simultaneously recorded from two brain regions, the higher-order auditory cortex and the amygdala, 1 day and 1 month after the conditioning. Animals displayed fearful behavior at both time points, and both areas showed learning-evoked changes. However, the two brain regions only interacted significantly after 1 month had passed (The cue increased the synchrony of their firing.) The degree of interaction predicted the animals' ability to recognize the tone as unpleasant.

Monday, March 28, 2016

Screenagers - brain executive function immediately diminished by television

An article by Jolly points to interesting work by Lillard and Peterson. Their summary:
The goal of this research was to study whether a fast-paced television show immediately influences preschool-aged children's executive function (eg, self-regulation, working memory).  Sixty 4-year-olds were randomly assigned to watch a fast-paced television cartoon or an educational cartoon or draw for 9 minutes. They were then given 4 tasks tapping executive function, including the classic delay-of-gratification and Tower of Hanoi tasks. Parents completed surveys regarding television viewing and child's attention. Children who watched the fast-paced television cartoon performed significantly worse on the executive function tasks than children in the other 2 groups when controlling for child attention, age, and television exposure.  Just 9 minutes of viewing a fast-paced television cartoon had immediate negative effects on 4-year-olds' executive function. Parents should be aware that fast-paced television shows could at least temporarily impair young children's executive function.

Friday, March 25, 2016

Our progression towards a nation of rich and poor

A fascinating and foreboding piece by Rank and Hirschl describe their development of a "economic risk calculator' available at riskcalculator.org
The idea behind our approach is similar to the idea behind a doctor’s ability to predict your risk of heart disease. Using several pieces of information (blood pressure, cholesterol, etc.), your doctor can make a reasonable estimate of your chances of having a heart attack in the next 10 years. These numbers are based on statistical patterns derived from a very large sample of families that make up the Framingham Heart Study, the longitudinal study of cardiovascular health that began in 1948.
Our predictions of economic risk work in a similar way. Using hundreds of thousands of case records taken from a longitudinal study of Americans that began in 1968, we estimate the likelihood — based on factors like race, education, marital status and age — of an individual’s falling below the official poverty line during the next five, 10 or 15 years. (The poverty line for a family of four in 2015 was approximately $24,000.)
Take someone ... who is in his or her later 30s, white, not married, with an education beyond high school. It turns out that the 15-year risk of poverty for such a person is actually 32 percent. In other words, one-third of such individuals will experience at least one year below the poverty line in the not-so-distant future...between the ages of 20 and 75, nearly 60 percent of Americans will spend at least one year below the official poverty line, and three-quarters will experience a year below 150 percent of the poverty line.
We are in danger of becoming an economically polarized society in which a small percentage of the population is free from economic risk, while a vast majority of Americans will encounter poverty as a normal part of life.