Wednesday, June 15, 2016

Our (Bare) Shelves, Our Selves

As is the case with many people moving through their 70's, I am having to downsize my surroundings. An 1861 stone schoolhouse converted to a residence that has been my Madison WI home for the past 26 years is going on the market next week as my husband and I contract into a smaller condo near the university for the 4-5 summer months we spend away from Fort Lauderdale. Old record, tape, CD, and book collections that have been a part of my extended ego are being discarded or massively downsized. It feels like a series of amputations, even though for years all my reading and music listening have not required any of these objects. Rather, they are being downloaded (Amazon Kindle, iPad) or streamed from the internet (Apple Music,Pandora, Google Play, etc.). My valued music CDs have been transferred to iTunes. The visual richness and emotions evoked by the history of my filled book shelves is hardly matched by the two devices that can now perform their functions, an iPad and a wireless speaker. This feeling of loss is why a recent Op-Ed piece by Teddy Wayne having the title of this post resonated with me. The transition I am describing is occurring in the homes of children growing up with parents who have moved from books and CDs to Kindle and streaming. In such settings there can be fewer random walks through a book, record, or CD collection that find novel material, you're looking more for what you think you want. The final paragraphs of Wayne's essay:
Poking through physical artifacts, as I did with those Beatles records, is archival and curatorial; it forces you to examine each object slowly, perhaps sample it and come across a serendipitous discovery.
Scrolling through file names on a device, on the other hand, is what we do all day long, often mindlessly, in our quest to find whatever it is we’re already looking for as rapidly as possible. To see “The Beatles” in a list of hundreds of artists in an iTunes database is not nearly as arresting as holding the album cover for “Sgt. Pepper’s Lonely Hearts Club Band.”
Consider the difference between listening to music digitally versus on a record player or CD. On the former, you’re more likely to download or stream only the singles you want to hear from an album. The latter requires enough of an investment — of acquiring it, but also of energy in playing it — that you stand a better chance of committing and listening to the entire album.
If I’d merely clicked on the first MP3 track of “Sgt. Pepper’s” rather than removed the record from its sleeve, placed it in the phonograph and carefully set the needle over it, I may have become distracted and clicked elsewhere long before the B-side “Lovely Rita” played.
And what of sentiment? Jeff Bezos himself would have a hard time defending the nostalgic capacity of a Kindle .azw file over that of a tattered paperback. Data files can’t replicate the lived-in feel of a piece of beloved art. To a child, a parent’s dog-eared book is a sign of a mind at work and of the personal significance of that volume.
A crisp JPEG of the cover design on a virtual shelf, however, looks the same whether it’s been reread 10 times or not at all. If, that is, it’s ever even seen.

Tuesday, June 14, 2016

Vision reconstructs causal history from static shapes.

From Chen and Scholl:
The perception of shape, it has been argued, also often entails the perception of time. A cookie missing a bite, for example, is seen as a whole cookie that was subsequently bitten. It has never been clear, however, whether such observations truly reflect visual processing. To explore this possibility, we tested whether the perception of history in static shapes could actually induce illusory motion perception. Observers watched a square change to a truncated form, with a “piece” of it missing, and they reported whether this change was sudden or gradual. When the contours of the missing piece suggested a type of historical “intrusion” (as when one pokes a finger into a lump of clay), observers actually saw that intrusion occur: The change appeared to be gradual even when it was actually sudden, in a type of transformational apparent motion. This provides striking phenomenological evidence that vision involves reconstructing causal history from static shapes.

Monday, June 13, 2016

A postdictive illusion of choice.

Bear and Bloom report a simple experiment showing how we can feel as if we make a choice before the time at which this choice is actually made.
Do people know when, or whether, they have made a conscious choice? Here, we explore the possibility that choices can seem to occur before they are actually made. In two studies, participants were asked to quickly choose from a set of options before a randomly selected option was made salient. Even when they believed that they had made their decision prior to this event, participants were significantly more likely than chance to report choosing the salient option when this option was made salient soon after the perceived time of choice. Thus, without participants’ awareness, a seemingly later event influenced choices that were experienced as occurring at an earlier time. These findings suggest that, like certain low-level perceptual experiences, the experience of choice is susceptible to “postdictive” influence and that people may systematically overestimate the role that consciousness plays in their chosen behavior.
From their text:
In the first experiment participants viewed five white circles that appeared in random positions on a computer screen and were asked to try to quickly choose one of these circles “in their head” before one of the circles turned red. After a circle turned red, participants indicated whether they had chosen the red circle, had chosen a circle that did not turn red, or had not had enough time to choose a circle before one of them turned red.
Because the red circle is selected randomly on all trials, people performing this task should choose the red circle on approximately 20% of the trials in which they claim to have had time to make a choice if they are, in fact, making their choices before a circle turns red (and they are not biased to report choosing the red circle for some other reason). In contrast, a postdictive model predicts that people could consciously experience having made a choice before a circle turned red even though this choice did, in fact, occur after a circle turned red and was influenced by that event. Specifically, this could happen if a circle turns red soon enough to bias a person’s choice unconsciously (e.g., by subliminally capturing visual attention..), but this person completes the choice before becoming conscious of the circle’s turning red. On the other hand, if there is a relatively long delay until a circle turns red, a person would be more likely to have finished making a choice before even unconsciously processing a circle’s turning red; hence, this event would be less likely to bias the choice.
In a second experiment, we explored whether postdiction could occur in a slightly different paradigm, in which participants chose one of two different-colored circles. We used two, rather than five, choice options in this experiment to control for a worry that the time-dependent bias we observed in Experiment 1 could have been driven by low-confidence responding. If participants were less confident in choices they made more quickly, they might have been prone to choose relatively randomly between the “y” and “n” response options in short-delay trials. Such a random pattern of responding would have biased participants’ reports of choosing the red circle toward .50 (because there were only two response options), and would have resulted in greater-than-chance reports of choosing the red circle for these shorter delays (because chance was .20 in Experiment 1). By making chance .50 in this experiment, we eliminated any concern that random responding could yield the time-dependent pattern of bias that we observed in Experiment 1.

Friday, June 10, 2016

Reducing stress induced inflammatory disease with bacteria.

Interesting work from Reber et al.:

Significance
The hygiene, or “old friends,” hypothesis proposes that lack of exposure to immunoregulatory microorganisms in modern urban societies is resulting in an epidemic of inflammatory disease, as well as psychiatric disorders in which chronic, low-level inflammation is a risk factor. An important determinant of immunoregulation is the microbial community occupying the host organism, collectively referred to as the microbiota. Here we show that stress disrupts the homeostatic relationship between the microbiota and the host, resulting in exaggerated inflammation. Treatment of mice with a heat-killed preparation of an immunoregulatory environmental microorganism, Mycobacterium vaccae, prevents stress-induced pathology. These data support a strategy of “reintroducing” humans to their old friends to promote optimal health and wellness.
Abstract
The prevalence of inflammatory diseases is increasing in modern urban societies. Inflammation increases risk of stress-related pathology; consequently, immunoregulatory or antiinflammatory approaches may protect against negative stress-related outcomes. We show that stress disrupts the homeostatic relationship between the microbiota and the host, resulting in exaggerated inflammation. Repeated immunization with a heat-killed preparation of Mycobacterium vaccae, an immunoregulatory environmental microorganism, reduced subordinate, flight, and avoiding behavioral responses to a dominant aggressor in a murine model of chronic psychosocial stress when tested 1–2 wk following the final immunization. Furthermore, immunization with M. vaccae prevented stress-induced spontaneous colitis and, in stressed mice, induced anxiolytic or fear-reducing effects as measured on the elevated plus-maze, despite stress-induced gut microbiota changes characteristic of gut infection and colitis. Immunization with M. vaccae also prevented stress-induced aggravation of colitis in a model of inflammatory bowel disease. Depletion of regulatory T cells negated protective effects of immunization with M. vaccae on stress-induced colitis and anxiety-like or fear behaviors. These data provide a framework for developing microbiome- and immunoregulation-based strategies for prevention of stress-related pathologies.

Thursday, June 09, 2016

Unethical amnesia

From Kouchaki and Gino:
Despite our optimistic belief that we would behave honestly when facing the temptation to act unethically, we often cross ethical boundaries. This paper explores one possibility of why people engage in unethical behavior over time by suggesting that their memory for their past unethical actions is impaired. We propose that, after engaging in unethical behavior, individuals’ memories of their actions become more obfuscated over time because of the psychological distress and discomfort such misdeeds cause. In nine studies (n = 2,109), we show that engaging in unethical behavior produces changes in memory so that memories of unethical actions gradually become less clear and vivid than memories of ethical actions or other types of actions that are either positive or negative in valence. We term this memory obfuscation of one’s unethical acts over time “unethical amnesia.” Because of unethical amnesia, people are more likely to act dishonestly repeatedly over time.

Wednesday, June 08, 2016

The attention economy

I pass on some clips from an essay by Tom Chatfield:
How many other things are you doing right now while you’re reading this piece? Are you also checking your email, glancing at your Twitter feed, and updating your Facebook page? What five years ago David Foster Wallace labelled ‘Total Noise’ — ‘the seething static of every particular thing and experience, and one’s total freedom of infinite choice about what to choose to attend to’ — is today just part of the texture of living on a planet that will, by next year, boast one mobile phone for each of its seven billion inhabitants. We are all amateur attention economists, hoarding and bartering our moments…
Much as corporations incrementally improve the taste, texture and sheer enticement of food and drink by measuring how hard it is to stop eating and drinking them, the actions of every individual online are fed back into measures where moreinexorably means better: more readers, more viewers, more exposure, more influence, more ads, more opportunities to unfurl the integrated apparatus of gathering and selling data. Attention, thus conceived, is an inert and finite resource, like oil or gold: a tradable asset that the wise manipulator auctions off to the highest bidder, or speculates upon to lucrative effect. There has even been talk of the world reaching ‘peak attention’, by analogy to peak oil production, meaning the moment at which there is no more spare attention left to spend.
There’s a reductive exaltation in defining attention as the contents of a global reservoir, slopping interchangeably between the brains of every human being alive. Where is the space, here, for the idea of attention as a mutual construction more akin to empathy than budgetary expenditure — or for those unregistered moments in which we attend to ourselves, to the space around us, or to nothing at all?
From the loftiest perspective of all, information itself is pulling the strings: free-ranging memes whose ‘purposes’ are pure self-propagation, and whose frantic evolution outstrips all retrospective accounts…consider yourself as interchangeable as the button you’re clicking, as automated as the systems in which you’re implicated. Seen from such a height, you signify nothing beyond your recorded actions…in making our attentiveness a fungible asset, we’re not so much conjuring currency out of thin air as chronically undervaluing our time.
We watch a 30-second ad in exchange for a video; we solicit a friend’s endorsement; we freely pour sentence after sentence, hour after hour, into status updates and stock responses. None of this depletes our bank balances. Yet its cumulative cost, while hard to quantify, affects many of those things we hope to put at the heart of a happy life: rich relationships, rewarding leisure, meaningful work, peace of mind.
What kind of attention do we deserve from those around us, or owe to them in return? What kind of attention do we ourselves deserve, or need, if we are to be ‘us’ in the fullest possible sense? These aren’t questions that even the most finely tuned popularity contest can resolve. Yet, if contentment and a sense of control are partial measures of success, many of us are selling ourselves far too cheap.

Tuesday, June 07, 2016

A redefinition of health and well-being for older adults

McClintock et al. take a more comprehensive approach to defining health and find some interesting new categories. The healthiest people are obese and robust!

Significance
Health has long been conceived as not just the absence of disease but also the presence of physical, psychological, and social well-being. Nonetheless, the traditional medical model focuses on specific organ system diseases. This representative study of US older adults living in their homes amassed not only comprehensive medical information but also psychological and social data and measured sensory function and mobility, all key factors for independent living and a gratifying life. This comprehensive model revealed six unique health classes, predicting mortality/incapacity. The healthiest people were obese and robust; two new classes, with twice the mortality/incapacity, were people with healed broken bones or poor mental health. This approach provides an empirical method for broadly reconceptualizing health, which may inform health policy.
Abstract
The World Health Organization (WHO) defines health as a “state of complete physical, mental and social well-being and not merely the absence of disease or infirmity.” Despite general acceptance of this comprehensive definition, there has been little rigorous scientific attempt to use it to measure and assess population health. Instead, the dominant model of health is a disease-centered Medical Model (MM), which actively ignores many relevant domains. In contrast to the MM, we approach this issue through a Comprehensive Model (CM) of health consistent with the WHO definition, giving statistically equal consideration to multiple health domains, including medical, physical, psychological, functional, and sensory measures. We apply a data-driven latent class analysis (LCA) to model 54 specific health variables from the National Social Life, Health, and Aging Project (NSHAP), a nationally representative sample of US community-dwelling older adults. We first apply the LCA to the MM, identifying five health classes differentiated primarily by having diabetes and hypertension. The CM identifies a broader range of six health classes, including two “emergent” classes completely obscured by the MM. We find that specific medical diagnoses (cancer and hypertension) and health behaviors (smoking) are far less important than mental health (loneliness), sensory function (hearing), mobility, and bone fractures in defining vulnerable health classes. Although the MM places two-thirds of the US population into “robust health” classes, the CM reveals that one-half belong to less healthy classes, independently associated with higher mortality. This reconceptualization has important implications for medical care delivery, preventive health practices, and resource allocation.

Monday, June 06, 2016

Why do we feel awe?

I want to point to an article by Dacher Keltner on the functions of awe that appeared on the Slate website, along with others sponsored by the John Templeton Foundation. Here are clips describing a few of the experiments he mentions.
A new science is now asking “Why awe?” This is a question we can approach in two ways. First we can consider the long, evolutionary view: Why did awe became part of our species’ emotional repertoire during seven million years of hominid evolution? A preliminary answer is that awe binds us to social collectives and enables us to act in more collaborative ways that enable strong groups, thus improving our odds for survival.
For example, in one study from our Berkeley lab, my colleague Michelle Shiota had participants fill in the blank of the following phrase: “ I AM ____.” They did so 20 times, either while standing before an awe-inspiring replica of a T. rex skeleton in UC Berkeley’s Museum of Paleontology or in the exact same place but oriented to look down a hallway, away from the T. rex. Those looking at the dinosaur were more likely to define their individual selves in collectivist terms—as a member of a culture, a species, a university, a moral cause. Awe embeds the individual self in a social identity.
Near Berkeley’s Museum of Paleontology stands a grove of eucalyptus trees, the tallest in North America. When you gaze up at these trees, with their peeling bark and surrounding nimbus of grayish green light, goosebumps may ripple down your neck, a sure sign of awe...my colleague Paul Piff staged a minor accident near that grove to see if awe would prompt greater kindness...Participants first either looked up into the tall trees for one minute—long enough for them to report being filled with awe—or oriented 90 degrees away to look up at the facade of a large science building. They then encountered a person who stumbled, dropping a handful of pens into the dirt. Sure enough, the participants who had been gazing up at the awe-inspiring trees picked up more pens. Experiencing awe seemed to make them more inclined to help someone in need. They also reported feeling less entitled and self-important than the other study participants did.

Friday, June 03, 2016

Is humanity getting better?

I want to pass on a few clips from a stimulating essay by Leif Wenar, who suggests "The real trick to understanding our world is to see it with both eyes at once. The world now is a thoroughly awful place — compared with what it should be. But not compared with what it was. Keeping both eyes open gives depth to our perception of our own time in history, and makes us better able to see where paths to more progress may be open.":
The 20th century marked an inflection point — the beginning of humanity’s transition from its ancient crises of ignorance to its modern crises of invention. Our science is now so penetrating, our systems are so robust, that we are mostly endangered by our own creations...Our transportation networks are now so fast and far-flung that they transmit diseases worldwide before cures can catch up. The next epidemics will play on our strengths, not our weaknesses — fighting them will mean canceling flights, not killing fleas. This Horseman of the Apocalypse has dismounted and now travels coach.
Indeed, our machines have multiplied so much that a new crisis looms because of the smoke coming off them as they combust. Future food crises, if they come, will be driven by anthropogenic climate change. Famine will descend not from the wrath of God but from the growth of gross domestic product. We ourselves are outfitting the Horsemen of the future, or perhaps it’s better to say that we are creating them...Whether humans can overcome their coming crises of invention will turn on the philosopher’s old question of whether individuals are essentially good or evil, which is a hard question — but recent news will tempt many thumbs to turn downward.
A more positive answer emerges if we switch to a systems perspective, evaluating humanity as a whole as we would an ecosystem or a complex machine. What happens when humanity “adds” energy to itself ... as it’s done massively in the transition from wood and muscle power to fossil fuels and alternatives?..Something is happening to our species, and especially over the last 70 years. The years since 1945 have seen many horrors...Yet this has also been the most prosperous time in human history by far. And by a long way the time with the greatest increase in democracy around the world. It has also been the most peaceful era in recorded human history. As Joshua Goldstein puts it in “Winning the War on War,” “We have avoided nuclear wars, left behind world war, nearly extinguished interstate war, and reduced civil wars to fewer countries with fewer casualties.” Goldstein continues:
In the first half of the twentieth century, world wars killed tens of millions and left whole continents in ruins. In the second half of that century, during the Cold War, proxy wars killed millions, and the world feared a nuclear war that could have wiped out our species. Now, in the early twenty-first century, the worst wars, such as Iraq, kill hundreds of thousands. We fear terrorist attacks that could destroy a city, but not life on the planet. The fatalities still represent a large number and the impacts of wars are still catastrophic for those caught in them, but overall, war has diminished dramatically.
...the big picture of postwar history shows significant improvements in nearly all indicators of lived human experience. The average life span of humans is today longer than it has ever been. A smaller proportion of women die in childbirth than ever before. Child malnutrition is at its lowest level ever, while literacy rates worldwide have never been higher. Most impressive has been the recent reduction in severe poverty — the reduction in the percentage of humans living each day on what a tall Starbucks coffee costs in America. During a recent 20-year stretch the mainstream estimate is that the percentage of the developing world living in such extreme poverty shrank by more than half, from 43 to 21 percent.
Humanity does learn, painfully and often only after thousands or even millions have died ...humanity learns as identities alter to become less aggressive and more open, so that networks can connect individual capacities more effectively and join our resources together.
What we take for granted frames the size of our concerns. We’ve come to expect that mayors and police chiefs will not endorse, much less order, the lynching of minorities. Within that frame, racial profiling and deaths in police custody are top priorities. After decades, we’ve come to expect enduring peace among the great powers. Within that frame any military action by a major power, or a civil war in a resource-rich state, rightly becomes top news.
We can’t relax; the upward trends in time’s graphs may crest at any point. Yet batting away the positive facts is lazy, and requires only a lower form of intelligence.

Thursday, June 02, 2016

Telephone metadata does reveal personal details.

Susan Landau reviews work of Mayer et al. that shows President Obama's statement:
When it comes to telephone calls, nobody is listening to your telephone calls. That’s not what this program is about. As was indicated, what the intelligence community is doing is looking at phone numbers and durations of calls. They are not looking at people’s names, and they’re not looking at content.
...is misleading. Mayer et al. convincingly show that call detail records (CDRs) are personally revelatory...
...it can be determined that someone is suffering from a multiple sclerosis relapse, having cardiac arrhythmia problems, seeking to buy an automatic rifle, intending to start a marijuana-growing venture, or having an abortion.
Needless to say, such insight puts communications surveillance law even more in flux. Here is the abstract from Mayer et al:
Since 2013, a stream of disclosures has prompted reconsideration of surveillance law and policy. One of the most controversial principles, both in the United States and abroad, is that communications metadata receives substantially less protection than communications content. Several nations currently collect telephone metadata in bulk, including on their own citizens. In this paper, we attempt to shed light on the privacy properties of telephone metadata. Using a crowdsourcing methodology, we demonstrate that telephone metadata is densely interconnected, can trivially be reidentified, and can be used to draw sensitive inferences.

Wednesday, June 01, 2016

A science of consciousness carnival in Tucson.

I started teaching my Biology of Mind course at the University of Wisconsin in the early 1990's, and was inspired and stimulated by attending the first of the "Towards a Science of Consciousness" meetings in Tucson, Az. There I met Daniel Dennett, who was very encouraging about my starting my book "The Biology of Mind" which was published in 1999. I attended the following two meetings but then dropped out as pseudoscience, wild speculation about quantum mechanics and consciousness, and New Age gibble-gabble continued make up a substantial fraction of the program. Fewer serious scientists were attending. George Johnson's review of the most recent meeting (now called "The Science of Consciousness," with the original organizer Stuart Hameroff presiding) makes me secure in my decision to have stayed away recently. Some clips from his review:
...wild speculations and carnivalesque pseudoscience were juxtaposed with sober sessions like “Agency and Mental Causation” and data-filled talks about probing conscious brain states with PET scans and EEGs...
...I found myself sitting, late one afternoon, in “Vibrations, Scale, and Topology,” where a musician from Tulsa, Okla., who called himself Timbre Wolf...played a recording of an eerie composition called “Brain Dance,” derived from vibrations generated by tiny molecular structures called microtubules, which are part of the scaffolding of brain cells. The music, to his ear, was reminiscent of Philip Glass, Steve Reich, Cuban rumba, Gustav Holst’s “The Planets,” and the visual rhythms of strange mathematical objects called Penrose tiles...All of this, he suspected, had something to do with quantum mechanics and consciousness, an idea that Dr. Hameroff has long been pursuing.
More disconcerting was the starring role given to the New Age entrepreneur Deepak Chopra. Dr. Chopra believes that human consciousness (through epigenetic feedback) directs the unfolding of human evolution.
No one seemed to object as Dr. Chopra, whose Chopra Foundation was one of the sponsors, shared the stage with prominent professors who engaged with his ideas as if he were another esteemed colleague....Also included in the lineup were presentations hypothesizing that dark energy could explain consciousness and that homeopathic medicine might work through nanoparticles and quantum entanglement — as if homeopathy worked at all.

Tuesday, May 31, 2016

Sentiments on cynicism and hope.

I want to pass on these clips from a commencement talk given by Maria Popova:
…in every environment densely populated by peers — self-comparison becomes inevitable.….here’s the thing about self-comparison: In addition to making you vacate your own experience, your own soul, your own life, in its extreme it breeds resignation. If we constantly feel that there is something more to be had — something that’s available to those with a certain advantage in life, but which remains out of reach for us — we come to feel helpless. And the most toxic byproduct of this helpless resignation is cynicism — that terrible habit of mind and orientation of spirit in which, out of hopelessness for our own situation, we grow embittered about how things are and about what’s possible in the world. Cynicism is a poverty of curiosity and imagination and ambition.
Today, the soul is in dire need of stewardship and protection from cynicism. The best defense against it is vigorous, intelligent, sincere hope — not blind optimism, because that too is a form of resignation, to believe that everything will work out just fine and we need not apply ourselves. I mean hope bolstered by critical thinking that is clear-headed in identifying what is lacking, in ourselves or the world, but then envisions ways to create it and endeavors to do that.
In its passivity and resignation, cynicism is a hardening, a calcification of the soul. Hope is a stretching of its ligaments, a limber reach for something greater.

Monday, May 30, 2016

Exercise and intermittent fasting improve brain plasticity and health

I have had numerous requests for a PDF of the article referenced in a Dec. 29, 2014 post - on how exercise and fasting stimulate brain plasticity and resilience - with the same title as this post.  It turns out that the reference pointed to by the link is open source. Readers should be able to download the article for themselves. Here is the text of the original post:

I thought it might be useful to point to this brief review by Praag et al. that references several recent pieces of work presented at a recent Soc. for Neuroscience Meeting symposium. The experiments indicate that exercise and intermittent energy restriction/fasting may optimize brain function and forestall metabolic and neurodegenerative diseases by enhancing neurogenesis, synaptic plasticity and neuronal stress robustness.  (Motivated readers can obtain the article from me.) Here is their central summary figure:


Exercise and IER/fasting exert complex integrated adaptive responses in the brain and peripheral tissues involved in energy metabolism. As described in the text, both exercise and IER enhance neuroplasticity and resistance of the brain to injury and disease. Some of the effects of exercise and IER on peripheral organs are mediated by the brain, including increased parasympathetic regulation of heart rate and increased insulin sensitivity of liver and muscle cells. In turn, peripheral tissues may respond to exercise and IER by producing factors that bolster neuronal bioenergetics and brain function. Examples include the following: mobilization of fatty acids in adipose cells and production of ketone bodies in the liver; production of muscle-derived neuroactive factors, such as irisin; and production of as yet unidentified neuroprotective “preconditioning factors.” Suppression of local inflammation in tissues throughout the body and the nervous system likely contributes to prevention and reversal of many different chronic disease processes.

Friday, May 27, 2016

Are electric vehicles really better for the planet?

Some time ago I read a careful analysis of the energy required to manufacture a Toyota Prius, which reported that more energy was expended over the life cycle of the typical vehicle (manufacturing the batteries being very energy intensive) than by a high efficiency gasoline-burning cars. I then lost the reference, and so was pleased to come upon the report below in Science Magazine by Wigginton. Work like this makes me feel a bit less guilty about staying with my cheap Honda Civic, and more able to resist the subtle aura of superiority that I imagine is being emitted by friends smugly driving about in their Toyota Priuses:
Shifting to electric passenger vehicles ideally will reduce the carbon footprint of the transportation sector. Two recent studies, however, show that the greenhouse gas emissions produced over the life cycle of electric vehicles, from production through use, may not always be less than those of gasoline-burning vehicles. Ellingsen et al. reveal that vehicle and battery size prohibit some larger electric vehicles from ever overcoming the high greenhouse gas emissions generated during production. Yuksel et al. show that regional factors in the United States such as electrical grid mix, temperature, and driving conditions strongly limit the potential of plug-in electric vehicles to out-perform high-efficiency gas vehicles. Blanket policies directed at the adoption of electric vehicles therefore could potentially fail to reduce the transportation sector's large carbon footprint.

Thursday, May 26, 2016

Culture shapes the evolution of cognition.

From Thompson et al.:
A central debate in cognitive science concerns the nativist hypothesis, the proposal that universal features of behavior reflect a biologically determined cognitive substrate: For example, linguistic nativism proposes a domain-specific faculty of language that strongly constrains which languages can be learned. An evolutionary stance appears to provide support for linguistic nativism, because coordinated constraints on variation may facilitate communication and therefore be adaptive. However, language, like many other human behaviors, is underpinned by social learning and cultural transmission alongside biological evolution. We set out two models of these interactions, which show how culture can facilitate rapid biological adaptation yet rule out strong nativization. The amplifying effects of culture can allow weak cognitive biases to have significant population-level consequences, radically increasing the evolvability of weak, defeasible inductive biases; however, the emergence of a strong cultural universal does not imply, nor lead to, nor require, strong innate constraints. From this we must conclude, on evolutionary grounds, that the strong nativist hypothesis for language is false. More generally, because such reciprocal interactions between cultural and biological evolution are not limited to language, nativist explanations for many behaviors should be reconsidered: Evolutionary reasoning shows how we can have cognitively driven behavioral universals and yet extreme plasticity at the level of the individual—if, and only if, we account for the human capacity to transmit knowledge culturally. Wherever culture is involved, weak cognitive biases rather than strong innate constraints should be the default assumption.

Wednesday, May 25, 2016

A model for aggression and violence around the world.

I want to pass on the abstract of a forthcoming article in Brain and Behavioral Science for which reviewer's comments are being solicited. (I'm on the mailing list of potential reviewers because I authored an article in the journal in the 1990s). It's model of climate, aggression, and self control makes total sense in terms of my experience of living both in Madison Wisconsin (from May-September) and Fort Lauderdale Florida (October-April). (I returned to Madison two weeks ago and, as usual, have been struck by how much less defensiveness and aggression is exhibited by strangers in public in the more northern Madison location. Strangers at grocery stores are more benign, pleasant, and occasionally even make eye contact!)
Target Article: Aggression and Violence Around the World: A Model of Climate, Aggression, and Self-control in Humans (CLASH)
Authors: Paul A. M. Van Lange, Maria I. Rinderu, and Brad J. Bushman
Deadline for Commentary Proposals: Thursday June 9, 2016
Abstract: Worldwide there are substantial differences within and between countries in aggression and violence. Although there are various exceptions, a general rule is that aggression and violence increase as one moves closer to the equator, which suggests the important role of climate differences. While this pattern is robust, theoretical explanations for these large differences in aggression and violence within countries and around the world are lacking. Most extant explanations focus on the influence of average temperature as a factor that triggers aggression (The General Aggression Model), or the notion that warm temperature allows for more social interaction situations (Routine Activity Theory) in which aggression is likely to unfold. We propose a new model of CLimate, Aggression, and Self-control in Humans (CLASH) that seeks to understand differences within and between countries in aggression and violence in terms of differences in climate. Lower temperatures, and especially larger degrees of seasonal variation in climate, calls for individuals and groups to adopt a slower life history strategy, and exert more focus on the future (versus present), and a stronger focus on self-control. The CLASH model further outlines that slow life strategy, future orientation, and strong self-control are important determinants of inhibiting aggression and violence. We also discuss how CLASH is different from other recently developed models that emphasize climate differences for understanding conflict. We conclude by discussing the theoretical and societal importance of climate in shaping individual and societal differences in aggression and violence.

Tuesday, May 24, 2016

The Dalai Lama’s Atlas of emotions

You might have a look at this curious website pointed to by Kevin Randall, an atlas of emotions developed by Paul Ekman and collaborators commissioned by the Dalai Lama (who paid ~$750,000 for the project). After surveying 248 of the most active emotion researchers in the world, Ekman chose to divide emotions into five broad categories (anger, fear, disgust, sadness and enjoyment), each having an elaborate subset of emotional states, triggers, actions and moods. A cartography and data visualization firm was engaged to help depict them in a visual, and hopefully useful, way.


I'm really at a bit of a loss to figure out how the byzantine complexity of the beautiful graphic displays are supposed to be useful. They don't quite do it for me. Maybe this is supposed to be a lookup guide for an emotion one is feeling but not quite categorizing? Sort of a bestiary of emotions? A well-intentioned effort, surely, but many of the descriptions seem quite banal and obvious.

Monday, May 23, 2016

When philosophy lost its way.

Frodeman and Briggle offer a lament over the irreversible passing of the practice of philosophy as a moral endeavor, one that might offer a view of the good society apart from the prescriptions of religion. Some clips from their essay:
Before its migration to the university, philosophy had never had a central home. Philosophers could be found anywhere — serving as diplomats, living off pensions, grinding lenses, as well as within a university. Afterward, if they were “serious” thinkers, the expectation was that philosophers would inhabit the research university…This purification occurred in response to at least two events. The first was the development of the natural sciences, as a field of study clearly distinct from philosophy, circa 1870, and the appearance of the social sciences in the decade thereafter. ..The second event was the placing of philosophy as one more discipline alongside these sciences within the modern research university. A result was that philosophy, previously the queen of the disciplines, was displaced, as the natural and social sciences divided the world between them.
Philosophers needed to embrace the structure of the modern research university, which consists of various specialties demarcated from one another. That was the only way to secure the survival of their newly demarcated, newly purified discipline. “Real” or “serious” philosophers had to be identified, trained and credentialed. Disciplinary philosophy became the reigning standard for what would count as proper philosophy.
Having adopted the same structural form as the sciences, it’s no wonder philosophy fell prey to physics envy and feelings of inadequacy. Philosophy adopted the scientific modus operandi of knowledge production, but failed to match the sciences in terms of making progress in describing the world. Much has been made of this inability of philosophy to match the cognitive success of the sciences. But what has passed unnoticed is philosophy’s all-too-successful aping of the institutional form of the sciences. We, too, produce research articles. We, too, are judged by the same coin of the realm: peer-reviewed products. We, too, develop sub-specializations far from the comprehension of the person on the street. In all of these ways we are so very “scientific.”
The act of purification accompanying the creation of the modern research university was not just about differentiating realms of knowledge. It was also about divorcing knowledge from virtue. Though it seems foreign to us now, before purification the philosopher (and natural philosopher) was assumed to be morally superior to other sorts of people. ..The study of philosophy elevated those who pursued it. Knowing and being good were intimately linked. It was widely understood that the point of philosophy was to become good rather than simply to collect or produce knowledge…The purification made it no longer sensible to speak of nature, including human nature, in terms of purposes and functions…By the late 19th century, Kierkegaard and Nietzsche had proved the failure of philosophy to establish any shared standard for choosing one way of life over another…There was a brief window when philosophy could have replaced religion as the glue of society; but the moment passed. People stopped listening as philosophers focused on debates among themselves.
Once knowledge and goodness were divorced, scientists could be regarded as experts, but there are no morals or lessons to be drawn from their work. Science derives its authority from impersonal structures and methods, not the superior character of the scientist. The individual scientist is no different from the average Joe, with no special authority to pronounce on what ought to be done…philosophy has aped the sciences by fostering a culture that might be called “the genius contest.” Philosophic activity devolved into a contest to prove just how clever one can be in creating or destroying arguments. Today, a hyperactive productivist churn of scholarship keeps philosophers chained to their computers. Like the sciences, philosophy has largely become a technical enterprise, the only difference being that we manipulate words rather than genes or chemicals. Lost is the once common-sense notion that philosophers are seeking the good life — that we ought to be (in spite of our failings) model citizens and human beings. Having become specialists, we have lost sight of the whole. The point of philosophy now is to be smart, not good. It has been the heart of our undoing.

Friday, May 20, 2016

This is how fascism comes to America

I pass on a few clips from a must-read article in the Washington Post by Robert Kagan, on Donald Trump:
Republican politicians marvel at how he has “tapped into” a hitherto unknown swath of the voting public. But what he has tapped into is what the founders most feared when they established the democratic republic: the popular passions unleashed, the “mobocracy.” Conservatives have been warning for decades about government suffocating liberty. But here is the other threat to liberty that Alexis de Tocqueville and the ancient philosophers warned about: that the people in a democracy, excited, angry and unconstrained, might run roughshod over even the institutions created to preserve their freedoms. As Alexander Hamilton watched the French Revolution unfold, he feared in America what he saw play out in France — that the unleashing of popular passions would lead not to greater democracy but to the arrival of a tyrant, riding to power on the shoulders of the people.
This phenomenon has arisen in other democratic and quasi-democratic countries over the past century, and it has generally been called “fascism.” Fascist movements, too, had no coherent ideology, no clear set of prescriptions for what ailed society. “National socialism” was a bundle of contradictions, united chiefly by what, and who, it opposed; fascism in Italy was anti-liberal, anti-democratic, anti-Marxist, anti-capitalist and anti-clerical. Successful fascism was not about policies but about the strongman, the leader (Il Duce, Der Fuhrer), in whom could be entrusted the fate of the nation. Whatever the problem, he could fix it. Whatever the threat, internal or external, he could vanquish it, and it was unnecessary for him to explain how. Today, there is Putinism, which also has nothing to do with belief or policy but is about the tough man who singlehandedly defends his people against all threats, foreign and domestic.
To understand how such movements take over a democracy, one only has to watch the Republican Party today. These movements play on all the fears, vanities, ambitions and insecurities that make up the human psyche. In democracies, at least for politicians, the only thing that matters is what the voters say they want — vox populi vox dei. A mass political movement is thus a powerful and, to those who would oppose it, frightening weapon. When controlled and directed by a single leader, it can be aimed at whomever the leader chooses. If someone criticizes or opposes the leader, it doesn’t matter how popular or admired that person has been. He might be a famous war hero, but if the leader derides and ridicules his heroism, the followers laugh and jeer. He might be the highest-ranking elected guardian of the party’s most cherished principles. But if he hesitates to support the leader, he faces political death.
This is how fascism comes to America, not with jackboots and salutes (although there have been salutes, and a whiff of violence) but with a television huckster, a phony billionaire, a textbook egomaniac “tapping into” popular resentments and insecurities, and with an entire national political party — out of ambition or blind party loyalty, or simply out of fear — falling into line behind him.

Thursday, May 19, 2016

Brain modules that process human consensus decision-making

Suzuki et al. offer a study noting brain areas important in consensus decision-making, with different decision variables being associated with activity in different brain area that are integrated by distributed neural activity (See Network hubs in the human brain for an overall review of domains of cognitive function with some great summary graphics). The summary and abstract:

Highlights
•A task is used to study how the brain implements consensus decision-making 
•Consensus decision-making depends on three distinct computational processes 
•These different signals are encoded in distinct brain regions 
•Integration of these signals occurs in the dorsal anterior cingulate cortex
Summary
Consensus building in a group is a hallmark of animal societies, yet little is known about its underlying computational and neural mechanisms. Here, we applied a computational framework to behavioral and fMRI data from human participants performing a consensus decision-making task with up to five other participants. We found that participants reached consensus decisions through integrating their own preferences with information about the majority group members’ prior choices, as well as inferences about how much each option was stuck to by the other people. These distinct decision variables were separately encoded in distinct brain areas—the ventromedial prefrontal cortex, posterior superior temporal sulcus/temporoparietal junction, and intraparietal sulcus—and were integrated in the dorsal anterior cingulate cortex. Our findings provide support for a theoretical account in which collective decisions are made through integrating multiple types of inference about oneself, others, and environments, processed in distinct brain modules.

Wednesday, May 18, 2016

Aerobic fitness: one minute of all out effort = 45 min. of moderate effort

Reynolds has written a series of articles describing experiments showing the benefits of high-intensity interval training. She now points to a study by Gillen et al. showing that high intensity effort periods of only 1 minute can have a big effect. Twelve weeks of a regime of 3 cycling sessions per week lasting 10 minutes each, with only one minute of that time being strenuous, caused the same 20% increase in aerobic fitness as sessions of 45 min of cycling at a moderate pace.

Tuesday, May 17, 2016

America in decline?

I pass on a few clips from Easterbrook's article, on the prevailing negative depiction (especially by Republican candidates) of America's current state and direction:
...most American social indicators have been positive at least for years, in many cases for decades. The country is, on the whole, in the best shape it’s ever been in. So what explains all the bad vibes?..the core reason for the disconnect between the nation’s pretty-good condition and the gloomy conventional wisdom is that optimism itself has stopped being respectable. Pessimism is now the mainstream, with optimists viewed as Pollyannas. If you don’t think everything is awful, you don’t understand the situation!
Objectively, the glass looks significantly more than half full.
Job growth has been strong for five years, with unemployment now below where it was for most of the 1990s, a period some extol as the “good old days.” The American economy is No. 1 by a huge margin, larger than Nos. 2 and 3 (China and Japan) combined. Americans are seven times as productive, per capita, as Chinese citizens. The dollar is the currency the world craves — which means other countries perceive America’s long-term prospects as very good.
Pollution, discrimination, crime and most diseases are in an extended decline; living standards, longevity and education levels continue to rise. The American military is not only the world’s strongest, it is the strongest ever. The United States leads the world in science and engineering, in business innovation, in every aspect of creativity, including the arts. Terrorism is a serious concern, but in the last 15 years, even taking into account Sept. 11, an American is five times more likely to be hit by lightning than to be killed by a terrorist.
Easterbrook continues with a discussion of the dire straits of the middle class, changes in manufacturing jobs ("Manufacturing jobs described by Mr. Trump and Mr. Sanders as “lost” to China cannot be found there, or anywhere."), etc.
...developing the postindustrial economy — while addressing issues such as inequality, greenhouse emissions and the condition of public schools — will require optimism. Pessimists think in terms of rear-guard actions to turn back the clock. Optimists understand that where the nation has faults, it’s time to roll up our sleeves and get to work.
That’s why the lack of progressive optimism is so keenly felt. In recent decades, progressives drank too deeply of instant-doomsday claims. If their predictions had come true, today petroleum would be exhausted, huge numbers of major animal species would be extinct, crop failures would be causing mass starvation, developing-world poverty would be getting worse instead of declining fast. (In 1990, 37 percent of humanity lived in what the World Bank defines as extreme poverty; today it’s 10 percent.)

Monday, May 16, 2016

Downsides of diversity.

I want to thank the anonymous commentator on the “Diversity makes you brighter” post, who sent links to interesting articles by Jonas and by Dinesena and Sønderskov. I pass on just some clips from Jonas, noting work by Putnam and Page:
...a fascinating new portrait of diversity emerging from recent scholarship. Diversity, it shows, makes us uncomfortable -- but discomfort, it turns out, isn't always a bad thing. Unease with differences helps explain why teams of engineers from different cultures may be ideally suited to solve a vexing problem. Culture clashes can produce a dynamic give-and-take, generating a solution that may have eluded a group of people with more similar backgrounds and approaches. At the same time, though, Putnam's work adds to a growing body of research indicating that more diverse populations seem to extend themselves less on behalf of collective needs and goals.
In more diverse communities, he says, there were neither great bonds formed across group lines nor heightened ethnic tensions, but a general civic malaise. And in perhaps the most surprising result of all, levels of trust were not only lower between groups in more diverse settings, but even among members of the same group...
So, there is a diversity paradox:
...those in more diverse communities may do more bowling alone, but the creative tensions unleashed by those differences in the workplace may vault those same places to the cutting edge of the economy and of creative culture.

Friday, May 13, 2016

Two ways to be satisfied.

Anna North points to an article by Helzer and Jayawickreme that examines two different control strategies for obtaining short and long term life satisfaction, “primary control” — the ability to directly affect one's circumstances — and “secondary control” — the ability to affect how one responds to those circumstances.
How does a sense of control relate to well-being? We consider two distinguishable control strategies, primary and secondary control, and their relationships with two facets of subjective well-being, daily positive/negative affective experience and global life satisfaction. Using undergraduate and online samples, the results suggest that these different control strategies are associated uniquely with distinct facets of well-being. After controlling for shared variance among constructs, primary control (the tendency to achieve mastery over circumstances via goal striving) was associated more consistently with daily affective experience than was secondary control, and secondary control (the tendency to achieve mastery over circumstances via sense-making) was associated more strongly with life satisfaction than primary control, but only within the student sample and community members not in a committed relationship. The results highlight the importance of both control strategies to everyday health and provide insights into the mechanisms underlying the relationship between control and well-being.
It is not clear why relationship status makes a difference. Helzer suggests that having a partner may help people deal with adversity the same way secondary control does, so secondary control may have less of an effect

Thursday, May 12, 2016

John Oliver on "Scientific Studies show...."

I have to pass on this great bit from John Oliver, on the vacuity of most scientific reporting.


Diversity makes you brighter.

Providing some data relevant to debates over affirmative action, Levine et al. show that ethnic diversity can increase intelligent behaviors. Misfits between market prices and the true value of assets (market bubbles) are more likely in ethnically homogeneous than in diverse markets.
Markets are central to modern society, so their failures can be devastating. Here, we examine a prominent failure: price bubbles. Bubbles emerge when traders err collectively in pricing, causing misfit between market prices and the true values of assets. The causes of such collective errors remain elusive. We propose that bubbles are affected by ethnic homogeneity in the market and can be thwarted by diversity. In homogenous markets, traders place undue confidence in the decisions of others. Less likely to scrutinize others’ decisions, traders are more likely to accept prices that deviate from true values. To test this, we constructed experimental markets in Southeast Asia and North America, where participants traded stocks to earn money. We randomly assigned participants to ethnically homogeneous or diverse markets. We find a marked difference: Across markets and locations, market prices fit true values 58% better in diverse markets. The effect is similar across sites, despite sizeable differences in culture and ethnic composition. Specifically, in homogenous markets, overpricing is higher as traders are more likely to accept speculative prices. Their pricing errors are more correlated than in diverse markets. In addition, when bubbles burst, homogenous markets crash more severely. The findings suggest that price bubbles arise not only from individual errors or financial conditions, but also from the social context of decision making. The evidence may inform public discussion on ethnic diversity: it may be beneficial not only for providing variety in perspectives and skills, but also because diversity facilitates friction that enhances deliberation and upends conformity.

Wednesday, May 11, 2016

What art unveils

I pass on some initial and final clips from an essay by Alva Noë that is worth reading in its entirely.
Is there a way of thinking about art that will get us closer to an understanding of its essential nature, and our own?...the trend is to try to answer these questions in the key of neuroscience. I recommend a different approach, but not because I don’t think it is crucial to explore the links between art and our biological nature. The problem is that neuroscience has yet to frame an adequate conception of our nature. You look in vain in the writings of neuroscientists for satisfying accounts of experience or consciousness. For this reason, I believe, we can’t use neuroscience to explain art and its place in our lives. Indeed, if I am right, the order of explanation may go in the other direction: Art can help us frame a better picture of our human nature.
...Design, the work of technology, stops, and art begins, when we are unable to take the background of our familiar technologies and activities for granted, and when we can no longer take for granted what is, in fact, a precondition of the very natural-seeming intelligibility of such things as doorknobs and pictures, words and sounds. When you and are I talking, I don’t pay attention to the noises you are making; your language is a transparency through which I encounter you. Design, at least when it is optimal, is transparent in just this way; it disappears from view and gets absorbed in application. You study the digital image of the shirt on the website, you don’t contemplate its image.
Art, in contrast, makes things strange. You do contemplate the image, when you examine Leonardo’s depiction of the lady with the ermine. You are likely, for example, to notice her jarringly oversized and masculine hand and to wonder why Leonardo draws our attention to that feature of this otherwise beautiful young person. Art disrupts plain looking and it does so on purpose. By doing so it discloses just what plain looking conceals.
Art unveils us ourselves. Art is a making activity because we are by nature and culture organized by making activities. A work of art is a strange tool. It is an alien implement that affords us the opportunity to bring into view everything that was hidden in the background.
If I am right, art isn’t a phenomenon to be explained. Not by neuroscience, and not by philosophy. Art is itself a research practice, a way of investigating the world and ourselves. Art displays us to ourselves, and in a way makes us anew, by disrupting our habitual activities of doing and making.

Tuesday, May 10, 2016

Our brain activity at rest predicts our performance on tasks.

The Science Mazaine precis of Travor et al.:
We all differ in how we perceive, think, and act. What drives individual differences in evoked brain activity? Tavor et al. applied computational models to functional magnetic resonance imaging (fMRI) data from the Human Connectome Project. Brain activity in the “resting” state when subjects were not performing any explicit task predicted differences in fMRI activation across a range of cognitive paradigms. This suggests that individual differences in many cognitive tasks are a stable trait marker. Resting-state functional connectivity thus already contains the repertoire that is then expressed during task-based fMRI.
And the article abstract:
When asked to perform the same task, different individuals exhibit markedly different patterns of brain activity. This variability is often attributed to volatile factors, such as task strategy or compliance. We propose that individual differences in brain responses are, to a large degree, inherent to the brain and can be predicted from task-independent measurements collected at rest. Using a large set of task conditions, spanning several behavioral domains, we train a simple model that relates task-independent measurements to task activity and evaluate the model by predicting task activation maps for unseen subjects using magnetic resonance imaging. Our model can accurately predict individual differences in brain activity and highlights a coupling between brain connectivity and function that can be captured at the level of individual subjects.

Monday, May 09, 2016

The key to political persuasion

I pass on clips from this interesting piece, that has been languishing in my queue of potential posts for some time, in which Willer and Feinberg give a more accessible account of their work reported in the Personality and Social Psychology Bulletin.
In business, everyone knows that if you want to persuade people to make a deal with you, you have to focus on what they value, not what you do. If you’re trying to sell your car, you emphasize the features of the sale that appeal to the buyer (the reliability and reasonable price of the vehicle), not the ones that appeal to you (the influx of cash).
This rule of salesmanship also applies in political debate — i.e., you should frame your position in terms of the moral values of the person you’re trying to convince. But when it comes to politics, this turns out to be hard to do. We found that people struggled to set aside their reasons for taking a political position and failed to consider how someone with different values might come to support that same position.
In one study, we presented liberals and conservatives with one of two messages in support of same-sex marriage. One message emphasized the need for equal rights for same-sex couples. This is the sort of fairness-based message that liberals typically advance for same-sex marriage. It is framed in terms of a value — equality — that research has shown resonates more strongly among liberals than conservatives. The other message was designed to appeal to values of patriotism and group loyalty, which have been shown to resonate more with conservatives. (It argued that “same-sex couples are proud and patriotic Americans” who “contribute to the American economy and society.”)
Liberals showed the same support for same-sex marriage regardless of which message they encountered. But conservatives supported same-sex marriage significantly more if they read the patriotism message rather than the fairness one.
In a parallel experiment, we targeted liberals for persuasion. We presented a group of liberals and conservatives with one of two messages in support of increased military spending. One message argued that we should “take pride in our military,” which “unifies us both at home and abroad.” The other argued that military spending is necessary because, through the military, the poor and disadvantaged “can achieve equal standing,” by ensuring they have “a reliable salary and a future apart from the challenges of poverty and inequality.”
For conservatives, it didn’t matter which message they read; their support for military spending was the same. However, liberals expressed significantly greater support for increasing military spending if they read the fairness message rather than the patriotism one.
If you’re thinking that these reframed arguments don’t sound like ones that conservatives and liberals would naturally be inclined to make, you’re right. In an additional study, we asked liberals to write a persuasive argument in favor of same-sex marriage aimed at convincing conservatives — and we offered a cash prize to the participant who wrote the most persuasive message. Despite the financial incentive, just 9 percent of liberals made arguments that appealed to more conservative notions of morality, while 69 percent made arguments based on more liberal values.
Conservatives were not much better. When asked to write an argument in favor of making English the official language of the United States that would be persuasive to liberals (with the same cash incentive), just 8 percent of conservatives appealed to liberal values, while 59 percent drew upon conservative values.
Why do we find moral reframing so challenging? There are a number of reasons. You might find it off-putting to endorse values that you don’t hold yourself. You might not see a link between your political positions and your audience’s values. And you might not even know that your audience endorses different values from your own. But whatever the source of the gulf, it can be bridged with effort and consideration.
Maybe reframing political arguments in terms of your audience’s morality should be viewed less as an exercise in targeted, strategic persuasion, and more as an exercise in real, substantive perspective taking. To do it, you have to get into the heads of the people you’d like to persuade, think about what they care about and make arguments that embrace their principles. If you can do that, it will show that you view those with whom you disagree not as enemies, but as people whose values are worth your consideration.
Even if the arguments that you wind up making aren’t those that you would find most appealing, you will have dignified the morality of your political rivals with your attention, which, if you think about it, is the least that we owe our fellow citizens.

Friday, May 06, 2016

Our perception of our body shape is very malleable - making your finger feel shorter.

Here is a neat trick. It works! (I tried it). Ekroll et al. show that illusory visual completion of an object's invisible backside can make you finger feel shorter. Here is their summary and the central graphic from the article.

Highlights
•The experience of the hidden backsides of things acts as a real percept 
•These percepts have causal powers, although they do not correspond to real objects 
•They can evoke a bizarre illusion in which the observer’s own finger feels shrunken 
•The perceptual representation of body shape is highly malleable
Summary
In a well-known magic trick known as multiplying balls, conjurers fool their audience with the use of a semi-spherical shell, which the audience perceives as a complete ball. Here, we report that this illusion persists even when observers touch the inside of the shell with their own finger. Even more intriguingly, this also produces an illusion of bodily self-awareness in which the finger feels shorter, as if to make space for the purely illusory volume of the visually completed ball. This observation provides strong evidence for the controversial and counterintuitive idea that our experience of the hidden backsides of objects is shaped by genuine perceptual representations rather than mere cognitive guesswork or imagery.
Figure


A Well-Known Magic Trick and the Shrunken Finger Illusion
(A and B) The multiplying balls routine. The magician first holds what seems to be a single ball between his fingers (A). After a quick flick of the wrist, a second ball seems to materialize (B). In reality, the lower “ball” is a hollow semi-spherical shell, from which the real ball is pulled out.
(C and D) Schematic illustration of the shrunken finger illusion. When a semi-spherical shell is balanced on the observer’s finger as shown in (C) and viewed from above, the observer often reports perceiving the shell as a complete ball (D), while his or her finger is felt to be unusually short, as if to make space for the illusory volume of the complete ball. Note that this drawing is an exaggerated caricature of the perceptual experience. In particular, the real effect may be smaller than depicted here. In the experiments, only the middle finger was extended, while the other fingers were closed to a fist (see Figure below).

Thursday, May 05, 2016

What happens if we all live to 100?

I want to mention an interesting article by Easterbrook that has been languishing in my queue of potential posts for more than a year. It notes numerous studies on aging and life extension, and the question of how long the eerily linear rise in life expectancy since 1840 (from the 40's to the 80's) can continue. Two clips:
No specific development or discovery has caused the rise: improvements in nutrition, public health, sanitation, and medical knowledge all have helped, but the operative impetus has been the “stream of continuing progress.”
One view is that increases will continue at least until life expectancy at birth surpasses 100. Jay Olshansky, a professor of public health at the University of Illinois at Chicago disagrees, saying:
...the rise in life expectancy will “hit a wall soon, if it hasn’t already....Most of the 20th-century gains in longevity came from reduced infant mortality, and those were one time gains.” Infant mortality in the United States trails some other nations’, but has dropped so much—down to one in 170—that little room for improvement remains. “There’s tremendous statistical impact on life expectancy when the young are saved,” Olshansky says. “A reduction in infant mortality saves the entire span of a person’s life. Avoiding mortality in a young person—say, by vaccine—saves most of the person’s life. Changes in medicine or lifestyle that extend the lives of the old don’t add much to the numbers.” Olshansky calculates that if cancer were eliminated, American life expectancy would rise by only three years, because a host of other chronic fatal diseases are waiting to take its place. He thinks the 21st century will see the average life span extend “another 10 years or so,” with a bonus of more health span. Then the increase will slow noticeably, or stop.
Easterbrook's discussion of the social, economic, and political aspects of our graying future is well worth reading. The number of Americans 65 or older could reach 108 million by 2050, like adding three more Floridas inhabited entirely by seniors.
The nonpartisan think tank Third Way has calculated that at the beginning of the Kennedy presidency, the federal government spent $2.50 on public investments—infrastructure, education, and research—for every $1 it spent on entitlements. By 2022, Third Way predicts, the government will spend $5 on entitlements for every $1 on public investments. Infrastructure, education, and research lead to economic growth; entitlement subsidies merely allow the nation to tread water.

Wednesday, May 04, 2016

Semantic maps in our brains - and some interactive graphics

Huth et al. have performed functional MRI on subjects listening to hours of narrative stories to find semantic domains that seem to be consistent across individuals. This interactive 3D viewer (a preliminary version with limited data that takes a while to download and requires a fairly fast computer) shows a color coding of areas with different semantic selectivities (body part, person, place, time, outdoor, visual, tactile, violence, etc.) Here is their Nature abstract:
The meaning of language is represented in regions of the cerebral cortex collectively known as the ‘semantic system’. However, little of the semantic system has been mapped comprehensively, and the semantic selectivity of most regions is unknown. Here we systematically map semantic selectivity across the cortex using voxel-wise modelling of functional MRI (fMRI) data collected while subjects listened to hours of narrative stories. We show that the semantic system is organized into intricate patterns that seem to be consistent across individuals. We then use a novel generative model to create a detailed semantic atlas. Our results suggest that most areas within the semantic system represent information about specific semantic domains, or groups of related concepts, and our atlas shows which domains are represented in each area. This study demonstrates that data-driven methods—commonplace in studies of human neuroanatomy and functional connectivity—provide a powerful and efficient means for mapping functional representations in the brain.

Tuesday, May 03, 2016

Video games for Neuro-Cognitive Optimization

Continuing the MindBlog thread on brain games (cf. here), I pass on the introduction to a brief review by Mishra, Anguera, and Gazzaley on designing the next generation of closed-loop video games (CLVGs) that offer the prospect of enhancing cognition:
Humans of all ages engage deeply in game play. Game-based interactive environments provide a rich source of enjoyment, but also generate powerful experiences that promote learning and behavioral change (Pellegrini, 2009). In the modern era, software-based video games have become ubiquitous. The degree of interactivity and immersion in these video games can now be further enhanced like never before with the advent of consumer-accessible technologies like virtual reality, augmented reality, wearable physiological devices, and motion capture, all of which can be readily integrated using accessible game engines. This technological revolution presents a huge opportunity for neuroscientists to design targeted, novel game-based tools that drive positive neuroplasticity, accelerate learning, and strengthen cognitive function, and thereby promote mental wellbeing in both healthy and impaired brains.
In fact, there is now a burgeoning brain-training industry that already claims to have achieved this goal. However, many commercial claims are unsubstantiated and dismissed by the scientific community (Max Planck Institute for Human Development/Stanford Center on Longevity, 2014, Underwood, 2016). It seems prudent for us to slow down and approach this opportunity with scientific rigor and conservative optimism. Enhancing brain function should not be viewed as a clever, profitable start-up idea that can be conquered with a large marketing budget. If the field continues to be led by overinflated claims, we will jeopardize the careful and iterative process of evidence-based innovations in brain training and thereby risk throwing out the baby with the bathwater.

To strike the right balance, the path to commercialization needs to be accomplished via cutting-edge, neuroscientifically informed video game development tightly coupled with refinement and validation of the software in well-controlled empirical studies. Additionally, to separate the grain from the chaff, these studies and the claims based on them need verification and approval by independent regulatory agencies and the broader scientific community. High-level video game development and rigorous scientific validation need to become the twin pillar foundations of the next generation of closed-loop video games (CLVGs). Here, we define CLVGs as interactive video games that incorporate rapid, real-time, performance-driven, adaptive game challenges and performance feedback. The time is ideal for intensified effort in this important endeavor; CLVGs that are methodically developed and validated have the potential to benefit a broad array of disciplines in need of effective tools to enhance brain function, including education, medicine, and wellness.

Monday, May 02, 2016

Embodied Prediction - perception and mind turned upside down

Andy Clark does a fascinating discussion and analysis of predictive processing, which turns the traditional picture of perception on its head. The embodied mind model, which seems to me completely compelling, shows the stark inadequacy of most brain centered models of mind and cognition. I pass on the end of his introduction and the closing paragraph of the essay. (This essay is just one of many on a fascinating website , Open Mind, that has posted 39 essays (edited by Thomas Metzinger and Jennifer Windt) by contributors who are both junior and senior members of the academic philosophy of mind field.
Predictive processing plausibly represents the last and most radical step in a retreat from the passive, input-dominated view of the flow of neural processing. According to this emerging class of models, naturally intelligent systems (humans and other animals) do not passively await sensory stimulation. Instead, they are constantly active, trying to predict the streams of sensory stimulation before they arrive. Before an “input” arrives on the scene, these pro-active cognitive systems are already busy predicting its most probable shape and implications. Systems like this are already (and almost constantly) poised to act, and all they need to process are any sensed deviations from the predicted state. It is these calculated deviations from predicted states (known as prediction errors) that thus bear much of the information-processing burden, informing us of what is salient and newsworthy within the dense sensory barrage. The extensive use of top-down probabilistic prediction here provides an effective means of avoiding the kinds of “representational bottleneck” feared by early opponents of representation-heavy—but feed-forward dominated—forms of processing. Instead, the downward flow of prediction now does most of the computational “heavy-lifting”, allowing moment-by-moment processing to focus only on the newsworthy departures signified by salient prediction errors. Such economy and preparedness is biologically attractive, and neatly sidesteps the many processing bottlenecks associated with more passive models of the flow of information.
Action itself...then needs to be reconceived. Action is not so much a response to an input as a neat and efficient way of selecting the next “input”, and thereby driving a rolling cycle. These hyperactive systems are constantly predicting their own upcoming states, and actively moving so as to bring some of them into being. We thus act so as to bring forth the evolving streams of sensory information that keep us viable (keeping us fed, warm, and watered) and that serve our increasingly recondite ends. PP thus implements a comprehensive reversal of the traditional (bottom-up, forward-flowing) schema. The largest contributor to ongoing neural response, if PP is correct, is the ceaseless anticipatory buzz of downwards-flowing neural prediction that drives both perception and action. Incoming sensory information is just one further factor perturbing those restless pro-active seas. Within those seas, percepts and actions emerge via a recurrent cascade of sub-personal predictions forged from unconscious expectations spanning multiple spatial and temporal scales.
Conceptually, this implies a striking reversal, in that the driving sensory signal is really just providing corrective feedback on the emerging top-down predictions. As ever-active prediction engines, these kinds of minds are not, fundamentally, in the business of solving puzzles given to them as inputs. Rather, they are in the business of keeping us one step ahead of the game, poised to act and actively eliciting the sensory flows that keep us viable and fulfilled. If this is on track, then just about every aspect of the passive forward-flowing model is false. We are not passive cognitive couch potatoes so much as proactive predictavores, forever trying to stay one step ahead of the incoming waves of sensory stimulation.
Conclusion: Towards a mature science of the embodied mind
By self-organizing around prediction error, and by learning a generative rather than a merely discriminative (i.e., pattern-classifying) model, these approaches realize many of the goals of previous work in artificial neural networks, robotics, dynamical systems theory, and classical cognitive science. They self-organize around prediction error signals, perform unsupervised learning using a multi-level architecture, and acquire a satisfying grip—courtesy of the problem decompositions enabled by their hierarchical form—upon structural relations within a domain. They do this, moreover, in ways that are firmly grounded in the patterns of sensorimotor experience that structure learning, using continuous, non-linguaform, inner encodings (probability density functions and probabilistic inference). Precision-based restructuring of patterns of effective connectivity then allow us to nest simplicity within complexity, and to make as much (or as little) use of body and world as task and context dictate. This is encouraging. It might even be that models in this broad ballpark offer us a first glimpse of the shape of a fundamental and unified science of the embodied mind.

Friday, April 29, 2016

The privileged fifth.

I tweeted this well researched OpEd piece by Thomas Edsall the first time I read it, and after my third reading, want to urge you to read it.  I pass on two  summary graphics that are part of the description of how the privileged top fifth of the U.S. population is becoming a self-perpetuating class that is steadily separating itself by geography, education, and income.


Thursday, April 28, 2016

Sleep deprivation, brain structure, and learning

Saletin et al. find that individual differences in the anatomy of the human hippocampus explain many of the differences in learning impairment after sleep loss. These structural differences also predict the subsequent EEG slow-wave activity during recovery sleep and the restoration of learning after sleep.

Significance statement
Sleep deprivation does not impact all people equally. Some individuals show cognitive resilience to the effects of sleep loss, whereas others express striking vulnerability, the reasons for which remain largely unknown. Here, we demonstrate that structural features of the human brain, specifically those within the hippocampus, accurately predict which individuals are susceptible (or conversely, resilient) to memory impairments caused by sleep deprivation. Moreover, this same structural feature determines the success of memory restoration following subsequent recovery sleep. Therefore, structural properties of the human brain represent a novel biomarker predicting individual vulnerability to (and recovery from) the effects of sleep loss, one with occupational relevance in professions where insufficient sleep is pervasive yet memory function is paramount.
Abstract
Sleep deprivation impairs the formation of new memories. However, marked interindividual variability exists in the degree to which sleep loss compromises learning, the mechanistic reasons for which are unclear. Furthermore, which physiological sleep processes restore learning ability following sleep deprivation are similarly unknown. Here, we demonstrate that the structural morphology of human hippocampal subfields represents one factor determining vulnerability (and conversely, resilience) to the impact of sleep deprivation on memory formation. Moreover, this same measure of brain morphology was further associated with the quality of nonrapid eye movement slow wave oscillations during recovery sleep, and by way of such activity, determined the success of memory restoration. Such findings provide a novel human biomarker of cognitive susceptibility to, and recovery from, sleep deprivation. Moreover, this metric may be of special predictive utility for professions in which memory function is paramount yet insufficient sleep is pervasive (e.g., aviation, military, and medicine).
For further reading on insomnia, this article notes several other studies, one noting several right brain regions of lowered connectivity in people with primary insomnia.

Wednesday, April 27, 2016

Grandiose narcissism and the U.S. presidency

Many of us are scratching our heads about what a Trump presidency might be like, particularly in regard to his outstanding personality trait: grandiose narcissism. Watts et al. have looked at the historical record to note how this trait has correlated with both positive and negative leadership behaviors in U.S. presidents up until Obama. Their abstract:
Recent research and theorizing suggest that narcissism may predict both positive and negative leadership behaviors. We tested this hypothesis with data on the 42 U.S. presidents up to and including George W. Bush, using (a) expert-derived narcissism estimates, (b) independent historical surveys of presidential performance, and (c) largely or entirely objective indicators of presidential performance. Grandiose, but not vulnerable, narcissism was associated with superior overall greatness in an aggregate poll; it was also positively associated with public persuasiveness, crisis management, agenda setting, and allied behaviors, and with several objective indicators of performance, such as winning the popular vote and initiating legislation. Nevertheless, grandiose narcissism was also associated with several negative outcomes, including congressional impeachment resolutions and unethical behaviors. We found that presidents exhibit elevated levels of grandiose narcissism compared with the general population, and that presidents’ grandiose narcissism has been rising over time. Our findings suggest that grandiose narcissism may be a double-edged sword in the leadership domain.
The two highest scorers on grandiose narcissism were Lyndon B. Johnson and Theodore Roosevelt. Richard M. Nixon scored high on "vulnerable narcissism," a trait associated with being self-absorbed and thin-skinned. From the authors' popular account of their work:
Studies in the Journal of Personality in 2013 and in Personality and Individual Differences in 2009 have shown that narcissistic individuals tend to impress others during brief interactions and to perform well in public, two attributes that lend themselves to political success. They are also willing to take risks, which can be a valuable asset in a leader.
In contrast, the psychologist W. Keith Campbell and others have found that narcissists tend to be overconfident when making decisions, to overestimate their abilities and to portray their ideas as innovative when they are not. Compared with their non-narcissistic counterparts, they are more likely to accumulate resources for themselves at others’ expense.
The psychologists Brad Bushman and Roy F. Baumeister have found that narcissists, but not people with garden-variety high self-esteem, are prone to retaliating harshly against people who have criticized them. If, for example, you present narcissists with negative feedback about essays they’ve written, they’re likely to exact revenge against their presumed essay evaluators by blasting them with loud noises (as one amusing study found).
Still other work by the psychologist Mitja Back and colleagues suggests that narcissists are generally well liked in the short term, often creating positive first impressions. Other research indicates, though, that after a while they are usually more disliked than other individuals. Their charisma tends to wear off.

Tuesday, April 26, 2016

Are we smart enough to know how smart animals are?

I want to pass on some clips from Silk's recent review of Frans de Waal's recent book whose title is the title of this post:
Natural selection, he argues, shapes cognitive abilities in the same way as it shapes traits such as wing length. As animals' challenges and habitats differ, so do their cognitive abilities. This idea, which he calls evolutionary cognition, has gained traction in psychology and biology in the past few decades.
For de Waal, evolutionary cognition has two key consequences. First, it is inconsistent with the concept of a 'great chain of being' in which organisms can be ordered from primitive to advanced, simple to complex, stupid to smart. Name a 'unique' human trait, and biologists will find another organism with a similar one. Humans make and use tools; so do wild New Caledonian crows (Corvus moneduloides). Humans develop cultures; so do humpback whales (Megaptera novaeangliae), which socially transmit foraging techniques. We can mentally 'time travel', remembering past events and planning for the future; so can western scrub jays (Aphelocoma californica), which can recall what they had for breakfast on one day, anticipate whether they will be given breakfast the next and selectively cache food when breakfast won't be delivered.
Furthermore, humans do not necessarily outdo other animals in all cognitive domains. Black-capped chickadees (Poecile atricapillus) store seeds in hundreds of locations each day, and can remember what they stored and where, as well as whether items in each location have been eaten, or stolen. Natural selection has favoured those prodigious feats of memory because they spell the difference between surviving winter and starving before spring. Human memory doesn't need to be as good: primates evolved in the tropics. “In the utilitarian view of biology,” de Waal argues, “animals have the brains they need — nothing more, nothing less.”
The second consequence of de Waal's view is that there is continuity across taxa. One source of continuity is based on evolutionary history: natural selection modifies traits to create new ones, producing commonalities among species with a common history. He points out that tool use is found not just in humans and chimpanzees, but also in other apes and monkeys, implying that relevant cognitive building blocks are shared across all primates. Continuity is also generated by convergent evolution, which produces similar traits in distantly related organisms such as New Caledonian crows and capuchin monkeys. De Waal opines that continuity “ought to be the default position for at least all mammals, and perhaps also birds and other vertebrates”.
...researchers are eager to understand what is distinctly human; some are driven by curiosity about how humans came to dominate the planet..Our success presumably has something to do with the emergence of a unique suite of cognitive traits...De Waal recognizes only one such trait: our rich and flexible system of symbolic communication, and our ability to exchange information about past and future. His commitment to the principle of continuity forces him to discount the importance of language for human cognition because of evidence of thinking by non-linguistic creatures. And he ignores compelling findings from linguists and developmental psychologists such as Elizabeth Spelke on the formative role of language in cognition.
A more satisfying book would leave readers with a clearer understanding of why, a few million years after our lineage diverged from the lineage of chimpanzees, we are the ones reading this book, and not them.