Tuesday, January 31, 2012

Our Thrifty Brains.

Andy Clark has done a piece that is really worth reading in the Stone, a New York Times forum for contemporary philosophers. (And, check out the video below):
Might the miserly use of neural resources be one of the essential keys to understanding how brains make sense of the world? Some recent work in computational and cognitive neuroscience suggests that it is indeed the frugal use of our native neural capacity (the inventive use of restricted “neural bandwidth,” if you will) that explains how brains like ours so elegantly make sense of noisy and ambiguous sensory input. That same story suggests, intriguingly, that perception, understanding and imagination, which we might intuitively consider to be three distinct chunks of our mental machinery, are inextricably tied together as simultaneous results of a single underlying strategy known as “predictive coding.” This strategy saves on bandwidth using (who would have guessed it?) one of the many technical wheezes that enable us to economically store and transmit pictures, sounds and videos using formats such as JPEG and MP3.

...perception may best be seen as what has sometimes been described as a process of “controlled hallucination” ...in which we (or rather, various parts of our brains) try to predict what is out there, using the incoming signal more as a means of tuning and nuancing the predictions rather than as a rich (and bandwidth-costly) encoding of the state of the world.

The basic effect hereabouts is neatly illustrated by a simple but striking demonstration (used by the neuroscientist Richard Gregory back in the 1970’s to make this very point) known as “the hollow face illusion.” This is a well-known illusion in which an ordinary face mask viewed from the back can appear strikingly convex. That is, it looks (from the back) to be shaped like a real face, with the nose sticking outward rather than having a concave nose cavity. Just about any hollow face mask will produce some version of this powerful illusion, and there are many examples on the Web, like this one:




Monday, January 30, 2012

A simple way to attentuate emotional arousal?

I just came across these interesting observations of Herwig et al.. They show that simply using self referential reflection (i.e., using mindfullness) to make an emotional state aware can attenuate amygdala activation and emotional arousal:
The regulation of emotions is an ongoing internal process and often a challenge. Current related neural models concern the intended control of reactions towards external events, mediated by prefrontal cortex regions upon basal emotion processing as in the amygdala. Cognitive strategies to regulate emotions in the context of affective disorders or stress reduction, increasingly applied in clinical practice, are also related to mindfulness techniques. We questioned their effects on neural emotion processing and investigated brain activity during purely internal mental self-referential processes of making current emotions and self-related cognitions aware. Thirty healthy subjects performed a task comprising periods of cognitive self-reflection, of introspection for actual own emotions and feelings, and of a neutral condition, while they were scanned with functional magnetic resonance imaging. Brain activations of twenty-seven subjects during emotion-introspection and self-reflection, and also a conjunction of both, were compared with the neutral condition. The conditions of self-reflection and emotion-introspection showed distinguishable activations in medial and ventrolateral prefrontal areas, in parietal regions and in the amygdala. Notably, amygdala activity decreased during emotion-introspection and increased compared to ‘neutral’ during self-reflection. The results indicate that already the self-referential mental state of making the actual emotional state aware is capable of attenuating emotional arousal. This extends current theories of emotion regulation and has implications for the application of mindfulness techniques as a component of psychotherapeutic strategies in affective disorders and also for possible everyday emotion regulation.

Friday, January 27, 2012

You think, therefore I am.

I pass on this contribution from Rose and Markus as their answer to this year's annual question from Edge.org (What is your favorite deep, elegant, or beautiful explanation?):
"I think, therefore I am." Cogito ergo sum. Remember this elegant and deep idea from RenĂ© Descartes' Principles of Philosophy? The fact that a person is contemplating whether she exists, Descartes argued, is proof that she, indeed, actually does exist. With this single statement, Descartes knit together two central ideas of Western philosophy: 1) thinking is powerful, and 2) individuals play a big role in creating their own I's—that is, their psyches, minds, souls, or selves.

Most of us learn "the cogito" at some point during our formal education. Yet far fewer of us study an equally deep and elegant idea from social psychology: Other people's thinking likewise powerfully shapes the I's that we are. Indeed, in many situations, other people's thinking has a bigger impact on our own thoughts, feelings, and actions than do the thoughts we conjure while philosophizing alone.

In other words, much of the time, "You think, therefore I am." For better and for worse.

An everyday instance of how your thinking affects other people's being is the Pygmalion effect. Psychologists Robert Rosenthal and Lenore Jacobson captured this effect in a classic 1963 study. After giving an IQ test to elementary school students, the researchers told the teachers which students would be "academic spurters" because of their allegedly high IQs. In reality, these students' IQs were no higher than those of the "normal" students. At the end of the school year, the researchers found that the "spurters'" had attained better grades and higher IQs than the "normals." The reason? Teachers had expected more from the spurters, and thus given them more time, attention, and care. And the conclusion? Expect more from students, and get better results.

A less sanguine example of how much our thoughts affect other people's I's is stereotype threat. Stereotypes are clouds of attitudes, beliefs, and expectations that follow around a group of people. A stereotype in the air over African Americans is that they are bad at school. Women labor under the stereotype that they suck at math.

As social psychologist Claude Steele and others have demonstrated in hundreds of studies, when researchers conjure these stereotypes—even subtly, by, say, asking people to write down their race or gender before taking a test—students from the stereotyped groups score lower than the stereotype-free group. But when researchers do not mention other people's negative views, the stereotyped groups meet or even exceed their competition. The researchers show that students under stereotype threat are so anxious about confirming the stereotype that they choke on the test. With repeated failures, they seek their fortunes in other domains. In this tragic way, other people's thoughts deform the I's of promising students.

As the planet gets smaller and hotter, knowing that "You think, therefore I am" could help us more readily understand how we affect our neighbours and how our neighbours affect us. Not acknowledging how much we impact each other, in contrast, could lead us to repeat the same mistakes.

Thursday, January 26, 2012

Cellular 'self eating' accounts for some beneficial effects of exercise.

Population studies suggest that exercise protects against diabetes, cancer, and age related diseases such as Alzheimer's. Work by Congcong He et al. has now shown that at least part of this effect is due to the increased "self-eating" (Autophagy) that cells must do to meet the energy demands of exercise. Autophagy recycles used or flawed membranes and internal cell structures by encircling its target material and then dumping it into a compartment that digests it. It has been shown in animal models to reduce diabetes, cancer, and neuro-degenerative diseases. The He et al. work documents that exercise induces autophagy in the skeletal muscles of mice, which in turn lowers glucose and insulin in the bloodstream. Mutant mice that don't induce more autophagy during exercise didn't show this effect. Further, the exercise induced reversal of diabetes induced by overfeeding mice was observed only the mice who showed a exercise induced increased autophagy. Here is the abstract with more details:
Exercise has beneficial effects on human health, including protection against metabolic disorders such as diabetes. However, the cellular mechanisms underlying these effects are incompletely understood. The lysosomal degradation pathway, autophagy, is an intracellular recycling system that functions during basal conditions in organelle and protein quality control. During stress, increased levels of autophagy permit cells to adapt to changing nutritional and energy demands through protein catabolism. Moreover, in animal models, autophagy protects against diseases such as cancer, neurodegenerative disorders, infections, inflammatory diseases, ageing and insulin resistance. Here we show that acute exercise induces autophagy in skeletal and cardiac muscle of fed mice. To investigate the role of exercise-mediated autophagy in vivo, we generated mutant mice that show normal levels of basal autophagy but are deficient in stimulus (exercise- or starvation)-induced autophagy. These mice (termed BCL2 AAA mice) contain knock-in mutations in BCL2 phosphorylation sites (Thr69Ala, Ser70Ala and Ser84Ala) that prevent stimulus-induced disruption of the BCL2–beclin-1 complex and autophagy activation. BCL2 AAA mice show decreased endurance and altered glucose metabolism during acute exercise, as well as impaired chronic exercise-mediated protection against high-fat-diet-induced glucose intolerance. Thus, exercise induces autophagy, BCL2 is a crucial regulator of exercise- (and starvation)-induced autophagy in vivo, and autophagy induction may contribute to the beneficial metabolic effects of exercise.

Wednesday, January 25, 2012

The psychology of perceived wealth.

Studies have shown that not every dollar contributes equally to perceived wealth, people’s standing relative to those around them often predicts well-being better than net worth does, and increasing income trends are preferred over decreasing ones. Sussman and Shafir (at Princeton, where Kahneman has carried out his behavioral economics studies) show several factors that can influence the perception of wealth:
We studied the perception of wealth as a function of varying levels of assets and debt. We found that with total wealth held constant, people with positive net worth feel and are seen as wealthier when they have lower debt (despite having fewer assets). In contrast, people with equal but negative net worth feel and are considered wealthier when they have greater assets (despite having larger debt). This pattern persists in the perception of both the self and others.
In their concluding discussion,
…people have a robust preference for higher assets in cases of negative net worth and for lower debt in cases of positive net worth…debt appears relatively salient in contexts of positive wealth, whereas assets loom relatively large in contexts of negative wealth, and this differential salience has a corresponding impact on financial judgments and decisions.

…the present findings show how the appeal of a loan may depend on one’s perceived financial state. For a person who is in the red, a loan may provide an appealing infusion of cash, whereas for a person in the black, it might present an aversive incursion into debt. Conversely, people who are in the black may be tempted to diminish their debt, whereas it may prove unappealing for those in the red to lower their debt at the expense of their assets.

Remarkably, the same striving for financial wealth and stability can trigger opposing behaviors: preference for greater assets in some circumstances, and for lower debt in others. Such impulses may not always be aligned with what is best financially. People who are in the red and eager to borrow will sometimes have access only to high-interest loans. And people who are eager to clear their debt will sometimes do so even when their debt (e.g., tax-incentivized mortgages) is financially beneficial. Such psychology may be of great consequence. A remarkable 25% of U.S. households had zero or negative net worth in 2009 (for Black households, the figure was about 40%. Better insight into the determinants of perceived financial wealth and financial decision making could help shape behaviorally informed policy.

Tuesday, January 24, 2012

Bounded rationality.

I thought I would pass on clips from Mahzarin Banaji's response to the Edge.org annual question "What is your favorite deep, elegant, or beautiful explanation?":
…my candidate for the most deeply satisfying explanation of recent decades is the idea of bounded rationality…Herbert Simon put one stake in the ground through the study of information processing and AI, showing that both people and organizations follow principles of behavior such as "satisficing" that constrain them to decent but not the best decisions. The second stake was placed by Kahneman and Tversky, who showed the stunning ways in even experts are error-prone—with consequences for not only their own health and happiness but that of their societies broadly.

Together the view of human nature that evolved over the past four decades has systematically changed the explanation for who we are and why we do what we do. We are error-prone in the unique ways in which we are, the explanation goes, not because we have malign intent, but because of the evolutionary basis of our mental architecture, the manner in which we remember and learn information, the way in which we are affected by those around us and so on. The reason we are boundedly rational is because the information space in which we must do our work is large compared to the capacities we have, including severe limits on conscious awareness, the ability to be able to control behavior, and to act in line even with our own intentions.

The idea that bad outcomes result from limited minds that cannot store, compute and adapt to the demands of the environment is a radically different explanation of our capacities and thereby our nature. It's elegance and beauty comes from it placing the emphasis on the ordinary and the invisible rather than on specialness and malign motives. This seems not so dissimilar from another shift in explanation from god to natural section and it is likely to be equally resisted. 

Monday, January 23, 2012

The age of anxiety

Daniel Smith does an interesting piece asking whether it is appropriate to consider our current times an "age of anxiety." Some clips:
...it is undeniable that ours is an age in which an enormous and growing number of people suffer from anxiety. According to the National Institute of Mental Health, anxiety disorders now affect 18 percent of the adult population of the United States, or about 40 million people. By comparison, mood disorders — depression and bipolar illness, primarily — affect 9.5 percent…anti-anxiety drug alprazolam — better known by its brand name, Xanax — was the top psychiatric drug on the list, clocking in at 46.3 million prescriptions in 2010.

Just because our anxiety is heavily diagnosed and medicated, however, doesn’t mean that we are more anxious than our forebears. It might simply mean that we are better treated — that we are, as individuals and a culture, more cognizant of the mind’s tendency to spin out of control.

Earlier eras might have been even more jittery than ours. Fourteenth-century Europe, for example, experienced devastating famines, waves of pillaging mercenaries, peasant revolts, religious turmoil and a plague that wiped out as much as half the population in four years. The evidence suggests that all this resulted in mass convulsions of anxiety, a period of psychic torment in which, as one historian has put it, “the more one knew, the less sense the world made.”

It’s hard to imagine that we have it even close to as bad as that. Yet there is an aspect of anxiety that we clearly have more of than ever before: self-awareness…Anxiety didn’t emerge as a cohesive psychiatric concept until the early 20th century..By 1977, the psychoanalyst Rollo May was noting an explosion in papers, books and studies on the subject.

...we shouldn’t be possessive about our uncertainties, particularly as one of the dominant features of anxiety is its recursiveness. Anxiety begins with a single worry, and the more you concentrate on that worry, the more powerful it gets, and the more you worry. One of the best things you can do is learn to let go: to disempower the worry altogether. If you start to believe that anxiety is a foregone conclusion — if you start to believe the hype about the times we live in — then you risk surrendering the battle before it’s begun.

Friday, January 20, 2012

On Solitude.

Reading a recent New York Times Op-Ed piece by Susan Cain ("The Rise of the New Groupthink") transported me back over 20 years to what I then experienced as a transformative reading of British Psychotherapist Anthony Storr's book "Solitude, a return to the self." It's reading provided me with a my needed validation of my own solitary and introspective nature (preferring to do my work and thinking my myself, even while serving and respecting social groups, such as the laboratory I ran). Storr's book was a reaction against the popular psychotherapies of the 1980s which emphasized intimate interpersonal relationships as the chief, if not the only, source of human happiness. He made a strong case that the life of an average person, not just a familiar list of brilliant scholars and artists such as Beethoven, Kant, Newton, etc., could be greatly enriched more time spent alone.

In a similar vein Cain writes against the current assumption that creativity, particularly in business, requires the collaboration of group of people addressing the problem at hand. Her central illustration describes the origins of the Apple computer, It's creation required the support of a creative group of engineers and Steve Jobs' business sense, but the creative kernel of work and insight that put together the core of the actual hardware and code that ran it was done by Wozniak's solitary effort. Cain notes:
...brainstorming sessions are one of the worst possible ways to stimulate creativity...People in groups tend to sit back and let others do the work; they instinctively mimic others’ opinions and lose sight of their own; and, often succumb to peer pressure... fear of rejection actives the brain's amygdala.

The one important exception to this dismal record is electronic brainstorming, where large groups outperform individuals; and the larger the group the better. The protection of the screen mitigates many problems of group work. This is why the Internet has yielded such wondrous collective creations. Marcel Proust called reading a “miracle of communication in the midst of solitude,” and that’s what the Internet is, too. It’s a place where we can be alone together — and this is precisely what gives it power.

...most humans have two contradictory impulses: we love and need one another, yet we crave privacy and autonomy....To harness the energy that fuels both these drives, we need to move beyond the New Groupthink and embrace a more nuanced approach to creativity and learning. Our offices should encourage casual, cafe-style interactions, but allow people to disappear into personalized, private spaces when they want to be alone. Our schools should teach children to work with others, but also to work on their own for sustained periods of time. And we must recognize that introverts like Steve Wozniak need extra quiet and privacy to do their best work.

Before Mr. Wozniak started Apple, he designed calculators at Hewlett-Packard, a job he loved partly because HP made it easy to chat with his colleagues. Every day at 10 a.m. and 2 p.m., management wheeled in doughnuts and coffee, and people could socialize and swap ideas. What distinguished these interactions was how low-key they were. For Mr. Wozniak, collaboration meant the ability to share a doughnut and a brainwave with his laid-back, poorly dressed colleagues — who minded not a whit when he disappeared into his cubicle to get the real work done.

Thursday, January 19, 2012

Chill-out architecture - The use of tree metaphors

I gravitate towards forests and trees (typing right now at a desk that looks out at a large tree canopy on the opposite riverbank) because the vision of green trees under a blue sky is vastly more calming that having to look at the more brown and red tints of modern city structures. (My current Fort Lauderdale location is an extended strip mall only occasionally small bits of nature to intrude). Old pine forests give me the same sheltered feeling as the great cathedrals of Europe.

Thus I am very sympathetic to efforts to argue for a evolutionary or biological basis for these feelings, which appear to be common to most human cultures. E.O. Wilson, the father of "Sociobiology" and evolutionary psychology, has written a book "Biophilia" that essentially argues that our preference for natural scenes is innate, the product of a psychology that evolved in paleolithic times. I would like this to be a correct view, but alas, it is, like most of evolutionary psychology, more like Rudyard Kipling's "Just so Stories" than hard science.

It is one thing to simply note trees as a metaphor for shelter, and thus to find it natural that architectural designs (such as the Metropol Parasol in Seville shown in the picture) that incorporate the tree metaphor would be pleasing to us. It is quite another hang this all on the supposed cognitive neuroscience of embodied cognition, as Sarah Williams Goldhagen, the architecture critic for The New Republic, has done in a rather confused piece. A recent post by Voytek, and the discussion following, point out a number of reservations and relevant points.

Wednesday, January 18, 2012

Living large - how the powerful overestimate.

From Duguid and Goncalo, their abstract, slightly edited:
In three experiments, we tested the prediction that individuals’ experience of power influences their perceptions of their own height. In the first experiment high power, relative to low power, was associated with smaller estimates of a pole’s height relative to the self, in a second experiment with larger estimates of one’s own height, and in a third experiment with choice of a taller avatar to represent the self in a second-life game . These results emerged regardless of whether power was experientially primed (In the first and third experiments) or manipulated through assigned roles (in the second experiment). Although a great deal of research has shown that more physically imposing individuals are more likely to acquire power, this work is the first to show that powerful people feel taller than they are. The discussion considers the implications for existing and future research on the physical experience of power.

Tuesday, January 17, 2012

My pushing back against our diffusion into “the cloud”

My son visits over the new year's holiday every year, which gives me the chance to have a "techie" conference with him to see what I've been missing. One of the web applications he mentioned lead me to Ghostery, a web app that installs on your web browser with a cute little pac-man like ghost that shows you who is tracking your web movements and what cookies have been put on your browser (I was rather taken aback to see that I'm tracked by 759 'bugs' and have 412 cookies). The Ghostery App allows you to inactivate them individually or as a group. Even though most of the monitoring of our movements on the web is supposedly for benign marketing purposes, I'm more than happy to turn it all off.

A storm of controversy has risen over Google recent effort to conflate supposedly neutral web searches with its Google Plus social network, so that a search for information on some idea or item might now yield results that include posts, photos, profiles and conversations from Google Plus that are public or were shared privately with the person searching. I go to google for Google for links to expert information, and don't want my search results to be cluttered with friends’ postings. Since I use google for practically everything I do on the web (this blog, mail, calendar, contacts, google+, google voice, etc.), this cross linking of my search results and my google+ account is in fact happening. Fortunately, you can turn off this google+ feature by going to the gear-shaped options icon at the top right of google search results, selecting "Search settings," scrolling down till you see "Personal results" and tick the box next to "Do not use personal results."

Monday, January 16, 2012

Remembering a rosy future.

Here is a fascinating tidbit from Dan Schacter's laboratory. When we imagine events in the future, our subsequent recall of negative simulations fades more rapidly than our recall of positive ones.:
Mental simulations of future experiences are often concerned with emotionally arousing events. Although it is widely believed that mental simulations enhance future behavior, virtually nothing is known about how memory for these simulations changes over time or whether simulations of emotional experiences are especially well remembered. We used a novel paradigm that combined recently developed methods for generating simulations of future events and well-established procedures for testing memory to examine the retention of positive, negative, and neutral simulations over delays of 10 min and 1 day. We found that at the longer delay, details associated with negative simulations were more difficult to remember than details associated with positive or neutral simulations. We suggest that these effects reflect the influence of the fading-affect bias, whereby negative reactions fade more quickly than positive reactions, and that this influence results in a tendency to remember a rosy simulated future. We discuss implications of our findings for individuals with affective disorders, such as depression and anxiety.
(Schacter, in the Harvard Psychology department, is a prolific memory researcher, and is author of such popular books as "The Seven Sins of Memory: How the Mind Forgets and Remembers." as well as coauthor, along with Gilbert and Wegner, of a really excellent introductory college Psychology text.)

Friday, January 13, 2012

Our bias against creativity

In principle we are all for creativity, but, when faced with the prospect of actually altering our behavior or opinions we falter. Mueller et al suggest that this is a covert, largely unconscious process regulated by how uncertain we feel. Their results show that regardless of the degree to which people are open minded, when they feel motivated to reduce uncertainty (either because they have an immediate goal of reducing uncertainty or they feel uncertain generally), they may experience more negative associations with creativity, which results in lower evaluations of a creative idea. Their findings imply an irony. Other research has shown that uncertainty spurs the search for and generation of creative ideas, yet these findings reveal that uncertainty also makes people less able to recognize creativity, perhaps when they need it most. Here is the abstract.:
People often reject creative ideas, even when espousing creativity as a desired goal. To explain this paradox, we propose that people can hold a bias against creativity that is not necessarily overt and that is activated when people experience a motivation to reduce uncertainty. In two experiments, we manipulated uncertainty using different methods, including an uncertainty-reduction prime. The results of both experiments demonstrated the existence of a negative bias against creativity (relative to practicality) when participants experienced uncertainty. Furthermore, this bias against creativity interfered with participants’ ability to recognize a creative idea. These results reveal a concealed barrier that creative actors may face as they attempt to gain acceptance for their novel ideas.

Thursday, January 12, 2012

IQ scores are malleable.

Brinch and Galloway do a rather clean demonstration that contests the common notion that education has little effect on IQ. Here is the abstract and one figure from the paper.:
Although some scholars maintain that education has little effect on intelligence quotient (IQ) scores, others claim that IQ scores are indeed malleable, primarily through intervention in early childhood. The causal effect of education on IQ at later ages is often difficult to uncover because analyses based on observational data are plagued by problems of reverse causation and self-selection into further education. We exploit a reform that increased compulsory schooling from 7 to 9 y in Norway in the 1960s to estimate the effect of education on IQ. We find that this schooling reform, which primarily affected education in the middle teenage years, had a substantial effect on IQ scores measured at the age of 19 y.

Average IQ and education by time to reform.

Wednesday, January 11, 2012

BioDigitalHuman - You've got to check out this site!

I've just spent the last two hours marveling at the incredibly elegant 3-D human anatomy website developed by BioDigital Systems (pointed to by Natashe Singer's article). (I'm finding the 3-D graphics work on either Firefox or Chrome, but not both, depending of which of my MacBook Pro laptops I'm using.  Go figure.  I use Apple computers, so can't comment on Microsoft Explorer.) You can view gross to detailed levels of skeletal, muscular, nervous, endocrine, cardiovascular, digestive, etc., systems. A click on a structure brings up a detailed description along with relevant clinical issues. I focused first on the brain (finding it helps if you first toggle off viewing the skeleton system skull that covers it. Duh!) You can zoom in and out, performing 3-D rotations to see precisely where structures are. Asking for smaller internal structure like the pituitary, or left or right amygdala, takes you to their internal location, and you can zoom in and out to appreciate how to get there. The transitions from external to internal brain structures are crude and jerky at this point, and I hope the developers will be adding more fine structure and smoother transitions during zooming. (It will take a massive amount of work to do this.)

I moved next to the muscular system, particularly around the knee joints (whose malfunctions in my case over the past year have convinced me I may no longer a teenager, in fact might be "old"). I found a more clear view of the muscles and their insertions that might underlie the pain than I've been able to get from several doctor's appointments.

Happy hunting!

Tuesday, January 10, 2012

Classic versus modern violins: beauty in the eye of the beholder?

It is a truism among musicians that no modern violin can, or ever will, approach the perfection of the instruments crafted by Renaissance violin makers such as Stradivari or Guarneri del GesĂ¹ in the late 17th and early 18th centuries. Nichlas Wade describes an interesting test published in PNAS that aims to determine if this is in fact the case. Violinists attending an international competition were asked to wear goggles (so they could not identify the instruments they were playing) and play three classic (a Guarneri and two Stradivari instruments) and three high quality modern violins. (It has been as pastime of physicists for years to analyze the sound qualities of old violins and devise construction techniques that could reproduce them in modern instruments.)
...participants in Dr. Fritz’s test could not reliably distinguish the old instruments from modern violins. Only 8 of the 21 subjects chose an old violin as the one they’d like to take home. In the old-to-new comparison, a Stradivarius came in last and a new violin as the most preferred.
The results are clear, even though there was grumbling from other players that the test was performed in a hotel room rather than a concert hall, so projection qualities of the instruments might not have been appreciated.

This reminds me of the numerous blind taste tests involving hundreds of people have shown no correlation between the price of wines costing from $1.50 to $150 and their reported taste. In fact, I've done a posting on the neural correlates of this effect (see also this related posting).

Monday, January 09, 2012

Are the Humanities becoming the “Animal Sciences”

A theme of my "Biology of Mind" course at the University of Wisconsin and the book of that title that I generated from my lecture notes was that our understanding of almost any aspect of our culture and literature could be enhanced by knowledge of its biological underpinnings. As more or and more of the cognitive faculties once assumed to be unique to humans are found in animals (aspects of math, language, tool use, the roots of morality) the citadel of the Humanities has increasingly taken note and an
article by James Gorman points to the consequences of this: a array of courses that bridge animal studies and human animal interactions.
This spring, freshmen at Harvard can take “Human, Animals and Cyborgs.” Last year Dartmouth offered “Animals and Women in Western Literature: Nags, Bitches and Shrews.” New York University offers “Animals, People and Those in Between.”
The existence of an emerging scholarly community is reflected by the recent formation of the Animals and Society Institute, which lists more than 100 college level courses that fit under the broad banner of animal studies. Previously ignored ethical issues in the treatment of animals are being scrutinized. The human-animal divide is being eroded as humans increasingly realize they too are animals, and subject to the same natural forces. Any cultural trend that injects just a bit more humility into us humans has to be a good thing.

Friday, January 06, 2012

Structural changes in adult brains caused by acquiring knowledge

A number of reports have appeared over the past 20 years suggesting that the hippocampus region of the brain involved in place memory is larger than normal in London Taxi drivers (who must pass a memory test of London streets to become licensed taxi drivers). Woollett and Maguire have now examined this more carefully. Their summaries:
-Trainee taxi drivers in London spend 3–4 years learning the city's layout
-We assessed the brain and memory of trainees before and after this long training
-Those who qualified experienced increased gray matter in posterior hippocampus
-Successful qualification was also associated with changes in memory profile

The last decade has seen a burgeoning of reports associating brain structure with specific skills and traits. Although these cross-sectional studies are informative, cause and effect are impossible to establish without longitudinal investigation of the same individuals before and after an intervention. Several longitudinal studies have been conducted; some involved children or young adults, potentially conflating brain development with learning, most were restricted to the motor domain, and all concerned relatively short timescales (weeks or months). Here, by contrast, we utilized a unique opportunity to study average-IQ adults operating in the real world as they learned, over four years, the complex layout of London's streets while training to become licensed taxi drivers. In those who qualified, acquisition of an internal spatial representation of London was associated with a selective increase in gray matter (GM) volume in their posterior hippocampi and concomitant changes to their memory profile. No structural brain changes were observed in trainees who failed to qualify or control participants. We conclude that specific, enduring, structural brain changes in adult humans can be induced by biologically relevant behaviors engaging higher cognitive functions such as spatial memory, with significance for the “nature versus nurture” debate.

Thursday, January 05, 2012

Can ignorance promote democracy?

It is easy to despair over the continuing decay in the intelligence and rationality of American voters, and worry about their susceptibility to manipulation by loud voices offering simplistic solutions. Past work has suggested that when many individuals (human voters, flocks of birds, schools of fish) must come together to make a single collective decision, a strongly opinionated minority (tea party anyone?), might be able to exert disproportional pressure on the decision-making process. Couzin et al. develop a theoretical model in which uninformed individuals inhibit the influence of a strongly opinionated minority, returning control to the numerical majority, and in experiments on the shiner, a schooling fish, show the utility of their model. In the presence of an intransigent (and not proselytizing) minority uninformed individuals tend to adopt the opinions of those around them, amplifying the majority opinion and preventing erosion by the intransigent minority. Thus, adding uninformed individuals to a group can facilitate fair representation during the process of information integration. Here is the abstract:
Conflicting interests among group members are common when making collective decisions, yet failure to achieve consensus can be costly. Under these circumstances individuals may be susceptible to manipulation by a strongly opinionated, or extremist, minority. It has previously been argued, for humans and animals, that social groups containing individuals who are uninformed, or exhibit weak preferences, are particularly vulnerable to such manipulative agents. Here, we use theory and experiment to demonstrate that, for a wide range of conditions, a strongly opinionated minority can dictate group choice, but the presence of uninformed individuals spontaneously inhibits this process, returning control to the numerical majority. Our results emphasize the role of uninformed individuals in achieving democratic consensus amid internal group conflict and informational constraints.

Wednesday, January 04, 2012

Dynamics of improvising together.

In a previous life (when I was a 30-something) I frequently participated in dance improvisation sessions sponsored by either the Univ. of Wisc. Dance Department or local dance groups. One of the basic exercises was 'mirroring', two dancers generating novel movements by attempting to spontaneously generate matching movements. This worked much better when participants were equal, rather than one being designated the leader. Here is an interesting bit of work by Noy et al. describing why that was the case:
Joint improvisation is the creative action of two or more people without a script or designated leader. Examples include improvisational theater and music, and day-to-day activities such as conversations. In joint improvisation, novel action is created, emerging from the interaction between people. Although central to creative processes and social interaction, joint improvisation remains largely unexplored due to the lack of experimental paradigms. Here we introduce a paradigm based on a theater practice called the mirror game. We measured the hand motions of two people mirroring each other at high temporal and spatial resolution. We focused on expert actors and musicians skilled in joint improvisation. We found that players can jointly create novel complex motion without a designated leader, synchronized to less than 40 ms. In contrast, we found that designating one player as leader deteriorated performance: The follower showed 2–3 Hz oscillation around the leader's smooth trajectory, decreasing synchrony and reducing the range of velocities reached. A mathematical model suggests a mechanism for these observations based on mutual agreement on future motion in mirrored reactive–predictive controllers. This is a step toward understanding the human ability to create novelty by improvising together.

Tuesday, January 03, 2012

Have a scary memory? Erase it with prozac plus psychotherapy.

Numerous clinical studies by now have shown that a combination of antidepressant medication and psychological treatment works better for mood disorders than either therapy on its own. Karpova et al. have now ferreted out the mechanisms that might underlie this fact by investigating the effect of fluoxetine (Prozac) on fear-conditioned memories in mice. Fluoxetine accelerated extinction of fear responses, and together with extinction training disrupted fear renewal and fear reinstatement, but neither treatment by itself produced long term fear extinction. Their results suggest that fluoxetine reactivates plasticity within the amygdala, which, in combination with extinction training, can lead to the erasure of conditioned fear responses. Here is their abstract:
Antidepressant drugs and psychotherapy combined are more effective in treating mood disorders than either treatment alone, but the neurobiological basis of this interaction is unknown. To investigate how antidepressants influence the response of mood-related systems to behavioral experience, we used a fear-conditioning and extinction paradigm in mice. Combining extinction training with chronic fluoxetine, but neither treatment alone, induced an enduring loss of conditioned fear memory in adult animals. Fluoxetine treatment increased synaptic plasticity, converted the fear memory circuitry to a more immature state, and acted through local brain-derived neurotrophic factor. Fluoxetine-induced plasticity may allow fear erasure by extinction-guided remodeling of the memory circuitry. Thus, the pharmacological effects of antidepressants need to be combined with psychological rehabilitation to reorganize networks rendered more plastic by the drug treatment.

Monday, January 02, 2012

Our genes and our behavior, a paradigm shift in our understanding

I thought it would be interesting to pass on two recent items I've come across. The first is the paper by Schultz et al. (also, see commentary by Wade.) that challenges some of the leading theories of social behavior (that stress environment, larger groups sizes forcing larger more intelligent brains, stepwise progression to complexity,etc.) to argue that genetic determinants force primate species, including ours, into whatever social structures they inherit.

Compare this with the proposed article from Behavioral and Brain Sciences "Behavior genetics and post genomics", by Charney (PDF download here), which points to the much more tortuous road from genotype to phenotype. Here is his abstract:
The science of genetics is undergoing a paradigm shift. Recent discoveries, including the activity of retrotransposons, the extent of copy number variations, somatic and chromosomal mosaicism, and the nature of the epigenome as a regulator of DNA expressivity, are challenging a series of dogmas concerning the nature of the genome and the relationship between genotype and phenotype. DNA, once held to be the unchanging template of heredity, now appears subject to a good deal of environmental change; considered to be identical in all cells and tissues of the body, there is growing evidence that somatic mosaicism is the normal human condition; and treated as the sole biological agent of heritability, we now know that the epigenome, which regulates gene expressivity, can be inherited via the germline. These developments are particularly significant for behavior genetics for at least three reasons: First, these phenomena appear to be particularly prevalent in the human brain, and likely are involved in much of human behavior; second, they have important implications for the validity of heritability and gene association studies, the methodologies that largely define the discipline of behavior genetics; and third, they appear to play a critical role in development during the perinatal period, and in enabling phenotypic plasticity in offspring in particular. I examine one of the central claims to emerge from the use of heritability studies in the behavioral sciences, the principle of "minimal shared maternal effects," in light of the growing awareness that the maternal perinatal environment is a critical venue for the exercise of adaptive phenotypic plasticity. This consideration has important implications for both developmental and evolutionary biology.