Friday, December 31, 2010

The wolfpack effect.

An interesting piece of work from Gao et al. showing how the perception of animacy influences our interactive behavior.

Imagine a pack of predators stalking their prey. The predators may not always move directly toward their target (e.g., when circling around it), but they may be consistently facing toward it. The human visual system appears to be extremely sensitive to such situations, even in displays involving simple shapes. We demonstrate this by introducing the wolfpack effect, which is found when several randomly moving, oriented shapes (darts, or discs with “eyes”) consistently point toward a moving disc. Despite the randomness of the shapes’ movement, they seem to interact with the disc—as if they are collectively pursuing it. This impairs performance in interactive tasks (including detection of actual pursuit), and observers selectively avoid such shapes when moving a disc through the display themselves. These and other results reveal that the wolfpack effect is a novel “social” cue to perceived animacy. And, whereas previous work has focused on the causes of perceived animacy, these results demonstrate its effects, showing how it irresistibly and implicitly shapes visual performance and interactive behavior.


Sample display (a) and manipulations (b–e) from the first experiment. The task was to detect whether one shape (the wolf) was chasing another (the sheep). Arrows indicate motion and were not present in the displays. In the wolfpack condition (a, b), all darts stayed oriented toward the task-irrelevant green square, regardless of their motion directions. This condition generated the wolfpack effect. In the perpendicular condition (c), each dart was always oriented orthogonally to the square. In the match condition (d), each dart was always oriented in the direction in which it was moving at that moment. And in the disc condition (e), the objects had no visible orientation.

A contrarian view of energy prospects.

John Tierney describes a wager he made in 2005 with Matthew Simmons, who bet $5,000 that the price of oil, then about $65 a barrel, would more than triple in the next five years, so that the average price of oil over the course of 2010 would be at least $200 a barrel in 2005 dollars....The average for 2010 has been just under $80, which is the equivalent of about $71 in 2005 dollars — a little higher than the $65 at the time of the bet, but far below the $200 threshold set by Mr. Simmons. (Tierney's mentor was the economist Julian L. Simon, a leader of the Cornucopians, optimists who believed there would always be abundant supplies of energy and other resources. Simon won a bet in the 1980s with Paul Ehrlich and two natural resources experts over the prices of five metals.) What happened to the grim predictions of declining oil reserves and rising prices? Perhaps there has been a temporary respite (which unfortunately will not help alternative energy efforts):

Giant new oil fields have been discovered off the coasts of Africa and Brazil. The new oil sands projects in Canada now supply more oil to the United States than Saudi Arabia does. Oil production in the United States increased last year, and the Department of Energy projects further increases over the next two decades...The really good news is the discovery of vast quantities of natural gas. It’s now selling for less than half of what it was five years ago. There’s so much available that the Energy Department is predicting low prices for gas and electricity for the next quarter-century. Lobbyists for wind farms, once again, have been telling Washington that the “sustainable energy” industry can’t sustain itself without further subsidies...As gas replaces dirtier fossil fuels, the rise in greenhouse gas emissions will be tempered, according to the Department of Energy. It projects that no new coal power plants will be built, and that the level of carbon dioxide emissions in the United States will remain below the rate of 2005 for the next 15 years even if no new restrictions are imposed.

Maybe something unexpected will change these happy trends, but for now I’d say that Julian Simon’s advice remains as good as ever. You can always make news with doomsday predictions, but you can usually make money betting against them.

Thursday, December 30, 2010

New brain circuits form online during rapid learning

Shtyrov et. al. show that after just 14 minutes of learning exposure to a new word, presentations of this word cause increased responses in the language cortex, reflecting rapid mapping of new word forms onto neural representations.

Humans are unique in developing large lexicons as their communication tool. To achieve this, they are able to learn new words rapidly. However, neural bases of this rapid learning, which may be an expression of a more general cognitive mechanism, are not yet understood. To address this, we exposed our subjects to familiar words and novel spoken stimuli in a short passive perceptual learning session and compared automatic brain responses to these items throughout the learning exposure. Initially, we found enhanced activity for known words, indexing the ignition of their underlying memory traces. However, just after 14 min of learning exposure, the novel items exhibited a significant increase in response magnitude matching in size with that to real words. This activation increase, as we would like to propose, reflects rapid mapping of new word forms onto neural representations. Similar to familiar words, the neural activity subserving rapid learning of new word forms was generated in the left-perisylvian language cortex, especially anterior superior-temporal areas. This first report of a neural correlate of rapid learning suggests that our brain may effectively form new neuronal circuits online as it gets exposed to novel patterns in the sensory input. Understanding such fast learning is key to the neurobiological explanation of the human language faculty and learning mechanisms in general.

Paying the price for a longer life.

A brief article by Bakalar notes a study by Crimmins et al.

...people live longer not because they are less likely to get sick, but because they survive longer with disease....As a result, a 20-year-old man today can expect to live about a year longer than a 20-year-old in 1998, but will spend 1.2 years more with a disease, and 2 more years unable to function normally.

Wednesday, December 29, 2010

Placebo pills work without deception

Kaptchuk et al. show that placebos administered without deception may be an effective treatment for irritable bowel syndrome. From Bakalar's summary:

They explained to all that a placebo was an inert substance, like a sugar pill, that had been found to “produce significant improvement in I.B.S. symptoms through mind-body self-healing processes.” The patients, all treated with the same attention, warmth and empathy by the researchers, were then randomly assigned to get the pill or not...At the end of three weeks, they tested all the patients with questionnaires assessing the level of their pain and other symptoms. The patients given the sugar pill — in a bottle clearly marked “placebo” — reported significantly better pain relief and greater reduction in the severity of other symptoms than those who got no pill.
A weakness of the study is that because the outcome measure is so subjective, placebo patients may have exaggerated their improvement to please the researchers.

English and Mandarin speakers think about time differently.

I pass on this abstract from Boroditsky et al., and a few clips from the article:

Time is a fundamental domain of experience. In this paper we ask whether aspects of language and culture affect how people think about this domain. Specifically, we consider whether English and Mandarin speakers think about time differently. We review all of the available evidence both for and against this hypothesis, and report new data that further support and refine it. The results demonstrate that English and Mandarin speakers do think about time differently. As predicted by patterns in language, Mandarin speakers are more likely than English speakers to think about time vertically (with earlier time-points above and later time-points below).
From their text:
Both English and Mandarin use horizontal front/back spatial terms to talk about time. In English, we can look forward to the good times ahead, or think back to travails past and be glad they are behind us. In Mandarin, front/back spatial metaphors for time are also common. For example,  Mandarin speakers use the spatial morphemes qián (‘‘front”) and hòu (‘‘back”) to talk about time...Unlike English speakers, Mandarin speakers also systematically and frequently use vertical metaphors. The spatial morphemes shàng (‘‘up”) and xià (‘‘down”) are used to talk about the order of events, weeks, months, semesters, and more. Earlier events are said to be shàng or ‘‘up”, and later events are said to be xià or ‘‘down”.  For example, “shàng ge yuè” is last (or previous) month, and “xià ge yuè” is next (or following) month...This difference between the two languages offers the prediction that Mandarin speakers would be more likely to conceive of time vertically than would English speakers.

In the experimental paradigm, participants made temporal judgments following horizontal or vertical spatial primes. On each trial, participants first answered several questions about the spatial relationship between two objects (arranged either horizontally or vertically on a computer screen), and then answered a question about time (e.g., March comes earlier than April; TRUE or FALSE). Participants’ response times to the target question about time following either the horizontal or vertical primes were the measure of interest.

The basic finding was that both groups organize time on the left-to-right axis with earlier events on the left, a pattern consistent with writing direction. But, Mandarin speakers also show evidence of vertical representations of time, with earlier events represented further up. English speakers showed no evidence of such a representation.

Tuesday, December 28, 2010

You've got to have (150) friends...

The post title is the title of a brief essay by Robin Dunbar, who is best known for his work documenting, for a large number of animal species,  the relationship between brain size and social group size (they get larger together.)   His curve predicts that the optimal group size for humans is about 150.  That is what is observed over a large number of hunter-gatherer and aboriginal human species across the world,  and (his article contends) is an evolved biological/psychological limit that operates even in a world of facebook that permits thousands of "friends." Some clips:

The developers at Facebook overlooked one of the crucial components in the complicated business of how we create relationships: our minds...Put simply, our minds are not designed to allow us to have more than a very limited number of people in our social world. The emotional and psychological investments that a close relationship requires are considerable, and the emotional capital we have available is limited...Indeed, no matter what Facebook allows us to do, I have found that most of us can maintain only around 150 meaningful relationships, online and off — what has become known as Dunbar’s number. Yes, you can “friend” 500, 1,000, even 5,000 people with your Facebook page, but all save the core 150 are mere voyeurs looking into your daily life — a fact incorporated into the new social networking site Path, which limits the number of friends you can have to 50.

Until relatively recently, almost everyone on earth lived in small, rural, densely interconnected communities, where our 150 friends all knew one another...the social and economic mobility of the past century has worn away at that interconnectedness. As we move around the country and across continents, we collect disparate pockets of friends, so that our list of 150 consists of a half-dozen subsets of people who barely know of one another’s existence...as we move around, though, we can lose touch with even our closest friends. Emotional closeness declines by around 15 percent a year in the absence of face-to-face contact, so that in five years someone can go from being an intimate acquaintance to the most distant outer layer of your 150 friends.

Facebook and other social networking sites allow us to keep up with friendships that would otherwise rapidly wither away. And they do something else that’s probably more important, if much less obvious: they allow us to reintegrate our networks so that, rather than having several disconnected subsets of friends, we can rebuild, albeit virtually, the kind of old rural communities where everyone knew everyone else. Welcome to the electronic village.

The science of cities.

I've been meaning to point out an interesting article by Jonah Lehrer that focuses on the work of 70-year old physicist Geoffrey West, who decided to turn his attention to discerning whether the cities that containing an ever increasing fraction of the world's population (82% of the people in the U.S. live in cities) follow discernible patterns and laws. Some edited clips:

Knowing the population of a metropolitan area in a given country allows one to estimate, with approximately 85 percent accuracy, its average income and the dimensions of its sewer system. These are the laws, they say, that automatically emerge whenever people “agglomerate,” cramming themselves into apartment buildings and subway cars. It doesn’t matter if the place is Manhattan or Manhattan, Kan.: the urban patterns remain the same...the real purpose of cities, and the reason cities keep on growing, is their ability to create massive economies of scale, just as big animals do. After analyzing the first sets of city data — the physicists began with infrastructure and consumption statistics — they concluded that cities looked a lot like elephants. In city after city, the indicators of urban “metabolism,” like the number of gas stations or the total surface area of roads, showed that when a city doubles in size, it requires an increase in resources of only 85 percent...This straightforward observation has some surprising implications. It suggests, for instance, that modern cities are the real centers of sustainability. According to the data, people who live in densely populated places require less heat in the winter and need fewer miles of asphalt per capita.
People, however, do not go to cities because they are more efficient, they go because their are more social and commercial interactions. West and colleagues were able to quantify Jane Jacob's points in her famous book “The Death and Life of Great American Cities.”
...whenever a city doubles in size, every measure of economic activity, from construction spending to the amount of bank deposits, increases by approximately 15 percent per capita (It also experiences a 15 percent per capita increase in violent crimes, traffic and AIDS cases). It doesn’t matter how big the city is; the law remains the same...everything that’s related to the social network goes up by the same percentage.

Monday, December 27, 2010

Listening to your heart

Dunn et al. try to evaluate how sensing feedback from the body influences thought and feeling. Some edited clips of background, and their results:

Some metaphorical expressions that are used daily, such as “brokenhearted” or “gut feelings,” reflect the common belief that feelings and cognitions are partly grounded in bodily responses. This idea is reflected in early philosophical writings about embodiment (e.g., Descartes, 1649/1989) and was introduced to experimental psychology by William James (1884), who asserted that perception of changes in the body “as they occur is the emotion” (pp. 189–190). Since then, there has been considerable debate about the extent to which feelings and cognitions are in fact embodied. Much of this discussion has focused on emotion experience and decision making. Schachter and Singer modified Jamesian theory to argue that emotion experience is a product of the cognitive appraisal of bodily arousal. The somatic marker hypothesis of Damasio proposes that emotional biasing signals emerging from the body influence intuitive decision making. These models remain controversial, and critics argue that bodily responses occur relatively late in the information-processing chain and are therefore best viewed as a consequence, rather than the cause, of cognitive-affective activity.

...A central but untested prediction of many of these proposals is that how well individuals can perceive subtle bodily changes (interoception) determines the strength of the relationship between bodily reactions and cognitive-affective processing. In a first study we demonstrated that the more accurately participants could track their heartbeat, the stronger the observed link between their heart rate reactions and their subjective arousal (but not valence) ratings of emotional images. (In other words, the more strongly these autonomic changes are felt, the more they are associated with arousal experience.) These results offer strong support for Jamesian bodily feedback theories.

In a second study, we found that increasing interoception ability either helped or hindered adaptive intuitive decision making, depending on whether the anticipatory bodily signals generated favored advantageous or disadvantageous choices. These findings identify both the generation and the perception of bodily responses as pivotal sources of variability in emotion experience and intuition, and offer strong supporting evidence for bodily feedback theories, suggesting that cognitive-affective processing does in significant part relate to “following the heart.” Our findings agree with those of other studies showing that an absence of emotion following frontal head injury can in some circumstances lead to superior decision making and with claims that elevated interoceptive awareness may maintain conditions such as anxiety.

Deric's MindBlog for smartphones

I use Google's Blogger to publish MindBlog, and they have just added a nice new tweak.

We realize that more and more users are accessing the web on smartphones, and we want to make sure that blogs still look nice when viewed on these smaller screens. We’ve put a lot of work into creating a mobile version of BlogSpot, which will automatically detect if a blog is accessed on a smartphone and then display a mobile-optimized version.
I have enabled this feature, so now if you go to mindblog.dericbownds.net using your iPhone or other smartphone,  you see this new format. 

Friday, December 24, 2010

The dark side of inflammation

Couzin-Frankel does an interesting piece in the News section of the Dec. 17 issue of Science, consonant with my opinion that inflammatory processes are one of the main issues in aging. The abstract:

Not long ago, inflammation had a clear role: It was a sidekick to the body's healers, briefly setting in as immune cells rebuilt tissue damaged by trauma or infection. Today, that's an afterthought. Inflammation has hit the big time. Over the past decade, it has become widely accepted that inflammation is a driving force behind chronic diseases that will kill nearly all of us. Cancer. Diabetes and obesity. Alzheimer's disease. Atherosclerosis. Here, inflammation wears a grim mask, shedding its redeeming features and making sick people sicker.
A growing body of evidence suggests that C-reactive protein (CRP), a molecular marker for inflammation, may be as crucial as cholesterol in assessing risk of heart attack. Macrophages, the white blood cells that are a hallmark of inflammation, appear around fatty plaques that build up in the arteries in atherosclerosis, infiltrate fat tissue in obesity, surround cancer cells to stimulate circulation and coax them along, help to kill neurons in neurodegenerative diseases such as Alzheimer's and Parkinson's, and promote two components of type 2 diabetes: insulin resistance and the death of pancreatic beta cells that produce insulin. Anti-inflammatory drugs (that suppresses action of proinflammatory cytokines released by immune cells)  have been shown in several cases to retard disease progression.

Google's body browser

I've been enjoying playing a bit with Google's new 3-D body browser that lets you proceed through body layers (muscles,organs, bones, circulatory and nervous systems), clicking on a part or area to see it outlined and identified. It is still very much under development, with the nervous system and brain yet to be fully engaged. When it's further along, it should be a tremendously convenient look up tool. You need a web browser that supports WebGL, such as Chrome or Firefox 4 Beta. You can share the exact scene you are viewing by copying and pasting the URL (the sciatic nerve, for example).

Thursday, December 23, 2010

Why diets fail.

Pankevich et al. offer observations that might explain why weight lost during an effective diet is usually regained - dieting makes the brain more sensitive to stress and the rewards of high-fat, high-calorie treats. These brain changes last long after the diet is over and prod otherwise healthy individuals to binge eat under pressure. Part of the abstract:

Long-term weight management by dieting has a high failure rate. Pharmacological targets have focused on appetite reduction, although less is understood as to the potential contributions of the stress state during dieting in long-term behavioral modification. In a mouse model of moderate caloric restriction in which a 10–15% weight loss similar to human dieting is produced, we examined physiological and behavioral stress measures. After 3 weeks of restriction, mice showed significant increases in immobile time in a tail suspension test and stress-induced corticosterone levels. Increased stress was associated with brain region-specific alterations of corticotropin-releasing factor expression and promoter methylation, changes that were not normalized with refeeding. Similar outcomes were produced by high-fat diet withdrawal, an additional component of human dieting. In examination of long-term behavioral consequences, previously restricted mice showed a significant increase in binge eating of a palatable high-fat food during stress exposure...In humans, such changes would be expected to reduce treatment success by promoting behaviors resulting in weight regain, and suggest that management of stress during dieting may be beneficial in long-term maintenance.

Sigh....

Coming upon cheerful news like the following makes me want to dig a hole, get in it, and pull it in after myself....

40 Percent Of Americans Still Believe In Creationism
Creationists And Climate Deniers Take On Teaching Climate Science In Schools

Wednesday, December 22, 2010

Regulation of distress better with thicker prefrontal cortex.

Ventral prefrontal cortex activity correlates with suppression of amygdala reactivity to emotionally challenging stimuli, presumably reflecting higher-order cognitive evaluation of negative stimuli being brought into play. (The amygdala influences a broad range of physiological and behavioral responses associated with emotion, with the left amygdala being particularly responsive to negative facial expressions.) Foland-Ross et al. have now shown that prefrontal grey matter (nerve cell containing cortical layer) thickness inversely correlates with amygdala reactivity. Greater ventromedial prefrontal cortical gray matter thickness was associated with greater reduction of activation in the left amygdala during affect labeling, a cognitive task that had previously been shown to dampen amygdala response.

In other words, if you have a thicker layer of prefrontal nerve cells, you might be less prone to emotional upset from unpleasant stimuli.

Out of our brains - extended mind continues

As a followup to my Nov. 3 post on critiques of Andy Clark's extended mind ideas (which drew 20 comments) I wanted to pass on this further Clark commentary and a sequel, pointed out to me by a loyal MindBlog reader, in which Clark tries to clarify his ideas.

Tuesday, December 21, 2010

Reducing depression with light stimulation of medial prefrontal cortex

In mice, to be sure.....Brain imaging and direct brain stimulation have implicated the prefrontal cortex in clinical depression in humans and mice, and now a study by Covington et al. confirms in both that suppression of gene activities associated with nerve activity in medial prefrontal cortex are associated with depressive behavior. In mice they used a genetic trick to introduce light activated channel proteins into nerve cells membranes in this area, and found that light stimulation which enhanced nerve activity had potent antidepressant-like effects. Here is their abstract:

Brain stimulation and imaging studies in humans have highlighted a key role for the prefrontal cortex in clinical depression; however, it remains unknown whether excitation or inhibition of prefrontal cortical neuronal activity is associated with antidepressant responses. Here, we examined cellular indicators of functional activity, including the immediate early genes (IEGs) zif268 (egr1), c-fos, and arc, in the prefrontal cortex of clinically depressed humans obtained postmortem. We also examined these genes in the ventral portion of the medial prefrontal cortex (mPFC) of mice after chronic social defeat stress, a mouse model of depression. In addition, we used viral vectors to overexpress channel rhodopsin 2 (a light-activated cation channel) in mouse mPFC to optogenetically drive "burst" patterns of cortical firing in vivo and examine the behavioral consequences. Prefrontal cortical tissue derived from clinically depressed humans displayed significant reductions in IEG expression, consistent with a deficit in neuronal activity within this brain region. Mice subjected to chronic social defeat stress exhibited similar reductions in levels of IEG expression in mPFC. Interestingly, some of these changes were not observed in defeated mice that escape the deleterious consequences of the stress, i.e., resilient animals. In those mice that expressed a strong depressive-like phenotype, i.e., susceptible animals, optogenetic stimulation of mPFC exerted potent antidepressant-like effects, without affecting general locomotor activity, anxiety-like behaviors, or social memory. These results indicate that the activity of the mPFC is a key determinant of depression-like behavior, as well as antidepressant responses.

Time, space, and number - evolved brain computations

Dehaene and Brannon introduce a special (and open access) issue of Trends in Cognitive Science on Time, space, and number.

What do the representations of space, time and number share that might justify their joint presence in a special issue of TICS? In his Critique of Pure Reason, Immanuel Kant famously argued that they provide ‘a priori intuitions’ that precede and structure how humans experience the environment. Indeed, these concepts are so basic to any understanding of the external world that it is hard to imagine how any animal species could survive without having mechanisms for spatial navigation, temporal orienting (e.g. time-stamped memories) and elementary numerical computations (e.g. choosing the food patch with the largest expected return). In the course of their evolution, humans and many other animal species might have internalized basic codes and operations that are isomorphic to the physical and arithmetic laws that govern the interaction of objects in the external world. The articles in this special issue all support this point of view: from grid cells to number neurons, the richness and variety of mechanisms by which animals and humans, including infants, can represent the dimensions of space, time and number is bewildering and suggests evolutionary processes and neural mechanisms by which Kantian intuitions might universally arise.

Monday, December 20, 2010

Culturomics

This is a bit mind-blowing. Here is the New York Times article, here is the Science summary by Bohannon, and here is the abstract and the article PDF of the collective effort by Google and academic researchers (including Steven Pinker, Martin Nowak, etc.), and here is the PDF of their supplement giving the details.  The abstract:

We constructed a corpus of digitized texts containing about 4% of all books (5,195,769 digitized books) ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of "culturomics", focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. "Culturomics" extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.
Clips from the Bohannon review:
The researchers have revealed 500,000 English words missed by all dictionaries, tracked the rise and fall of ideologies and famous people, and, perhaps most provocatively, identified possible cases of political suppression unknown to historians...tracking the ebb and flow of “Sigmund Freud” and “Charles Darwin” reveals an ongoing intellectual shift: Freud has been losing ground, and Darwin finally overtook him in 2005...the amount of data that Google Books offers...currently includes 2 trillion words from 15 million books, about 12% of every book in every language published since the Gutenberg Bible in 1450. By comparison, the human genome is a mere 3-billion-letter poem...the size of the English language has nearly doubled over the past century, to more than 1 million words. And vocabulary seems to be growing faster now than ever before.

"Me-We" - Obama and the passions

A great essay by Mark Lilla (Humanities Professor at Columbia University) in the NYTimes Sunday magazine. The conclusion:

...the shape of American politics over the past half-century has been determined by two great waves of passion: the first running from the Kennedy and Johnson administrations through the ’70s, the second running from the Reagan administration to the departure of George W. Bush. What dominated during the first wave was excitement about a New Frontier, hope for a just and Great Society, fear of nuclear war, a desire for greater social freedom — and confidence that government could accomplish much. In the next era the same passions, nearly as intense, would be successfully redirected by Ronald Reagan. Now the excitement was about privatization, hope was invested in economic growth, fears centered on the family and the greatest desire was for freedom from government.

The Great Recession and the Tea Party’s ire, directed at Democrats and Republicans alike, suggest that this second political dispensation is coming to an end and that Americans’ passions are ready to be redirected once again. Having been dealt a bad hand, President Obama may have only a slim chance of doing that, but he has absolutely none if he limits himself to appealing to people’s interests. That’s not been the American experience of change. In our politics, history doesn’t happen when a leader makes an argument, or even strikes a pose. It happens when he strikes a chord. And you don’t need charts and figures to do that; in fact they get in the way. You only need two words.

George Plimpton used to tell the story of Muhammad Ali going to Harvard one year to give an address. At the end of his speech, someone called out to him, “Give us a poem!” He paused, stretched out his arms to the audience and delivered what Plimpton said was the shortest poem in the English language:

ME [pause]

WE!

Friday, December 17, 2010

Female genes respond to winners versus losers

In a cichlid fish, that is, but I'll bet it's happening in humans too. When a male mate a female has chosen is a winner in a male-male competition, reproductive center genes are activated; when he is a loser, anxiety-like response center genes are activated. From Russ Fernald's group:

Females should be choosier than males about prospective mates because of the high costs of inappropriate mating decisions. Both theoretical and empirical studies have identified factors likely to influence female mate choices. However, male–male social interactions also can affect mating decisions, because information about a potential mate can trigger changes in female reproductive physiology. We asked how social information about a preferred male influenced neural activity in females, using immediate early gene (IEG) expression as a proxy for brain activity. A gravid female cichlid fish (Astatotilapia burtoni) chose between two socially equivalent males and then saw fights between these two males in which her preferred male either won or lost. We measured IEG expression levels in several brain nuclei including those in the vertebrate social behavior network (SBN), a collection of brain nuclei known to be important in social behavior. When the female saw her preferred male win a fight, SBN nuclei associated with reproduction were activated, but when she saw her preferred male lose a fight, the lateral septum, a nucleus associated with anxiety, was activated instead. Thus social information alone, independent of actual social interactions, activates specific brain regions that differ significantly depending on what the female sees. In female brains, reproductive centers are activated when she chooses a winner, and anxiety-like response centers are activated when she chooses a loser. These experiments assessing the role of mate-choice information on the brain using a paradigm of successive presentations of mate information suggest ways to understand the consequences of social information on animals using IEG expression.

Thursday, December 16, 2010

Why women apologize more than men.

Schumann and Ross note that it is not because men have more fragile egos, but because they have a higher threshold for what constitutes offensive behavior.

Despite wide acceptance of the stereotype that women apologize more readily than men, there is little systematic evidence to support this stereotype or its supposed bases (e.g., men’s fragile egos). We designed two studies to examine whether gender differences in apology behavior exist and, if so, why. In Study 1, participants reported in daily diaries all offenses they committed or experienced and whether an apology had been offered. Women reported offering more apologies than men, but they also reported committing more offenses. There was no gender difference in the proportion of offenses that prompted apologies. This finding suggests that men apologize less frequently than women because they have a higher threshold for what constitutes offensive behavior. In Study 2, we tested this threshold hypothesis by asking participants to evaluate both imaginary and recalled offenses. As predicted, men rated the offenses as less severe than women did. These different ratings of severity predicted both judgments of whether an apology was deserved and actual apology behavior.

Wednesday, December 15, 2010

Amazing moving graphic - history of well-being

This was pointed to in David Brooks NY Times column yesterday.

Turning back aging - The 91 year old athlete

I've been meaning to pass on some nuggets from an NYTimes Magazine article by Bruce Grierson, which tells the story of Olga Kotelko, a remarkable 91 year old woman who has shattered many world records in her Masters Competition age group. He references a number of studies and observations on aging that I was unaware of, particularly mentioning muscle physiologist Tanja Taivassalo. This first quote below gave me a bit of pause (since I am 68 years old, and in extremely good shape)...

We start losing wind in our 40s and muscle tone in our 50s. Things go downhill slowly until around age 75, when something alarming tends to happen...“There’s a slide I show in my physical-activity-and-aging class,” Taivassalo says. “You see a shirtless fellow holding barbells, but I cover his face. I ask the students how old they think he is. I mean, he could be 25. He’s just ripped. Turns out he’s 67. And then in the next slide there’s the same man at 78, in the same pose. It’s very clear he’s lost almost half of his muscle mass, even though he’s continued to work out. So there’s something going on.” But no one knows exactly what. Muscle fibers ought in theory to keep responding to training. But they don’t. Something is applying the brakes.
This seems not to be happening in Olga Kotelko, and a number of studies are looking at processes that seem to stall the natural processes of aging.
Exercise has been shown to add between six and seven years to a life span...Two recent studies involving middle-aged runners suggest that the serious mileage they were putting in, over years and years, had protected them at the chromosomal level. It appears that exercise may stimulate the production of telomerase, an enzyme that maintains and repairs the little caps on the ends of chromosomes that keep genetic information intact when cells divide. That may explain why older athletes aren’t just more cardiovascularly fit than their sedentary counterparts — they are more free of age-related illness in general.

Mark Tarnopolsky (professor of pediatrics and medicine at McMaster University in Hamilton) maintains that exercise in particular seems to activate a muscle stem cell called a satellite cell. With the infusion of these squeaky-clean cells into the system, the mitochondria seem to rejuvenate. (The phenomenon has been called “gene shifting.”) If this is right, exercise in older adults can roll back the odometer. After six months of twice weekly strength exercise training, Tarnopolsky has shown that the biochemical, physiological and genetic signature of older muscle is “turned back” nearly 15 or 20 years.

Tuesday, December 14, 2010

The truth wears off...

Jonah Lehrer has a fascinating article in the recent New Yorker which describes in detail a disturbing trend:

..all sorts of well-established multiply confirmed findings have started to look increasingly uncertain, It's as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable. This phenomenon doesn't yet have an official name, but it's occurring across a wide range of fields, from psychology to ecology. In the field of medicine the phenomenon seems extremely widespread, affecting not only antipsychotics but also therapies from cardiac stents to Vitamin E and antidepressants...a forthcoming analysis demonstrates that the efficacy of antidepressants has gone down as much as three-fold in recent decades.
Lehrer tells the story of a number of serious scientists who have reported statistically significant effects with appropriate controls, only to find then disappear over time, seemingly iron-clad results that on repetition seemed to fade away. One example is "verbal overshadowing", subjects being shown a face and asked to describe it being much less likely to recognize the face when shown it later than those who had simply looked at it. Another theory that has fallen apart is the claim that females use symmetry as a proxy for reproductive fitness of males. A 2005 study found that of the 50 most cited clinical research studies (with randomized control trials), almost half were subsequently not replicated or had their effects significantly downgraded, and these studies had guided clinical practice (hormone replacement therapy for menopausal women, low-dose aspirin to prevent heart attacks and strokes).

It is not entirely clear why this is happening, and several possibilities are mentioned:
-statistical regression to the mean, an early statistical fluke gets canceled out.
-publication bias on the part of journals, who prefer positive data over null results
-selective reporting, or significance chasing.  A review found that over 90% of psychological studies reporting statistically significant data, i.e. odds of being produced by chance less than 5% of the time, found the effect they were looking for. (One classic example of selective reporting concerns testing acupuncture in Asian countries - largely positive data - versus Western countries - less than half confirming. See today's other posting on MindBlog).

The problem of selective reporting doesn't derive necessarily from dishonesty, but from the fundamental cognitive flaw that we like proving ourselves right and hate being wrong.  The decline effect may actually be a decline of illusion.

We shouldn't throw out the baby with the bath water, as Lehrer notes in a subsequent blog posting. These problems don't mean we shouldn't believe in evolution or climate change:
One of the sad ironies of scientific denialism is that we tend to be skeptical of precisely the wrong kind of scientific claims. In poll after poll, Americans have dismissed two of the most robust and widely tested theories of modern science: evolution by natural selection and climate change. These are theories that have been verified in thousands of different ways by thousands of different scientists working in many different fields. (This doesn’t mean, of course, that such theories won’t change or get modified – the strength of science is that nothing is settled.) Instead of wasting public debate on creationism or the rhetoric of Senator Inhofe, I wish we’d spend more time considering the value of spinal fusion surgery, or second generation antipsychotics, or the verity of the latest gene association study.

More on efficacy/mechanism of acupuncture...

A loyal MindBlog reader has pointed me to a rather thorough review of numerous studies of traditional chinese versus sham acupuncture that in balance suggest that the two are equally effective in relieving musculoskeletal pain and osteoarthritis. The review then does a thorough discussion of whether acupuncture is a placebo effect and concludes that most of the benefits of acupuncture for pain syndromes result from the treatment ritual and patient–provider interaction - which meets the definition of a placebo effect.

Monday, December 13, 2010

The weather - why I am in Florida

A personal note.... The photos below are iPhone camera shots of the patio of my Wisconsin home where the temperature is 1 degree farenheit (taken by my partner there) and the shot I've just taken from my work desk in the Fort Lauderdale condo I use as an office while here November through March.  People here are complaining about the unusual cold (high of 62 today)!

Imagining eating reduces actual eating

We are just like Pavlov's dogs, in that thinking about a treat like eating chocolate enhances our desire for it and motivation to get it. After we have eaten some, our desire wanes, or habituates. Morewedge et al. make the fascinating observation that imagining the repetitive comsumption of the treat reduces the amount we actually eat:

The consumption of a food typically leads to a decrease in its subsequent intake through habituation—a decrease in one’s responsiveness to the food and motivation to obtain it. We demonstrated that habituation to a food item can occur even when its consumption is merely imagined. Five experiments showed that people who repeatedly imagined eating a food (such as cheese) many times subsequently consumed less of the imagined food than did people who repeatedly imagined eating that food fewer times, imagined eating a different food (such as candy), or did not imagine eating a food. They did so because they desired to eat it less, not because they considered it less palatable. These results suggest that mental representation alone can engender habituation to a stimulus.

Sleights of Mind

I attended the annual meeting of the Society for the Scientific Study of Consciousness in 2007, sending a few MindBlog dispaches from the event, and several subsequent posts.  It was organized by Stephen Macknik and Susana Martinez-Conde, both neuroscientists at the Barrow Neurological Institute in Phoenix.  Over a number of years they have studied how the tricks of magicians can be explained by classic and recent studies in cognitive neuroscience.  They organized a fascinating session at the meeting in which several scientists and four prominent magicians showed and discussed their craft. 

Macknik and Martinez-Conde have now joined with science journalist Sandra Blakesless (whose book "The Body has a Mind of its Own" I reviewed in a 2007 MindBlog post) to offer an engaging book: "Sleights of Mind."  I've just finished the advance copy I was sent, found it a very interesting and enjoyable read,  and plan to make it my seasonal gift to a number of friends.  They describe a large number of magical tricks and illusions, following each with an explanatory sections (prefaced by "spoiler alert") that list visual (and other sensory) afterimages, adaptations, habituations, cognitive and sensory short cuts, etc.  that explain why we can so easily be tricked.

Friday, December 10, 2010

The one night stand gene?

An amusing article in a recent PLoS One by Garcia et al makes me wonder whether we soon may be requiring prospective mates to reveal not only their HIV status but also the number of tandem repeats in their dopamine receptor gene. Genetic tweaking of the receptor for the "feel good" neurotransmitter dopamine may be all it takes to ramp up sexual promiscuity and infidelity (usual disclaimer: This does NOT mean we are talking about a 'gene' for promiscuity, in spite of the title of this post). They rounded up 181 college students, asked them to answer a questionnaire about their sexual habits along with other proclivities, such as cigarette smoking and the tendency to take risks.  They also measured variable number tandem repeats (VNTR) polymorphism in exon III of the subjects dopamine D4 receptor gene (DRD4), which has been correlated with an array of behavioral phenotypes, particularly promiscuity and infidelity. They found that subjects having at least one 7-repeat allele (7R+) report a greater categorical rate of promiscuous sexual behavior (i.e., having ever had a “one-night stand”) and report a more than 50% increase in instances of sexual infidelity. (Genotypes were grouped as 7R+ (at least one allele 7-repeats or longer) or 7R- (both alleles less than 7-repeats); the 7R+ genotype was present in 24% of the sample.)

Thursday, December 09, 2010

Complete heresy: life based on arsenic instead of phosphorus??

I had a wrenching gut reaction to first glancing at the headlines suggesting that a bacterium had been found which could live on arsenic instead of phosphorus...  My university degrees were in biochemistry, and if one thing was certain in this world, it was the basic recipe for life anywhere would have to contain carbon, nitrogen, oxygen, and phosphorus. Phosphorus forms the backbone of strands of DNA and RNA, as well as ATP and NAD, two molecules key to energy transfer in a cell. Arsenic is one row down in the periodic table from phosphorus and so does have similar chemical properties. It is a poison for us because it inserts into proteins and nucleic acids where phosphorus should, and screws up their action. A look at the article by Wolf-Simon et al., however, made me breathe a bit easier, because what they has actually done is to take a bacterium that lives under extreme conditions, in Mono Lake, located in eastern California, which is a hypersaline and alkaline water body with high dissolved arsenic concentrations. They grew the bacteria in increasingly high levels of arsenic (radioactively labeled), while decreasing phosphorus levels, and found arsenic incorporation into protein, lipid, nucleic acid, and metabolite fractions of the cells. So... these creatures are certainly different from us, they have evolved to be able to deal with arsenic. From Pennisi's review of this work:

Wolfe-Simon speculates that organisms like GFAJ-1 could have thrived in the arsenic-laden hydrothermal vent–like environments of early Earth, where some researchers think life first arose, and that later organisms may have adapted to using phosphorus. Others say they'll refrain from such speculation until they see more evidence of GFAJ-1's taste for arsenic and understand how the DNA and other biomolecules can still function with the element incorporated. “As in this type of game changer, some people will rightly want more proof,” says microbiologist Robert Gunsalus of the University of California, Los Angeles. “There is much to do in order to firmly put this microbe on the biological map.”

Wednesday, December 08, 2010

Narcissists - an endangered species?

You should have a look at two interesting NYTimes articles by Zanor and Carey on proposed changes to the fifth edition of the psychologist's and psychiatrist's bible,  the Diagnostic and Statistical Manual of Mental Disorders (due out in 2013, and known as DSM-5), which eliminate five of the 10 personality disorders that are listed in the current edition: narcissistic, dependent, histrionic, schizoid and paranoid. Rather than defining a syndrome by a cluster of related traits, with the clinician matching patients to that profile, the new proposed approach chooses from a long list of personality traits that best describe a particular patient. The older approach treats the categories as if we know them to be scientifically accurate (which we don't), and while fitting with common sense and folk psychology, can have the nature of self fulfilling prophesy...not to mention making life easier for insurance companies and the courts. Zanor quotes psychologist Jonathan Shedler:

Clinicians are accustomed to thinking in terms of syndromes, not deconstructed trait ratings. Researchers think in terms of variables, and there’s just a huge schism.... the committee was stacked with a lot of academic researchers who really don’t do a lot of clinical work. We’re seeing yet another manifestation of what’s called in psychology the science-practice schism.”

Tuesday, December 07, 2010

How reading rewires the brain.

Dehaene et al. have done an interesting study of how our brains deal with written language, which appeared only about 5,000 years ago and thus must use brain circuit evolved for other purposes. Not surprisingly, areas that originally evolved to process vision and spoken language respond more strongly to written words in literate than in illiterate subjects. This repurposing may have involved a tradeoff: for people who learned to read early in life, a smaller region of the left occipital-temporal cortex responded to images of faces than in the illiterate volunteers. (The figure shows brain regions that respond more strongly to text in people who can read.):
Does literacy improve brain function? Does it also entail losses? Using functional magnetic resonance imaging, we measured brain responses to spoken and written language, visual faces, houses, tools, and checkers in adults of variable literacy (10 were illiterate, 22 became literate as adults, and 31 became literate in childhood). As literacy enhanced the left fusiform activation evoked by writing, it induced a small competition with faces at this location but also broadly enhanced visual responses in fusiform and occipital cortex, extending to area V1. Literacy also enhanced phonological activation to speech in the planum temporale and afforded a top-down activation of orthography from spoken inputs. Most changes occurred even when literacy was acquired in adulthood, emphasizing that both childhood and adult education can profoundly refine cortical organization.

Monday, December 06, 2010

Advanced human achievement - simple reinforcement learning?

Sejnowski writes an interesting review of work by Desrochers et al., which examines whether basic principles of reinforcement learning, coupled with a complex environment and a large memory, might account for more complex behaviors. They show that reinforcement learning can explain not only behavioral choice in a complex environment, but also the evolution toward optimal behavior over a long time. They studied, in the monkey, the sort of eye movements we make several times a second when scanning a complex image (the scan path is dramatically influenced by what we are thinking.) Here is their abstract, followed by Sejnowski's summation.

Habits and rituals are expressed universally across animal species. These behaviors are advantageous in allowing sequential behaviors to be performed without cognitive overload, and appear to rely on neural circuits that are relatively benign but vulnerable to takeover by extreme contexts, neuropsychiatric sequelae, and processes leading to addiction. Reinforcement learning (RL) is thought to underlie the formation of optimal habits. However, this theoretic formulation has principally been tested experimentally in simple stimulus-response tasks with relatively few available responses. We asked whether RL could also account for the emergence of habitual action sequences in realistically complex situations in which no repetitive stimulus-response links were present and in which many response options were present. We exposed naïve macaque monkeys to such experimental conditions by introducing a unique free saccade scan task. Despite the highly uncertain conditions and no instruction, the monkeys developed a succession of stereotypical, self-chosen saccade sequence patterns. Remarkably, these continued to morph for months, long after session-averaged reward and cost (eye movement distance) reached asymptote. Prima facie, these continued behavioral changes appeared to challenge RL. However, trial-by-trial analysis showed that pattern changes on adjacent trials were predicted by lowered cost, and RL simulations that reduced the cost reproduced the monkeys’ behavior. Ultimately, the patterns settled into stereotypical saccade sequences that minimized the cost of obtaining the reward on average. These findings suggest that brain mechanisms underlying the emergence of habits, and perhaps unwanted repetitive behaviors in clinical disorders, could follow RL algorithms capturing extremely local explore/exploit tradeoffs.
Sejnowski's review gives several other examples of reinforcement learning solving difficult problems (such as learning how to play Blackgammon), and concludes:
...the jury is still out on whether reinforcement learning can explain the highest levels of human achievement. Rather than add a radically new piece of machinery to the brain, such as a language module, nature may have tinkered with the existing brain machinery to make it more efficient. Children have a remarkable ability to learn through imitation and shared attention, which might greatly speed up reinforcement learning by focusing learning on important stimuli. We are also exceptional at waiting for rewards farther into the future than other species, in some cases delaying gratification to an imagined afterlife made concrete by words. Supercharged with a larger cerebral cortex, faster learning, and a longer time horizon, is it possible that we solve complex problems in mathematics the same way that monkeys find optimal scan paths?

Friday, December 03, 2010

Political Leapfrogging

From the 'Editor's Choice' section of the Nov. 26 Science Magazine:

Although there have been many discussions of the polarized nature of American politics, do the views of elected officials match the preferences of their electorate? Bafumi and Herron sought to answer this question by comparing a national opinion survey of American voters (the Cooperative Congressional Election Study; CCES) with legislator voting records of the 109th (2005–2006) and 110th (2007–2008) Congresses. In many cases, the CCES questions were similar to (or the same as) actual congressional roll call votes, which allowed for better comparison. By developing a linear scale bounded by representatives (or CCES respondents) who had taken consistently liberal or conservative positions, the authors found that members of Congress were more extreme than the voters they represented. The median member of the 109th House of Representatives was more conservative than the median American voter, but the median member of the 110th House of Representatives was more liberal. Thus, voting out one extremist usually led to replacement by someone equally extreme, but of the opposite party. The authors refer to this as “leapfrogging” because the moderate views of the median American voter are leapfrogged during the turnover. Although the turnover was similar in the Senate, overall it appeared to be more moderate.

The Article: Amer. Polit. Sci. Rev. 104, 519 (2010).

Thursday, December 02, 2010

What makes the human brain special...

Our human brains are bigger than those of our ape relatives, in particular the frontal lobes that are required for advanced cognitive functions. Semendeferi et al have focused on a particular area of the frontal lobes: Brodmann area 10 (BA 10), which sits at the pole of the frontal lobes just above the eyes, and is thought to be involved in abstract thinking and other sophisticated cognition. They find not only that this area is relatively larger in humans, but that there is more space between nerve cell bodies in human brains than in the brains of apes, allowing room for connections between neurons. (In contrast, there were only subtle differences in cell body density among humans, chimpanzees, bonobos, gorillas, orangutans, and gibbons in the visual, somatosensory, and motor cortices.) Their analysis looked at the cells in layer three of the cortex, which communicates with other areas of the brain. BA10 in humans also contain a higher concentration of so-called Von Economo neurons, which are generally thought to be high-performance neurons specialized for rapidly transmitting information from one brain region to another.


More space between neurons in the human brain (right) compared with the chimp brain (left) could allow more complex neural wiring.

The authors suggest that human brain evolution was likely characterized by an increase in the number and width of cortical minicolumns and the space available for interconnectivity between neurons in the frontal lobe, especially the prefrontal cortex.

Wednesday, December 01, 2010

How comfort foods reduce stress.

Interesting work from Ulrich-Lai et al.  Apparently sweet tastes (and sex) reduce stress behaviors by chilling down parts of the amygdala causing them:

Individuals often eat calorically dense, highly palatable “comfort” foods during stress for stress relief. This article demonstrates that palatable food intake (limited intake of sucrose drink) reduces neuroendocrine, cardiovascular, and behavioral responses to stress in rats. Artificially sweetened (saccharin) drink reproduces the stress dampening, whereas oral intragastric gavage of sucrose is without effect. Together, these results suggest that the palatable/rewarding properties of sucrose are necessary and sufficient for stress dampening. In support of this finding, another type of natural reward (sexual activity) similarly reduces stress responses. Ibotenate lesions of the basolateral amygdala (BLA) prevent stress dampening by sucrose, suggesting that neural activity in the BLA is necessary for the effect. Moreover, sucrose intake increases mRNA and protein expression in the BLA for numerous genes linked with functional and/or structural plasticity. Lastly, stress dampening by sucrose is persistent, which is consistent with long-term changes in neural activity after synaptic remodeling. Thus, natural rewards, such as palatable foods, provide a general means of stress reduction, likely via structural and/or functional plasticity in the BLA. These findings provide a clearer understanding of the motivation for consuming palatable foods during times of stress and influence therapeutic strategies for the prevention and/or treatment of obesity and other stress-related disorders.