Friday, December 31, 2010

The wolfpack effect.

An interesting piece of work from Gao et al. showing how the perception of animacy influences our interactive behavior.
Imagine a pack of predators stalking their prey. The predators may not always move directly toward their target (e.g., when circling around it), but they may be consistently facing toward it. The human visual system appears to be extremely sensitive to such situations, even in displays involving simple shapes. We demonstrate this by introducing the wolfpack effect, which is found when several randomly moving, oriented shapes (darts, or discs with “eyes”) consistently point toward a moving disc. Despite the randomness of the shapes’ movement, they seem to interact with the disc—as if they are collectively pursuing it. This impairs performance in interactive tasks (including detection of actual pursuit), and observers selectively avoid such shapes when moving a disc through the display themselves. These and other results reveal that the wolfpack effect is a novel “social” cue to perceived animacy. And, whereas previous work has focused on the causes of perceived animacy, these results demonstrate its effects, showing how it irresistibly and implicitly shapes visual performance and interactive behavior.


Sample display (a) and manipulations (b–e) from the first experiment. The task was to detect whether one shape (the wolf) was chasing another (the sheep). Arrows indicate motion and were not present in the displays. In the wolfpack condition (a, b), all darts stayed oriented toward the task-irrelevant green square, regardless of their motion directions. This condition generated the wolfpack effect. In the perpendicular condition (c), each dart was always oriented orthogonally to the square. In the match condition (d), each dart was always oriented in the direction in which it was moving at that moment. And in the disc condition (e), the objects had no visible orientation.

A contrarian view of energy prospects.

John Tierney describes a wager he made in 2005 with Matthew Simmons, who bet $5,000 that the price of oil, then about $65 a barrel, would more than triple in the next five years, so that the average price of oil over the course of 2010 would be at least $200 a barrel in 2005 dollars....The average for 2010 has been just under $80, which is the equivalent of about $71 in 2005 dollars — a little higher than the $65 at the time of the bet, but far below the $200 threshold set by Mr. Simmons. (Tierney's mentor was the economist Julian L. Simon, a leader of the Cornucopians, optimists who believed there would always be abundant supplies of energy and other resources. Simon won a bet in the 1980s with Paul Ehrlich and two natural resources experts over the prices of five metals.) What happened to the grim predictions of declining oil reserves and rising prices? Perhaps there has been a temporary respite (which unfortunately will not help alternative energy efforts):
Giant new oil fields have been discovered off the coasts of Africa and Brazil. The new oil sands projects in Canada now supply more oil to the United States than Saudi Arabia does. Oil production in the United States increased last year, and the Department of Energy projects further increases over the next two decades...The really good news is the discovery of vast quantities of natural gas. It’s now selling for less than half of what it was five years ago. There’s so much available that the Energy Department is predicting low prices for gas and electricity for the next quarter-century. Lobbyists for wind farms, once again, have been telling Washington that the “sustainable energy” industry can’t sustain itself without further subsidies...As gas replaces dirtier fossil fuels, the rise in greenhouse gas emissions will be tempered, according to the Department of Energy. It projects that no new coal power plants will be built, and that the level of carbon dioxide emissions in the United States will remain below the rate of 2005 for the next 15 years even if no new restrictions are imposed.

Maybe something unexpected will change these happy trends, but for now I’d say that Julian Simon’s advice remains as good as ever. You can always make news with doomsday predictions, but you can usually make money betting against them.

Thursday, December 30, 2010

New brain circuits form online during rapid learning

Shtyrov et. al. show that after just 14 minutes of learning exposure to a new word, presentations of this word cause increased responses in the language cortex, reflecting rapid mapping of new word forms onto neural representations.
Humans are unique in developing large lexicons as their communication tool. To achieve this, they are able to learn new words rapidly. However, neural bases of this rapid learning, which may be an expression of a more general cognitive mechanism, are not yet understood. To address this, we exposed our subjects to familiar words and novel spoken stimuli in a short passive perceptual learning session and compared automatic brain responses to these items throughout the learning exposure. Initially, we found enhanced activity for known words, indexing the ignition of their underlying memory traces. However, just after 14 min of learning exposure, the novel items exhibited a significant increase in response magnitude matching in size with that to real words. This activation increase, as we would like to propose, reflects rapid mapping of new word forms onto neural representations. Similar to familiar words, the neural activity subserving rapid learning of new word forms was generated in the left-perisylvian language cortex, especially anterior superior-temporal areas. This first report of a neural correlate of rapid learning suggests that our brain may effectively form new neuronal circuits online as it gets exposed to novel patterns in the sensory input. Understanding such fast learning is key to the neurobiological explanation of the human language faculty and learning mechanisms in general.

Paying the price for a longer life.

A brief article by Bakalar notes a study by Crimmins et al.
...people live longer not because they are less likely to get sick, but because they survive longer with disease....As a result, a 20-year-old man today can expect to live about a year longer than a 20-year-old in 1998, but will spend 1.2 years more with a disease, and 2 more years unable to function normally.

Wednesday, December 29, 2010

Placebo pills work without deception

Kaptchuk et al. show that placebos administered without deception may be an effective treatment for irritable bowel syndrome. From Bakalar's summary:
They explained to all that a placebo was an inert substance, like a sugar pill, that had been found to “produce significant improvement in I.B.S. symptoms through mind-body self-healing processes.” The patients, all treated with the same attention, warmth and empathy by the researchers, were then randomly assigned to get the pill or not...At the end of three weeks, they tested all the patients with questionnaires assessing the level of their pain and other symptoms. The patients given the sugar pill — in a bottle clearly marked “placebo” — reported significantly better pain relief and greater reduction in the severity of other symptoms than those who got no pill.
A weakness of the study is that because the outcome measure is so subjective, placebo patients may have exaggerated their improvement to please the researchers.

English and Mandarin speakers think about time differently.

I pass on this abstract from Boroditsky et al., and a few clips from the article:
Time is a fundamental domain of experience. In this paper we ask whether aspects of language and culture affect how people think about this domain. Specifically, we consider whether English and Mandarin speakers think about time differently. We review all of the available evidence both for and against this hypothesis, and report new data that further support and refine it. The results demonstrate that English and Mandarin speakers do think about time differently. As predicted by patterns in language, Mandarin speakers are more likely than English speakers to think about time vertically (with earlier time-points above and later time-points below).
From their text:
Both English and Mandarin use horizontal front/back spatial terms to talk about time. In English, we can look forward to the good times ahead, or think back to travails past and be glad they are behind us. In Mandarin, front/back spatial metaphors for time are also common. For example,  Mandarin speakers use the spatial morphemes qián (‘‘front”) and hòu (‘‘back”) to talk about time...Unlike English speakers, Mandarin speakers also systematically and frequently use vertical metaphors. The spatial morphemes shàng (‘‘up”) and xià (‘‘down”) are used to talk about the order of events, weeks, months, semesters, and more. Earlier events are said to be shàng or ‘‘up”, and later events are said to be xià or ‘‘down”.  For example, “shàng ge yuè” is last (or previous) month, and “xià ge yuè” is next (or following) month...This difference between the two languages offers the prediction that Mandarin speakers would be more likely to conceive of time vertically than would English speakers.

In the experimental paradigm, participants made temporal judgments following horizontal or vertical spatial primes. On each trial, participants first answered several questions about the spatial relationship between two objects (arranged either horizontally or vertically on a computer screen), and then answered a question about time (e.g., March comes earlier than April; TRUE or FALSE). Participants’ response times to the target question about time following either the horizontal or vertical primes were the measure of interest.

The basic finding was that both groups organize time on the left-to-right axis with earlier events on the left, a pattern consistent with writing direction. But, Mandarin speakers also show evidence of vertical representations of time, with earlier events represented further up. English speakers showed no evidence of such a representation.

Tuesday, December 28, 2010

You've got to have (150) friends...

The post title is the title of a brief essay by Robin Dunbar, who is best known for his work documenting, for a large number of animal species,  the relationship between brain size and social group size (they get larger together.)   His curve predicts that the optimal group size for humans is about 150.  That is what is observed over a large number of hunter-gatherer and aboriginal human species across the world,  and (his article contends) is an evolved biological/psychological limit that operates even in a world of facebook that permits thousands of "friends." Some clips:
The developers at Facebook overlooked one of the crucial components in the complicated business of how we create relationships: our minds...Put simply, our minds are not designed to allow us to have more than a very limited number of people in our social world. The emotional and psychological investments that a close relationship requires are considerable, and the emotional capital we have available is limited...Indeed, no matter what Facebook allows us to do, I have found that most of us can maintain only around 150 meaningful relationships, online and off — what has become known as Dunbar’s number. Yes, you can “friend” 500, 1,000, even 5,000 people with your Facebook page, but all save the core 150 are mere voyeurs looking into your daily life — a fact incorporated into the new social networking site Path, which limits the number of friends you can have to 50.

Until relatively recently, almost everyone on earth lived in small, rural, densely interconnected communities, where our 150 friends all knew one another...the social and economic mobility of the past century has worn away at that interconnectedness. As we move around the country and across continents, we collect disparate pockets of friends, so that our list of 150 consists of a half-dozen subsets of people who barely know of one another’s existence...as we move around, though, we can lose touch with even our closest friends. Emotional closeness declines by around 15 percent a year in the absence of face-to-face contact, so that in five years someone can go from being an intimate acquaintance to the most distant outer layer of your 150 friends.

Facebook and other social networking sites allow us to keep up with friendships that would otherwise rapidly wither away. And they do something else that’s probably more important, if much less obvious: they allow us to reintegrate our networks so that, rather than having several disconnected subsets of friends, we can rebuild, albeit virtually, the kind of old rural communities where everyone knew everyone else. Welcome to the electronic village.

The science of cities.

I've been meaning to point out an interesting article by Jonah Lehrer that focuses on the work of 70-year old physicist Geoffrey West, who decided to turn his attention to discerning whether the cities that containing an ever increasing fraction of the world's population (82% of the people in the U.S. live in cities) follow discernible patterns and laws. Some edited clips:
Knowing the population of a metropolitan area in a given country allows one to estimate, with approximately 85 percent accuracy, its average income and the dimensions of its sewer system. These are the laws, they say, that automatically emerge whenever people “agglomerate,” cramming themselves into apartment buildings and subway cars. It doesn’t matter if the place is Manhattan or Manhattan, Kan.: the urban patterns remain the same...the real purpose of cities, and the reason cities keep on growing, is their ability to create massive economies of scale, just as big animals do. After analyzing the first sets of city data — the physicists began with infrastructure and consumption statistics — they concluded that cities looked a lot like elephants. In city after city, the indicators of urban “metabolism,” like the number of gas stations or the total surface area of roads, showed that when a city doubles in size, it requires an increase in resources of only 85 percent...This straightforward observation has some surprising implications. It suggests, for instance, that modern cities are the real centers of sustainability. According to the data, people who live in densely populated places require less heat in the winter and need fewer miles of asphalt per capita.
People, however, do not go to cities because they are more efficient, they go because their are more social and commercial interactions. West and colleagues were able to quantify Jane Jacob's points in her famous book “The Death and Life of Great American Cities.”
...whenever a city doubles in size, every measure of economic activity, from construction spending to the amount of bank deposits, increases by approximately 15 percent per capita (It also experiences a 15 percent per capita increase in violent crimes, traffic and AIDS cases). It doesn’t matter how big the city is; the law remains the same...everything that’s related to the social network goes up by the same percentage.

Monday, December 27, 2010

Listening to your heart

Dunn et al. try to evaluate how sensing feedback from the body influences thought and feeling. Some edited clips of background, and their results:
Some metaphorical expressions that are used daily, such as “brokenhearted” or “gut feelings,” reflect the common belief that feelings and cognitions are partly grounded in bodily responses. This idea is reflected in early philosophical writings about embodiment (e.g., Descartes, 1649/1989) and was introduced to experimental psychology by William James (1884), who asserted that perception of changes in the body “as they occur is the emotion” (pp. 189–190). Since then, there has been considerable debate about the extent to which feelings and cognitions are in fact embodied. Much of this discussion has focused on emotion experience and decision making. Schachter and Singer modified Jamesian theory to argue that emotion experience is a product of the cognitive appraisal of bodily arousal. The somatic marker hypothesis of Damasio proposes that emotional biasing signals emerging from the body influence intuitive decision making. These models remain controversial, and critics argue that bodily responses occur relatively late in the information-processing chain and are therefore best viewed as a consequence, rather than the cause, of cognitive-affective activity.

...A central but untested prediction of many of these proposals is that how well individuals can perceive subtle bodily changes (interoception) determines the strength of the relationship between bodily reactions and cognitive-affective processing. In a first study we demonstrated that the more accurately participants could track their heartbeat, the stronger the observed link between their heart rate reactions and their subjective arousal (but not valence) ratings of emotional images. (In other words, the more strongly these autonomic changes are felt, the more they are associated with arousal experience.) These results offer strong support for Jamesian bodily feedback theories.

In a second study, we found that increasing interoception ability either helped or hindered adaptive intuitive decision making, depending on whether the anticipatory bodily signals generated favored advantageous or disadvantageous choices. These findings identify both the generation and the perception of bodily responses as pivotal sources of variability in emotion experience and intuition, and offer strong supporting evidence for bodily feedback theories, suggesting that cognitive-affective processing does in significant part relate to “following the heart.” Our findings agree with those of other studies showing that an absence of emotion following frontal head injury can in some circumstances lead to superior decision making and with claims that elevated interoceptive awareness may maintain conditions such as anxiety.

Deric's MindBlog for smartphones

I use Google's Blogger to publish MindBlog, and they have just added a nice new tweak.
We realize that more and more users are accessing the web on smartphones, and we want to make sure that blogs still look nice when viewed on these smaller screens. We’ve put a lot of work into creating a mobile version of BlogSpot, which will automatically detect if a blog is accessed on a smartphone and then display a mobile-optimized version.
I have enabled this feature, so now if you go to mindblog.dericbownds.net using your iPhone or other smartphone,  you see this new format. 

Friday, December 24, 2010

The dark side of inflammation

Couzin-Frankel does an interesting piece in the News section of the Dec. 17 issue of Science, consonant with my opinion that inflammatory processes are one of the main issues in aging. The abstract:
Not long ago, inflammation had a clear role: It was a sidekick to the body's healers, briefly setting in as immune cells rebuilt tissue damaged by trauma or infection. Today, that's an afterthought. Inflammation has hit the big time. Over the past decade, it has become widely accepted that inflammation is a driving force behind chronic diseases that will kill nearly all of us. Cancer. Diabetes and obesity. Alzheimer's disease. Atherosclerosis. Here, inflammation wears a grim mask, shedding its redeeming features and making sick people sicker.
A growing body of evidence suggests that C-reactive protein (CRP), a molecular marker for inflammation, may be as crucial as cholesterol in assessing risk of heart attack. Macrophages, the white blood cells that are a hallmark of inflammation, appear around fatty plaques that build up in the arteries in atherosclerosis, infiltrate fat tissue in obesity, surround cancer cells to stimulate circulation and coax them along, help to kill neurons in neurodegenerative diseases such as Alzheimer's and Parkinson's, and promote two components of type 2 diabetes: insulin resistance and the death of pancreatic beta cells that produce insulin. Anti-inflammatory drugs (that suppresses action of proinflammatory cytokines released by immune cells)  have been shown in several cases to retard disease progression.

Google's body browser

I've been enjoying playing a bit with Google's new 3-D body browser that lets you proceed through body layers (muscles,organs, bones, circulatory and nervous systems), clicking on a part or area to see it outlined and identified. It is still very much under development, with the nervous system and brain yet to be fully engaged. When it's further along, it should be a tremendously convenient look up tool. You need a web browser that supports WebGL, such as Chrome or Firefox 4 Beta. You can share the exact scene you are viewing by copying and pasting the URL (the sciatic nerve, for example).

Thursday, December 23, 2010

Why diets fail.

Pankevich et al. offer observations that might explain why weight lost during an effective diet is usually regained - dieting makes the brain more sensitive to stress and the rewards of high-fat, high-calorie treats. These brain changes last long after the diet is over and prod otherwise healthy individuals to binge eat under pressure. Part of the abstract:
Long-term weight management by dieting has a high failure rate. Pharmacological targets have focused on appetite reduction, although less is understood as to the potential contributions of the stress state during dieting in long-term behavioral modification. In a mouse model of moderate caloric restriction in which a 10–15% weight loss similar to human dieting is produced, we examined physiological and behavioral stress measures. After 3 weeks of restriction, mice showed significant increases in immobile time in a tail suspension test and stress-induced corticosterone levels. Increased stress was associated with brain region-specific alterations of corticotropin-releasing factor expression and promoter methylation, changes that were not normalized with refeeding. Similar outcomes were produced by high-fat diet withdrawal, an additional component of human dieting. In examination of long-term behavioral consequences, previously restricted mice showed a significant increase in binge eating of a palatable high-fat food during stress exposure...In humans, such changes would be expected to reduce treatment success by promoting behaviors resulting in weight regain, and suggest that management of stress during dieting may be beneficial in long-term maintenance.

Sigh....

Coming upon cheerful news like the following makes me want to dig a hole, get in it, and pull it in after myself....

40 Percent Of Americans Still Believe In Creationism
Creationists And Climate Deniers Take On Teaching Climate Science In Schools

Wednesday, December 22, 2010

Regulation of distress better with thicker prefrontal cortex.

Ventral prefrontal cortex activity correlates with suppression of amygdala reactivity to emotionally challenging stimuli, presumably reflecting higher-order cognitive evaluation of negative stimuli being brought into play. (The amygdala influences a broad range of physiological and behavioral responses associated with emotion, with the left amygdala being particularly responsive to negative facial expressions.) Foland-Ross et al. have now shown that prefrontal grey matter (nerve cell containing cortical layer) thickness inversely correlates with amygdala reactivity. Greater ventromedial prefrontal cortical gray matter thickness was associated with greater reduction of activation in the left amygdala during affect labeling, a cognitive task that had previously been shown to dampen amygdala response.

In other words, if you have a thicker layer of prefrontal nerve cells, you might be less prone to emotional upset from unpleasant stimuli.

Out of our brains - extended mind continues

As a followup to my Nov. 3 post on critiques of Andy Clark's extended mind ideas (which drew 20 comments) I wanted to pass on this further Clark commentary and a sequel, pointed out to me by a loyal MindBlog reader, in which Clark tries to clarify his ideas.

Tuesday, December 21, 2010

Reducing depression with light stimulation of medial prefrontal cortex

In mice, to be sure.....Brain imaging and direct brain stimulation have implicated the prefrontal cortex in clinical depression in humans and mice, and now a study by Covington et al. confirms in both that suppression of gene activities associated with nerve activity in medial prefrontal cortex are associated with depressive behavior. In mice they used a genetic trick to introduce light activated channel proteins into nerve cells membranes in this area, and found that light stimulation which enhanced nerve activity had potent antidepressant-like effects. Here is their abstract:
Brain stimulation and imaging studies in humans have highlighted a key role for the prefrontal cortex in clinical depression; however, it remains unknown whether excitation or inhibition of prefrontal cortical neuronal activity is associated with antidepressant responses. Here, we examined cellular indicators of functional activity, including the immediate early genes (IEGs) zif268 (egr1), c-fos, and arc, in the prefrontal cortex of clinically depressed humans obtained postmortem. We also examined these genes in the ventral portion of the medial prefrontal cortex (mPFC) of mice after chronic social defeat stress, a mouse model of depression. In addition, we used viral vectors to overexpress channel rhodopsin 2 (a light-activated cation channel) in mouse mPFC to optogenetically drive "burst" patterns of cortical firing in vivo and examine the behavioral consequences. Prefrontal cortical tissue derived from clinically depressed humans displayed significant reductions in IEG expression, consistent with a deficit in neuronal activity within this brain region. Mice subjected to chronic social defeat stress exhibited similar reductions in levels of IEG expression in mPFC. Interestingly, some of these changes were not observed in defeated mice that escape the deleterious consequences of the stress, i.e., resilient animals. In those mice that expressed a strong depressive-like phenotype, i.e., susceptible animals, optogenetic stimulation of mPFC exerted potent antidepressant-like effects, without affecting general locomotor activity, anxiety-like behaviors, or social memory. These results indicate that the activity of the mPFC is a key determinant of depression-like behavior, as well as antidepressant responses.

Time, space, and number - evolved brain computations

Dehaene and Brannon introduce a special (and open access) issue of Trends in Cognitive Science on Time, space, and number.
What do the representations of space, time and number share that might justify their joint presence in a special issue of TICS? In his Critique of Pure Reason, Immanuel Kant famously argued that they provide ‘a priori intuitions’ that precede and structure how humans experience the environment. Indeed, these concepts are so basic to any understanding of the external world that it is hard to imagine how any animal species could survive without having mechanisms for spatial navigation, temporal orienting (e.g. time-stamped memories) and elementary numerical computations (e.g. choosing the food patch with the largest expected return). In the course of their evolution, humans and many other animal species might have internalized basic codes and operations that are isomorphic to the physical and arithmetic laws that govern the interaction of objects in the external world. The articles in this special issue all support this point of view: from grid cells to number neurons, the richness and variety of mechanisms by which animals and humans, including infants, can represent the dimensions of space, time and number is bewildering and suggests evolutionary processes and neural mechanisms by which Kantian intuitions might universally arise.

Monday, December 20, 2010

Culturomics

This is a bit mind-blowing. Here is the New York Times article, here is the Science summary by Bohannon, and here is the abstract and the article PDF of the collective effort by Google and academic researchers (including Steven Pinker, Martin Nowak, etc.), and here is the PDF of their supplement giving the details.  The abstract:

We constructed a corpus of digitized texts containing about 4% of all books (5,195,769 digitized books) ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of "culturomics", focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. "Culturomics" extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.
Clips from the Bohannon review:
The researchers have revealed 500,000 English words missed by all dictionaries, tracked the rise and fall of ideologies and famous people, and, perhaps most provocatively, identified possible cases of political suppression unknown to historians...tracking the ebb and flow of “Sigmund Freud” and “Charles Darwin” reveals an ongoing intellectual shift: Freud has been losing ground, and Darwin finally overtook him in 2005...the amount of data that Google Books offers...currently includes 2 trillion words from 15 million books, about 12% of every book in every language published since the Gutenberg Bible in 1450. By comparison, the human genome is a mere 3-billion-letter poem...the size of the English language has nearly doubled over the past century, to more than 1 million words. And vocabulary seems to be growing faster now than ever before.

"Me-We" - Obama and the passions

A great essay by Mark Lilla (Humanities Professor at Columbia University) in the NYTimes Sunday magazine. The conclusion:
...the shape of American politics over the past half-century has been determined by two great waves of passion: the first running from the Kennedy and Johnson administrations through the ’70s, the second running from the Reagan administration to the departure of George W. Bush. What dominated during the first wave was excitement about a New Frontier, hope for a just and Great Society, fear of nuclear war, a desire for greater social freedom — and confidence that government could accomplish much. In the next era the same passions, nearly as intense, would be successfully redirected by Ronald Reagan. Now the excitement was about privatization, hope was invested in economic growth, fears centered on the family and the greatest desire was for freedom from government.

The Great Recession and the Tea Party’s ire, directed at Democrats and Republicans alike, suggest that this second political dispensation is coming to an end and that Americans’ passions are ready to be redirected once again. Having been dealt a bad hand, President Obama may have only a slim chance of doing that, but he has absolutely none if he limits himself to appealing to people’s interests. That’s not been the American experience of change. In our politics, history doesn’t happen when a leader makes an argument, or even strikes a pose. It happens when he strikes a chord. And you don’t need charts and figures to do that; in fact they get in the way. You only need two words.

George Plimpton used to tell the story of Muhammad Ali going to Harvard one year to give an address. At the end of his speech, someone called out to him, “Give us a poem!” He paused, stretched out his arms to the audience and delivered what Plimpton said was the shortest poem in the English language:

ME [pause]

WE!

Friday, December 17, 2010

Female genes respond to winners versus losers

In a cichlid fish, that is, but I'll bet it's happening in humans too. When a male mate a female has chosen is a winner in a male-male competition, reproductive center genes are activated; when he is a loser, anxiety-like response center genes are activated. From Russ Fernald's group:
Females should be choosier than males about prospective mates because of the high costs of inappropriate mating decisions. Both theoretical and empirical studies have identified factors likely to influence female mate choices. However, male–male social interactions also can affect mating decisions, because information about a potential mate can trigger changes in female reproductive physiology. We asked how social information about a preferred male influenced neural activity in females, using immediate early gene (IEG) expression as a proxy for brain activity. A gravid female cichlid fish (Astatotilapia burtoni) chose between two socially equivalent males and then saw fights between these two males in which her preferred male either won or lost. We measured IEG expression levels in several brain nuclei including those in the vertebrate social behavior network (SBN), a collection of brain nuclei known to be important in social behavior. When the female saw her preferred male win a fight, SBN nuclei associated with reproduction were activated, but when she saw her preferred male lose a fight, the lateral septum, a nucleus associated with anxiety, was activated instead. Thus social information alone, independent of actual social interactions, activates specific brain regions that differ significantly depending on what the female sees. In female brains, reproductive centers are activated when she chooses a winner, and anxiety-like response centers are activated when she chooses a loser. These experiments assessing the role of mate-choice information on the brain using a paradigm of successive presentations of mate information suggest ways to understand the consequences of social information on animals using IEG expression.

Thursday, December 16, 2010

Why women apologize more than men.

Schumann and Ross note that it is not because men have more fragile egos, but because they have a higher threshold for what constitutes offensive behavior.
Despite wide acceptance of the stereotype that women apologize more readily than men, there is little systematic evidence to support this stereotype or its supposed bases (e.g., men’s fragile egos). We designed two studies to examine whether gender differences in apology behavior exist and, if so, why. In Study 1, participants reported in daily diaries all offenses they committed or experienced and whether an apology had been offered. Women reported offering more apologies than men, but they also reported committing more offenses. There was no gender difference in the proportion of offenses that prompted apologies. This finding suggests that men apologize less frequently than women because they have a higher threshold for what constitutes offensive behavior. In Study 2, we tested this threshold hypothesis by asking participants to evaluate both imaginary and recalled offenses. As predicted, men rated the offenses as less severe than women did. These different ratings of severity predicted both judgments of whether an apology was deserved and actual apology behavior.

Wednesday, December 15, 2010

Amazing moving graphic - history of well-being

This was pointed to in David Brooks NY Times column yesterday.

Turning back aging - The 91 year old athlete

I've been meaning to pass on some nuggets from an NYTimes Magazine article by Bruce Grierson, which tells the story of Olga Kotelko, a remarkable 91 year old woman who has shattered many world records in her Masters Competition age group. He references a number of studies and observations on aging that I was unaware of, particularly mentioning muscle physiologist Tanja Taivassalo. This first quote below gave me a bit of pause (since I am 68 years old, and in extremely good shape)...
We start losing wind in our 40s and muscle tone in our 50s. Things go downhill slowly until around age 75, when something alarming tends to happen...“There’s a slide I show in my physical-activity-and-aging class,” Taivassalo says. “You see a shirtless fellow holding barbells, but I cover his face. I ask the students how old they think he is. I mean, he could be 25. He’s just ripped. Turns out he’s 67. And then in the next slide there’s the same man at 78, in the same pose. It’s very clear he’s lost almost half of his muscle mass, even though he’s continued to work out. So there’s something going on.” But no one knows exactly what. Muscle fibers ought in theory to keep responding to training. But they don’t. Something is applying the brakes.
This seems not to be happening in Olga Kotelko, and a number of studies are looking at processes that seem to stall the natural processes of aging.
Exercise has been shown to add between six and seven years to a life span...Two recent studies involving middle-aged runners suggest that the serious mileage they were putting in, over years and years, had protected them at the chromosomal level. It appears that exercise may stimulate the production of telomerase, an enzyme that maintains and repairs the little caps on the ends of chromosomes that keep genetic information intact when cells divide. That may explain why older athletes aren’t just more cardiovascularly fit than their sedentary counterparts — they are more free of age-related illness in general.

Mark Tarnopolsky (professor of pediatrics and medicine at McMaster University in Hamilton) maintains that exercise in particular seems to activate a muscle stem cell called a satellite cell. With the infusion of these squeaky-clean cells into the system, the mitochondria seem to rejuvenate. (The phenomenon has been called “gene shifting.”) If this is right, exercise in older adults can roll back the odometer. After six months of twice weekly strength exercise training, Tarnopolsky has shown that the biochemical, physiological and genetic signature of older muscle is “turned back” nearly 15 or 20 years.

Tuesday, December 14, 2010

The truth wears off...

Jonah Lehrer has a fascinating article in the recent New Yorker which describes in detail a disturbing trend:
..all sorts of well-established multiply confirmed findings have started to look increasingly uncertain, It's as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable. This phenomenon doesn't yet have an official name, but it's occurring across a wide range of fields, from psychology to ecology. In the field of medicine the phenomenon seems extremely widespread, affecting not only antipsychotics but also therapies from cardiac stents to Vitamin E and antidepressants...a forthcoming analysis demonstrates that the efficacy of antidepressants has gone down as much as three-fold in recent decades.
Lehrer tells the story of a number of serious scientists who have reported statistically significant effects with appropriate controls, only to find then disappear over time, seemingly iron-clad results that on repetition seemed to fade away. One example is "verbal overshadowing", subjects being shown a face and asked to describe it being much less likely to recognize the face when shown it later than those who had simply looked at it. Another theory that has fallen apart is the claim that females use symmetry as a proxy for reproductive fitness of males. A 2005 study found that of the 50 most cited clinical research studies (with randomized control trials), almost half were subsequently not replicated or had their effects significantly downgraded, and these studies had guided clinical practice (hormone replacement therapy for menopausal women, low-dose aspirin to prevent heart attacks and strokes).

It is not entirely clear why this is happening, and several possibilities are mentioned:
-statistical regression to the mean, an early statistical fluke gets canceled out.
-publication bias on the part of journals, who prefer positive data over null results
-selective reporting, or significance chasing.  A review found that over 90% of psychological studies reporting statistically significant data, i.e. odds of being produced by chance less than 5% of the time, found the effect they were looking for. (One classic example of selective reporting concerns testing acupuncture in Asian countries - largely positive data - versus Western countries - less than half confirming. See today's other posting on MindBlog).

The problem of selective reporting doesn't derive necessarily from dishonesty, but from the fundamental cognitive flaw that we like proving ourselves right and hate being wrong.  The decline effect may actually be a decline of illusion.

We shouldn't throw out the baby with the bath water, as Lehrer notes in a subsequent blog posting. These problems don't mean we shouldn't believe in evolution or climate change:
One of the sad ironies of scientific denialism is that we tend to be skeptical of precisely the wrong kind of scientific claims. In poll after poll, Americans have dismissed two of the most robust and widely tested theories of modern science: evolution by natural selection and climate change. These are theories that have been verified in thousands of different ways by thousands of different scientists working in many different fields. (This doesn’t mean, of course, that such theories won’t change or get modified – the strength of science is that nothing is settled.) Instead of wasting public debate on creationism or the rhetoric of Senator Inhofe, I wish we’d spend more time considering the value of spinal fusion surgery, or second generation antipsychotics, or the verity of the latest gene association study.

More on efficacy/mechanism of acupuncture...

A loyal MindBlog reader has pointed me to a rather thorough review of numerous studies of traditional chinese versus sham acupuncture that in balance suggest that the two are equally effective in relieving musculoskeletal pain and osteoarthritis. The review then does a thorough discussion of whether acupuncture is a placebo effect and concludes that most of the benefits of acupuncture for pain syndromes result from the treatment ritual and patient–provider interaction - which meets the definition of a placebo effect.

Monday, December 13, 2010

The weather - why I am in Florida

A personal note.... The photos below are iPhone camera shots of the patio of my Wisconsin home where the temperature is 1 degree farenheit (taken by my partner there) and the shot I've just taken from my work desk in the Fort Lauderdale condo I use as an office while here November through March.  People here are complaining about the unusual cold (high of 62 today)!

Imagining eating reduces actual eating

We are just like Pavlov's dogs, in that thinking about a treat like eating chocolate enhances our desire for it and motivation to get it. After we have eaten some, our desire wanes, or habituates. Morewedge et al. make the fascinating observation that imagining the repetitive comsumption of the treat reduces the amount we actually eat:
The consumption of a food typically leads to a decrease in its subsequent intake through habituation—a decrease in one’s responsiveness to the food and motivation to obtain it. We demonstrated that habituation to a food item can occur even when its consumption is merely imagined. Five experiments showed that people who repeatedly imagined eating a food (such as cheese) many times subsequently consumed less of the imagined food than did people who repeatedly imagined eating that food fewer times, imagined eating a different food (such as candy), or did not imagine eating a food. They did so because they desired to eat it less, not because they considered it less palatable. These results suggest that mental representation alone can engender habituation to a stimulus.

Sleights of Mind

I attended the annual meeting of the Society for the Scientific Study of Consciousness in 2007, sending a few MindBlog dispaches from the event, and several subsequent posts.  It was organized by Stephen Macknik and Susana Martinez-Conde, both neuroscientists at the Barrow Neurological Institute in Phoenix.  Over a number of years they have studied how the tricks of magicians can be explained by classic and recent studies in cognitive neuroscience.  They organized a fascinating session at the meeting in which several scientists and four prominent magicians showed and discussed their craft. 

Macknik and Martinez-Conde have now joined with science journalist Sandra Blakesless (whose book "The Body has a Mind of its Own" I reviewed in a 2007 MindBlog post) to offer an engaging book: "Sleights of Mind."  I've just finished the advance copy I was sent, found it a very interesting and enjoyable read,  and plan to make it my seasonal gift to a number of friends.  They describe a large number of magical tricks and illusions, following each with an explanatory sections (prefaced by "spoiler alert") that list visual (and other sensory) afterimages, adaptations, habituations, cognitive and sensory short cuts, etc.  that explain why we can so easily be tricked.

Friday, December 10, 2010

The one night stand gene?

An amusing article in a recent PLoS One by Garcia et al makes me wonder whether we soon may be requiring prospective mates to reveal not only their HIV status but also the number of tandem repeats in their dopamine receptor gene. Genetic tweaking of the receptor for the "feel good" neurotransmitter dopamine may be all it takes to ramp up sexual promiscuity and infidelity (usual disclaimer: This does NOT mean we are talking about a 'gene' for promiscuity, in spite of the title of this post). They rounded up 181 college students, asked them to answer a questionnaire about their sexual habits along with other proclivities, such as cigarette smoking and the tendency to take risks.  They also measured variable number tandem repeats (VNTR) polymorphism in exon III of the subjects dopamine D4 receptor gene (DRD4), which has been correlated with an array of behavioral phenotypes, particularly promiscuity and infidelity. They found that subjects having at least one 7-repeat allele (7R+) report a greater categorical rate of promiscuous sexual behavior (i.e., having ever had a “one-night stand”) and report a more than 50% increase in instances of sexual infidelity. (Genotypes were grouped as 7R+ (at least one allele 7-repeats or longer) or 7R- (both alleles less than 7-repeats); the 7R+ genotype was present in 24% of the sample.)

Thursday, December 09, 2010

Complete heresy: life based on arsenic instead of phosphorus??

I had a wrenching gut reaction to first glancing at the headlines suggesting that a bacterium had been found which could live on arsenic instead of phosphorus...  My university degrees were in biochemistry, and if one thing was certain in this world, it was the basic recipe for life anywhere would have to contain carbon, nitrogen, oxygen, and phosphorus. Phosphorus forms the backbone of strands of DNA and RNA, as well as ATP and NAD, two molecules key to energy transfer in a cell. Arsenic is one row down in the periodic table from phosphorus and so does have similar chemical properties. It is a poison for us because it inserts into proteins and nucleic acids where phosphorus should, and screws up their action. A look at the article by Wolf-Simon et al., however, made me breathe a bit easier, because what they has actually done is to take a bacterium that lives under extreme conditions, in Mono Lake, located in eastern California, which is a hypersaline and alkaline water body with high dissolved arsenic concentrations. They grew the bacteria in increasingly high levels of arsenic (radioactively labeled), while decreasing phosphorus levels, and found arsenic incorporation into protein, lipid, nucleic acid, and metabolite fractions of the cells. So... these creatures are certainly different from us, they have evolved to be able to deal with arsenic. From Pennisi's review of this work:
Wolfe-Simon speculates that organisms like GFAJ-1 could have thrived in the arsenic-laden hydrothermal vent–like environments of early Earth, where some researchers think life first arose, and that later organisms may have adapted to using phosphorus. Others say they'll refrain from such speculation until they see more evidence of GFAJ-1's taste for arsenic and understand how the DNA and other biomolecules can still function with the element incorporated. “As in this type of game changer, some people will rightly want more proof,” says microbiologist Robert Gunsalus of the University of California, Los Angeles. “There is much to do in order to firmly put this microbe on the biological map.”

Wednesday, December 08, 2010

Narcissists - an endangered species?

You should have a look at two interesting NYTimes articles by Zanor and Carey on proposed changes to the fifth edition of the psychologist's and psychiatrist's bible,  the Diagnostic and Statistical Manual of Mental Disorders (due out in 2013, and known as DSM-5), which eliminate five of the 10 personality disorders that are listed in the current edition: narcissistic, dependent, histrionic, schizoid and paranoid. Rather than defining a syndrome by a cluster of related traits, with the clinician matching patients to that profile, the new proposed approach chooses from a long list of personality traits that best describe a particular patient. The older approach treats the categories as if we know them to be scientifically accurate (which we don't), and while fitting with common sense and folk psychology, can have the nature of self fulfilling prophesy...not to mention making life easier for insurance companies and the courts. Zanor quotes psychologist Jonathan Shedler:
Clinicians are accustomed to thinking in terms of syndromes, not deconstructed trait ratings. Researchers think in terms of variables, and there’s just a huge schism.... the committee was stacked with a lot of academic researchers who really don’t do a lot of clinical work. We’re seeing yet another manifestation of what’s called in psychology the science-practice schism.”

Tuesday, December 07, 2010

How reading rewires the brain.

Dehaene et al. have done an interesting study of how our brains deal with written language, which appeared only about 5,000 years ago and thus must use brain circuit evolved for other purposes. Not surprisingly, areas that originally evolved to process vision and spoken language respond more strongly to written words in literate than in illiterate subjects. This repurposing may have involved a tradeoff: for people who learned to read early in life, a smaller region of the left occipital-temporal cortex responded to images of faces than in the illiterate volunteers. (The figure shows brain regions that respond more strongly to text in people who can read.):
Does literacy improve brain function? Does it also entail losses? Using functional magnetic resonance imaging, we measured brain responses to spoken and written language, visual faces, houses, tools, and checkers in adults of variable literacy (10 were illiterate, 22 became literate as adults, and 31 became literate in childhood). As literacy enhanced the left fusiform activation evoked by writing, it induced a small competition with faces at this location but also broadly enhanced visual responses in fusiform and occipital cortex, extending to area V1. Literacy also enhanced phonological activation to speech in the planum temporale and afforded a top-down activation of orthography from spoken inputs. Most changes occurred even when literacy was acquired in adulthood, emphasizing that both childhood and adult education can profoundly refine cortical organization.

Monday, December 06, 2010

Advanced human achievement - simple reinforcement learning?

Sejnowski writes an interesting review of work by Desrochers et al., which examines whether basic principles of reinforcement learning, coupled with a complex environment and a large memory, might account for more complex behaviors. They show that reinforcement learning can explain not only behavioral choice in a complex environment, but also the evolution toward optimal behavior over a long time. They studied, in the monkey, the sort of eye movements we make several times a second when scanning a complex image (the scan path is dramatically influenced by what we are thinking.) Here is their abstract, followed by Sejnowski's summation.
Habits and rituals are expressed universally across animal species. These behaviors are advantageous in allowing sequential behaviors to be performed without cognitive overload, and appear to rely on neural circuits that are relatively benign but vulnerable to takeover by extreme contexts, neuropsychiatric sequelae, and processes leading to addiction. Reinforcement learning (RL) is thought to underlie the formation of optimal habits. However, this theoretic formulation has principally been tested experimentally in simple stimulus-response tasks with relatively few available responses. We asked whether RL could also account for the emergence of habitual action sequences in realistically complex situations in which no repetitive stimulus-response links were present and in which many response options were present. We exposed naïve macaque monkeys to such experimental conditions by introducing a unique free saccade scan task. Despite the highly uncertain conditions and no instruction, the monkeys developed a succession of stereotypical, self-chosen saccade sequence patterns. Remarkably, these continued to morph for months, long after session-averaged reward and cost (eye movement distance) reached asymptote. Prima facie, these continued behavioral changes appeared to challenge RL. However, trial-by-trial analysis showed that pattern changes on adjacent trials were predicted by lowered cost, and RL simulations that reduced the cost reproduced the monkeys’ behavior. Ultimately, the patterns settled into stereotypical saccade sequences that minimized the cost of obtaining the reward on average. These findings suggest that brain mechanisms underlying the emergence of habits, and perhaps unwanted repetitive behaviors in clinical disorders, could follow RL algorithms capturing extremely local explore/exploit tradeoffs.
Sejnowski's review gives several other examples of reinforcement learning solving difficult problems (such as learning how to play Blackgammon), and concludes:
...the jury is still out on whether reinforcement learning can explain the highest levels of human achievement. Rather than add a radically new piece of machinery to the brain, such as a language module, nature may have tinkered with the existing brain machinery to make it more efficient. Children have a remarkable ability to learn through imitation and shared attention, which might greatly speed up reinforcement learning by focusing learning on important stimuli. We are also exceptional at waiting for rewards farther into the future than other species, in some cases delaying gratification to an imagined afterlife made concrete by words. Supercharged with a larger cerebral cortex, faster learning, and a longer time horizon, is it possible that we solve complex problems in mathematics the same way that monkeys find optimal scan paths?

Friday, December 03, 2010

Political Leapfrogging

From the 'Editor's Choice' section of the Nov. 26 Science Magazine:
Although there have been many discussions of the polarized nature of American politics, do the views of elected officials match the preferences of their electorate? Bafumi and Herron sought to answer this question by comparing a national opinion survey of American voters (the Cooperative Congressional Election Study; CCES) with legislator voting records of the 109th (2005–2006) and 110th (2007–2008) Congresses. In many cases, the CCES questions were similar to (or the same as) actual congressional roll call votes, which allowed for better comparison. By developing a linear scale bounded by representatives (or CCES respondents) who had taken consistently liberal or conservative positions, the authors found that members of Congress were more extreme than the voters they represented. The median member of the 109th House of Representatives was more conservative than the median American voter, but the median member of the 110th House of Representatives was more liberal. Thus, voting out one extremist usually led to replacement by someone equally extreme, but of the opposite party. The authors refer to this as “leapfrogging” because the moderate views of the median American voter are leapfrogged during the turnover. Although the turnover was similar in the Senate, overall it appeared to be more moderate.

The Article: Amer. Polit. Sci. Rev. 104, 519 (2010).

Thursday, December 02, 2010

What makes the human brain special...

Our human brains are bigger than those of our ape relatives, in particular the frontal lobes that are required for advanced cognitive functions. Semendeferi et al have focused on a particular area of the frontal lobes: Brodmann area 10 (BA 10), which sits at the pole of the frontal lobes just above the eyes, and is thought to be involved in abstract thinking and other sophisticated cognition. They find not only that this area is relatively larger in humans, but that there is more space between nerve cell bodies in human brains than in the brains of apes, allowing room for connections between neurons. (In contrast, there were only subtle differences in cell body density among humans, chimpanzees, bonobos, gorillas, orangutans, and gibbons in the visual, somatosensory, and motor cortices.) Their analysis looked at the cells in layer three of the cortex, which communicates with other areas of the brain. BA10 in humans also contain a higher concentration of so-called Von Economo neurons, which are generally thought to be high-performance neurons specialized for rapidly transmitting information from one brain region to another.


More space between neurons in the human brain (right) compared with the chimp brain (left) could allow more complex neural wiring.

The authors suggest that human brain evolution was likely characterized by an increase in the number and width of cortical minicolumns and the space available for interconnectivity between neurons in the frontal lobe, especially the prefrontal cortex.

Wednesday, December 01, 2010

How comfort foods reduce stress.

Interesting work from Ulrich-Lai et al.  Apparently sweet tastes (and sex) reduce stress behaviors by chilling down parts of the amygdala causing them:
Individuals often eat calorically dense, highly palatable “comfort” foods during stress for stress relief. This article demonstrates that palatable food intake (limited intake of sucrose drink) reduces neuroendocrine, cardiovascular, and behavioral responses to stress in rats. Artificially sweetened (saccharin) drink reproduces the stress dampening, whereas oral intragastric gavage of sucrose is without effect. Together, these results suggest that the palatable/rewarding properties of sucrose are necessary and sufficient for stress dampening. In support of this finding, another type of natural reward (sexual activity) similarly reduces stress responses. Ibotenate lesions of the basolateral amygdala (BLA) prevent stress dampening by sucrose, suggesting that neural activity in the BLA is necessary for the effect. Moreover, sucrose intake increases mRNA and protein expression in the BLA for numerous genes linked with functional and/or structural plasticity. Lastly, stress dampening by sucrose is persistent, which is consistent with long-term changes in neural activity after synaptic remodeling. Thus, natural rewards, such as palatable foods, provide a general means of stress reduction, likely via structural and/or functional plasticity in the BLA. These findings provide a clearer understanding of the motivation for consuming palatable foods during times of stress and influence therapeutic strategies for the prevention and/or treatment of obesity and other stress-related disorders.

Tuesday, November 30, 2010

This is your brain on metaphors

A loyal mindblog reader has pointed me to an essay by one of my heroes, Robert Sapolsky, written for The Stone, a blog hosted by The New York Times which as a forum for contemporary philosophers. He discusses how the brain has evolved to link the literal and the metaphorical by duct-taping metaphors and symbols to whichever pre-existing brain areas provided the closest fit. The insula, for example, registers gustatory disgust.
Not only does the insula “do” sensory disgust; it does moral disgust as well. Because the two are so viscerally similar. When we evolved the capacity to be disgusted by moral failures, we didn’t evolve a new brain region to handle it. Instead, the insula expanded its portfolio.

...there’s a fancier, more recently evolved brain region in the frontal cortex called the anterior cingulate that’s involved in the subjective, evaluative response to pain...When humans evolved the ability to be wrenched with feeling the pain of others, where was it going to process it? It got crammed into the anterior cingulate. And thus it “does” both physical and psychic pain.
Sapolsky reviews a range of other studies showing how the brain links the literal and metaphorical, several of which have been the subjects of previous posts on this blog (cleanliness influencing moral judgements, holding a hot versus cold liquid influencing personality judgements, the weight of a resume influencing the judged gravity of a job applicant, etc.).
The viscera that can influence moral decision making and the brain’s confusion about the literalness of symbols can have enormous consequences. Part of the emotional contagion of the genocide of Tutsis in Rwanda arose from the fact that when militant Hutu propagandists called for the eradication of the Tutsi, they iconically referred to them as “cockroaches.” Get someone to the point where his insula activates at the mention of an entire people, and he’s primed to join the bloodletting.
And, an example of the sort in my recent post on resolving conflict:
But if the brain confusing reality and literalness with metaphor and symbol can have adverse consequences, the opposite can occur as well. At one juncture just before the birth of a free South Africa, Nelson Mandela entered secret negotiations with an Afrikaans general with death squad blood all over his hands, a man critical to the peace process because he led a large, well-armed Afrikaans resistance group. They met in Mandela’s house, the general anticipating tense negotiations across a conference table. Instead, Mandela led him to the warm, homey living room, sat beside him on a comfy couch, and spoke to him in Afrikaans. And the resistance melted away.
...Nelson Mandela was wrong when he advised, “Don’t talk to their minds; talk to their hearts.” He meant talk to their insulas and cingulate cortices and all those other confused brain regions, because that confusion could help make for a better world.

Monday, November 29, 2010

Brain clutter - what's left undone lingers on

In the editor's choice of the Nov. 19 Science, Gilber Chin does a summary of recent work by Masicampo and Baumeister showing that unconscious unfilled goals can compromise our fluid intelligence.
...They demonstrate that humans suffer from a hangover due to unfulfilled goals: When people were primed to strive for honesty as a goal and then required to write about an episode in which they had acted dishonestly, the induced sense of incompleteness negatively affected their ability to solve anagrams, a task that relies on fluid intelligence. Neither the prime alone nor the recounting of the episode sufficed, and people who had been primed but then wrote about someone else's dishonesty were not similarly afflicted. Furthermore, the unfulfilled goal, though detectable with implicit measures of activation, did not rise to the level of reportable or conscious awareness.
Here is the Masicampo and Baumeister abstract:
Even after one stops actively pursuing a goal, many mental processes remain focused on the goal (e.g., the Zeigarnik effect), potentially occupying limited attentional and working memory resources. Five studies examined whether the processes associated with unfulfilled goals would interfere with tasks that require the executive function, which has a limited focal capacity and can pursue only one goal at a time. In Studies [Study 1] and [Study 2], activating a goal nonconsciously and then manipulating unfulfillment caused impairments on later tasks requiring fluid intelligence (solving anagrams; Study 1) and impulse control (dieting; Study 2). Study 3 showed that impairments were specific to executive functioning tasks: an unfulfilled goal impaired performance on logic problems but not on a test of general knowledge (only the former requires executive functions). Study 4 found that the effect was moderated by individual differences; participants who reported a tendency to shift readily amongst their various pursuits showed no task interference. Study 5 found that returning to fulfill a previously frustrated goal eliminated the interference effect. These findings provide converging evidence that unfulfilled goals can interfere with later tasks, insofar as they require executive functions.

Friday, November 26, 2010

Social cognition in reptiles

A MindBlog reader referred me to this interesting post by a blog, "The Thoughtful Animal," that I had been unaware of, and have now added to the BlogRoll in the right column of MindBlog.
If several others are all directing their attention at a specific point in space, there might be something important there. We're naturally aware of where others are looking. And so are lots of other animals.

Gaze-following is the ability of an animal to orient its gaze to match that of another animal, and though this ability has been observed in mammals and birds, the phylogeny of gaze-following is still uncertain...gaze-sensitivity - the ability of an animal to avoid the gaze of another animal - seems to be somewhat more common in the animal kingdom, having been observed in mammals and birds, and some reptiles and fish. Gaze-sensitivity may have evolved as an anti-predator defense; a theory known as the "evil eye hypothesis" suggests that the awareness of the gaze direction of a predator would help an animal know when it was safe to move about or come out of a hiding spot. Gaze-following requires gaze-sensitivity; indeed, gaze-following develops in human children after gaze-sensitivity. It therefore follows that gaze-following is cognitively more complex than gaze-sensitivity.

Are these abilities also present in reptiles? If so, it could suggest that all amniotic species (birds, mammals, and reptiles) share them, and that it emerged quite a long time ago, in evolutionary terms...Eight captive-bred red-footed tortoises were socially housed for six months prior to this experiment. One tortoise, the demonstrator (the same individual was always used as demonstrator), was placed on one side of a tank, and a second tortoise, the observer, was placed on the opposite side of the tank. They were separated by transparent screens. Above, a small opaque partition separated the two sides of the tank. The investigators directed a small laser beam towards the opaque partition on the side of the demonstrator. Once the demonstrator noticed the light, she invariably looked up at it. The experimenters varied the color of the light to maintain her interest, such that she would not habituate to it. When the demonstrator looked up, would the observer direct his or her gaze up as well? If so, it would suggest that red-footed tortoises, despite their solitary existence, are sensitive to the gaze direction of their conspecifics.

There was a clear difference between the conditions, with the observer tortoises looking up in the experimental condition significantly more than in either of the control conditions. This was the first study to demonstrate that reptiles are able to follow the gaze of conspecifics, suggesting that gaze following may occur more often in the animal kingdom than previously thought...It is possible that the common ancestor of the three amniotic classes - birds, mammals, and reptiles - possessed the ability to co-orient and follow the gaze of others, rather than gaze-following having evolved two or three separate times. There was theoretically little selective pressure for such an ability to have emerged in this particular species, given their solitary lifestyle. Another possibility, however, is that gaze-sensitivity may be innate, and that gaze-following builds on this innate mechanism through associative learning. This could also explain the results of this experiment, as the tortoises had six months of social experiences prior to the beginning of the study.

Thursday, November 25, 2010

Becoming a GPS zombie - eroding your brain.

Almost every day I get an "I just came across your blog, and thought you might be interested in....." which is basically a request that I link to the site to increase their web traffic. I've started to reflexively delete such emails, but paused with one from the health editor of msnbc.com, Melissa Dahl, pointing me to their piece on recent work done at McGill University, noting comments from one of the collaborators, Veroica Bohbot (who is a co-authors of no less than 14 papers presented at the recent annual meeting of the Society for Neuroscience, which I used to loyally attend.) All deal with the two major options we use to navigate our world: a spatial strategy depending on our hippocampus which builds cognitive maps landmarks as visual cues, and a stimulus-response strategy depending on the caudate nucleus in which we follow 'turn left', 'turn right' instructions of the sort given by a GPS device. During aging we shift increasingly from the spatial to the response strategy as our hippocampus function declines. The McGill workers found a greater volume of grey matter in the hippocampus of older adults who used spatial strategies. And these adults scored higher on a standardized cognition test used to help diagnose mild cognitive impairment, which is often a precursor to Alzheimer's disease. These findings suggest that using spatial memory may increase the function of the hippocampus and increase our quality of life as we age.. Another example of "Use it or lose it." Using a GPS device is sparing us the work of exercising our hippocampal spatial navigation circuits, and thus could easily enhance their decay.

Wednesday, November 24, 2010

Trouble with numbers? Try zapping your brain.

Kodosh et al. in the Nov. 4 issue of Current Biology (noted by ScienceNow) report that administering a small electrical charge (transcranial direct current stimulation) to stimulate a center implicated in math operations located on the right side of the parietal lobe (beneath the crown of the head) can enhance a person's ability to process numbers for up to 6 months. The mild stimulation is said to be harmless, and might be tried to restore numerical skills in people suffering from degenerative diseases or stroke. Here is their abstract:
Highlights
* Brain stimulation to the parietal cortex can enhance or impair numerical abilities
* The effects were specific to the polarity of the current
* The improvement in numerical abilities lasts up to 6 months
* The brain stimulation affected specifically the material that was recently learned
Summary
Around 20% of the population exhibits moderate to severe numerical disabilities  and a further percentage loses its numerical competence during the lifespan as a result of stroke or degenerative diseases. In this work, we investigated the feasibility of using noninvasive stimulation to the parietal lobe during numerical learning to selectively improve numerical abilities. We used transcranial direct current stimulation (TDCS), a method that can selectively inhibit or excitate neuronal populations by modulating GABAergic (anodal stimulation) and glutamatergic (cathodal stimulation) activity. We trained subjects for 6 days with artificial numerical symbols, during which we applied concurrent TDCS to the parietal lobes. The polarity of the brain stimulation specifically enhanced or impaired the acquisition of automatic number processing and the mapping of number into space, both important indices of numerical proficiency. The improvement was still present 6 months after the training. Control tasks revealed that the effect of brain stimulation was specific to the representation of artificial numerical symbols. The specificity and longevity of TDCS on numerical abilities establishes TDCS as a realistic tool for intervention in cases of atypical numerical development or loss of numerical abilities because of stroke or degenerative illnesses.

Tuesday, November 23, 2010

Predicting the future with web search queries

Goel et al. find that online activity at any moment in time not only provides a snapshot of the instantaneous interests, concerns, and intentions of the global population, but it is also predictive of what people will do in the near future:
Recent work has demonstrated that Web search volume can “predict the present,” meaning that it can be used to accurately track outcomes such as unemployment levels, auto and home sales, and disease prevalence in near real time. Here we show that what consumers are searching for online can also predict their collective future behavior days or even weeks in advance. Specifically we use search query volume to forecast the opening weekend box-office revenue for feature films, first-month sales of video games, and the rank of songs on the Billboard Hot 100 chart, finding in all cases that search counts are highly predictive of future outcomes. We also find that search counts generally boost the performance of baseline models fit on other publicly available data, where the boost varies from modest to dramatic, depending on the application in question... We conclude that in the absence of other data sources, or where small improvements in predictive performance are material, search queries provide a useful guide to the near future.
And, in a similar vein, Preis et al. find a strong correlation between queries submitted to Google and weekly fluctuations in stock trading. They introduce a method for quantifying complex correlations in time series with which they find a clear tendency that search volume time series and transaction volume time series show recurring patterns. From the ScienceNow summary:
The Google data could not predict the weekly fluctuations in stock prices. However, the team found a strong correlation between Internet searches for a company's name and its trade volume, the total number of times the stock changed hands over a given week. So, for example, if lots of people were searching for computer manufacturer IBM one week, there would be a lot of trading of IBM stock the following week. But the Google data couldn't predict its price, which is determined by the ratio of shares that are bought and sold.

At least not yet. Neil Johnson, a physicist at the University of Miami in Florida, says that if researchers could drill down even farther into the Google Trends data—so that they could view changes in search terms on a daily or even an hourly basis—they might be able to predict a rise or fall in stock prices. They might even be able to forecast financial crises. It would be an opportunity for Google "to really collaborate with an academic group in a new area," he says. Then again, if the hourly stream of search queries really can predict stock price changes, Google might want to keep those data to itself.

Monday, November 22, 2010

Attention span and focus - problem/not a problem?

I have done several posts on how heavy computer and internet use might nudge our brain processes (in either a positive or detrimental way), so I was entertained by reading somewhat contrasting takes on this issue in yesterday's Sunday NY Times, Virginia Heffernan writing in the Sunday Magazine on "The Attention-Span Myth," and Matt Richtel's "Growing Up Digital, Wired for Distraction."
Clips from Heffernan:
...attention spans...have become the digital-age equivalent of souls...which might be measured by the psychologist’s equivalent of a tailor’s tape? ..isn’t there something just unconvincing about the idea that an occult “span” in the brain makes certain cultural objects more compelling than others? So a kid loves the drums but can hardly get through a chapter of “The Sun Also Rises”; and another aces algebra tests but can’t even understand how Call of Duty is played.

In other eras, distractibility wasn’t considered shameful. It was regularly praised, in fact — as autonomy, exuberance and versatility. To be brooding, morbid, obsessive or easily mesmerized was thought much worse than being distractible. In “Moby-Dick,” Starbuck tries to distract Ahab from his monomania with evocations of family life in Nantucket...sitting silently without fidgeting: that’s essentially what we want of children with bum attention spans, isn’t it? The first sign that a distractible child is doing “better” — with age or Adderall, say — is that he sits still...At some point, we stopped calling Tom Sawyer-style distractibility either animal spirits or a discipline problem. We started to call it sick..the problem with the attention-span discourse is that it’s founded on the phantom idea of an attention span. A healthy “attention span” becomes just another ineffable quality to remember having, to believe you’ve lost, to worry about your kids lacking, to blame the culture for destroying. Who needs it?
The Richtel article tells stories about students at Woodside High School in Silicon Valley's Redwood City California.  "Here, as elsewhere, it is not uncommon for students to send hundreds of text messages a day or spend hours playing video games, and virtually everyone is on Facebook." It is in environments like these that a generation of kids is being raised whose brains might be wired differently, habituated to distraction and to switching tasks, not to focus.  Many of Richtel's stories deal with the contest between the immediate gratifications of distractability  and doing homework and reading that builds a self, and a future. Richtel also provides an descriptions of several academic studies

Greedy Geezers

Apparently my demographic group (seniors on Medicare) radically changed its voting behavior in the recent midterm elections, and is very opposed to the new Health Care Legislation, saying in effect, “I’ve got mine—good luck getting yours.”. In the Nov. 21 New Yorker Surowiecki does a nice commentary:
In the 2006 midterm election, seniors split their vote evenly between House Democrats and Republicans. This time, they went for Republicans by a twenty-one-point margin...The election has been termed the “revolt of the middle class.” But it might more accurately be called the revolt of the retired...The real sticking point was health-care reform, which the elderly didn’t like from the start...the very people who currently enjoy the benefits of a subsidized, government-run insurance system are intent on keeping others from getting the same treatment...seniors today get far more out of Medicare than they ever put in, which means that their medical care is paid for by current taxpayers...the subsidies that seniors get aren’t fundamentally different from the ones that the Affordable Care Act will offer some thirty million Americans who don’t have insurance.

Current sentiment among seniors seems like a classic example of an effect that the economist Benjamin Friedman identified in his magisterial book “The Moral Consequences of Economic Growth”: in hard times voters get more selfish. Historically, Friedman notes, times of stagnation have been times of reaction, with voters bent on protecting their own interests, hostile to outsiders, and less interested in social welfare...the Democrats’ loss of support among the elderly was more a matter of economic fundamentals than of political framing. If the economy were growing briskly, it’s unlikely that the health-care bill would have become so politically toxic.

Friday, November 19, 2010

Using invisible visual signals to see things.

Di Luca et al. have done an ingenious experiment that demonstrates that an invisible signal can be recruited as a cue for perceptual appearance. Regularities between the 'invisible' (below perceptual threshold) signal and a perceived signal can be unconsciously learned - perception can rapidly undergo “structure learning” by automatically picking up novel contingencies between sensory signals, thus automatically recruiting signals for novel uses during the construction of a percept. It is worthwhile to step through their description of how the experiment works:
To convincingly show that new perceptual meanings for sensory signals can be learned automatically, one needs an “invisible visual signal,” that is, a signal that is sensed but that has no effect on visual appearance. The gradient of vertical binocular disparity, created by 2% vertical magnification of one eye's image (the eye of vertical magnification [EVM]), can be such a signal. In several control experiments, we ensured that EVM could not be seen by the participants.

The stimulus we used was a horizontal cylinder rotating either front side up or front side down. In its basic form, the cylinder was defined by horizontal lines with fading edges. The lines moved up and down on the screen, thereby creating the impression of a rotating cylinder with ambiguous rotation direction, so participants perceived it rotating sometimes as front side up and sometimes as front side down.

We tested whether the signal created by 2% vertical magnification could be recruited to control the perceived rotation direction of this ambiguously rotating cylinder. To do so, we exposed participants to a new contingency. We used a disambiguated version of the cylinder that contained additional depth cues: dots provided horizontal disparity, and a rectangle occluded part of the farther surface of the cylinder. These cues disambiguated the perceived rotation direction of the cylinder. In training trials, we exposed participants to cylinder stimuli in which EVM and the unambiguously perceived rotation direction were contingent upon one another. To test whether EVM had an effect on the perceived rotation direction of the cylinder, we interleaved these training trials with probe trials that had ambiguous rotation direction. If participants recruited EVM to the new use, then perceived rotation direction on probe trials would come to depend on EVM. If participants did not recruit EVM, then perceived rotation direction would be independent of EVM.

Importantly, after exposure to the new contingency, all participants saw a majority of probe trials consistent with the rotation direction contingent with EVM during exposure—that is, the learning effect was highly significant.

Thursday, November 18, 2010

How life experiences alter what our genes do.

It has been a frustration that we are unable to pinpoint causative genetic effects in many complex diseases and behavioral abnormalities. Many think the missing information resides in our nongenetic cellular memory, which records developmental and environmental cues. "Epigenetics" has become the catch-all phrase for many environmentally influenced genetic regulatory systems involving DNA methylation, histone modification, nucleosome location, or noncoding RNA. The basic requirement for an epigenetic system is that it be heritable, self-perpetuating, and reversible. Benedict Carey has done a nice non-technical article on epigenetics, how people’s experience and environment affect the function of their genes. Some clips:
Genes are far more than protein machines, pumping out their product like a popcorn maker. Many carry what are, in effect, chemical attachments: compounds acting on the DNA molecule that regulate when, where or how much protein is made, without altering the recipe itself. Studies suggest that such add-on, or epigenetic, markers develop as an animal adapts to its environment, whether in the womb or out in the world — and the markers can profoundly affect behavior.

...researchers have shown that affectionate mothering alters the expression of genes, allowing them to dampen their physiological response to stress. These biological buffers are then passed on to the next generation: rodents and nonhuman primates biologically primed to handle stress tend to be more nurturing to their own offspring.

...Epigenetic markers may likewise hinder normal development: the offspring of parents who experience famine are at heightened risk for developing schizophrenia, some research suggests — perhaps because of the chemical signatures on the genes that parents pass on. Another recent study found evidence that, in some people with autism, epigenetic markers had silenced the gene which makes the receptor for the hormone oxytocin. Oxytocin oils the brain’s social circuits, and is critical in cementing relationships.

...The National Institutes of Health is sponsoring about 100 studies looking at the relationship between epigenetic markers and behavior problems, including drug abuse, post-traumatic stress, bipolar disorder and schizophrenia, compared with just a handful of such studies a decade ago.

Wednesday, November 17, 2010

Tiny touches of the tongue - the elegance of cats.

I've learned something about my constant companions, two Abyssinian cats named Marvin and Melvin.  I've always wondered how the rapid petite tongue flickers they use while drinking could be getting much water into their mouths.  Now two MIT physicists have the simple answer. Their tongues perform a complex maneuver that pits gravity versus inertia in a delicate balance. Using high speed photography they found that:
...cats rest the tips of their tongues on the liquid's surface without penetrating it. The water sticks to the cat's tongue and is pulled upward as the cat draws its tongue into its mouth. When the cat closes its mouth, it breaks the liquid column but still keeps its chin and whiskers dry. Here is the full text of their article.
From Nicholas Wade's description:
What happens is that the cat darts its tongue, curving the upper side downward so that the tip lightly touches the surface of the water...The tongue is then pulled upward at high speed, drawing a column of water behind it...Just at the moment that gravity finally overcomes the rush of the water and starts to pull the column down — snap! The cat’s jaws have closed over the jet of water and swallowed it...The cat laps four times a second — too fast for the human eye to see anything but a blur — and its tongue moves at a speed of one meter per second.

Tuesday, November 16, 2010

A wandering mind is an unhappy mind.

Killingsworth and Gilbert report a fascinating study in the Nov. 12 issue of Science Magazine. They developed a smartphone technology to sample people’s ongoing thoughts, feelings, and actions and found that people are thinking about what is not happening almost as often as they are thinking about what is, and that this typically makes them unhappy. Here are some excerpts:
Unlike other animals, human beings spend a lot of time thinking about what is not going on around them, contemplating events that happened in the past, might happen in the future, or will never happen at all. Indeed, "stimulus-independent thought" or "mind wandering" appears to be the brain’s default mode of operation...this ability is a remarkable evolutionary achievement that allows people to learn, reason, and plan, it may have an emotional cost.
To measure the emotional consequences of mind-wandering the authors developed a a Web application for the iPhone for collecting real-time reports from large numbers of people.
The application contacts participants through their iPhones at random moments during their waking hours, presents them with questions, and records their answers to a database at www.trackyourhappiness.org. The database currently contains nearly a quarter of a million samples from about 5000 people from 83 different countries who range in age from 18 to 88 and who collectively represent every one of 86 major occupational categories.

To find out how often people’s minds wander, what topics they wander to, and how those wanderings affect their happiness, we analyzed samples from 2250 adults (58.8% male, 73.9% residing in the United States, mean age of 34 years) who were randomly assigned to answer a happiness question ("How are you feeling right now?") answered on a continuous sliding scale from very bad (0) to very good (100), an activity question ("What are you doing right now?") answered by endorsing one or more of 22 activities adapted from the day reconstruction method (10, 11), and a mind-wandering question ("Are you thinking about something other than what you’re currently doing?") answered with one of four options: no; yes, something pleasant; yes, something neutral; or yes, something unpleasant. Our analyses revealed three facts.

First, people’s minds wandered frequently, regardless of what they were doing. Mind wandering occurred in 46.9% of the samples and in at least 30% of the samples taken during every activity except making love. The frequency of mind wandering in our real-world sample was considerably higher than is typically seen in laboratory experiments. Surprisingly, the nature of people’s activities had only a modest impact on whether their minds wandered and had almost no impact on the pleasantness of the topics to which their minds wandered.

Second, multilevel regression revealed that people were less happy when their minds were wandering than when they were not..., and this was true during all activities, including the least enjoyable. Although people’s minds were more likely to wander to pleasant topics (42.5% of samples) than to unpleasant topics (26.5% of samples) or neutral topics (31% of samples), people were no happier when thinking about pleasant topics than about their current activity...and were considerably unhappier when thinking about neutral topics ... or unpleasant topics... than about their current activity (Figure, bottom). Although negative moods are known to cause mind wandering, time-lag analyses strongly suggested that mind wandering in our sample was generally the cause, and not merely the consequence, of unhappiness.

Third, what people were thinking was a better predictor of their happiness than was what they were doing. The nature of people’s activities explained 4.6% of the within-person variance in happiness and 3.2% of the between-person variance in happiness, but mind wandering explained 10.8% of within-person variance in happiness and 17.7% of between-person variance in happiness. The variance explained by mind wandering was largely independent of the variance explained by the nature of activities, suggesting that the two were independent influences on happiness.


Figure - Mean happiness reported during each activity (top) and while mind wandering to unpleasant topics, neutral topics, pleasant topics or not mind wandering (bottom). Dashed line indicates mean of happiness across all samples. Bubble area indicates the frequency of occurrence. The largest bubble ("not mind wandering") corresponds to 53.1% of the samples, and the smallest bubble ("praying/worshipping/meditating") corresponds to 0.1% of the samples.
 ADDED NOTE:  I just opened my New York Times this morning and find a piece by John Tierney on this work.