Friday, December 10, 2010

The one night stand gene?

An amusing article in a recent PLoS One by Garcia et al makes me wonder whether we soon may be requiring prospective mates to reveal not only their HIV status but also the number of tandem repeats in their dopamine receptor gene. Genetic tweaking of the receptor for the "feel good" neurotransmitter dopamine may be all it takes to ramp up sexual promiscuity and infidelity (usual disclaimer: This does NOT mean we are talking about a 'gene' for promiscuity, in spite of the title of this post). They rounded up 181 college students, asked them to answer a questionnaire about their sexual habits along with other proclivities, such as cigarette smoking and the tendency to take risks.  They also measured variable number tandem repeats (VNTR) polymorphism in exon III of the subjects dopamine D4 receptor gene (DRD4), which has been correlated with an array of behavioral phenotypes, particularly promiscuity and infidelity. They found that subjects having at least one 7-repeat allele (7R+) report a greater categorical rate of promiscuous sexual behavior (i.e., having ever had a “one-night stand”) and report a more than 50% increase in instances of sexual infidelity. (Genotypes were grouped as 7R+ (at least one allele 7-repeats or longer) or 7R- (both alleles less than 7-repeats); the 7R+ genotype was present in 24% of the sample.)

Thursday, December 09, 2010

Complete heresy: life based on arsenic instead of phosphorus??

I had a wrenching gut reaction to first glancing at the headlines suggesting that a bacterium had been found which could live on arsenic instead of phosphorus...  My university degrees were in biochemistry, and if one thing was certain in this world, it was the basic recipe for life anywhere would have to contain carbon, nitrogen, oxygen, and phosphorus. Phosphorus forms the backbone of strands of DNA and RNA, as well as ATP and NAD, two molecules key to energy transfer in a cell. Arsenic is one row down in the periodic table from phosphorus and so does have similar chemical properties. It is a poison for us because it inserts into proteins and nucleic acids where phosphorus should, and screws up their action. A look at the article by Wolf-Simon et al., however, made me breathe a bit easier, because what they has actually done is to take a bacterium that lives under extreme conditions, in Mono Lake, located in eastern California, which is a hypersaline and alkaline water body with high dissolved arsenic concentrations. They grew the bacteria in increasingly high levels of arsenic (radioactively labeled), while decreasing phosphorus levels, and found arsenic incorporation into protein, lipid, nucleic acid, and metabolite fractions of the cells. So... these creatures are certainly different from us, they have evolved to be able to deal with arsenic. From Pennisi's review of this work:
Wolfe-Simon speculates that organisms like GFAJ-1 could have thrived in the arsenic-laden hydrothermal vent–like environments of early Earth, where some researchers think life first arose, and that later organisms may have adapted to using phosphorus. Others say they'll refrain from such speculation until they see more evidence of GFAJ-1's taste for arsenic and understand how the DNA and other biomolecules can still function with the element incorporated. “As in this type of game changer, some people will rightly want more proof,” says microbiologist Robert Gunsalus of the University of California, Los Angeles. “There is much to do in order to firmly put this microbe on the biological map.”

Wednesday, December 08, 2010

Narcissists - an endangered species?

You should have a look at two interesting NYTimes articles by Zanor and Carey on proposed changes to the fifth edition of the psychologist's and psychiatrist's bible,  the Diagnostic and Statistical Manual of Mental Disorders (due out in 2013, and known as DSM-5), which eliminate five of the 10 personality disorders that are listed in the current edition: narcissistic, dependent, histrionic, schizoid and paranoid. Rather than defining a syndrome by a cluster of related traits, with the clinician matching patients to that profile, the new proposed approach chooses from a long list of personality traits that best describe a particular patient. The older approach treats the categories as if we know them to be scientifically accurate (which we don't), and while fitting with common sense and folk psychology, can have the nature of self fulfilling prophesy...not to mention making life easier for insurance companies and the courts. Zanor quotes psychologist Jonathan Shedler:
Clinicians are accustomed to thinking in terms of syndromes, not deconstructed trait ratings. Researchers think in terms of variables, and there’s just a huge schism.... the committee was stacked with a lot of academic researchers who really don’t do a lot of clinical work. We’re seeing yet another manifestation of what’s called in psychology the science-practice schism.”

Tuesday, December 07, 2010

How reading rewires the brain.

Dehaene et al. have done an interesting study of how our brains deal with written language, which appeared only about 5,000 years ago and thus must use brain circuit evolved for other purposes. Not surprisingly, areas that originally evolved to process vision and spoken language respond more strongly to written words in literate than in illiterate subjects. This repurposing may have involved a tradeoff: for people who learned to read early in life, a smaller region of the left occipital-temporal cortex responded to images of faces than in the illiterate volunteers. (The figure shows brain regions that respond more strongly to text in people who can read.):
Does literacy improve brain function? Does it also entail losses? Using functional magnetic resonance imaging, we measured brain responses to spoken and written language, visual faces, houses, tools, and checkers in adults of variable literacy (10 were illiterate, 22 became literate as adults, and 31 became literate in childhood). As literacy enhanced the left fusiform activation evoked by writing, it induced a small competition with faces at this location but also broadly enhanced visual responses in fusiform and occipital cortex, extending to area V1. Literacy also enhanced phonological activation to speech in the planum temporale and afforded a top-down activation of orthography from spoken inputs. Most changes occurred even when literacy was acquired in adulthood, emphasizing that both childhood and adult education can profoundly refine cortical organization.

Monday, December 06, 2010

Advanced human achievement - simple reinforcement learning?

Sejnowski writes an interesting review of work by Desrochers et al., which examines whether basic principles of reinforcement learning, coupled with a complex environment and a large memory, might account for more complex behaviors. They show that reinforcement learning can explain not only behavioral choice in a complex environment, but also the evolution toward optimal behavior over a long time. They studied, in the monkey, the sort of eye movements we make several times a second when scanning a complex image (the scan path is dramatically influenced by what we are thinking.) Here is their abstract, followed by Sejnowski's summation.
Habits and rituals are expressed universally across animal species. These behaviors are advantageous in allowing sequential behaviors to be performed without cognitive overload, and appear to rely on neural circuits that are relatively benign but vulnerable to takeover by extreme contexts, neuropsychiatric sequelae, and processes leading to addiction. Reinforcement learning (RL) is thought to underlie the formation of optimal habits. However, this theoretic formulation has principally been tested experimentally in simple stimulus-response tasks with relatively few available responses. We asked whether RL could also account for the emergence of habitual action sequences in realistically complex situations in which no repetitive stimulus-response links were present and in which many response options were present. We exposed naïve macaque monkeys to such experimental conditions by introducing a unique free saccade scan task. Despite the highly uncertain conditions and no instruction, the monkeys developed a succession of stereotypical, self-chosen saccade sequence patterns. Remarkably, these continued to morph for months, long after session-averaged reward and cost (eye movement distance) reached asymptote. Prima facie, these continued behavioral changes appeared to challenge RL. However, trial-by-trial analysis showed that pattern changes on adjacent trials were predicted by lowered cost, and RL simulations that reduced the cost reproduced the monkeys’ behavior. Ultimately, the patterns settled into stereotypical saccade sequences that minimized the cost of obtaining the reward on average. These findings suggest that brain mechanisms underlying the emergence of habits, and perhaps unwanted repetitive behaviors in clinical disorders, could follow RL algorithms capturing extremely local explore/exploit tradeoffs.
Sejnowski's review gives several other examples of reinforcement learning solving difficult problems (such as learning how to play Blackgammon), and concludes:
...the jury is still out on whether reinforcement learning can explain the highest levels of human achievement. Rather than add a radically new piece of machinery to the brain, such as a language module, nature may have tinkered with the existing brain machinery to make it more efficient. Children have a remarkable ability to learn through imitation and shared attention, which might greatly speed up reinforcement learning by focusing learning on important stimuli. We are also exceptional at waiting for rewards farther into the future than other species, in some cases delaying gratification to an imagined afterlife made concrete by words. Supercharged with a larger cerebral cortex, faster learning, and a longer time horizon, is it possible that we solve complex problems in mathematics the same way that monkeys find optimal scan paths?

Friday, December 03, 2010

Political Leapfrogging

From the 'Editor's Choice' section of the Nov. 26 Science Magazine:
Although there have been many discussions of the polarized nature of American politics, do the views of elected officials match the preferences of their electorate? Bafumi and Herron sought to answer this question by comparing a national opinion survey of American voters (the Cooperative Congressional Election Study; CCES) with legislator voting records of the 109th (2005–2006) and 110th (2007–2008) Congresses. In many cases, the CCES questions were similar to (or the same as) actual congressional roll call votes, which allowed for better comparison. By developing a linear scale bounded by representatives (or CCES respondents) who had taken consistently liberal or conservative positions, the authors found that members of Congress were more extreme than the voters they represented. The median member of the 109th House of Representatives was more conservative than the median American voter, but the median member of the 110th House of Representatives was more liberal. Thus, voting out one extremist usually led to replacement by someone equally extreme, but of the opposite party. The authors refer to this as “leapfrogging” because the moderate views of the median American voter are leapfrogged during the turnover. Although the turnover was similar in the Senate, overall it appeared to be more moderate.

The Article: Amer. Polit. Sci. Rev. 104, 519 (2010).

Thursday, December 02, 2010

What makes the human brain special...

Our human brains are bigger than those of our ape relatives, in particular the frontal lobes that are required for advanced cognitive functions. Semendeferi et al have focused on a particular area of the frontal lobes: Brodmann area 10 (BA 10), which sits at the pole of the frontal lobes just above the eyes, and is thought to be involved in abstract thinking and other sophisticated cognition. They find not only that this area is relatively larger in humans, but that there is more space between nerve cell bodies in human brains than in the brains of apes, allowing room for connections between neurons. (In contrast, there were only subtle differences in cell body density among humans, chimpanzees, bonobos, gorillas, orangutans, and gibbons in the visual, somatosensory, and motor cortices.) Their analysis looked at the cells in layer three of the cortex, which communicates with other areas of the brain. BA10 in humans also contain a higher concentration of so-called Von Economo neurons, which are generally thought to be high-performance neurons specialized for rapidly transmitting information from one brain region to another.


More space between neurons in the human brain (right) compared with the chimp brain (left) could allow more complex neural wiring.

The authors suggest that human brain evolution was likely characterized by an increase in the number and width of cortical minicolumns and the space available for interconnectivity between neurons in the frontal lobe, especially the prefrontal cortex.

Wednesday, December 01, 2010

How comfort foods reduce stress.

Interesting work from Ulrich-Lai et al.  Apparently sweet tastes (and sex) reduce stress behaviors by chilling down parts of the amygdala causing them:
Individuals often eat calorically dense, highly palatable “comfort” foods during stress for stress relief. This article demonstrates that palatable food intake (limited intake of sucrose drink) reduces neuroendocrine, cardiovascular, and behavioral responses to stress in rats. Artificially sweetened (saccharin) drink reproduces the stress dampening, whereas oral intragastric gavage of sucrose is without effect. Together, these results suggest that the palatable/rewarding properties of sucrose are necessary and sufficient for stress dampening. In support of this finding, another type of natural reward (sexual activity) similarly reduces stress responses. Ibotenate lesions of the basolateral amygdala (BLA) prevent stress dampening by sucrose, suggesting that neural activity in the BLA is necessary for the effect. Moreover, sucrose intake increases mRNA and protein expression in the BLA for numerous genes linked with functional and/or structural plasticity. Lastly, stress dampening by sucrose is persistent, which is consistent with long-term changes in neural activity after synaptic remodeling. Thus, natural rewards, such as palatable foods, provide a general means of stress reduction, likely via structural and/or functional plasticity in the BLA. These findings provide a clearer understanding of the motivation for consuming palatable foods during times of stress and influence therapeutic strategies for the prevention and/or treatment of obesity and other stress-related disorders.

Tuesday, November 30, 2010

This is your brain on metaphors

A loyal mindblog reader has pointed me to an essay by one of my heroes, Robert Sapolsky, written for The Stone, a blog hosted by The New York Times which as a forum for contemporary philosophers. He discusses how the brain has evolved to link the literal and the metaphorical by duct-taping metaphors and symbols to whichever pre-existing brain areas provided the closest fit. The insula, for example, registers gustatory disgust.
Not only does the insula “do” sensory disgust; it does moral disgust as well. Because the two are so viscerally similar. When we evolved the capacity to be disgusted by moral failures, we didn’t evolve a new brain region to handle it. Instead, the insula expanded its portfolio.

...there’s a fancier, more recently evolved brain region in the frontal cortex called the anterior cingulate that’s involved in the subjective, evaluative response to pain...When humans evolved the ability to be wrenched with feeling the pain of others, where was it going to process it? It got crammed into the anterior cingulate. And thus it “does” both physical and psychic pain.
Sapolsky reviews a range of other studies showing how the brain links the literal and metaphorical, several of which have been the subjects of previous posts on this blog (cleanliness influencing moral judgements, holding a hot versus cold liquid influencing personality judgements, the weight of a resume influencing the judged gravity of a job applicant, etc.).
The viscera that can influence moral decision making and the brain’s confusion about the literalness of symbols can have enormous consequences. Part of the emotional contagion of the genocide of Tutsis in Rwanda arose from the fact that when militant Hutu propagandists called for the eradication of the Tutsi, they iconically referred to them as “cockroaches.” Get someone to the point where his insula activates at the mention of an entire people, and he’s primed to join the bloodletting.
And, an example of the sort in my recent post on resolving conflict:
But if the brain confusing reality and literalness with metaphor and symbol can have adverse consequences, the opposite can occur as well. At one juncture just before the birth of a free South Africa, Nelson Mandela entered secret negotiations with an Afrikaans general with death squad blood all over his hands, a man critical to the peace process because he led a large, well-armed Afrikaans resistance group. They met in Mandela’s house, the general anticipating tense negotiations across a conference table. Instead, Mandela led him to the warm, homey living room, sat beside him on a comfy couch, and spoke to him in Afrikaans. And the resistance melted away.
...Nelson Mandela was wrong when he advised, “Don’t talk to their minds; talk to their hearts.” He meant talk to their insulas and cingulate cortices and all those other confused brain regions, because that confusion could help make for a better world.

Monday, November 29, 2010

Brain clutter - what's left undone lingers on

In the editor's choice of the Nov. 19 Science, Gilber Chin does a summary of recent work by Masicampo and Baumeister showing that unconscious unfilled goals can compromise our fluid intelligence.
...They demonstrate that humans suffer from a hangover due to unfulfilled goals: When people were primed to strive for honesty as a goal and then required to write about an episode in which they had acted dishonestly, the induced sense of incompleteness negatively affected their ability to solve anagrams, a task that relies on fluid intelligence. Neither the prime alone nor the recounting of the episode sufficed, and people who had been primed but then wrote about someone else's dishonesty were not similarly afflicted. Furthermore, the unfulfilled goal, though detectable with implicit measures of activation, did not rise to the level of reportable or conscious awareness.
Here is the Masicampo and Baumeister abstract:
Even after one stops actively pursuing a goal, many mental processes remain focused on the goal (e.g., the Zeigarnik effect), potentially occupying limited attentional and working memory resources. Five studies examined whether the processes associated with unfulfilled goals would interfere with tasks that require the executive function, which has a limited focal capacity and can pursue only one goal at a time. In Studies [Study 1] and [Study 2], activating a goal nonconsciously and then manipulating unfulfillment caused impairments on later tasks requiring fluid intelligence (solving anagrams; Study 1) and impulse control (dieting; Study 2). Study 3 showed that impairments were specific to executive functioning tasks: an unfulfilled goal impaired performance on logic problems but not on a test of general knowledge (only the former requires executive functions). Study 4 found that the effect was moderated by individual differences; participants who reported a tendency to shift readily amongst their various pursuits showed no task interference. Study 5 found that returning to fulfill a previously frustrated goal eliminated the interference effect. These findings provide converging evidence that unfulfilled goals can interfere with later tasks, insofar as they require executive functions.

Friday, November 26, 2010

Social cognition in reptiles

A MindBlog reader referred me to this interesting post by a blog, "The Thoughtful Animal," that I had been unaware of, and have now added to the BlogRoll in the right column of MindBlog.
If several others are all directing their attention at a specific point in space, there might be something important there. We're naturally aware of where others are looking. And so are lots of other animals.

Gaze-following is the ability of an animal to orient its gaze to match that of another animal, and though this ability has been observed in mammals and birds, the phylogeny of gaze-following is still uncertain...gaze-sensitivity - the ability of an animal to avoid the gaze of another animal - seems to be somewhat more common in the animal kingdom, having been observed in mammals and birds, and some reptiles and fish. Gaze-sensitivity may have evolved as an anti-predator defense; a theory known as the "evil eye hypothesis" suggests that the awareness of the gaze direction of a predator would help an animal know when it was safe to move about or come out of a hiding spot. Gaze-following requires gaze-sensitivity; indeed, gaze-following develops in human children after gaze-sensitivity. It therefore follows that gaze-following is cognitively more complex than gaze-sensitivity.

Are these abilities also present in reptiles? If so, it could suggest that all amniotic species (birds, mammals, and reptiles) share them, and that it emerged quite a long time ago, in evolutionary terms...Eight captive-bred red-footed tortoises were socially housed for six months prior to this experiment. One tortoise, the demonstrator (the same individual was always used as demonstrator), was placed on one side of a tank, and a second tortoise, the observer, was placed on the opposite side of the tank. They were separated by transparent screens. Above, a small opaque partition separated the two sides of the tank. The investigators directed a small laser beam towards the opaque partition on the side of the demonstrator. Once the demonstrator noticed the light, she invariably looked up at it. The experimenters varied the color of the light to maintain her interest, such that she would not habituate to it. When the demonstrator looked up, would the observer direct his or her gaze up as well? If so, it would suggest that red-footed tortoises, despite their solitary existence, are sensitive to the gaze direction of their conspecifics.

There was a clear difference between the conditions, with the observer tortoises looking up in the experimental condition significantly more than in either of the control conditions. This was the first study to demonstrate that reptiles are able to follow the gaze of conspecifics, suggesting that gaze following may occur more often in the animal kingdom than previously thought...It is possible that the common ancestor of the three amniotic classes - birds, mammals, and reptiles - possessed the ability to co-orient and follow the gaze of others, rather than gaze-following having evolved two or three separate times. There was theoretically little selective pressure for such an ability to have emerged in this particular species, given their solitary lifestyle. Another possibility, however, is that gaze-sensitivity may be innate, and that gaze-following builds on this innate mechanism through associative learning. This could also explain the results of this experiment, as the tortoises had six months of social experiences prior to the beginning of the study.

Thursday, November 25, 2010

Becoming a GPS zombie - eroding your brain.

Almost every day I get an "I just came across your blog, and thought you might be interested in....." which is basically a request that I link to the site to increase their web traffic. I've started to reflexively delete such emails, but paused with one from the health editor of msnbc.com, Melissa Dahl, pointing me to their piece on recent work done at McGill University, noting comments from one of the collaborators, Veroica Bohbot (who is a co-authors of no less than 14 papers presented at the recent annual meeting of the Society for Neuroscience, which I used to loyally attend.) All deal with the two major options we use to navigate our world: a spatial strategy depending on our hippocampus which builds cognitive maps landmarks as visual cues, and a stimulus-response strategy depending on the caudate nucleus in which we follow 'turn left', 'turn right' instructions of the sort given by a GPS device. During aging we shift increasingly from the spatial to the response strategy as our hippocampus function declines. The McGill workers found a greater volume of grey matter in the hippocampus of older adults who used spatial strategies. And these adults scored higher on a standardized cognition test used to help diagnose mild cognitive impairment, which is often a precursor to Alzheimer's disease. These findings suggest that using spatial memory may increase the function of the hippocampus and increase our quality of life as we age.. Another example of "Use it or lose it." Using a GPS device is sparing us the work of exercising our hippocampal spatial navigation circuits, and thus could easily enhance their decay.

Wednesday, November 24, 2010

Trouble with numbers? Try zapping your brain.

Kodosh et al. in the Nov. 4 issue of Current Biology (noted by ScienceNow) report that administering a small electrical charge (transcranial direct current stimulation) to stimulate a center implicated in math operations located on the right side of the parietal lobe (beneath the crown of the head) can enhance a person's ability to process numbers for up to 6 months. The mild stimulation is said to be harmless, and might be tried to restore numerical skills in people suffering from degenerative diseases or stroke. Here is their abstract:
Highlights
* Brain stimulation to the parietal cortex can enhance or impair numerical abilities
* The effects were specific to the polarity of the current
* The improvement in numerical abilities lasts up to 6 months
* The brain stimulation affected specifically the material that was recently learned
Summary
Around 20% of the population exhibits moderate to severe numerical disabilities  and a further percentage loses its numerical competence during the lifespan as a result of stroke or degenerative diseases. In this work, we investigated the feasibility of using noninvasive stimulation to the parietal lobe during numerical learning to selectively improve numerical abilities. We used transcranial direct current stimulation (TDCS), a method that can selectively inhibit or excitate neuronal populations by modulating GABAergic (anodal stimulation) and glutamatergic (cathodal stimulation) activity. We trained subjects for 6 days with artificial numerical symbols, during which we applied concurrent TDCS to the parietal lobes. The polarity of the brain stimulation specifically enhanced or impaired the acquisition of automatic number processing and the mapping of number into space, both important indices of numerical proficiency. The improvement was still present 6 months after the training. Control tasks revealed that the effect of brain stimulation was specific to the representation of artificial numerical symbols. The specificity and longevity of TDCS on numerical abilities establishes TDCS as a realistic tool for intervention in cases of atypical numerical development or loss of numerical abilities because of stroke or degenerative illnesses.

Tuesday, November 23, 2010

Predicting the future with web search queries

Goel et al. find that online activity at any moment in time not only provides a snapshot of the instantaneous interests, concerns, and intentions of the global population, but it is also predictive of what people will do in the near future:
Recent work has demonstrated that Web search volume can “predict the present,” meaning that it can be used to accurately track outcomes such as unemployment levels, auto and home sales, and disease prevalence in near real time. Here we show that what consumers are searching for online can also predict their collective future behavior days or even weeks in advance. Specifically we use search query volume to forecast the opening weekend box-office revenue for feature films, first-month sales of video games, and the rank of songs on the Billboard Hot 100 chart, finding in all cases that search counts are highly predictive of future outcomes. We also find that search counts generally boost the performance of baseline models fit on other publicly available data, where the boost varies from modest to dramatic, depending on the application in question... We conclude that in the absence of other data sources, or where small improvements in predictive performance are material, search queries provide a useful guide to the near future.
And, in a similar vein, Preis et al. find a strong correlation between queries submitted to Google and weekly fluctuations in stock trading. They introduce a method for quantifying complex correlations in time series with which they find a clear tendency that search volume time series and transaction volume time series show recurring patterns. From the ScienceNow summary:
The Google data could not predict the weekly fluctuations in stock prices. However, the team found a strong correlation between Internet searches for a company's name and its trade volume, the total number of times the stock changed hands over a given week. So, for example, if lots of people were searching for computer manufacturer IBM one week, there would be a lot of trading of IBM stock the following week. But the Google data couldn't predict its price, which is determined by the ratio of shares that are bought and sold.

At least not yet. Neil Johnson, a physicist at the University of Miami in Florida, says that if researchers could drill down even farther into the Google Trends data—so that they could view changes in search terms on a daily or even an hourly basis—they might be able to predict a rise or fall in stock prices. They might even be able to forecast financial crises. It would be an opportunity for Google "to really collaborate with an academic group in a new area," he says. Then again, if the hourly stream of search queries really can predict stock price changes, Google might want to keep those data to itself.

Monday, November 22, 2010

Attention span and focus - problem/not a problem?

I have done several posts on how heavy computer and internet use might nudge our brain processes (in either a positive or detrimental way), so I was entertained by reading somewhat contrasting takes on this issue in yesterday's Sunday NY Times, Virginia Heffernan writing in the Sunday Magazine on "The Attention-Span Myth," and Matt Richtel's "Growing Up Digital, Wired for Distraction."
Clips from Heffernan:
...attention spans...have become the digital-age equivalent of souls...which might be measured by the psychologist’s equivalent of a tailor’s tape? ..isn’t there something just unconvincing about the idea that an occult “span” in the brain makes certain cultural objects more compelling than others? So a kid loves the drums but can hardly get through a chapter of “The Sun Also Rises”; and another aces algebra tests but can’t even understand how Call of Duty is played.

In other eras, distractibility wasn’t considered shameful. It was regularly praised, in fact — as autonomy, exuberance and versatility. To be brooding, morbid, obsessive or easily mesmerized was thought much worse than being distractible. In “Moby-Dick,” Starbuck tries to distract Ahab from his monomania with evocations of family life in Nantucket...sitting silently without fidgeting: that’s essentially what we want of children with bum attention spans, isn’t it? The first sign that a distractible child is doing “better” — with age or Adderall, say — is that he sits still...At some point, we stopped calling Tom Sawyer-style distractibility either animal spirits or a discipline problem. We started to call it sick..the problem with the attention-span discourse is that it’s founded on the phantom idea of an attention span. A healthy “attention span” becomes just another ineffable quality to remember having, to believe you’ve lost, to worry about your kids lacking, to blame the culture for destroying. Who needs it?
The Richtel article tells stories about students at Woodside High School in Silicon Valley's Redwood City California.  "Here, as elsewhere, it is not uncommon for students to send hundreds of text messages a day or spend hours playing video games, and virtually everyone is on Facebook." It is in environments like these that a generation of kids is being raised whose brains might be wired differently, habituated to distraction and to switching tasks, not to focus.  Many of Richtel's stories deal with the contest between the immediate gratifications of distractability  and doing homework and reading that builds a self, and a future. Richtel also provides an descriptions of several academic studies

Greedy Geezers

Apparently my demographic group (seniors on Medicare) radically changed its voting behavior in the recent midterm elections, and is very opposed to the new Health Care Legislation, saying in effect, “I’ve got mine—good luck getting yours.”. In the Nov. 21 New Yorker Surowiecki does a nice commentary:
In the 2006 midterm election, seniors split their vote evenly between House Democrats and Republicans. This time, they went for Republicans by a twenty-one-point margin...The election has been termed the “revolt of the middle class.” But it might more accurately be called the revolt of the retired...The real sticking point was health-care reform, which the elderly didn’t like from the start...the very people who currently enjoy the benefits of a subsidized, government-run insurance system are intent on keeping others from getting the same treatment...seniors today get far more out of Medicare than they ever put in, which means that their medical care is paid for by current taxpayers...the subsidies that seniors get aren’t fundamentally different from the ones that the Affordable Care Act will offer some thirty million Americans who don’t have insurance.

Current sentiment among seniors seems like a classic example of an effect that the economist Benjamin Friedman identified in his magisterial book “The Moral Consequences of Economic Growth”: in hard times voters get more selfish. Historically, Friedman notes, times of stagnation have been times of reaction, with voters bent on protecting their own interests, hostile to outsiders, and less interested in social welfare...the Democrats’ loss of support among the elderly was more a matter of economic fundamentals than of political framing. If the economy were growing briskly, it’s unlikely that the health-care bill would have become so politically toxic.

Friday, November 19, 2010

Using invisible visual signals to see things.

Di Luca et al. have done an ingenious experiment that demonstrates that an invisible signal can be recruited as a cue for perceptual appearance. Regularities between the 'invisible' (below perceptual threshold) signal and a perceived signal can be unconsciously learned - perception can rapidly undergo “structure learning” by automatically picking up novel contingencies between sensory signals, thus automatically recruiting signals for novel uses during the construction of a percept. It is worthwhile to step through their description of how the experiment works:
To convincingly show that new perceptual meanings for sensory signals can be learned automatically, one needs an “invisible visual signal,” that is, a signal that is sensed but that has no effect on visual appearance. The gradient of vertical binocular disparity, created by 2% vertical magnification of one eye's image (the eye of vertical magnification [EVM]), can be such a signal. In several control experiments, we ensured that EVM could not be seen by the participants.

The stimulus we used was a horizontal cylinder rotating either front side up or front side down. In its basic form, the cylinder was defined by horizontal lines with fading edges. The lines moved up and down on the screen, thereby creating the impression of a rotating cylinder with ambiguous rotation direction, so participants perceived it rotating sometimes as front side up and sometimes as front side down.

We tested whether the signal created by 2% vertical magnification could be recruited to control the perceived rotation direction of this ambiguously rotating cylinder. To do so, we exposed participants to a new contingency. We used a disambiguated version of the cylinder that contained additional depth cues: dots provided horizontal disparity, and a rectangle occluded part of the farther surface of the cylinder. These cues disambiguated the perceived rotation direction of the cylinder. In training trials, we exposed participants to cylinder stimuli in which EVM and the unambiguously perceived rotation direction were contingent upon one another. To test whether EVM had an effect on the perceived rotation direction of the cylinder, we interleaved these training trials with probe trials that had ambiguous rotation direction. If participants recruited EVM to the new use, then perceived rotation direction on probe trials would come to depend on EVM. If participants did not recruit EVM, then perceived rotation direction would be independent of EVM.

Importantly, after exposure to the new contingency, all participants saw a majority of probe trials consistent with the rotation direction contingent with EVM during exposure—that is, the learning effect was highly significant.

Thursday, November 18, 2010

How life experiences alter what our genes do.

It has been a frustration that we are unable to pinpoint causative genetic effects in many complex diseases and behavioral abnormalities. Many think the missing information resides in our nongenetic cellular memory, which records developmental and environmental cues. "Epigenetics" has become the catch-all phrase for many environmentally influenced genetic regulatory systems involving DNA methylation, histone modification, nucleosome location, or noncoding RNA. The basic requirement for an epigenetic system is that it be heritable, self-perpetuating, and reversible. Benedict Carey has done a nice non-technical article on epigenetics, how people’s experience and environment affect the function of their genes. Some clips:
Genes are far more than protein machines, pumping out their product like a popcorn maker. Many carry what are, in effect, chemical attachments: compounds acting on the DNA molecule that regulate when, where or how much protein is made, without altering the recipe itself. Studies suggest that such add-on, or epigenetic, markers develop as an animal adapts to its environment, whether in the womb or out in the world — and the markers can profoundly affect behavior.

...researchers have shown that affectionate mothering alters the expression of genes, allowing them to dampen their physiological response to stress. These biological buffers are then passed on to the next generation: rodents and nonhuman primates biologically primed to handle stress tend to be more nurturing to their own offspring.

...Epigenetic markers may likewise hinder normal development: the offspring of parents who experience famine are at heightened risk for developing schizophrenia, some research suggests — perhaps because of the chemical signatures on the genes that parents pass on. Another recent study found evidence that, in some people with autism, epigenetic markers had silenced the gene which makes the receptor for the hormone oxytocin. Oxytocin oils the brain’s social circuits, and is critical in cementing relationships.

...The National Institutes of Health is sponsoring about 100 studies looking at the relationship between epigenetic markers and behavior problems, including drug abuse, post-traumatic stress, bipolar disorder and schizophrenia, compared with just a handful of such studies a decade ago.

Wednesday, November 17, 2010

Tiny touches of the tongue - the elegance of cats.

I've learned something about my constant companions, two Abyssinian cats named Marvin and Melvin.  I've always wondered how the rapid petite tongue flickers they use while drinking could be getting much water into their mouths.  Now two MIT physicists have the simple answer. Their tongues perform a complex maneuver that pits gravity versus inertia in a delicate balance. Using high speed photography they found that:
...cats rest the tips of their tongues on the liquid's surface without penetrating it. The water sticks to the cat's tongue and is pulled upward as the cat draws its tongue into its mouth. When the cat closes its mouth, it breaks the liquid column but still keeps its chin and whiskers dry. Here is the full text of their article.
From Nicholas Wade's description:
What happens is that the cat darts its tongue, curving the upper side downward so that the tip lightly touches the surface of the water...The tongue is then pulled upward at high speed, drawing a column of water behind it...Just at the moment that gravity finally overcomes the rush of the water and starts to pull the column down — snap! The cat’s jaws have closed over the jet of water and swallowed it...The cat laps four times a second — too fast for the human eye to see anything but a blur — and its tongue moves at a speed of one meter per second.

Tuesday, November 16, 2010

A wandering mind is an unhappy mind.

Killingsworth and Gilbert report a fascinating study in the Nov. 12 issue of Science Magazine. They developed a smartphone technology to sample people’s ongoing thoughts, feelings, and actions and found that people are thinking about what is not happening almost as often as they are thinking about what is, and that this typically makes them unhappy. Here are some excerpts:
Unlike other animals, human beings spend a lot of time thinking about what is not going on around them, contemplating events that happened in the past, might happen in the future, or will never happen at all. Indeed, "stimulus-independent thought" or "mind wandering" appears to be the brain’s default mode of operation...this ability is a remarkable evolutionary achievement that allows people to learn, reason, and plan, it may have an emotional cost.
To measure the emotional consequences of mind-wandering the authors developed a a Web application for the iPhone for collecting real-time reports from large numbers of people.
The application contacts participants through their iPhones at random moments during their waking hours, presents them with questions, and records their answers to a database at www.trackyourhappiness.org. The database currently contains nearly a quarter of a million samples from about 5000 people from 83 different countries who range in age from 18 to 88 and who collectively represent every one of 86 major occupational categories.

To find out how often people’s minds wander, what topics they wander to, and how those wanderings affect their happiness, we analyzed samples from 2250 adults (58.8% male, 73.9% residing in the United States, mean age of 34 years) who were randomly assigned to answer a happiness question ("How are you feeling right now?") answered on a continuous sliding scale from very bad (0) to very good (100), an activity question ("What are you doing right now?") answered by endorsing one or more of 22 activities adapted from the day reconstruction method (10, 11), and a mind-wandering question ("Are you thinking about something other than what you’re currently doing?") answered with one of four options: no; yes, something pleasant; yes, something neutral; or yes, something unpleasant. Our analyses revealed three facts.

First, people’s minds wandered frequently, regardless of what they were doing. Mind wandering occurred in 46.9% of the samples and in at least 30% of the samples taken during every activity except making love. The frequency of mind wandering in our real-world sample was considerably higher than is typically seen in laboratory experiments. Surprisingly, the nature of people’s activities had only a modest impact on whether their minds wandered and had almost no impact on the pleasantness of the topics to which their minds wandered.

Second, multilevel regression revealed that people were less happy when their minds were wandering than when they were not..., and this was true during all activities, including the least enjoyable. Although people’s minds were more likely to wander to pleasant topics (42.5% of samples) than to unpleasant topics (26.5% of samples) or neutral topics (31% of samples), people were no happier when thinking about pleasant topics than about their current activity...and were considerably unhappier when thinking about neutral topics ... or unpleasant topics... than about their current activity (Figure, bottom). Although negative moods are known to cause mind wandering, time-lag analyses strongly suggested that mind wandering in our sample was generally the cause, and not merely the consequence, of unhappiness.

Third, what people were thinking was a better predictor of their happiness than was what they were doing. The nature of people’s activities explained 4.6% of the within-person variance in happiness and 3.2% of the between-person variance in happiness, but mind wandering explained 10.8% of within-person variance in happiness and 17.7% of between-person variance in happiness. The variance explained by mind wandering was largely independent of the variance explained by the nature of activities, suggesting that the two were independent influences on happiness.


Figure - Mean happiness reported during each activity (top) and while mind wandering to unpleasant topics, neutral topics, pleasant topics or not mind wandering (bottom). Dashed line indicates mean of happiness across all samples. Bubble area indicates the frequency of occurrence. The largest bubble ("not mind wandering") corresponds to 53.1% of the samples, and the smallest bubble ("praying/worshipping/meditating") corresponds to 0.1% of the samples.
 ADDED NOTE:  I just opened my New York Times this morning and find a piece by John Tierney on this work.