Highlights
* Brain stimulation to the parietal cortex can enhance or impair numerical abilitiesSummary
* The effects were specific to the polarity of the current
* The improvement in numerical abilities lasts up to 6 months
* The brain stimulation affected specifically the material that was recently learned
Around 20% of the population exhibits moderate to severe numerical disabilities and a further percentage loses its numerical competence during the lifespan as a result of stroke or degenerative diseases. In this work, we investigated the feasibility of using noninvasive stimulation to the parietal lobe during numerical learning to selectively improve numerical abilities. We used transcranial direct current stimulation (TDCS), a method that can selectively inhibit or excitate neuronal populations by modulating GABAergic (anodal stimulation) and glutamatergic (cathodal stimulation) activity. We trained subjects for 6 days with artificial numerical symbols, during which we applied concurrent TDCS to the parietal lobes. The polarity of the brain stimulation specifically enhanced or impaired the acquisition of automatic number processing and the mapping of number into space, both important indices of numerical proficiency. The improvement was still present 6 months after the training. Control tasks revealed that the effect of brain stimulation was specific to the representation of artificial numerical symbols. The specificity and longevity of TDCS on numerical abilities establishes TDCS as a realistic tool for intervention in cases of atypical numerical development or loss of numerical abilities because of stroke or degenerative illnesses.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Wednesday, November 24, 2010
Trouble with numbers? Try zapping your brain.
Kodosh et al. in the Nov. 4 issue of Current Biology (noted by ScienceNow) report that administering a small electrical charge (transcranial direct current stimulation) to stimulate a center implicated in math operations located on the right side of the parietal lobe (beneath the crown of the head) can enhance a person's ability to process numbers for up to 6 months. The mild stimulation is said to be harmless, and might be tried to restore numerical skills in people suffering from degenerative diseases or stroke. Here is their abstract:
Tuesday, November 23, 2010
Predicting the future with web search queries
Goel et al. find that online activity at any moment in time not only provides a snapshot of the instantaneous interests, concerns, and intentions of the global population, but it is also predictive of what people will do in the near future:
Recent work has demonstrated that Web search volume can “predict the present,” meaning that it can be used to accurately track outcomes such as unemployment levels, auto and home sales, and disease prevalence in near real time. Here we show that what consumers are searching for online can also predict their collective future behavior days or even weeks in advance. Specifically we use search query volume to forecast the opening weekend box-office revenue for feature films, first-month sales of video games, and the rank of songs on the Billboard Hot 100 chart, finding in all cases that search counts are highly predictive of future outcomes. We also find that search counts generally boost the performance of baseline models fit on other publicly available data, where the boost varies from modest to dramatic, depending on the application in question... We conclude that in the absence of other data sources, or where small improvements in predictive performance are material, search queries provide a useful guide to the near future.And, in a similar vein, Preis et al. find a strong correlation between queries submitted to Google and weekly fluctuations in stock trading. They introduce a method for quantifying complex correlations in time series with which they find a clear tendency that search volume time series and transaction volume time series show recurring patterns. From the ScienceNow summary:
The Google data could not predict the weekly fluctuations in stock prices. However, the team found a strong correlation between Internet searches for a company's name and its trade volume, the total number of times the stock changed hands over a given week. So, for example, if lots of people were searching for computer manufacturer IBM one week, there would be a lot of trading of IBM stock the following week. But the Google data couldn't predict its price, which is determined by the ratio of shares that are bought and sold.
At least not yet. Neil Johnson, a physicist at the University of Miami in Florida, says that if researchers could drill down even farther into the Google Trends data—so that they could view changes in search terms on a daily or even an hourly basis—they might be able to predict a rise or fall in stock prices. They might even be able to forecast financial crises. It would be an opportunity for Google "to really collaborate with an academic group in a new area," he says. Then again, if the hourly stream of search queries really can predict stock price changes, Google might want to keep those data to itself.
Blog Categories:
culture/politics,
futures,
technology
Monday, November 22, 2010
Attention span and focus - problem/not a problem?
I have done several posts on how heavy computer and internet use might nudge our brain processes (in either a positive or detrimental way), so I was entertained by reading somewhat contrasting takes on this issue in yesterday's Sunday NY Times, Virginia Heffernan writing in the Sunday Magazine on "The Attention-Span Myth," and Matt Richtel's "Growing Up Digital, Wired for Distraction."
Clips from Heffernan:
Clips from Heffernan:
...attention spans...have become the digital-age equivalent of souls...which might be measured by the psychologist’s equivalent of a tailor’s tape? ..isn’t there something just unconvincing about the idea that an occult “span” in the brain makes certain cultural objects more compelling than others? So a kid loves the drums but can hardly get through a chapter of “The Sun Also Rises”; and another aces algebra tests but can’t even understand how Call of Duty is played.The Richtel article tells stories about students at Woodside High School in Silicon Valley's Redwood City California. "Here, as elsewhere, it is not uncommon for students to send hundreds of text messages a day or spend hours playing video games, and virtually everyone is on Facebook." It is in environments like these that a generation of kids is being raised whose brains might be wired differently, habituated to distraction and to switching tasks, not to focus. Many of Richtel's stories deal with the contest between the immediate gratifications of distractability and doing homework and reading that builds a self, and a future. Richtel also provides an descriptions of several academic studies
In other eras, distractibility wasn’t considered shameful. It was regularly praised, in fact — as autonomy, exuberance and versatility. To be brooding, morbid, obsessive or easily mesmerized was thought much worse than being distractible. In “Moby-Dick,” Starbuck tries to distract Ahab from his monomania with evocations of family life in Nantucket...sitting silently without fidgeting: that’s essentially what we want of children with bum attention spans, isn’t it? The first sign that a distractible child is doing “better” — with age or Adderall, say — is that he sits still...At some point, we stopped calling Tom Sawyer-style distractibility either animal spirits or a discipline problem. We started to call it sick..the problem with the attention-span discourse is that it’s founded on the phantom idea of an attention span. A healthy “attention span” becomes just another ineffable quality to remember having, to believe you’ve lost, to worry about your kids lacking, to blame the culture for destroying. Who needs it?
Blog Categories:
brain plasticity,
culture/politics,
human development
Greedy Geezers
Apparently my demographic group (seniors on Medicare) radically changed its voting behavior in the recent midterm elections, and is very opposed to the new Health Care Legislation, saying in effect, “I’ve got mine—good luck getting yours.”. In the Nov. 21 New Yorker Surowiecki does a nice commentary:
In the 2006 midterm election, seniors split their vote evenly between House Democrats and Republicans. This time, they went for Republicans by a twenty-one-point margin...The election has been termed the “revolt of the middle class.” But it might more accurately be called the revolt of the retired...The real sticking point was health-care reform, which the elderly didn’t like from the start...the very people who currently enjoy the benefits of a subsidized, government-run insurance system are intent on keeping others from getting the same treatment...seniors today get far more out of Medicare than they ever put in, which means that their medical care is paid for by current taxpayers...the subsidies that seniors get aren’t fundamentally different from the ones that the Affordable Care Act will offer some thirty million Americans who don’t have insurance.
Current sentiment among seniors seems like a classic example of an effect that the economist Benjamin Friedman identified in his magisterial book “The Moral Consequences of Economic Growth”: in hard times voters get more selfish. Historically, Friedman notes, times of stagnation have been times of reaction, with voters bent on protecting their own interests, hostile to outsiders, and less interested in social welfare...the Democrats’ loss of support among the elderly was more a matter of economic fundamentals than of political framing. If the economy were growing briskly, it’s unlikely that the health-care bill would have become so politically toxic.
Friday, November 19, 2010
Using invisible visual signals to see things.
Di Luca et al. have done an ingenious experiment that demonstrates that an invisible signal can be recruited as a cue for perceptual appearance. Regularities between the 'invisible' (below perceptual threshold) signal and a perceived signal can be unconsciously learned - perception can rapidly undergo “structure learning” by automatically picking up novel contingencies between sensory signals, thus automatically recruiting signals for novel uses during the construction of a percept. It is worthwhile to step through their description of how the experiment works:
To convincingly show that new perceptual meanings for sensory signals can be learned automatically, one needs an “invisible visual signal,” that is, a signal that is sensed but that has no effect on visual appearance. The gradient of vertical binocular disparity, created by 2% vertical magnification of one eye's image (the eye of vertical magnification [EVM]), can be such a signal. In several control experiments, we ensured that EVM could not be seen by the participants.
The stimulus we used was a horizontal cylinder rotating either front side up or front side down. In its basic form, the cylinder was defined by horizontal lines with fading edges. The lines moved up and down on the screen, thereby creating the impression of a rotating cylinder with ambiguous rotation direction, so participants perceived it rotating sometimes as front side up and sometimes as front side down.
We tested whether the signal created by 2% vertical magnification could be recruited to control the perceived rotation direction of this ambiguously rotating cylinder. To do so, we exposed participants to a new contingency. We used a disambiguated version of the cylinder that contained additional depth cues: dots provided horizontal disparity, and a rectangle occluded part of the farther surface of the cylinder. These cues disambiguated the perceived rotation direction of the cylinder. In training trials, we exposed participants to cylinder stimuli in which EVM and the unambiguously perceived rotation direction were contingent upon one another. To test whether EVM had an effect on the perceived rotation direction of the cylinder, we interleaved these training trials with probe trials that had ambiguous rotation direction. If participants recruited EVM to the new use, then perceived rotation direction on probe trials would come to depend on EVM. If participants did not recruit EVM, then perceived rotation direction would be independent of EVM.
Importantly, after exposure to the new contingency, all participants saw a majority of probe trials consistent with the rotation direction contingent with EVM during exposure—that is, the learning effect was highly significant.
Thursday, November 18, 2010
How life experiences alter what our genes do.
It has been a frustration that we are unable to pinpoint causative genetic effects in many complex diseases and behavioral abnormalities. Many think the missing information resides in our nongenetic cellular memory, which records developmental and environmental cues. "Epigenetics" has become the catch-all phrase for many environmentally influenced genetic regulatory systems involving DNA methylation, histone modification, nucleosome location, or noncoding RNA. The basic requirement for an epigenetic system is that it be heritable, self-perpetuating, and reversible. Benedict Carey has done a nice non-technical article on epigenetics, how people’s experience and environment affect the function of their genes. Some clips:
Genes are far more than protein machines, pumping out their product like a popcorn maker. Many carry what are, in effect, chemical attachments: compounds acting on the DNA molecule that regulate when, where or how much protein is made, without altering the recipe itself. Studies suggest that such add-on, or epigenetic, markers develop as an animal adapts to its environment, whether in the womb or out in the world — and the markers can profoundly affect behavior.
...researchers have shown that affectionate mothering alters the expression of genes, allowing them to dampen their physiological response to stress. These biological buffers are then passed on to the next generation: rodents and nonhuman primates biologically primed to handle stress tend to be more nurturing to their own offspring.
...Epigenetic markers may likewise hinder normal development: the offspring of parents who experience famine are at heightened risk for developing schizophrenia, some research suggests — perhaps because of the chemical signatures on the genes that parents pass on. Another recent study found evidence that, in some people with autism, epigenetic markers had silenced the gene which makes the receptor for the hormone oxytocin. Oxytocin oils the brain’s social circuits, and is critical in cementing relationships.
...The National Institutes of Health is sponsoring about 100 studies looking at the relationship between epigenetic markers and behavior problems, including drug abuse, post-traumatic stress, bipolar disorder and schizophrenia, compared with just a handful of such studies a decade ago.
Wednesday, November 17, 2010
Tiny touches of the tongue - the elegance of cats.
I've learned something about my constant companions, two Abyssinian cats named Marvin and Melvin. I've always wondered how the rapid petite tongue flickers they use while drinking could be getting much water into their mouths. Now two MIT physicists have the simple answer. Their tongues perform a complex maneuver that pits gravity versus inertia in a delicate balance. Using high speed photography they found that:
...cats rest the tips of their tongues on the liquid's surface without penetrating it. The water sticks to the cat's tongue and is pulled upward as the cat draws its tongue into its mouth. When the cat closes its mouth, it breaks the liquid column but still keeps its chin and whiskers dry. Here is the full text of their article.From Nicholas Wade's description:
What happens is that the cat darts its tongue, curving the upper side downward so that the tip lightly touches the surface of the water...The tongue is then pulled upward at high speed, drawing a column of water behind it...Just at the moment that gravity finally overcomes the rush of the water and starts to pull the column down — snap! The cat’s jaws have closed over the jet of water and swallowed it...The cat laps four times a second — too fast for the human eye to see anything but a blur — and its tongue moves at a speed of one meter per second.
Tuesday, November 16, 2010
A wandering mind is an unhappy mind.
Killingsworth and Gilbert report a fascinating study in the Nov. 12 issue of Science Magazine. They developed a smartphone technology to sample people’s ongoing thoughts, feelings, and actions and found that people are thinking about what is not happening almost as often as they are thinking about what is, and that this typically makes them unhappy. Here are some excerpts:
Unlike other animals, human beings spend a lot of time thinking about what is not going on around them, contemplating events that happened in the past, might happen in the future, or will never happen at all. Indeed, "stimulus-independent thought" or "mind wandering" appears to be the brain’s default mode of operation...this ability is a remarkable evolutionary achievement that allows people to learn, reason, and plan, it may have an emotional cost.To measure the emotional consequences of mind-wandering the authors developed a a Web application for the iPhone for collecting real-time reports from large numbers of people.
The application contacts participants through their iPhones at random moments during their waking hours, presents them with questions, and records their answers to a database at www.trackyourhappiness.org. The database currently contains nearly a quarter of a million samples from about 5000 people from 83 different countries who range in age from 18 to 88 and who collectively represent every one of 86 major occupational categories.ADDED NOTE: I just opened my New York Times this morning and find a piece by John Tierney on this work.
To find out how often people’s minds wander, what topics they wander to, and how those wanderings affect their happiness, we analyzed samples from 2250 adults (58.8% male, 73.9% residing in the United States, mean age of 34 years) who were randomly assigned to answer a happiness question ("How are you feeling right now?") answered on a continuous sliding scale from very bad (0) to very good (100), an activity question ("What are you doing right now?") answered by endorsing one or more of 22 activities adapted from the day reconstruction method (10, 11), and a mind-wandering question ("Are you thinking about something other than what you’re currently doing?") answered with one of four options: no; yes, something pleasant; yes, something neutral; or yes, something unpleasant. Our analyses revealed three facts.
First, people’s minds wandered frequently, regardless of what they were doing. Mind wandering occurred in 46.9% of the samples and in at least 30% of the samples taken during every activity except making love. The frequency of mind wandering in our real-world sample was considerably higher than is typically seen in laboratory experiments. Surprisingly, the nature of people’s activities had only a modest impact on whether their minds wandered and had almost no impact on the pleasantness of the topics to which their minds wandered.
Second, multilevel regression revealed that people were less happy when their minds were wandering than when they were not..., and this was true during all activities, including the least enjoyable. Although people’s minds were more likely to wander to pleasant topics (42.5% of samples) than to unpleasant topics (26.5% of samples) or neutral topics (31% of samples), people were no happier when thinking about pleasant topics than about their current activity...and were considerably unhappier when thinking about neutral topics ... or unpleasant topics... than about their current activity (Figure, bottom). Although negative moods are known to cause mind wandering, time-lag analyses strongly suggested that mind wandering in our sample was generally the cause, and not merely the consequence, of unhappiness.
Third, what people were thinking was a better predictor of their happiness than was what they were doing. The nature of people’s activities explained 4.6% of the within-person variance in happiness and 3.2% of the between-person variance in happiness, but mind wandering explained 10.8% of within-person variance in happiness and 17.7% of between-person variance in happiness. The variance explained by mind wandering was largely independent of the variance explained by the nature of activities, suggesting that the two were independent influences on happiness.
Figure - Mean happiness reported during each activity (top) and while mind wandering to unpleasant topics, neutral topics, pleasant topics or not mind wandering (bottom). Dashed line indicates mean of happiness across all samples. Bubble area indicates the frequency of occurrence. The largest bubble ("not mind wandering") corresponds to 53.1% of the samples, and the smallest bubble ("praying/worshipping/meditating") corresponds to 0.1% of the samples.
Monday, November 15, 2010
Great pianists of the 20th century
I have to pass on this wonderful video on great pianists of the 20th century. Horowitz doing the Carmen variations is astounding.
Color of ambient light directly influences our brain's emtional processing
When I started my first research laboratory 42 years ago, one of my first actions was to replace all of the standard fluorescent light fixtures, with more natural daylight spectrum bulbs that contained more blue wavelengths. My experience, and that of research students in my laboratory, was that this made the work environment more calm and tranquil. By now we have learned that blue light is the best stimulus for a visual pathway that lies outside of the classical (red/green/blue) rod and cone photoreceptor cells of our retinas. It is driven by a the blue sensitive visual pigment, melanopsin, that is found in some newly discovered inner (ganglion) cells of the retina. Ambient light input from both this and the classical photoreceptors significantly modulates ongoing cognitive brain function, including attention, working memory, updating, and sensory processing, within a few tens of seconds. The amygdala, a central component of our emotional brain, receives sparse direct projections from the newly discovered light sensitive retinal ganglion cells and is one of the brain areas acutely affected by changes in ambient light. Vandewalle et al have now shown that ambient, particularly blue, light directly influences emotional brain processing. Their abstract:
Light therapy can be an effective treatment for mood disorders, suggesting that light is able to affect mood state in the long term. As a first step to understand this effect, we hypothesized that light might also acutely influence emotion and tested whether short exposures to light modulate emotional brain responses. During functional magnetic resonance imaging, 17 healthy volunteers listened to emotional and neutral vocal stimuli while being exposed to alternating 40-s periods of blue or green ambient light. Blue (relative to green) light increased responses to emotional stimuli in the voice area of the temporal cortex and in the hippocampus. During emotional processing, the functional connectivity between the voice area, the amygdala, and the hypothalamus was selectively enhanced in the context of blue illumination, which shows that responses to emotional stimulation in the hypothalamus and amygdala are influenced by both the decoding of vocal information in the voice area and the spectral quality of ambient light. These results demonstrate the acute influence of light and its spectral quality on emotional brain processing and identify a unique network merging affective and ambient light information.
Friday, November 12, 2010
Reducing pain by touching ourselves
Interesting observations from Kammers et al. The abstract:
Acute peripheral pain is reduced by multisensory interactions at the spinal level. Central pain is reduced by reorganization of cortical body representations. We show here that acute pain can also be reduced by multisensory integration through self-touch, which provides proprioceptive, thermal, and tactile input forming a coherent body representation. We combined self-touch with the thermal grill illusion (TGI). In the traditional TGI, participants press their fingers on two warm objects surrounding one cool object. The warm surround unmasks pain pathways, which paradoxically causes the cool object to feel painfully hot. Here, we warmed the index and ring fingers of each hand while cooling the middle fingers. Immediately after, these three fingers of the right hand were touched against the same three fingers on the left hand. This self-touch caused a dramatic 64% reduction in perceived heat. We show that this paradoxical release from paradoxical heat cannot be explained by low-level touch-temperature interactions alone. To reduce pain, we often clutch a painful hand with the other hand. We show here that self-touch not only gates pain signals reaching the brain but also, via multisensory integration, increases coherence of cognitive body representations to which pain afferents project.
How the leopard got its spots...
I pass on this interesting bit from the Nov. 5 Science Magazine's 'Editor's Choice" section describing work by Allen et al. :
The evolution of color patterns in animal coats has long been of interest to evolutionary biologists. From stripes on tigers to leopard spots and even the lion's plain coat, members of the cat family (Felidae) display some of the most striking patterns and variation in the degree of patterning across species. Camouflaging may be especially important in felids due to their stalking predatory behavior; however, the degree to which this shapes patterning across the family is unresolved. Allen et al. now compare mathematical model–generated categories of pattern complexity and variation to the phylogenetic history of the family and find that coat patterning is a highly changeable trait, which is largely related to felids' ecology. For instance, spots occur in species that live in closed environments, such as forests, and particularly complex patterns are found in arboreal and nocturnal species. In contrast, most species that live in open habitats, such as savannahs and mountains, have plain coats. These findings imply that spots provide camouflage in the spotted light found in forest canopies, whereas nonpatterned animals do better in the flat light of an open habitat. Thus, strong selection for background matching has rapidly generated tremendous diversity in coat patterning among felids.
Thursday, November 11, 2010
Another reason exercise inhibits aging - boost of muscle stem cells.
Work by Shefer et al. suggests why exercisers have better muscle function than nonexercisers as they age. It turns out that endurance exercise doesn't just tone muscles, it also increases the number of the muscle stem cells that regenerate muscles after injury or illness. Enhanced stem cell numbers might also delay sarcopenia, the decline in muscle mass that occurs with aging. The experiments on rats showed that the number of muscle stem cells (called satellite cells) increased after rats spent 13 weeks running on a treadmill for 20 minutes a day. Younger rats showed a 20% to 35% increase in the mean number of stem cells per muscle fiber, while older rats showed a 33% to 47% increase.
Wednesday, November 10, 2010
Dancing Scientists
In a previous life, I was a dancer, taking classes in modern dance technique and improvisation, and was actually in a few performances. This was in the middle 1970's, a period when I was also traveling to the Esalen Institute in the California Big Sur coast to commune with Monarch butterflies and migrating whales, and take classes in Gestalt therapy, Alexander Technique, Feldenkrais technique, and massage...those were the days! This history explains why I am particularly seduced by the recent piece in Science Magazine by John Bohannon on dancing scientists, which covers the "Dance your Ph.D." contest run by Science Magazine for the past three years. The dancing scientists actually originated in the hippie 1970's, down the California coast from where I was watching whales, in Paul Berg's lab at Stanford. The tedious filming of those days has been replaced by digital camera outputs sent straight to YouTube. Berg got hundreds of requests for the original film, which then became a DVD, and is now on YouTube. It is a hoot to watch as a historical piece, and drives me to spasms of nostalgia.
Robust exchanges on MindBlog
Just as I have been following the commentaries (19 when I last looked) on the topic of last Wednesday's post "Our minds extend beyond our heads" another comment came in on a older post, "Neuroscience and the soul" which also triggered a robust exchange, so I thought I would just repeat that link here.
Tuesday, November 09, 2010
Neural signature in the brain of skilled negotiators
Read Montague's group studies brain correlates of how we manipulate other people's beliefs about ourselves for gain. A review by Bhanoo describes the effort:
Researchers created a game in which players were given the true value of an object on a scale of 1 to 10. The players used this information to make a bid to the seller of the object, who did not know the true value...The buyers fell into three groups. One group consisted of players who were honest in their price suggestions, making low bids directly related to the true value. A second group, called “conservatives,” made bids only weakly related to the true price. The last and most interesting group, known as “strategic deceivers,” bid higher when the true price was low, and then when the true price was high, they bid low, and collected large gains...strategic deceivers had unique brain activity in regions connected to complex decision-making, goal maintenance and understanding another person’s belief system. Though the game was abstract, there are real-life advantages to being a strategic deceiver...It’s used to bargain in a marketplace or in a store but also to recruit someone for a job, or to negotiate a higher salary.Here is the abstract of the paper:
The management and manipulation of our own social image in the minds of others requires difficult and poorly understood computations. One computation useful in social image management is strategic deception: our ability and willingness to manipulate other people's beliefs about ourselves for gain. We used an interpersonal bargaining game to probe the capacity of players to manage their partner's beliefs about them. This probe parsed the group of subjects into three behavioral types according to their revealed level of strategic deception; these types were also distinguished by neural data measured during the game. The most deceptive subjects emitted behavioral signals that mimicked a more benign behavioral type, and their brains showed differential activation in right dorsolateral prefrontal cortex and left Brodmann area 10 at the time of this deception. In addition, strategic types showed a significant correlation between activation in the right temporoparietal junction and expected payoff that was absent in the other groups. The neurobehavioral types identified by the game raise the possibility of identifying quantitative biomarkers for the capacity to manipulate and maintain a social image in another person's mind.
Monday, November 08, 2010
The recent election - gridlock or compromise?
I find it hard to find much of a glimmer of hope for the prospect that some important issues will actually be faced in the next several years, but this piece by Benedict Carey does note some research suggesting conditions that can lead to a softening of strongly held positions. A few clips:
...people tend to exaggerate their differences with opponents to begin with, research suggests, especially in the company of fellow partisans. In small groups organized around a cause, for instance, members are prone to one-up one another; the most extreme tend to rise the most quickly, making the group look more radical than it is.
...recent studies demonstrates how quickly large differences can be put aside, under some circumstances. In one, a team of psychologists had a group of college students who scored very high on measures of patriotism read and critique an essay titled “Beyond the Rhetoric: Understanding the Recent Terrorist Attacks in Context,” which argued that the 9/11 attacks were partly a response to American policy in the Middle East.
The students judged the report harshly — unless, prompted by the researchers, they had first described a memory that they were proud of. This group, flush with the image of having acted with grace or courage, was significantly more open to at least considering the case spelled out in the essay than those who had recounted a memory of having failed to exhibit their most prized personal quality.
Confronting an opposing political view is a threat to identity, but “if you remind people of what they value in some other domain of their life, it lessens the pain,” said the lead author, Geoffrey L. Cohen, a social psychologist at Stanford. “It opens them up to information that they might not otherwise consider.”
The effect of such affirmations seems especially pronounced in people who boast strong convictions. In a follow-up experiment, the research team had supporters of abortion-rights act out a negotiation with an opponent on an abortion bill. Again, participants who were prompted to recall a treasured memory beforehand were more open to seeking areas of agreement and more respectful of their opposite’s position than those not so prompted.
Friday, November 05, 2010
Why our brains go for market bubbles.
Jonah Lehrer has a nice piece in last Sunday's New York Times Magazine which discusses Read Montague's work suggesting that financial manias seem to take advantage of deep-seated human flaws; the market fails only because the brain fails first.
At first, Montague’s data confirmed the obvious: our brains crave reward. He watched as a cluster of dopamine neurons acted like greedy information processors, firing rapidly as the subjects tried to maximize their profits during the early phases of the bubble. When share prices kept going up, these brain cells poured dopamine into the caudate nucleus, which increased the subjects’ excitement and led them to pour more money into the market. The bubble was building.
But then Montague discovered something strange. As the market continued to rise, these same neurons significantly reduced their rate of firing. “It’s as if the cells were getting anxious,” Montague says. “They knew something wasn’t right.” And then, just before the bubble burst, these neurons typically stopped firing altogether. In many respects, these dopamine neurons seem to be acting like an internal thermostat, shutting off when the market starts to overheat. Unfortunately, the rest of the brain is too captivated by the profits to care: instead of heeding the warning, the brain obeys the urges of so-called higher regions, like the prefrontal cortex, which are busy coming up with all sorts of reasons that the market will never decline. In other words, our primal emotions are acting rationally, while those rational circuits are contributing to the mass irrationality.
Thursday, November 04, 2010
Calculate your endurance
I thought I would pass on this fascinating item from the Random Samples section of the Oct. 29th issue of Science Magazine. Now you can calculate exactly what carbohydrate loading you need to run a marathon in a desired amount of time:
Just about all serious marathon runners have experienced it. In the last half of a marathon, usually at about mile 21, their energy suddenly plummets. Their legs slow down, and it's almost impossible to make them go faster. Nutritionists blame carbohydrate loss: When the supply runs out, runners "hit the wall."
Now a model published this month in PLoS Computational Biology tells runners when they'll hit the wall, helping them to plan their carb-loading or refueling strategies accordingly. Benjamin Rapoport, a dual M.D.-Ph.D. student at Harvard Medical School and the Massachusetts Institute of Technology, says the idea began 5 years ago, when a class conflicted with his running in the Boston Marathon. His professor let him skip, provided he give a talk on the physiology of endurance running afterward. The talk became an annual tradition, and now he's quantified his ideas.
Rapoport's model looks at multiple factors, such as a runner's desired pace, muscle mass, and aerobic capacity, the amount of oxygen the body can deliver to its muscles. "It's a real tour de force," says physiologist Michael Joyner of the Mayo Clinic in Rochester, Minnesota, although he adds it is hard to account for all individual differences.
Which is better, chowing down days before or grabbing some sugar during the race? "Both," says sports nutritionist Edward Coyle of the University of Texas, Austin. "The carb loading will raise the glycogen levels in your muscles, and taking in carbs during the race will keep your blood glucose levels up." And now Rapoport even has an app for that: Athletes will can calculate their thresholds at http://endurancecalculator.com/.
Wednesday, November 03, 2010
Our minds extend beyond our heads.
I've always liked the idea, lucidly presented by Andy Clark over many years, that our minds are impossible to distinguish from our environment, because they really can't exist in the absence of a cognitive coupling between the two. I am relaying below the entire text of an instructive and interesting book review by Erik Myin of a book of commentaries on an influential 1998 paper by Andy Clark and David Chalmers titled "The extended mind." (very much worth reading, PDF here).
Where is the mind? "In the head" or "in the brain," most people might respond. The philosopher Gilbert Ryle gave a different answer:
The statement "the mind is in its own place," as theorists might construe it, is not true, for the mind is not even a metaphorical "place." On the contrary, the chessboard, the platform, the scholar's desk, the judge's bench, the lorry, the driver's seat, the studio and the football field are among its places. (1)
Recently, this idea of the mind not being confined to the head has been reinvigorated by philosophers and cognitive scientists, who see the mind as "spreading out" or "extending" into the world. "How do you know the way to San José?" philosopher John Haugeland has famously asked (2). Chances are you don't have some inner analog of a printed map. Rather, you know where you should enter the highway, and then you get there by following the road signs. Your knowledge seems to be partially "implemented" in the environment. There is now a blooming field of research into "situation cognition," which explores how cognitive or mental phenomena such as problem solving or remembering can be strongly dependent on interactions between subjects and their environments.
The possible far-reaching implications of a situated view of cognition were brought into sharp focus by Andy Clark and David Chalmers in their 1998 paper "The extended mind" (3). There they defend the idea that the mind "extends" into the environment in cases in which a human organism and the environment become cognitively coupled systems. Their by now iconic illustration of cognitive coupling involves "Otto," a "slightly amnesic" person, who uses a notebook to write down important facts that he is otherwise likely to forget. Unlike a person who remembers the address of the Museum of Modern Art by relying on natural memory, Otto recalls it by accessing his notebook. If one supposes that the notebook is constantly available to Otto and that what is written in it is endorsed by Otto, it becomes plausible—so Clark and Chalmers argue—that Otto's memory extends to include the notebook. After all, they notice, Otto's notes seem to play exactly the same role as memory traces in other people. Wouldn't it be chauvinistic to restrict the mind's extent to what's natural and inner?
Clark and Chalmers's paper has triggered a vigorous and continuing debate. Nonbelievers concede that numerous tight causal couplings between minds and environments exist, but they deny that it therefore makes sense to speak of an extended mind instead of a mind in a person that closely interacts with an environment. All things considered, they argue, thoughts remain in persons—never in objects like notebooks, however closely dependent a person could become on them.
Enthusiasts for the extended mind thesis insist that a close causal coupling between persons and environments can license the conclusion that the mind spreads into the environment. Some follow the argument in Clark and Chalmers that infers extendedness from the fact that external elements can play a role that would be considered as cognitive if played by something internal to a person.
Other supporters of the idea are suspicious of this argument from parity. They note that the most interesting cases of causal coupling are those in which the environment does not simply function as some ersatz internal milieu—when the involvement of external means makes possible forms of cognition that were not possible without them. For example, when pen and paper, symbolic systems, or computers make possible calculations, computations, and, ultimately, scientific theories. Those taking this position hold that it is when the environment becomes a necessary factor in enabling novel cognitive processes that the mind extends.
In The Extended Mind, philosopher Richard Menary (University of Wollongong) brings together the Clark and Chalmers paper and several responses to it. The collection, lucidly introduced by Menary, will neither definitively prove nor deal the deathblow to the idea that "the place of the mind" is the world—nor even establish that there really is such a question about "the place of the mind" that needs to be answered. Rather, the volume provides carefully drawn arguments for and against different interpretations of the extended mind thesis, often with extensive reference to empirical material. Several of the papers in the collection are excellent.
To take one fascinating idea, consider Susan Hurley on "variable neural correlates." We are comfortable with the correlation between types of experience and types of brain states, and undoubtedly such variation is one important source for the idea that the mind is in the head. Hurley notes, however, that there is also a dependence of experience on type of interaction with the environment, one not aligned to strictly neural properties. For example, when blind people haptically read Braille text, activity in the visual cortex seems to correlate with tactile experience. In people who are not blind, tactile experience correlates with activity in the tactile cortex. What explains the common enabling of tactile experience by the different kinds of cortex seems to be tactile causal coupling with the environment, rather than strictly neural type. According to Hurley, and others, the same kind of correlation-tracking reasoning that convinces us, in standard cases, that the mind is in the brain should here lead to the conclusion that the mind is not in the head.
References
* 1. G. Ryle, The Concept of Mind (Hutchinson, London, 1949).
* 2. J. Haugeland, Having Thought: Essays in the Metaphysics of Mind (Harvard Univ. Press, Cambridge, MA, 1998).
* 3. A. Clark, D. J. Chalmers, Analysis 58, 7 (1998).
Subscribe to:
Posts (Atom)