This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Thursday, January 04, 2007
Alcohol and Arthritis
Jonsson et. al. show (at least in mice) that "low but persistent ethanol consumption delays the onset and halts the progression of collagen-induced arthritis by interaction with innate immune responsiveness."
An Autistic Savant - The Living Camera
Steven is an autistic savant living in London who did not speak until he was five and now has great difficulty with language as an adult. When he was eleven he drew a perfect aerial view of London after flying over it only once. Here is a windows media player movie describing his Rome flyover and drawing.
The Free Will debate...
Any of you who have read my "I-Illusion" piece or followed this blog will know that I have a continuing interest in the issue of free will. The science section in the Jan. 2 issue of The New York Times has a beautifully written essay by Dennis Overbye - "Free Will: Now You Have It, Now You Don't" - which gives the views of Dennett, Wegner, Libet, Silberstein and others. I'm tempted to give you huge chunks of the article, but will retrain myself to just a few clips:
Overbye: A bevy of experiments in recent years suggest that the conscious mind is like a monkey riding a tiger of subconscious decisions and actions in progress, frantically making up stories about being in control.
Silberstein: If people freak at evolution, etc., how much more will they freak if scientists and philosophers tell them they are nothing more than sophisticated meat machines, and is that conclusion now clearly warranted or is it premature?
Dennett: When we consider whether free will is an illusion or reality, we are looking into an abyss. What seems to confront us is a plunge into nihilism and despair.
Overbye: Dennett...is one of many who have tried to redefine free will in a way that involves no escape from the materialist world while still offering enough autonomy for moral responsibility, which seems to be what everyone cares about. ... Dennett argues, it is precisely our immersion in causality and the material world that frees us. Evolution, history and culture, he explains, have endowed us with feedback systems that give us the unique ability to reflect and think things over and to imagine the future. Free will and determinism can co-exist.
Dennett: All the varieties of free will worth having, we have...We have the power of imagination, to see and imagine futures...That’s what makes us moral agents...You don’t need a miracle to have responsibility.
Overbye also reviews the idea of freedom as a possible emergent phenomena that grows naturally in accordance with the laws of physics - like stock markets, brains, or the rules of democracy - that play by new rules once they are here.
Overbye: A bevy of experiments in recent years suggest that the conscious mind is like a monkey riding a tiger of subconscious decisions and actions in progress, frantically making up stories about being in control.
Silberstein: If people freak at evolution, etc., how much more will they freak if scientists and philosophers tell them they are nothing more than sophisticated meat machines, and is that conclusion now clearly warranted or is it premature?
Dennett: When we consider whether free will is an illusion or reality, we are looking into an abyss. What seems to confront us is a plunge into nihilism and despair.
Overbye: Dennett...is one of many who have tried to redefine free will in a way that involves no escape from the materialist world while still offering enough autonomy for moral responsibility, which seems to be what everyone cares about. ... Dennett argues, it is precisely our immersion in causality and the material world that frees us. Evolution, history and culture, he explains, have endowed us with feedback systems that give us the unique ability to reflect and think things over and to imagine the future. Free will and determinism can co-exist.
Dennett: All the varieties of free will worth having, we have...We have the power of imagination, to see and imagine futures...That’s what makes us moral agents...You don’t need a miracle to have responsibility.
Overbye also reviews the idea of freedom as a possible emergent phenomena that grows naturally in accordance with the laws of physics - like stock markets, brains, or the rules of democracy - that play by new rules once they are here.
Wednesday, January 03, 2007
The distinction between sincerity and authenticity
I want to pass on clips from an essay by Orlando Patterson in the Dec. 26 New York Times. He cites Lionel Trilling, the cultural critic, as having in the 1970s "encouraged us to take seriously the distinction between sincerity and authenticity. Sincerity, he said, requires us to act and really be the way that we present ourselves to others. Authenticity involves finding and expressing the true inner self and judging all relationships in terms of it."
Patterson suggests that "Authenticity now dominates our way of viewing ourselves and our relationships, with baleful consequences. Within sensitive individuals it breeds doubt; between people it promotes distrust; within groups it enhances group-think in the endless quest to be one with the group’s true soul; and between groups it is the inner source of identity politics...the primacy of the self has penetrated major areas of government: emotivist arguments trump reasoned discourse in Congressional hearings and criminal justice; and in public education."
"Social scientists and pollsters routinely belittle results showing growing tolerance; they argue that Americans have simply learned how to conceal their deeply...Harvard social psychologist Mahzarin Banaji and her collaborators claim to have evidence, based on more than three million self-administered Web-based tests, that nearly all of us are authentically bigoted to the core with hidden “implicit prejudices” — about race, gender, age, homosexuality and appearance — that we deny, sometimes with consciously tolerant views ingrained prejudices."
"I couldn’t care less whether my neighbors and co-workers are authentically sexist, racist or ageist. What matters is that they behave with civility and tolerance, obey the rules of social interaction and are sincere about it. The criteria of sincerity are unambiguous: Will they keep their promises? Will they honor the meanings and understandings we tacitly negotiate? Are their gestures of cordiality offered in conscious good faith?...Sincerity rests in reconciling our performance of tolerance with the people we become. And what it means for us today is that the best way of living in our diverse and contentiously free society is neither to obsess about the hidden depths of our prejudices nor to deny them, but to behave as if we had none."
Patterson suggests that "Authenticity now dominates our way of viewing ourselves and our relationships, with baleful consequences. Within sensitive individuals it breeds doubt; between people it promotes distrust; within groups it enhances group-think in the endless quest to be one with the group’s true soul; and between groups it is the inner source of identity politics...the primacy of the self has penetrated major areas of government: emotivist arguments trump reasoned discourse in Congressional hearings and criminal justice; and in public education."
"Social scientists and pollsters routinely belittle results showing growing tolerance; they argue that Americans have simply learned how to conceal their deeply...Harvard social psychologist Mahzarin Banaji and her collaborators claim to have evidence, based on more than three million self-administered Web-based tests, that nearly all of us are authentically bigoted to the core with hidden “implicit prejudices” — about race, gender, age, homosexuality and appearance — that we deny, sometimes with consciously tolerant views ingrained prejudices."
"I couldn’t care less whether my neighbors and co-workers are authentically sexist, racist or ageist. What matters is that they behave with civility and tolerance, obey the rules of social interaction and are sincere about it. The criteria of sincerity are unambiguous: Will they keep their promises? Will they honor the meanings and understandings we tacitly negotiate? Are their gestures of cordiality offered in conscious good faith?...Sincerity rests in reconciling our performance of tolerance with the people we become. And what it means for us today is that the best way of living in our diverse and contentiously free society is neither to obsess about the hidden depths of our prejudices nor to deny them, but to behave as if we had none."
Tuesday, January 02, 2007
Explaining away the supernatural as brain misfirings?
Several of my posts have mentioned work suggesting a basis for phenomena like out of body experiences or sensing the presence of phantom others: in the temporary perturbation of brain processes that normally arrange our perception of the external world and others in it. These perturbations have been observed during epileptic seizures and electrical or magnetic stimulation of regions of the brain. Deborah Blum, author of “Ghost Hunters: William James and the Scientific Search for Life After Death.” weighs in on this issue in a Op-Ed piece in the Dec. 30 New York Times. She cites work from Blanke's laboratory that I mentioned in my Oct. 3 post. She seems critical of scientists who "concluded that ghosts are mere “bodily delusions,” electrical misfirings and nothing more" and cites work done on psychic phenomena by respected scientists in the late 19th century. Blum says "Dr. Blanke believes that even this one subject’s experience serves as an example of how we may mistake errant signals in the brain for something more. Humans tend, he points out, to seek explanation, to impose meaning on events that may have none. The pure rationalists among us suggest that our need to add meaning to a basic, biological existence easily accounts for the way we organize religions and find evidence of otherworldly powers in the stuff of everyday life."
Blum then continues: "The nonpurists suggest a different conclusion: willful scientific blindness. And there’s no reason Dr. Blanke’s study can’t support their theories of the paranormal. Perhaps his experimental electric current simply mimics the work of an equally powerful spirit. Much of the psychical research done today applies similar principles: brain-imaging machines highlight parts of the brain that respond to psychic phenomena, while other devices are used to search for infrared radiation or increased electrical activity in haunted houses."
Wait a minute... Equally powerful spirit? Will someone please measure this spirit with a physical instrument, because it is altering physical processes in the brain! Or, "parts of the brain that respond to psychic phenomena?" What is cause and what is effect here? Are we presupposing the existence of psychic phenomena as causes? Then please measure them. I'm sorry, but I can't give up my skepticism about things that alter material physical processes in the brain, how can a non-physical process (spirit, ectoplasm, soul, whatever) change them? We're back to Descartes putting the soul in the pineal gland.
Photo credit: New York Times.
Blum then continues: "The nonpurists suggest a different conclusion: willful scientific blindness. And there’s no reason Dr. Blanke’s study can’t support their theories of the paranormal. Perhaps his experimental electric current simply mimics the work of an equally powerful spirit. Much of the psychical research done today applies similar principles: brain-imaging machines highlight parts of the brain that respond to psychic phenomena, while other devices are used to search for infrared radiation or increased electrical activity in haunted houses."
Wait a minute... Equally powerful spirit? Will someone please measure this spirit with a physical instrument, because it is altering physical processes in the brain! Or, "parts of the brain that respond to psychic phenomena?" What is cause and what is effect here? Are we presupposing the existence of psychic phenomena as causes? Then please measure them. I'm sorry, but I can't give up my skepticism about things that alter material physical processes in the brain, how can a non-physical process (spirit, ectoplasm, soul, whatever) change them? We're back to Descartes putting the soul in the pineal gland.
Photo credit: New York Times.
Monday, January 01, 2007
"Web 2.0"...both feel-good and wicked
I want to pass on two articles on "Web 2.0", the first by Celeste Biever in the Dec. 23 Issue of New Scientist, and the second by David Pogue in the Dec. 31 issue of the New York Times.
First, Biever:
"USER participation is crucial to the survival of popular websites like YouTube and Flickr. But how do these sites ensure that new videos, photos and comments keep flooding in?
It all comes down to persuasion strategies, says B. J. Fogg at Stanford University in California, who is analysing the techniques employed by websites that rely on their users for content, known collectively as Web 2.0. The secret is to tie the acquisition of friends, compliments and status – spoils that humans will work hard for – to activities that enhance the site, such as inviting new users and contributing photos, he says. “You offer someone a context for gaining status, and they are going to work for that status.”
If you offer people a chance to gain status, they will work for it
Fogg and his colleagues analysed hundreds of such sites and identified three stages to their success, which they called discovery, superficial involvement and true commitment.
They found that the first two stages are easily achieved, for example by making it simple for existing users to email their friends with something they have posted online. In this way other people discover the site and become superficially involved through activities such as rating a posted video or photo. What separates successful from unsuccessful websites is the ability to get these people to create content of their own, involve yet more friends, and remain active and loyal (see “Watch yourself”).
By studying over 150 videos of people using successful sites, Fogg identified key strategies that persuade users to get involved. One incentive is to give people the opportunity to increase their status. For example, the photo-sharing website Flickr assigns images an “interestingness” score depending on how many people view them and whether they comment. This encourages users to email their friends with links to their photos. This is good for the site as it improves the quality of Flickr's search engine by ensuring the most interesting photos are ranked most highly.
Sites also keep people involved by giving them the chance to earn rewards. For bloggers these could come in the form of comments from other users, while on the business networking site Linked-In they might be endorsements that potential contacts can read. Again, these benefit the websites by engaging other users.
The effects of both status and rewards are increased because they are doled out unpredictably – new people joining your friendship group on MySpace say, or a new comment on your blog. This ensures users frequently return to the site to check for changes.
Fogg hopes that by studying how well these strategies work, he will be able to quantify them and discover new ways in which people are open to persuasion. “The web is a huge lab for studying human psychology,” he says. “I think what we are seeing with Web 2.0 is which persuasion technologies work and which do not.”
And the next article, by Pogue:
"IN 2006, the big Internet news was “Web 2.0” — that is, participatory Web sites, like YouTube, MySpace, Wikipedia, Digg and Flickr, which relied on material supplied by the audience itself. On these explosively popular sites, the Web is not so much a publication as a global conversation.
In 2007, the challenge may be keeping that conversation from descending into the muck.
As a Web 2.0 site or a blog becomes more popular, a growing percentage of its reader contributions devolve into vitriol, backstabbing and name-calling (not to mention Neanderthal spelling and grammar). Participants address each other as “idiot” and “moron” (and worse) the way correspondents of old might have used “sir” or “madam.”
The New Nastiness may be no different from the incivility people can show each other in everyday life. It may be inspired by the political insultfests on TV and radio. Or it may be that anonymity online removes whatever self-control they might have exhibited when confronting their subjects in person.
Internet veterans scoff at the notion that there’s any increase in hostility online. They point to similar “flame wars” dating back to the earliest days of the Internet, even before there was a Web.
Instead, these observers note that rudeness increases disproportionately with a site’s popularity. That is, the decline of comment quality on YouTube doesn’t reflect a decline on the Internet in general, only of YouTube’s wider appeal.
One thing is clear, however: the uncivil participants are driving away the civil ones. The result is an acceleration of the cycle, and an increasing proportion of hostile remarks.
Requiring commenters to use their real names might work to add some civility, but such a radical change might drive away a big chunk of the audience. It’s more likely that the citizens of the Internet will simply learn to accept the poison on the comment boards as an unfortunate side effect of free speech online, much the way they grumblingly tolerate spam in their e-mail in-box."
First, Biever:
"USER participation is crucial to the survival of popular websites like YouTube and Flickr. But how do these sites ensure that new videos, photos and comments keep flooding in?
It all comes down to persuasion strategies, says B. J. Fogg at Stanford University in California, who is analysing the techniques employed by websites that rely on their users for content, known collectively as Web 2.0. The secret is to tie the acquisition of friends, compliments and status – spoils that humans will work hard for – to activities that enhance the site, such as inviting new users and contributing photos, he says. “You offer someone a context for gaining status, and they are going to work for that status.”
If you offer people a chance to gain status, they will work for it
Fogg and his colleagues analysed hundreds of such sites and identified three stages to their success, which they called discovery, superficial involvement and true commitment.
They found that the first two stages are easily achieved, for example by making it simple for existing users to email their friends with something they have posted online. In this way other people discover the site and become superficially involved through activities such as rating a posted video or photo. What separates successful from unsuccessful websites is the ability to get these people to create content of their own, involve yet more friends, and remain active and loyal (see “Watch yourself”).
By studying over 150 videos of people using successful sites, Fogg identified key strategies that persuade users to get involved. One incentive is to give people the opportunity to increase their status. For example, the photo-sharing website Flickr assigns images an “interestingness” score depending on how many people view them and whether they comment. This encourages users to email their friends with links to their photos. This is good for the site as it improves the quality of Flickr's search engine by ensuring the most interesting photos are ranked most highly.
Sites also keep people involved by giving them the chance to earn rewards. For bloggers these could come in the form of comments from other users, while on the business networking site Linked-In they might be endorsements that potential contacts can read. Again, these benefit the websites by engaging other users.
The effects of both status and rewards are increased because they are doled out unpredictably – new people joining your friendship group on MySpace say, or a new comment on your blog. This ensures users frequently return to the site to check for changes.
Fogg hopes that by studying how well these strategies work, he will be able to quantify them and discover new ways in which people are open to persuasion. “The web is a huge lab for studying human psychology,” he says. “I think what we are seeing with Web 2.0 is which persuasion technologies work and which do not.”
And the next article, by Pogue:
"IN 2006, the big Internet news was “Web 2.0” — that is, participatory Web sites, like YouTube, MySpace, Wikipedia, Digg and Flickr, which relied on material supplied by the audience itself. On these explosively popular sites, the Web is not so much a publication as a global conversation.
In 2007, the challenge may be keeping that conversation from descending into the muck.
As a Web 2.0 site or a blog becomes more popular, a growing percentage of its reader contributions devolve into vitriol, backstabbing and name-calling (not to mention Neanderthal spelling and grammar). Participants address each other as “idiot” and “moron” (and worse) the way correspondents of old might have used “sir” or “madam.”
The New Nastiness may be no different from the incivility people can show each other in everyday life. It may be inspired by the political insultfests on TV and radio. Or it may be that anonymity online removes whatever self-control they might have exhibited when confronting their subjects in person.
Internet veterans scoff at the notion that there’s any increase in hostility online. They point to similar “flame wars” dating back to the earliest days of the Internet, even before there was a Web.
Instead, these observers note that rudeness increases disproportionately with a site’s popularity. That is, the decline of comment quality on YouTube doesn’t reflect a decline on the Internet in general, only of YouTube’s wider appeal.
One thing is clear, however: the uncivil participants are driving away the civil ones. The result is an acceleration of the cycle, and an increasing proportion of hostile remarks.
Requiring commenters to use their real names might work to add some civility, but such a radical change might drive away a big chunk of the audience. It’s more likely that the citizens of the Internet will simply learn to accept the poison on the comment boards as an unfortunate side effect of free speech online, much the way they grumblingly tolerate spam in their e-mail in-box."
Friday, December 29, 2006
Challenging the link between the human microcephalin gene, evolution, and cognition.
Michael Balter writes an interesting account in Science Magazine over the controversy over the interpretation of data on the microcephalin gene, a gene that regulates brain size. (Microcephaly is the congenital or developmental disorder in which the circumference of the head is smaller than normal because the brain has not developed properly or has stopped growing. )
"...in two papers in Science last year, Lahn reported that variants of the two genes appear to have been strongly favored by recent natural selection (Science, 9 September 2005, pp. 1717 and 1720). That implies that the variants conferred a survival or reproductive benefit, perhaps a cognitive one. In media interviews, Lahn conceded that there was no real evidence natural selection had acted on cognition or intelligence. But both papers pointed out that the mutations arose when key events in human cultural development occurred: The microcephalin variant was dated to about 37,000 years ago, when the first art and symbolism showed up in Europe, and the ASPM variant to 5800 years ago, when the first cities arose.
Lahn's papers also reported the skewed geographic distribution of the genetic variants. Variants in microcephalin turned up in 75% or more of some Europeans and Asians Lahn studied, but in less than 10% of some African groups. The ASPM variant was also much less frequent in Africa. (click on graphic to enlarge).
Bloggers jumped on the news, trumpeting the papers as support for the idea that African Americans have lower intelligence than whites. Two months later, in the conservative National Review Online, columnist John Derbyshire wrote that the research implied that "our cherished national dream of a well-mixed and harmonious meritocracy … may be unattainable."
"Soon after the Science papers were published, Lahn set out to see whether the variants give a cognitive advantage. In one study, Lahn helped controversial psychologist Philippe Rushton of the University of Western Ontario in London, Canada, test whether people who carry the favored variants have higher IQs. Rushton is well known for his claims that African Americans have lower intelligence than whites, and Lahn had found that some genetic variants are common in Europeans and Asians but less frequent among sub-Saharan Africans. But Rushton reported last week at the annual meeting of the International Society for Intelligence Research in San Francisco, California, that he had struck out: The variants conferred no advantage on IQ tests. "[We] had no luck," Rushton told Science, "no matter which way we analyzed the data." Lahn was not a co-author, but his group genotyped the 644 adults of differing ethnicity in the study."
Among some geneticists, there was consternation. "There was no evidence whatsoever that these [genetic variants] have any effect" on differences between people, Altshuler says, adding that the controversy over the work was "easily anticipated." Harvard geneticist Richard Lewontin goes further, criticizing both Lahn and Science for publishing such speculative links to cultural advances. "These two papers are particularly egregious examples of going well beyond the data to try to make a splash," he says. And archaeologist Scott MacEachern of Bowdoin College in Brunswick, Maine, says the archaeological links in the papers are simplistic and outdated. The symbolic revolution, agriculture, and urbanism developed "over many thousands of years, and none was restricted to Europe and the Middle East," he says."
"...in two papers in Science last year, Lahn reported that variants of the two genes appear to have been strongly favored by recent natural selection (Science, 9 September 2005, pp. 1717 and 1720). That implies that the variants conferred a survival or reproductive benefit, perhaps a cognitive one. In media interviews, Lahn conceded that there was no real evidence natural selection had acted on cognition or intelligence. But both papers pointed out that the mutations arose when key events in human cultural development occurred: The microcephalin variant was dated to about 37,000 years ago, when the first art and symbolism showed up in Europe, and the ASPM variant to 5800 years ago, when the first cities arose.
Lahn's papers also reported the skewed geographic distribution of the genetic variants. Variants in microcephalin turned up in 75% or more of some Europeans and Asians Lahn studied, but in less than 10% of some African groups. The ASPM variant was also much less frequent in Africa. (click on graphic to enlarge).
Bloggers jumped on the news, trumpeting the papers as support for the idea that African Americans have lower intelligence than whites. Two months later, in the conservative National Review Online, columnist John Derbyshire wrote that the research implied that "our cherished national dream of a well-mixed and harmonious meritocracy … may be unattainable."
"Soon after the Science papers were published, Lahn set out to see whether the variants give a cognitive advantage. In one study, Lahn helped controversial psychologist Philippe Rushton of the University of Western Ontario in London, Canada, test whether people who carry the favored variants have higher IQs. Rushton is well known for his claims that African Americans have lower intelligence than whites, and Lahn had found that some genetic variants are common in Europeans and Asians but less frequent among sub-Saharan Africans. But Rushton reported last week at the annual meeting of the International Society for Intelligence Research in San Francisco, California, that he had struck out: The variants conferred no advantage on IQ tests. "[We] had no luck," Rushton told Science, "no matter which way we analyzed the data." Lahn was not a co-author, but his group genotyped the 644 adults of differing ethnicity in the study."
Among some geneticists, there was consternation. "There was no evidence whatsoever that these [genetic variants] have any effect" on differences between people, Altshuler says, adding that the controversy over the work was "easily anticipated." Harvard geneticist Richard Lewontin goes further, criticizing both Lahn and Science for publishing such speculative links to cultural advances. "These two papers are particularly egregious examples of going well beyond the data to try to make a splash," he says. And archaeologist Scott MacEachern of Bowdoin College in Brunswick, Maine, says the archaeological links in the papers are simplistic and outdated. The symbolic revolution, agriculture, and urbanism developed "over many thousands of years, and none was restricted to Europe and the Middle East," he says."
Thursday, December 28, 2006
Actions speak louder than brain images.
There is growing concern over the pseudoscientific use of brain imaging to predict behaviors or assign a character type. Apoorva Mandavilli writes a short essay on this topic in the Dec. 7 issue of Nature Magazine.
"Can brain scans of a racist, liar or psychopath accurately tell whether that person will persecute, fib or kill? No, say experts in the ethics of neuroscience, who are increasingly concerned that such images will be used to make dangerous legal or social judgements about people's behaviour. They say it is time for scientists, lawyers and philosophers to speak up about the limitations of such techniques....interpreting brain scans, and correlating them to actions, is inaccurate at best. All we can really gain from such studies is a more nuanced understanding of behaviour...studies of behavioural or physical responses — for example, a person's reaction to different races in real life — should trump imaging every time...The legal and moral claims being made[ from imaging studies involving very few people] are far too extensive."
" In a landmark case in the US Supreme Court in March 2005, several leading scientific groups, including the American Medical Association, the American Psychiatric Association and the National Mental Health Association, filed briefs to support the premise that teenagers are less rational than adults.
The data included a brain-imaging study showing that the prefrontal cortex, which governs impulse control and reasoning, develops late in adolescence (see Nature 442, 865–867; 2006), and could explain some irrational aspects of teenage behaviour."
"Many groups thought this study could help rule against the death penalty. But although the court ruled against the death penalty for those younger than 18, it chose not to cite the brain-imaging study, relying instead on behavioural studies that showed adolescents are more impulsive, more vulnerable to peer pressure and more affected by stress."
"Can brain scans of a racist, liar or psychopath accurately tell whether that person will persecute, fib or kill? No, say experts in the ethics of neuroscience, who are increasingly concerned that such images will be used to make dangerous legal or social judgements about people's behaviour. They say it is time for scientists, lawyers and philosophers to speak up about the limitations of such techniques....interpreting brain scans, and correlating them to actions, is inaccurate at best. All we can really gain from such studies is a more nuanced understanding of behaviour...studies of behavioural or physical responses — for example, a person's reaction to different races in real life — should trump imaging every time...The legal and moral claims being made[ from imaging studies involving very few people] are far too extensive."
" In a landmark case in the US Supreme Court in March 2005, several leading scientific groups, including the American Medical Association, the American Psychiatric Association and the National Mental Health Association, filed briefs to support the premise that teenagers are less rational than adults.
The data included a brain-imaging study showing that the prefrontal cortex, which governs impulse control and reasoning, develops late in adolescence (see Nature 442, 865–867; 2006), and could explain some irrational aspects of teenage behaviour."
"Many groups thought this study could help rule against the death penalty. But although the court ruled against the death penalty for those younger than 18, it chose not to cite the brain-imaging study, relying instead on behavioural studies that showed adolescents are more impulsive, more vulnerable to peer pressure and more affected by stress."
Blog Categories:
culture/politics,
morality,
technology
Brain self repair
Jan's laboratory at UCSF (Cell Volume 127, Issue 6, 15 December 2006) has looked at an area of the brain known as the subventricular zone (SVZ). They showed that a gene called Numb regulates how stem cells from the SVZ become neurons, and instructs these cells to maintain the walls of the lateral ventricles, the brain's central cavities.
With Numb knocked out, mice developed large holes in these walls. But, rather than worsening over time, the holes were repaired within 6 weeks. The team suggests that stem cells that escaped the knockout were able to shore up the walls. Such capacity for do-it-yourself repair might be harnessed to treat brain damage.
With Numb knocked out, mice developed large holes in these walls. But, rather than worsening over time, the holes were repaired within 6 weeks. The team suggests that stem cells that escaped the knockout were able to shore up the walls. Such capacity for do-it-yourself repair might be harnessed to treat brain damage.
Wednesday, December 27, 2006
A bull market in Brain Fitness and Calisthenics
I've been meaning for some time to do a post on the avalanche of interest in aging baby boomers not loosing their marbles any faster than absolutely necessary. An article on this topic by Pam Belluck in the 12/27/06 New York Times prompts me to go ahead. We are seeing a blooming of blogs and start up companies that focus on techniques for preserving memory and mental acuity (Posit Science, Third Age, Vigorous Mind, Rocky Mountain Learning, Sharp Brains, Happy Neuron, My Brain Trainer, to mention just a few). The Developing Intelligence blog has a post that discusses the Sharp Brains company, and the Sharp Brains Blog and the Brain Reserves Blog are among several that focus on brain fitness.
There are positive individual testimonials to the effectiveness of brain exercises, and a number of group studies are underway, but we are still absent any hard data that brain exercises bring a benefit that is distinguishable from general cardiovascular exercise. Belluck notes "human studies have generally relied on observations of people with healthier brains, but have not tested whether a particular behavior improves brain health. Perhaps people with healthier brains are more likely to do brain-stimulating activities, not the reverse." She also makes the point: "Certainly most brain-healthy recommendations are not considered bad for people. They do not have the potential risks of drugs or herbal supplements... The challenge we have is it’s going to be a lot like the anti-aging industry: how much science is there behind this?"
There are positive individual testimonials to the effectiveness of brain exercises, and a number of group studies are underway, but we are still absent any hard data that brain exercises bring a benefit that is distinguishable from general cardiovascular exercise. Belluck notes "human studies have generally relied on observations of people with healthier brains, but have not tested whether a particular behavior improves brain health. Perhaps people with healthier brains are more likely to do brain-stimulating activities, not the reverse." She also makes the point: "Certainly most brain-healthy recommendations are not considered bad for people. They do not have the potential risks of drugs or herbal supplements... The challenge we have is it’s going to be a lot like the anti-aging industry: how much science is there behind this?"
Synchronies to bind our brains... check out the movie
A commentary by Sporns and Honey and an article by Bassett et al in PNAS delve into (quoting Spors and Honey) "explaining how functional brain states emerge from the interactions of dozens, perhaps hundreds, of brain regions, each containing millions of neurons. Much evidence supports the view that highly evolved nervous systems are capable of rapid, real-time integration of information across segregated sensory channels and brain regions. This integration happens without the need for a central controller or executive: It is the functional outcome of dynamic interactions within and between the complex structural networks of the brain... the study by Bassett et al. reveals the existence of large-scale functional networks in magnetoencephalographic (MEG) recordings with attributes that are preserved across multiple frequency bands and that flexibly adapt to task demands. These networks exhibit "small-world" structure, i.e., high levels of clustering and short path lengths. The authors' analysis reveals that the small-world topology of brain functional networks is largely preserved across multiple frequency bands and behavioral tasks."
From Bassett et al: "Coherent or correlated oscillation of large-scale, distributed neural networks is widely regarded as an important physiological substrate for motor, perceptual and cognitive representations in the brain...The topology of networks can range from entirely random to fully ordered (a lattice). In this spectrum, small-world topology is characteristic of complex networks that demonstrate both clustered or cliquish interconnectivity within groups of nodes sharing many nearest neighbors in common (like regular lattices), and a short path length between any two nodes in the network (like random graphs). This is an attractive configuration, in principle, for the anatomical and functional architecture of the brain, because small-world networks are known to optimize information transfer, increase the rate of learning, and support both segregated and distributed information processing."
"Magnetoencephalographic data were acquired from 22 subjects, half of whom performed a finger-tapping task, whereas the other half were studied at rest. Signals were recorded from a set of 275 points overlying the scalp surface, to provide a time-frequency decomposition of human brain activity... brain functional networks were characterized by small-world properties at all six wavelet scales considered, corresponding approximately to classical {delta} (low and high), {theta}, {alpha}, beta, and {gamma} frequency bands. Global topological parameters (path length, clustering) were conserved across scales, most consistently in the frequency range 2–37 Hz, implying a scale-invariant or fractal small-world organization. Dynamical analysis showed that networks were located close to the threshold of order/disorder transition in all frequency bands. The highest-frequency {gamma} network had greater synchronizability, greater clustering of connections, and shorter path length than networks in the scaling regime of (lower) frequencies. Behavioral state did not strongly influence global topology or synchronizability; however, motor task performance was associated with emergence of long-range connections in both beta and {gamma} networks. Long-range connectivity, e.g., between frontal and parietal cortex, at high frequencies during a motor task may facilitate sensorimotor binding. Human brain functional networks demonstrate a fractal small-world architecture that supports critical dynamics and task-related spatial reconfiguration while preserving global topological parameters."
The above figure is a demonstration model by Sporns and Honey of the relationship of structural to functional connectivity networks consisting of a set of 1,600 modeled neural mean field units arranged on a sphere and engaging in noise-driven spontaneous activity. (A) The anatomical connection pattern, shown only for a few randomly selected neural units, consists of a mix of mostly local (clustered) connections and a few connections made over longer distances. (B) A snapshot and an EEG-like recording trace of the dynamical neuronal activity pattern. Neuronal dynamics is characterized by complex spatial and temporal structure across multiple scales [Click here to see a supporting movie]. (C) A functional connectivity network obtained from a thresholded correlation matrix calculated from the dynamics shown in B. In this example, both structural and functional connectivity patterns exhibit small-world attributes.
From Bassett et al: "Coherent or correlated oscillation of large-scale, distributed neural networks is widely regarded as an important physiological substrate for motor, perceptual and cognitive representations in the brain...The topology of networks can range from entirely random to fully ordered (a lattice). In this spectrum, small-world topology is characteristic of complex networks that demonstrate both clustered or cliquish interconnectivity within groups of nodes sharing many nearest neighbors in common (like regular lattices), and a short path length between any two nodes in the network (like random graphs). This is an attractive configuration, in principle, for the anatomical and functional architecture of the brain, because small-world networks are known to optimize information transfer, increase the rate of learning, and support both segregated and distributed information processing."
"Magnetoencephalographic data were acquired from 22 subjects, half of whom performed a finger-tapping task, whereas the other half were studied at rest. Signals were recorded from a set of 275 points overlying the scalp surface, to provide a time-frequency decomposition of human brain activity... brain functional networks were characterized by small-world properties at all six wavelet scales considered, corresponding approximately to classical {delta} (low and high), {theta}, {alpha}, beta, and {gamma} frequency bands. Global topological parameters (path length, clustering) were conserved across scales, most consistently in the frequency range 2–37 Hz, implying a scale-invariant or fractal small-world organization. Dynamical analysis showed that networks were located close to the threshold of order/disorder transition in all frequency bands. The highest-frequency {gamma} network had greater synchronizability, greater clustering of connections, and shorter path length than networks in the scaling regime of (lower) frequencies. Behavioral state did not strongly influence global topology or synchronizability; however, motor task performance was associated with emergence of long-range connections in both beta and {gamma} networks. Long-range connectivity, e.g., between frontal and parietal cortex, at high frequencies during a motor task may facilitate sensorimotor binding. Human brain functional networks demonstrate a fractal small-world architecture that supports critical dynamics and task-related spatial reconfiguration while preserving global topological parameters."
The above figure is a demonstration model by Sporns and Honey of the relationship of structural to functional connectivity networks consisting of a set of 1,600 modeled neural mean field units arranged on a sphere and engaging in noise-driven spontaneous activity. (A) The anatomical connection pattern, shown only for a few randomly selected neural units, consists of a mix of mostly local (clustered) connections and a few connections made over longer distances. (B) A snapshot and an EEG-like recording trace of the dynamical neuronal activity pattern. Neuronal dynamics is characterized by complex spatial and temporal structure across multiple scales [Click here to see a supporting movie]. (C) A functional connectivity network obtained from a thresholded correlation matrix calculated from the dynamics shown in B. In this example, both structural and functional connectivity patterns exhibit small-world attributes.
A review of MindBlog
A colleague recently pointed out this review of MindBlog, which I had completely missed. And to lay on another bit of self-promotion, let me remind you that "The Biology of Mind" has received good reviews and is a friendly read.
Tuesday, December 26, 2006
Thought without language - metacognition in Animals
The Dec. 15 issue of The New Scientist has an interesting article by Helen Philips, "The Known Unknown," about game playing in monkeys and dolphins that sheds light on their 'thinking about thinking' , knowing what they don't know - which appears to be a key step on the transition to full consciousnes. Here is a nice graphic from that article. (Click on the graphic to enlarge it).
Blog Categories:
animal behavior,
consciousness,
unconscious
In defense of order....
Check out this response by Jessica Duquette to the NY Times article on disorder that was the subject of yesterday's post. She argues "Neat is fluid and dynamic, not prissy and stuck." and cites another response to Green's NY Times essay: "There is a difference in a stagnate mess and an active mess. A desk with papers and notes changing daily shows activity. Mess that accumulates and stagnates is a sign of incompletion and unwillingness to go back over something, like cleaning up desk at the end of the day. I have found that things need to get moved around but also need a place to be so a person can find them when needed. That is a time saver, not a time waster. Not being able to find a tool to fix the light switch or hang a coat rack only adds to problem. Then you end up looking for your coat in dark when you’re in a hurry."
Monday, December 25, 2006
Disorder as the detritus of a creative mind...
This is the subtitle title of a recent essay in the New York Times, "Saying Yes to Mess", by Penelope Green. Being a tidy control freak (while my partner generates entropy and piles), I can't resist passing on some clips:
In the face of a booming home-organizing market (5.9 Billion last year), "An anti-anticlutter movement is afoot, one that says yes to mess and urges you to embrace your disorder. Studies are piling up that show that messy desks are the vivid signatures of people with creative, limber minds (who reap higher salaries than those with neat “office landscapes”) and that messy closet owners are probably better parents and nicer and cooler than their tidier counterparts. It’s a movement that confirms what you have known, deep down, all along: really neat people are not avatars of the good life; they are humorless and inflexible prigs, and have way too much time on their hands. "
David Freedman and Eric Abrahamson, in their forthcoming book "A Perfect Mess: The Hidden Benefits of Disorder," "describe the properties of mess in loving terms. Mess has resonance, they write, which means it can vibrate beyond its own confines and connect to the larger world. It was the overall scumminess of Alexander Fleming’s laboratory that led to his discovery of penicillin, from a moldy bloom in a petri dish he had forgotten on his desk....The book is a meandering, engaging tour of beneficial mess and the systems and individuals reaping those benefits, like Gov. Arnold Schwarzenegger, whose mess-for-success tips include never making a daily schedule."
"In the semiotics of mess, desks may be the richest texts. Messy-desk research borrows from cognitive ergonomics, a field of study dealing with how a work environment supports productivity. Consider that desks, our work landscapes, are stand-ins for our brains, and so the piles we array on them are “cognitive artifacts,” or data cues, of our thoughts as we work.
To a professional organizer brandishing colored files and stackable trays, cluttered horizontal surfaces are a horror; to cognitive psychologists like Jay Brand, who works in the Ideation Group of Haworth Inc., the huge office furniture company, their peaks and valleys glow with intellectual intent and showcase a mind whirring away: sorting, linking, producing. (By extension, a clean desk can be seen as a dormant area, an indication that no thought or work is being undertaken.)
His studies and others, like a survey conducted last year by Ajilon Professional Staffing, in Saddle Brook, N.J., which linked messy desks to higher salaries (and neat ones to salaries under $35,000), answer Einstein’s oft-quoted remark, “If a cluttered desk is a sign of a cluttered mind, of what, then, is an empty desk?”
In the face of a booming home-organizing market (5.9 Billion last year), "An anti-anticlutter movement is afoot, one that says yes to mess and urges you to embrace your disorder. Studies are piling up that show that messy desks are the vivid signatures of people with creative, limber minds (who reap higher salaries than those with neat “office landscapes”) and that messy closet owners are probably better parents and nicer and cooler than their tidier counterparts. It’s a movement that confirms what you have known, deep down, all along: really neat people are not avatars of the good life; they are humorless and inflexible prigs, and have way too much time on their hands. "
David Freedman and Eric Abrahamson, in their forthcoming book "A Perfect Mess: The Hidden Benefits of Disorder," "describe the properties of mess in loving terms. Mess has resonance, they write, which means it can vibrate beyond its own confines and connect to the larger world. It was the overall scumminess of Alexander Fleming’s laboratory that led to his discovery of penicillin, from a moldy bloom in a petri dish he had forgotten on his desk....The book is a meandering, engaging tour of beneficial mess and the systems and individuals reaping those benefits, like Gov. Arnold Schwarzenegger, whose mess-for-success tips include never making a daily schedule."
"In the semiotics of mess, desks may be the richest texts. Messy-desk research borrows from cognitive ergonomics, a field of study dealing with how a work environment supports productivity. Consider that desks, our work landscapes, are stand-ins for our brains, and so the piles we array on them are “cognitive artifacts,” or data cues, of our thoughts as we work.
To a professional organizer brandishing colored files and stackable trays, cluttered horizontal surfaces are a horror; to cognitive psychologists like Jay Brand, who works in the Ideation Group of Haworth Inc., the huge office furniture company, their peaks and valleys glow with intellectual intent and showcase a mind whirring away: sorting, linking, producing. (By extension, a clean desk can be seen as a dormant area, an indication that no thought or work is being undertaken.)
His studies and others, like a survey conducted last year by Ajilon Professional Staffing, in Saddle Brook, N.J., which linked messy desks to higher salaries (and neat ones to salaries under $35,000), answer Einstein’s oft-quoted remark, “If a cluttered desk is a sign of a cluttered mind, of what, then, is an empty desk?”
Friday, December 22, 2006
The giving season: food and money -an evolutionary link?
This short note from the Editor's choice section in the Dec. 22 issue of Science:
"Although the giving of gifts is a common activity at this time of year, giving a gift certificate has become an allowable substitute for giving money, which is generally regarded as unseemly. In order to explore whether money can serve not only as a useful instrument (for the purchase of material goods) but also as a valued resource, Briers et al. (Psychol. Sci. 17, 939 (2006) have carried out a series of experiments to see whether an unfulfilled desire for food (or money) might make one more tight-fisted (or more voracious). People who were hungry behaved less generously toward a charity (Médecins Sans Frontières) and in public goods games than those who had just eaten cake; conversely, people who were told to imagine being desirous of a substantial payoff (being in such a state was confirmed by how much their estimates of the size of a coin were skewed to be larger than actual) consumed more M&M's than those who were focused on a modest windfall. These results linking the rewarding character of food to that of money dovetail neatly with a recent study (Vohs et al., Science, Reports, p. 1154, 17 November 2006) that demonstrated money's value as a means of enhancing one's self-sufficiency and social independence.
Here is the abstract from Briers et al.:
This report attempts to provide an evolutionary explanation for humans' motivation to strive for money in present-day societies. We propose that people's desire for money is a modern derivate of their desire for food. In three studies, we show the reciprocal association between the incentive value of food and of money. In Study 1, hungry participants were less likely than satiated participants to donate to charity. In Study 2, participants in a room with an olfactory food cue, known to increase the desire to eat, offered less money in a give-some game compared with participants in a room free of scent. In Study 3, participants' desire for money affected the amount of M&M's® they ate in a subsequent taste test, but only among participants who were not restricting their food intake in order to manage their weight.
"Although the giving of gifts is a common activity at this time of year, giving a gift certificate has become an allowable substitute for giving money, which is generally regarded as unseemly. In order to explore whether money can serve not only as a useful instrument (for the purchase of material goods) but also as a valued resource, Briers et al. (Psychol. Sci. 17, 939 (2006) have carried out a series of experiments to see whether an unfulfilled desire for food (or money) might make one more tight-fisted (or more voracious). People who were hungry behaved less generously toward a charity (Médecins Sans Frontières) and in public goods games than those who had just eaten cake; conversely, people who were told to imagine being desirous of a substantial payoff (being in such a state was confirmed by how much their estimates of the size of a coin were skewed to be larger than actual) consumed more M&M's than those who were focused on a modest windfall. These results linking the rewarding character of food to that of money dovetail neatly with a recent study (Vohs et al., Science, Reports, p. 1154, 17 November 2006) that demonstrated money's value as a means of enhancing one's self-sufficiency and social independence.
Here is the abstract from Briers et al.:
This report attempts to provide an evolutionary explanation for humans' motivation to strive for money in present-day societies. We propose that people's desire for money is a modern derivate of their desire for food. In three studies, we show the reciprocal association between the incentive value of food and of money. In Study 1, hungry participants were less likely than satiated participants to donate to charity. In Study 2, participants in a room with an olfactory food cue, known to increase the desire to eat, offered less money in a give-some game compared with participants in a room free of scent. In Study 3, participants' desire for money affected the amount of M&M's® they ate in a subsequent taste test, but only among participants who were not restricting their food intake in order to manage their weight.
During sleep: a brain memory dialogue
It has been known for some time that specific patterns of nerve firing in "place cells" of the rat hippocampus occur during learning a visual maze and that these patterns are replayed during sleep, apparently as a part of memory consolidation. Wilson's laboratory at M.I.T. (reporting in Nature Neuroscience) have now studied multicell spiking patterns in both the visual cortex and hippocampus during slow-wave sleep in rats. As Nicholas Wade notes in the NYTimes, the recordings capture dialogue between the hippocampus, where initial memories of the day's events are formed, and the neocortex, the sheet of neurons on the outer surface of the brain that mediates conscious thought and contains long-term memories.
Ji and Wilson found that spiking patterns not only in the visual cortex but also in the hippocampus were organized into frames, defined as periods of stepwise increase in neuronal population activity. The multicell firing sequences evoked by awake experience were replayed during these frames in both regions. Furthermore, replay events in the sensory cortex and hippocampus were coordinated to reflect the same experience. These results imply simultaneous reactivation of coherent memory traces in the cortex and hippocampus during sleep that may contribute to or reflect the result of the memory consolidation process. Because the fast rewinds in the neocortex tended to occur fractionally sooner than their counterparts in the hippocampus, Wilson thinks the dialogue is probably being initiated by the neocortex, and reflects a querying of the hippocampus's raw memory data.
Wade's review quotes comments from Wilson:
“The neocortex is essentially asking the hippocampus to replay events that contain a certain image, place or sound...The neocortex is trying to make sense of what is going on in the hippocampus and to build models of the world, to understand how and why things happen...These models are presumably used to direct behavior...They are able to generate expectations about the world and plausibly fill in blanks in memory.
Though the neocortex learns from the hippocampus, the raw memory traces, from childhood onward, are not transferred and are probably retained in the hippocampus... If so, the forgetfulness of age would arise because of problems in accessing the hippocampus, not because the data has vanished.
The subject matter of the neocortex-hippocampus dialogue in rats seems mostly to concern recent events. This is consistent with what people report when awoken from nondreaming sleep — usually small snatches of information about recent events. Dr. Wilson also said that the new findings, by showing activity in the visual neocortex, confirmed that rats had humanlike dreams with visual imagery, a possibility some researchers had doubted."
Ji and Wilson found that spiking patterns not only in the visual cortex but also in the hippocampus were organized into frames, defined as periods of stepwise increase in neuronal population activity. The multicell firing sequences evoked by awake experience were replayed during these frames in both regions. Furthermore, replay events in the sensory cortex and hippocampus were coordinated to reflect the same experience. These results imply simultaneous reactivation of coherent memory traces in the cortex and hippocampus during sleep that may contribute to or reflect the result of the memory consolidation process. Because the fast rewinds in the neocortex tended to occur fractionally sooner than their counterparts in the hippocampus, Wilson thinks the dialogue is probably being initiated by the neocortex, and reflects a querying of the hippocampus's raw memory data.
Wade's review quotes comments from Wilson:
“The neocortex is essentially asking the hippocampus to replay events that contain a certain image, place or sound...The neocortex is trying to make sense of what is going on in the hippocampus and to build models of the world, to understand how and why things happen...These models are presumably used to direct behavior...They are able to generate expectations about the world and plausibly fill in blanks in memory.
Though the neocortex learns from the hippocampus, the raw memory traces, from childhood onward, are not transferred and are probably retained in the hippocampus... If so, the forgetfulness of age would arise because of problems in accessing the hippocampus, not because the data has vanished.
The subject matter of the neocortex-hippocampus dialogue in rats seems mostly to concern recent events. This is consistent with what people report when awoken from nondreaming sleep — usually small snatches of information about recent events. Dr. Wilson also said that the new findings, by showing activity in the visual neocortex, confirmed that rats had humanlike dreams with visual imagery, a possibility some researchers had doubted."
Thursday, December 21, 2006
Mind Wars
I would like to recommend to you an interesting, authoritative, and well written book on the massive amount of research being conducted by the United States defense establishment on brain research relevant to:
- "bulding better humans" (for war purposes)
- controlling human behaviors through chemical or other means (DARPA funded early LSD experiments and Darpanet was the first name for the internet)
- "mind-reading" using imaging techniques
- brain-machine interfaces, 'borgs' (machine-human hybrids)
- improving battefield survivabiliy, making "sleepless" soliders, etc. etc.
The book is "Mind Wars: Brain Research and National Defense" by Jonathan D. Moreno, who holds a chair professorship and is Director of the Center for Biomedical Ethics at the University of Virginia. He does not argue for a separation of the academic research world and the national security establishment, but thinks that much more effort should go into formulating an "ethics of neurosecurity and neurodefense."
- "bulding better humans" (for war purposes)
- controlling human behaviors through chemical or other means (DARPA funded early LSD experiments and Darpanet was the first name for the internet)
- "mind-reading" using imaging techniques
- brain-machine interfaces, 'borgs' (machine-human hybrids)
- improving battefield survivabiliy, making "sleepless" soliders, etc. etc.
The book is "Mind Wars: Brain Research and National Defense" by Jonathan D. Moreno, who holds a chair professorship and is Director of the Center for Biomedical Ethics at the University of Virginia. He does not argue for a separation of the academic research world and the national security establishment, but thinks that much more effort should go into formulating an "ethics of neurosecurity and neurodefense."
Wednesday, December 20, 2006
Abolishing pain with a single sodium channel mutation
Even though this blog is mainly about nervous systems, brains, and behaviors, I'm occasionally drawn back to my roots as a molecular biologist by a particularly outstanding example of how a single molecule can determine what we take to be a very complex experience - in this case the experience of pain. Cox et. al. have found that a mutation in the gene for a particular sodium channel subunit that is strongly expressed in the pain sensitivie endings of nociceptive (pain sensing and transmitting) neurons can abolish the ability to feel pain. The rare mutation was found in several individuals from a family in northern Pakistan who were unable to experience pain. This work should stimulate the search for novel analgesics that selectively target this sodium channel subunit.
The ususual human situation that permitted this work is described: "The index case for the present study was a ten-year-old child, well known to the medical service after regularly performing 'street theatre'. He placed knives through his arms and walked on burning coals, but experienced no pain. He died before being seen on his fourteenth birthday, after jumping off a house roof. Subsequently, we studied three further consanguineous families in which there were individuals with similar histories of a lack of pain appreciation, each originating from northern Pakistan and part of the Qureshi birdari/clan. All six affected individuals had never felt any pain, at any time, in any part of their body...All had injuries to their lips (some requiring later plastic surgery) and/or tongue (with loss of the distal third in two cases), caused by biting themselves in the first 4 yr of life. All had frequent bruises and cuts, and most had suffered fractures or osteomyelitis, which were only diagnosed in retrospect because of painless limping or lack of use of a limb. The children were considered of normal intelligence by their parents and teachers, and by the caring physicians."
The ususual human situation that permitted this work is described: "The index case for the present study was a ten-year-old child, well known to the medical service after regularly performing 'street theatre'. He placed knives through his arms and walked on burning coals, but experienced no pain. He died before being seen on his fourteenth birthday, after jumping off a house roof. Subsequently, we studied three further consanguineous families in which there were individuals with similar histories of a lack of pain appreciation, each originating from northern Pakistan and part of the Qureshi birdari/clan. All six affected individuals had never felt any pain, at any time, in any part of their body...All had injuries to their lips (some requiring later plastic surgery) and/or tongue (with loss of the distal third in two cases), caused by biting themselves in the first 4 yr of life. All had frequent bruises and cuts, and most had suffered fractures or osteomyelitis, which were only diagnosed in retrospect because of painless limping or lack of use of a limb. The children were considered of normal intelligence by their parents and teachers, and by the caring physicians."
Tuesday, December 19, 2006
When the "why?" isn't crucial...
I would like to point you to a brief article by Sally Satel in today's New York Times Science Section that mirrors my own sentiments about the usefullness of insight into how a maladaptive behavior, such as drug use or over-eating, might have originally started. Insisting on finding a cause can be an excuse for not working on changing a maladaptive behavior, and knowing a cause doesn't guarantee that behavior will change. There is no convincing data for the effectiveness of insight therapy, while there is such data for cognitive therapy - which trains one to note what isn't working when it starts up and choose to do something else. Satel says "It is time to retire the myth that insight is a prerequisite for change," and she offers two case studies:
"...the grail-like search for insight can backfire when it becomes a way for patients to avoid the hard work of change. This was my experience with Joe, a 24-year-old heroin addict. At every session, Joe would talk about his childhood relationship with his father, seeking new clues for how it damaged him and drove him to heroin...When I tried to change the topic to on-the-job stresses, which he linked to heroin craving, he said he’d rather “do psychotherapy.” Joe was forestalling the need to make practical changes. The many-layered drama with his dad doubled as an excuse for using heroin, absolving him of the responsibility to quit. When I proposed that possibility to him, he said, “Maybe you’re right.” But nothing really changed. He died of an accidental overdose a few months later."
"..insight has no guaranteed relationship to change. A colleague of mine treated a 45-year-old woman, Joan, who came for therapy because she hated her chunky body. Joan firmly believed that once she discovered The Reason for her overeating she would stop...After a few months, Joan told my colleague that her father had developed cancer the year she went off to college...“You know, I never made the connection until now,” she announced triumphantly, “but I started overeating when he began to waste away. It’s like I was trying to nourish him through myself.” ..A poignant metaphor, yes, but months later she hasn’t lost a pound."
"...the grail-like search for insight can backfire when it becomes a way for patients to avoid the hard work of change. This was my experience with Joe, a 24-year-old heroin addict. At every session, Joe would talk about his childhood relationship with his father, seeking new clues for how it damaged him and drove him to heroin...When I tried to change the topic to on-the-job stresses, which he linked to heroin craving, he said he’d rather “do psychotherapy.” Joe was forestalling the need to make practical changes. The many-layered drama with his dad doubled as an excuse for using heroin, absolving him of the responsibility to quit. When I proposed that possibility to him, he said, “Maybe you’re right.” But nothing really changed. He died of an accidental overdose a few months later."
"..insight has no guaranteed relationship to change. A colleague of mine treated a 45-year-old woman, Joan, who came for therapy because she hated her chunky body. Joan firmly believed that once she discovered The Reason for her overeating she would stop...After a few months, Joan told my colleague that her father had developed cancer the year she went off to college...“You know, I never made the connection until now,” she announced triumphantly, “but I started overeating when he began to waste away. It’s like I was trying to nourish him through myself.” ..A poignant metaphor, yes, but months later she hasn’t lost a pound."
Blog Categories:
acting/choosing,
consciousness,
self help
Subscribe to:
Posts (Atom)