Friday, October 24, 2008

How we know our own minds...

For those of you who might be more heavily into philosophy of mind and introspective psychology, I want to pass along this PDF of a draft article that is to appear in Brain and Behavioral Sciences, "How we know our own minds: the relationship between mindreading and metacognition." Carruthers defends the idea that our knowledge of our own attitudes results from turning our mindreading capacities upon ourselves, not from introspection for propositional attitudes. Here is his abstract, showing the organization of his arguments:
Four different accounts of the relationship between third-person mindreading and first-person metacognition are compared and evaluated. While three of them endorse the existence of introspection for propositional attitudes, the fourth (defended here) claims that our knowledge of our own attitudes results from turning our mindreading capacities upon ourselves. Section 1 introduces the four accounts. Section 2 develops the “mindreading is prior” model in more detail, showing how it predicts introspection for perceptual and quasi-perceptual (e.g. imagistic) mental events while claiming that metacognitive access to our own attitudes always results from swift unconscious self-interpretation. It also considers the model’s relationship to the expression of attitudes in speech. Section 3 argues that the commonsense belief in the existence of introspection should be given no weight. Section 4 argues briefly that data from childhood development are of no help in resolving this debate. Section 5 considers the evolutionary claims to which the different accounts are committed, and argues that the three introspective views make predictions that aren’t borne out by the data. Section 6 examines the extensive evidence that people often confabulate when self-attributing attitudes. Section 7 considers “two systems” accounts of human thinking and reasoning, arguing that although there are inrospectable events within System 2, there are no introspectable attitudes. Section 8 examines alleged evidence of “unsymbolized thinking”. Section 9 considers the claim that schizophrenia exhibits a dissociation between mindreading and metacognition. Finally, Section 10 evaluates the claim that autism presents a dissociation in the opposite direction, of metacognition without mindreading.

Thursday, October 23, 2008

Different aspects of human intelligence correlate with cortical thickness versus neural activation.

Choi et al. report an interesting study in the Journal of Neuroscience. I'm passing on the abstract and a bit of explanation of crystallized versus fluid intelligence, but not the usual flashy fMRI graphics:
We hypothesized that individual differences in intelligence (Spearman's g) are supported by multiple brain regions, and in particular that fluid (gF) and crystallized (gC) components of intelligence are related to brain function and structure with a distinct profile of association across brain regions. In 225 healthy young adults scanned with structural and functional magnetic resonance imaging sequences, regions of interest (ROIs) were defined on the basis of a correlation between g and either brain structure or brain function. In these ROIs, gC was more strongly related to structure (cortical thickness) than function, whereas gF was more strongly related to function (blood oxygenation level-dependent signal during reasoning) than structure. We further validated this finding by generating a neurometric prediction model of intelligence quotient (IQ) that explained 50% of variance in IQ in an independent sample. The data compel a nuanced view of the neurobiology of intelligence, providing the most persuasive evidence to date for theories emphasizing multiple distributed brain regions differing in function.
As background:
gC, sometimes described as verbal ability, is more dependent on accumulated knowledge in long-term storage, including semantic memory. gF refers to reasoning ability, and is known to depend on working memory. Although gC and gF are typically correlated and can be considered subfactors of g (Jensen), they are conceptually and empirically separable. For instance, gC continues to increase over the lifespan, but gF peaks in early adulthood and then declines . Furthermore, at the neural level, lesion studies demonstrated that patients with anterior temporal damages perform poorly on tests of semantic knowledge, whereas prefrontal patients typically show profound deficits in solving diverse reasoning tasks.

Sleep accelerates improvement in working memory.

I've mentioned the n-back task for improving working memory and intelligence in several - previous - posts. Kuriyama et al. now use this test to show that post-training sleep significantly enhances this improvement:
Working memory (WM) performance, which is an important factor for determining problem-solving and reasoning ability, has been firmly believed to be constant. However, recent findings have demonstrated that WM performance has the potential to be improved by repetitive training. Although various skills are reported to be improved by sleep, the beneficial effect of sleep on WM performance has not been clarified. Here, we show that improvement in WM performance is facilitated by posttraining naturalistic sleep. A spatial variant of the n-back WM task was performed by 29 healthy young adults who were assigned randomly to three different experimental groups that had different time schedules of repetitive n-back WM task sessions, with or without intervening sleep. Intergroup and intersession comparisons of WM performance (accuracy and response time) profiles showed that n-back accuracy after posttraining sleep was significantly improved compared with that after the same period of wakefulness, independent of sleep timing, subject's vigilance level, or circadian influences. On the other hand, response time was not influenced by sleep or repetitive training schedules. The present study indicates that improvement in n-back accuracy, which could reflect WM capacity, essentially benefits from posttraining sleep.

Wednesday, October 22, 2008

Cognitive therapy versus medication for depression

DeRubeis et al. offer an interesting review article in Nature Reviews Neuroscience on treatment outcomes and neural mechanisms, from which I pass on part of the abstract and some summary graphs:
Studies have shown that cognitive therapy is as efficacious as antidepressant medication at treating depression, and it seems to reduce the risk of relapse even after its discontinuation. Cognitive therapy and antidepressant medication probably engage some similar neural mechanisms, as well as mechanisms that are distinctive to each.


Cognitive therapy and antidepressant medication have comparable effects. This graph shows the response of outpatients who had moderate-to-severe depression to cognitive therapy (CT), antidepressant medication (ADM) or placebo. Patients who were assigned to ADM or to CT showed a significantly higher response rate after 8 weeks of treatment than those who were assigned to placebo. After 16 weeks of treatment the response rates of ADM and CT were almost identical.


Less relapse after cognitive therapy compared with antidepressant medication. The second phase of the parent antidepressant medication (ADM) versus cognitive therapy (CT) study followed patients who had responded to ADM or to CT3. Patients who had responded to ADM were randomly assigned to either continue ADM treatment for one year (beige and red lines) or to change to placebo treatment for 1 year (green line). Patients who responded to CT were allowed three sessions of CT during the 1-year continuation period. In the follow-up period, none of the patients received any treatment. The figure shows that prior treatment with CT protected against relapse of depression at least as well as the continued provision of ADM, and better than ADM treatment that was subsequently discontinued. Note that the patient group that was given ADM in the continuation year contained a number of patients who did not adhere to the medication regimen. The red line indicates the response of the ADM-continuation group including these non-compliant patients, whereas the beige line shows the response of the patients in this group after the non-compliant patients had been removed from the analysis.

After a graphic showing changes in blood-oxygen-level-dependent (BOLD) signal in response to cognitive and emotional tasks associated with cognitive therapy, the authors offer a summary hypothetical time course of the changes to amygdala and prefrontal function that are associated with antidepressant medication and cognitive therapy.


a | During acute depression, amygdala activity is increased (red) and prefrontal activity is decreased (blue) relative to activity in these regions in healthy individuals. b | Cognitive therapy (CT) effectively exercises the prefrontal cortex (PFC), yielding increased inhibitory function of this region. c | Antidepressant medication (ADM) targets amygdala function more directly, decreasing its activity. d | After ADM or CT, amygdala function is decreased and prefrontal function is increased. The double-headed arrow between the amygdala and the PFC represents the bidirectional homeostatic influences that are believed to operate in healthy individuals.

Our somatosensory cortex embodies the facial expressions of others

The Editor's choice section of science describes an interesting bit of work by Pitcher et al. showing the embodyment of our social cognition:
Humans are especially interested in faces, as a means of sending signals--witness the sizeable arc of somatosensory cortex devoted to representation of one's own face--and as a substrate for social cognition. Pitcher et al. describe results supporting theories of embodied cognition and emotion, which posit cognition and emotion as being shaped by our bodily movements and perceptions. They used repetitive transcranial magnetic stimulation (rTMS) to interfere with neural activity in the face areas of the somatosensory cortex while people discriminated the emotional expressions of faces (happy, sad, surprised, fearful, angry, and disgusted) and found that accuracy dropped significantly, as it also did when the occipital face area was similarly stimulated. The temporal sequence of neural processing was then delineated using double-pulse TMS, showing that the occipital area acted in the time window from 60 to 100 ms after the face stimulus was shown, whereas the somatosensory area was active a bit later, between 100 and 170 ms.

Tuesday, October 21, 2008

Botnets

Here is a scary article.

Evolution of Religious Prosociality

Norenzayan and Shariff offer an interesting review article on empirical evidence for religious prosociality. Here is one clip and two figures from the article.
Agreement is emerging that selective pressures over the course of human evolution can explain the wide cross-cultural re-occurrence, historical persistence, and predictable cognitive structure of religious beliefs and behaviors. The tendency to detect agency in nature likely supplied the cognitive template that supports the pervasive belief in supernatural agents. These agents are widely believed to transcend physical, biological, and psychological limitations. However, other important details are subject to cultural variation. Although in many societies supernatural agents are not directly concerned with human morality, in many others, morally concerned agents use their supernatural powers to observe and, in some cases, to punish and reward human social interactions. Examples include the God of Abrahamic religions and Viracocha, the Incan supreme God, but also many morally concerned deities found in traditional societies, such as the adalo, ancestral spirits of the Kwaio Solomon islanders. These beliefs are likely to spread culturally to the extent that they facilitate ingroup cooperation. This could occur by conforming to individual psychology that favors reputation-sensitive prosocial tendencies, as the by-product account holds; by competition among social groups, as the cultural group selection account would suggest; or possibly by some combination of the two. Religious behaviors and rituals, if more costly to cooperating group members than to freeloaders, may have reliably signaled the presence of devotion and, therefore, cooperative intention toward ingroup members, in turn, buffering religious groups against defection from freeloaders and reinforcing cooperative norms. Religious prosociality, thus, may have softened the limitations that kinship-based and (direct or indirect) reciprocity-based altruism place on group size. In this way, the cultural spread of religious prosociality may have facilitated the rise of stable, large, cooperative communities of genetically unrelated individuals.


Figure - Implicit activation of God concepts, relative to a neutral prime, increased offers in the one-shot, anonymous Dictator Game. Priming secular concepts indicating moral authority had a similar effect. The results showed not only a quantitative increase in generosity, but also a qualitative shift in social norms. In the control group, the modal response was selfishness, a plurality of players pocketed all $10. In the God group, the mode shifted to fairness, a plurality of players split the money evenly (N = 75). It remains to be seen, however, whether these effects
would occur if the recipient was clearly marked as an outgroup member.


Figure - Life expectancy of religious versus secular communes. An analysis of 200 religious and secular communes in 19th-century America (29), for every year of their life course, religious communes were about four times as likely to survive than their secular counterparts. This difference remained after statistically controlling for type of commune movement, year founded, and year at risk of dissolution (the last control assesses major historical trends that may independently impact commune dissolution).

Redefining Depression as Mere Sadness

An article by Pies with the title of this post is worth reading. It deals with the criticism that modern psychiatric practice, in collusion with pill pushing pharmaceutical companies, has medicalized “normal sadness” brought on by external circumstances. (Added note: Pies has emailed this this link to a more detailed discussion posted on the PsychCentral website.) Here are some clips from the NYTimes article:
In their recent book “The Loss of Sadness” (Oxford, 2007), Allan V. Horwitz and Jerome C. Wakefield assert that for thousands of years, symptoms of sadness that were “with cause” were separated from those that were “without cause.” Only the latter were viewed as mental disorders.

With the advent of modern diagnostic criteria, these authors argue, doctors were directed to ignore the context of the patient’s complaints and focus only on symptoms — poor appetite, insomnia, low energy, hopelessness and so on. The current criteria for major depression, they say, largely fail to distinguish between “abnormal” reactions caused by “internal dysfunction” and “normal sadness” brought on by external circumstances. And they blame vested interests — doctors, researchers, pharmaceutical companies — for fostering this bloated concept of depression.

But while this increasingly popular thesis contains a kernel of truth, it conceals a bushel basket of conceptual and scientific problems.

For one thing, if modern diagnostic criteria were converting mere sadness into clinical depression, we would expect the number of new cases of depression to be skyrocketing compared with rates in a period like the 1950s to the 1970s. But several new studies in the United States and Canada find that the incidence of serious depression has held relatively steady in recent decades.

Second, it may seem easy to determine that someone with depressive complaints is reacting to a loss that touched off the depression. Experienced clinicians know this is rarely the case.

Third, and perhaps most troubling, is the implication that a recent major loss makes it more likely that the person’s depressive symptoms will follow a benign and limited course, and therefore do not need medical treatment. This has never been demonstrated, to my knowledge, in any well-designed studies. And what has been demonstrated, in a study by Dr. Sidney Zisook, is that antidepressants may help patients with major depressive symptoms occurring just after the death of a loved one.

Monday, October 20, 2008

The iPathology of your iBrain

Small and Vorgan offer an engaging article in Scientific American Mind on how daily exposure to television, computers, smart phones, video games, search engines and web browsers is rewiring our brains. A modern generation is rising with brains that are very different from the brains of those of us whose basic brain wiring was laid down in a time when direct social interactions were more the norm. One of the authors (Small) compared brain activities in computer- savvy versus computer-naive 50-60 year olds while searching for accurate information on a topic using Google, subtracting activity associated with just reading a text to determine activity specific to the searching function (which was the same in the two groups). In the baseline scanning session during searching on Google, the computer-savvy subjects engaged their dorsolateral prefrontal cortex while the Internet-naive subjects showed minimal activation in this region. After just five days of practice,the exact same neural circuitry in the front part of the brain became active in the Internet-naive subjects. Five hours on the Internet, and these participants had already rewired their brains.
Our high-tech revolution has plunged us into a state of “continuous partial attention,” which software executive Linda Stone, who coined the term in 1998, describes as continually staying busy—keeping tabs on everything while never truly focusing on anything. Continuous partial attention differs from multitasking, wherein we have a purpose for each task and we are trying to improve efficiency and productivity. Instead, when our minds partially attend, and do so continuously, we scan for an opportunity for any type of contact at every given moment. We virtually chat as our text messages flow, and we keep tabs on active buddy lists (friends and other screen names in an instant message program); everything, everywhere, is connected through our peripheral attention. Although having all our pals online from moment to moment seems intimate, we risk losing personal touch with our real-life relationships and may experience an artificial sense of intimacy as compared with when we shut down our devices and devote our attention to one
individual at a time.

When paying continuous partial attention, people may place their brain in a heightened state of stress. They no longer have time to reflect, contemplate or make thoughtful decisions. Instead they exist in a sense of constant crisis—on alert for a new contact or bit of exciting news or information at any moment. Once people get used to this state, they tend to thrive on the perpetual connectivity. It feeds their ego and sense of self-worth, and it becomes irresistible. Neuroimaging studies suggest that this sense of selfworth may protect the size of the hippocampus—the horseshoeshaped brain region in the medial (inward-facing) temporal lobe, which allows us to learn and remember new information. Psychiatry professor Sonia J. Lupien and her associates at McGill University studied hippocampal size in healthy younger and older adult volunteers. Measures of self esteem correlated significantly with hippocampal size, regardless of age. They also found that the more people felt in control of their lives, the larger the hippocampus. But at some point, the sense of control and self-worth we feel when we maintain continuous partial attention tends to break down—our brains were not built to sustain such monitoring for extended periods. Eventually the hours of unrelenting digital connectivity can create a unique type of brain strain. Many people who have been working on the Internet for several hours without a break report making frequent errors in their work. On signing off, they notice feeling spaced out, fatigued, irritable and distracted, as if they are in a “digital fog.” This new form of mental stress, what Small terms “techno-brain burnout,” is threatening to become an epidemic. Under this kind of stress, our brains instinctively signal the adrenal gland to secrete cortisol and adrenaline. In the short run, these stress hormones boost energy levels and augment memory, but over time they actually impair cognition, lead to depression, and alter the neural circuitry in the hippocampus, amygdala and prefrontal cortex—the brain regions that control mood and thought. Chronic and prolonged techno-brain burnout can even reshape the underlying brain structure.

While the brains of today’s digital natives are wiring up for rapid-fire cyber searches, however, the neural circuits that control the more traditional learning methods are neglected and gradually diminished. The pathways for human interaction and communication weaken as customary one-on-one people skills atrophy. Our U.C.L.A. research team and other scientists have shown that we can intentionally alter brain wiring and reinvigorate some of these dwindling neural pathways, even while the newly evolved technology circuits bring our brains to extraordinary levels of potential.

Bayesian estimation on the presidential election.

From the "Random Samples" section of the Oct. 17 Science Magazine:
In the winner-take-all world of politics, candidates know that even a modest lead in the polls can spell almost certain victory. Sheldon Jacobson, an operations research specialist at the University of Illinois, Urbana-Champaign, and colleagues, including a group of students, have attempted to quantify that insight for the current United States presidential election, putting their predictions for the Electoral College on a Web site, election08.cs.uiuc.edu. Using a statistical method known as Bayesian estimation, they combined an analysis of results from the 2004 Bush-versus-Kerry contest with current state-by-state polls for Obama versus McCain to produce probabilities for each candidate of carrying each state. They then converted the estimates into a probability distribution for the total number of Electoral College votes a candidate might receive. In Indiana, for example, polls as of 4 October gave McCain a slight 2.5% lead. But given that Bush carried Indiana in 2004 by 20.7%, a Bayesian calculation indicates McCain's chance of winning the state's 11 Electoral College votes at about 87%. Most states are now in the bag for one candidate or the other; only a handful are truly in Bayesian play. Current calculations give McCain no chance of victory. "However," Jacobson cautions, "if the polls move, then so will our forecasts."

Friday, October 17, 2008

The rise of the machines

Richard Dooloing writes on how a physicist, a wizard and a serial killer warned us of the current financial crisis.
We are living, we have long been told, in the Information Age. Yet now we are faced with the sickening suspicion that technology has run ahead of us. Man is a fire-stealing animal, and we can’t help building machines and machine intelligences, even if, from time to time, we use them not only to outsmart ourselves but to bring us right up to the doorstep of Doom.

We are still fearful, superstitious and all-too-human creatures. At times, we forget the magnitude of the havoc we can wreak by off-loading our minds onto super-intelligent machines, that is, until they run away from us, like mad sorcerers’ apprentices, and drag us up to the precipice for a look down into the abyss.

As the financial experts all over the world use machines to unwind Gordian knots of financial arrangements so complex that only machines can make — “derive” — and trade them, we have to wonder: Are we living in a bad sci-fi movie? Is the Matrix made of credit default swaps?

When Treasury Secretary Paulson (looking very much like a frightened primate) came to Congress seeking an emergency loan, Senator Jon Tester of Montana, a Democrat still living on his family homestead, asked him: “I’m a dirt farmer. Why do we have one week to determine that $700 billion has to be appropriated or this country’s financial system goes down the pipes?”

“Well, sir,” Mr. Paulson could well have responded, “the computers have demanded it.”

Men and women - different gene expression changes in brain on aging

Berchtold et al. pack quite a lot of interesting information into their abstract describing work on sexually dimorphic gene expression changes during human aging. I'm right in the middle of this quote: "Prominent change occurred in the sixth to seventh decades across cortical regions, suggesting that this period is a critical transition point in brain aging, particularly in males." Here is the abstract:
Gene expression profiles were assessed in the hippocampus, entorhinal cortex, superior-frontal gyrus, and postcentral gyrus across the lifespan of 55 cognitively intact individuals aged 20–99 years. Perspectives on global gene changes that are associated with brain aging emerged, revealing two overarching concepts. First, different regions of the forebrain exhibited substantially different gene profile changes with age. For example, comparing equally powered groups, 5,029 probe sets were significantly altered with age in the superior-frontal gyrus, compared with 1,110 in the entorhinal cortex. Prominent change occurred in the sixth to seventh decades across cortical regions, suggesting that this period is a critical transition point in brain aging, particularly in males. Second, clear gender differences in brain aging were evident, suggesting that the brain undergoes sexually dimorphic changes in gene expression not only in development but also in later life. Globally across all brain regions, males showed more gene change than females. Further, Gene Ontology analysis revealed that different categories of genes were predominantly affected in males vs. females. Notably, the male brain was characterized by global decreased catabolic and anabolic capacity with aging, with down-regulated genes heavily enriched in energy production and protein synthesis/transport categories. Increased immune activation was a prominent feature of aging in both sexes, with proportionally greater activation in the female brain. These data open opportunities to explore age-dependent changes in gene expression that set the balance between neurodegeneration and compensatory mechanisms in the brain and suggest that this balance is set differently in males and females, an intriguing idea.

Thursday, October 16, 2008

Embodyment and Art

I just received an email from Andrew Werth regarding the previous post. Andrew is a former software engineer turned artist whose paintings are about perception and embodiment. I thought I would pass on this link to his website, which lets one view paintings in his Embodyment Series.

Arguing for Embodied Consciousness

I thought I would pass along portions of a review in Science by Harold Fromm which has the title of this post, of Edward Slingerland's new book, "What Science Offers the Humanities - Integrating Body and Culture."
...his overall task is to address the befuddled dualism that still dominates most of our intellectual disciplines...Slingerland's central theme is that everything human has evolved in the interests of the materiality of the body. He identifies objectivist realism and postmodern relativity, both insufficiently attentive to the body, as the major epistemologies to be swept away, followed by the dualism of body and soul. For Slingerland, the presiding genii behind such a cleansing are George Lakoff and Mark Johnson, with heavier debts to Johnson [whose terse summary of embodiment in (1) appeared too late for Slingerland to reference]. They view all thought and human behavior as generated by the body and expressed as conceptual metaphors that translate physical categories (such as forward, backward, up, and down) into abstract categories (such as progress, benightedness, divinity, immorality). These body-driven metaphors, Slingerland writes, are a "set of limitations on human cognition, constraining human conceptions of entities, categories, causation, physics, psychology, biology, and other humanly relevant domains."

The supposedly objective world is not "some preexisting object out there in the world, with a set of invariant and observer-independent properties, simply waiting to be found the way one finds a lost sock under the bed." All we can ever see or understand is what our own bodily faculties permit via the current structure of the brain.

In opposition to objective realism, postmodern relativity regards language and culture as constituting the only "real" world possible for us. It posits an endless hall of mirrors with no access to outside--epitomized by Derrida's notorious remark that there is nothing (at least for humans) outside of texts (i.e., culture). This view, which dominated the humanities for several decades, is mercifully beginning to fade as the cognitive sciences have matured and are increasingly promulgated.

Even though the knowing human subject is itself just a thing and not an immaterial locus of reason, the universe it experiences is as real and functional for us as any "thing" could possibly be. We do get some things "right," even if we can never know the noumenal genesis behind our knowledge. And the very concept of noumena (things in themselves independent of any observer) now seems somewhat obsolete, given that the intuition of discrete, self-bounded "things" is as built-in to the human psyche as the Kantian intuitions of space and time, grounding all experience.

Our million billion synapses produce a "person" with the illusion of a self. Slingerland holds that "we are robots designed to be constitutionally incapable of experiencing ourselves and other conspecifics as robots." Our innate and overactive theory of mind (that other people, like ourselves, have "intentions") projects agency onto everything--in the past, even onto stones and trees. The "hard problem" for philosophy of consciousness (to use David Chalmers's phrase) remains: what are thoughts, cogitations, thinkers, qualia? Chalmers's solution, alas, swept away Cartesian dualism only to sneak his own magic spook, conscious experience (for him, on par with mass, charge, and space-time), in through the back door (2, 3).

Slingerland starts with Darwin and eventually follows Daniel Dennett so far as to agree that consciousness can be done full justice through third-person descriptions that require no mysterious, unaccounted-for, nonmaterial, first-person entity as substrate. Thus the famous "Mary," who intellectually knows everything there is to know about color despite having been sequestered for life in a color-free lab, will recognize red the first time she steps outside (4). And Thomas Nagel's famous bats don't know anything about bathood that we can't figure out for ourselves from observation (5). No first-person construct, no locus of consciousness, need be invoked.

The next step, if you want to go so far (the jury is out), is to eliminate consciousness altogether, because there's nothing for it to do that can't be done without it. And with it, you need a spook to keep the show on the road. Choose your insoluble problem: eliminate consciousness altogether as superfluous or explain it (if there's really a you who makes such choices). Slingerland prefers the first option.

His conclusion, which I can hardly do justice to here, is relatively satisfying. He notes that although we don't have great difficulty knowing that Earth revolves around the Sun while feeling that the Sun is rising and setting (Dennett's favorite example of folk psychology), "no cognitively undamaged human being can help acting like and at some level really feeling that he or she is free"--however nonsensical the notion of agencyless free will (i.e., "choices" without a self to make them). Still, once the corrosive acid of Darwinism [to use Dennett's figure from (6)] has resolved the body-mind dualism into body alone, some but not most of us are able "to view human beings simultaneously under two descriptions: as physical systems and as persons."

References

1. M. Johnson, The Meaning of the Body: Aesthetics of Human Understanding (Univ. of Chicago Press, Chicago, 2007).
2. D. J. Chalmers, J. Consciousness Stud. 2, 200 (1995).
3. D. J. Chalmers, The Conscious Mind: In Search of a Fundamental Theory (Oxford Univ. Press, Oxford, 1996).
4. F. Jackson, Philos. Q. 32, 127 (1982).
5. T. Nagel, Philos. Rev. 83, 435 (1974).
6. D. C. Dennett, Darwin's Dangerous Idea: Evolution and the Meaning of Life (Simon and Schuster, New York, 1995).

Your bladder and your brain.

Overactive bladder, usually caused by bladder obstruction in males, apparently affects ~17% of the population, towards whom those awful pharmaceutical television adds are directed. Signals arising from bladder or colonic pathology are processed by the cortex and can potentially be expressed as central symptoms (e.g., hyperarousal, attention disorders, anxiety) that occur alongside the visceral pathology. Rickenbacher et al. now show, in a rat model, that bladder obstruction not only botches up the bladder, but also brain regions involved in its regulation.
Neural circuits that allow for reciprocal communication between the brain and viscera are critical for coordinating behavior with visceral activity. At the same time, these circuits are positioned to convey signals from pathologic events occurring in viscera to the brain, thereby providing a structural basis for comorbid central and peripheral symptoms. In the pons, Barrington's nucleus and the norepinephrine (NE) nucleus, locus coeruleus (LC), are integral to a circuit that links the pelvic viscera with the forebrain and coordinates pelvic visceral activity with arousal and behavior. Here, we demonstrate that a prevalent bladder dysfunction, produced by partial obstruction in rat, has an enduring disruptive impact on cortical activity through this circuit. Within 2 weeks of partial bladder obstruction, the activity of LC neurons was tonically elevated. LC hyperactivity was associated with cortical electroencephalographic activation that was characterized by decreased low-frequency (1–3 Hz) activity and prominent theta oscillations (6–8 Hz) that persisted for 4 weeks. Selective lesion of the LC–NE system significantly attenuated the cortical effects. The findings underscore the potential for significant neurobehavioral consequences of bladder disorders, including hyperarousal, sleep disturbances, and disruption of sensorimotor integration, as a result of central noradrenergic hyperactivity. The results further imply that pharmacological manipulation of central NE function may alleviate central sequelae of these visceral disorders.

Wednesday, October 15, 2008

Applied neuroeconomics - the fear of loss

Bajaj does an interesting writeup of well-known crowd psychological dynamics behind recent "irrational" drops in the stock market. Fear is a more powerful force than greed. Our aversive reaction to losing $1000 is greater than our pleasure at earning the same amount...
fear now seems to rule, with investors often exhibiting a Wall Street version of the fight-or-flight mechanism — selling first, and asking questions later...some analysts are starting to suggest the markets are showing signs of “capitulation” — what happens when even the bullish holdouts, the unflagging optimists, throw up their hands and join the stampede out of the market...To some, signs of capitulation can be read as an indicator that the bottom may be near.
The opposite swing of the cycle is buying at the top of a bubble. I remember during my winter stays in Ft. Lauderdale in 2005 and 2006, every fourth person I chatted with seemed to be a realtor and dinner conversations were dominated by stories about fast profits on flipped condominiums.

Wordwatchers

An article by Wapner points to the work of James Pennebacker and his Wordwatchers blog that is tracking the candidates use of words during the 2008 election. The blog makes fascinating reading. Here is just one clip:
Predicting how they will govern. Most language dimensions that we study are probably better markers of how people will lead than who will vote for them. Some dimensions that are relevant include:

Cognitive complexity. A particularly reliable marker of cognitive complexity is the exclusive word dimension. Exclusive words such as but, except, without, exclude, signal that the speaker is making an effort to distinguish what is in a category and not in a category. Those who use more exclusive words make better grades in college, are more honest in lab studies, and have more nuanced understanding of events and people. Through the primaries until now, Obama has consistently been the highest in exclusive word use and McCain the lowest.

Categorical versus fluid thinking. Some people naturally approach problems by assigning them to categories. Categorical thinking involves the use of articles (a, an, the) and concrete nouns. Men, for example, use articles at much higher rates than women. Fluid thinking involves describing actions and changes, often in more abstract ways. A crude measure of fluid thinking is the use of verbs. Women use verbs more than men.

McCain and Obama could not be more different in their use of articles and verbs. McCain uses verbs at an extremely low rate and articles at a fairly high rate. Obama, on the other hand, is remarkably high in his use of verbs and low in his use of articles. These patterns suggest that McCain’s natural way of understanding the world is to first label the problem and find a way to put it into a pre-existing category. Obama is more likely to define the world as ongoing actions or processes.

Personal and socially connected. Individuals who think about and try to connect with others tend to use more personal pronouns (I, we, you, she, they) than those who are more socially detached. Bush was higher than Kerry or Gore. McCain has consistently been much higher than any other candidate in this election cycle. His use of 1st person singular (I, me, my) is particularly high which often signals an openness and honesty. Obama uses personal pronouns at moderate levels - similar to Hillary Clinton and most other primary candidates of both parties.

Restrained versus impulsive. People vary in the degree to which they act quickly or shoot from the hip versus stand back and consider their options. Over the last few years, some have argued that the use of negations (e.g., no, not, never) indicate a sign of inhibition or constraint. Low use of negations may be linked to impulsiveness. Bush was low in negations whereas Kerry was quite high. Across the election cycle, Obama has consistently been the highest user of negations - suggesting a restrained approach - where as McCain has been the lowest - a more impulsive way of dealing with the world.

Tuesday, October 14, 2008

MRI of moral emotions while causing harm.

Kédia et al. perform brain imaging of subjects as they imagine harm in several different contexts of subject and victim:
The statement "An agent harms a victim" depicts a situation that triggers moral emotions. Depending on whether the agent and the victim are the self or someone else, it can lead to four different moral emotions: self-anger ("I harm myself"), guilt ("I harm someone"), other-anger ("someone harms me"), and compassion ("someone harms someone"). In order to investigate the neural correlates of these emotions, we examined brain activation patterns elicited by variations in the agent (self vs. other) and the victim (self vs. other) of a harmful action. Twenty-nine healthy participants underwent functional magnetic resonance imaging while imagining being in situations in which they or someone else harmed themselves or someone else. Results indicated that the three emotional conditions associated with the involvement of other, either as agent or victim (guilt, other-anger, and compassion conditions), all activated structures that have been previously associated with the Theory of Mind (ToM, the attribution of mental states to others), namely, the dorsal medial prefrontal cortex, the precuneus, and the bilateral temporo-parietal junction. Moreover, the two conditions in which both the self and other were concerned by the harmful action (guilt and other-anger conditions) recruited emotional structures (i.e., the bilateral amygdala, anterior cingulate, and basal ganglia). These results suggest that specific moral emotions induce different neural activity depending on the extent to which they involve the self and other.

This year's Ig-Noble prize in cognitive science goes to a slime mold

A clip from Steve Nadis' write up in Nature News of this year's Ig-Noble prizes:
Slime moulds exhibit the kind of "contemplative behaviour" that Hamlet is famous for, muses Toshiyuki Nakagaki of Hokkaido University in Japan. ...The slime mold's puzzle-solving ability — Shakespearean or otherwise — is a discovery that is unlikely to change the world, but it won Nakagaki and his colleagues an Ig Nobel Prize for cognitive science last week at the annual event held at Harvard University in Cambridge, Massachusetts. Their research... showed that slime molds looking for food have "the ability to find the minimum-length solution between two points in a labyrinth".

Subsequently, the team has found that molds can find the shortest path between 30–50 points, which is something even supercomputers cannot yet work out. "We can't even check the mold's solution," notes Nakagaki, "but it looks good."

Monday, October 13, 2008

How context can set our emotional reaction to a smell.

Rolls et al. do a piece of work on how selective attention to affective value can alter how our brains process olfactory stimuli:
How does selective attention to affect influence sensory processing? In a functional magnetic resonance imaging investigation, when subjects were instructed to remember and rate the pleasantness of a jasmin odor, activations were greater in the medial orbitofrontal and pregenual cingulate cortex than when subjects were instructed to remember and rate the intensity of the odor. When the subjects were instructed to remember and rate the intensity, activations were greater in the inferior frontal gyrus. These top–down effects occurred not only during odor delivery but started in a preparation period after the instruction before odor delivery, and continued after termination of the odor in a short-term memory period. Thus, depending on the context in which odors are presented and whether affect is relevant, the brain prepares itself, responds to, and remembers an odor differently. These findings show that when attention is paid to affective value, the brain systems engaged to prepare for, represent, and remember a sensory stimulus are different from those engaged when attention is directed to the physical properties of a stimulus such as its intensity. This differential biasing of brain regions engaged in processing a sensory stimulus depending on whether the cognitive demand is for affect-related versus more sensory-related processing may be an important aspect of cognition and attention. This has many implications for understanding the effects not only of olfactory but also of other sensory stimuli.