Tuesday, July 15, 2008

Persistence of anxious temperament - brain correlates

The temperament we display in early childhood (introvesion versus extroversion, high versus low reactivity, anxiety in unfamiliar versus familiar situations, etc) is largely genetically determined and persists through life. The work of Kagan and others has shown that children classed as highly reactive as babies are more likely to be subdued in unfamiliar situations and report a dour mood and anxiety over the future. Anxious temperament is an early predictor of the later risk to develop anxiety, depression and drug abuse related to self-medicating. It becomes increasingly clear that people with anxious temperaments are come wired that way, telling them to calm down just doesn't work. Kalin and his colleagues here at Wisconsin have produced an interesting study on the relevant brain correlates of this behavior by looking at brain activity, anxious behavior and stress hormones in adolescent rhesus monkeys, which have been used in numerous studies as models to understand anxious temperament in human children. They found that individuals with the most anxious temperaments showed higher activity in the amygdala, which regulates emotion and triggers reactions to anxiety, such as the fight or flight response. These anxious monkeys had more metabolic activity in the amygdala in both secure and threatening situations. These differences remained over several years of testing. From their abstract:
Regardless of context, results demonstrated a trait-like pattern of brain activity (amygdala, bed nucleus of stria terminalis, hippocampus, and periaqueductal gray) that is predictive of individual phenotypic differences. Importantly, individuals with extreme anxious temperament also displayed increased activity of this circuit when assessed in the security of their home environment. These findings suggest that increased activity of this circuit early in life mediates the childhood temperamental risk to develop anxiety and depression. In addition, the findings provide an explanation for why individuals with anxious temperament have difficulty relaxing in environments that others perceive as non-stressful.

Self interest versus 'moral sentiment' in economic policy

A review by Bowles in Science considers:
...a shortcoming in the conventional economic approach to policy design: It overlooks the possibility that economic incentives that appeal to self interest may diminish ethical or other reasons for complying with social norms and contributing to the common good. It cites one simple example of this happening:

In Haifa, at six day care centers, a fine was imposed on parents who were late picking up their children at the end of the day. Parents responded to the fine by doubling the fraction of time they arrived late. When after 12 weeks the fine was revoked, their enhanced tardiness persisted unabated. While other interpretations are possible, the counterproductive imposition of the fines illustrate a kind of negative synergy between economic incentives and moral behavior. The fine seems to have undermined the parents' sense of ethical obligation to avoid inconveniencing the teachers and led them to think of lateness as just another commodity they could purchase.
A clip from the Bowles' discussion:
Although standard in economics, reliance solely on self-interest in the design of policies has never won universal assent. Until recently, however, dissenting views, like Titmuss' celebrated claim that paying for blood donations degrades the willingness to contribute, were thought to lack either empirical support or a coherent account of why separability might fail. But a recent experiment suggests that Titmuss may have been right, at least for women. Other experiments surveyed in this review provide additional evidence that material interests and moral sentiments are not separable in the sense required by the conventional economic approach to policy-making.

Economists, psychologists, and others, in part stimulated by these new empirical data, are well on their way to constructing an economic psychology of the interplay of self-regarding and other-regarding motivation that may eventually enlighten mechanism design and public policy....Good policies and constitutions are those that support socially valued ends not only by harnessing selfish preferences to public ends but also by evoking, cultivating, and empowering public-spirited motives. The modest tax on plastic grocery bags enacted in Ireland in 2002 that resulted in a 94 per cent decline in their use appears to have had just this effect : Carrying a plastic bag joined wearing a fur coat in the gallery of antisocial anachronisms.

Monday, July 14, 2008

Mendelssohn piano trios....

The is the first of what will be several weekly postings of portions of the house concert at Twin Valley on 6/29/08. This is the Andante con moto tranquillo from Mendelssohn's 1st piano trio. I am playingon the Steinway B, with Sonny Enslen (cello) and Daphne Tsao (violin).

The choices you make - indirect social influence.

I thought I was pass on this brief perspectives essay from Science by Jerker Denrell:
To what extent are the opinions you hold simply a reflection of the opinions of those you associate with? Most people like to think that their opinions are based on their own deliberations. Of course, there are exceptions. You may take into account the opinions of others if you believe they are better informed. You may even conform to the majority opinion in order to avoid being seen as deviant (1, 2). Studies of how norms and beliefs vary between groups, and how they are transmitted from peers or parents, testify to the importance of such social influence (3).

Explanations of social influence usually focus on why people are persuaded by or conform to the opinions of others (4). Although important, this research has neglected the role of information collection in belief formation and how biased beliefs, as well as social influence, can emerge from biased search processes (5).

For example, suppose you are deciding which of two cars to buy. If your neighbor buys one of the cars, you can observe it more closely and will thus learn more about its attributes. This opportunity to observe the car can bias your decision toward buying the same car, even if you do not care about whether you have the same car as your neighbor. This is especially true if acquiring information about cars other than your neighbor's is costly (6). If the information you learn about your neighbor's car is strongly positive, it makes sense to buy this car and discontinue the search. In this case you will not find out whether the other car is superior. If the information you learn is not very positive, however, it then makes sense to examine the other car. Only in this case will you find out how the two cars compare. Because the comparison process is asymmetric, you are overall more likely to buy the same car as your neighbor even if the information you learn is equally likely to be positive or negative.

The attitudes and behavior of others can also influence our learning processes by leading us to revisit objects and events that we had previously avoided because of poor past experiences (7). Suppose Bob likes a restaurant while Alice does not. By herself Alice might not visit the restaurant again, and her attitude would remain negative. But Alice might join Bob if he wants to go to the restaurant. By visiting the restaurant again, Alice gets a chance to change her opinion. Alice's attitude will depend on Bob's, but only because he influenced the probability of her revisiting the restaurant.

Finally, the number of your friends who engage in some activity can also influence your estimate of the value of this activity. If you have many friends who start firms, for example, your estimate of the chances of success will be based on a large sample size. A large sample size may lead you to have a higher estimate of the success rate than you would if the sample size were small. Experiments show that a large sample size leads to a more optimistic view when the outcome distribution is skewed (8). If only 10% succeed, you may only observe failures in a small sample, and will then underestimate the success rate.

These mechanisms produce behavior that looks like conformity: You are more likely to evaluate an activity positively if others do so. But in these examples your attitude is not directly influenced by hearing about the attitudes of others. Your attitude is only indirectly influenced by others because their behavior exposes you to additional samples of the activity.

Such indirect mechanisms of social influence are important, because even individuals who try to be impartial and make the best decision given the available information may fail to recognize that the available information is influenced by others (9). For example, a manager who tries to avoid discrimination may nevertheless come to believe that individuals who belong to the same social networks as the manager does are superior to those the manager seldom interacts with and has less information about. To learn more about these mechanisms, we need to broaden studies of social influence and belief formation to include the phases of learning and information collection that precede decision-making and judgment (10).

References and Notes

1. S. E. Asch, Sci. Am. 193, 31 (November, 1955).
2. M. Deutsch, H. Gerard, J. Abnorm. Soc. Psychol. 51, 629 (1955).
3. P. J. Richerson, R. Boyd, Not by Genes Alone: How Culture Transformed Human Evolution (Univ. of Chicago Press, Chicago, 2005).
4. R. B. Cialdini, N. J. Goldstein, Annu. Rev. Psychol. 55, 591 (2004).
5. Y. Trope, A. Liberman, in Social Psychology: Handbook of Basic Principles, E. T. Higgins, A. W. Kruglanski, Eds. (Guilford, New York, 1996), pp. 239-270.
6. N. V. Moshkin, R. Shachar, Market. Sci. 21, 435 (2002).
7. J. Denrell, G. Le Mens, Psychol. Rev. 114, 398 (2007).
8. R. Hertwig et al., Psychol. Sci. 15, 534 (2004).
9. J. Denrell, Psychol. Rev. 112, 951 (2005).
10. For recent research on the effect of sampling on judgment, see K. Fiedler, P. Juslin, Eds., Information Sampling and Adaptive Cognition (Cambridge Univ. Press, Cambridge, 2006).

Friday, July 11, 2008

Resveratrol - protection from ravages of aging.

In mice, at least....An article in Wired Magazine points to a multi-authored study in Cell Metabolism:
A small molecule that safely mimics the ability of dietary restriction (DR) to delay age-related diseases in laboratory animals is greatly sought after. We and others have shown that resveratrol mimics effects of DR in lower organisms. In mice, we find that resveratrol induces gene expression patterns in multiple tissues that parallel those induced by DR and every-other-day feeding. Moreover, resveratrol-fed elderly mice show a marked reduction in signs of aging, including reduced albuminuria, decreased inflammation, and apoptosis in the vascular endothelium, increased aortic elasticity, greater motor coordination, reduced cataract formation, and preserved bone mineral density. However, mice fed a standard diet did not live longer when treated with resveratrol beginning at 12 months of age. Our findings indicate that resveratrol treatment has a range of beneficial effects in mice but does not increase the longevity of ad libitum-fed animals when started midlife.

Where Ritalin acts in the brain to focus attention.

An interesting piece of work from Berridge's lab here at the University of Wisconsin shows that the cognition and attention enhancing drug Ritalin (methylphenidate, MPH) fine-tunes the functioning of neurons in the prefrontal cortex (PFC), which is involved in attention, decision-making and impulse control. While it enhances the efflux of the neurotransmitters norepinephrine and dopamine in PFC, it appears to have minimal effects elsewhere.

Only working memory–enhancing doses of MPH increased the responsivity of individual PFC neurons and altered neuronal ensemble responses within the PFC. The effects were not observed outside the PFC (i.e., within somatosensory cortex). In contrast, high-dose MPH profoundly suppressed evoked discharge of PFC neurons. These observations suggest that preferential enhancement of signal processing within the PFC, including alterations in the discharge properties of individual PFC neurons and PFC neuronal ensembles, underlie the behavioral/cognitive actions of low-dose psychostimulants.

Thursday, July 10, 2008

As new kind of science, as data deluge makes the scientific method obsolete...

An article by Chris Anderson in Wired Magazine, pointed out to me by my son Jon, argues that science as we have known it has ended. The argument is that the quest for knowledge that used to begin with grand theories now, in the petabyte age, begins with massive amounts of data. Google has set the new model for science. I show some clips here, and then follow with the contra argument by John Timmers that follows):
Google conquered the advertising world with nothing more than applied mathematics. It didn't pretend to know anything about the culture and conventions of advertising — it just assumed that better data, with better analytical tools, would win the day. And Google was right...Google's founding philosophy is that we don't know why this page is better than that one: If the statistics of incoming links say it is, that's good enough. No semantic or causal analysis is required. That's why Google can translate languages without actually "knowing" them (given equal corpus data, Google can translate Klingon into Farsi as easily as it can translate French into German). And why it can match ads to content without any knowledge or assumptions about the ads or the content.

The hypothesize-model-test model of science is becoming obsolete...The models we were taught in school about "dominant" and "recessive" genes steering a strictly Mendelian process have turned out to be an even greater simplification of reality than Newton's laws. The discovery of gene-protein interactions and other aspects of epigenetics has challenged the view of DNA as destiny and even introduced evidence that environment can influence inheritable traits, something once considered a genetic impossibility...the more we learn about biology, the further we find ourselves from a model that can explain it...There is now a better way. Petabytes allow us to say: "Correlation is enough." We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.

The best practical example of this is the shotgun gene sequencing by J. Craig Venter. Enabled by high-speed sequencers and supercomputers that statistically analyze the data they produce, Venter went from sequencing individual organisms to sequencing entire ecosystems. In 2003, he started sequencing much of the ocean, retracing the voyage of Captain Cook. And in 2005 he started sequencing the air. In the process, he discovered thousands of previously unknown species of bacteria and other life-forms.

Venter can make some guesses about the animals — that they convert sunlight into energy in a particular way, or that they descended from a common ancestor. But besides that, he has no better model of this species than Google has of your MySpace page. It's just data. By analyzing it with Google-quality computing resources, though, Venter has advanced biology more than anyone else of his generation.

This kind of thinking is poised to go mainstream. In February, the National Science Foundation announced the Cluster Exploratory, a program that funds research designed to run on a large-scale distributed computing platform developed by Google and IBM in conjunction with six pilot universities. The cluster will consist of 1,600 processors, several terabytes of memory, and hundreds of terabytes of storage, along with the software, including Google File System, IBM's Tivoli, and an open source version of Google's MapReduce. Early CluE projects will include simulations of the brain and the nervous system and other biological research that lies somewhere between wetware and software.
Here is the immediate rejoinder to this article from John Timmers at Ars Technica.
Every so often, someone (generally not a practicing scientist) suggests that it's time to replace science with something better. The desire often seems to be a product of either an exaggerated sense of the potential of new approaches, or a lack of understanding of what's actually going on in the world of science. This week's version, which comes courtesy of Chris Anderson, the Editor-in-Chief of Wired, manages to combine both of these features in suggesting that the advent of a cloud of scientific data may free us from the need to use the standard scientific method.

It's easy to see what has Anderson enthused. Modern scientific data sets are increasingly large, comprehensive, and electronic. Things like genome sequences tell us all there is to know about the DNA present in an organism's cells, while DNA chip experiments can determine every gene that's expressed by that cell. That data's also publicly available—out in the cloud, in the current parlance—and it's being mined successfully. That mining extends beyond traditional biological data, too, as projects like WikiProteins are also drawing on text-mining of the electronic scientific literature to suggest connections among biological activities.

There is a lot to like about these trends, and little reason not to be enthused about them. They hold the potential to suggest new avenues of research that scientists wouldn't have identified based on their own analysis of the data. But Anderson appears to take the position that the new research part of the equation has become superfluous; simply having a good algorithm that recognizes the correlation is enough.

The source of this flight of fancy was apparently a quote by Google's research director, who repurposed a cliché that most scientists are aware of: "All models are wrong, and increasingly you can succeed without them." And Google clearly has. It doesn't need to develop a theory as to why a given pattern of links can serve as an indication of valuable information; all it needs to know is that an algorithm that recognizes specific link patterns satisfies its users. Anderson's argument distills down to the suggestion that science can operate on the same level—mechanisms, models, and theories are all dispensable as long as something can pick the correlations out of masses of data.

Science 2.0 I can't possibly imagine how he comes to that conclusion. Correlations are a way of catching a scientist's attention, but the models and mechanisms that explain them are how we make the predictions that not only advance science, but generate practical applications. One only needs to look at a promising field that lacks a strong theoretical foundation—high-temperature superconductivity springs to mind—to see how badly the lack of a theory can impact progress. Put in more practical terms, would Anderson be willing to help test a drug that was based on a poorly understood correlation pulled out of a datamine? These days, we like our drugs to have known targets and mechanisms of action and, to get there, we need standard science.

Anderson does provide two examples that he feels support his position, but they actually appear to undercut it. He notes that we know quantum mechanics is wrong on some level, but have been unable to craft a replacement theory after decades of work. But he neglects to mention two key things: without the testable predictions made by the theory, we'll never be able to tell how precisely it is wrong and, in those decades where we've failed to find a replacement, the predictions of quantum mechanics have been used to create the modern electronics industry, with the data cloud being a consequence of that.

If anything, his second example is worse. We can now perform large-scale genetic surveys of the life present in remote environments, such as the far reaches of the Pacific. Doing so has informed us that there's a lot of unexplored biodiversity on the bacterial level; fragments of sequence hint at organisms we've never encountered under a microscope. But as Anderson himself notes, the only thing we can do is make a few guesses as to the properties of the organisms based on who their relatives are, an activity that actually requires a working scientific theory, namely evolution. To do more than that, we need to deploy models of metabolism and ecology against the bacteria themselves.

Overall, the foundation of the argument for a replacement for science is correct: the data cloud is changing science, and leaving us in many cases with a Google-level understanding of the connections between things. Where Anderson stumbles is in his conclusions about what this means for science. The fact is that we couldn't have even reached this Google-level understanding without the models and mechanisms that he suggests are doomed to irrelevance. But, more importantly, nobody, including Anderson himself if he had thought about it, should be happy with stopping at this level of understanding of the natural world.

Meditation and executive function - untraining the brain.

A MindBlog reader passes on this link to a reposting of a interesting article by Chris Chatham on how easily normal conflicts in making decisions can be lessened by changes in attention. My May 1 post references other work on this topic.

Wednesday, July 09, 2008

Worried sick...achieving wellness?

I always enjoy it when a good curmudgeonly antidote comes along to temper bright eyed optimism. Such a contrast is provided by Zuger's review of very different books by Snyderman and Hadler. Synderman:
With chirpy, can-do optimism...recapitulates the standard wisdom. Watch your diet, exercise, lose weight, stop smoking, be screened regularly for a variety of dire illnesses, rein in cholesterol and blood sugar, stay in touch with your doctor and be sure to check out those aches and pains pronto, just in case. So speaks the medical establishment.
While Hadler:
..who is a longtime debunker of much the establishment holds dear...reminds us...we are all going to die...holding every dire illness at bay forever is simply not an option. The real goal is to reach a venerable age — say 85 — more or less intact. And the statistics tell Dr. Hadler that ignoring most of the advice Dr. Snyderman offers is the way to do it.
An excerpt from Hadler's book:
Daily, we are offered the image of the baby-boom generation going on forever, making impossible demands on successive generations to provide pensions, health care, and community. That, too, is fatuous. However, more of us are living longer than did our parents. Clearly, the likelihood that we will enjoy life as an octogenarian has increased over the course of the twentieth century. Far less clear is whether the likelihood of becoming a nonagenarian has increased similarly. It has certainly not done so at anything like the same rate as the likelihood of being an octogenarian. The effect is so striking that it has caused many of us to wonder if there is not a fixed longevity for our species, set around eighty-five years of age. Some have likened this to a warranty: you are off warranty at eighty-five, beyond is a bonus, and well beyond is a statistical oddity. This projected demographic is consistent with current population trends. With one caveat, these hard facts seem unlikely to change. It is possible that molecular biology can alter the fixed longevity of our species. But don't hold your breath. None of us will live to see that — and maybe no one ever will.

Eighty-five (+/- a little bit) appears to be the programmed life expectancy for our species. I grant that the science is imperfect. But eighty-five is a linchpin of my personal philosophy of life. I, for one, do not care how many diseases I harbor on my eighty-fifth birthday, though I prefer not to know that they are creeping up on me. I, for one, do not care which of these diseases carries me off as long as the leaving is gentle and the legacy meaningful. Perhaps the best we can reasonably hope for is eighty-five years of life free of morbidities that overwhelm our wherewithal to cope, then to die in our sleep on our eighty-fifth birthday.

Ecocultural basis of cognition

Farmers and fishermen are more holistic than herders. Uskul et al. offer a fascinating study on factors influencing holistic versus more focused perception:
It has been proposed that social interdependence fosters holistic cognition, that is, a tendency to attend to the broad perceptual and cognitive field, rather than to a focal object and its properties, and a tendency to reason in terms of relationships and similarities, rather than rules and categories. This hypothesis has been supported mostly by demonstrations showing that East Asians, who are relatively interdependent, reason and perceive in a more holistic fashion than do Westerners. We examined holistic cognitive tendencies in attention, categorization, and reasoning in three types of communities that belong to the same national, geographic, ethnic, and linguistic regions and yet vary in their degree of social interdependence: farming, fishing, and herding communities in Turkey's eastern Black Sea region. As predicted, members of farming and fishing communities, which emphasize harmonious social interdependence, exhibited greater holistic tendencies than members of herding communities, which emphasize individual decision making and foster social independence. Our findings have implications for how ecocultural factors may have lasting consequences on important aspects of cognition.

Tuesday, July 08, 2008

Brain regions active during different economic decisions.

The editor's choice section of science magazine spotlights an interesting paper in J. Neurosci:
When we make economic decisions, for example the purchase of a good or a service, our brain has to perform at least three computations. First, it has to assess the goal value of the good: in economic terms, our maximal willingness to pay. Second, it has to assess the decision value of the good: the goal value minus the unavoidable costs. Third, there is a prediction error, which indicates the deviation from one's expectations of reward; the prediction error is positive when something better than expected happens and negative when the opposite occurs. Unfortunately, these three related quantities are intermingled and are often highly correlated, making it challenging to isolate the neural regions performing these computations.

Hare et al. have attempted to measure goal value, decision value, and prediction error in a single neuroimaging task so that they could dissociate these parameters. They found that ventral striatum activation reflected prediction error and not goal or decision value. However, activity in the medial orbitofrontal cortex and the central orbitofrontal cortex correlated with goal value and decision value, respectively.
Here is a summary figure from the paper:

Figure - Combined activation maps for goal values (GVs), decision values (DVs), and prediction errors (PEs). Activity correlated with GVs in the mOFC is shown in red, activity correlated with DVs in the cOFC is shown in yellow, and activity correlated with PEs in the ventral striatum is shown in green.

Another Happiness Survey

The Univ. of Michigan press release describing work from the World Values Survey based at the University of Michigan Institute for Social Research. Denmark is the happiest nation in the world and Zimbabwe the unhappiest. The United States ranks 16th on the list, immediately after New Zealand.

Monday, July 07, 2008

Piazolla - Otono Portena

Here is the second Piazolla tango we did at the 6/29/08 Sunday musical at Twin Valley.

Brain Foods...

Gómez-Pinilla contributes a review article to the latest issue of Nature Reviews Neuroscience on how various dietary factors, in addition to some gut and brain hormones, increase the resistance of neurons to insults and promote mental fitness. I pass on one figure dealing with dietary omega-3 fatty acids, followed by a summary table.



The omega-3 fatty acid docosahexaenoic acid (DHA), which humans mostly attain from dietary fish, can affect synaptic function and cognitive abilities by providing plasma membrane fluidity at synaptic regions. DHA constitutes more than 30% of the total phospholipid composition of plasma membranes in the brain, and thus it is crucial for maintaining membrane integrity and, consequently, neuronal excitability and synaptic function. Dietary DHA is indispensable for maintaining membrane ionic permeability and the function of transmembrane receptors that support synaptic transmission and cognitive abilities. Omega-3 fatty acids also activate energy-generating metabolic pathways that subsequently affect molecules such as brain-derived neurotrophic factor (BDNF) and insulin-like growth factor 1 (IGF1). IGF1 can be produced in the liver and in skeletal muscle, as well as in the brain, and so it can convey peripheral messages to the brain in the context of diet and exercise. BDNF and IGF1 acting at presynaptic and postsynaptic receptors can activate signalling systems, such as the mitogen-activated protein kinase (MAPK) and calcium/calmodulin-dependent protein kinase II (CaMKII) systems, which facilitate synaptic transmission and support long-term potentiation that is associated with learning and memory.


(Click to enlarge table.)

Friday, July 04, 2008

MRI of mental time travel.

Arzy et al. make the interesting observation that one's imagined self location influences the neural activity related to mental time travel. Slightly edited clips from the article:
A fundamental characteristic of human conscious experience is the ability to not only experience the present moment but also to recall the past and predict the future, or to "travel" back and forth in time, a facility that is called "mental time travel" (MTT)...Converging evidence from recent memory research suggests that re-experiencing and pre-experiencing an event rely on similar neural mechanisms. Similar strategies and the same brain regions are found to be used in imagining past and future events, as future predictions may be based on past memories... when changing the location of one's self in time to past or future, one does not only recall and predict, but one also changes one's mental egocentric perspective on life events. Moreover, from these new self-locations in time, other life events might be regarded differently with respect to their relations to past or future. Thus, when imagining oneself as 10 years younger, last year's events are in the future (relative future) in relation to the initially imagined self-location in time, and vice versa (relative past).
Since earlier studies had shown behavioral and electrophysiological differences between judgments about one's own body while taken from one's actual spatial self-location versus different imagined self-locations, and given evidence that shared mechanisms process time and space in the brain, the authors developed a behavioral paradigm to determine if differences are found not only between different self-locations in time (past, now, and future), but also while imagining events in the relative past or the relative future. They followed neural correlates of MTT using behavioral measures, evoked potential (EP) mapping, and electrical neuroimaging in healthy adult participants.


Stimuli and procedure. The three different self-locations in time (past, now, and future) are shown. Participants were asked to mentally imagine themselves in one of these self-locations, and from these self-locations to judge whether different self or nonself events (e.g., top row) already happened (relative past, darker colors) or are yet to happen (relative future, lighter colors).
Their work confirmed that:
...that MTT is composed of two different cognitive processes: absolute MTT, which is the location of the self to different points in time (past, present, or future), and relative MTT, which is the location of one's self with respect to the experienced event (relative past and relative future). These processes recruit a network of brain areas in distinct time periods including the occipitotemporal, temporoparietal, and anteromedial temporal cortices. Our findings suggest that in addition to autobiographical memory processes, the cognitive mechanisms of MTT also involve mental imagery and self-location, and that relative MTT, but not absolute MTT, is more strongly directed to future prediction than to past recollection.

Generators of MTT map are localized to the right temporoparietal, occipitotemporal, and left anteromedial temporal cortices.

When your brain Lies to You

Even when a lie is presented with a disclaimer, people often later remember it as true. A brief review in the OpEd section of the NYTimes shows how a well documented feature of our memory, source amnesia, might lead 10 % of us to thinking that Barack Obama is a Muslim.

Thursday, July 03, 2008

Brain markers that predict vulnerability to psychosis.

Honey et al. offer an interesting study in the Journal for Neuroscience. As indicated in these slightly edited clips from text and abstract:
They used a drug model of psychosis to relate presymptomatic physiology to symptom outcome. Ketamine induces transient psychotic symptoms in healthy volunteers and exacerbates existing symptoms in patients. They assessed brain responses, separately under placebo and ketamine treatments, in healthy volunteers across four cognitive challenges, each theoretically related to a symptom of psychosis. Two of the tasks (verbal working memory and attention) are associated with negative symptoms, which may result from social and cognitive disengagement attributable to reduced processing capacity of prefrontal cortex, leading to difficulties in concentration and maintaining task set. They predicted that prefrontal activity during the attention and working memory tasks would be associated with vulnerability to negative symptoms under ketamine.

A failure to monitor "inner speech" may provide a mechanism leading to auditory hallucinations, whereby self-generated speech is misattributed externally. Comparing verbal self-monitoring (imagining speech spoken by another person) with inner speech (minimal self-monitoring) increases prefrontal and temporal cortex activation in patients with auditory hallucinations. Ketamine produces auditory illusory experiences similar to the heightened auditory and visual awareness described by patients during the prodromal phase, and it has been suggested that these contribute to the development of hallucinations. The authors predicted that prefrontal and temporal cortex activation during a self-monitoring task would be associated with vulnerability to the auditory illusory experiences under ketamine.

Finally, a sentence completion task was used to engage brain regions associated with semantic processing. Thought disorder involves difficulty in constraining semantic threads of language, making speech disjointed and chaotic, as also observed under ketamine. In patients, the requirement to generate an appropriate semantic response to complete a sentence is associated with increased activation of left frontal and temporal cortex. They predicted that frontotemporal responses to a sentence completion task would predict vulnerability to thought disorder induced by ketamine.

They in fact found that brain responses to cognitive task demands under placebo predict the expression of psychotic phenomena after drug administration. Frontothalamic responses to a working memory task were associated with the tendency of subjects to experience negative symptoms under ketamine. Bilateral frontal responses to an attention task were also predictive of negative symptoms. Frontotemporal activations during language processing tasks were predictive of thought disorder and auditory illusory experiences. A subpsychotic dose of ketamine administered during a second scanning session resulted in increased basal ganglia and thalamic activation during the working memory task, paralleling previous reports in patients with schizophrenia. These results demonstrate precise and predictive brain markers for individual profiles of vulnerability to drug-induced psychosis.

Bias at the ballot box.

Berger et al. provide an interesting demonstration of how susceptible a voter's choice is to environmental cues. The two types of study done are described in Tim Lincoln's review of this work in Nature:
The first was an analysis of results from a general election held in Arizona in 2000, the ballot for which included a proposition to raise state sales tax from 5.0% to 5.6%, to increase education spending. Polling stations included churches, schools, community centres and government buildings.

Berger et al. predicted that voting in a school would produce more support for the proposition than voting in other places. Indeed it did, but not by much compared with other documented effects on voter choice such as order on the ballot paper. Nonetheless, the effect persisted through tests for various other confounding factors (for example, the possibility of a consistently different level of voter turnout at school polling locations).

The second study was a carefully run online experiment that also involved a proposed tax increase to fund schools. The 'voting environment' was manipulated by exposing participants to typical images of schools or control images. The upshot was the same, with the school images prompting greater (and apparently unconscious) support for the initiative than, for example, an image of an office.

All in all, the authors conclude that what they call contextual priming of polling location affects how people vote. They reasonably wonder whether such factors could, for example, influence voting in a church on such matters as gay marriage and stem-cell research.

But here's a thought. In the event of science spending being on the political agenda, why not offer the lab as a polling station? But maybe dim that fluorescent lighting, and persuade all those bearded fellows in white coats to take the day off — or not, as the case may be.

Wednesday, July 02, 2008

A Piazolla Tango

I'm working up the videos of the Sunday musical at Twin Valley mentioned in Monday's post. Here is Anton Piazolla's Invierno Porteno.

Why are musical chords cheerful or melancholy?

In the current issue of American Scientist, Cook and Hayashi offer a fascinating article on the psychoacoustics of harmony perception (PDF here). Major and minor chords entered Western music during the Renaissance, when two-part harmonies were supplanted by three-tone chords. The authors argue that human responses to these chords have a biological basis, rather than being learned (the opinion of most musical theorists). Their acoustical model explains harmony in terms of the relative positions of the three notes in a triad and how their complex higher harmonics, or upper partials, interact with them. Those of you interested in science and music should check out the special Nature series of essays on this topic.

Jean-Philippe Rameau, a French composer and author, wrote his Treatise on Harmony in 1722, one of the first and most influential studies of harmony in Western music. His book noted the profound emotional difference between major and minor chords: “The major mode is suitable for songs of mirth and rejoicing,” he wrote, while the minor mode was suitable for “plaints, and mournful songs.”

From their conclusions, after the analysis section of the paper:
Now that we have a model of how listeners identify a chord as major or minor, we may take the final step and speculate as to why the acoustical valence carries an emotional valence as well. We contend that the emotional symbolism of major and minor chords has a biological basis. Across the animal kingdom, vocalizations with a descending pitch are used to signal social strength, aggression or dominance. Similarly, vocalizations with a rising pitch connote social weakness, defeat or submission. Of course, animals convey these messages in other ways as well, with facial expressions, body posture and so on—but all else being equal, changes in the fundamental frequency of the voice have intrinsic meaning.

This same frequency code has been absorbed, though attenuated, in human speech patterns: A rising inflection is commonly used to denote questions, politeness or deference, whereas a falling inflection signals commands, statements or dominance. How might this translate to a musical context? If we start with a tense, ambiguous chord—for example, the augmented chord containing two 4-semitone intervals— and decrease any one of the three fundamentals by one semitone, the chord will resolve into a major key. It will then have a 5–4, 3–5, or 4–3 semitone structure. Conversely, if we resolve the ambiguous chord by raising any one of the three fundamentals by a semitone, we will obtain a minor chord. The universal emotional response to these chords stems, we believe, directly from an instinctive, preverbal understanding of the frequency code in nature. One of us (Cook) has explored this in more detail (see the bibliography).

Individual tastes and musical styles vary widely. In the West, music has changed over the centuries from styles that employed predominantly the resolved major and minor chords to styles that include more and more dissonant intervals and unresolved chords. Inevitably, some composers have taken this historical trend to its logical extreme, and produced music that fanatically avoids all indications of consonance or harmonic resolution. Such surprisingly colorless “chromatic” music is intellectually interesting, but notably lacking in the ebb and flow of tension and resolution that most popular music employs, and that most listeners crave. Whatever one’s own personal preferences may be for dissonance and unresolved harmonies, some kind of balance between consonance and dissonance, and between harmonic tension and resolution, seems to be essential—genre by genre, and individual by individual—to assure the emotional ups and downs that make music satisfying.