Friday, July 02, 2010

More on enhanced brain processes in musicians.

From Pallesen et al.:
Musical competence may confer cognitive advantages that extend beyond processing of familiar musical sounds. Behavioural evidence indicates a general enhancement of both working memory and attention in musicians. It is possible that musicians, due to their training, are better able to maintain focus on task-relevant stimuli, a skill which is crucial to working memory. We measured the blood oxygenation-level dependent (BOLD) activation signal in musicians and non-musicians during working memory of musical sounds to determine the relation among performance, musical competence and generally enhanced cognition. All participants easily distinguished the stimuli. We tested the hypothesis that musicians nonetheless would perform better, and that differential brain activity would mainly be present in cortical areas involved in cognitive control such as the lateral prefrontal cortex. The musicians performed better as reflected in reaction times and error rates. Musicians also had larger BOLD responses than non-musicians in neuronal networks that sustain attention and cognitive control, including regions of the lateral prefrontal cortex, lateral parietal cortex, insula, and putamen in the right hemisphere, and bilaterally in the posterior dorsal prefrontal cortex and anterior cingulate gyrus. The relationship between the task performance and the magnitude of the BOLD response was more positive in musicians than in non-musicians, particularly during the most difficult working memory task. The results confirm previous findings that neural activity increases during enhanced working memory performance. The results also suggest that superior working memory task performance in musicians rely on an enhanced ability to exert sustained cognitive control. This cognitive benefit in musicians may be a consequence of focused musical training.

How we learn skilled motor sequences

Because I am a pianist, I found the following bit from Steele and Penhune to be interesting and relevant. They found that:
Performance was separated into two components: accuracy (the more explicit, rapidly learned, stimulus–response association component) and synchronization (the more procedural, slowly learned component).
Here is their whole abstract:
Our capacity to learn movement sequences is fundamental to our ability to interact with the environment. Although different brain networks have been linked with different stages of learning, there is little evidence for how these networks change across learning. We used functional magnetic resonance imaging to identify the specific contributions of the cerebellum and primary motor cortex (M1) during early learning, consolidation, and retention of a motor sequence task. Performance was separated into two components: accuracy (the more explicit, rapidly learned, stimulus–response association component) and synchronization (the more procedural, slowly learned component). The network of brain regions active during early learning was dominated by the cerebellum, premotor cortex, basal ganglia, presupplementary motor area, and supplementary motor area as predicted by existing models. Across days of learning, as performance improved, global decreases were found in the majority of these regions. Importantly, within the context of these global decreases, we found specific regions of the left M1 and right cerebellar VIIIA/VIIB that were positively correlated with improvements in synchronization performance. Improvements in accuracy were correlated with increases in hippocampus, BA 9/10, and the putamen. Thus, the two behavioral measures, accuracy and synchrony, were found to be related to two different sets of brain regions—suggesting that these networks optimize different components of learning. In addition, M1 activity early on day 1 was shown to be predictive of the degree of consolidation on day 2. Finally, functional connectivity between M1 and cerebellum in late learning points to their interaction as a mechanism underlying the long-term representation and expression of a well learned skill.

Thursday, July 01, 2010

More on testosterone and human trust.

Johnson and Breedlove offer a commentary on work by Bos et al. mentioned in my June 11 post. They note that:
Women who were already skeptical in their judgment of trustworthy faces did not change their judgment under the influence of testosterone (T). Rather, it was the 12 women who gave the highest ratings of trust under placebo who became significantly more skeptical after T treatment...Because endogenously produced T levels normally vary across time, these findings ... raise the question of whether fluctuating androgen secretion may normally modulate a person’s judgment of whether to trust people. There are circadian rhythms in T secretion, in both men and women, so is there also a circadian rhythm in how they judge trustworthiness in faces? There is also variation in circulating T in women across the menstrual cycle, with a modest peak in circulating T just a few days before ovulation, the very period during which copulation is most likely to result in pregnancy. What’s more, androgens such as T have been reported to boost women’s libido in several studies, including one study using the same sublingual dose of T, which increased sexual arousal. If androgens normally boost female libido, a peak in T before ovulation makes sense to evolutionary psychologists who might expect women to be most interested in sex when they are most fertile. What the present findings suggest is that women might also reach their peak in skepticism about the trustworthiness of other people, presumably including potential mates, at about this same point in the ovulatory cycle. Heightened skepticism about a potential mate’s trustworthiness also makes evolutionary sense in scenarios where a father’s ongoing support is crucial for survival of the infant.
The review also speculates on where T may be acting in the brain:
...the amygdala has been implicated in many studies of social judgment, including making judgments about other people’s faces, and it is also a hotspot for neurons expressing the androgen receptors that T acts upon to regulate gene expression (14, 15). Thus, it is possible that T may alter social judgments by acting directly on the amygdala, perhaps, the authors suggest, by regulating the strength of signaling between the amygdala and other brain regions implicated in social evaluation, such as the orbitofrontal cortex.

Figure - Potential model for hormonal effects on interpersonal trust. The amygdala (center) is active during fearful responses or detecting threat in faces, and many neurons there possess androgen receptors, enabling them to respond to T. Bos et al. (4) suggest that T may reduce interpersonal trust by acting on vasopressinergic neurons in the amygdala to increase communication to brainstem systems that activate fearful responses, while reducing communication to orbitofrontal cortex. Oxytocin boosts interpersonal trust, perhaps by exerting opposing effects on these same systems.

Prefrontal Reward Prediction Errors in Alcohol Dependence

Park et al. (in a collaboration involving, once again,  Ray Dolan at University College) find abnormal functional connectivity between striatum and dorsolateral prefrontal cortex in alcohol-dependent patients.
Patients suffering from addiction persist in consuming substances of abuse, despite negative consequences or absence of positive consequences. One potential explanation is that these patients are impaired at flexibly adapting their behavior to changes in reward contingencies. A key aspect of adaptive decision-making involves updating the value of behavioral options. This is thought to be mediated via a teaching signal expressed as a reward prediction error (PE) in the striatum. However, to exert control over adaptive behavior, value signals need to be broadcast to higher executive regions, such as prefrontal cortex. Here we used functional MRI and a reinforcement learning task to investigate the neural mechanisms underlying maladaptive behavior in human male alcohol-dependent patients. We show that in alcohol-dependent patients the expression of striatal PEs is intact. However, abnormal functional connectivity between striatum and dorsolateral prefrontal cortex (dlPFC) predicted impairments in learning and the magnitude of alcohol craving. These results are in line with reports of dlPFC structural abnormalities in substance dependence and highlight the importance of frontostriatal connectivity in addiction, and its pivotal role in adaptive updating of action values and behavioral regulation. Furthermore, they extend the scope of neurobiological deficits underlying addiction beyond the focus on the striatum.

Wednesday, June 30, 2010

Dysregulation Nation

Judith Warner does an nice piece in the NYTimes Magazine, in which she notes that problems of self-regulation — of appetite, emotion, impulse and cupidity — may well be the defining social pathology of our time. The ideas of Peter C. Whybrow at UCLA are referenced:
...Under normal circumstances, the emotional, reward-seeking, selfish, “myopic” part of our brain is checked and balanced in its desirous cravings by our powers of cognition — our awareness of the consequences, say, of eating too much or spending too much. But after decades of never-before-seen levels of affluence and endless messages promoting instant gratification...this self-regulatory system has been knocked out of whack. The “orgy of self-indulgence” that spread in our land of no-money-down mortgages, Whybrow wrote in his 2005 book, “American Mania: When More Is Not Enough, ”has disturbed the “ancient mechanisms that sustain our physical and mental balance.”...If you put a person in an environment that worships wealth and favors conspicuous consumption, add gross income inequalities that breed envy and competition, mix in stagnant wages, a high cost of living and too-easy credit, you get overspending, high personal debt and a “treadmill-like existence,” as Whybrow calls it: compulsive getting and spending.

The “yawning void, an insatiable hunger, an emptiness waiting to be filled,” that Lasch identified as animating the typical narcissist of the 1970s has grown only deeper with the passage of time. The Great Recession was supposed to portend a scaling back, a recalibration of our lifestyle, and usher in a new era of making more of less. But the pressures that drive the dysregulated American haven’t abated any since the fall of 2008. Wall Street is resurgent, and unemployment is still high. For too many people, the cycle of craving and debt that drives our treadmill existence simply can’t be broken.

The classical art of memory

Andrea Becchetti notes a correspondence between modern understanding of memory formation in the hippocampus and techniques in the "Art of Memory" developed by Greek and Roman culture:
Formation and consolidation of declarative memories depend on the physiology of the hippocampal formation. Moreover, this brain structure determines the dynamic representation of the spatial environment, thus also contributing to spatial navigation through specialized cells such as “place” and “grid” cells (1). The most interesting recent results by Jacobs et al. (2) further this research field by showing that the entorhinal cortex contains path cells that represent direction of movement (clockwise or counterclockwise). The authors underscore that neurons in the entorhinal cortex encode multiple features of the environmental and behavioral context that can then be memorized by means of operations carried out by the hippocampus. They conclude by suggesting that a fuller characterization of these neurons’ properties and relation to the hippocampal circuit will be necessary to understand the neural basis of cognition. I fully agree with their conclusion and wish to comment on a further aspect of this complex issue, by considering psychological evidence that traces back to the ancient world and is generally neglected by modern neuroscience.

Greek and Roman culture has handed down to us the so-called “art of memory,” a set of methods aimed at improving one’s memory, described in detail by Cicero, Quintilianus, and others. The history of these concepts and their multifarious cultural meaning was masterfully treated by Rossi (3) and Yates (4). In brief, committing to memory long written pieces, word lists, series of numbers, etc. is greatly facilitated by proceeding as follows. First, one chooses a series of objects or places located in a (preferably familiar) spatial environment, such as the architecture details of a building or the landmarks of a certain route. Subsequently, these objects or places are mentally associated to the items to be remembered. The map of environmental images, which is easy to recall, thus provides a direct hint to the more abstract items. Moreover, proceeding along such a mental path directly provides the proper order of the sequence to be memorized (say a poem or speech). For example, Cicero used to associate the main points of his long speeches to specific buildings or other topographical reference points along the familiar route to the Roman Forum.

This method still constitutes the basis of modern mnemonics, which must be deeply rooted in neurology because it seems to be applied unawares even by mnemonists who have never heard of its existence. A famous example is the one described by Luria (5), who was himself unaware of the art of memory. Such venerable psychological evidence makes the neurophysiological association between orientation in space and declarative memory in the hippocampal formation even more suggestive. It supports the notion that the consolidation of human memory is guided by a partially preconfigured system related to external space representation, which may be the evolutionary basis of memory processing of more abstract entities in complex brains. These considerations may also have heuristic value in suggesting how the enthorinal cortex, the hippocampus, and the neocortex interplay during memory consolidation of complex abstract issues.

1. Moser EI Kropff E, Moser MB
(2008) Place cells, grid cells, and the brain’s spatial representation system. Annu Rev Neurosci 31:69–89.
2. Jacobs J, Kahana MJ, Ekstrom AD, Mollison MV, Fried I (2010) A sense of direction in human entorhinal cortex. Proc Natl Acad Sci USA 107:6487–6492.
3. Rossi P (2006) Logic and the Art of Memory (Continuum International, London).
4. Yates F (1992) The Art of Memory (Pimlico, London).
5. Luria AR (1986) The Mind of a Mnemonist: A Little Book About a Vast Memory (Harvard Univ Press, Cambridge, MA).

Tuesday, June 29, 2010

Binge drinking and the adolescent brain.

Deficits in hippocampus-associated cognitive tasks are observed in alcoholic humans. Taffe et al. show that binge drinking in adolescent macaque monkeys causes long lasting decreases in hippocampus cell division, turnover, and migration. Their results:
...demonstrate that the hippocampal neurogenic niche during adolescence is highly vulnerable to alcohol and that alcohol decreases neuronal turnover in adolescent nonhuman primate hippocampus by altering the ongoing process of neuronal development. This lasting effect, observed 2 mo after alcohol discontinuation, may underlie the deficits in hippocampus-associated cognitive tasks that are observed in alcoholics.

Being hungry or full influences our risk taking.

Dolan's group does some neat experiments showing that having a full stomach makes us more risk aversive in monetary decisions. We act just like other animals, who often express a preference for risky (more variable) food sources when below a metabolic reference point (hungry), and safe (less variable) food sources when sated. We follow an ecological model of feeding behavior, not the behavior predicted by normative economic theory. Thus hormone levels that reflect our metabolic state (ghrelins signalling acute nutrient intake and leptins providing an assay of energy reserves) can, like oxytocin and testosterone, influence our economic choices.

Monday, June 28, 2010

MRI can decode subjective, but not objective, memories

In experiments that cast further doubt on claims of lie detection by fMRI measurements, Rissman et al. find that subjective memory states can be decoded accurately under controlled experimental conditions, but that fMRI has uncertain utility for objectively detecting an individual's past experiences. Here is a nice summary of their work from Gilbert Chin:
As a consequence of recent investigations that have used sophisticated methods of analyzing brain activity to propose that objective lie detection may be feasible, it has become apparent that designing a task in which subjects lie whole-heartedly and voluntarily (as opposed to being instructed to do so every fifth answer, for instance) is a nontrivial undertaking. Rissman et al. have approached this challenge by adapting a well-established laboratory paradigm—that of face recognition—to conditions that approximate those of quotidian experience. They asked subjects to study 200 faces and then interrogated them 1 hour later, using a mix of new and old test faces. The menu of responses offered a choice of (i) definitely remembered; (ii–iii) high and low confidence that the face was familiar; and (iv–v) high and low confidence that the face was new.
An analysis of brain activity during the response phase revealed distinctive patterns when old (that is, previously studied) faces were rated by the subject as definitely remembered versus strongly familiar, and also when they were rated as being strongly versus weakly familiar. In contrast, for faces rated as being weakly unfamiliar, it was not possible to tell from the neural activity patterns which were actually new and which had been seen during the study phase, and for weakly familiar faces, the new/old distinction was achievable only some of the time. Furthermore, if subjects were instead told to rate attractiveness during the study phase and then asked to categorize faces by gender during the response phase, it was not possible to diagnose which faces were new and which were not. Taken together, these findings suggest that brain activity reflects subjective, rather than objective, face recognition.

How we read the minds of others.

Tamir et al. do some interesting MRI studies that suggest that understanding the mental states of others starts with self perception as an anchor from which serial adjustments of the perceptions of others are made:
Recent studies have suggested that the medial prefrontal cortex (MPFC) contributes both to understanding the mental states of others and to introspecting about one's own mind. This finding has suggested that perceivers might use their own thoughts and feelings as a starting point for making inferences about others, consistent with “simulation” or “self-projection” views of social cognition. However, perceivers cannot simply assume that others think and feel exactly as they do; social cognition also must include processes that adjust for perceived differences between self and other. Recent cognitive work has suggested that such correction occurs through a process of “anchoring-and-adjustment” by which perceivers serially tune their inferences from an initial starting point based on their own introspections. Here, we used functional MRI to test two predictions derived from this anchoring-and-adjustment view. Participants (n = 64) used a Likert scale to judge the preferences of another person and to indicate their own preferences on the same items, allowing us to calculate the discrepancy between the participant's answers for self and other. Whole-brain parametric analyses identified a region in the MPFC in which activity was related linearly to this self–other discrepancy when inferring the mental states of others. These findings suggest both that the self serves as an important starting point from which to understand others and that perceivers customize such inferences by serially adjusting away from this anchor.


Figure - The relation between BOLD response and self–other discrepancy during Other trials was calculated separately for subregions of the MPFC. Although the response of dorsal MPFC (A) increased linearly with increasing self–other discrepancy, the response of ventral MPFC (B) distinguished only between trials on which self–other discrepancy was zero (overlap between self and other) versus greater than zero (discrepancy between self and other). Error bars indicate the SEM.
A bit more on the actual experimental design:
Although the specific design of the four experiments differed slightly, each required participants to answer a series of questions about their opinions and preferences and to judge how other individuals would answer the same questions. On each trial, participants saw a cue that indicated the target of the judgment (self or another person) and a brief phrase (e.g., “enjoy winter sports such as skiing or snowboarding”; “fear speaking in public”). Participants used either a four- or five-point scale either to report how well the statement described themselves or to judge how well it described the other person. Within each experiment, participants considered the same set of statements for self and other.

Before scanning, participants were told that the purpose of the experiment was to examine how people make inferences about target individuals on the basis of minimal or no information. In all studies, targets were college-aged individuals depicted by a photograph downloaded from an internet dating website, although the specific identity of individuals varied across studies.

Friday, June 25, 2010

"The Singularity" - and Singularity University

I've been meaning to point to this interesting New York Times article, on techno-utopian Singularity University (whose sponsors include Google co-founders Sergey Brin and Larry Page), which aims to enhance and prepare us for the arrival of "The Singularity" — "a time, possibly just a couple decades from now, when a superior intelligence will dominate and life will take on an altered form that we can’t predict or comprehend in our current, limited state." The article focuses on Raymond Kurzweil, the inventor and businessman who is the Singularity’s most ubiquitous spokesman:
...(who, in August)..will begin a cross-country multimedia road show to promote “Transcendent Man,” a documentary about his life and beliefs. Another of his projects, “The Singularity Is Near: A True Story About the Future,” has also started to make its way around the film festival circuit....some Singularitarians aren’t all that fond of Mr. Kurzweil...“I think he’s a genius and has certainly brought a lot of these ideas into the public discourse,” says James J. Hughes, the executive director of the Institute for Ethics and Emerging Technologies, a nonprofit that studies the implications of advancing technology. “But there are plenty of people that say he has hijacked the Singularity term.”

Some of the Singularity’s adherents portray a future where humans break off into two species: the Haves, who have superior intelligence and can live for hundreds of years, and the Have-Nots, who are hampered by their antiquated, corporeal forms and beliefs....“The Singularity is not the great vision for society that Lenin had or Milton Friedman might have,” says Andrew Orlowski, a British journalist who has written extensively on techno-utopianism. “It is rich people building a lifeboat and getting off the ship.”

Despite all of the zeal behind the movement, there are those who look askance at its promises and prospects...Jonathan Huebner, for example, is often held up as Mr. Kurzweil’s foil. A physicist who works at the Naval Air Warfare Center as a weapons designer, he, like Mr. Kurzweil, has compiled his own cathedral of graphs and lists of important inventions. He is unimpressed with the state of progress and, in 2005, published in a scientific journal a paper called “A Possible Declining Trend for Worldwide Innovation.”..Measuring the number of innovations divided by the size of the worldwide population, Dr. Huebner contends that the rate of innovation peaked in 1873. Or, based on the number of patents in the United States weighed against the population, he found a peak around 1916. (Both Dr. Huebner and Mr. Kurzweil are occasionally teased about their faith in graphs.)

Presidential Harrisment

A clip from an article by Steve Mirsky in the June issue of Scientific American:
In early March, Harris Interactive conducted an on-line survey to gauge the attitudes of Americans toward President Barack Obama. The Harris Poll generated some fascinating data. For example, 40 percent of those polled believe Obama is a socialist. (He’s not—ask any socialist.) Thirty-two percent believe he is a Muslim. (I had predicted that a Mormon, Jew, Wiccan, atheist and Quetzalcoatl worshipper would become president before America elected a Muslim, so a third of this country actually may be quite open-minded, in an obtuse way.) Also, 14 percent believe that Obama may be the Antichrist. Of those who identified themselves as Republicans, 24 percent think Obama might be.

Thursday, June 24, 2010

Concert pianists as genius models?

Not likely in my case.... but I will mention an article by Charles Ambrose in the July-August issue of American Scientist, pointed out to me by a friend who is a loyal MindBlog reader. I started this blog post at the same time I plunged into read the article, assuming I would be passing on some juicy clips, but alas have to report coming up short of much substance - although the article is worth a link because of its review of brain plasticity, and notes specific brain changes associated with development of various skilled activities. Ambrose mentions the increased areas in the parietal lobe found in Albert Einstein's brain, and then goes on to note other examples of increases in brain areas associated with expertise, as for example in professional musicians who have enlarged areas in their auditory cortex. The article doesn't even begin to engage the teaser sentence at its beginning: "What accounts for highly intelligent and greatly gifted individuals?" and is disjointed and wandering enough that I'm surprised that editors at American Scientist let it through their filters.

Antipsychotic drug shrinks the brain

It turns out that haloperidol, a commonly-prescribed antipsychotic drug, shrinks the brain within hours of administration, specifically diminishing grey-matter volume in the striatum — a region that mediates movement. The effect is reversible. The Meyer-Lindberg group doing the study suggests that by acting on Dopamine D2 receptors it may downsize synaptic connections, and thus cause the lapses in motor control that affect many patients on antipsychotics.

Wednesday, June 23, 2010

Sense of Wonder

As we age our brains become so stuffed with our history that we loose the capacity to open to novel experiences, to sense things with the naive freshness of a child. I pass on this brief fable by Richard A. Lovett in the June 3 Issue of Nature about a miraculous cure for this condition:
Clay Nadir wanted a book for the beach. Not just any book, but the type that makes you forget the beach, other than as the place where you discovered Jack London or Sherlock Holmes or Norman Mailer. But even on the shelves of his city's largest bookstore, he wasn't finding anything. It was as though everything new had long ago been stuffed into his brain.

Maybe he was jaded. Maybe, once you'd worked your way through Agatha Christie, no manor house would ever again hold your attention. And was The Time Machine really that good, or had it simply been a first, both for Clay and the world?

Nature writing, westerns, mountaineering, ghost stories, dysfunctional families ... literarily, Clay had been there, done that. In the past hour he'd wandered though fantasy, mystery, biography and what a friend called 'Qual. Lit.' — a conceited term if ever there was one. Quality literature, ha! As though any genre had a monopoly. Not to mention that once you'd read Nabokov and Woolf and Joyce, you could get as jaded with that stuff as anything else. Even Shakespeare you could eventually memorize.

Maybe he should try romance. He'd never dabbled in it before, so at least it would be different.

Then, in the occult section, something caught his eye. It was an odd book: black, with a red, spiral vortex on the cover. It made him think of Hitchcock's Vertigo. Now that was a movie: Jimmy Stewart and Kim Novak in a deceptively simple story you had to see several times to fully grasp. But once you did, so many other movies seemed so ... trivial.

The book also made him think of something from his youth. Something to do with an old TV show. What was it called? Oh yes, The Time Tunnel. Each week, they'd spun this thing like a giant pinwheel and run off to some distant era. Probably unbelievably stupid if he watched it today, but at the time it hit him like his first viewing of Doctor Who, another show involving a time vortex, plus a lot of other things he'd never seen before.

There was no author listed, and as he picked up the book, he seemed to be falling into the vortex. On the back was a simple endorsement: 'Guaranteed to restore your sense of wonder'.

Yeah, right. He'd heard that one before.

He opened it but there was no preface, no introduction, no writing at all. Just more spirals, one to a page, these in black-and-white.

He nearly set it back down, but the sense of being sucked in was too strong. It was as though the entire room were spinning: just what Jimmy Stewart's character must have felt as he looked into the depths ... Dizzying enough that Clay no longer wanted to think about Stewart or Hitchcock or old TV shows.

There was a white circle in the centre of the first spiral. Peering into it he saw flickers of motion: barely remembered images of Jimmy Stewart, Kim Novak and maybe The Time Tunnel.

With an effort, he turned the page. Another spiral, again sucking him in. This time, he saw words.

Call me Ishmael.

It was the best of times ...

In the beginning God created the heavens and earth.

Once upon a time there was a Martian named Valentine Michael Smith.

To be, or not to be ...

Rather than simply reading them, he felt as though the words were being pulled from him, faster and faster. He turned another page and another and another. It wasn't just words and videos. There were also stills: a stern-looking couple with a pitchfork; a woman with an odd half-smile. Guitar riffs, symphonies, something about Lucy in the sky with a yellow submarine. Names for these would tickle his memory then be gone, often faster than he could grasp what they had been. Something about whistling and moaning. Something about singing insects.

Then, it was over. Clay had no idea how long he'd been staring at the book. All he knew was that he'd flipped through most of the pages, but not all. He looked at the next few, but they were simply spirals. Still dizzying, but not like before. He flipped back, but there were no longer any words or images. Just paper.

There was no price tag on the book. Clay wondered briefly why it had been so captivating. Maybe he'd merely let his blood sugar dip too low. Maybe the spirals caught him off guard.

He put the book back where he'd found it, on a countertop beside a computer terminal where customers could check the store's inventory. It was as though a prior reader, if that was the proper term for the peruser of such a book, had placed it there, easy to find.

Clay still needed something for the beach.

He wandered the store, more or less at random, until he fetched up in the mystery section. Mysteries were fun, he thought, although he didn't know why. There were a lot of books and he couldn't remember which ones he'd read, so he picked the first that caught his eye.

'It was a dark and stormy night,' he read. Wow, he breathed, and was instantly hooked.

The Devil's grimmace.

An interesting fragment by Gisela Telis in ScienceNow:
When 15th-century Europeans first landed on the Bahamas, Cuba, and Hispaniola, they met with the "devil's grimace." That's what these foreigners dubbed the faces with bared teeth that adorned everything from necklaces to ceremonial bowls created by the native Taíno people. European chroniclers interpreted the motif as a ferocious animal's snarl or a skull's grimace, signs of the heathen islanders' aggression. But they were wrong, researchers report in the latest issue of Current Anthropology. By studying teeth-baring in humans, chimps, and rhesus macaques and comparing these to the Taíno depictions, scientists determined that open-lipped, closed-jaw displays show submission, benign intent, and even happiness—but not aggression. So the "fiendish" faces that so troubled Europeans were most likely just smiling, to signal—ironically enough—social cohesion and connection.

Tuesday, June 22, 2010

Oxytocin: out-group aggression, social cues, and amygdalar action

Three recent papers are a reflection of the recent outpouring of work on oxytocin (a peptide hormone containing the 9 amino acids shown in the figure), which (from Miller's review):
...promotes social bonding in a wide range of animals, including humans. Sold on the Internet in a formulation called "Liquid Trust," the peptide hormone is marketed as a romance enhancer and sure ticket to business success. Australian therapists are trying it alongside counseling for couples with ailing marriages. And police and military forces reportedly are interested in its potential to elicit cooperation from crime suspects or enemy agents.
The hormone is now being found to have a prickly side, and is coming to be regarded as much more than just a touchy-feely "trust hormone." De Dreu et al. have designed experiments to demonstrate that oxytocin drives a "tend and defend" response in that it promotes in-group trust and cooperation, and defensive, but not offensive, aggression toward competing out-groups.

In another study on oxytocin, Gamer et al. add to studies that have shown that oxytocin decreases aversive reactions to negative social stimuli, and find that subjects given oxytocin, relative to subjects given placebo, are more likely to make eye movements toward the eye region when viewing images of human faces. They find that subregions of the amygdala are important in mediating this effect. Oxytocin:
...attenuated activation in lateral and dorsal regions of the anterior amygdala for fearful faces but enhanced activity for happy expressions, thus indicating a shift of the processing focus toward positive social stimuli. On the other hand, oxytocin increased the likelihood of reflexive gaze shifts toward the eye region irrespective of the depicted emotional expression. This gazing pattern was related to an increase of activity in the posterior amygdala and an enhanced functional coupling of this region to the superior colliculi. Thus, different behavioral effects of oxytocin seem to be closely related its specific modulatory influence on subregions within the human amygdala.
These finding have implications for understanding the role of oxytocin in normal social behavior as well as the possible therapeutic impact of oxytocin in brain disorders characterized by social dysfunction.

The Wayward Mind

Continuing my review of old posts that have popped into my head over the past few days during mulling over this and that, I am reproducing my March 6, 2006 post in its entirety: 

I want to mention the excellent book by Guy Claxton - THE WAYWARD MIND, an intimate history of the unconscious (2005, Little, Brown, and Co., available from amazon.com). Here is a excerpt and paraphrase from pp. 348-252:
"What we call our "self " is an agglomeration of both conscious and unconscious ingredients, cans, needs, dos, oughts, thinks - the temptation is to assume that the "I" is the same in all of them - so that instead of having an intricate web of things that make me ME, I have to create a single imaginary hub around which they all revolve, to which they all refer - the attempt to keep this fiction going, to "hold it together" can become quite tiring and bothersome - If "I" am essentially reasonable, if I imagine that my zones of control - over my own feelings for example - are wider and more robust than they are, then I am going to get in a tangle trying to "control myself." If I have decided that who I am is clever, attractive, athletic, stable, creating the hub of "I" locks everything together and prevents it moving. It stops Me expanding to include the unconscious, or graciously shrinking to accommodate old age. I can not enjoy my waywardness, nor see it as an intrinsic part of ME - (note: he gives Ramachandran's two foot nose pinocchio demonstration as evidence of plasticity of self image), and then says - The orthodox sense of self is thrown by such experiences, and tends to suffer a sense-of-humour failure. It sees all waywardness as an affront, and tends to become earnest or myopic in response. In a nutshell: it is bad enough to have a nightmare, without your rattled sense of self telling you that you are going mad. Weird experience can never be just funny (as the pinocchio effect can be) or matter-of fact (as possession is in Bali), or transiently inconvenvient (as a bad dream is), or wonderful (as a mystical experience can be), or just mysterious (as a premonition might be). For the locked-up self they have to be denied, explained or dealt with. All the evidence is that a more relaxed attitude toward the bounds of self makes for a richer, easier and more creative life. Perhaps, after all, waywardness in all its forms is in need not so much of explanation, but of a mystified but friendly welcome. We can explain it if we wish, and the brain is beginning to a reasonable job. But the need to explain, when not motivated by the dispassionate curiosity of the scientist, is surely a sign of anxiety: of the desire to tame with words that which is experienced as unsettling.

Monday, June 21, 2010

Followup on acupuncture

Following the "Acupuncture's secret revealed?" post on 6/17 a reader sent me two interesting links that I want to pass on.  Body in Mind reviews a paper in Journal of Pain that finds no significant effect of physician training or expertise on outcome, and notes studies that find no difference between needle placement at classical acupuncture points and randomly placed needles, and also that patient expectations about acupuncture influence outcome. And, Eric Mead briefly lectures on "The Magic of the Placebo."

Is life worth living?

Philosopher Peter Singer, in his "Should this be the last generation" query, offers philosophical rambling of the sort that drives me up the wall, a prime example of one of the things our brains were definitely not designed to do. My bottom line is that I refuse to get excited about existential issues ("Is life worth living", etc.) beyond those I think my two abyssinian cats would find compelling...food, shelter, a place to poop, and sex. I don't think the human overlay on top of that has shown much competence with  'the meaning of it all' questions. In this territory we are like the dog being asked to understand quantum physics. Singer starts by noting 19th century German philosopher Schopenhauer's pessimism:
...the best life possible for humans is one in which we strive for ends that, once achieved, bring only fleeting satisfaction. New desires then lead us on to further futile struggle and the cycle repeats itself.
He then goes on to note more recent arguments from philosopher David Benatar:
To bring into existence someone who will suffer is, Benatar argues, to harm that person, but to bring into existence someone who will have a good life is not to benefit him or her...Hence continued reproduction will harm some children severely, and benefit none.

Benatar also argues that human lives are, in general, much less good than we think they are....we are, in Benatar’s view, victims of the illusion of pollyannaism. This illusion may have evolved because it helped our ancestors survive, but it is an illusion nonetheless. If we could see our lives objectively, we would see that they are not something we should inflict on anyone.

...the people who will be most severely harmed by climate change have not yet been conceived. If there were to be no future generations, there would be much less for us to feel to guilty about....So why don’t we make ourselves the last generation on earth? If we would all agree to have ourselves sterilized then no sacrifices would be required — we could party our way into extinction!
Singer does end on a more upbeat note:
I do think it would be wrong to choose the non-sentient universe. In my judgment, for most people, life is worth living. Even if that is not yet the case, I am enough of an optimist to believe that, should humans survive for another century or two, we will learn from our past mistakes and bring about a world in which there is far less suffering than there is now. But justifying that choice forces us to reconsider the deep issues with which I began. Is life worth living? Are the interests of a future child a reason for bringing that child into existence? And is the continuance of our species justifiable in the face of our knowledge that it will certainly bring suffering to innocent future human beings?