Friday, January 29, 2010

A clever treatment for tinnitus

Okamoto et al. find that they can reduce brain activity related to tinnitus by exposing chronic tinnitus patients to self-chosen, enjoyable music, which has been modified (“notched”) to contain no energy in the frequency range surrounding the individual tinnitus frequency:

Maladaptive auditory cortex reorganization may contribute to the generation and maintenance of tinnitus. Because cortical organization can be modified by behavioral training, we attempted to reduce tinnitus loudness by exposing chronic tinnitus patients to self-chosen, enjoyable music, which was modified (“notched”) to contain no energy in the frequency range surrounding the individual tinnitus frequency. After 12 months of regular listening, the target patient group (n = 8) showed significantly reduced subjective tinnitus loudness and concomitantly exhibited reduced evoked activity in auditory cortex areas corresponding to the tinnitus frequency compared to patients who had received an analogous placebo notched music treatment (n = 8). These findings indicate that tinnitus loudness can be significantly diminished by an enjoyable, low-cost, custom-tailored notched music treatment, potentially via reversing maladaptive auditory cortex reorganization.

Thursday, January 28, 2010

Social structure influences language structure.

Lupyan and Dale do a statistical analysis of over 2,000 languages to show that

...languages spoken by large groups have simpler inflectional morphology than languages spoken by smaller groups, as measured on a variety of factors such as case systems and complexity of conjugations. Additionally, languages spoken by large groups are much more likely to use lexical strategies in place of inflectional morphology to encode evidentiality, negation, aspect, and possession.
This suggests that
...just as biological organisms are shaped by ecological niches, language structures appear to adapt to the environment (niche) in which they are being learned and used.

Wednesday, January 27, 2010

Resting brain default mode activity under genetic control

From Glahn et al:

The default-mode network, a coherent resting-state brain network, is thought to characterize basal neural activity. Aberrant default-mode connectivity has been reported in a host of neurological and psychiatric illnesses and in persons at genetic risk for such illnesses. Whereas the neurophysiologic mechanisms that regulate default-mode connectivity are unclear, there is growing evidence that genetic factors play a role. In this report, we estimate the importance of genetic effects on the default-mode network by examining covariation patterns in functional connectivity among 333 individuals from 29 randomly selected extended pedigrees. Heritability for default-mode functional connectivity was 0.424 ± 0.17 (P = 0.0046). Although neuroanatomic variation in this network was also heritable, the genetic factors that influence default-mode functional connectivity and gray-matter density seem to be distinct, suggesting that unique genes influence the structure and function of the network. In contrast, significant genetic correlations between regions within the network provide evidence that the same genetic factors contribute to variation in functional connectivity throughout the default mode. Specifically, the left parahippocampal region was genetically correlated with all other network regions. In addition, the posterior cingulate/precuneus region, medial prefrontal cortex, and right cerebellum seem to form a subnetwork. Default-mode functional connectivity is influenced by genetic factors that cannot be attributed to anatomic variation or a single region within the network. By establishing the heritability of default-mode functional connectivity, this experiment provides the obligatory evidence required before these measures can be considered as endophenotypes for psychiatric or neurological illnesses or to identify genes influencing intrinsic brain function.



Fig. 1 (A) Group-ICA map of the default-mode network derived from resting state scans of 333 individuals from large extended pedigrees. (B) Significant genetic correlations for functional connectivity between heritable regions in the default-mode network. The left parahippocampal gyrus (green) was genetically correlated with the posterior cingulate/precuneus (yellow), medial prefrontal (blue), right cerebellar (red), and right temporal-parietal (pink) regions. In addition, the posterior cingulate/precuneus, medial prefrontal, and right cerebellar regions form a circuit influenced by the same genetic factors. (C) Significant environmental correlations between these same regions.
The pattern of significant environmental correlations differed dramatically from those of the genetic correlations (Fig. 1C). Environmental correlations typically result from unmeasured aspects of the environment or correlated measurement errors. The right temporal–parietal region was significantly correlated with the posterior cingulate/precuneus and medial prefrontal cortex. Neither of these regions showed significant genetic correlations. In addition, the right cerebellum and medial prefrontal cortex had a significant environmental correlation.

Distinguish Democrats and Republicans from their faces.

Here is a quirky item... it turns out that we can guess the political affiliation of someone, more accurately than by chance, by looking at their photograph. Faces perceived to be more powerful are more likely to be perceived as Republicans. Faces perceived as warmer are more likely to also be perceived as Democrats!

Tuesday, January 26, 2010

The secret life of chaos.

The BBC has produced a beautiful program on Chaos theory. The mathematics of chaos can explain how and why the universe creates exquisite order and pattern, transforming simplicity into complexity. A YouTube version can be viewed here, the BBC web version plays only in the UK. The clip below is the first installment in the six part series.

Thinking of God moves attention

Here is an interesting tidbit:

The concepts of God and Devil are well known across many cultures and religions, and often involve spatial metaphors, but it is not well known if our mental representations of these concepts affect visual cognition. To examine if exposure to divine concepts produces shifts of attention, participants completed a target detection task in which they were first presented with God- and Devil-related words. We found faster RTs when targets appeared at compatible locations with the concepts of God (up/right locations) or Devil (down/left locations), and also found that these results do not vary by participants’ religiosity. These results indicate that metaphors associated with the divine have strong spatial components that can produce shifts of attention, and add to the growing evidence for an extremely robust connection between internal spatial representations and where attention is allocated in the external environment.

Monday, January 25, 2010

Speech perception requires motor system activation.

Yuen et al. find that specific articulatory commands are activated automatically and involuntarily during speech perception, and suggest, in a broader framework, that perception of action entails activation of the motor system. Their behavioral evidence backs up functional MRI studies that have demonstrated that the brain regions involved in the perception of speech overlap with those involved in the production of speech. They reasoned that if articulatory information is activated in speech perception, then this information should interfere with articulation in a scenario in which participants are asked to produce a target syllable while listening to a different auditory distractor. Their approach was to investigate how an auditory distractor impacts upon the actual articulation of a different target. The thought was that if articulatory information is activated in speech perception, then that information might interfere with speech production by introducing particular distortions of the target syllable that reflect the articulatory properties of the distractor. Here is their abstract:

Emerging neurophysiologic evidence indicates that motor systems are activated during the perception of speech, but whether this activity reflects basic processes underlying speech perception remains a matter of considerable debate. Our contribution to this debate is to report direct behavioral evidence that specific articulatory commands are activated automatically and involuntarily during speech perception. We used electropalatography to measure whether motor information activated from spoken distractors would yield specific distortions on the articulation of printed target syllables. Participants produced target syllables beginning with /k/ or /s/ while listening to the same syllables or to incongruent rhyming syllables beginning with /t/. Tongue–palate contact for target productions was measured during the articulatory closure of /k/ and during the frication of /s/. Results revealed “traces” of the incongruent distractors on target productions, with the incongruent /t/-initial distractors inducing greater alveolar contact in the articulation of /k/ and /s/ than the congruent distractors. Two further experiments established that (i) the nature of this interference effect is dependent specifically on the articulatory properties of the spoken distractors; and (ii) this interference effect is unique to spoken distractors and does not arise when distractors are presented in printed form. Results are discussed in terms of a broader emerging framework concerning the relationship between perception and action, whereby the perception of action entails activation of the motor system.

Friday, January 22, 2010

Rats can learn a cooperation game.

Yet another supposed barrier between human and animal smarts has fallen,  the assumption that only humans have the cognitive capabilities to play the famous 'Prisoner's Dilemma' game, i.e. to engage in reciprocity, which requires numerical discrimination, memory, and control of temporal discounting. Viana et al.:

We use an iterated PD game to test rats (Rattus norvegicus) for the presence of such cognitive abilities by manipulating the strategy of the opponent, Tit-for-Tat and Pseudo-Random, or the relative size of the temptation to defect. We found that rats shape their behaviour according to the opponent's strategy and the relative outcome resulting from cooperative or defective moves. Finally, we show that the behaviour of rats is contingent upon their motivational state (hungry versus sated).

Thursday, January 21, 2010

Watching our brain decide when it has enough information

Once we think we have sufficient data for a decision, our brains constrain the accumulation of addition information. de Lange et al. actually view this process using magnetoencephalography (MEG, which records the weak magnetic signals generated by brain activity):

In the last decade, great progress has been made in characterizing the accumulation of neural information during simple unitary perceptual decisions. However, much less is known about how sequentially presented evidence is integrated over time for successful decision making. The aim of this study was to study the mechanisms of sequential decision making in humans. In a magnetoencephalography (MEG) study, we presented healthy volunteers with sequences of centrally presented arrows. Sequence length varied between one and five arrows, and the accumulated directions of the arrows informed the subject about which hand to use for a button press at the end of the sequence (e.g., LRLRR should result in a right-hand press). Mathematical modeling suggested that nonlinear accumulation was the rational strategy for performing this task in the presence of no or little noise, whereas quasilinear accumulation was optimal in the presence of substantial noise. MEG recordings showed a correlate of evidence integration over parietal and central cortex that was inversely related to the amount of accumulated evidence (i.e., when more evidence was accumulated, neural activity for new stimuli was attenuated). This modulation of activity likely reflects a top–down influence on sensory processing, effectively constraining the influence of sensory information on the decision variable over time. The results indicate that, when making decisions on the basis of sequential information, the human nervous system integrates evidence in a nonlinear manner, using the amount of previously accumulated information to constrain the accumulation of additional evidence.

Wednesday, January 20, 2010

One thing that doesn't deteriorate as we age!

Kadota and Gomi find that our speed of detecting visual stimuli in our peripheral visual field during reaching movements shows little decay with aging, in contrast to other visual functions.

It is well established that humans can react more quickly to a visual stimulus in the visual field center than to one in the visual periphery and that the reaction to a stimulus in the visual periphery markedly deteriorates with aging. These tendencies are true in conventional discrimination-reaction tasks. Surprisingly, however, we found that they are entirely different when reactions are induced by the same visual stimuli during reaching movements. The reaction time for a stimulus in the visual periphery was significantly faster than in the central vision, and age-related slowing of reactions to the stimulus in the visual periphery were quite small, compared to that observed in the conventional reaction tasks. This inconsistent slowing of reactions in different motor conditions underscores a distinctive visuomotor pathway for online control, which is more robust against age-related deterioration.

Tuesday, January 19, 2010

Why I am a snowbird...

Scenes from the yard in my Wisconsin home. Beautiful, but chilly. 



A thought to speech machine.

When I was a post-doctoral fellow in the Neurobiology Dept. at Harvard Medical School in 1967-68, I regularly attended tea time discussions in the Hubel and Wiesel laboratory (These guys got a Nobel Prize a few years later for their work on how the visual cortex works). I recall being astounded by their discussions about experiments with microelectrodes implanted in a monkey brain that were finding that the activity of almost any nerve cell reported by an electrode could be trained to fire on demand by operant conditioning (for example, a cell being trained that a certain pattern of its activity could produce a reward stimulus like fruit juice). This memory came back to me when Mindblog reader Tristan emailed me excited about work (which turns out to have been in my queue of potential post topics) that is a logical extension of those experiments over 40 years ago.

This work by Guenther et al. implanted in the motor cortex a long-term cone electrode that records from neurites that grow onto its recording surface. The subject was a 26 year old male suffering from locked-in syndrome due to a brain stem stroke incurred at age 16, leaving the brain areas responsible for consciousness, cognition, and higher-level aspects of movement control intact while eliminating nearly all voluntary movement. They were able to effect some speech restoration by decoding continuous auditory parameters for a real-time speech synthesizer from neuronal activity during attempted speech reported by implanted motor cortex electrode. The paper has interesting figures and a video. Here is a central summary figure:



Black circles and curved arrows represent neurons and axonal projections, respectively, in the neural circuitry for speech motor output. The volunteer's stroke-induced lesion in the efferent motor pathways (red X) disconnects motor plans represented in the cerebral cortex from the speech motoneurons, thus disabling speech output while sparing somatic, auditory, and visual sensation as well as speech motor planning centers in cerebral cortex. Signals collected from an electrode implanted in the subject's speech motor cortex are amplified and sent wirelessly across the scalp as FM radio signals. The signals are then routed to an electrophysiology recording system for further amplification, analog-to-digital conversion, and spike sorting. The sorted spikes are sent to a Neural Decoder which translates them into commands for a Speech Synthesizer. Audio signals from the synthesizer are fed back to the subject in real time. [Abbreviation: PrCG = precentral gyrus.]

Monday, January 18, 2010

Distinguishing a new evolutionary track?

This interesting article considers society as an evolutionary track distinct from culture and genes, noting that social and cultural units relate to different informational categories (roles versus beliefs); they are learned in different ways (experience versus interpretation), and have different accuracy and consistency requirements (necessary versus unnecessary). The authors consider cultural exaptation in footbinding, marriage form, and religious practices in early 20th-century Taiwan. (Exaptation refers to a trait evolving because it serves one particular function, but subsequently having the unintended consequence of serving another.) Their analysis notes changes in religious and grave-site rituals that were an unintended and unintuitive consequence of legally imposed changes in footbinding in southwestern Taiwan early in the century. This provides a demonstration of how associations across distinct social and cultural inheritance tracks - having different evolutionary dynamics - affect behavior. Here is the abstract:

Social theorists have long recognized that changes in social order have cultural consequences but have not been able to provide an individual-level mechanism of such effects. Explanations of human behavior have only just begun to explore the different evolutionary dynamics of social and cultural inheritance. Here we provide ethnographic evidence of how cultural evolution, at the level of individuals, can be influenced by social evolution. Sociocultural epistasis—association of cultural ideas with the hierarchical structure of social roles—influences cultural change in unexpected ways. We document the existence of cultural exaptation, where a custom's origin was not due to acceptance of the later associated ideas. A cultural exaptation can develop in the absence of a cultural idea favoring it, or even in the presence of a cultural idea against it. Such associations indicate a potentially larger role for social evolutionary dynamics in explaining individual human behavior than previously anticipated.

Friday, January 15, 2010

Internet hive mind - the madness of crowds

John Tierney does an interesting review of computer guru Jaron Lanier's new book "You Are Not a Gadget." which is a manifesto against “hive thinking” and “digital Maoism” - by which he means the glorification of open-source software, free information and collective work at the expense of individual creativity. Lanier (slightly edited):

...blames the Web’s tradition of “drive-by anonymity” for fostering vicious pack behavior on blogs, forums and social networks. He acknowledges the examples of generous collaboration, like Wikipedia, but argues that the mantras of “open culture” and “information wants to be free” have produced a destructive new social contract....“The basic idea of this contract,” he writes, “is that authors, journalists, musicians and artists are encouraged to treat the fruits of their intellects and imaginations as fragments to be given without pay to the hive mind. Reciprocity takes the form of self-promotion. Culture is to become precisely nothing but advertising.” ...masses of “digital peasants” are forced to provide free material to a few “lords of the clouds” like Google and YouTube.

Mr. Lanier was once an advocate himself for piracy, arguing that his fellow musicians would make up for the lost revenue in other ways. Sure enough, some musicians have done well selling T-shirts and concert tickets, but it is striking how many of the top-grossing acts began in the predigital era, and how much of today’s music is a mash-up of the old....“It’s as if culture froze just before it became digitally open, and all we can do now is mine the past like salvagers picking over a garbage dump,” Mr. Lanier writes. Or, to use another of his grim metaphors: “Creative people — the new peasants — come to resemble animals converging on shrinking oases of old media in a depleted desert.”...To save those endangered species, Mr. Lanier proposes rethinking the Web’s ideology, revising its software structure and introducing innovations like a universal system of micropayments.

Thursday, January 14, 2010

Category errors in politics (the rage of the left) and mind science

Hendrik Hertzberg, in an interesting piece in the Jan. 11 issue of The New Yorker 'Talk of the Town' section, comments on the alienation and disappointment of the liberal left with the health care reform bill in congress.  He cites this as an example of John Ruskin's "Pathetic Fallacy' (Violent feelings producing in us a falseness in all our impressions of external things - as in anthropomorphic treatment of inanimate objects as if they had human feelings, thought, or sensations).  A more recent description would be philosopher Gilbert Ryle's 'Category Error' - ascribing a property to a thing that could not possibly have that property. (We do this with our minds, taking our 'selves' to be 'real,' rather being a illusory model generated by our brain hardware, cf. The I-Illusion). Anyway, from Hertzberg's article:

...It's the false attribution of human feelings, thoughts, or intentions to inanimate objects, or to living entities that cannot possibly have such feelings, thoughts, or intentions...The American government has its human aspects - it is staffed by human beings, mostly - but its atomized, at-odds-with-itself legislative structure (House and Senate, each with its arcane rules, its semi-feudal committee chairs, andits independently elected members, none of whom are accountable or fully responsible for outcomes) make it more like an inanimate object. In our sclerotic lawmaking process, it is not enough that the President, a majority of bothy House of Congress, and a majority of the voters at he last election favor extending health care to all citizens.

The left-wing critics are right about the conspicuous flaws of the pending health-care reform - its lack of even a weak "public option," its too meager subsidies, its windfalls for Big Pharma...etc. But it is nonsense to attribute the less than fully satisfactory result to the alleged perfidy of the President or "the Democrats." The critics' indignation would be better be direction at what an earlier generation malcontents called "the system" - starting, perhaps, with the Senate's filibuster rule, an inanimate object if there ever was one.
Hertzberg goes on to point out that the senate defeat of John F. Kennedy's health care reform attempts in 1962 was reversed only after his assassination, and establishing Medicare required both Lyndon Johnson's landslide election and his legendary legislative talents.
The health-care bill now being kicked and prodded and bribed toward passage will not "do the job," either - only part of it. Are Barack Obama and the Demoncrats in Congress doing enough? No. But they are doing what's possible. That may be pathetic, but it's no fallacy.

Wednesday, January 13, 2010

High-Tech Sex... and gestural interactions with electronics

Would RealTouch have saved Elliot Spitzer or Tiger Woods? I doubt it. An article on the Adult Entertainment Expo that follows the Consumer Electronics Show discusses several new approaches to mechanically providing our titillation. I was particularly struck by the RealTouch site, which has a promotional video, as well as the YouTube clip below. I wonder if their library of on demand movies that synch with the device includes any gay porn? Again, I doubt it......By the way, some fascinating new technology for interacting with television and computers using body gestures was noted at the Consumer Electronic Show).

Tuesday, January 12, 2010

Attention alters appearance

An interesting article by Störmer et al. addresses the neural basis of our phenomenological experience, probing a central question in perception: Does attention alter our subjective experience of the world? The authors found that attention increases the perceived contrast of visual stimuli (sine wave gratings) by boosting early sensory processing in the visual cortex. Here are some slightly edited clips from a review by Carrasco in the same journal:

Voluntary attention refers to the sustained, endogenous directing of attention to a location in the visual field. Involuntary attention is the transient, exogenous capture of attention to a location, brought about by a sudden change in the environment. (Visual attention can be covertly deployed, without eye movements. We use covert attention routinely in everyday situations, when we search for objects, drive a car, cross the street, play sports, or dance, as well as in social situations—for example, when moving the eyes would provide a cue to intentions that we wish to conceal.)... Störmer et al. modified an existing experimental paradigm in two insightful and exciting ways to investigate the effect of attention on appearance with concurrent electrophysiological and behavioral measures. First, to eliminate any possibility of intramodal sensory interactions between the cue and the stimulus, they used a lateralized auditory cue rather than a visual cue. This modification enabled the authors to study the effects of cross-modal attention on appearance. Second, they recorded evoked electrical fields on the scalp—event-related potentials (ERPs)—from visual cortex in response to the cued target as observers judged the relative contrast of visual stimuli presented to the right and left visual fields. ERPs are electrophysiological responses that arise during sensory, cognitive, and motor processing, which provide precise information about the time course of information processing. In this study, they help pinpoint the level of processing at which attention exerts its effect on judgments of contrast appearance. Short-latency evoked responses, P1 (90–150 ms) and N1 (180–240 ms), reflect early sensory processes that can be modulated by selective attention; longer-latency components (250–500 ms) arise from multiple cortical generators and reflect postperceptual processing, including decision-making, working memory encoding, and response selection.

Here is the abstract from the article:
The question of whether attention makes sensory impressions appear more intense has been a matter of debate for over a century. Recent psychophysical studies have reported that attention increases apparent contrast of visual stimuli, but the issue continues to be debated. We obtained converging neurophysiological evidence from human observers as they judged the relative contrast of visual stimuli presented to the left and right visual fields following a lateralized auditory cue. Cross-modal cueing of attention boosted the apparent contrast of the visual target in association with an enlarged neural response in the contralateral visual cortex that began within 100 ms after target onset. The magnitude of the enhanced neural response was positively correlated with perceptual reports of the cued target being higher in contrast. The results suggest that attention increases the perceived contrast of visual stimuli by boosting early sensory processing in the visual cortex.

Monday, January 11, 2010

The Messiah Complex


Yes,  of course I went to see Avatar in 3-D IMAX.  Loved it. (The picture shows Deric using a chemical depressant to recover from the sensory overload, at a dinner with friends afterwards.) I thought David Brook's column on the movie was a treat,  and have to agree with his critical points:
...would it be totally annoying to point out that the whole White Messiah fable, especially as Cameron applies it, is kind of offensive?...It rests on the stereotype that white people are rationalist and technocratic while colonial victims are spiritual and athletic. It rests on the assumption that nonwhites need the White Messiah to lead their crusades. It rests on the assumption that illiteracy is the path to grace. It also creates a sort of two-edged cultural imperialism. Natives can either have their history shaped by cruel imperialists or benevolent ones, but either way, they are going to be supporting actors in our journey to self-admiration...It’s just escapism, obviously, but benevolent romanticism can be just as condescending as the malevolent kind — even when you surround it with pop-up ferns and floating mountains.

Friday, January 08, 2010

Breaking the addiction to connectivity and networking.


This post is a personal note.......As the holidays have drawn to a close and I am overloaded and  behind on almost everything,  I realize yet again that I spend an inordinate amount of time 'just checking' a continuous incoming flux of emails, listserves, google alerts, tweets, SMS messages.  This time spent thinking in multiple small (Twitter is 140 character) chunks is time subtracted from thinking in more depth (or at least in paragraph size chunks!).  The post I did recently on this issue obviously hasn't influenced my behavior.  So, one of my new year's resolutions has been to cancel subscriptions to virtually everything, and check email only twice a day. Removing my google alerts and hitting the 'unsubscribe' link on many incoming emails has reduced my email volume by half. Ignoring Facebook, YouTube, LinkedIn and all their 'friend' requests has made life a bit more manageable (I regret being 'unfriendly,' but it can't be helped). I'm finding that the connectivity of email exchanges related to mindblog.dericbownds.net is almost more than I can handle.  I  haven't really gotten into either Facebook or LinkedIn (interesting experiments, but no thank you), so I've decided to condense down to spending time only on this blog and an occasional 'tweet' as my venues for reaching out to others. Anyone who wishes to chat with me can easily find my email address, and the email exchanges I have had with  mindblog readers have been engaging and worthwhile.   It begins to feel like a shroud is slowly lifting.....    is it possible that I am beginning to smash my mind's  "Twittering Machine"?  (The figure is the classic Paul Klee painting of that title).

Distinguishing conscious from unconscious brain activity.

Schurger et al. nudge a bit further towards finding one holy grail of neuroscience - identifying the neuronal correlates of conscious awareness.


What qualifies a neural representation for a role in subjective experience? Previous evidence suggests that the duration and intensity of the neural response to a sensory stimulus are factors. We introduce another attribute—the reproducibility of a pattern of neural activity across different episodes—that predicts specific and measurable differences between conscious and nonconscious neural representations indepedently of duration and intensity. We found that conscious neural activation patterns are relatively reproducible when compared with nonconscious neural activation patterns corresponding to the same perceptual content. This is not adequately explained by a difference in signal-to-noise ratio.
Clips from their account:
Functional magnetic resonance imaging (fMRI)was used to measure brain activity while subjects performed a simple visual category-discrimination task. The stimuli were simple line drawings of faces and houses (12 of each), rendered in two opposing but isoluminant colors (see the figure and legend). Visibility of the stimuli was manipulated by using dichoptic color masking. Subjects were asked to identify the category of the stimulus (face or house) on each trial, guessing if necessary, and to wager ("high" or "low" for monetary rewards) on the accuracy of each of their perceptual decisions. Wagering was used as a collateral index of subjects’ awareness of the object.



Dichoptic-color masking. This method of manipulating awareness... relies on the phenomenon of dichoptic color fusion. The "same color" mode corresponds to the visible condition, and the "opposite color" mode corresponds to the invisible condition. In order to achieve disappearance of the image in the opposite color mode, the two colors must be approximately isoluminant and the object boundaries slightly blurred. Before the experiment, subjects were trained to maintain steady fixation and were cued to do so during each trial with the appearance of the fixation point (500 ms before stimulus onset). Stimuli were presented stereoscopically in the fMRI scanner by using a cardboard divider and prism lenses

Thursday, January 07, 2010

Brain correlates of strategic and experiential emotional intelligence.

From Krueger et al. evidence that two central components of emotional intelligence are most strongly associated with different regions of the prefrontal cortex:

Emotional intelligence (EI) refers to a set of competencies that are essential features of human social life. Although the neural substrates of EI are virtually unknown, it is well established that the prefrontal cortex (PFC) plays a crucial role in human social-emotional behavior. We studied a unique sample of combat veterans from the Vietnam Head Injury Study, which is a prospective, long-term follow-up study of veterans with focal penetrating head injuries. We administered the Mayer-Salovey-Caruso Emotional Intelligence Test as a valid standardized psychometric measure of EI behavior to examine two key competencies of EI: (i) Strategic EI as the competency to understand emotional information and to apply it for the management of the self and of others and (ii) Experiential EI as the competency to perceive emotional information and to apply it for the integration into thinking. The results revealed that key competencies underlying EI depend on distinct neural PFC substrates. First, ventromedial PFC damage diminishes Strategic EI, and therefore, hinders the understanding and managing of emotional information. Second, dorsolateral PFC damage diminishes Experiential EI, and therefore, hinders the perception and integration of emotional information. In conclusion, EI should be viewed as complementary to cognitive intelligence and, when considered together, provide a more complete understanding of human intelligence.

Where did the time go?


Benedict Carey offers a nice piece on our sense of time. The article relates a number of interesting experiments on the variety of ways in which our brains expand or contact our sense of time:
...emotional events — a breakup, a promotion, a transformative trip abroad — tend to be perceived as more recent than they actually are, by months or even years...the findings support the philosopher Martin Heidegger’s observation that time “persists merely as a consequence of the events taking place in it.”...the reverse may also be true: if very few events come to mind, then the perception of time does not persist; the brain telescopes the interval that has passed.

Wednesday, January 06, 2010

Have fun.....NOW!

I've been meaning to pass on this nice article by John Tierney.  Postponed pleasures are much less likely to happen at all....

Anhedonia - frontal lobe correlates

An interesting open access article from Heller et al. that suggest that diminished pleasure from previously rewarding contexts is due to inability to sustain positive affect, not to a reduction in ability to experience pleasure:

Anhedonia, the loss of pleasure or interest in previously rewarding stimuli, is a core feature of major depression. While theorists have argued that anhedonia reflects a reduced capacity to experience pleasure, evidence is mixed as to whether anhedonia is caused by a reduction in hedonic capacity. An alternative explanation is that anhedonia is due to the inability to sustain positive affect across time. Using positive images, we used an emotion regulation task to test whether individuals with depression are unable to sustain activation in neural circuits underlying positive affect and reward. While up-regulating positive affect, depressed individuals failed to sustain nucleus accumbens activity over time compared with controls. This decreased capacity was related to individual differences in self-reported positive affect. Connectivity analyses further implicated the fronto-striatal network in anhedonia. These findings support the hypothesis that anhedonia in depressed patients reflects the inability to sustain engagement of structures involved in positive affect and reward.

Tuesday, January 05, 2010

We are what our ancestors did or didn't eat.

Ann Gibbons does a nice summary of our human ancestral diet and how it has changed to give us a modern array of diseases. Some slightly edited clips:

By the time hunter-gatherer modern humans swept into Europe about 40,000 years ago, they were adept at hunting large game and had also expanded their palates to dine regularly on small animals and freshwater fish....By studying the ratios of carbon and nitrogen isotopes from collagen in bones, ...the main sources of dietary protein of 27 early Europeans and Neandertals is known; fish eaters, for example, have more nitrogen-15 in their bones than meat eaters...the oldest known modern human in Europe—the 35,000-year-old jawbone from Pestera cu Oase cave in Romania—got much of his protein from fish. By 30,000 years ago, other modern humans got as much as 20% of their protein from fish.




The next big dietary shift came about 10,000 years ago, when humans began to domesticate plants and, later, animals. The move to agriculture introduced staples of the Western diet: cereal grains, sugars, and milk after weaning...The agricultural revolution favored people lucky enough to have gene variants that helped them digest milk, alcohol, and starch...when ethnic groups abandon traditional lifestyles and rapidly adopt Western diets, they often suffer. Researchers have known for more than a decade that the Pima of the southwestern United States have "thrifty phenotypes": sluggish metabolisms that store fat efficiently and boost survival on low-calorie diets. That's probably because their ancestors in Mexico underwent frequent famine. When they eat the calorie-rich Western diet, the Pima develop high rates of obesity, diabetes, and high cholesterol, although their blood pressure stays relatively low...the Evenki reindeer herders and other indigenous peoples of Siberia have very high metabolisms, an adaptation to the cold that allows them to convert fat into energy efficiently. When the Soviet Union collapsed in the 1990s, many Siberians abandoned traditional lifestyles and diets. They too became obese and developed heart disease but in a different way from the Pima: The Evenki retained low levels of cholesterol and diabetes but developed high blood pressure.

Although we are what our ancestors ate, we are also what they didn't eat. In India, for example, more than 66% of the population in some regions experienced famine during British colonialism a century ago. Women who survived tended to have low-birth-weight babies, whose bodies were small and efficient at storing fat. It's as though these babies took cues during fetal and early development about their mothers' lifelong nutritional experience and adjusted their growth and body and organ size accordingly. Human stature often tracks the nutritional status of mothers, and it can take generations for descendants to recover. In India, average height in males dropped at a rate of almost 2 centimeters per century in the decades following colonialism...When these small babies gain weight in childhood, though, it stresses their smaller organs, such as the pancreas and heart, making them more susceptible to obesity, diabetes, and heart disease. This is the case in south India today, where many people have thrifty phenotypes with less muscle and more fat per body size. Yet they are shifting rapidly to a high-fat, high-sugar diet. As a result, India risks becoming the diabetes capital of the world.

Similarity breeds connection

Aral et al. examine the huge dataset available on the spread (contagion) of a new mobile service product (Yahoo! Go) among 27.5 million users of Yahoo.com. The dataset comprehensively captures the diffusion of this mobile service product over a social network for 5 months after its launch date. They note that a key challenge in identifying true contagions in such data is to distinguish peer-to-peer influence, in which a node influences or causes outcomes in its neighbors, from homophily, in which dyadic similarities between nodes create correlated outcome patterns among neighbors that merely mimic viral contagions without direct causal influence.   Here is their abstract:

Node characteristics and behaviors are often correlated with the structure of social networks over time. While evidence of this type of assortative mixing and temporal clustering of behaviors among linked nodes is used to support claims of peer influence and social contagion in networks, homophily may also explain such evidence. Here we develop a dynamic matched sample estimation framework to distinguish influence and homophily effects in dynamic networks, and we apply this framework to a global instant messaging network of 27.4 million users, using data on the day-by-day adoption of a mobile service application and users' longitudinal behavioral, demographic, and geographic data. We find that previous methods overestimate peer influence in product adoption decisions in this network by 300–700%, and that homophily explains >50% of the perceived behavioral contagion. These findings and methods are essential to both our understanding of the mechanisms that drive contagions in networks and our knowledge of how to propagate or combat them in domains as diverse as epidemiology, marketing, development economics, and public health.

Monday, January 04, 2010

God's beliefs as what we want them to be.

Epley et al., in an open access article, find that what we believe about God's views is more egocentric than what we believe about the views of other humans:

People often reason egocentrically about others' beliefs, using their own beliefs as an inductive guide. Correlational, experimental, and neuroimaging evidence suggests that people may be even more egocentric when reasoning about a religious agent's beliefs (e.g., God). In both nationally representative and more local samples, people's own beliefs on important social and ethical issues were consistently correlated more strongly with estimates of God's beliefs than with estimates of other people's beliefs (Studies 1–4). Manipulating people's beliefs similarly influenced estimates of God's beliefs but did not as consistently influence estimates of other people's beliefs (Studies 5 and 6). A final neuroimaging study demonstrated a clear convergence in neural activity when reasoning about one's own beliefs and God's beliefs, but clear divergences when reasoning about another person's beliefs (Study 7). In particular, reasoning about God's beliefs activated areas associated with self-referential thinking more so than did reasoning about another person's beliefs. Believers commonly use inferences about God's beliefs as a moral compass, but that compass appears especially dependent on one's own existing beliefs.

The faith instinct

Schulevitz reviews Nicholas Wade's new book "The Faith Instinct - How it evolved and how it endures."

According to Wade, a New York Times science writer, religions are machines for manufacturing social solidarity. They bind us into groups. Long ago, codes requiring altruistic behavior, and the gods who enforced them, helped human society expand from families to bands of people who were not necessarily related. We didn’t become religious creatures because we became social; we became social creatures because we became religious. Or, to put it in Darwinian terms, being willing to live and die for their coreligionists gave our ancestors an advantage in the struggle for resources.
...Rituals take time; sacrifices take money or its equivalent. Individuals willing to lavish time and money on a particular group signal their commitment to it, and a high level of commitment makes each coreligionist less loath to ignore short-term self-interest and to act for the benefit of the whole. What are gods for? They’re the enforcers. Supernatural beings scare away cheaters and freeloaders and cow everyone into loyal, unselfish, dutiful and, when appropriate, warlike behavior.

...our innate piety has adapted to our changing needs. Hunter-gatherers were egalitarian and, shamans aside, had direct access to the divine. But when humans began to farm and to settle in cities and states, religion became hierarchical. Priests emerged, turning unwritten rules and chummy gods into opaque instruments of surveillance and power. Church bureaucracies created crucial social institutions but also suppressed the more ecstatic aspects of worship, especially music, dance and trance. Wade advances the delightfully explosive thesis that the periodic rise of exuberant mystery cults represent human nature rebelling against the institutionalization of worship: “A propensity to follow the ecstatic behaviors of dance and trance was built into people’s minds and provided consistently fertile ground for revolts against established religion,”

Friday, January 01, 2010

A gentle start to the new year...some Debussy

I offer the mellow romantic energy of this Debussy piece "Valse Romantique" to start the new year. I continue to be amazed at how many people listen to and comment on some of these piano pieces I have posted on YouTube.

Reward regions in the adolescent brain.

Van Leijenhorst et al examine brain activations associated with anticipation, receipt, and omission of reward in several different age groups:

The relation between brain development across adolescence and adolescent risky behavior has attracted increasing interest in recent years. It has been proposed that adolescents are hypersensitive to reward because of an imbalance in the developmental pattern followed by the striatum and prefrontal cortex. To date, it is unclear if adolescents engage in risky behavior because they overestimate potential rewards or respond more to received rewards and whether these effects occur in the absence of decisions. In this study, we used a functional magnetic resonance imaging paradigm that allowed us to dissociate effects of the anticipation, receipt, and omission of reward in 10- to 12-year-old, 14- to 15-year-old, and 18- to 23-year-old participants. We show that in anticipation of uncertain outcomes, the anterior insula is more active in adolescents compared with young adults and that the ventral striatum shows a reward-related peak in middle adolescence, whereas young adults show orbitofrontal cortex activation to omitted reward. These regions show distinct developmental trajectories. This study supports the hypothesis that adolescents are hypersensitive to reward and adds to the current literature in demonstrating that neural activation differs in adolescents even for small rewards in the absence of choice. These findings may have important implications for understanding adolescent risk-taking behavior.

Unconscious strategic recruitment of resources measured with pupil diameter.

Recent research suggests that reward cues, in the absence of awareness, can enhance people's investment of physical resources (for an example, see this previous post). Bijleveld et al. make the interesting observation that pupil dilation can reveal strategic recruitment of resources when subliminal reward cues are presented. They make use of the fact that our pupils dilate with sympathetic activity and constrict with parasympathetic activity so that their size can be an unobtrusive measure of the resources invested in a task. Their rationale:

If subliminal reward cues input into the strategic processes involved in resource recruitment, the effects of rewards on pupil dilation should occur when the task is demanding (here, recall of five digits), but not when the task is undemanding (recall of three digits), as undemanding tasks can be completed routinely and do not require many resources. It is important to note that this interactive effect of reward and demand on recruitment of resources is expected to occur regardless of whether the reward is processed consciously or nonconsciously.
Their results:
Pupil-dilation data indicated that valuable (compared with nonvaluable) rewards led to recruitment of more resources, but only when obtaining the reward required considerable mental effort. This pattern was identical for supraliminal and subliminal reward cues. This indicates that awareness of a reward is not a necessary condition for strategic resource recruitment to take place. These findings are in line with recent research suggesting that the unconscious has flexible and adaptive capabilities (Hassin, Uleman, & Bargh, 2005; Wilson, 2002). More generally, whereas analyses of costs (required effort) and benefits (value of rewards) are usually thought to require consciousness, our findings suggest that such strategic processes can occur outside of awareness—and these processes show in the eyes.