Thursday, January 14, 2010

Category errors in politics (the rage of the left) and mind science

Hendrik Hertzberg, in an interesting piece in the Jan. 11 issue of The New Yorker 'Talk of the Town' section, comments on the alienation and disappointment of the liberal left with the health care reform bill in congress.  He cites this as an example of John Ruskin's "Pathetic Fallacy' (Violent feelings producing in us a falseness in all our impressions of external things - as in anthropomorphic treatment of inanimate objects as if they had human feelings, thought, or sensations).  A more recent description would be philosopher Gilbert Ryle's 'Category Error' - ascribing a property to a thing that could not possibly have that property. (We do this with our minds, taking our 'selves' to be 'real,' rather being a illusory model generated by our brain hardware, cf. The I-Illusion). Anyway, from Hertzberg's article:
...It's the false attribution of human feelings, thoughts, or intentions to inanimate objects, or to living entities that cannot possibly have such feelings, thoughts, or intentions...The American government has its human aspects - it is staffed by human beings, mostly - but its atomized, at-odds-with-itself legislative structure (House and Senate, each with its arcane rules, its semi-feudal committee chairs, andits independently elected members, none of whom are accountable or fully responsible for outcomes) make it more like an inanimate object. In our sclerotic lawmaking process, it is not enough that the President, a majority of bothy House of Congress, and a majority of the voters at he last election favor extending health care to all citizens.

The left-wing critics are right about the conspicuous flaws of the pending health-care reform - its lack of even a weak "public option," its too meager subsidies, its windfalls for Big Pharma...etc. But it is nonsense to attribute the less than fully satisfactory result to the alleged perfidy of the President or "the Democrats." The critics' indignation would be better be direction at what an earlier generation malcontents called "the system" - starting, perhaps, with the Senate's filibuster rule, an inanimate object if there ever was one.
Hertzberg goes on to point out that the senate defeat of John F. Kennedy's health care reform attempts in 1962 was reversed only after his assassination, and establishing Medicare required both Lyndon Johnson's landslide election and his legendary legislative talents.
The health-care bill now being kicked and prodded and bribed toward passage will not "do the job," either - only part of it. Are Barack Obama and the Demoncrats in Congress doing enough? No. But they are doing what's possible. That may be pathetic, but it's no fallacy.

Wednesday, January 13, 2010

High-Tech Sex... and gestural interactions with electronics

Would RealTouch have saved Elliot Spitzer or Tiger Woods? I doubt it. An article on the Adult Entertainment Expo that follows the Consumer Electronics Show discusses several new approaches to mechanically providing our titillation. I was particularly struck by the RealTouch site, which has a promotional video, as well as the YouTube clip below. I wonder if their library of on demand movies that synch with the device includes any gay porn? Again, I doubt it......By the way, some fascinating new technology for interacting with television and computers using body gestures was noted at the Consumer Electronic Show).

Tuesday, January 12, 2010

Attention alters appearance

An interesting article by Störmer et al. addresses the neural basis of our phenomenological experience, probing a central question in perception: Does attention alter our subjective experience of the world? The authors found that attention increases the perceived contrast of visual stimuli (sine wave gratings) by boosting early sensory processing in the visual cortex. Here are some slightly edited clips from a review by Carrasco in the same journal:
Voluntary attention refers to the sustained, endogenous directing of attention to a location in the visual field. Involuntary attention is the transient, exogenous capture of attention to a location, brought about by a sudden change in the environment. (Visual attention can be covertly deployed, without eye movements. We use covert attention routinely in everyday situations, when we search for objects, drive a car, cross the street, play sports, or dance, as well as in social situations—for example, when moving the eyes would provide a cue to intentions that we wish to conceal.)... Störmer et al. modified an existing experimental paradigm in two insightful and exciting ways to investigate the effect of attention on appearance with concurrent electrophysiological and behavioral measures. First, to eliminate any possibility of intramodal sensory interactions between the cue and the stimulus, they used a lateralized auditory cue rather than a visual cue. This modification enabled the authors to study the effects of cross-modal attention on appearance. Second, they recorded evoked electrical fields on the scalp—event-related potentials (ERPs)—from visual cortex in response to the cued target as observers judged the relative contrast of visual stimuli presented to the right and left visual fields. ERPs are electrophysiological responses that arise during sensory, cognitive, and motor processing, which provide precise information about the time course of information processing. In this study, they help pinpoint the level of processing at which attention exerts its effect on judgments of contrast appearance. Short-latency evoked responses, P1 (90–150 ms) and N1 (180–240 ms), reflect early sensory processes that can be modulated by selective attention; longer-latency components (250–500 ms) arise from multiple cortical generators and reflect postperceptual processing, including decision-making, working memory encoding, and response selection.

Here is the abstract from the article:
The question of whether attention makes sensory impressions appear more intense has been a matter of debate for over a century. Recent psychophysical studies have reported that attention increases apparent contrast of visual stimuli, but the issue continues to be debated. We obtained converging neurophysiological evidence from human observers as they judged the relative contrast of visual stimuli presented to the left and right visual fields following a lateralized auditory cue. Cross-modal cueing of attention boosted the apparent contrast of the visual target in association with an enlarged neural response in the contralateral visual cortex that began within 100 ms after target onset. The magnitude of the enhanced neural response was positively correlated with perceptual reports of the cued target being higher in contrast. The results suggest that attention increases the perceived contrast of visual stimuli by boosting early sensory processing in the visual cortex.

Monday, January 11, 2010

The Messiah Complex


Yes,  of course I went to see Avatar in 3-D IMAX.  Loved it. (The picture shows Deric using a chemical depressant to recover from the sensory overload, at a dinner with friends afterwards.) I thought David Brook's column on the movie was a treat,  and have to agree with his critical points:
...would it be totally annoying to point out that the whole White Messiah fable, especially as Cameron applies it, is kind of offensive?...It rests on the stereotype that white people are rationalist and technocratic while colonial victims are spiritual and athletic. It rests on the assumption that nonwhites need the White Messiah to lead their crusades. It rests on the assumption that illiteracy is the path to grace. It also creates a sort of two-edged cultural imperialism. Natives can either have their history shaped by cruel imperialists or benevolent ones, but either way, they are going to be supporting actors in our journey to self-admiration...It’s just escapism, obviously, but benevolent romanticism can be just as condescending as the malevolent kind — even when you surround it with pop-up ferns and floating mountains.

Friday, January 08, 2010

Breaking the addiction to connectivity and networking.


This post is a personal note.......As the holidays have drawn to a close and I am overloaded and  behind on almost everything,  I realize yet again that I spend an inordinate amount of time 'just checking' a continuous incoming flux of emails, listserves, google alerts, tweets, SMS messages.  This time spent thinking in multiple small (Twitter is 140 character) chunks is time subtracted from thinking in more depth (or at least in paragraph size chunks!).  The post I did recently on this issue obviously hasn't influenced my behavior.  So, one of my new year's resolutions has been to cancel subscriptions to virtually everything, and check email only twice a day. Removing my google alerts and hitting the 'unsubscribe' link on many incoming emails has reduced my email volume by half. Ignoring Facebook, YouTube, LinkedIn and all their 'friend' requests has made life a bit more manageable (I regret being 'unfriendly,' but it can't be helped). I'm finding that the connectivity of email exchanges related to mindblog.dericbownds.net is almost more than I can handle.  I  haven't really gotten into either Facebook or LinkedIn (interesting experiments, but no thank you), so I've decided to condense down to spending time only on this blog and an occasional 'tweet' as my venues for reaching out to others. Anyone who wishes to chat with me can easily find my email address, and the email exchanges I have had with  mindblog readers have been engaging and worthwhile.   It begins to feel like a shroud is slowly lifting.....    is it possible that I am beginning to smash my mind's  "Twittering Machine"?  (The figure is the classic Paul Klee painting of that title).

Distinguishing conscious from unconscious brain activity.

Schurger et al. nudge a bit further towards finding one holy grail of neuroscience - identifying the neuronal correlates of conscious awareness.

What qualifies a neural representation for a role in subjective experience? Previous evidence suggests that the duration and intensity of the neural response to a sensory stimulus are factors. We introduce another attribute—the reproducibility of a pattern of neural activity across different episodes—that predicts specific and measurable differences between conscious and nonconscious neural representations indepedently of duration and intensity. We found that conscious neural activation patterns are relatively reproducible when compared with nonconscious neural activation patterns corresponding to the same perceptual content. This is not adequately explained by a difference in signal-to-noise ratio.
Clips from their account:
Functional magnetic resonance imaging (fMRI)was used to measure brain activity while subjects performed a simple visual category-discrimination task. The stimuli were simple line drawings of faces and houses (12 of each), rendered in two opposing but isoluminant colors (see the figure and legend). Visibility of the stimuli was manipulated by using dichoptic color masking. Subjects were asked to identify the category of the stimulus (face or house) on each trial, guessing if necessary, and to wager ("high" or "low" for monetary rewards) on the accuracy of each of their perceptual decisions. Wagering was used as a collateral index of subjects’ awareness of the object.



Dichoptic-color masking. This method of manipulating awareness... relies on the phenomenon of dichoptic color fusion. The "same color" mode corresponds to the visible condition, and the "opposite color" mode corresponds to the invisible condition. In order to achieve disappearance of the image in the opposite color mode, the two colors must be approximately isoluminant and the object boundaries slightly blurred. Before the experiment, subjects were trained to maintain steady fixation and were cued to do so during each trial with the appearance of the fixation point (500 ms before stimulus onset). Stimuli were presented stereoscopically in the fMRI scanner by using a cardboard divider and prism lenses

Thursday, January 07, 2010

Brain correlates of strategic and experiential emotional intelligence.

From Krueger et al. evidence that two central components of emotional intelligence are most strongly associated with different regions of the prefrontal cortex:
Emotional intelligence (EI) refers to a set of competencies that are essential features of human social life. Although the neural substrates of EI are virtually unknown, it is well established that the prefrontal cortex (PFC) plays a crucial role in human social-emotional behavior. We studied a unique sample of combat veterans from the Vietnam Head Injury Study, which is a prospective, long-term follow-up study of veterans with focal penetrating head injuries. We administered the Mayer-Salovey-Caruso Emotional Intelligence Test as a valid standardized psychometric measure of EI behavior to examine two key competencies of EI: (i) Strategic EI as the competency to understand emotional information and to apply it for the management of the self and of others and (ii) Experiential EI as the competency to perceive emotional information and to apply it for the integration into thinking. The results revealed that key competencies underlying EI depend on distinct neural PFC substrates. First, ventromedial PFC damage diminishes Strategic EI, and therefore, hinders the understanding and managing of emotional information. Second, dorsolateral PFC damage diminishes Experiential EI, and therefore, hinders the perception and integration of emotional information. In conclusion, EI should be viewed as complementary to cognitive intelligence and, when considered together, provide a more complete understanding of human intelligence.

Where did the time go?


Benedict Carey offers a nice piece on our sense of time. The article relates a number of interesting experiments on the variety of ways in which our brains expand or contact our sense of time:
...emotional events — a breakup, a promotion, a transformative trip abroad — tend to be perceived as more recent than they actually are, by months or even years...the findings support the philosopher Martin Heidegger’s observation that time “persists merely as a consequence of the events taking place in it.”...the reverse may also be true: if very few events come to mind, then the perception of time does not persist; the brain telescopes the interval that has passed.

Wednesday, January 06, 2010

Have fun.....NOW!

I've been meaning to pass on this nice article by John Tierney.  Postponed pleasures are much less likely to happen at all....

Anhedonia - frontal lobe correlates

An interesting open access article from Heller et al. that suggest that diminished pleasure from previously rewarding contexts is due to inability to sustain positive affect, not to a reduction in ability to experience pleasure:
Anhedonia, the loss of pleasure or interest in previously rewarding stimuli, is a core feature of major depression. While theorists have argued that anhedonia reflects a reduced capacity to experience pleasure, evidence is mixed as to whether anhedonia is caused by a reduction in hedonic capacity. An alternative explanation is that anhedonia is due to the inability to sustain positive affect across time. Using positive images, we used an emotion regulation task to test whether individuals with depression are unable to sustain activation in neural circuits underlying positive affect and reward. While up-regulating positive affect, depressed individuals failed to sustain nucleus accumbens activity over time compared with controls. This decreased capacity was related to individual differences in self-reported positive affect. Connectivity analyses further implicated the fronto-striatal network in anhedonia. These findings support the hypothesis that anhedonia in depressed patients reflects the inability to sustain engagement of structures involved in positive affect and reward.

Tuesday, January 05, 2010

We are what our ancestors did or didn't eat.

Ann Gibbons does a nice summary of our human ancestral diet and how it has changed to give us a modern array of diseases. Some slightly edited clips:
By the time hunter-gatherer modern humans swept into Europe about 40,000 years ago, they were adept at hunting large game and had also expanded their palates to dine regularly on small animals and freshwater fish....By studying the ratios of carbon and nitrogen isotopes from collagen in bones, ...the main sources of dietary protein of 27 early Europeans and Neandertals is known; fish eaters, for example, have more nitrogen-15 in their bones than meat eaters...the oldest known modern human in Europe—the 35,000-year-old jawbone from Pestera cu Oase cave in Romania—got much of his protein from fish. By 30,000 years ago, other modern humans got as much as 20% of their protein from fish.




The next big dietary shift came about 10,000 years ago, when humans began to domesticate plants and, later, animals. The move to agriculture introduced staples of the Western diet: cereal grains, sugars, and milk after weaning...The agricultural revolution favored people lucky enough to have gene variants that helped them digest milk, alcohol, and starch...when ethnic groups abandon traditional lifestyles and rapidly adopt Western diets, they often suffer. Researchers have known for more than a decade that the Pima of the southwestern United States have "thrifty phenotypes": sluggish metabolisms that store fat efficiently and boost survival on low-calorie diets. That's probably because their ancestors in Mexico underwent frequent famine. When they eat the calorie-rich Western diet, the Pima develop high rates of obesity, diabetes, and high cholesterol, although their blood pressure stays relatively low...the Evenki reindeer herders and other indigenous peoples of Siberia have very high metabolisms, an adaptation to the cold that allows them to convert fat into energy efficiently. When the Soviet Union collapsed in the 1990s, many Siberians abandoned traditional lifestyles and diets. They too became obese and developed heart disease but in a different way from the Pima: The Evenki retained low levels of cholesterol and diabetes but developed high blood pressure.

Although we are what our ancestors ate, we are also what they didn't eat. In India, for example, more than 66% of the population in some regions experienced famine during British colonialism a century ago. Women who survived tended to have low-birth-weight babies, whose bodies were small and efficient at storing fat. It's as though these babies took cues during fetal and early development about their mothers' lifelong nutritional experience and adjusted their growth and body and organ size accordingly. Human stature often tracks the nutritional status of mothers, and it can take generations for descendants to recover. In India, average height in males dropped at a rate of almost 2 centimeters per century in the decades following colonialism...When these small babies gain weight in childhood, though, it stresses their smaller organs, such as the pancreas and heart, making them more susceptible to obesity, diabetes, and heart disease. This is the case in south India today, where many people have thrifty phenotypes with less muscle and more fat per body size. Yet they are shifting rapidly to a high-fat, high-sugar diet. As a result, India risks becoming the diabetes capital of the world.

Similarity breeds connection

Aral et al. examine the huge dataset available on the spread (contagion) of a new mobile service product (Yahoo! Go) among 27.5 million users of Yahoo.com. The dataset comprehensively captures the diffusion of this mobile service product over a social network for 5 months after its launch date. They note that a key challenge in identifying true contagions in such data is to distinguish peer-to-peer influence, in which a node influences or causes outcomes in its neighbors, from homophily, in which dyadic similarities between nodes create correlated outcome patterns among neighbors that merely mimic viral contagions without direct causal influence.   Here is their abstract:
Node characteristics and behaviors are often correlated with the structure of social networks over time. While evidence of this type of assortative mixing and temporal clustering of behaviors among linked nodes is used to support claims of peer influence and social contagion in networks, homophily may also explain such evidence. Here we develop a dynamic matched sample estimation framework to distinguish influence and homophily effects in dynamic networks, and we apply this framework to a global instant messaging network of 27.4 million users, using data on the day-by-day adoption of a mobile service application and users' longitudinal behavioral, demographic, and geographic data. We find that previous methods overestimate peer influence in product adoption decisions in this network by 300–700%, and that homophily explains >50% of the perceived behavioral contagion. These findings and methods are essential to both our understanding of the mechanisms that drive contagions in networks and our knowledge of how to propagate or combat them in domains as diverse as epidemiology, marketing, development economics, and public health.

Monday, January 04, 2010

God's beliefs as what we want them to be.

Epley et al., in an open access article, find that what we believe about God's views is more egocentric than what we believe about the views of other humans:
People often reason egocentrically about others' beliefs, using their own beliefs as an inductive guide. Correlational, experimental, and neuroimaging evidence suggests that people may be even more egocentric when reasoning about a religious agent's beliefs (e.g., God). In both nationally representative and more local samples, people's own beliefs on important social and ethical issues were consistently correlated more strongly with estimates of God's beliefs than with estimates of other people's beliefs (Studies 1–4). Manipulating people's beliefs similarly influenced estimates of God's beliefs but did not as consistently influence estimates of other people's beliefs (Studies 5 and 6). A final neuroimaging study demonstrated a clear convergence in neural activity when reasoning about one's own beliefs and God's beliefs, but clear divergences when reasoning about another person's beliefs (Study 7). In particular, reasoning about God's beliefs activated areas associated with self-referential thinking more so than did reasoning about another person's beliefs. Believers commonly use inferences about God's beliefs as a moral compass, but that compass appears especially dependent on one's own existing beliefs.

The faith instinct

Schulevitz reviews Nicholas Wade's new book "The Faith Instinct - How it evolved and how it endures."
According to Wade, a New York Times science writer, religions are machines for manufacturing social solidarity. They bind us into groups. Long ago, codes requiring altruistic behavior, and the gods who enforced them, helped human society expand from families to bands of people who were not necessarily related. We didn’t become religious creatures because we became social; we became social creatures because we became religious. Or, to put it in Darwinian terms, being willing to live and die for their coreligionists gave our ancestors an advantage in the struggle for resources.
...Rituals take time; sacrifices take money or its equivalent. Individuals willing to lavish time and money on a particular group signal their commitment to it, and a high level of commitment makes each coreligionist less loath to ignore short-term self-interest and to act for the benefit of the whole. What are gods for? They’re the enforcers. Supernatural beings scare away cheaters and freeloaders and cow everyone into loyal, unselfish, dutiful and, when appropriate, warlike behavior.

...our innate piety has adapted to our changing needs. Hunter-gatherers were egalitarian and, shamans aside, had direct access to the divine. But when humans began to farm and to settle in cities and states, religion became hierarchical. Priests emerged, turning unwritten rules and chummy gods into opaque instruments of surveillance and power. Church bureaucracies created crucial social institutions but also suppressed the more ecstatic aspects of worship, especially music, dance and trance. Wade advances the delightfully explosive thesis that the periodic rise of exuberant mystery cults represent human nature rebelling against the institutionalization of worship: “A propensity to follow the ecstatic behaviors of dance and trance was built into people’s minds and provided consistently fertile ground for revolts against established religion,”

Friday, January 01, 2010

A gentle start to the new year...some Debussy

I offer the mellow romantic energy of this Debussy piece "Valse Romantique" to start the new year. I continue to be amazed at how many people listen to and comment on some of these piano pieces I have posted on YouTube.

Reward regions in the adolescent brain.

Van Leijenhorst et al examine brain activations associated with anticipation, receipt, and omission of reward in several different age groups:
The relation between brain development across adolescence and adolescent risky behavior has attracted increasing interest in recent years. It has been proposed that adolescents are hypersensitive to reward because of an imbalance in the developmental pattern followed by the striatum and prefrontal cortex. To date, it is unclear if adolescents engage in risky behavior because they overestimate potential rewards or respond more to received rewards and whether these effects occur in the absence of decisions. In this study, we used a functional magnetic resonance imaging paradigm that allowed us to dissociate effects of the anticipation, receipt, and omission of reward in 10- to 12-year-old, 14- to 15-year-old, and 18- to 23-year-old participants. We show that in anticipation of uncertain outcomes, the anterior insula is more active in adolescents compared with young adults and that the ventral striatum shows a reward-related peak in middle adolescence, whereas young adults show orbitofrontal cortex activation to omitted reward. These regions show distinct developmental trajectories. This study supports the hypothesis that adolescents are hypersensitive to reward and adds to the current literature in demonstrating that neural activation differs in adolescents even for small rewards in the absence of choice. These findings may have important implications for understanding adolescent risk-taking behavior.

Unconscious strategic recruitment of resources measured with pupil diameter.

Recent research suggests that reward cues, in the absence of awareness, can enhance people's investment of physical resources (for an example, see this previous post). Bijleveld et al. make the interesting observation that pupil dilation can reveal strategic recruitment of resources when subliminal reward cues are presented. They make use of the fact that our pupils dilate with sympathetic activity and constrict with parasympathetic activity so that their size can be an unobtrusive measure of the resources invested in a task. Their rationale:
If subliminal reward cues input into the strategic processes involved in resource recruitment, the effects of rewards on pupil dilation should occur when the task is demanding (here, recall of five digits), but not when the task is undemanding (recall of three digits), as undemanding tasks can be completed routinely and do not require many resources. It is important to note that this interactive effect of reward and demand on recruitment of resources is expected to occur regardless of whether the reward is processed consciously or nonconsciously.
Their results:
Pupil-dilation data indicated that valuable (compared with nonvaluable) rewards led to recruitment of more resources, but only when obtaining the reward required considerable mental effort. This pattern was identical for supraliminal and subliminal reward cues. This indicates that awareness of a reward is not a necessary condition for strategic resource recruitment to take place. These findings are in line with recent research suggesting that the unconscious has flexible and adaptive capabilities (Hassin, Uleman, & Bargh, 2005; Wilson, 2002). More generally, whereas analyses of costs (required effort) and benefits (value of rewards) are usually thought to require consciousness, our findings suggest that such strategic processes can occur outside of awareness—and these processes show in the eyes.

Thursday, December 31, 2009

Inhibited behavior and our right frontal cortex.

Another interesting piece of work from Richie Davidson and colleagues at the University of Wisconsin.  Continuing their general catalog of correlations of left versus right frontal lobe activation with outgoing versus protective behaviors they find  that individuals with greater tonic (resting) activity in right-posterior dorsolateral prefrontal cortex rate themselves as more behaviorally inhibited. (The authors point out some limitation of the study: It is done with only female subjects and self reports of inhibition, so this clearly needs to be expanded. Their conclusions rest on a model, not a direct measurement, of the cerebral sources underlying the EEG, and the study did not address the degree to which individual differences in behavioral inhibition reflect altered functional connectivity between right-posterior DLPFC and other structures thought to underlie the behavioral inhibition system - e.g., amygdala, PAG, or ACC). Here is their abstract followed by a graphic from the article:
Individuals show marked variation in their responses to threat. Such individual differences in behavioral inhibition play a profound role in mental and physical well-being. Behavioral inhibition is thought to reflect variation in the sensitivity of a distributed neural system responsible for generating anxiety and organizing defensive responses to threat and punishment. Although progress has been made in identifying the key constituents of this behavioral inhibition system in humans, the involvement of dorsolateral prefrontal cortex (DLPFC) remains unclear. Here, we acquired self-reported Behavioral Inhibition System Sensitivity scores and high-resolution electroencephalography from a large sample (n= 51). Using the enhanced spatial resolution afforded by source modeling techniques, we show that individuals with greater tonic (resting) activity in right-posterior DLPFC rate themselves as more behaviorally inhibited. This observation provides novel support for recent conceptualizations of behavioral inhibition and clues to the mechanisms that might underlie variation in threat-induced negative affect.



Figure - Relations between individual differences in behavioral inhibition and tonic activity in right-posterior dorsolateral prefrontal cortex (DLPFC). The images in (a) depict the results of the electroencephalography source modeling analyses. The cluster lies at the intersection of the precentral and inferior frontal sulci, encompassing the right-posterior midfrontal and inferior-frontal gyri and including the inferior frontal junction (cluster-corrected p= .02). The crosshair shows the location in right-posterior DLPFC of the peak correlation in the sagittal (green outline), coronal (cyan outline), and axial (yellow outline) planes. The magnitude of voxel-wise correlations is depicted using a red-yellow scale; lighter shades of yellow indicate stronger correlations. "L" and "R" indicate the left and right hemispheres, respectively. The scatter plot (b) depicts the peak correlation between scores on the Behavioral Inhibition System (BIS) scale and standardized activity in right-posterior DLPFC (area 9/46v), r(48) =−.37, uncorrected p= .003. Standardized activity is in units of z-transformed cortical current density, log10(A/m2). Lines depict the regression line and 95% confidence envelope.


The power of music

North and Hargreaves introduce a special issue of The Psychologist which looks at musical ability; how and why people let music into their lives, and the impact of musical proficiency. They focus on the power of music to do harm (Rock music and self-injurious behavior), its effects on animal welfare, and its ability to influence pain stress and immunity. Some clips regarding the latter:
The most convincing evidence comes from Standley’s (1995) meta-analysis of 55 studies concerning the effect of music on 129 medically related variables. Podiatric pain, paediatric respiration, pulse, blood pressure and use of analgesia (in dental patients), pain, medication in paediatric surgery patients and EMG all showed effect sizes over 2, and the mean effect size over all 129 variables was .88, meaning that the impact of music was almost one standard deviation greater than without music.

The largest single body of literature concerns the impact of music on chronic pain, pain experienced during and after treatment, and pain experienced specifically by cancer patients and those undergoing palliative care. Research suggests that music can mediate pain in these cases by distracting the patient’s attention from it and/or by increasing their perceived control over the pain (since if patients believe that they have access to music as a means of pain control, then this belief itself decreases the aversiveness of pain). Similar research on stress has yielded the not entirely unsurprising conclusion that it may be reduced by music; but also that the amount of stress reduction varies according to age, the stressor, the listener’s musical preference, and their prior level of musical experience. More interestingly still, this reduction in stress manifests itself through physical measures, such as reduced levels of cortisol, and this has a very provocative further implication. Lower levels of stress are associated with greater immunity to illness of course, and several studies have indicated effects of music listening on physical measures of immune system strength, such as salivary immunoglobulin A. Although the mechanism by which this occurs is not well understood, the implication is clear: music contributes directly to physical health.

Wednesday, December 30, 2009

Changes in our brain connectivities as a funcion of age and sex

From Gong et al. one of those studies I take personally (i.e. describing my aging male brain):
Neuroanatomical differences attributable to aging and gender have been well documented, and these differences may be associated with differences in behaviors and cognitive performance. However, little is known about the dynamic organization of anatomical connectivity within the cerebral cortex, which may underlie population differences in brain function. In this study, we investigated age and sex effects on the anatomical connectivity patterns of 95 normal subjects ranging in age from 19 to 85 years. Using the connectivity probability derived from diffusion magnetic resonance imaging tractography, we characterized the cerebral cortex as a weighted network of connected regions. This approach captures the underlying organization of anatomical connectivity for each subject at a regional level. Advanced graph theoretical analysis revealed that the resulting cortical networks exhibited "small-world" character (i.e., efficient information transfer both at local and global scale). In particular, the precuneus and posterior cingulate gyrus were consistently observed as centrally connected regions, independent of age and sex. Additional analysis revealed a reduction in overall cortical connectivity with age. There were also changes in the underlying network organization that resulted in decreased local efficiency, and also a shift of regional efficiency from the parietal and occipital to frontal and temporal neocortex in older brains. In addition, women showed greater overall cortical connectivity and the underlying organization of their cortical networks was more efficient, both locally and globally. There were also distributed regional differences in efficiency between sexes. Our results provide new insights into the substrates that underlie behavioral and cognitive differences in aging and sex.



Figure- The spatial distribution of cortical regions showing significant age effect (p less than 0.05, corrected) on the integrated regional efficiency. The color represents t statistic of the age effect that was calculated from the general linear model. Each identified region was marked out. Notably, negative age effect was mainly distributed in the parietal and occipital cortex, whereas the positive age effect was localized only in the frontal and temporal cortex.