This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Thursday, April 10, 2008
Rationalization of our choices - statistics rather than psychology?
Tierny has done it again - a really really kewl article on what appears to be an error in some classical psychological experiments on cognitive dissonance and rationalization. He provides online exercises you can do. Those early experiments suggested choice rationalization: Once we reject something, we tell ourselves we never liked it anyway (and thereby spare ourselves the painfully dissonant thought that we made the wrong choice). It turns out that in the free-choice paradigm used to test our tendency to rationalize decisions, any bias or slight preference for one of the initial choices can lead to results on subsequent choices that are explained by simple statistics rather than a psychological explanation. The article is worth a careful read...
Wednesday, April 09, 2008
Episodic-like memory in rats - not like humans
Until recent experiments showing that scrub jays remember where and when they cached or discovered foods of differing palatability, it had been thought that episodic memory - defined as ability to remember an event (what) as well as where and when it happened - was confined to humans. Memory for 'when' observed in scrub jays has been taken to suggest that animals can mentally travel in time or locate a past event within a temporal framework of hours and days. Roberts et al. point out that:
An alternative possibility is that, instead of remembering when an event happened within a framework of past time, animals are keeping track of how much time has elapsed since caching or encountering a particular food item at a particular place and are using elapsed time to indicate return to or avoidance of that location. The cues of when and how long ago are typically confounded in studies of episodic-like memory. Thus, animals might be remembering how long ago an event occurred by keeping track of elapsed time using accumulators, circadian timers, their own behavior, or the strength of a decaying memory trace. If this is the case, then episodic-like memory in animals may be quite different from human episodic memory in which people can reconstruct past experiences within an absolute temporal dimension.Their experiments show that this is the case.
Three groups of Long-Evans hooded rats were tested for memory of previously encountered food. The different groups could use only the cues of when, how long ago, or when + how long ago. Only the cue of how long ago food was encountered was used successfully. These results suggest that episodic-like memory in rats is qualitatively different from human episodic memory.
Creating Musical Variation
Here is a clip from a very interesting perspectives piece on approaches to creating musical variation, by Diana S. Dabby in the April 4 issue of Science:
In the 21 letters that Mozart wrote to his friend Michael Puchberg between 1788 and 1791, there exist at least 24 variants of the supplication "Brother, can you spare a dime?" Mozart ornaments his language to cajole, flatter, and play on Puchberg's sympathies. He varies his theme of "cash needed now" in much the same way an 18thcentury composer might dress a melody in new attire by weaving additional notes around its thematic tones in order to create a variation. Such ornamentation could enliven and elaborate one or more musical entities, as can be heard in the Haydn F Minor Variations (1793) (mp3 file of theme, mp3 file of variation). The Haydn represents one of the most popular forms of the 18th and 19th centuries--variations on original or borrowed themes. Yet myriad variation techniques existed besides ornamentation, including permutation and combination, as advocated by a number of 18th-century treatises. More recently, fields such as chaos theory have allowed composers to create new kinds of variations, some of which are reminiscent of earlier combinatorial techniques.
In a broad context, variation refers to the technique of altering musical material to create something related, yet new. Recognizing its importance to composers, the 20th-century composer and teacher Arnold Schoenberg defined variation as "repetition in which some features are changed and the rest preserved". He wrote numerous examples showing how a group of four notes, each having the same duration, can be varied by making rhythmic alterations, adding neighboring notes, changing the order of the notes, and so on (see the figure, panels A to C). Changing the order of the notes reflects the 18th-century practice of ars combinatoria. Joseph Riepel advocated a similar approach (see the figure, panel D).
Figure - Idea and variations. Variation techniques illustrated by Schoenberg, Riepel, and a chaotic mapping example. Schoenberg offers numerous ways to vary a given four-note group, shown in the first measure of each line. (A) Rhythmic changes. (B) Addition of neighboring notes. (C) Changing the original order. (D) One of many examples given by Riepel of ars permutatoria, a branch of ars combinatoria, where six permutations of the notes A B C are given (15). Note that Riepel writes above the staff the German musical spelling of the notes so that "B" translates to B-flat. (E) The first measure of a Bach prelude (pitches only) followed by the first measure of a variation generated by the chaotic mapping.
Tuesday, April 08, 2008
Simple curves can influence whether we see happy or sad faces.
Here is an interesting bit of work from Xu et al. showing that adaptation to simple stimuli (like the shape of a mouth) that are processed early in the visual hierarchy can influence our perception of higher level perceptions (i.e., of faces) that are analyzed at higher levels of the visual hierarchy. Thus adaptation to a concave (sad) cartoon mouth shape makes subsequent perception more likely to report a happy face, and vice versa. Their abstract:
Adaptation is ubiquitous in sensory processing. Although sensory processing is hierarchical, with neurons at higher levels exhibiting greater degrees of tuning complexity and invariance than those at lower levels, few experimental or theoretical studies address how adaptation at one hierarchical level affects processing at others. Nevertheless, this issue is critical for understanding cortical coding and computation. Therefore, we examined whether perception of high-level facial expressions can be affected by adaptation to low-level curves (i.e., the shape of a mouth). After adapting to a concave curve, subjects more frequently perceived faces as happy, and after adapting to a convex curve, subjects more frequently perceived faces as sad. We observed this multilevel aftereffect with both cartoon and real test faces when the adapting curve and the mouths of the test faces had the same location. However, when we placed the adapting curve 0.2° below the test faces, the effect disappeared. Surprisingly, this positional specificity held even when real faces, instead of curves, were the adapting stimuli, suggesting that it is a general property for facial-expression aftereffects. We also studied the converse question of whether face adaptation affects curvature judgments, and found such effects after adapting to a cartoon face, but not a real face. Our results suggest that there is a local component in facial-expression representation, in addition to holistic representations emphasized in previous studies. By showing that adaptation can propagate up the cortical hierarchy, our findings also challenge existing functional accounts of adaptation.
Here are some examples of face stimuli used in the studies, in which subjects were experiments as well as naive subjects:
Figure - Examples of the face stimuli used in this study. a, Cartoon faces used in experiment 1, generated with our anti-aliasing program. The mouth curvature varied from concave to convex to produce sad to happy expressions. b, Ekman faces used in experiment 2. The first (sad) and last (happy) images were taken from the Ekman PoFA database, and the intermediate ones were generated with MorphMan 4.0. c, MMI faces used in experiments 3 and 4. The first (sad), middle (neutral), and last (happy) images were taken from the MMI face database, and the other images were generated with MorphMan 4.0.
Our Racist, Sexist Selves
Kristof has a great Op-Ed piece int he Sunday NYTimes with the title of this post. You should check out the psychological experiments that you can do online. You may think you are not prejudiced, but these "implicit attitude tests" might show otherwise.
Monday, April 07, 2008
The Amazing Aging Brain
Blog Categories:
aging,
brain plasticity,
human development
The social cognitive neuroscience of business organizations
Jumping on the bandwagon of getting cognitive neuroscience into business and marketing, there is a special issue of the Annals of the New York Academy of Sciences which offers one open access article by Klein and D'Desposito, "Neurocognitive Inefficacy of the Strategy Process." Their abstract (written in business-speak gobbledegook, but content can be extracted):
The most widely used (and taught) protocols for strategic analysis—Strengths, Weaknesses, Opportunities, and Threats (SWOT) and Porter's (1980) Five Force Framework for industry analysis—have been found to be insufficient as stimuli for strategy creation or even as a basis for further strategy development. We approach this problem from a neurocognitive perspective. We see profound incompatibilities between the cognitive process—deductive reasoning—channeled into the collective mind of strategists within the formal planning process through its tools of strategic analysis (i.e., rational technologies) and the essentially inductive reasoning process actually needed to address ill-defined, complex strategic situations. Thus, strategic analysis protocols that may appear to be and, indeed, are entirely rational and logical are not interpretable as such at the neuronal substrate level where thinking takes place. The analytical structure (or propositional representation) of these tools results in a mental dead end, the phenomenon known in cognitive psychology as functional fixedness. The difficulty lies with the inability of the brain to make out meaningful (i.e., strategy-provoking) stimuli from the mental images (or depictive representations) generated by strategic analysis tools. We propose decreasing dependence on these tools and conducting further research employing brain imaging technology to explore complex data handling protocols with richer mental representation and greater potential for strategy creation.
The spiritual side of atheism
This from Andrew Sullivan's blog.
Friday, April 04, 2008
Report from the road - Central Texas Bluebonnets
From the drive into Austin Texas yesterday, the roadsides (seeded with wildflowers by Lady Bird Johnson) were a riot of spring flowers at their peak. (Click to enlarge).
The social brain in adolescence - a review
In a recent Nature Reviews Neuroscience, Sarah-Jayne Blakemore does a summary of changes in the social brain during adolescence and I put down here the slightly edited capsule summary and one summary figure that offers a review of the relevant brain structures:
The 'social brain', the network of brain regions involved in understanding other people, includes the medial prefrontal cortex (mPFC) and the posterior superior temporal sulcus (pSTS). These regions are key to the process of mentalizing — that is, the attribution of mental states to oneself and to other people...Recent functional neuroimaging studies have shown that activity in parts of the social brain during social cognitive tasks changes during adolescence... activity in the PFC during face-processing tasks increases from childhood to adolescence and then decreases from adolescence to adulthood. Consistent with this, activity in the mPFC during mentalizing tasks decreases between adolescence and adulthood.
The prefrontal cortex is one of the brain regions that undergo structural development, including synaptic reorganization, during adolescence. Synaptic density, reflected in grey-matter volume in MRI scans, decreases during adolescence...Synaptic reorganization in the PFC might underlie the functional changes that are seen in the social brain during adolescence, as well as the social cognitive changes during this period.
Figure - Regions that are involved in social cognition include the medial prefrontal cortex (mPFC) and the temporoparietal junction (TPJ), which are involved in thinking about mental states, and the posterior superior temporal sulcus (pSTS), which is activated by observing faces and biological motion. Other regions of the social brain on the lateral surface are the inferior frontal gyrus (IFG) and the interparietal sulcus (IPS). Regions on the medial surface that are involved in social cognition include the amygdala, the anterior cingulate cortex (ACC) and the anterior insula (AI).
Preschool children's narratives predict later math performance
In a Nature journal club note, Devlin points out work by O'Neill and colleagues, who examined whether language development in preschool children might be a predictor of later math ability, given that early aptitude for arithmetic is not a terribly good indicator of future math performance.
O'Neill and her team showed three- and four-year-old children a picture book and asked them to tell a story about what they saw...narrative measures of conjunction use, event content, perspective shift, and mental state reference were significantly predictive of later Math scores. The sophistication with which the children told their stories was important. The most significant feature of this sophistication was children's ability to switch perspectives as they related the stories. Crucially, this correlation pertained not to later performance in reading, spelling or general knowledge, but to future mathematical ability.
Thursday, April 03, 2008
Report from the road...
My first day on the road to Austin, TX. (see April 1 post) ended at Waklulla Springs State Park in the Florida panhandle, staying overnight in the Park Lodge. The second night has been at L'auberge Casino in Lake Charles, Louisiana (the room sans gambling)...heading out for the Cajun Trail along the coast today.
Runner's High - endorphin release finally demonstated
It has long been assumed that strenuous exercise causes chemical changes in the brain, particularly the release of endorphins, the brain’s naturally occurring opiates. The problem with this idea, from Kolata's review, has been:
...that it was not feasible to do a spinal tap before and after someone exercised to look for a flood of endorphins in the brain. Researchers could detect endorphins in people’s blood after a run, but those endorphins were part of the body’s stress response and could not travel from the blood to the brain. They were not responsible for elevating one’s mood. So for more than 30 years, the runner’s high remained an unproved hypothesis.Boecker et al. used a synthetic opioid labelled with fluorine isotope ([18F]FDPN), visible in positron emission brain scans (PET scans), which binds to brain opioid receptors. Less of this compound was found bound to several brain sites important in mood control after running, because those site had become occupied by endogenous opioids during the running. The current affective states before and after running as well as before the resting PET scan were evaluated with Visual Analog Mood Scales - subjects rated different items (sadness, tension, fear, anger, confusion, fatigue, happiness, and energy. This yielded the VAS euphoria scale referenced in the figure. VAS ratings of euphoria are inversely correlated with [18F]FDPN binding. Here is that figure, followed by the full abstract from the article.
Figure - Correlation of opioidergic binding in runners with VAS ratings of euphoria. Statistical parametric maps of the regression analysis (regions where VAS ratings of euphoria are inversely correlated with [18F]FDPN binding) in standard stereotactic space (Montreal Neurological Institute [MNI] space) are overlaid in color on axial slices of a skull-stripped normalized brain.The abstract:
The runner's high describes a euphoric state resulting from long-distance running. The cerebral neurochemical correlates of exercise-induced mood changes have been barely investigated so far. We aimed to unravel the opioidergic mechanisms of the runner's high in the human brain and to identify the relationship to perceived euphoria. We performed a positron emission tomography "ligand activation" study with the nonselective opioidergic ligand 6-O-(2-[18F]fluoroethyl)-6-O-desmethyldiprenorphine ([18F]FDPN). Ten athletes were scanned at 2 separate occasions in random order, at rest and after 2 h of endurance running (21.5 ± 4.7 km). Binding kinetics of [18F]FDPN were quantified by basis pursuit denoising (DEPICT software). Statistical parametric mapping (SPM2) was used for voxelwise analyses to determine relative changes in ligand binding after running and correlations of opioid binding with euphoria ratings. Reductions in opioid receptor availability were identified preferentially in prefrontal and limbic/paralimbic brain structures. The level of euphoria was significantly increased after running and was inversely correlated with opioid binding in prefrontal/orbitofrontal cortices, the anterior cingulate cortex, bilateral insula, parainsular cortex, and temporoparietal regions. These findings support the "opioid theory" of the runner's high and suggest region-specific effects in frontolimbic brain areas that are involved in the processing of affective states and mood.
Infants to adults, color perception switches from right to left hemisphere
An interesting article by Franklin et al. shows that our perception of color categories (CP) starts in the right hemisphere, but then switches to the left hemisphere as it develops the lexical color codes of language. They suggest that language-driven CP in adults may not build on prelinguistic CP, but that language instead imposes its categories on a left hemisphere that is not categorically prepartitioned.
Blog Categories:
attention/perception,
human development,
language
Wednesday, April 02, 2008
The 'size' of an odor can influence our reaching to grasp an object.
An nice example from Tubaldi et al. of multisensory integration. They find that olfactory information contains highly detailed information able to elicit the planning for a reach-to-grasp movement suited to interact with the evoked object. From their paper:
The size of the object evoked by the odour has the potential to modulate hand shaping. Importantly, the fact that ‘size’ olfactory information modulates the hand at the level of individual digits (and not only the thumb-index distance as previously reported) leads to two important considerations in terms of sensorimotor transformation. First, from a perceptual perspective, the representation evoked by the odour seems to contain highly detailed information about the object (i.e., volumetric features rather than a linear dimension such as the thumb-index distance). If olfaction had provided a blurred and holistic object's representation (i.e., a low spatial-resolution of the object's image), then the odour would have not affected the hand in its entirety. Second, from a motor perspective, the olfactory representation seems to be mapped into the action vocabulary with a certain degree of reliability. The elicited motor plan embodies specific and selective commands for handling the ‘smelled’ object, and it is fully manageable by the motor system. Therefore, it is not an incomplete primal sketch which only provides a preliminary descriptive in the terms of motor execution.Some of the details:
When the odour was ‘large’ and the visual target was small, only one finger joint (i.e., the mcp joint of the ring finger) was affected by the olfactory stimulus. In contrast, the influence of the ‘small’ odour on the kinematics of a reach-to-grasp movement towards a large target was much more evident and a greater number of joints were mobilized. This seems to suggest that planning for a reach-to-grasp movement on the basis of a ‘small’ odour when the target is large poses more constraints than when the odour is ‘large’ and the movement is directed towards a small target. Our proposal is that the motor plan elicited by the odour has to be modified according to the visual target. However such reorganization could be more easily managed without compromising object grasp when the odour is ‘large’ and the visual target is small than vice versa.
When a preceding odour elicits a motor plan which is congruent with the motor plan subsequently established for the visual target, the kinematic patterning is magnified. Therefore, the grasp plan triggered by the olfactory stimulus primed the grasp plan established for the visual target. This effect was evident at the very beginning of the movement, fading away during the second phase of the movement. For both the incongruent conditions the conflict between the ‘olfactory’ and the ‘visual’ grasp plans lasted for the entire movement duration. Importantly, and again in contrast with what reported for the incongruent conditions, an odour of a similar ‘size’ than the visual target, does not alter hand synergies with respect to when no-odour is presented. This indicates that when the ‘size’ of the odour and the size of the visual target match, the integration of the two modalities reinforces the grasp plan, the established synergic pattern is more ‘protected’ and it does not change. Having two sources carrying similar information leads to a more stable and coherent action.
Antidepressant effects of eating less.
I notice that when I get paranoid about my weight creeping up and suddenly eat less for several days, my general mood improves considerably.... I wonder if the chemistry described in these (admittedly more extreme) experiments on rodents done by Lutter et al. is what is going on. The experiments deal with the orexin neuropeptides, which can stimulate food seeking activity in mice and decrease anxiety-like behaviors in helplessness and social defeat model of stress. (Decreased levels of orexin-A have been reported in the CSF of suicidal patients with major depressive disorder, supporting chronic social defeat stress as a model of major depression.) The title of the article is "Orexin Signaling Mediates the Antidepressant-Like Effect of Calorie Restriction" Here is the abstract:
During periods of reduced food availability, animals must respond with behavioral adaptations that promote survival. Despite the fact that many psychiatric syndromes include disordered eating patterns as a component of the illness, little is known about the neurobiology underlying behavioral changes induced by short-term calorie restriction. Presently, we demonstrate that 10 d of calorie restriction, corresponding to a 20–25% weight loss, causes a marked antidepressant-like response in two rodent models of depression and that this response is dependent on the hypothalamic neuropeptide orexin (hypocretin). Wild-type mice, but not mice lacking orexin, show longer latency to immobility and less total immobility in the forced swim test after calorie restriction. In the social defeat model of chronic stress, calorie restriction reverses the behavioral deficits seen in wild-type mice but not in orexin knock-out mice. Additionally, chronic social defeat stress induces a prolonged reduction in the expression of prepro-orexin mRNA via epigenetic modification of the orexin gene promoter, whereas calorie restriction enhances the activation of orexin cells after social defeat. Together, these data indicate that orexin plays an essential role in mediating reduced depression-like symptoms induced by calorie restriction.
Tuesday, April 01, 2008
MindBlog hits the road....
I'm loading boxes into my Honda Civic, leaving my condo in paradise (Fort Lauderdale) to return to Madison, Wisconsin via Austin, Texas - where I visit my son and his wife who live in the family house in which I grew up. It is a week or two early to return to Wisconsin, but I've decided I should symbolically share the suffering by arriving for the last gasp of a winter that has deposited 107 inches of snow on my Twin Valley home.
I've decided to take a leisurely tourist drive, tonight staying in the Wakula Springs State Park in the Florida panhandle, at the Wakula Springs Lodge, an example of Mediterranean Revival architecture built in 1937 by Edward Ball, who established the Wakula Springs wildlife preserve in 1934. After driving along Florida's Gulf coast Wednesday I'm heading on to Lake Charles, Louisiana, and crashing at the L'Augerge Du Lac casino. Thursday morning I will take the "Creole Trail" along the Louisiana Gulf coast into Texas, and then head on to Austin. I'm not sure what my internet status will be. I have asked a friend to post some blog drafts I've prepared ahead. It would be therapeutic for me to be off the grid for a few days.......
I've decided to take a leisurely tourist drive, tonight staying in the Wakula Springs State Park in the Florida panhandle, at the Wakula Springs Lodge, an example of Mediterranean Revival architecture built in 1937 by Edward Ball, who established the Wakula Springs wildlife preserve in 1934. After driving along Florida's Gulf coast Wednesday I'm heading on to Lake Charles, Louisiana, and crashing at the L'Augerge Du Lac casino. Thursday morning I will take the "Creole Trail" along the Louisiana Gulf coast into Texas, and then head on to Austin. I'm not sure what my internet status will be. I have asked a friend to post some blog drafts I've prepared ahead. It would be therapeutic for me to be off the grid for a few days.......
Mind Reading with fMRI
From the Nature Editor's summary:
Recent functional magnetic resonance imaging (fMRI) studies have shown that, based on patterns of activity evoked by different categories of visual images, it is possible to deduce simple features in the visual scene, or to which category it belongs. Kay et al. take this approach a tantalizing step further. Their newly developed decoding method, based on quantitative receptive field models that characterize the relationship between visual stimuli and fMRI activity in early visual areas, can identify with high accuracy which specific natural image an observer saw, even for an image chosen at random from 1,000 distinct images. This prompts the thought that it may soon be possible to decode subjective perceptual experiences such as visual imagery and dreams, an idea previously restricted to the realm of science fiction.The abstract from Kay et al., followed by one figure:
A challenging goal in neuroscience is to be able to read out, or decode, mental content from brain activity. Recent functional magnetic resonance imaging (fMRI) studies have decoded orientation, position, and object category from activity in visual cortex. However, these studies typically used relatively simple stimuli (for example, gratings) or images drawn from fixed categories (for example, faces, houses), and decoding was based on previous measurements of brain activity evoked by those same stimuli or categories. To overcome these limitations, here we develop a decoding method based on quantitative receptive-field models that characterize the relationship between visual stimuli and fMRI activity in early visual areas. These models describe the tuning of individual voxels for space, orientation and spatial frequency, and are estimated directly from responses evoked by natural images. We show that these receptive-field models make it possible to identify, from a large set of completely novel natural images, which specific image was seen by an observer. Identification is not a mere consequence of the retinotopic organization of visual areas; simpler receptive-field models that describe only spatial tuning yield much poorer identification performance. Our results suggest that it may soon be possible to reconstruct a picture of a person's visual experience from measurements of brain activity alone.
Figure Legend - The experiment consisted of two stages. In the first stage, model estimation, fMRI data were recorded while each subject viewed a large collection of natural images. These data were used to estimate a quantitative receptive-field model for each voxel. In the second stage, image identification, fMRI data were recorded while each subject viewed a collection of novel natural images. For each measurement of brain activity, we attempted to identify which specific image had been seen. This was accomplished by using the estimated receptive-field models to predict brain activity for a set of potential images and then selecting the image whose predicted activity most closely matches the measured activity.
Blog Categories:
attention/perception,
consciousness,
technology
Monday, March 31, 2008
Regulating the brain circuits of compassion
Here is yet more compelling evidence that you are what you spend your time imagining. In a recent study in PLoS ONE, Lutz, Davidson and colleagues extend their previous work on correlations between brain states and meditation to show that one particular kind of Buddhist meditation, which emphasizes empathetic and loving thoughts towards others and self, changes the brain's reactivity to emotional sounds. In experienced practitioners of the 'loving-kindness-compassion' meditation technique, such images caused larger reactions in the insular and anterior cingulate cortices than were observed in novices. Here is their abstract and one figure from the paper.
Recent brain imaging studies using functional magnetic resonance imaging (fMRI) have implicated insula and anterior cingulate cortices in the empathic response to another's pain. However, virtually nothing is known about the impact of the voluntary generation of compassion on this network. To investigate these questions we assessed brain activity using fMRI while novice and expert meditation practitioners generated a loving-kindness-compassion meditation state. To probe affective reactivity, we presented emotional and neutral sounds during the meditation and comparison periods. Our main hypothesis was that the concern for others cultivated during this form of meditation enhances affective processing, in particular in response to sounds of distress, and that this response to emotional sounds is modulated by the degree of meditation training. The presentation of the emotional sounds was associated with increased pupil diameter and activation of limbic regions (insula and cingulate cortices) during meditation (versus rest). During meditation, activation in insula was greater during presentation of negative sounds than positive or neutral sounds in expert than it was in novice meditators. The strength of activation in insula was also associated with self-reported intensity of the meditation for both groups. These results support the role of the limbic circuitry in emotion sharing. The comparison between meditation vs. rest states between experts and novices also showed increased activation in amygdala, right temporo-parietal junction (TPJ), and right posterior superior temporal sulcus (pSTS) in response to all sounds, suggesting, greater detection of the emotional sounds, and enhanced mentation in response to emotional human vocalizations for experts than novices during meditation. Together these data indicate that the mental expertise to cultivate positive emotion alters the activation of circuitries previously linked to empathy and theory of mind in response to emotional stimuli.
(AI) and (Ins.) stand for anterior insula and insula, respectively (z = 12 and z = 19, 15 experts and 15 novices, color codes: orange, p less than 5.10ˆ-2, yellow, p less than 2.10ˆ-2). B, C. Impulse response from rest to compassion in response to emotional sounds in AI (B) and Ins. (C). D–E. Responses in AI (D) and Ins. (E) during poor and good blocks of compassion, as verbally reported, for 12 experts (red) and 10 novices (blue).
Blog Categories:
happiness,
meditation,
mirror neurons,
morality,
social cognition
Sunday, March 30, 2008
Subscribe to:
Posts (Atom)