How does lacking vs. possessing power in a social exchange affect people’s trust in their exchange partner? An answer to this question has broad implications for a number of exchange settings in which dependence plays an important role. Here, we report on a series of experiments in which we manipulated participants’ power position in terms of structural dependence and observed their trust perceptions and behaviors. Over a variety of different experimental paradigms and measures, we find that more powerful actors place less trust in others than less powerful actors do. Our results contradict predictions by rational actor models, which assume that low-power individuals are able to anticipate that a more powerful exchange partner will place little value on the relationship with them, thus tends to behave opportunistically, and consequently cannot be trusted. Conversely, our results support predictions by motivated cognition theory, which posits that low-power individuals want their exchange partner to be trustworthy and then act according to that desire. Mediation analyses show that, consistent with the motivated cognition account, having low power increases individuals’ hope and, in turn, their perceptions of their exchange partners’ benevolence, which ultimately leads them to trust.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Thursday, October 29, 2015
Low-power people are more trusting in social exchange.
Schilke et al. make observations suggestion that low-power individuals want high-power people they interact with to be trustworthy, and act according to that desire:
Wednesday, October 28, 2015
How much sleep do we really need?
A study by Yetish et al. casts fascinating light on the widespread idea that a large fraction of us in modern industrial societies are sleep-deprived, going to bed later than is "natural" and sleeping less than our bodies need. They monitored the sleep patterns of three hunter-gatherer cultures in Bolivia, Tanzania, and South Africa. Here is their summary:
Highlights
Highlights
•Preindustrial societies in Tanzania, Namibia, and Bolivia show similar sleep parametersSummary
•They do not sleep more than “modern” humans, with average durations of 5.7–7.1 hr
•They go to sleep several hours after sunset and typically awaken before sunrise
•Temperature appears to be a major regulator of human sleep duration and timing
How did humans sleep before the modern era? Because the tools to measure sleep under natural conditions were developed long after the invention of the electric devices suspected of delaying and reducing sleep, we investigated sleep in three preindustrial societies. We find that all three show similar sleep organization, suggesting that they express core human sleep patterns, most likely characteristic of pre-modern era Homo sapiens. Sleep periods, the times from onset to offset, averaged 6.9–8.5 hr, with sleep durations of 5.7–7.1 hr, amounts near the low end of those industrial societies. There was a difference of nearly 1 hr between summer and winter sleep. Daily variation in sleep duration was strongly linked to time of onset, rather than offset. None of these groups began sleep near sunset, onset occurring, on average, 3.3 hr after sunset. Awakening was usually before sunrise. The sleep period consistently occurred during the nighttime period of falling environmental temperature, was not interrupted by extended periods of waking, and terminated, with vasoconstriction, near the nadir of daily ambient temperature. The daily cycle of temperature change, largely eliminated from modern sleep environments, may be a potent natural regulator of sleep. Light exposure was maximal in the morning and greatly decreased at noon, indicating that all three groups seek shade at midday and that light activation of the suprachiasmatic nucleus is maximal in the morning. Napping occurred on fewer than 7% of days in winter and fewer than 22% of days in summer. Mimicking aspects of the natural environment might be effective in treating certain modern sleep disorders.
Tuesday, October 27, 2015
Chilling down our religiousity and intolerance with some magnets?
A group of collaborators has used transcranial magnetic stimulation to dial down activity in the area of the posterior medial frontal cortex (pMFC)that evaluates threats and plans responses. A group of subjects who had undergone this procedure expressed less bias against immigrants and also less belief in God than a group that received a sham TMS treatment.
People cleave to ideological convictions with greater intensity in the aftermath of threat. The posterior medial frontal cortex (pMFC) plays a key role in both detecting discrepancies between desired and current conditions and adjusting subsequent behavior to resolve such conflicts. Building on prior literature examining the role of the pMFC in shifts in relatively low-level decision processes, we demonstrate that the pMFC mediates adjustments in adherence to political and religious ideologies. We presented participants with a reminder of death and a critique of their in-group ostensibly written by a member of an out-group, then experimentally decreased both avowed belief in God and out-group derogation by downregulating pMFC activity via transcranial magnetic stimulation. The results provide the first evidence that group prejudice and religious belief are susceptible to targeted neuromodulation, and point to a shared cognitive mechanism underlying concrete and abstract decision processes. We discuss the implications of these findings for further research characterizing the cognitive and affective mechanisms at play.
Monday, October 26, 2015
The hippocampus is essential for recall but not for recognition.
From Patai et al:
Which specific memory functions are dependent on the hippocampus is still debated. The availability of a large cohort of patients who had sustained relatively selective hippocampal damage early in life enabled us to determine which type of mnemonic deficit showed a correlation with extent of hippocampal injury. We assessed our patient cohort on a test that provides measures of recognition and recall that are equated for difficulty and found that the patients' performance on the recall tests correlated significantly with their hippocampal volumes, whereas their performance on the equally difficult recognition tests did not and, indeed, was largely unaffected regardless of extent of hippocampal atrophy. The results provide new evidence in favor of the view that the hippocampus is essential for recall but not for recognition.
Friday, October 23, 2015
Brain activity associated with predicting rewards to others.
Lockwood et al. make the interesting observation that a subregion of the anterior cingulate cortex shows specialization for processing others' versus one's own rewards.
Empathy—the capacity to understand and resonate with the experiences of others—can depend on the ability to predict when others are likely to receive rewards. However, although a plethora of research has examined the neural basis of predictions about the likelihood of receiving rewards ourselves, very little is known about the mechanisms that underpin variability in vicarious reward prediction. Human neuroimaging and nonhuman primate studies suggest that a subregion of the anterior cingulate cortex in the gyrus (ACCg) is engaged when others receive rewards. Does the ACCg show specialization for processing predictions about others' rewards and not one's own and does this specialization vary with empathic abilities? We examined hemodynamic responses in the human brain time-locked to cues that were predictive of a high or low probability of a reward either for the subject themselves or another person. We found that the ACCg robustly signaled the likelihood of a reward being delivered to another. In addition, ACCg response significantly covaried with trait emotion contagion, a necessary foundation for empathizing with other individuals. In individuals high in emotion contagion, the ACCg was specialized for processing others' rewards exclusively, but for those low in emotion contagion, this region also responded to information about the subject's own rewards. Our results are the first to show that the ACCg signals probabilistic predictions about rewards for other people and that the substantial individual variability in the degree to which the ACCg is specialized for processing others' rewards is related to trait empathy.
Thursday, October 22, 2015
Drugs or therapy for depression?
I want to pass on a few clips from a piece by Friedman, summarizing work by Mayberg and collaborators at Emory University, who looked for brain activity that might predict whether a depressed patient would respond better to psychotherapy or antidepressant medication:
Using PET scans, she randomized a group of depressed patients to either 12 weeks of treatment with the S.S.R.I. antidepressant Lexapro or to cognitive behavior therapy, which teaches patients to correct their negative and distorted thinking.
Over all, about 40 percent of the depressed subjects responded to either treatment. But Dr. Mayberg found striking brain differences between patients who did well with Lexapro compared with cognitive behavior therapy, and vice versa. Patients who had low activity in a brain region called the anterior insula measured before treatment responded quite well to C.B.T. but poorly to Lexapro; conversely, those with high activity in this region had an excellent response to Lexapro, but did poorly with C.B.T.
We know that the insula is centrally involved in the capacity for emotional self-awareness, cognitive control and decision making, all of which are impaired by depression. Perhaps cognitive behavior therapy has a more powerful effect than an antidepressant in patients with an underactive insula because it teaches patients to control their emotionally disturbing thoughts in a way that an antidepressant cannot.
These neurobiological differences may also have important implications for treatment, because for most forms of depression, there is little evidence to support one form of treatment over another...Currently, doctors typically prescribe antidepressants on a trial-and-error basis, selecting or adding one antidepressant after another when a patient fails to respond to the first treatment. Rarely does a clinician switch to an empirically proven psychotherapy like cognitive behavior therapy after a patient fails to respond to medication, although these data suggest this might be just the right strategy. One day soon, we may be able to quickly scan a patient with an M.R.I. or PET, check the brain activity “fingerprint” and select an antidepressant or psychotherapy accordingly.
Is the nonspecific nature of talk therapy — feeling understood and cared for by another human being — responsible for its therapeutic effect? Or will specific types of therapy — like C.B.T. or interpersonal or psychodynamic therapy — show distinctly different clinical and neurobiological effects for various psychiatric disorders?...Right now we don’t have a clue, in part because of the current research funding priorities of the National Institute of Mental Health, which strongly favors brain science over psychosocial treatments. But these are important questions, and we owe it to our patients to try to answer them.
Wednesday, October 21, 2015
Hoopla over a bit of rat brain…a complete brain simulation?
A vastly expensive and heavily marketed international collaborative "Blue Brain Project (BBP)" has now reported its first digital reconstruction of a slice of rat somatosensory cortex, the most complete simulation of a piece of excitable brain matter to date (still, a speck of tissue compared to the human brain, which is two million times larger). I, along with a chorus of critics, can not see how a static depiction and reconstruction of a cortical column (~30,000 neurons, ~40 million synapses) is anything but a waste of money. The biological reality is that those neurons and synapses are not just sitting there, with static components cranking away like the innards of a computer. The wiring is plastic, constantly changing as axons, dendrites, and synapses both grow and retract, changing the number and kind of their connections over which information flows.
Koch and Buice make the generous point that all this might not matter if one could devise the biological equivalent of Alan Turing's Imitation game, seeing if an observer could tell whether output they observe for a given input is being generated by the simulation or by electrical recording from living tissue. Here are some interesting clips from their article in Cell.
Koch and Buice make the generous point that all this might not matter if one could devise the biological equivalent of Alan Turing's Imitation game, seeing if an observer could tell whether output they observe for a given input is being generated by the simulation or by electrical recording from living tissue. Here are some interesting clips from their article in Cell.
...the current BBP model stops with the continuous and deterministic Hodgkin-Huxley currents...And therein lies an important lesson. If the real and the synthetic can’t be distinguished at the level of firing rate activity (even though it is uncontroversial that spiking is caused by the concerted action of tens of thousands of ionic channel proteins), the molecular level of granularity would appear to be irrelevant to explain electrical activity. Teasing out which mechanisms contribute to any specific phenomena is essential to what is meant by understanding.
Markram et al. claim that their results point to the minimal datasets required to model cortex. However, we are not aware of any rigorous argument in the present triptych of manuscripts, specifying the relevant level of granularity. For instance, are active dendrites, such as those of the tall, layer 5 pyramidal cells, essential? Could they be removed without any noticeable effect? Why not replace the continuous, macroscopic, and deterministic HH equations with stochastic Markov models of thousands of tiny channel conductances? Indeed, why not consider quantum mechanical levels of descriptions? Presumably, the latter two avenues have not been chosen because of their computational burden and the intuition that they are unlikely to be relevant. The Imitation Game offers a principled way of addressing these important questions: only add a mechanism if its impact on a specific set of measurables can be assessed by a trained observer.
Consider the problem of numerical weather prediction and climate modeling, tasks whose physico-chemical and computational complexity is comparable to whole-brain modeling. Planet-wide simulations that cover timescales from hours to decades require a deep understanding of how physical systems interact across multiple scales and careful choices about the scale at which different phenomena are modeled. This has led to an impressive increase in predictive power since 1950, when the first such computer calculations were performed. Of course, a key difference between weather prediction and whole-brain simulation is that the former has a very specific and quantifiable scientific question (to wit: “is it going to rain tomorrow?”). The BBP has created an impressive initial scaffold that will facilitate asking these kinds of questions for brains.
Blog Categories:
attention/perception,
brain plasticity,
technology
Tuesday, October 20, 2015
Meditation madness
Adam Grant does a NYTimes Op-Ed piece that mirrors some of my own sentiments about the current meditation craze. There would seem to be almost nothing that practicing meditation doesn't enhance (ingrown toenails?) I'm fascinated by what studies on meditation have told us about how the mind works, and MindBlog has done many posts on the topic (click the meditation link under 'selected blog categories' in the right column.) I and many others personally find it very useful in maintaining a calm and focused mind. But.... it is not a universal panacea, and many of its effects can be accomplished, as Grant points out, by other means. (By the way, a Wisconsin colleague of mine who has assisted in a number of the meditation studies conducted by Richard Davidson and collaborators at the University of Wisconsin feels that people who engage meditation regimes display more depressive behaviors after a period of time.) Some clips from Grant's screed:
...Every benefit of the practice can be gained through other activities...This is the conclusion from an analysis of 47 trials of meditation programs, published last year in JAMA Internal Medicine: “We found no evidence that meditation programs were better than any active treatment (i.e., drugs, exercise and other behavioral therapies).”
O.K., so meditation is just one of many ways to fight stress. But there’s another major benefit of meditating: It makes you mindful. After meditating, people are more likely to focus their attention in the present. But as the neuroscientist Richard Davidson and the psychologist Alfred Kaszniak recently lamented, “There are still very few methodologically rigorous studies that demonstrate the efficacy of mindfulness-based interventions in either the treatment of specific diseases or in the promotion of well-being.”
And guess what? You don’t need to meditate to achieve mindfulness either...you can become more mindful by thinking in conditionals instead of absolutes...Change “is” to “could be,” and you become more mindful. The same is true when you look for an answer rather than the answer.(I would also point out that 'mindfulness' can frequently be generated by switching in your thoughts from a first to a third person perspective.) Finally:
...in some situations, meditation may be harmful: Willoughby Britton, a Brown University Medical School professor, has discovered numerous cases of traumatic meditation experiences that intensify anxiety, reduce focus and drive, and leave people feeling incapacitated.
Monday, October 19, 2015
A brain switch that can make the familiar seem new?
We all face the issue how to refresh and renew our energy and perspective after our brains have adapted, habituated, or densensitized to an ongoing interest or activity that lost its novelty. As I engage my long term interests in piano performance and studying how our minds work, I wish I could throw a "reset" switch in my brain that would let me approach the material as if it were new again. Ho et al. appear to have found such a switch, in the perirhinal cortex of rats, that regulates whether images are perceived as familiar or novel:
Perirhinal cortex (PER) has a well established role in the familiarity-based recognition of individual items and objects. For example, animals and humans with perirhinal damage are unable to distinguish familiar from novel objects in recognition memory tasks. In the normal brain, perirhinal neurons respond to novelty and familiarity by increasing or decreasing firing rates. Recent work also implicates oscillatory activity in the low-beta and low-gamma frequency bands in sensory detection, perception, and recognition. Using optogenetic methods in a spontaneous object exploration (SOR) task, we altered recognition memory performance in rats. In the SOR task, normal rats preferentially explore novel images over familiar ones. We modulated exploratory behavior in this task by optically stimulating channelrhodopsin-expressing perirhinal neurons at various frequencies while rats looked at novel or familiar 2D images. Stimulation at 30–40 Hz during looking caused rats to treat a familiar image as if it were novel by increasing time looking at the image. Stimulation at 30–40 Hz was not effective in increasing exploration of novel images. Stimulation at 10–15 Hz caused animals to treat a novel image as familiar by decreasing time looking at the image, but did not affect looking times for images that were already familiar. We conclude that optical stimulation of PER at different frequencies can alter visual recognition memory bidirectionally.Unfortunately, given that rather fancy optogenetic methods were used to vary oscillatory activity in the perirhinal cortex, no human applications of this work are imminent.
Blog Categories:
attention/perception,
brain plasticity,
technology
Sunday, October 18, 2015
Sir Reginald's Marvellous Organ
Under the "random curious stuff" category noted in MindBlog's title, above, I can't resist passing on this naughty video sent by a friend...apologies to sensitive readers who only want the brain stuff.
Friday, October 16, 2015
Great apes can look ahead in time
Yet another supposed distinction between human and animal minds has bit the dust. The prevailing dogma (expressed in my talk "The Beast Within") has been that animals don't anticipate the future. Now Kano and Hirata show that chimpanzees remember a movie they viewed a day earlier, because when the movie is shown again their eyes move to a part of the screen where an action is about to happen that is relevant to the storyline.
Highlights
•We developed a novel eye-tracking task to examine great apes’ memory skills
•Apes watched the same videos twice across 2 days, with a 24-hr delay
•Apes made anticipatory looks based on where-what information on the second day
•Apes thus encoded ongoing events into long-term memory by single experiences
Summary
Everyday life poses a continuous challenge for individuals to encode ongoing events, retrieve past events, and predict impending events. Attention and eye movements reflect such online cognitive and memory processes, especially through “anticipatory looks”. Previous studies have demonstrated the ability of nonhuman animals to retrieve detailed information about single events that happened in the distant past. However, no study has tested whether nonhuman animals employ online memory processes, in which they encode ongoing movie-like events into long-term storage during single viewing experiences. Here, we developed a novel eye-tracking task to examine great apes’ anticipatory looks to the events that they had encountered one time 24 hr earlier. Half-minute movie clips depicted novel and potentially alarming situations to the participant apes (six bonobos, six chimpanzees). In the experiment 1 clip, an aggressive ape-like character came out from one of two identical doors. While viewing the same movie again, apes anticipatorily looked at the door where the character would show up. In the experiment 2 clip, the human actor grabbed one of two objects and attacked the character with it. While viewing the same movie again but with object-location switched, apes anticipatorily looked at the object that the human would use, rather than the former location of the object. Our results thus show that great apes, just by watching the events once, encoded particular information (location and content) into long-term memory and later retrieved that information at a particular time in anticipation of the impending events.
Highlights
•We developed a novel eye-tracking task to examine great apes’ memory skills
•Apes watched the same videos twice across 2 days, with a 24-hr delay
•Apes made anticipatory looks based on where-what information on the second day
•Apes thus encoded ongoing events into long-term memory by single experiences
Summary
Everyday life poses a continuous challenge for individuals to encode ongoing events, retrieve past events, and predict impending events. Attention and eye movements reflect such online cognitive and memory processes, especially through “anticipatory looks”. Previous studies have demonstrated the ability of nonhuman animals to retrieve detailed information about single events that happened in the distant past. However, no study has tested whether nonhuman animals employ online memory processes, in which they encode ongoing movie-like events into long-term storage during single viewing experiences. Here, we developed a novel eye-tracking task to examine great apes’ anticipatory looks to the events that they had encountered one time 24 hr earlier. Half-minute movie clips depicted novel and potentially alarming situations to the participant apes (six bonobos, six chimpanzees). In the experiment 1 clip, an aggressive ape-like character came out from one of two identical doors. While viewing the same movie again, apes anticipatorily looked at the door where the character would show up. In the experiment 2 clip, the human actor grabbed one of two objects and attacked the character with it. While viewing the same movie again but with object-location switched, apes anticipatorily looked at the object that the human would use, rather than the former location of the object. Our results thus show that great apes, just by watching the events once, encoded particular information (location and content) into long-term memory and later retrieved that information at a particular time in anticipation of the impending events.
Blog Categories:
animal behavior,
future,
memory/learning
Thursday, October 15, 2015
Rhodopsin curing blindness?
In a previous life (1962-1998) my laboratory studied how the rhodopsin visual pigment in our eyes changes light into a nerve signal. Thus it excites me when I see major advances in understanding our vision and curing visual diseases. I want to pass on a nice graphic offered by Van Gelder and Kaur to illustrate recent work of Cehajic-Kapetanovic et al. (open access) showing that introduction of the visual pigment rhodopsin by viral gene therapy into the inner retina nerve cells of retinas whose rods and cones have degenerated can restore light sensitivity and can restore vision-like physiology and behavior to mice blind from outer retinal degeneration:
(click figure to enlarge) Gene therapy rescue of vision in retinal degeneration. (A) In the healthy retina, light penetrates from inner to outer retina to reach the cones and rods, which transduce signals through horizontal, bipolar, amacrine, and ultimately retinal ganglion cells to the brain. (B) In outer retinal degenerative diseases, loss of photoreceptors renders the retina insensitive to light. (C) Gene therapy with AAV2/2 virus expressing human rhodopsin (hRod) under the control of the CAG promoter results in expression of the photopigment in many surviving cells of the inner retina, and results in restoration of light responses recognized by the brain. (D) More selective expression of rhodopsin in a subset of bipolar cells is achieved by use of a virus in which expression is driven by the grm6 promoter. This version appeared to restore the most natural visual function to blind mice.
Wednesday, October 14, 2015
Can epigenetics explain homosexuality?
Michael Balter notes work presented by Vilain's UCLA laboratory at this year's American Society of Human Genetics meeting. His abstract, followed by some clips of his text:
(added note: an alert reader, see comment below, just added this critique of the following work from The Atlantic)
(added note: an alert reader, see comment below, just added this critique of the following work from The Atlantic)
A new study suggests that epigenetic effects—chemical modifications of the human genome that alter gene activity without changing the DNA sequence—may sometimes influence sexual orientation. Researchers studied methylation, the attachment of a methyl group to specific regions of DNA, in 37 pairs of male identical twins who were discordant—meaning that one was gay and the other straight—and 10 pairs who were both gay. Their search yielded five genome regions where the methylation pattern appears very closely linked to sexual orientation. A model that predicted sexual orientation based on these patterns was almost 70% accurate within this group—although that predictive ability does not necessarily apply to the general population.
Researchers thought they were hot on the trail of “gay genes” in 1993, when a team led by geneticist Dean Hamer of the National Cancer Institute reported that one or more genes for homosexuality had to reside on Xq28, a large region on the X chromosome...but some teams were unable to replicate the findings and the actual genes have not been found...Twin studies suggested, moreover, that gene sequences can't be the full explanation. For example, the identical twin of a gay man, despite having the same genome, only has a 20% to 50% chance of being gay himself.
That's why some have suggested that epigenetics—instead of or in addition to traditional genetics—might be involved. During development, chromosomes are subject to chemical changes that don't affect the nucleotide sequence but can turn genes on or off; the best known example is methylation, in which a methyl group is attached to specific DNA regions. Such “epi-marks” can remain in place for a lifetime, but most are erased when eggs and sperm are produced, so that a fetus starts with a blank slate. Recent studies, however, have shown that some marks are passed on to the next generation.
In a 2012 paper, Rice and his colleagues suggested that such unerased epi-marks might cause homosexuality when they are passed on from father to daughter or from mother to son...Such ideas inspired Tuck Ngun, a postdoc in Vilain's lab, to study the methylation patterns at 140,000 regions in the DNA of 37 pairs of male identical twins who were discordant—meaning that one was gay and the other straight—and 10 pairs who were both gay...the team identified five regions in the genome where the methylation pattern appears very closely linked to sexual orientation...Just why identical twins sometimes end up with different methylation patterns isn't clear. If Rice's hypothesis is right, their mothers' epi-marks might have been erased in one son, but not the other; or perhaps neither inherited any marks but one of them picked them up in the womb...In an earlier review, Ngun and Vilain cited evidence that methylation may be determined by subtle differences in the environment each fetus experiences during gestation, such as their exact locations within the womb and how much of the maternal blood supply each receives.
Tuesday, October 13, 2015
Musical expertise changes the brain's functional connectivity during audiovisual integration
Music notation reading encapsulates auditory, visual, and motor information in a highly organized manner and therefore provides a useful model for studying multisensory phenomena. Paraskevopoulos et al. show that large-scale functional brain networks underpinning audiovisual integration are organized differently in musicians and nonmusicians. They examine brain responses to congruent (sound played corresponding to musical notation) and incongruent (sound played different from notation) stimuli.
Multisensory integration engages distributed cortical areas and is thought to emerge from their dynamic interplay. Nevertheless, large-scale cortical networks underpinning audiovisual perception have remained undiscovered. The present study uses magnetoencephalography and a methodological approach to perform whole-brain connectivity analysis and reveals, for the first time to our knowledge, the cortical network related to multisensory perception. The long-term training-related reorganization of this network was investigated by comparing musicians to nonmusicians. Results indicate that nonmusicians rely on processing visual clues for the integration of audiovisual information, whereas musicians use a denser cortical network that relies mostly on the corresponding auditory information. These data provide strong evidence that cortical connectivity is reorganized due to expertise in a relevant cognitive domain, indicating training-related neuroplasticity.
Figure - Paradigm of an audiovisual congruent and incongruent trial. (A) A congruent trial. (B) An incongruent trial. The line “time” represents the duration of the presentation of the auditory and visual part of the stimulus. The last picture of each trial represents the intertrial stimulus in which subjects had to answer if the trial was congruent or incongruent.
Figure - Cortical network underpinning audiovisual integration. (Upper) Statistical parametric maps of the significant networks for the congruent > incongruent comparison. Networks presented are significant at P less than 0.001, FDR corrected. The color scale indicates t values. (Lower) Node strength of the significant networks for each comparison. Strength is represented by node size.
Blog Categories:
attention/perception,
brain plasticity,
music
Monday, October 12, 2015
Runner's high? Thank your internal marijuana...
From Fuss et al.:
Exercise is rewarding, and long-distance runners have described a runner’s high as a sudden pleasant feeling of euphoria, anxiolysis, sedation, and analgesia. A popular belief has been that endogenous endorphins mediate these beneficial effects. However, running exercise increases blood levels of both β-endorphin (an opioid) and anandamide (an endocannabinoid). Using a combination of pharmacologic, molecular genetic, and behavioral studies in mice, we demonstrate that cannabinoid receptors mediate acute anxiolysis and analgesia after running. We show that anxiolysis depends on intact cannabinoid receptor 1 (CB1) receptors on forebrain GABAergic neurons and pain reduction on activation of peripheral CB1 and CB2 receptors. We thus demonstrate that the endocannabinoid system is crucial for two main aspects of a runner's high. Sedation, in contrast, was not influenced by cannabinoid or opioid receptor blockage, and euphoria cannot be studied in mouse models.
Friday, October 09, 2015
A Gee Whiz! moment. Activating neurons with ultrasound.
Optogenetics, making nerve cells sensitive to light by a genetic manipulation, has the limitation that light doesn't penetrate living tissue very well, and so must be delivered by a invasive thin fiber optic stimulation. Frank and Gorman offer a video clip describing work of Ibsen et al., who show a nerve cell can be genetically altered to become sensitive to activation by non-invasive ultrasound, an approach they described as "sonogenetics." The video (I could do without the rock music sound track) shows a worm's movement changing direction as a nerve cell is stimulated by ultrasound.
Thursday, October 08, 2015
1/f brain noise increases with aging.
From Gazzaley and collaborators, a description of what in going on in our aging brains:
Aging is associated with performance decrements across multiple cognitive domains. The neural noise hypothesis, a dominant view of the basis of this decline, posits that aging is accompanied by an increase in spontaneous, noisy baseline neural activity. Here we analyze data from two different groups of human subjects: intracranial electrocorticography from 15 participants over a 38 year age range (15–53 years) and scalp EEG data from healthy younger (20–30 years) and older (60–70 years) adults to test the neural noise hypothesis from a 1/f noise perspective. Many natural phenomena, including electrophysiology, are characterized by 1/f noise. The defining characteristic of 1/f is that the power of the signal frequency content decreases rapidly as a function of the frequency (f) itself. The slope of this decay, the noise exponent (χ), is often <−1 for electrophysiological data and has been shown to approach white noise (defined as χ = 0) with increasing task difficulty. We observed, in both electrophysiological datasets, that aging is associated with a flatter (more noisy) 1/f power spectral density, even at rest, and that visual cortical 1/f noise statistically mediates age-related impairments in visual working memory. These results provide electrophysiological support for the neural noise hypothesis of aging.
Wednesday, October 07, 2015
Methionine, an amino acid, enhances recovery from cocaine addiction.
Wright et al. use a mouse model to show that the common amino acid methionine - which can serve as a methyl group donor for the DNA methylation that regulates neural functions associated with learning, memory, and synaptic plasticity - can reduce addictive like behaviors such as drug seeking, and block a cocaine-induced marker of neuronal activation after reinstatement in the nucleus accumbens and the medial prefrontal cortex, two brain regions responsible for drug seeking and relapse. Here is the technical abstract:
Epigenetic mechanisms, such as histone modifications, regulate responsiveness to drugs of abuse, such as cocaine, but relatively little is known about the regulation of addictive-like behaviors by DNA methylation. To investigate the influence of DNA methylation on the locomotor-activating effects of cocaine and on drug-seeking behavior, rats receiving methyl supplementation via chronic L-methionine (MET) underwent either a sensitization regimen of intermittent cocaine injections or intravenous self-administration of cocaine, followed by cue-induced and drug-primed reinstatement. MET blocked sensitization to the locomotor-activating effects of cocaine and attenuated drug-primed reinstatement, with no effect on cue-induced reinstatement or sucrose self-administration and reinstatement. Furthermore, upregulation of DNA methyltransferase 3a and 3b and global DNA hypomethylation were observed in the nucleus accumbens core (NAc), but not in the medial prefrontal cortex (mPFC), of cocaine-pretreated rats. Glutamatergic projections from the mPFC to the NAc are critically involved in the regulation of cocaine-primed reinstatement, and activation of both brain regions is seen in human addicts when reexposed to the drug. When compared with vehicle-pretreated rats, the immediate early gene c-Fos (a marker of neuronal activation) was upregulated in the NAc and mPFC of cocaine-pretreated rats after cocaine-primed reinstatement, and chronic MET treatment blocked its induction in both regions. Cocaine-induced c-Fos expression in the NAc was associated with reduced methylation at CpG dinucleotides in the c-Fos gene promoter, effects reversed by MET treatment. Overall, these data suggest that drug-seeking behaviors are, in part, attributable to a DNA methylation-dependent process, likely occurring at specific gene loci (e.g., c-Fos) in the reward pathway.
Blog Categories:
acting/choosing,
attention/perception,
motivation/reward
Tuesday, October 06, 2015
Memory aging and brain maintenance
An open access article by Nyberg et al. notes
The association of intact memory functioning in old age with maintenance and preservation of a functionally young and healthy brain may seem obvious. However, up to the present the focus has largely been on possible forms of compensatory brain responses. This is so, even though it remains unclear whether memory performance in old age can benefit from altered patterns of brain activation, with almost as many studies showing positive as negative relationships.Their abstract suggests the relevance of "brain maintenance":
Episodic memory and working memory decline with advancing age. Nevertheless, large-scale population-based studies document well-preserved memory functioning in some older individuals. The influential ‘reserve’ notion holds that individual differences in brain characteristics or in the manner people process tasks allow some individuals to cope better than others with brain pathology and hence show preserved memory performance. Here, we discuss a complementary concept, that of brain maintenance (or relative lack of brain pathology), and argue that it constitutes the primary determinant of successful memory aging. We discuss evidence for brain maintenance at different levels: cellular, neurochemical, gray- and white-matter integrity, and systems-level activation patterns. Various genetic and lifestyle factors support brain maintenance in aging and interventions may be designed to promote maintenance of brain structure and function in late life.The figures are worth a look, for they illustrate how a fraction of older individuals have brains that, at different levels of brain organization, are similar to younger brains in their relative lack of brain pathology. They say very little about the "lifestyle factors" or "interventions" that might promote brain maintenance.
Monday, October 05, 2015
The wealthy are different from you and me...
The abstract from an article titled "The distributional preferences of an elite" by Fisman et al.:
We studied the distributional preferences of an elite cadre of Yale Law School students, a group that will assume positions of power in U.S. society. Our experimental design allows us to test whether redistributive decisions are consistent with utility maximization and to decompose underlying preferences into two qualitatively different tradeoffs: fair-mindedness versus self-interest, and equality versus efficiency. Yale Law School subjects are more consistent than subjects drawn from the American Life Panel, a diverse sample of Americans. Relative to the American Life Panel, Yale Law School subjects are also less fair-minded and substantially more efficiency-focused. We further show that our measure of equality-efficiency tradeoffs predicts Yale Law School students’ career choices: Equality-minded subjects are more likely to be employed at nonprofit organizations.
Subscribe to:
Posts (Atom)