According to a theoretical tradition dating back to Aristotle, verbs can be classified into two broad categories. Telic verbs (e.g., “decide,” “sell,” “die”) encode a logical endpoint, whereas atelic verbs (e.g., “think,” “negotiate,” “run”) do not, and the denoted event could therefore logically continue indefinitely. Here we show that sign languages encode telicity in a seemingly universal way and moreover that even nonsigners lacking any prior experience with sign language understand these encodings. In experiments 1–5, nonsigning English speakers accurately distinguished between telic (e.g., “decide”) and atelic (e.g., “think”) signs from (the historically unrelated) Italian Sign Language, Sign Language of the Netherlands, and Turkish Sign Language. These results were not due to participants' inferring that the sign merely imitated the action in question. In experiment 6, we used pseudosigns to show that the presence of a salient visual boundary at the end of a gesture was sufficient to elicit telic interpretations, whereas repeated movement without salient boundaries elicited atelic interpretations. Experiments 7–10 confirmed that these visual cues were used by all of the sign languages studied here. Together, these results suggest that signers and nonsigners share universally accessible notions of telicity as well as universally accessible “mapping biases” between telicity and visual form.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Monday, June 15, 2015
Sign language as a window on universals in linguistic representation of events.
From Strickland et al., I've learned a new word - telicity:
Blog Categories:
evolutionary psychology,
human evolution,
language
Friday, June 12, 2015
Oxytocin makes formidable men more likeable.
An interesting tidbit from Chen et al.:
Physical size and strength are associated with dominance and threat. The current study tested (i) whether men’s evaluations of male strangers would be negatively influenced by cues indicating physical formidability, and (ii) whether these evaluations would be influenced by oxytocin, a neuropeptide that mediates social behavior and reduces social anxiety. In a placebo-controlled double-blind design, we administered either oxytocin (24 I.U.) or placebo intranasally to 100 healthy males and assessed their responses to an image of either a physically formidable (strong) or physically non-formidable (weak) male peer. Whereas participants receiving placebo expressed dislike and avoidance of the strong male relative to the weak male, oxytocin selectively improved social evaluation of the strong male. These results provide first evidence that oxytocin regulates social evaluation of peers based on body features indicating strength and formidability. We discuss the possibility that oxytocin may promote the expansion of social networks by increasing openness toward potentially threatening individuals.
Blog Categories:
attention/perception,
social cognition
Thursday, June 11, 2015
Serotonin deficiency correlates with stress vulnerability.
From Sachs et al., a statement of significance, followed by a more detailed abstract:
Significance
The biological factors that determine whether an individual develops mental illness, such as depression or posttraumatic stress disorder, or responds adequately to pharmacotherapy remain almost completely unknown. Using genetically modified mice, we demonstrate that low levels of brain serotonin lead to increased vulnerability to psychosocial stress and prevent the antidepressant-like effects of fluoxetine following stress exposure. Our data also show that inhibiting the lateral habenula can reverse stress-induced behavioral avoidance in serotonin-deficient animals, which fail to respond to fluoxetine. Our results provide additional insight into the serotonin deficiency hypothesis of depression and highlight the potential of targeting the lateral habenula to treat depression and anxiety disorders in patients who fail to respond to selective serotonin reuptake inhibitors.
Abstract
Brain serotonin (5-HT) deficiency and exposure to psychosocial stress have both been implicated in the etiology of depression and anxiety disorders, but whether 5-HT deficiency influences susceptibility to depression- and anxiety-like phenotypes induced by psychosocial stress has not been formally established. Most clinically effective antidepressants increase the extracellular levels of 5-HT, and thus it has been hypothesized that antidepressant responses result from the reversal of endogenous 5-HT deficiency, but this hypothesis remains highly controversial. Here we evaluated the impact of brain 5-HT deficiency on stress susceptibility and antidepressant-like responses using tryptophan hydroxylase 2 knockin (Tph2KI) mice, which display 60–80% reductions in brain 5-HT. Our results demonstrate that 5-HT deficiency leads to increased susceptibility to social defeat stress (SDS), a model of psychosocial stress, and prevents the fluoxetine (FLX)-induced reversal of SDS-induced social avoidance, suggesting that 5-HT deficiency may impair antidepressant responses. In light of recent clinical and preclinical studies highlighting the potential of inhibiting the lateral habenula (LHb) to achieve antidepressant and antidepressant-like responses, we also examined whether LHb inhibition could achieve antidepressant-like responses in FLX-insensitive Tph2KI mice subjected to SDS. Our data reveal that using designer receptors exclusively activated by designer drugs (DREADDs) to inhibit LHb activity leads to reduced SDS-induced social avoidance behavior in both WT and Tph2KI mice. This observation provides additional preclinical evidence that inhibiting the LHb might represent a promising alternative therapeutic approach under conditions in which selective 5-HT reuptake inhibitors are ineffective.
Wednesday, June 10, 2015
The urge to instruct.
I thought I would pass on this humorous piece by Peter Funt on his review of YouTube videos on how to break in a new baseball glove. After reviewing videos on using microwave ovens, jacuzzis, mallets, oil, heat, hot water:
After nearly two hours of viewing, I’ve learned that: (a) it appears no certification is necessary to teach on YouTube; (b) although baseball is the national pastime, no one knows how to break in a glove; and (c) if you accidentally lock a new mitt in an old car, you might be able to use a tennis ball to break in.This sets me to mulling on what motivates the making of such videos, and indeed, what motives me to pass on or describe articles about mind, brain, and behavior that I have very little real critical insight into. Just as with a kid showing off a new toy, the 'gee whiz' or 'this is neat' moment that comes from encountering a new idea or bit of work is enhanced by sharing it with others.
Tuesday, June 09, 2015
The "Good Life"
The previous post noting recent work by Dacher Keltner and collaborators prompted me to have another look at the website of "The Greater Good Science Center" at the University of California Berkeley that Keltner and others have established. It has recently developed another website, Greater Good in Action, that offers engaging brief exercises in practices shown by research to enhance and build all sorts of good stuff: connection, empathy, kindness, compassion, forgiveness, gratitude, happiness, optimism, resilience to stress, awe, etc. Just clicking through, and spending maybe 5-10 minutes, on a few of the exercises leaves me with a mushy warm glow of contentment, which persists for varying periods of time until my usual default curmudgeonly self reappears. You might enjoy trying some.
Blog Categories:
culture/politics,
happiness,
self help
Monday, June 08, 2015
Why do we experience awe?
Awe, like reverence, is a pro-social emotion that subordinates individual self interest to a larger whole. Dacher Keltner and Paul Piff, in their NYTimes piece publicizing of their more academic publication (marketing is a necessity these days) describe five different studies, each providing experimental evidence that awe helps bind us to others, motivating us to be more generous and helpful to strangers, to act in collaborative ways that enable strong groups and cohesive communities. The positive effect of awe on prosociality are partly explained by feelings of a smaller self.
...even brief experiences of awe, such as being amid beautiful tall trees, lead people to feel less narcissistic and entitled and more attuned to the common humanity people share with one another. In the great balancing act of our social lives, between the gratification of self-interest and a concern for others, fleeting experiences of awe redefine the self in terms of the collective, and orient our actions toward the needs of those around us.
You could make the case that our culture today is awe-deprived. Adults spend more and more time working and commuting and less time outdoors and with other people...Attendance at arts events — live music, theater, museums and galleries — has dropped over the years...Arts and music programs in schools are being dismantled in lieu of programs better suited to standardized testing; time outdoors and for novel, unbounded exploration are sacrificed for résumé-building activities.
We believe that awe deprivation has had a hand in a broad societal shift that has been widely observed over the past 50 years: People have become more individualistic, more self-focused, more materialistic and less connected to others. To reverse this trend, we suggest that people insist on experiencing more everyday awe, to actively seek out what gives them goose bumps, be it in looking at trees, night skies, patterns of wind on water or the quotidian nobility of others — the teenage punk who gives up his seat on public transportation, the young child who explores the world in a state of wonder, the person who presses on against all odds.
Blog Categories:
culture/politics,
happiness,
social cognition
Friday, June 05, 2015
Lack of exercise disrupts body’s rhythms.
Natural daily rhythms in spontaneous movement patterns in both humans and mice show scale invariance, i.e., movement patterns repeat over time scales of minutes to hours. These scale invariant patterns decay with aging in both humans and mice, apparently correlating with progressive dysfunction of circadian pacemaker circuits in the brain's suprachiasmatic nucleus. Scheer and collaborators have now shown that in both aged and young mice exercise is a crucial variable. Loss of scale invariance associated with both inactivity and aging can be restored by exercise, even in old animals.
In healthy humans and other animals, behavioral activity exhibits scale invariance over multiple timescales from minutes to 24 h, whereas in aging or diseased conditions, scale invariance is usually reduced significantly. Accordingly, scale invariance can be a potential marker for health. Given compelling indications that exercise is beneficial for mental and physical health, we tested to what extent a lack of exercise affects scale invariance in young and aged animals. We studied six or more mice in each of four age groups (0.5, 1, 1.5, and 2 y) and observed an age-related deterioration of scale invariance in activity fluctuations. We found that limiting the amount of exercise, by removing the running wheels, leads to loss of scale-invariant properties in all age groups. Remarkably, in both young and old animals a lack of exercise reduced the scale invariance in activity fluctuations to the same level. We next showed that scale invariance can be restored by returning the running wheels. Exercise during the active period also improved scale invariance during the resting period, suggesting that activity during the active phase may also be beneficial for the resting phase. Finally, our data showed that exercise had a stronger influence on scale invariance than the effect of age. The data suggest that exercise is beneficial as revealed by scale-invariant parameters and that, even in young animals, a lack of exercise leads to strong deterioration in these parameters.
Blog Categories:
acting/choosing,
aging,
brain plasticity,
exercise
Thursday, June 04, 2015
It’s not a stream of consciousness, its a rhythm.
In the NYTimes Gray Matter series Gregory Hickok gives an exegesis on the implications of his Psychological Science paper "The Rhythm of Perception." Some edited clips:
IN 1890, the American psychologist William James famously likened our conscious experience to the flow of a stream...recent research has shown that the “stream” of consciousness is, in fact, an illusion. We actually perceive the world in rhythmic pulses (brain waves that correlate with states like calm alertness and deep sleep) rather than as a continuous flow....We are exploring the possibility that brain rhythms are not merely a reflection of mental activity but a cause of it, helping shape perception, movement, memory and even consciousness itself...What this means is that the brain samples the world in rhythmic pulses, perhaps even discrete time chunks, much like the individual frames of a movie. From the brain’s perspective, experience is not continuous but quantized.
It turns out, for example, that our ability to detect a subtle event, like a slight change in a visual scene, oscillates over time, cycling between better and worse perceptual sensitivity several times a second. Research shows that these rhythms correlate with electrical rhythms of the brain.
If that’s hard to picture, here’s an analogy: Imagine trying to see an animal through a thick, swirling fog that varies in density as it drifts. The distinctness of the animal’s form will oscillate with the density of the fog, alternating between periods of relative clarity and opaqueness. According to recent experiments, this is how our perceptual systems sample the world — but rather than fog, it’s brain waves that drive the oscillations...Rhythms in the environment, such as those in music or speech, can draw neural oscillations into their tempo, effectively synchronizing the brain’s rhythms with those of the world around us.In the study reported in Psychological Science Hickok and colleagues:
...presented listeners with a three-beat-per-second rhythm (a pulsing “whoosh” sound) for only a few seconds and then asked the listeners to try to detect a faint tone immediately afterward. The tone was presented at a range of delays between zero and 1.4 seconds after the rhythm ended. Not only did we find that the ability to detect the tone varied over time by up to 25 percent — that’s a lot — but it did so precisely in sync with the previously heard three-beat-per-second rhythm.
Why would the brain do this? One theory is that it’s the brain’s way of focusing attention. Picture a noisy cafe filled with voices, clanging dishes and background music. As you attend to one particular acoustic stream — say, your lunch mate’s voice — your brain synchronizes its rhythm to the rhythm of the voice and enhances the perceptibility of that stream, while suppressing other streams, which have their own, different rhythms. (More broadly, this kind of synchronization has been proposed as a mechanism for communication between neural networks within the brain.)
All of this points to the need for a new metaphor. We should talk of the “rhythm” of thought, of perception, of consciousness. Conceptualizing our mental experience this way is not only more accurate, but it also situates our mind within the broader context of the daily, monthly and yearly rhythms that dominate our lives.
Wednesday, June 03, 2015
Humans need not apply.
Check out this chilling video on the increasing obsolescence of humans, which was referenced in a recent meeting of the Chaos and Complexity Seminar group I attend at the University of Wisconsin (when I am in Madison during the warm months). Then note the partial solace offered by Carr's essay "Why Robots Will Always Need Us."
Blog Categories:
culture/politics,
human evolution,
technology
Tuesday, June 02, 2015
How we fashion meaning and purpose.
This is a brief post about some material I thought I might get to cohere, maybe something along the lines of purpose as an evolved means of generating order, part of the big story of order evolving from chaos in the universe. I was wrong about the coherence, but I think the links are worth mentioning. The “Big History Project” is an effort to generate a modern origin story that transcends previous stories because it is global. From David Christian:
The second source I want to mention is a piece by Worthen titled "Wanted: A Theology of Atheism." The title is an oxymoron [Greek Theos (god) + logia (subject of study)], presumably intentional. It discusses efforts, of the sort described in some previous MindBlog posts, to form secular (godless) forums, churches, or assemblies that meet our human need for communal settings that reinforce kindness and moral behavior, that balance the needs of the community against self interest. Worthen quotes Sam Harris's:
...in modern science we've gotten used to the idea that science doesn't offer meaning in the way that institutional religions did in the past. I'm increasingly thinking that this idea that modernity puts us in a world without meaning….may be completely wrong. We may be living in an intellectual building site, where a new story is being constructed. It's vastly more powerful than the previous stories because it's the first one that is global. It's not anchored in a particular culture or a particular society. This is an origin story that works for humans in Beijing as well as in Buenos Aires...it sums over vastly more information than any early origin story….across so many domains, the amount of information, of good, rigorous ideas, is so rich that we can tease out that story.The Christian's Big History project reminds me of the Natural Sciences 5 course originated by my mentor George Wald, which I taught in when I was a Harvard senior and then graduate student in 1963-67. These efforts have started with cosmology, the origin of the university, the solar systems and earth, the appearance and evolution of life, and finally the human story. I had a look at Chapter 5 of the online Big History Project (aimed at middle and high school level, 13-17 year olds) and found it reasonably engaging.
The second source I want to mention is a piece by Worthen titled "Wanted: A Theology of Atheism." The title is an oxymoron [Greek Theos (god) + logia (subject of study)], presumably intentional. It discusses efforts, of the sort described in some previous MindBlog posts, to form secular (godless) forums, churches, or assemblies that meet our human need for communal settings that reinforce kindness and moral behavior, that balance the needs of the community against self interest. Worthen quotes Sam Harris's:
...promoting science as a universal moral guide. This proposal is an old one. The 19th-century French philosopher Auguste Comte and the American intellectuals Walter Lippmann and John Dewey all wrote that moral progress depended on the scientific method.
Morality depends on “the totality of facts that relate to human well-being, and our knowledge of it grows the more we learn about ourselves, in fields ranging from molecular biology to economics,” Harris has stressed the special role of his own field, cognitive science. Every discovery about the brain’s experience of pleasure and suffering has implications for how we should treat other humans. Moral philosophy is really an “undeveloped branch of science” whose laws apply in Peoria just as they do in the Punjab.
Pragmatist philosophers like Philip Kitcher offer a different approach to the question of atheist morality, one based on “the sense that ethical life grows out of our origins, the circumstances under which our ancestors lived, and it’s a work in progress,” he said. In the pragmatist tradition, science is useful, but ethical claims are not objective scientific facts. They are only “true” if they seem to “work” in real life.
Blog Categories:
culture/politics,
human evolution,
religion
Monday, June 01, 2015
Sleep stabilizes, but does not enhance, motor performance training.
From Nettersheim et al., a result challenging the prominent model that sleep enhances the performance of a newly learned skill (which is my experience in learning difficulat new piano passages):
Sleep supports the consolidation of motor sequence memories, yet it remains unclear whether sleep stabilizes or actually enhances motor sequence performance. Here we assessed the time course of motor memory consolidation in humans, taking early boosts in performance into account and varying the time between training and sleep. Two groups of subjects, each participating in a short wake condition and a longer sleep condition, were trained on the sequential finger-tapping task in the evening and were tested (1) after wake intervals of either 30 min or 4 h and (2) after a night of sleep that ensued either 30 min or 4 h after training. The results show an early boost in performance 30 min after training and a subsequent decay across the 4 h wake interval. When sleep followed 30 min after training, post-sleep performance was stabilized at the early boost level. Sleep at 4 h after training restored performance to the early boost level, such that, 12 h after training, performance was comparable regardless of whether sleep occurred 30 min or 4 h after training. These findings indicate that sleep does not enhance but rather stabilizes motor sequence performance without producing additional gains.
Blog Categories:
acting/choosing,
memory/learning,
sleep
Friday, May 29, 2015
Cultural differences, emotional expressivity, and smiles
Rychlowska et al. analyze cultural display rules from 32 countries to reveal that the extent to which a country’s present-day population descends from numerous versus few source countries is associated with norms favoring greater emotional expressivity.
A small number of facial expressions may be universal in that they are produced by the same basic affective states and recognized as such throughout the world. However, other aspects of emotionally expressive behavior vary widely across culture. Just why do they vary? We propose that some cultural differences in expressive behavior are determined by historical heterogeneity, or the extent to which a country’s present-day population descended from migration from numerous vs. few source countries over a period of 500 y. Our reanalysis of data on cultural rules for displaying emotion from 32 countries reveals that historical heterogeneity explains substantial, unique variance in the degree to which individuals believe that emotions should be openly expressed. We also report an original study of the underlying states that people believe are signified by a smile. Cluster analysis applied to data from nine countries, including Canada, France, Germany, India, Indonesia, Israel, Japan, New Zealand, and the United States, reveals that countries group into “cultures of smiling” determined by historical heterogeneity. Factor analysis shows that smiles sort into three social-functional subtypes: pleasure, affiliative, and dominance. The relative importance of these smile subtypes varies as a function of historical heterogeneity. These findings thus highlight the power of social-historical factors to explain cross-cultural variation in emotional expression and smile behavior.
Thursday, May 28, 2015
How biased are our brains?
Kristof reviews some recent work on unconscious bias, particularly racial bias.
Scholars suggest that in evolutionary times we became hard-wired to make instantaneous judgments about whether someone is in our “in group” or not — because that could be lifesaving. A child who didn’t prefer his or her own group might have been at risk of being clubbed to death...tests of unconscious biases... suggest that people turn out to have subterranean racial and gender biases that they are unaware of and even disapprove of.I thought I would point out a recently published book, the subject of a forthcoming multiple review in Behavioral and Brain Sciences, which argues that the power of biases on perception is usually overstated, that perceptions of individuals and groups tend to be accurate. The précis of the book can be downloaded here. Book title and abstract:
Lee Jussim - Social Perception and Social Reality: Why Accuracy Dominates Bias and Self-Fulfilling Prophecy (Oxford University Press, 2012)
Abstract: Social Perception and Social Reality reviews the evidence in social psychology and related fields and reaches three conclusions: 1. Although errors, biases, and self-fulfilling prophecies in person perception, are real, reliable, and occasionally quite powerful, on average, they tend to be weak, fragile and fleeting; 2. Perceptions of individuals and groups tend to be at least moderately, and often highly accurate; and 3. Conclusions based on the research on error, bias, and self-fulfilling prophecies routinely greatly overstates their power and pervasiveness, and consistently ignores evidence of accuracy, agreement, and rationality in social perception. The weight of the evidence - including some of the most classic research widely interpreted as testifying to the power of biased and self-fulfilling processes - is that interpersonal expectations related to social reality primarily because they reflect rather than cause social reality. This is the case not only of teacher expectations, but also social stereotypes, both as perceptions of groups, and as the bases of expectations regarding individuals. The time is long overdue to replace cherry-picked and unjustified stories emphasizing error, bias, the power of self-fulfilling prophecies and the inaccuracy of stereotypes with conclusions that more closely correspond to the full range of empirical findings, which includes multiple failed replications of classic expectancy studies, meta-analyses consistently demonstrating small or at best moderate expectancy effects, and high accuracy in social perception.
Wednesday, May 27, 2015
After Phrenology: Neural Reuse and the Interactive Brain
I've been reading through an interesting article by Michael Anderson, a précis of a book accepted for publication and available as a PDF through BBS. I pass on the abstract:
Neural reuse is a form of neuroplasticity whereby neural elements originally developed for one purpose are put to multiple uses. A diverse behavioral repertoire is achieved via the creation of multiple, nested, and overlapping neural coalitions, in which each neural element is a member of multiple different coalitions and cooperates with a different set of partners at different times. This has profound implications for how we think about our continuity with other species, for how we understand the similarities and differences between psychological processes, and for how best to pursue a unified science of the mind. After Phrenology surveys the terrain and advocates for a series of reforms in psychology and cognitive neuroscience. The book argues that, among other things, we should capture brain function in a multi-dimensional manner, develop a new, action-oriented vocabulary for psychology, and recognize that higher-order cognitive processes are built from complex configurations of already evolved circuitry.
Tuesday, May 26, 2015
Neural basis of anxiety reduction by placebo.
Meyer et al. directly measure neural consequences of expecting a placebo treatment to be effective in relieving anxiety:
The beneficial effects of placebo treatments on fear and anxiety (placebo anxiolysis) are well known from clinical practice, and there is strong evidence indicating a contribution of treatment expectations to the efficacy of anxiolytic drugs. Although clinically highly relevant, the neural mechanisms underlying placebo anxiolysis are poorly understood. In two studies in humans, we tested whether the administration of an inactive treatment along with verbal suggestions of anxiolysis can attenuate experimentally induced states of phasic fear and/or sustained anxiety. Phasic fear is the response to a well defined threat and includes attentional focusing on the source of threat and concomitant phasic increases of autonomic arousal, whereas in sustained states of anxiety potential and unclear danger requires vigilant scanning of the environment and elevated tonic arousal levels. Our placebo manipulation consistently reduced vigilance measured in terms of undifferentiated reactivity to salient cues (indexed by subjective ratings, skin conductance responses and EEG event-related potentials) and tonic arousal [indexed by cue-unrelated skin conductance levels and enhanced EEG alpha (8–12 Hz) activity], indicating a downregulation of sustained anxiety rather than phasic fear. We also observed a placebo-dependent sustained increase of frontal midline EEG theta (4–7 Hz) power and frontoposterior theta coupling, suggesting the recruitment of frontally based cognitive control functions. Our results thus support the crucial role of treatment expectations in placebo anxiolysis and provide insight into the underlying neural mechanisms.
Monday, May 25, 2015
How alarm amplifies through social networks.
We've all probably played the parlor game with 10 or more people sitting in a circle, with one whispering a word into the ear of the person to their right, continuing to pass the word by whispering to the right until it comes back to the originator, frequently altered from its original form. Moussaïd et al. do a version of this routine in an experiment on how risk perception of hazardous events such as contagious outbreaks, terrorist attacks, and climate change spread through social networks. They find that although the content of a message is gradually lost over repeated social transmissions, subjective perceptions of risk propagate and amplify due to social influence.
Understanding how people form and revise their perception of risk is central to designing efficient risk communication methods, eliciting risk awareness, and avoiding unnecessary anxiety among the public. However, public responses to hazardous events such as climate change, contagious outbreaks, and terrorist threats are complex and difficult-to-anticipate phenomena. Although many psychological factors influencing risk perception have been identified in the past, it remains unclear how perceptions of risk change when propagated from one person to another and what impact the repeated social transmission of perceived risk has at the population scale. Here, we study the social dynamics of risk perception by analyzing how messages detailing the benefits and harms of a controversial antibacterial agent undergo change when passed from one person to the next in 10-subject experimental diffusion chains. Our analyses show that when messages are propagated through the diffusion chains, they tend to become shorter, gradually inaccurate, and increasingly dissimilar between chains. In contrast, the perception of risk is propagated with higher fidelity due to participants manipulating messages to fit their preconceptions, thereby influencing the judgments of subsequent participants. Computer simulations implementing this simple influence mechanism show that small judgment biases tend to become more extreme, even when the injected message contradicts preconceived risk judgments. Our results provide quantitative insights into the social amplification of risk perception, and can help policy makers better anticipate and manage the public response to emerging threats.
Blog Categories:
culture/politics,
fear/anxiety/stress,
social cognition
Friday, May 22, 2015
Modulating movement intention and the extended present with tDCS.
Douglas et al. do a fascinating bit of work on the 'extended present' in which our brains function, during which our experienced intention to make a movement actually comes ~200 milliseconds after motor cortex signals initiating the movement (the famous Libet experiment showing we are 'late to action'). Conscious intention, or volition, provides the foundation for our attributing agency to ourselves, and for society attributing responsibility to an individual. A distorted sense of volition is a hallmark of many neurological and psychiatric illnesses such as alien hand syndrome, psychogenic movement disorders, and schizophrenia.
Conscious intention is a fundamental aspect of the human experience. Despite long-standing interest in the basis and implications of intention, its underlying neurobiological mechanisms remain poorly understood. Using high-definition transcranial DC stimulation (tDCS), we observed that enhancing spontaneous neuronal excitability in both the angular gyrus and the primary motor cortex caused the reported time of conscious movement intention to be ∼60–70 ms earlier. Slow brain waves recorded ∼2–3 s before movement onset, as well as hundreds of milliseconds after movement onset, independently correlated with the modulation of conscious intention by brain stimulation. These brain activities together accounted for 81% of interindividual variability in the modulation of movement intention by brain stimulation. A computational model using coupled leaky integrator units with biophysically plausible assumptions about the effect of tDCS captured the effects of stimulation on both neural activity and behavior. These results reveal a temporally extended brain process underlying conscious movement intention that spans seconds around movement commencement.
Thursday, May 21, 2015
Brain correlates of loving kindness meditation.
Garrison and collaborators extend their work on brain correlates of meditation practice, noting again a central role for the posterior cingulate cortex/precuneus (for previous posts in this thread, enter "Garrison" in the MindBlog search box in the left column).
Loving kindness is a form of meditation involving directed well-wishing, typically supported by the silent repetition of phrases such as “may all beings be happy,” to foster a feeling of selfless love. Here we used functional magnetic resonance imaging to assess the neural substrate of loving kindness meditation in experienced meditators and novices. We first assessed group differences in blood oxygen level-dependent (BOLD) signal during loving kindness meditation. We next used a relatively novel approach, the intrinsic connectivity distribution of functional connectivity, to identify regions that differ in intrinsic connectivity between groups, and then used a data-driven approach to seed-based connectivity analysis to identify which connections differ between groups. Our findings suggest group differences in brain regions involved in self-related processing and mind wandering, emotional processing, inner speech, and memory. Meditators showed overall reduced BOLD signal and intrinsic connectivity during loving kindness as compared to novices, more specifically in the posterior cingulate cortex/precuneus (PCC/PCu), a finding that is consistent with our prior work and other recent neuroimaging studies of meditation. Furthermore, meditators showed greater functional connectivity during loving kindness between the PCC/PCu and the left inferior frontal gyrus, whereas novices showed greater functional connectivity during loving kindness between the PCC/PCu and other cortical midline regions of the default mode network, the bilateral posterior insula lobe, and the bilateral parahippocampus/hippocampus. These novel findings suggest that loving kindness meditation involves a present-centered, selfless focus for meditators as compared to novices.
Blog Categories:
happiness,
meditation,
mindfulness,
self help
Wednesday, May 20, 2015
Essentialism
I pass on some clips from Richard Dawkins' brief essay, and suggest you also take a look at Lisa Barrett's comments on essentialist views of the mind:
Essentialism—what I’ve called "the tyranny of the discontinuous mind"—stems from Plato, with his characteristically Greek geometer’s view of things. For Plato, a circle, or a right triangle, were ideal forms, definable mathematically but never realised in practice. A circle drawn in the sand was an imperfect approximation to the ideal Platonic circle hanging in some abstract space. That works for geometric shapes like circles, but essentialism has been applied to living things and Ernst Mayr blamed this for humanity’s late discovery of evolution—as late as the nineteenth century. If, like Aristotle, you treat all flesh-and-blood rabbits as imperfect approximations to an ideal Platonic rabbit, it won’t occur to you that rabbits might have evolved from a non-rabbit ancestor, and might evolve into a non-rabbit descendant. If you think, following the dictionary definition of essentialism, that the essence of rabbitness is "prior to" the existence of rabbits (whatever "prior to" might mean, and that’s a nonsense in itself) evolution is not an idea that will spring readily to your mind, and you may resist when somebody else suggests it.
Essentialism rears its ugly head in racial terminology. The majority of "African Americans" are of mixed race. Yet so entrenched is our essentialist mind-set, American official forms require everyone to tick one race/ethnicity box or another: no room for intermediates. A different but also pernicious point is that a person will be called "African American" even if only, say, one of his eight great grandparents was of African descent. As Lionel Tiger put it to me, we have here a reprehensible "contamination metaphor." But I mainly want to call attention to our society’s essentialist determination to dragoon a person into one discrete category or another. We seem ill-equipped to deal mentally with a continuous spectrum of intermediates. We are still infected with the plague of Plato’s essentialism.
Moral controversies such as those over abortion and euthanasia are riddled with the same infection. At what point is a brain-dead accident-victim defined as "dead"? At what moment during development does an embryo become a "person"? Only a mind infected with essentialism would ask such questions. An embryo develops gradually from single-celled zygote to newborn baby, and there’s no one instant when "personhood" should be deemed to have arrived. The world is divided into those who get this truth and those who wail, "But there has to be some moment when the fetus becomes human." No, there really doesn’t, any more than there has to be a day when a middle aged person becomes old. It would be better—though still not ideal—to say the embryo goes through stages of being a quarter human, half human, three quarters human . . . The essentialist mind will recoil from such language and accuse me of all manner of horrors for denying the essence of humanness...Our essentialist urge toward rigid definitions of "human" (in debates over abortion and animal rights) and "alive" (in debates over euthanasia and end-of-life decisions) makes no sense in the light of evolution and other gradualistic phenomena.
We define a poverty "line": you are either "above" or "below" it. But poverty is a continuum. Why not say, in dollar-equivalents, how poor you actually are? The preposterous Electoral College system in US presidential elections is another, and especially grievous, manifestation of essentialist thinking. Florida must go either wholly Republican or wholly Democrat—all 25 Electoral College votes—even though the popular vote is a dead heat. But states should not be seen as essentially red or blue: they are mixtures in various proportions.
You can surely think of many other examples of "the dead hand of Plato"—essentialism. It is scientifically confused and morally pernicious. It needs to be retired.
Tuesday, May 19, 2015
What to do with brain markers that predict future behaviors?
An interesting review from Gabrieli et al. suggesting how predicting individual futures with neuromarkers might make a pragmatic contribution to human welfare.
Neuroimaging has greatly enhanced the cognitive neuroscience understanding of the human brain and its variation across individuals (neurodiversity) in both health and disease. Such progress has not yet, however, propelled changes in educational or medical practices that improve people’s lives. We review neuroimaging findings in which initial brain measures (neuromarkers) are correlated with or predict future education, learning, and performance in children and adults; criminality; health-related behaviors; and responses to pharmacological or behavioral treatments. Neuromarkers often provide better predictions (neuroprognosis), alone or in combination with other measures, than traditional behavioral measures. With further advances in study designs and analyses, neuromarkers may offer opportunities to personalize educational and clinical practices that lead to better outcomes for people.
Figure - Functional Brain Measure Predicting a Clinical Outcome: Prior to treatment, patients with social anxiety disorder who exhibited greater posterior activation (left) for angry relative to neutral facial expressions had a better clinical response to CBT (cognitive behavioral therapy) than patients who exhibited lesser activation (right)
Blog Categories:
brain plasticity,
fear/anxiety/stress
Monday, May 18, 2015
More comment on anti-aging regimes
A recent brief review by Jane Brodie points to work showing a small effect of cognitive training programs engaging memory, reasoning, or speed of processing. After 10 years 60% of those in the training programs, compared with 50 percent of the controls, had maintained or improved their ability to perform activities of daily living. Reasoning and speed, but not memory, training resulted in improved targeted cognitive abilities for 10 years. The article also contains a brief video demonstrating the NeuroRacer program developed by Gazzaley and colleagues. The article points out that very few of the potions and gizmos on the market "...have been proven to have a meaningful, sustainable benefit beyond lining the pockets of their sellers. Before you invest in them, you’d be wise to look for well-designed, placebo-controlled studies that attest to their ability to promote a youthful memory and other cognitive functions."
Friday, May 15, 2015
Grasp posture of our hands biases our visual processing.
Fascinating observations from Laura Thomas:
Observers experience biases in visual processing for objects within easy reach of their hands; these biases may assist them in evaluating items that are candidates for action. I investigated the hypothesis that hand postures that afford different types of actions differentially bias vision. Across three experiments, participants performed global-motion-detection and global-form-perception tasks while their hands were positioned (a) near the display in a posture affording a power grasp, (b) near the display in a posture affording a precision grasp, or (c) in their laps. Although the power-grasp posture facilitated performance on the motion-detection task, the precision-grasp posture instead facilitated performance on the form-perception task. These results suggest that the visual system weights processing on the basis of an observer’s current affordances for specific actions: Fast and forceful power grasps enhance temporal sensitivity, whereas detail-oriented precision grasps enhance spatial sensitivity.
Blog Categories:
acting/choosing,
attention/perception
Thursday, May 14, 2015
Literary (like musical and athletic) expertise shifts brain activity to the caudate nucleus.
Erhard et al. find that creative writing by expert versus amateur writers is associated with more activation in the caudate nucleus, the same area that become more active in expert versus amateur athletes and musicians. The increased recruitment of the basal ganglia network with increasing levels of expertise correlates with behavioral automatization that facilitates complex cognitive tasks.
The aim of the present study was to explore brain activities associated with creativity and expertise in literary writing. Using functional magnetic resonance imaging (fMRI), we applied a real-life neuroscientific setting that consisted of different writing phases (brainstorming and creative writing; reading and copying as control conditions) to well-selected expert writers and to an inexperienced control group.
During creative writing, experts showed cerebral activation in a predominantly left-hemispheric fronto-parieto-temporal network. When compared to inexperienced writers, experts showed increased left caudate nucleus and left dorsolateral and superior medial prefrontal cortex activation. In contrast, less experienced participants recruited increasingly bilateral visual areas. During creative writing activation in the right cuneus showed positive association with the creativity index in expert writers.
High experience in creative writing seems to be associated with a network of prefrontal (mPFC and DLPFC) and basal ganglia (caudate) activation. In addition, our findings suggest that high verbal creativity specific to literary writing increases activation in the right cuneus associated with increased resources obtained for reading processes.
Blog Categories:
acting/choosing,
memory/learning,
music
Wednesday, May 13, 2015
Human Purpose
The recent NYTimes David Brooks Op-Ed piece “What is your purpose?” has drawn a lot of feedback and comment. He laments the passing the era of lofty authority figures like Reinhold Niebuhr who argued for a coherent moral ecology. (I remember as a Quincy House Harvard undergraduate in 1962 having breakfast with Reinhold and Ursula Niebuhr during their several weeks residence in the house.)
These days we live in a culture that is more diverse, decentralized, interactive and democratized…Public debate is now undermoralized and overpoliticized…Intellectual prestige has drifted away from theologians, poets and philosophers and toward neuroscientists, economists, evolutionary biologists and big data analysts. These scholars have a lot of knowledge to bring, but they’re not in the business of offering wisdom on the ultimate questions.And there you have it. Per Thomas Wolfe, “You Can’t Go Home Again.” Brooks’ "neuroscientists, economists, evolutionary biologists and big data analysts" have let the genie out of the bottle. We are very clear now that “purpose” is a human invention in the service of passing on our genes. I have seen no more clear statement on purpose than that given by E. O. Wilson, which I excerpted in my Dec. 5, 2014 post.
Blog Categories:
culture/politics,
human evolution,
morality
Tuesday, May 12, 2015
Trust and the Insula
From Belfi et al:
Reciprocal trust is a crucial component of cooperative, mutually beneficial social relationships. Previous research using tasks that require judging and developing interpersonal trust has suggested that the insula may be an important brain region underlying these processes. Here, using a neuropsychological approach, we investigated the role of the insula in reciprocal trust during the Trust Game (TG), an interpersonal economic exchange. Consistent with previous research, we found that neurologically normal adults reciprocate trust in kind, i.e., they increase trust in response to increases from their partners, and decrease trust in response to decreases. In contrast, individuals with damage to the insula displayed abnormal expressions of trust. Specifically, these individuals behaved benevolently (expressing misplaced trust) when playing the role of investor, and malevolently (violating their partner's trust) when playing the role of the trustee. Our findings lend further support to the idea that the insula is important for expressing normal interpersonal trust, perhaps because the insula helps to recognize risk during decision-making and to identify social norm violations.
Blog Categories:
emotion,
emotions,
morality,
social cognition
Monday, May 11, 2015
The immensity of the vacated present.
The title of this post is a phrase from a recent essay by Vivian Gornick, "The cost of daydreaming," describing an experience that very much resonates with my own, and that I think is describing her discovery and way of noticing the distinction between our internal mind wandering (default mode) and present centered outwardly oriented (attentional) brain networks (the subject of many MindBlog posts). On finding that she could sense the start of daydreaming and suppress it:
...the really strange and interesting thing happened. A vast emptiness began to open up behind my eyes as I went about my daily business. The daydreaming, it seemed, had occupied more space than I’d ever imagined. It was as though a majority of my waking time had routinely been taken up with fantasizing, only a narrow portion of consciousness concentrated on the here and now...I began to realize what daydreaming had done for me — and to me.
Turning 60 was like being told I had six months to live. Overnight, retreating into the refuge of a fantasized tomorrow became a thing of the past. Now there was only the immensity of the vacated present...It wasn’t hard to cut short the daydreaming, but how exactly did one manage to occupy the present when for so many years one hadn’t?"Then, after a period of time:
...I became aware, after a street encounter, that the vacancy within was stirring with movement. A week later another encounter left me feeling curiously enlivened. It was the third one that did it. A hilarious exchange had taken place between me and a pizza deliveryman, and sentences from it now started repeating themselves in my head as I walked on, making me laugh each time anew, and each time with yet deeper satisfaction. Energy — coarse and rich — began to swell inside the cavity of my chest. Time quickened, the air glowed, the colors of the day grew vivid; my mouth felt fresh. A surprising tenderness pressed against my heart with such strength it seemed very nearly like joy; and with unexpected sharpness I became alert not to the meaning but to the astonishment of human existence. It was there on the street, I realized, that I was filling my skin, occupying the present.
Blog Categories:
attention/perception,
mindfulness,
self,
self help
Friday, May 08, 2015
Yet another self therapy electrical widget.
I've done a number of posts on transcranial direct current stimulation (tDCS). Entering 'transcranial' in MindBlog's search box takes you to posts on tDCS effects on impulse control, memory enhancement, general cognition, etc. Entering tDCS in a google search box gets you a plethora of offered devices costing from $50 to $300+. An article by Kira Peikoff now points to a forthcoming new-agey device called Thync that is claimed to alter your mood as desired ("calm vibes" or "energy vibes"). An interesting clip from that article:
In January, the journal Brain Stimulation published the largest meta-analysis of tDCS to date. After examining every finding replicated by at least two research groups, leading to 59 analyses, the authors reported that one session of tDCS failed to show any significant benefit for users.Thync developers claim to be bypassing the brain and using pulsed currents to stimulate subcutaneous peripheral nerves to modulate the stress response. The placebo-controlled studies they say they have done to document effectiveness of the procedure have not been published.
Thursday, May 07, 2015
Observing brain correlates of social interactions.
Bilek et al. look at coordinated brain activity during social interactions between two people in a joint attention paradigm, using a hyperscanning procedure in which neuroimaging is done with the subjects' brain activity measured in two scanner sites coupled over the internet. Allowing two humans to see (and possibly hear) each other in a hyperscanning framework makes possible an immersive social interaction while both participant’s brains are imaged. The authors constructed a setup with delay-free data transmission and precisely synchronized data acquisition, in addition to a live video stream provided between scanner sites during the sessions.
From their significance and abstract sections:
Social interaction is the likely driver of human brain evolution, critical for health, and underlies phenomena as varied as childhood development, stock market behavior, and much of what is studied in the humanities. However, appropriate experimental methods to study the underlying brain processes are still developing and technically challenging...Here, we used hyperscanning functional MRI (fMRI) to study information flow between brains of human dyads during real-time social interaction in a joint attention paradigm. In a hardware setup enabling immersive audiovisual interaction of subjects in linked fMRI scanners, we characterize cross-brain connectivity components that are unique to interacting individuals, identifying information flow between the sender’s and receiver’s temporoparietal junction. We replicate these findings in an independent sample and validate our methods by demonstrating that cross-brain connectivity relates to a key real-world measure of social behavior. Together, our findings support a central role of human-specific cortical areas in the brain dynamics of dyadic interactions and provide an approach for the noninvasive examination of the neural basis of healthy and disturbed human social behavior with minimal a priori assumptions.
Figure: Neural coupling of sender and receiver right temporoparietal junctions in a shared attention paradigm. A, Discovery study performed on 26 subjects (13 pairs); B, Replication study performed on 50 subjects (25 pairs).
Blog Categories:
attention/perception,
social cognition
Early environment and stress systems in children.
Most of the detailed work on the long term effects of adverse early life environment on stress response systems has been done with rodent models. McLaughlin et al. now present results from the Bucharest Early Intervention Project examining whether randomized placement of children into a family caregiving environment alters development of the autonomic nervous system and HPA axis in children exposed to early-life deprivation associated with institutional rearing. They provide the first experimental evidence in humans for a sensitive period with regard to stress response system development.
Disruptions in stress response system functioning are thought to be a central mechanism by which exposure to adverse early-life environments influences human development. Although early-life adversity results in hyperreactivity of the sympathetic nervous system (SNS) and hypothalamic–pituitary–adrenal (HPA) axis in rodents, evidence from human studies is inconsistent. We present results from the Bucharest Early Intervention Project examining whether randomized placement into a family caregiving environment alters development of the autonomic nervous system and HPA axis in children exposed to early-life deprivation associated with institutional rearing. Electrocardiogram, impedance cardiograph, and neuroendocrine data were collected during laboratory-based challenge tasks from children (mean age = 12.9 y) raised in deprived institutional settings in Romania randomized to a high-quality foster care intervention (n = 48) or to remain in care as usual (n = 43) and a sample of typically developing Romanian children (n = 47). Children who remained in institutional care exhibited significantly blunted SNS and HPA axis responses to psychosocial stress compared with children randomized to foster care, whose stress responses approximated those of typically developing children. Intervention effects were evident for cortisol and parasympathetic nervous system reactivity only among children placed in foster care before age 24 and 18 months, respectively, providing experimental evidence of a sensitive period in humans during which the environment is particularly likely to alter stress response system development. We provide evidence for a causal link between the early caregiving environment and stress response system reactivity in humans with effects that differ markedly from those observed in rodent models.
Blog Categories:
fear/anxiety/stress,
human development
Wednesday, May 06, 2015
Cognitive training enhances intrinsic brain connectivity in childhood.
A fascinating open access article from Astle et al., who found changes in brain connectivity in 8-11 year old children who completed 20 sessions of computerized working memory training at home:
In human participants, the intensive practice of particular cognitive activities can induce sustained improvements in cognitive performance, which in some cases transfer to benefits on untrained activities. Despite the growing body of research examining the behavioral effects of cognitive training in children, no studies have explored directly the neural basis of these training effects in a systematic, controlled fashion. Therefore, the impact of training on brain neurophysiology in childhood, and the mechanisms by which benefits may be achieved, are unknown. Here, we apply new methods to examine dynamic neurophysiological connectivity in the context of a randomized trial of adaptive working memory training undertaken in children. After training, connectivity between frontoparietal networks and both lateral occipital complex and inferior temporal cortex was altered. Furthermore, improvements in working memory after training were associated with increased strength of neural connectivity at rest, with the magnitude of these specific neurophysiological changes being mirrored by individual gains in untrained working memory performance.
Tuesday, May 05, 2015
Ideology distorts perception of social mobility in the U.S.
I find it interesting that I share the same misperception of social mobility changes that are noted in this piece of work by Chambers et al.:
The ability to move upward in social class or economic position (i.e., social mobility) is a defining feature of the American Dream, yet recent public-opinion polls indicate that many Americans are losing confidence in the essential fairness of the system and their opportunities for financial advancement. In two studies, we examined Americans’ perceptions of both current levels of mobility in the United States and temporal trends in mobility, and we compared these perceptions with objective indicators to determine perceptual accuracy. Overall, participants underestimated current mobility and erroneously concluded that mobility has declined over the past four decades. These misperceptions were more pronounced among politically liberal participants than among politically moderate or conservative ones. These perception differences were accounted for by liberals’ relative dissatisfaction with the current social system, social hierarchies, and economic inequality. These findings have important implications for theories of political ideology.The author's introduction, after noting pessimistic views on social mobility - "A recent Gallup poll (Dugan & Newport, 2013) found that only 52% of Americans agreed that there is plenty of opportunity for the average person to get ahead in life—down from 81% a mere 15 years earlier and the lowest level in over six decades." - points to the actual data on social mobility changes:
The publication of a recent, multidecade report provided us with an opportunity to compare those public perceptions with economic reality. Chetty, Hendren, Kline, Saez, and Turner (2014a, 2014b) compared the tax records of nearly 40 million American adults with those of their parents 20 years earlier, assessing changes in individuals’ economic position relative to their starting point in life (i.e., their parents’ economic position). They also compared individuals born in different decades, from the early 1970s through the mid 1990s, to assess any generational changes in mobility patterns. First, they found (as have Hertz, 2007; Lee & Solon, 2009) that intergenerational mobility rates have not declined, but in fact remained stable during the three-decade period they examined—contrary to popular belief (Dugan & Newport, 2013; Pew Research Center, 2012). Second, their data revealed that Americans enjoy—depending on one’s perspective—a substantial amount of social mobility. For example, of individuals born to parents in the bottom third of the income distribution (i.e., lower-class parents), 49% remained in the bottom third later in life, whereas 51% moved up to the middle or top third. In other words, despite their disadvantaged backgrounds, half of them were upwardly mobile (though still below the two-thirds one might expect based on the American Dream). Moreover, because they utilized much larger sample sizes, actual tax records (instead of self-reported income), and multiple indicators of mobility (e.g., incomes, college attendance rates), Chetty and colleagues’ study yields more precise estimates of social mobility than prior studies, and their findings are consistent with those of other published reports (Pew Research Center, 2013; U.S. Department of the Treasury, 2007). This makes their study the most appropriate standard to gauge our participants’ perceptual accuracy.
Monday, May 04, 2015
Economic origins of ultrasociality
Gowdy and Krall, in a manuscript under review by Behavioral and Brain Sciences, argue that the transition to agriculture in ants, termites, and our humans species generated the need for the extreme role specializations and ultrasociality that distinguish us from social species that depend on foraging for food. (Interested readers can obtain a copy of the MS from me). I pass on a clip from the introduction, followed by their abstract.
With the widespread adoption of agriculture some 10,000 years ago human societies took on some important characteristics shared with social insects—ants and termites in particular—that also engage in the production of their own food. These characteristics represented a sharp break in the evolutionary history of these lineages and led to two important outcomes (1) ecosystem domination as a product of a dramatic increase in population size and much more intensive resource exploitation and (2) the suppression of individual autonomy as the group itself became the focus of economic organization. The evolution of agriculture in fungus-growing ants and termites, and in human societies, is an example of convergent evolution—the independent evolution of similar characteristics in species not closely related. In terms of genetics, ants, humans and termites could hardly be more different. Yet in all three lineages similar patterns of economic organization emerge through similar selection pressures. We use the term ultrasociality to refer to these lineages and we address the question of its origin through the fundamental question of evolutionary biology: “where did something come from and what were the selection pressures that favored its spread?”Abstract:
Ultrasociality refers to the social organization of a few species, including humans and some social insects, having complex division of labor, city states and an almost exclusive dependence on agriculture for subsistence. We argue that the driving forces in the evolution of these ultrasocial societies were economic. With the agricultural transition, species could directly produce their own food and this was such a competitive advantage that those species now dominate the planet. Once underway, this transition was propelled by the selection of withinspecies groups that could best capture the advantages of (1) actively managing the inputs to food production, (2) a more complex division of labor, and (3) increasing returns to larger scale and larger group size. Together these factors reoriented productive life and radically altered the structure of these societies. Once agriculture began, populations expanded as these economic drivers opened up new opportunities for the exploitation of resources and the active management of inputs to food production. With intensified group-level competition, larger populations and intensive resource exploitation became competitive advantages and the “social conquest of earth” was underway. Ultrasocial species came to dominate the earth’s ecosystems. Ultrasociality also brought a loss of autonomy for individuals within the group. We argue that exploring the common causes and consequences of ultrasociality in humans and the social insects that adopted agriculture can provide fruitful insights into the evolution of complex human society.
Blog Categories:
culture,
culture/politics,
evolution/debate,
human evolution
Friday, May 01, 2015
Explaining the increase in individualism in the U.S. over the past 150 years.
Grossman and Varnum quantify shifts in eight cultural-level correlates of individualism reflected in the domains of cultural products (individualist and collectivist themes in books, behavioral patterns of uniqueness such as baby-naming practices, behavioral and demographic correlates of individualism-collectivism reflecting the strength of family ties such as family size, percentage of single-person households and multigenerational households, divorce rates, etc.) and test the relationship between these indicators and trends in pathogen prevalence, the number of disasters, urbanization, secularism, and socioeconomic structure. Their data suggest socioeconomic structure shifts have been the most potent predictor of changes across a wide range of individualism-related markers. Compared with blue-collar occupations, white-collar occupations afford and demand more autonomy and self-direction, and greater affluence enables individuals to pursue their own interests without consulting or depending on larger collectives. Their abstract:
Why do cultures change? The present work examined cultural change in eight cultural-level markers, or correlates, of individualism in the United States, all of which increased over the course of the 20th century: frequency of individualist themes in books, preference for uniqueness in baby naming, frequency of single-child relative to multichild families, frequency of single-generation relative to multigeneration households, percentage of adults and percentage of older adults living alone, small family size, and divorce rates (relative to marriage rates). We tested five key hypotheses regarding cultural change in individualism-collectivism. As predicted by previous theories, changes in socioeconomic structure, pathogen prevalence, and secularism accompanied changes in individualism averaged across all measures. The relationship with changes in individualism was less robust for urbanization. Contrary to previous theories, changes in individualism were positively (as opposed to negatively) related to the frequency of disasters. Time-lagged analyses suggested that only socioeconomic structure had a robust effect on individualism; changes in socioeconomic structure preceded changes in individualism. Implications for anthropology, psychology, and sociology are discussed.
Thursday, April 30, 2015
The most clever crow yet....
MindBlog has done posts on a raven making a tool, an urban crow with street smarts, and now I pass on a video that a friend alerted me to, showing a crow solving a multi-step problem.
Wednesday, April 29, 2015
Effortless awareness: Linking subjective experience with brain activity during focused meditation with real-time fMRI
Garrison and collaborators, whose work I have referenced in a previous post and also at the end of my "Upstairs/Downstairs" web lecture, give a further account of their work in this abstract from their open source article in Frontiers in Human Neuroscience:
Neurophenomenological studies seek to utilize first-person self-report to elucidate cognitive processes related to physiological data. Grounded theory offers an approach to the qualitative analysis of self-report, whereby theoretical constructs are derived from empirical data. Here we used grounded theory methodology (GTM) to assess how the first-person experience of meditation relates to neural activity in a core region of the default mode network—the posterior cingulate cortex (PCC). We analyzed first-person data consisting of meditators' accounts of their subjective experience during runs of a real time fMRI neurofeedback study of meditation, and third-person data consisting of corresponding feedback graphs of PCC activity during the same runs. We found that for meditators, the subjective experiences of “undistracted awareness” such as “concentration” and “observing sensory experience,” and “effortless doing” such as “observing sensory experience,” “not efforting,” and “contentment,” correspond with PCC deactivation. Further, the subjective experiences of “distracted awareness” such as “distraction” and “interpreting,” and “controlling” such as “efforting” and “discontentment,” correspond with PCC activation. Moreover, we derived several novel hypotheses about how specific qualities of cognitive processes during meditation relate to PCC activity, such as the difference between meditation and “trying to meditate.” These findings offer novel insights into the relationship between meditation and mind wandering or self-related thinking and neural activity in the default mode network, driven by first-person reports.
Tuesday, April 28, 2015
Improving vision in older adults.
I'm now a Fort Lauderdale, Florida resident (except for 5 months of spring and summer in Madison WI.), and have several friends 85 and older still still driving on the death defying I-95 interstate that links Palm Beach, Fort Lauderdale, and Miami, even though their visual capabilities have clearly declined. This is an age cohort that is increasing by 350% between 2000 and 2050. One of the most obvious declines in their visual processing is with contrast sensitivity, resolving small changes in illumination and shape detail, especially at high spatial frequencies. DeLoss et al., in a study in the same vein as others reported in this blog (enter aging in the search box in the left column), show that doing simple discrimination exercises for 1.5 hr per day of testing and training over 7 days resulted in performance that was not statistically different from that of younger college age adults prior to training. (These were exercises of the sort currently available online (See Brainhq.com or Luminosity.com). Here is the abstract, followed by figures illustrating the test employed.
A major problem for the rapidly growing population of older adults (age 65 and over) is age-related declines in vision, which have been associated with increased risk of falls and vehicle crashes. Research suggests that this increased risk is associated with declines in contrast sensitivity and visual acuity. We examined whether a perceptual-learning task could be used to improve age-related declines in contrast sensitivity. Older and younger adults were trained over 7 days using a forced-choice orientation-discrimination task with stimuli that varied in contrast with multiple levels of additive noise. Older adults performed as well after training as did college-age younger adults prior to training. Improvements transferred to performance for an untrained stimulus orientation and were not associated with changes in retinal illuminance. Improvements in far acuity in younger adults and in near acuity in older adults were also found. These findings indicate that behavioral interventions can greatly improve visual performance for older adults.
Example of the task used in the study. In each trial, subjects saw a Gabor patch at one of two standard orientations—25° clockwise (shown here) or 25° counterclockwise for training and testing trials, 45° clockwise or 45° counterclockwise for familiarization trials. After this Gabor patch disappeared, subjects saw a second stimulus and had to judge whether it was rotated clockwise or counterclockwise in comparison with the standard orientation (the examples shown here are rotated 15° clockwise and counterclockwise off the standard orientation, respectively).
Example of contrast and noise levels used in the experiment. Gabor patches are displayed at 75% contrast in the top row and at 25% contrast in the bottom row. Stimuli were presented in five blocks (examples shown from left to right). There was no noise in the first block, but starting with the second block, stimuli were presented in Gaussian noise, with the noise level increasing in each subsequent block.A clip from the NY Times review:
...the subjects watched 750 striped images that were rapidly presented on a computer screen with subtle changes in the visual “noise” surrounding them — like snow on a television. The viewer indicated whether the images were rotating clockwise or counterclockwise. The subject would hear a beep for every correct response.
Each session took an hour and a half. The exercises were taxing, although the subjects took frequent breaks. But after five sessions, the subjects had learned to home in more precisely on the images and to filter out the distracting visual noise. After the training, the older adults performed as well as those 40 years younger, before their own training.
Monday, April 27, 2015
Emotion is a component of the earliest stages of perception.
Emotions and perceptions are generally assumed to be separate and parallel realms of the mind. Topolinski et al. show to the contrary that affect is a genuine online-component of perception, instantaneously mirroring the success of different perceptual stages.
Here is their abstract, following by a graphic of the impossible Necker cube used in experiments 4 and 5.
Current theories assume that perception and affect are separate realms of the mind. In contrast, we argue that affect is a genuine online-component of perception instantaneously mirroring the success of different perceptual stages. Consequently, we predicted that the success (or failure) of even very early and cognitively encapsulated basic visual processing steps would trigger immediate positive (or negative) affective responses. To test this assumption, simple visual stimuli that either allowed or obstructed early visual processing stages without participants being aware of this were presented briefly. Across 5 experiments, we found more positive affective responses to stimuli that allowed rather than obstructed Gestalt completion at certain early visual stages (Experiments 1–3; briefest presentation 100 ms with post-mask), and visual disambiguation in possible vs. impossible Necker cubes (Experiments 4 and 5; briefest presentation 100 ms with post-mask). This effect was observed both on verbal preference ratings (Experiments 1, 2, and 4) and as facial muscle responses occurring within 2–4 s after stimulus onset (zygomaticus activity; Experiments 3 and 7). For instance, in participants unaware of spatial possibility we found affective discrimination between possible and impossible Necker cubes (the famous Freemish Crate) for 100 ms presentation timings, although a conscious discrimination took more than 2000 ms (Experiment 4).
The Freemish Crate (impossible Necker Cube) which features inconsistent mutual occlusions of some of the lines constituting the cube and thus represents a cube that cannot exist in three dimensions. Such an impossible cube renders visual disambiguation impossible. (a), the visual manipulation from Experiments 4 and 5 (b), examples of a possible (c), and an impossible Necker cube (d).
Blog Categories:
acting/choosing,
consciousness,
emotion,
emotions
Friday, April 24, 2015
Subliminal learning can nudge future control decisions.
A dichotomy is proposed by most dual-system approaches to cognition (as, for example, in Kahneman's 2011 book "Thinking, fast and slow") in which processes are nonconscious, fast, associative, automatic, rigid, and subjectively effortless, or they are conscious, slow, propositional, controlled, flexible, and effortful. Farooqui and Manly demonstrate a more nuanced reality: unconscious associative learning of subliminal relations. They describe the setup:
In our experiments, participants performed task-switching tests. On each trial, participants saw a colored rectangle that indicated which of two tasks should be performed on the numbers or letter inside it. A couple of seconds before these were presented, one of three subliminal cues appeared: One cue predicted a switch from the prior task, another cue predicted a repetition of the prior task, and a third cue was nonpredictive (and could therefore appear before either type of trial). The cues were not linked to any task set; rather, they predicted the switch/repeat status of the trial, and therefore the participants could use them to make proactive goal-directed changes in the currently active cognitive set.They found that
Despite participants’ being entirely unaware of subliminal cues in a series of challenging switch tasks, cues predicting switch trials were reliably associated with improved performance. This robust effect was observed across four variants of stimuli and tasks in four independent participant groups.From their summary abstract:
... This utilization of subliminal information was flexible and adapted to a change in cues predicting task switches and occurred only when switch trials were difficult and effortful. When cues were consciously visible, participants were unable to discern their relevance and could not use them to enhance switch performance. Our results show that unconscious cognition can implicitly use subliminal information in a goal-directed manner for anticipatory control, and they also suggest that subliminal representations may be more conducive to certain forms of associative learning.
Blog Categories:
acting/choosing,
memory/learning,
unconscious
Thursday, April 23, 2015
Mind wandering and mental autonomy.
I'm on my third reading of a dense open access paper by Thomas Metzinger in Frontiers in Psychology titled "The myth of cognitive agency: subpersonal thinking as a cyclically recurring loss of mental autonomy." Readers interested in my Upstairs/Downstairs web lecture or the March 19 post on Metzinger might want to check it out. My headache sets in from trying to remember and keep in mind the numerous abbreviations he uses for economy in the text - AA, CA, PSM,EAM, SRB, UI, etc. - a whole glossary for them is provided. Of particular interest are his two suggested criteria, noted in the abstract just below, for "individuating single episodes of mind-wandering, namely, the “self-representational blink” (SRB) and a sudden shift in the phenomenological “unit of identification” (UI)." I pass on the abstract first, and then a few clips from the sections of the paper titled "Mind wandering as a switch in the unit of identification" and "The re-appearance of meta-awareness"
This metatheoretical paper investigates mind wandering from the perspective of philosophy of mind. It has two central claims. The first is that, on a conceptual level, mind wandering can be fruitfully described as a specific form of mental autonomy loss. The second is that, given empirical constraints, most of what we call “conscious thought” is better analyzed as a subpersonal process that more often than not lacks crucial properties traditionally taken to be the hallmark of personal-level cognition - such as mental agency, explicit, consciously experienced goal-directedness, or availability for veto control. I claim that for roughly two thirds of our conscious life-time we do not possess mental autonomy (M-autonomy) in this sense. Empirical data from research on mind wandering and nocturnal dreaming clearly show that phenomenally represented cognitive processing is mostly an automatic, non-agentive process and that personal-level cognition is an exception rather than the rule. This raises an interesting new version of the mind-body problem: How is subpersonal cognition causally related to personal-level thought? More fine-grained phenomenological descriptions for what we called “conscious thought” in the past are needed, as well as a functional decomposition of umbrella terms like “mind wandering” into different target phenomena and a better understanding of the frequent dynamic transitions between spontaneous, task-unrelated thought and meta-awareness. In an attempt to lay some very first conceptual foundations for the now burgeoning field of research on mind wandering, the third section proposes two new criteria for individuating single episodes of mind-wandering, namely, the “self-representational blink” (SRB) and a sudden shift in the phenomenological “unit of identification” (UI). I close by specifying a list of potentially innovative research goals that could serve to establish a stronger connection between mind wandering research and philosophy of mind.And, from the text:
Mind Wandering as a Switch in the Unit of Identification
Let us look at a second phenomenological feature of mind wandering that could, if correctly described, yield a new theoretical perspective. Perhaps the most interesting phenomenological feature of mind wandering is a sudden shift in the UI (phenomenal unit of identification). The UI is the phenomenal property with which we currently identify, exactly the form of currently active conscious content that generates the subjective experience of “I am this!” Please note how many mind wandering episodes are phenomenologically disembodied states, because perceptual decoupling often also means decoupling from current body perception...
The Re-Appearance of Meta-Awareness
How exactly does an episode of mind wandering end? Schooler and colleagues, referring to work by the late Daniel Wegner, point out that regaining meta-awareness may be accompanied by an illusion of control...“I have just regained meta-awareness, because I just introspectively realized that I was lost in mind wandering!”
Because mindfulness and mind wandering are opposing constructs, the process of losing and regaining meta-awareness can be most closely studied in different stages of classical mindfulness meditation. In the early stages of object-oriented meditation, there will typically be cyclically recurring losses of mental autonomy, plus an equally recurring mental action, namely the decision to gently, but firmly bring the focus of attention back to the formal object of meditation, for example to interoceptive sensations associated with the respiratory process. Here, the phenomenology will often be one of mental agency, goal directedness, and a mild sense of effort. In advanced stages of open monitoring meditation, however, the aperture of attention has gradually widened, typically resulting in an effortless and choiceless awareness of the present moment as a whole. Such forms of stable meta-awareness may now be described as shifts to as state without a UI. Whereas in beginning stages of object-oriented mindfulness practice, the meditator identifies with an internal model of a mental agent directed at a certain goal-state (“the meditative self”), meta-awareness of the second kind is typically described as having an effortless and non-agentive quality. Interestingly, the neural correlates pertaining to this difference between “trying to meditate” and meditation are now beginning to emerge (Garrison et al., 2013, graphic from this paper is in my Upstairs/Downstairs web lecture).
Blog Categories:
attention/perception,
consciousness,
meditation
Wednesday, April 22, 2015
The evolution of gender effects on empathy.
Christov-Moore et al. offer a review making the point that differences in the capacity for empathy between males and females have deep evolutionary and developmental roots, in addition to any cultural expectations about gender roles. The review references a graphic summary of brain areas involved in experience sharing which I also pass on.
Highlights
Highlights
• Sex differences in empathy have phylogenetic and ontogenetic roots in biology.
• As primary caregivers females evolved adaptations to be sensitive to infants’ signals.
• Sex differences in empathy appear to be consistent and stable across the lifespan.
• In affective empathy, females, compared to men, show higher emotional responsivity.
• Males show greater recruitment of brain areas for the control of cognitive empathy.Abstract
Evidence suggests that there are differences in the capacity for empathy between males and females. However, how deep do these differences go? Stereotypically, females are portrayed as more nurturing and empathetic, while males are portrayed as less emotional and more cognitive. Some authors suggest that observed gender differences might be largely due to cultural expectations about gender roles. However, empathy has both evolutionary and developmental precursors, and can be studied using implicit measures, aspects that can help elucidate the respective roles of culture and biology. This article reviews evidence from ethology, social psychology, economics, and neuroscience to show that there are fundamental differences in implicit measures of empathy, with parallels in development and evolution. Studies in nonhuman animals and younger human populations (infants/children) offer converging evidence that sex differences in empathy have phylogenetic and ontogenetic roots in biology and are not merely cultural byproducts driven by socialization. We review how these differences may have arisen in response to males’ and females’ different roles throughout evolution. Examinations of the neurobiological underpinnings of empathy reveal important quantitative gender differences in the basic networks involved in affective and cognitive forms of empathy, as well as a qualitative divergence between the sexes in how emotional information is integrated to support decision making processes. Finally, the study of gender differences in empathy can be improved by designing studies with greater statistical power and considering variables implicit in gender (e.g., sexual preference, prenatal hormone exposure). These improvements may also help uncover the nature of neurodevelopmental and psychiatric disorders in which one sex is more vulnerable to compromised social competence associated with impaired empathy.The summary graphic by Zaki and Ochsner :
Neuroscientific approaches to studying experience sharing and mentalizing. (a) The experimental logic underlying first-person perception studies of experience sharing. The blue circle represents brain regions engaged by direct, first-person experience of an affective response, motor intention, or other internal state. The yellow circle represents regions engaged by third-person observation of someone else experiencing the same kind of internal state. To the extent that a region demonstrates neural resonance—common engagement by first- and third-person experience (green overlap)—it is described as supporting a perceiver's vicarious experience of a target's state (regions demonstrating such properties are highlighted in green in c). (b) Studies of mentalizing typically ask participants to make judgments about targets’ beliefs, thoughts, intentions and/or feelings, as depicted in highly stylized social cues, including vignettes (top left), posed facial expressions (right), or even more isolated nonverbal cues, such as target eye gaze (bottom left). Regions engaged by such tasks (blue in c) are described as contributing to perceivers’ ability to mentalize. (c) Brain regions associated with experience sharing (green) and mentalizing (blue). IPL, inferior parietal lobule; TPJ, temporoparietal junction; pSTS, posterior superior temporal sulcus; TP, temporal pole; AI, anterior insula; PMC, premotor cortex; PCC, posterior cingulate cortex; ACC, anterior cingulate cortex; MPFC, medial prefrontal cortex.
Blog Categories:
culture/politics,
mirror neurons,
social cognition
Tuesday, April 21, 2015
Observing leadership emergence through interpersonal brain synchronization.
Interesting work from Jiang et al., who show that show that interpersonal neural synchronization is significantly higher between leaders and followers than between followers and followers, suggesting that leaders emerge by synchronizing their brain activity with that of the followers:
The neural mechanism of leader emergence is not well understood. This study investigated (i) whether interpersonal neural synchronization (INS) plays an important role in leader emergence, and (ii) whether INS and leader emergence are associated with the frequency or the quality of communications. Eleven three-member groups were asked to perform a leaderless group discussion (LGD) task, and their brain activities were recorded via functional near infrared spectroscopy (fNIRS)-based hyperscanning. Video recordings of the discussions were coded for leadership and communication. Results showed that the INS for the leader–follower (LF) pairs was higher than that for the follower–follower (FF) pairs in the left temporo-parietal junction (TPJ), an area important for social mentalizing. Although communication frequency was higher for the LF pairs than for the FF pairs, the frequency of leader-initiated and follower-initiated communication did not differ significantly. Moreover, INS for the LF pairs was significantly higher during leader-initiated communication than during follower-initiated communications. In addition, INS for the LF pairs during leader-initiated communication was significantly correlated with the leaders’ communication skills and competence, but not their communication frequency. Finally, leadership could be successfully predicted based on INS as well as communication frequency early during the LGD (before half a minute into the task). In sum, this study found that leader emergence was characterized by high-level neural synchronization between the leader and followers and that the quality, rather than the frequency, of communications was associated with synchronization. These results suggest that leaders emerge because they are able to say the right things at the right time.
Blog Categories:
culture/politics,
faces,
social cognition,
vision
Monday, April 20, 2015
Glycogen recovery after exercise: junk food as good as sport supplements
I note this article because one of its authors, Chuck Dumke, now at the University of Montana, worked in my vision research laboratory at the University of Wisconsin in the 1990s, where he also studied kinesiology and sports performance. I'm passing this on to several friends who buy expensive post-exercise food supplements.
A variety of dietary choices are marketed to enhance glycogen recovery after physical activity. Past research informs recommendations regarding the timing, dose, and nutrient compositions to facilitate glycogen recovery. This study examined the effects of isoenergetic sport supplements (SS) vs. fast food (FF) on glycogen recovery and exercise performance. Eleven males completed two experimental trials in a randomized, counterbalanced order. Each trial included a 90-minute glycogen depletion ride followed by a 4-hour recovery period. Absolute amounts of macronutrients (1.54 ± 0.27 g·kg-1 carbohydrate, 0.24 ± 0.04 g·kg fat-1, and 0.18 ± 0.03g·kg protein-1) as either SS or FF were provided at 0 and 2 hours. Muscle biopsies were collected from the vastus lateralis at 0 and 4 hours post exercise. Blood samples were analyzed at 0, 30, 60, 120, 150, 180, and 240 minutes post exercise for insulin and glucose, with blood lipids analyzed at 0 and 240 minutes. A 20k time-trial (TT) was completed following the final muscle biopsy. There were no differences in the blood glucose and insulin responses. Similarly, rates of glycogen recovery were not different across the diets (6.9 ± 1.7 and 7.9 ± 2.4 mmol·kgwet weight-1·hr-1 for SS and FF, respectively). There was also no difference across the diets for TT performance (34.1 ± 1.8 and 34.3 ± 1.7 minutes for SS and FF, respectively. These data indicate that short-term food options to initiate glycogen resynthesis can include dietary options not typically marketed as sports nutrition products such as fast food menu items.
Friday, April 17, 2015
Your friends know how long you will live.
An interesting study from Jackson et al. analyzing data from an east coast cohort of 600 people observed in the 1930s through 2013:
Although self-rated personality traits predict mortality risk, no study has examined whether one’s friends can perceive personality characteristics that predict one’s mortality risk. Moreover, it is unclear whether observers’ reports (compared with self-reports) provide better or unique information concerning the personal characteristics that result in longer and healthier lives. To test whether friends’ reports of personality predict mortality risk, we used data from a 75-year longitudinal study (the Kelly/Connolly Longitudinal Study on Personality and Aging). In that study, 600 participants were observed beginning in 1935 through 1938, when they were in their mid-20s, and continuing through 2013. Male participants seen by their friends as more conscientious and open lived longer, whereas friend-rated emotional stability and agreeableness were protective for women. Friends’ ratings were better predictors of longevity than were self-reports of personality, in part because friends’ ratings could be aggregated to provide a more reliable assessment. Our findings demonstrate the utility of observers’ reports in the study of health and provide insights concerning the pathways by which personality traits influence health.Some details from the text of the article:
Between 1935 and 1938, 600 individuals (300 engaged heterosexual couples) began participating in the KCLS, a longitudinal study on personality and newly formed marriages. Participants were recruited through newspaper advertisements, other advertisements, and word of mouth in the state of Connecticut. The participants were primarily from middle-class backgrounds...Peer ratings were obtained from people that participants identified as knowing them well enough to provide accurate ratings; most of these friends had been in the participants’ wedding parties. Each participant named three to eight friends, and the majority of participants were rated by five friends...Self-ratings and peer ratings of personality were obtained using the 36-item Kelly Personality Rating Scale..we conducted a study to validate the PRS using more modern personality measures: the Big Five Inventory, the Iowa Personality Questionnaire, and the Mini International Personality Item Pool...The resulting five-factor solution reflected the Big Five factor structure. Extraversion was assessed with five items (e.g., quiet, popular), agreeableness with six items (e.g., courteous, sincere), conscientiousness with five items (e.g., persistent, reliable), emotional stability with four items (e.g., nervous, temperamental), and openness with four items (e.g., cultured, intelligent). Analyses indicated that the model adequately captured variation in modern Big Five composite scores...The average life span for men was 75.2 years (range = 23–98 years, SD = 15.5). The average life span for women was 81.3 years (range = 23–102 years, SD = 13.4). The 21 surviving participants had an average age of 97.2 years (SD = 2.1) in 2013.
Subscribe to:
Posts (Atom)