Wednesday, September 14, 2016

A psychological mechanism to explain why childhood adversity diminishes adult health?

A large number of studies have by now shown that harsh social and physical environments early in life are associated with a substantial increase in the risk of chronic illnesses, such as heart disease, diabetes, and some forms of cancer. It is generally assumed that the hypothalamic-pituitary-adrenal (HPA) axis is an essential biological intermediary of these poor health outcomes in adulthood. Zilioli et al. suggest that lowered sense of self worth is the psychological mechanism that persists into adulthood to alter stress physiology. Their abstract:
Childhood adversity is associated with poor health outcomes in adulthood; the hypothalamic-pituitary-adrenal (HPA) axis has been proposed as a crucial biological intermediary of these long-term effects. Here, we tested whether childhood adversity was associated with diurnal cortisol parameters and whether this link was partially explained by self-esteem. In both adults and youths, childhood adversity was associated with lower levels of cortisol at awakening, and this association was partially driven by low self-esteem. Further, we found a significant indirect pathway through which greater adversity during childhood was linked to a flatter cortisol slope via self-esteem. Finally, youths who had a caregiver with high self-esteem experienced a steeper decline in cortisol throughout the day compared with youths whose caregiver reported low self-esteem. We conclude that self-esteem is a plausible psychological mechanism through which childhood adversity may get embedded in the activity of the HPA axis across the life span.
And, a clip from their discussion, noting limits to the interpretation of the correlations they observe:
These findings suggest that one’s sense of self-worth might act as a proximal psychological mechanism through which childhood adversity gets embedded in human stress physiology. Specifically, higher self-esteem was associated with a steeper (i.e., healthier) cortisol decline during the day, whereas low self-esteem was associated with a flatter cortisol slope. Depression and neuroticism were tested as alternative pathways linking childhood adversity to cortisol secretion and were found not to be significant, which suggests that the indirect effect was specific to self-esteem. Nevertheless, it is plausible that other psychological pathways exist that might carry the effects of childhood adversity across the life span. For example, attachment security, a potential antecedent of self-esteem that forms during childhood, would be a strong candidate for playing such a role. Unfortunately, this construct was not assessed in our studies, but we hope that future work will test this hypothesis.

Tuesday, September 13, 2016

The ecstasy of speed - or leisure?

Because I so frequently feel overwhelmed by input streams of chunks of information, I wonder how readers of this blog manage to find time to attend to its contents. (I am gratified that so many seem to do so.) Thoughts like this made me pause over Maria Popova's recent essay on our anxiety about time. I want to pass on a few clips, and recommend that you read all of it. She quotes extensively from James Gleick's book published in 2000: "Faster: The Acceleration of Just About Everything.", and begins by noting a 1918 Bertrand Russell quote, “both in thought and in feeling, even though time be real, to realise the unimportance of time is the gate of wisdom.”
Half a century after German philosopher Josef Pieper argued that leisure is the basis of culture and the root of human dignity, Gleick writes:
We are in a rush. We are making haste. A compression of time characterizes the life of the century....We have a word for free time: leisure. Leisure is time off the books, off the job, off the clock. If we save time, we commonly believe we are saving it for our leisure. We know that leisure is really a state of mind, but no dictionary can define it without reference to passing time. It is unrestricted time, unemployed time, unoccupied time. Or is it? Unoccupied time is vanishing. The leisure industries (an oxymoron maybe, but no contradiction) fill time, as groundwater fills a sinkhole. The very variety of experience attacks our leisure as it attempts to satiate us. We work for our amusement...Sociologists in several countries have found that increasing wealth and increasing education bring a sense of tension about time. We believe that we possess too little of it: that is a myth we now live by.
To fully appreciate Gleick’s insightful prescience, it behooves us to remember that he is writing long before the social web as we know it, before the conspicuous consumption of “content” became the currency of the BuzzMalnourishment industrial complex, before the timelines of Twitter and Facebook came to dominate our record and experience of time. (Prescience, of course, is a form of time travel — perhaps our only nonfictional way to voyage into the future.) Gleick writes:
We live in the buzz. We wish to live intensely, and we wonder about the consequences — whether, perhaps, we face the biological dilemma of the waterflea, whose heart beats faster as the temperature rises. This creature lives almost four months at 46 degrees Fahrenheit but less than one month at 82 degrees...Yet we have made our choices and are still making them. We humans have chosen speed and we thrive on it — more than we generally admit. Our ability to work fast and play fast gives us power. It thrills us… No wonder we call sudden exhilaration a rush.
Gleick considers what our units of time reveal about our units of thought:
We have reached the epoch of the nanosecond. This is the heyday of speed. “Speed is the form of ecstasy the technical revolution has bestowed on man,” laments the Czech novelist Milan Kundera, suggesting by ecstasy a state of simultaneous freedom and imprisonment… That is our condition, a culmination of millennia of evolution in human societies, technologies, and habits of mind.
The more I experience and read about the winding up and acceleration of our lives (think of the rate and omnipresence of the current presidential campaign!),  the more I realize the importance of rediscovering the sanity of leisure and quiet spaces.

Monday, September 12, 2016

Mind and Body - A neural substrate of psychosomatic illness

We all have our "hot buttons" - events or issues that can trigger an acute stress response as our adrenal medulla releases adrenaline, causing heart rate increases, sweating, pupil dilation, etc. Dum et al. use a clever tracer technique to show neural connections between the adrenal medulla and higher cortical centers that might exert a 'top-down' cognitive control of this arousal:

Significance
How does the “mind” (brain) influence the “body” (internal organs)? We identified key areas in the primate cerebral cortex that are linked through multisynaptic connections to the adrenal medulla. The most substantial influence originates from a broad network of motor areas that are involved in all aspects of skeletomotor control from response selection to motor preparation and movement execution. A smaller influence originates from a network in medial prefrontal cortex that is involved in the regulation of cognition and emotion. Thus, cortical areas involved in the control of movement, cognition, and affect are potential sources of central commands to influence sympathetic arousal. These results provide an anatomical basis for psychosomatic illness where mental states can alter organ function.
Abstract
Modern medicine has generally viewed the concept of “psychosomatic” disease with suspicion. This view arose partly because no neural networks were known for the mind, conceptually associated with the cerebral cortex, to influence autonomic and endocrine systems that control internal organs. Here, we used transneuronal transport of rabies virus to identify the areas of the primate cerebral cortex that communicate through multisynaptic connections with a major sympathetic effector, the adrenal medulla. We demonstrate that two broad networks in the cerebral cortex have access to the adrenal medulla. The larger network includes all of the cortical motor areas in the frontal lobe and portions of somatosensory cortex. A major component of this network originates from the supplementary motor area and the cingulate motor areas on the medial wall of the hemisphere. These cortical areas are involved in all aspects of skeletomotor control from response selection to motor preparation and movement execution. The second, smaller network originates in regions of medial prefrontal cortex, including a major contribution from pregenual and subgenual regions of anterior cingulate cortex. These cortical areas are involved in higher-order aspects of cognition and affect. These results indicate that specific multisynaptic circuits exist to link movement, cognition, and affect to the function of the adrenal medulla. This circuitry may mediate the effects of internal states like chronic stress and depression on organ function and, thus, provide a concrete neural substrate for some psychosomatic illness.

Friday, September 09, 2016

Want to predict a group’s social standing? Get a hormonal profile.

Usually the analysis of a group's social standing is attempted by determining demographic or psychological characteristics of group members. Akinola et al. suggest that the collective hormonal profile of the group can be equally predictive, and provides a neurobiological perspective on the factors that determine who rises to the top across, not just within, social hierarchies:

Significance
Past research has focused primarily on demographic and psychological characteristics of group members without taking into consideration the biological make-up of groups. Here we introduce a different construct—a group’s collective hormonal profile—and find that a group’s biological profile predicts its standing across groups and that the particular profile supports a dual-hormone hypothesis. Groups with a collective hormonal profile characterized by high testosterone and low cortisol exhibit the highest performance. The current work provides a neurobiological perspective on factors determining group behavior and performance that are ripe for further exploration.
Abstract
Prior research has shown that an individual’s hormonal profile can influence the individual’s social standing within a group. We introduce a different construct—a collective hormonal profile—which describes a group’s hormonal make-up. We test whether a group’s collective hormonal profile is related to its performance. Analysis of 370 individuals randomly assigned to work in 74 groups of three to six individuals revealed that group-level concentrations of testosterone and cortisol interact to predict a group’s standing across groups. Groups with a collective hormonal profile characterized by high testosterone and low cortisol exhibited the highest performance. These collective hormonal level results remained reliable when controlling for personality traits and group-level variability in hormones. These findings support the hypothesis that groups with a biological propensity toward status pursuit (high testosterone) coupled with reduced stress-axis activity (low cortisol) engage in profit-maximizing decision-making. The current work extends the dual-hormone hypothesis to the collective level and provides a neurobiological perspective on the factors that determine who rises to the top across, not just within, social hierarchies.

Thursday, September 08, 2016

Reason is not required for a life of meaning.

Robert Burton, former neurology chief at UCSF and a neuroscience author, has contributed an excellent short essay to the NYTimes philosophy series The Stone. A few clips:
Few would disagree with two age-old truisms: We should strive to shape our lives with reason, and a central prerequisite for the good life is a personal sense of meaning...Any philosophical approach to values and purpose must acknowledge this fundamental neurological reality: a visceral sense of meaning in one’s life is an involuntary mental state that, like joy or disgust, is independent from and resistant to the best of arguments...Anyone who has experienced a bout of spontaneous depression knows the despair of feeling that nothing in life is worth pursuing and that no argument, no matter how inspired, can fill the void. Similarly, we are all familiar with the countless narratives of religious figures “losing their way” despite retaining their formal beliefs.
As neuroscience attempts to pound away at the idea of pure rationality and underscore the primacy of subliminal mental activity, I am increasingly drawn to the metaphor of idiosyncratic mental taste buds. From genetic factors (a single gene determines whether we find brussels sprouts bitter or sweet), to the cultural — considering fried grasshoppers and grilled monkey brains as delicacies — taste isn’t a matter of the best set of arguments...If thoughts, like foods, come in a dazzling variety of flavors, and personal taste trumps reason, philosophy — which relies most heavily on reason, and aims to foster the acquisition of objective knowledge — is in a bind.
Though we don’t know how thoughts are produced by the brain, it is hard to imagine having a thought unaccompanied by some associated mental state. We experience a thought as pleasing, revolting, correct, incorrect, obvious, stupid, brilliant, etc. Though integral to our thoughts, these qualifiers arise out of different brain mechanisms from those that produce the raw thought. As examples, feelings of disgust, empathy and knowing arise from different areas of brain and can be provoked de novo in volunteer subjects via electrical stimulation even when the subjects are unaware of having any concomitant thought at all. This chicken-and-egg relationship between feelings and thought can readily be seen in how we make moral judgments...The psychologist Jonathan Haidt and others have shown that our moral stances strongly correlate with the degree of activation of those brain areas that generate a sense of disgust and revulsion. According to Haidt, reason provides an after-the-fact explanation for moral decisions that are preceded by inherently reflexive positive or negative feelings. Think about your stance on pedophilia or denying a kidney transplant to a serial killer.
After noting work of Libet and others showing that our sense of agency is an illusion - initiating an action occurs well after our brains have already started that action, especially in tennis players and baseball batters - Burton suggests that:
It is unlikely that there is any fundamental difference in how the brain initiates thought and action. We learn the process of thinking incrementally, acquiring knowledge of language, logic, the external world and cultural norms and expectations just as we learn physical actions like talking, walking or playing the piano. If we conceptualize thought as a mental motor skill subject to the same temporal reorganization as high-speed sports, it’s hard to avoid the conclusion that the experience of free will (agency) and conscious rational deliberation are both biologically generated illusions.
What then are we to do with the concept of rationality? It would be a shame to get rid of a term useful in characterizing the clarity of a line of reasoning. Everyone understands that “being rational” implies trying to strip away biases and innate subjectivity in order to make the best possible decision. But what if the word rational leads us to scientifically unsound conclusions?
Going forward, the greatest challenge for philosophy will be to remain relevant while conceding that, like the rest of the animal kingdom, we are decision-making organisms rather than rational agents, and that our most logical conclusions about moral and ethical values can’t be scientifically verified nor guaranteed to pass the test of time. (The history of science should serve as a cautionary tale for anyone tempted to believe in the persistent truth of untestable ideas).
Even so, I would hate to discard such truisms such as “know thyself” or “the unexamined life isn’t worth living.” Reason allows us new ways of seeing, just as close listening to a piece of music can reveal previously unheard melodies and rhythms or observing an ant hill can give us an unexpected appreciation of nature’s harmonies. These various forms of inquiry aren’t dependent upon logic and verification; they are modes of perception.

Wednesday, September 07, 2016

Brain network characteristics of highly intelligent people.

Schultz and Cole show that higher intelligence is associated with less task-related brain network reconfiguration:

SIGNIFICANCE STATEMENT
The brain's network configuration varies based on current task demands. For example, functional brain connections are organized in one way when one is resting quietly but in another way if one is asked to make a decision. We found that the efficiency of these updates in brain network organization is positively related to general intelligence, the ability to perform a wide variety of cognitively challenging tasks well. Specifically, we found that brain network configuration at rest was already closer to a wide variety of task configurations in intelligent individuals. This suggests that the ability to modify network connectivity efficiently when task demands change is a hallmark of high intelligence.
ABSTRACT
The human brain is able to exceed modern computers on multiple computational demands (e.g., language, planning) using a small fraction of the energy. The mystery of how the brain can be so efficient is compounded by recent evidence that all brain regions are constantly active as they interact in so-called resting-state networks (RSNs). To investigate the brain's ability to process complex cognitive demands efficiently, we compared functional connectivity (FC) during rest and multiple highly distinct tasks. We found previously that RSNs are present during a wide variety of tasks and that tasks only minimally modify FC patterns throughout the brain. Here, we tested the hypothesis that, although subtle, these task-evoked FC updates from rest nonetheless contribute strongly to behavioral performance. One might expect that larger changes in FC reflect optimization of networks for the task at hand, improving behavioral performance. Alternatively, smaller changes in FC could reflect optimization for efficient (i.e., small) network updates, reducing processing demands to improve behavioral performance. We found across three task domains that high-performing individuals exhibited more efficient brain connectivity updates in the form of smaller changes in functional network architecture between rest and task. These smaller changes suggest that individuals with an optimized intrinsic network configuration for domain-general task performance experience more efficient network updates generally. Confirming this, network update efficiency correlated with general intelligence. The brain's reconfiguration efficiency therefore appears to be a key feature contributing to both its network dynamics and general cognitive ability.

Tuesday, September 06, 2016

Feeling Good? Do something unpleasant.

A curious piece from Taquet et al.:
Most theories of motivation have highlighted that human behavior is guided by the hedonic principle, according to which our choices of daily activities aim to minimize negative affect and maximize positive affect. However, it is not clear how to reconcile this idea with the fact that people routinely engage in unpleasant yet necessary activities. To address this issue, we monitored in real time the activities and moods of over 28,000 people across an average of 27 d using a multiplatform smartphone application. We found that people’s choices of activities followed a hedonic flexibility principle. Specifically, people were more likely to engage in mood-increasing activities (e.g., play sports) when they felt bad, and to engage in useful but mood-decreasing activities (e.g., housework) when they felt good. These findings clarify how hedonic considerations shape human behavior. They may explain how humans overcome the allure of short-term gains in happiness to maximize long-term welfare.

Monday, September 05, 2016

Do your friends really like you?

I found this article by Murphy pointing to work by Almaatouq et al. to align with my recent experience of having two long term friends (or so I thought), simply stop responding to emails about getting together. And, from the other direction, being described as "our good friend" by a couple I didn't particularly like. It turns out that studies show that only about half of perceived friendships are mutual. The Alamaatouq et al. study:
...analyzed friendship ties among 84 subjects (ages 23 to 38) in a business management class by asking them to rank one another on a five-point continuum of closeness from “I don’t know this person” to “One of my best friends.” The feelings were mutual 53 percent of the time while the expectation of reciprocity was pegged at 94 percent. This is consistent with data from several other friendship studies conducted over the past decade, encompassing more than 92,000 subjects, in which the reciprocity rates ranged from 34 percent to 53 percent.
Clips from the last portion of Murphy's article:
Because time is limited, so, too, is the number of friends you can have, according to the work of the British evolutionary psychologist Robin I.M. Dunbar. He describes layers of friendship, where the topmost layer consists of only one or two people, say a spouse and best friend with whom you are most intimate and interact daily. The next layer can accommodate at most four people for whom you have great affinity, affection and concern and who require weekly attention to maintain. Out from there, the tiers contain more casual friends with whom you invest less time and tend to have a less profound and more tenuous connection. Without consistent contact, they easily fall into the realm of acquaintance.
...playing it safe by engaging in shallow, unfulfilling or nonreciprocal relationships has physical repercussions. Not only do the resulting feelings of loneliness and isolation increase the risk of death as much as smoking, alcoholism and obesity; you may also lose tone, or function, in the so-called smart vagus nerve, which brain researchers think allows us to be in intimate, supportive and reciprocal relationships in the first place...In the presence of a true friend...the smart or modulating aspect of the vagus nerve is what makes us feel at ease rather than on guard as when we are with a stranger or someone judgmental. It’s what enables us to feel O.K. about exposing the soft underbelly of our psyche and helps us stay engaged and present in times of conflict. Lacking authentic friendships, the smart vagus nerve is not exercised. It loses tone and one’s anxiety remains high, making abiding, deep connections difficult.

Friday, September 02, 2016

Growing Older, Getting Happier

A brief piece from Nicholas Bakalar in the NYTimes summaring the recent paper by Thomas et al. (senior author Dilip Jeste):
Older people tend to be happier than younger people, and their happiness increases with age...Researchers contacted 1,546 people ages 21 to 99 via random telephone calls and found that older age was, not surprisingly, tied to declines in physical and cognitive function. But it was also associated with higher levels of overall satisfaction, happiness and well-being, and lower levels of anxiety, depression and stress. The older the person, the study found, the better his or her mental health tended to be.
The researchers used well-validated scales to assess mental health, although the study relied on self-reports and was a snapshot in time that did not follow an individual through a lifetime. Other studies have found similar results linking advancing age and higher levels of happiness.
The reasons for the effect remain unclear, but the senior author, Dr. Dilip V. Jeste, a professor of psychiatry at the University of California, San Diego, had some suggestions...“Brain studies show that the amygdala in older people responds less to stressful or negative images than in a younger person,” he said. “We become wise. Peer pressure loses its sting. Better decision-making, more control of emotions, doing things that are not just for yourself, knowing oneself better, being more studious and yet more decisive...“This is good news for young people, too,” he added. “You have something to look forward to.”
Here are the methods and results sections from the abstract:
Methods: Cross-sectional data were obtained from 1,546 individuals aged 21–100 years, selected using random digit dialing for the Successful AGing Evaluation (SAGE) study, a structured multicohort investigation that included telephone interviews and in-home surveys of community-based adults without dementia. Data were collected from 1/26/2010 to 10/07/2011 targeting participants aged 50–100 years and from 6/25/2012 to 7/15/2013 targeting participants aged 21–100 years with an emphasis on adding younger individuals. Data included self-report measures of physical health, measures of both positive and negative attributes of mental health, and a phone interview–based measure of cognition.
Results: Comparison of age cohorts using polynomial regression suggested a possible accelerated deterioration in physical and cognitive functioning, averaging 1.5 to 2 standard deviations over the adult lifespan. In contrast, there appeared to be a linear improvement of about 1 standard deviation in various attributes of mental health over the same life period.

Thursday, September 01, 2016

Wednesday, August 31, 2016

Climate disasters act as threat multipliers in ethnic conflicts.

Schleussner et al. offer a proof of a common assumption about the effects of climate disasters: that they drive people further apart rather than closer together:
Social and political tensions keep on fueling armed conflicts around the world. Although each conflict is the result of an individual context-specific mixture of interconnected factors, ethnicity appears to play a prominent and almost ubiquitous role in many of them. This overall state of affairs is likely to be exacerbated by anthropogenic climate change and in particular climate-related natural disasters. Ethnic divides might serve as predetermined conflict lines in case of rapidly emerging societal tensions arising from disruptive events like natural disasters. Here, we hypothesize that climate-related disaster occurrence enhances armed-conflict outbreak risk in ethnically fractionalized countries. Using event coincidence analysis, we test this hypothesis based on data on armed-conflict outbreaks and climate-related natural disasters for the period 1980–2010. Globally, we find a coincidence rate of 9% regarding armed-conflict outbreak and disaster occurrence such as heat waves or droughts. Our analysis also reveals that, during the period in question, about 23% of conflict outbreaks in ethnically highly fractionalized countries robustly coincide with climatic calamities. Although we do not report evidence that climate-related disasters act as direct triggers of armed conflicts, the disruptive nature of these events seems to play out in ethnically fractionalized societies in a particularly tragic way. This observation has important implications for future security policies as several of the world’s most conflict-prone regions, including North and Central Africa as well as Central Asia, are both exceptionally vulnerable to anthropogenic climate change and characterized by deep ethnic divides.

Tuesday, August 30, 2016

Our self and our temporo-parietal junction

Eddy does a review of the Temporo-parietal junction area of our brain that appears to be central to our sense of self and other:

Highlights
•Existing literature places the TPJ at the interface between mind and matter. 
•The right TPJ is critical for the control of self and other representations. 
•Dysfunction of right TPJ may therefore compromise our sense of self. 
•Disintegration of the self may in turn underpin various neuropsychiatric symptoms.
Abstract
The temporo-parietal junction (TPJ) is implicated in a variety of processes including multisensory integration, social cognition, sense of agency and stimulus-driven attention functions. Furthermore, manipulation of cortical excitation in this region can influence a diverse range of personal and interpersonal perceptions, from those involved in moral decision making to judgments about the location of the self in space. Synthesis of existing studies places the TPJ at the neural interface between mind and matter, where information about both mental and physical states is processed and integrated, contributing to self-other differentiation. After first summarising the functions of the TPJ according to existing literature, this narrative review aims to offer insight into the potential role of TPJ dysfunction in neuropsychiatric disorders, with a focus on the involvement of the right TPJ in controlling representations relating to the self and other. Problems with self-other distinctions may reflect or pose a vulnerability to the symptoms associated with Tourette syndrome, Schizophrenia, Autistic Spectrum Disorder and Obsessive Compulsive Disorder. Further study of this most fascinating neural region will therefore make a substantial contribution to our understanding of neuropsychiatric symptomatology and highlight significant opportunities for therapeutic impact.

Anatomical and functional subdivisions of the temporo-parietal junction. Top row: Functional MRI meta-analysis data...Showing forward inference data identified using the terms ‘social’ in red, and ‘attention’ in green, with overlap in yellow. Bottom row: Standard anatomical maps using Automated Anatomical Labelling. Showing right inferior parietal lobe (cyan), supramarginal gyrus (green), angular gyrus (deep blue), superior temporal gyrus (yellow) and middle temporal gyrus (red).

Monday, August 29, 2016

Psychological disruptions of our online lives.

I want to pass on clips from a review by Steiner-Adair in the Washington Post, describing Mary Aiken's book "The Cyber Effect," that describes how cyberspace is changing the way we think, feel, and behave:
She uses the science of human behavior to define cyberspace as a unique environment — an actual space — not simply a virtual extension of the pre-digital world and our characteristic behaviors there. Yes, we still hang out, connect, flirt, fight, learn, do business and do good online. But disinhibition and anonymity in cyberspace foster a particular pattern of impulsivity, careless or inflammatory expression, social cruelty, deception, exploitation — and vulnerability. Consider the unsettling phenomenon of ubiquitous victimology, in which “the criminals are well hidden but you aren’t.” That extends from the ordinary streets of online life to the deep, criminal underground where predators roam and perps hawk illicit wares from drugs, guns and hired assassins to trafficked humans and tools for terrorism. Forget reality TV, this is reality. And it’s a mouse click away from your living room — and your curious child.
Our real-world senses do not serve or protect us adequately in cyberspace, Aiken warns. As humans, we’re caught in the gap between evolution and a sea change in our environment. Our instincts for appraising mates, pals and trustworthy others are visceral, designed by nature for face-to-face, embodied interaction in a physical environment. They fail to pick up signals when we meet in the cyber-realm. Without those protective filters, and unaware that they’ve been disabled, we’re vulnerable in new ways. Connecting on line feels so easy and natural that we come to assume a newfound sameness and closeness with strangers.
This phenomenon of “online syndication,” as Aiken calls it — using the Internet to find others we think are like-minded and to normalize and socialize underlying tendencies — is a setup for easy disaster, as Aiken shows in her examples of people caught in cyber-crises: humiliating exchanges or exposure, debt, love affairs, fetishes, porn and gaming addictions, or the lure of criminal behavior. They fail to see the big disconnect between who they are in real life and who they are online, and the gap is fraught with consequences.
Aiken is concerned for children’s development, health and safety in a cyber-environment that replaces face-to-face interaction with online engagement and includes easy access to pornography and hyper-stimulating, addictive activity. The evidence is in, she says, and it shows conclusively that “there are windows in the formative years when very specific skills need to be learned. When those developmental windows close, a child may be developmentally or emotionally crippled for life.”
...the Internet “is clearly, unmistakably, and emphatically an adult environment. It simply wasn’t designed for children. So why are they there?” Indeed, why are we giving kids keys to the Internet? Who would ever think it’s a good idea for children to have miniature computers in their pockets that can take them anywhere online, unsupervised and unprotected? Aiken describes the lack of regulation, accountability, privacy and protection for children caught in this digital transition as a “crime against innocence.” It represents a massive seduction of parents and other adults who should know better, she argues. Her forensic perspective compels us all to demand better protection, reminding us that children ages 4 through 12 are the most vulnerable population on the Web.

Friday, August 26, 2016

A bit of nostalgia - Powers of 10

I just stumbled across a charming relic from my counter culture days in the 1970's, when I was watching whales and monarch butterflies at the Esalen Institute in Big Sur, and learning gestalt, TA, Alexander, massage, and meditation techniques. At one point I signed up for transcendental meditation instruction, and this 1977 video was shown in the first session, after which the instructor said "That's all there is to it"........Sigh.....


Thursday, August 25, 2016

Alerting or Somnogenic light - pick your color

Bourgin and Hubbard summarize work by Pilorz et al.
Light exerts profound effects on our physiology and behaviour, setting our biological clocks to the correct time and regulating when we are asleep and we are awake. The photoreceptors mediating these responses include the rods and cones involved in vision, as well as a subset of photosensitive retinal ganglion cells (pRGCs) expressing the blue light-sensitive photopigment melanopsin. Previous studies have shown that mice lacking melanopsin show impaired sleep in response to light. However, other studies have shown that light increases glucocorticoid release—a response typically associated with stress. To address these contradictory findings, we studied the responses of mice to light of different colours. We found that blue light was aversive, delaying sleep onset and increasing glucocorticoid levels. By contrast, green light led to rapid sleep onset. These different behavioural effects appear to be driven by different neural pathways. Surprisingly, both responses were impaired in mice lacking melanopsin. These data show that light can promote either sleep or arousal. Moreover, they provide the first evidence that melanopsin directly mediates the effects of light on glucocorticoids. This work shows the extent to which light affects our physiology and has important implications for the design and use of artificial light sources.

Wednesday, August 24, 2016

Oxytocin - a molecular substrate for forming optimistic beliefs about the future

Ma et al. demonstrate a molecular basis for why people tend to incorporate desirable, but not undesirable, feedback into their beliefs:

Significance
People tend to incorporate desirable feedback into their beliefs but discount undesirable ones. Such optimistic updating has evolved as an advantageous mechanism for social adaptation and physical/mental health. Here, in three independent studies, we show that intranasally administered oxytocin (OT), an evolutionary ancient neuropeptide pivotal to social adaptation, augments optimistic belief updating by increasing updates and learning of desirable feedback but impairing updates of undesirable feedback. Moreover, the OT-impaired updating of undesirable feedback is more salient in individuals with high, rather than with low, depression or anxiety traits. OT also increases second-order confidence judgment after desirable feedback. These findings reveal a molecular substrate underlying the formation of optimistic beliefs about the future.
Abstract
Humans update their beliefs upon feedback and, accordingly, modify their behaviors to adapt to the complex, changing social environment. However, people tend to incorporate desirable (better than expected) feedback into their beliefs but to discount undesirable (worse than expected) feedback. Such optimistic updating has evolved as an advantageous mechanism for social adaptation. Here, we examine the role of oxytocin (OT)―an evolutionary ancient neuropeptide pivotal for social adaptation―in belief updating upon desirable and undesirable feedback in three studies (n = 320). Using a double-blind, placebo-controlled between-subjects design, we show that intranasally administered OT (IN-OT) augments optimistic belief updating by facilitating updates of desirable feedback but impairing updates of undesirable feedback. The IN-OT–induced impairment in belief updating upon undesirable feedback is more salient in individuals with high, rather than with low, depression or anxiety traits. IN-OT selectively enhances learning rate (the strength of association between estimation error and subsequent update) of desirable feedback. IN-OT also increases participants’ confidence in their estimates after receiving desirable but not undesirable feedback, and the OT effect on confidence updating upon desirable feedback mediates the effect of IN-OT on optimistic belief updating. Our findings reveal distinct functional roles of OT in updating the first-order estimation and second-order confidence judgment in response to desirable and undesirable feedback, suggesting a molecular substrate for optimistic belief updating.

Tuesday, August 23, 2016

Slow motion increases perceived intent.

The abstract from interesting work of Caruso et al.
To determine the appropriate punishment for a harmful action, people must often make inferences about the transgressor’s intent. In courtrooms and popular media, such inferences increasingly rely on video evidence, which is often played in “slow motion.” Four experiments (n = 1,610) involving real surveillance footage from a murder or broadcast replays of violent contact in professional football demonstrate that viewing an action in slow motion, compared with regular speed, can cause viewers to perceive an action as more intentional. This slow motion intentionality bias occurred, in part, because slow motion video caused participants to feel like the actor had more time to act, even when they knew how much clock time had actually elapsed. Four additional experiments (n = 2,737) reveal that allowing viewers to see both regular speed and slow motion replay mitigates the bias, but does not eliminate it. We conclude that an empirical understanding of the effect of slow motion on mental state attribution should inform the life-or-death decisions that are currently based on tacit assumptions about the objectivity of human perception.

Monday, August 22, 2016

Lifespan changes in brain and cognition - early life sets the stage.

Walhovd et al. present a fascinating study on the origins of lifespan changes in brain and cognition, defining an extensive cortical region wherein surface area relates positively to general cognitive ability (GCA) in development. They find evidence that especially prefrontal and medial and posterolateral temporal clusters relate more strongly to GCA:

Significance
Brain and cognition change with age, with early gains and later declines. Attempts have been made to identify age-specific mechanisms, focusing on when and how declines begin in adults. However, even though general cognitive ability declines with age, there is a high stability in individuals’ cognitive ability relative to their same-age peers. Here we show that the relation between brain and cognition appears remarkably stable through the human lifespan. The cortical area change trajectories of higher and lower cognitive ability groups were parallel through life. Birth weight and parental education were identified as predictors, which provides novel evidence for stability in brain–cognition relationships throughout life, and indicates that early life factors impact brain and cognition for the entire life course.
Abstract
Neurodevelopmental origins of functional variation in older age are increasingly being acknowledged, but identification of how early factors impact human brain and cognition throughout life has remained challenging. Much focus has been on age-specific mechanisms affecting neural foundations of cognition and their change. In contrast to this approach, we tested whether cerebral correlates of general cognitive ability (GCA) in development could be extended to the rest of the lifespan, and whether early factors traceable to prenatal stages, such as birth weight and parental education, may exert continuous influences. We measured the area of the cerebral cortex in a longitudinal sample of 974 individuals aged 4–88 y (1,633 observations). An extensive cortical region was identified wherein area related positively to GCA in development. By tracking area of the cortical region identified in the child sample throughout the lifespan, we showed that the cortical change trajectories of higher and lower GCA groups were parallel through life, suggesting continued influences of early life factors. Birth weight and parental education obtained from the Norwegian Mother–Child Cohort study were identified as such early factors of possible life-long influence. Support for a genetic component was obtained in a separate twin sample (Vietnam Era Twin Study of Aging), but birth weight in the child sample had an effect on cortical area also when controlling for possible genetic differences in terms of parental height. Our results provide novel evidence for stability in brain–cognition relationships throughout life, and indicate that early life factors impact brain and cognition for the entire life course.
A summary graphic from the review by Jagust:


Conceptual model linking brain development, cognition, brain reserve, and late-life cognitive decline. Early life exposures and genes affect brain development, which in turn is related to GCA. GCA and education are related to one another, and provide brain reserve with advancing age. The graph demonstrates two individuals with high (blue) and low (red) brain reserve. Although the rate of their age-related cognitive decline is identical, the person with higher reserve crosses the threshold for dependence at an older age, thus experiencing a longer independent life. Early-life exposures, however, also confer indirect beneficial effects in addition to brain development, and these are likely to be salutary over the lifespan.

Friday, August 19, 2016

Neural link between affective understanding and interpersonal attraction

From Anders et al.:
Being able to comprehend another person’s intentions and emotions is essential for successful social interaction. However, it is currently unknown whether the human brain possesses a neural mechanism that attracts people to others whose mental states they can easily understand. Here we show that the degree to which a person feels attracted to another person can change while they observe the other’s affective behavior, and that these changes depend on the observer’s confidence in having correctly understood the other’s affective state. At the neural level, changes in interpersonal attraction were predicted by activity in the reward system of the observer’s brain. Importantly, these effects were specific to individual observer–target pairs and could not be explained by a target’s general attractiveness or expressivity. Furthermore, using multivoxel pattern analysis (MVPA), we found that neural activity in the reward system of the observer’s brain varied as a function of how well the target’s affective behavior matched the observer’s neural representation of the underlying affective state: The greater the match, the larger the brain’s intrinsic reward signal. Taken together, these findings provide evidence that reward-related neural activity during social encounters signals how well an individual’s “neural vocabulary” is suited to infer another person’s affective state, and that this intrinsic reward might be a source of changes in interpersonal attraction.

Thursday, August 18, 2016

Statistics versus judgement.

This interesting website, pointed out to me by a friend, offers to send a daily gem of information to you, usually an excerpt from a published book...so, being a glutton for input streams, I signed up. I usually move on after glancing at a given day's topic, but this excerpt from Kahneman's "Thinking, Fast and Slow" I pass on, after excerpting even further:
In his book Clinical vs. Statistical Prediction: A The­oretical Analysis and a Review of the Evidence, psychoanalyst Paul Meehl gave evidence that statistical models almost always yield better predictions and diagnoses than the judgment of trained professionals. In fact, experts frequently give different answers when presented with the same information within a matter of a few minutes...Meehl's book provoked shock and disbelief among clinical psychologists, and the controversy it started has engendered a stream of research that is still flowing today, more than fifty years after its publication. The number of studies reporting comparisons of clinical and statistical predictions has increased to roughly two hundred, but the score in the contest between algorithms and humans has not changed. About 60% of the studies have shown significantly better accuracy for the algo­rithms. The other comparisons scored a draw in accuracy, but a tie is tanta­mount to a win for the statistical rules, which are normally much less expensive to use than expert judgment. No exception has been convinc­ingly documented.
The range of predicted outcomes has expanded to cover medical vari­ables such as the longevity of cancer patients, the length of hospital stays, the diagnosis of cardiac disease, and the susceptibility of babies to sudden infant death syndrome; economic measures such as the prospects of success for new businesses, the evaluation of credit risks by banks, and the future career satisfaction of workers; questions of interest to government agencies, including assessments of the suitability of foster parents, the odds of recidivism among juvenile offenders, and the likelihood of other forms of violent behavior; and miscellaneous outcomes such as the evaluation of scientific presentations, the winners of football games, and the future prices of Bor­deaux wine. Each of these domains entails a significant degree of uncer­tainty and unpredictability. We describe them as 'low-validity environments.' In every case, the accuracy of experts was matched or exceeded by a simple algorithm.
Another reason for the inferiority of expert judgment is that humans are incorrigibly inconsistent in making summary judgments of complex information. When asked to evaluate the same information twice, they frequently give different answers. The extent of the inconsistency is often a matter of real concern. Experienced radiologists who evaluate chest X-rays as 'normal' or 'abnormal' contradict themselves 20% of the time when they see the same picture on separate occasions. A study of 101 indepen­dent auditors who were asked to evaluate the reliability of internal corpo­rate audits revealed a similar degree of inconsistency. A review of 41 separate studies of the reliability of judgments made by auditors, pathologists, psy­chologists, organizational managers, and other professionals suggests that this level of inconsistency is typical, even when a case is reevaluated within a few minutes. Unreliable judgments cannot be valid predictors of anything

Wednesday, August 17, 2016

How China is changing your internet.

Here is a fascinating piece done by the NYTimes on the parallel universe of the internet in China.

 

Tuesday, August 16, 2016

The long lives of fairy tales.

I pass on some clips from a review by Pagel of work by Da Silva and Tehrani suggesting that some common fairy tales can be traced back 7,000 years or more, long before written languages appeared.
The Indo-European language family is a collection of related languages that probably arose in Anatolia and is now spoken all over western Eurasia. Its modern descendants include the Celtic, Germanic and Italic or Romance languages of western Europe, the Slavic languages of Russia and much of the Balkans, and the Indo-Iranian languages including Persian, as well as Sanskrit and most of the languages of the Indian sub-continent.
Language evolves faster than genes and language is predominantly vertically transmitted. Similarities and differences among vocabulary items, then, play the same role for cultural phylogenies as genes do for species trees, and provide greater resolution over short timescales. The Indo-European language tree is one of the most carefully studied of these language phylogenies
With a phylogenetic tree in hand, the authors recorded the presence or absence of each of 275 fairy tales in fifty Indo-European languages...Of the 275 tales, the authors discarded 199 after performing two tests of horizontal transmission...This left a group of 76 tales for which vertical transmission over the course of Indo-European history was the dominant signal for the patterns of shared presence and absence among contemporary societies. Hänsel and Gretel didn’t make this cut, but Beauty and the Beast did.
Evolutionary statistical methods were then applied to calculate a probability that each of the tales was present at each of various major historical splitting points on the Indo-European language phylogeny, taking account of uncertainty both in the phylogeny and in the reconstructed state. Calculating the ancestral probabilities depends only upon the distribution of tales in the contemporary languages in combination with the phylogenetic tree and so neatly gets around the problem that few if any tales exist as ‘fossil’ texts...Fourteen of the 76 tales, including Beauty and the Beast, were assigned a 50% or greater chance of having been present in the common ancestor of the entire western branch of the Indo-European languages. ..
A further four of the fourteen tales — but not Beauty and the Beast — had a 50% or greater probability of being present at the root of the Indo-European tree. A proto-Indo-European origin for these four tales represents a probable age of over 7,000 years. The tale with the highest probability (87%) of being present at the root was The Smith and the Devil whose story of a smith selling his soul to the devil is echoed today in the modern story of Faust. The authors suggest that metal working technology — as implied by the presence of a smith — could have been available this long ago.
Considering all these notions might lead us to ask why not more of the fairy tales appeared right back at the Indo-European root, or perhaps to wonder if some could go back even further. Perhaps some do. Flood myths appear in many of the world’s cultures, with some speculation that they date to the end of the last Ice Age perhaps 15,000 to 20,000 years ago when sea levels rose dramatically — if true, the western Bible story of Noah is just a comparatively recent hand-me-down.

Monday, August 15, 2016

Brain changes during hypnosis

Jiang et al. do the most detailed analysis to date of brain changes that are distinctive to people undergoing hypnosis:
Hypnosis has proven clinical utility, yet changes in brain activity underlying the hypnotic state have not yet been fully identified. Previous research suggests that hypnosis is associated with decreased default mode network (DMN) activity and that high hypnotizability is associated with greater functional connectivity between the executive control network (ECN) and the salience network (SN). We used functional magnetic resonance imaging to investigate activity and functional connectivity among these three networks in hypnosis. We selected 57 of 545 healthy subjects with very high or low hypnotizability using two hypnotizability scales. All subjects underwent four conditions in the scanner: rest, memory retrieval, and two different hypnosis experiences guided by standard pre-recorded instructions in counterbalanced order. Seeds for the ECN, SN, and DMN were left and right dorsolateral prefrontal cortex, dorsal anterior cingulate cortex (dACC), and posterior cingulate cortex (PCC), respectively. During hypnosis there was reduced activity in the dACC, increased functional connectivity between the dorsolateral prefrontal cortex (DLPFC;ECN) and the insula in the SN, and reduced connectivity between the ECN (DLPFC) and the DMN (PCC). These changes in neural activity underlie the focused attention, enhanced somatic and emotional control, and lack of self-consciousness that characterizes hypnosis.

Friday, August 12, 2016

Why do people infer “ought” from “is”?

Tworek and Cimpian offer an interesting perspective, doing experiments illustrating how we ascribe intrinsic value to what is customary. I give the start of their introduction setting the context, and then their abstract:
In his dissent from the Supreme Court decision recognizing a federal constitutional right for people to marry a same-sex partner, Chief Justice Roberts noted that heterosexual marriage has been around “for millennia” in societies all over the world: “the Kalahari Bushmen and the Han Chinese, the Carthaginians and the Aztecs”. A possible reading of this remark is that we should take what is typical as a signpost for what is good—how things ought to be.1 Whatever the correct interpretation here, the tendency to move seamlessly from “is” to “ought” is a mainstay of everyday reasoning. However, the validity of such “is”-to-“ought” inferences (or ought inferences) is at best uncertain. The mere existence of a pattern of behavior does not, by itself, reveal that the behavior is good.2 For instance, slavery and child labor were common throughout history, and still are in some parts of the world, yet it does not follow that people ought to engage in these practices. Why, then, do people frequently draw ought inferences and find them persuasive?
Abstract
People tend to judge what is typical as also good and appropriate—as what ought to be. What accounts for the prevalence of these judgments, given that their validity is at best uncertain? We hypothesized that the tendency to reason from “is” to “ought” is due in part to a systematic bias in people’s (nonmoral) explanations, whereby regularities (e.g., giving roses on Valentine’s Day) are explained predominantly via inherent or intrinsic facts (e.g., roses are beautiful). In turn, these inherence-biased explanations lead to value-laden downstream conclusions (e.g., it is good to give roses). Consistent with this proposal, results from five studies (N = 629 children and adults) suggested that, from an early age, the bias toward inherence in explanations fosters inferences that imbue observed reality with value. Given that explanations fundamentally determine how people understand the world, the bias toward inherence in these judgments is likely to exert substantial influence over sociomoral understanding.

Thursday, August 11, 2016

How our brain and visceral monitoring encode the ‘self’

Babo-Rebelo et al. show that two seemingly distinct roles of the default brain network (DN), in self-related cognition on the one hand, and in the monitoring of bodily signals for autonomous function regulation, on the other, are functionally coupled. They do this by testing whether the amplitudes of heartbeat-evoked responses (HERs) during thoughts systematically covary with their self-relatedness, and whether this mechanism engages the DN. They employ two scales of self-relatedness. The “Me” scale described the content of the thought oriented either toward oneself or toward an external object, event, or person. The “I” scale described the engagement of the participant as the protagonist or the agent in the thought. Here is their abstract:
The default network (DN) has been consistently associated with self-related cognition, but also to bodily state monitoring and autonomic regulation. We hypothesized that these two seemingly disparate functional roles of the DN are functionally coupled, in line with theories proposing that selfhood is grounded in the neural monitoring of internal organs, such as the heart. We measured with magnetoencephalograhy neural responses evoked by heartbeats while human participants freely mind-wandered. When interrupted by a visual stimulus at random intervals, participants scored the self-relatedness of the interrupted thought. They evaluated their involvement as the first-person perspective subject or agent in the thought (“I”), and on another scale to what degree they were thinking about themselves (“Me”). During the interrupted thought, neural responses to heartbeats in two regions of the DN, the ventral precuneus and the ventromedial prefrontal cortex, covaried, respectively, with the “I” and the “Me” dimensions of the self, even at the single-trial level. No covariation between self-relatedness and peripheral autonomic measures (heart rate, heart rate variability, pupil diameter, electrodermal activity, respiration rate, and phase) or alpha power was observed. Our results reveal a direct link between selfhood and neural responses to heartbeats in the DN and thus directly support theories grounding selfhood in the neural monitoring of visceral inputs. More generally, the tight functional coupling between self-related processing and cardiac monitoring observed here implies that, even in the absence of measured changes in peripheral bodily measures, physiological and cognitive functions have to be considered jointly in the DN.

Wednesday, August 10, 2016

Leave the kids alone! A cognitive case for un-parenting

I want to pass on some clips from the text of a recent review by Glausiusz of Alison Gopnik's book on child-rearing "The Gardener and the Carpenter," and also from the NYTimes pieces by Gopnik summarizing its main arguments. (Her bottom line: "We don’t have to make children learn, we just have to let them learn." Clips from the book review:
An Amazon trawl for “parenting books” last month offered up 186,262 results. ..This is less genre than tsunami...Yet, as Alison Gopnik notes...the word parenting became common only in the 1970s, rising in popularity as traditional sources of wisdom about child-rearing — large extended families, for example — fell away...Gopnik...argues that the message of this massive modern industry is misguided.
It assumes that the 'right' parenting techniques or expertise will sculpt your child into a successful adult. But using a scheme to shape material into a product is the modus operandi of a carpenter, whose job it is to make the chair steady or the door true. There is very little empirical evidence, Gopnik says, that “small variations” in what parents do (such as whether they sleep-train) “have reliable and predictable long-term effects on who those children become”. Raising and caring for children is more like tending a garden: it involves “a lot of exhausted digging and wallowing in manure” to create a safe, nurturing space in which innovation, adaptability and resilience can thrive. Her approach focuses on helping children to find their own way, even if it isn't one you'd choose for them. The lengthy childhood of our species gives kids ample opportunity to explore, exploit and experiment before they are turned out into an unpredictable world.
Clips from Gopnik:
It’s not just that young children don’t need to be taught in order to learn. In fact, studies show that explicit instruction, the sort of teaching that goes with school and “parenting,” can be limiting. When children think they are being taught, they are much more likely to simply reproduce what the adult does, instead of creating something new.
My lab tried a different version of the experiment with the complicated toy. This time, though, the experimenter acted like a teacher. She said, “I’m going to show you how my toy works,” instead of “I wonder how this toy works.” The children imitated exactly what she did, and didn’t come up with their own solutions.
The children seem to work out, quite rationally, that if a teacher shows them one particular way to do something, that must be the right technique, and there’s no point in trying something new. But as a result, the kind of teaching that comes with schools and “parenting” pushes children toward imitation and away from innovation.
There is a deep irony here. Parents and policy makers care about teaching because they recognize that learning is increasingly important in an information age. But the new information economy, as opposed to the older industrial one, demands more innovation and less imitation, more creativity and less conformity.
In fact, children’s naturally evolved learning techniques are better suited to that sort of challenge than the teaching methods of the past two centuries.

Tuesday, August 09, 2016

Uncalculating cooperation is used to signal trustworthiness.

Jordan et al. devise an economic game experiment whose results help to explain a range of puzzling behaviors, such as extreme altruism, the use of ethical principles, and romantic love:

Significance
Human prosociality presents an evolutionary puzzle, and reciprocity has emerged as a dominant explanation: cooperating today can bring benefits tomorrow. Reciprocity theories clearly predict that people should only cooperate when the benefits outweigh the costs, and thus that the decision to cooperate should always depend on a cost–benefit analysis. Yet human cooperation can be very uncalculating: good friends grant favors without asking questions, romantic love “blinds” us to the costs of devotion, and ethical principles make universal moral prescriptions. Here, we provide the first evidence, to our knowledge, that reputation effects drive uncalculating cooperation. We demonstrate, using economic game experiments, that people engage in uncalculating cooperation to signal that they can be relied upon to cooperate in the future.
Abstract
Humans frequently cooperate without carefully weighing the costs and benefits. As a result, people may wind up cooperating when it is not worthwhile to do so. Why risk making costly mistakes? Here, we present experimental evidence that reputation concerns provide an answer: people cooperate in an uncalculating way to signal their trustworthiness to observers. We present two economic game experiments in which uncalculating versus calculating decision-making is operationalized by either a subject’s choice of whether to reveal the precise costs of cooperating (Exp. 1) or the time a subject spends considering these costs (Exp. 2). In both experiments, we find that participants are more likely to engage in uncalculating cooperation when their decision-making process is observable to others. Furthermore, we confirm that people who engage in uncalculating cooperation are perceived as, and actually are, more trustworthy than people who cooperate in a calculating way. Taken together, these data provide the first empirical evidence, to our knowledge, that uncalculating cooperation is used to signal trustworthiness, and is not merely an efficient decision-making strategy that reduces cognitive costs. Our results thus help to explain a range of puzzling behaviors, such as extreme altruism, the use of ethical principles, and romantic love.

Monday, August 08, 2016

A brain area crucial to coping with stress.

Sinha et al. show that “neuroflexibility” in a specific region of our ventromedial prefrontal cortex enhances resilience to stress - an increase in its activity dampens down brain areas initially activated by stress. Subjects showing lower levels of this flexibility exhibited higher levels of maladaptive coping behaviors in real life.
Active coping underlies a healthy stress response, but neural processes supporting such resilient coping are not well-known. Using a brief, sustained exposure paradigm contrasting highly stressful, threatening, and violent stimuli versus nonaversive neutral visual stimuli in a functional magnetic resonance imaging (fMRI) study, we show significant subjective, physiologic, and endocrine increases and temporally related dynamically distinct patterns of neural activation in brain circuits underlying the stress response. First, stress-specific sustained increases in the amygdala, striatum, hypothalamus, midbrain, right insula, and right dorsolateral prefrontal cortex (DLPFC) regions supported the stress processing and reactivity circuit. Second, dynamic neural activation during stress versus neutral runs, showing early increases followed by later reduced activation in the ventrolateral prefrontal cortex (VLPFC), dorsal anterior cingulate cortex (dACC), left DLPFC, hippocampus, and left insula, suggested a stress adaptation response network. Finally, dynamic stress-specific mobilization of the ventromedial prefrontal cortex (VmPFC), marked by initial hypoactivity followed by increased VmPFC activation, pointed to the VmPFC as a key locus of the emotional and behavioral control network. Consistent with this finding, greater neural flexibility signals in the VmPFC during stress correlated with active coping ratings whereas lower dynamic activity in the VmPFC also predicted a higher level of maladaptive coping behaviors in real life, including binge alcohol intake, emotional eating, and frequency of arguments and fights. These findings demonstrate acute functional neuroplasticity during stress, with distinct and separable brain networks that underlie critical components of the stress response, and a specific role for VmPFC neuroflexibility in stress-resilient coping.

Friday, August 05, 2016

To remember something better, wait, then exercise.

Another nice bit on what exercise can do for you.  The abstract from van Dongen et al.:

Highlights
•Performing aerobic exercise 4 hr after learning improved associative memory 
•Exercise at this time also increased hippocampal pattern similarity during retrieval 
•Exercise performed immediately after learning had no effect on memory retention 
•Exercise could have potential as a memory intervention in educational settings
Summary
Persistent long-term memory depends on successful stabilization and integration of new memories after initial encoding. This consolidation process is thought to require neuromodulatory factors such as dopamine, noradrenaline, and brain-derived neurotrophic factor. Without the release of such factors around the time of encoding, memories will decay rapidly. Recent studies have shown that physical exercise acutely stimulates the release of several consolidation-promoting factors in humans, raising the question of whether physical exercise can be used to improve memory retention. Here, we used a single session of physical exercise after learning to exogenously boost memory consolidation and thus long-term memory. Three groups of randomly assigned participants first encoded a set of picture-location associations. Afterward, one group performed exercise immediately, one 4 hr later, and the third did not perform any exercise. Participants otherwise underwent exactly the same procedures to control for potential experimental confounds. Forty-eight hours later, participants returned for a cued-recall test in a magnetic resonance scanner. With this design, we could investigate the impact of acute exercise on memory consolidation and retrieval-related neural processing. We found that performing exercise 4 hr, but not immediately, after encoding improved the retention of picture-location associations compared to the no-exercise control group. Moreover, performing exercise after a delay was associated with increased hippocampal pattern similarity for correct responses during delayed retrieval. Our results suggest that appropriately timed physical exercise can improve long-term memory and highlight the potential of exercise as an intervention in educational and clinical settings.

Thursday, August 04, 2016

Distinguishing brain correlations from causes.

Many researchers, even though they know better, fall into the trap of assuming that correlations are causes (i.e., if brain activity X occurs just before or at the same time as action Y, it must be causing Y.) Katz et al. offer a nice example of this in looking at a brain region (the lateral intraparietal (LIP) cortex) whose activity reflects deciding on the direction of a moving set of dots. When this region was inactivated in rhesus macague monkeys performing a motion direction discrimination, it had no effect on decision making performance. But, when area MT (a motion detection area that shows only weak correlations with choices) was inhibited, performance was profoundly impaired. This suggests that larger networks should always be considered even in what seem to be simple decisions. The abstract:
During decision making, neurons in multiple brain regions exhibit responses that are correlated with decisions1. However, it remains uncertain whether or not various forms of decision-related activity are causally related to decision making. Here we address this question by recording and reversibly inactivating the lateral intraparietal (LIP) and middle temporal (MT) areas of rhesus macaques performing a motion direction discrimination task. Neurons in area LIP exhibited firing rate patterns that directly resembled the evidence accumulation process posited to govern decision making, with strong correlations between their response fluctuations and the animal’s choices. Neurons in area MT, in contrast, exhibited weak correlations between their response fluctuations and choices, and had firing rate patterns consistent with their sensory role in motion encoding. The behavioural impact of pharmacological inactivation of each area was inversely related to their degree of decision-related activity: while inactivation of neurons in MT profoundly impaired psychophysical performance, inactivation in LIP had no measurable impact on decision-making performance, despite having silenced the very clusters that exhibited strong decision-related activity. Although LIP inactivation did not impair psychophysical behaviour, it did influence spatial selection and oculomotor metrics in a free-choice control task. The absence of an effect on perceptual decision making was stable over trials and sessions and was robust to changes in stimulus type and task geometry, arguing against several forms of compensation. Thus, decision-related signals in LIP do not appear to be critical for computing perceptual decisions, and may instead reflect secondary processes. Our findings highlight a dissociation between decision correlation and causation, showing that strong neuron-decision correlations do not necessarily offer direct access to the neural computations underlying decisions.

Wednesday, August 03, 2016

Competition does not improve quality of work.

I have always been phobic about competition, especially in my scientific laboratory work, because I could feel its toxic effects on my risk taking, spontaneity and creativity. The reason that I made some useful contributions to understanding the chemistry of how we see is that I chose to emphasize questions and areas that were not in the current arenas of competition. I also felt the peer review processes involved were frequently biased (I served as a grant peer reviewer for many years.)  Balietti et al. design a laboratory experiment that produces results exactly matching my own experience:
Significance
Competition is an essential mechanism in increasing the effort and performance of human groups in real life. However, competition has side effects: it can be detrimental to creativity and reduce cooperation. We conducted an experiment called the Art Exhibition Game to investigate the effect of competitive incentives in environments where the quality of creative products and the amount of innovation allowed are decided through peer review. Our approach is general and can provide insights in domains such as clinical evaluations, scientific admissibility, and science funding. Our results show that competition leads to more innovation but also to more unfair reviews and to a lower level of agreement between reviewers. Moreover, competition does not improve the average quality of published works.  
Abstract
To investigate the effect of competitive incentives under peer review, we designed a novel experimental setup called the Art Exhibition Game. We present experimental evidence of how competition introduces both positive and negative effects when creative artifacts are evaluated and selected by peer review. Competition proved to be a double-edged sword: on the one hand, it fosters innovation and product diversity, but on the other hand, it also leads to more unfair reviews and to a lower level of agreement between reviewers. Moreover, an external validation of the quality of peer reviews during the laboratory experiment, based on 23,627 online evaluations on Amazon Mechanical Turk, shows that competition does not significantly increase the level of creativity. Furthermore, the higher rejection rate under competitive conditions does not improve the average quality of published contributions, because more high-quality work is also rejected. Overall, our results could explain why many ground-breaking studies in science end up in lower-tier journals. Differences and similarities between the Art Exhibition Game and scholarly peer review are discussed and the implications for the design of new incentive systems for scientists are explained.

Tuesday, August 02, 2016

Turn-taking skills unique to humans?

In yet another "humans are unique with respect to...." type article Melis, Tomasello and collaborators do experiments showing that humans differ in their ability to carry out long-term collaborative relationships that involve taking turns. I've been reading de Waal's recent book "Are we smart enough to know how smart animals are?" (which I highly recommend), which suggests that the "unique to humans" implicit in the title of the article of the article may not be appropriate, for the article demonstrates a more 'advanced' behavior in humans only in a specific paradyme involving just the two species. Potential turn taking behavior in other social animals, invertebrates as well as vertebrates, is still a possibility. The experiments:
...gave pairs of 3- and 5-year-old children and chimpanzees a collaboration task in which equal rewards could be obtained only if the members of a pair worked together first to reward one and then to reward the other. Neither species had previously been tested in a paradigm in which partners can distribute collaboratively produced rewards in “fair” ways only by taking turns being the sole beneficiary.
Here is their abstract:
Long-term collaborative relationships require that any jointly produced resources be shared in mutually satisfactory ways. Prototypically, this sharing involves partners dividing up simultaneously available resources, but sometimes the collaboration makes a resource available to only one individual, and any sharing of resources must take place across repeated instances over time. Here, we show that beginning at 5 years of age, human children stabilize cooperation in such cases by taking turns across instances of obtaining a resource. In contrast, chimpanzees do not take turns in this way, and so their collaboration tends to disintegrate over time. Alternating turns in obtaining a collaboratively produced resource does not necessarily require a prosocial concern for the other, but rather requires only a strategic judgment that partners need incentives to continue collaborating. These results suggest that human beings are adapted for thinking strategically in ways that sustain long-term cooperative relationships and that are absent in their nearest primate relatives.

Monday, August 01, 2016

Zapping your brain at home.

Mindblog has done a number of posts on transcranial electrical stimulation, usually reporting some beneficial cognitive or emotional effects (enter 'transcranial in the mindblog search box to see some of these). Because the technique requires only a 9 volt battery and a couple of wires, do it yourself (D.I.Y) kits have been marketed by a number of websites, but in general professional cognitive scientists caution against home brew science efforts because of potential deleterious effects of brain stimulation (though none have been reported). A recent Gray Matter piece by Anna Wexler reports her study over the past three years of D.I.Y. brain stimulators:
Their conflict with neuroscientists offers a fascinating case study of what happens when experimental tools normally kept behind the closed doors of academia — in this case, transcranial direct current stimulation — are appropriated for use outside them...To date, more than 1,000 peer-reviewed studies of the technique have been published. Studies have suggested, among other things, that the stimulation may be beneficial for treating problems like depression and chronic pain as well as enhancing cognition and learning in healthy individuals.
(I should point out my post noting a review by Farah that references one meta-analysis of the literature that does not support reported cognitive effects.)
Home use remains a subculture, part of the contemporary movement to “hack” one’s body — using supplements, brain-training games and self-tracking devices — to optimize productivity...I have conducted long interviews with dozens of D.I.Y. stimulators, both in person and via Skype; collected hundreds of questionnaire responses; and tracked online forums, websites, blogs and other platforms on which practitioners communicate. I’ve found that they are — for the most part — astute, inventive and resourceful....I’ve found (as I reported last year in The Journal of Medical Ethics) that users adhere to many of the protocols used in scientific studies.
The growth of D.I.Y. brain stimulation stems in part from a larger frustration with the exclusionary institutions of modern medicine, such as the exorbitant price of pharmaceuticals and the glacial pace at which new therapies trickle down to patients. For people without an institutional affiliation, even reading a journal article can be prohibitively expensive.
As neuroscientists continue to conduct brain stimulation experiments, publish results in journals and hold conferences, the D.I.Y. practitioners have remained quiet downstream listeners, blogging about scientists’ experiments, posting unrestricted versions of journal articles and linking to videos of conference talks. Some practitioners create their own manuals and guides based on published papers.
Added note: Want a brain stimulation conference? Check this out.

Friday, July 29, 2016

Worldwide internet access from light based wireless communication

Under the "random curious stuff" category in MindBlog's subtitle, I want to point to this NYTimes piece that fascinated me by using some simple analogies to explain complex technical issues. The work is a prelude to providing internet access anywhere on the planet, from the skies, using drones.:
In a paper published Tuesday in Optica, researchers from Internet.org’s Connectivity Lab have outlined a new type of light detector that can be used for free-space optical communication, a communication technique that uses light to send data wirelessly.
Free-space optical communication works by encoding communication signals in laser beams. Transmitters on the ground or in satellites shoot that light through the air to receivers that can decode the data. (To understand this on simple terms, think of encoding and sending information through morse code using a flashlight.)
...many free-space optical communication systems use smaller receivers with complex pointing and tracking systems. Because laser beams are narrow and travel in straight lines from point A to point B, these receivers have to continuously maneuver to catch laser beams head-on...Imagine trying to water a small potted plant with a water gun from different angles...To maximize the amount of water you catch, you have to constantly move the pot around.
The Facebook researchers’ solution to this problem is a light detector that doesn’t need pointing and tracking, but still allows for fast transmission...Facebook’s detector contains a spherical bundle of special fluorescent fibers. The bundle, somewhere between the size of a golf ball and tennis ball, is able to absorb blue laser light from any direction and re-emit it as green light. Because that green light is diffuse, it can then be funneled to a small receiver that converts the light back to data...imagine that instead of a water gun, you’re pointing a blow dart gun at a water balloon attached to a funnel over the potted plant. As soon as you hit the balloon, it pops and releases water. With the addition of the balloon, you’ve eliminated the need to move the pot around. You can shoot at the water balloon from any direction, and the plant will get watered.
Facebook’s new detector is able to achieve fast data rates of two gigabits per second — several orders of magnitude higher than those from radio frequencies — because light has a higher frequency than radio waves, and because the fluorescence process is fast. Free-space optical communication can also carry more information than radio communication, and is more secure because narrow laser beams are harder to intercept than wide radio waves.
The technology fits in with Facebook’s plans to beam internet access down from the skies using drones.

Thursday, July 28, 2016

Abstract thinking in newborn ducklings!

Martinho and Kacelnik have shown that duckings imprint on the relational concept of "same or different." From Wasserman's perspective on the work:
Adhering to the adage that “actions speak more loudly than words,” scientists are deploying powerful behavioral tests that provide animals with nonverbal ways to reveal their intelligence to us. Although animals may not be able to speak, studying their behavior may be a suitable substitute for assaying their thoughts, and this in turn may allow us to jettison the stale canard that thought without language is impossible. Following this behavioral approach and using the familiar social learning phenomenon of imprinting, Martinho and Kacelnik report that mallard ducklings preferentially followed a novel pair of objects that conformed to the learned relation between a familiar pair of objects. Ducklings that had earlier been exposed to a pair of identical objects preferred a pair of identical objects to a pair of nonidentical objects; other ducklings that had been exposed to a pair of nonidentical objects preferred a pair of nonidentical objects to a pair of identical objects. Because the testing objects were decidedly unlike the training objects, Martinho and Kacelnik concluded that the ducklings had effectively understood the abstract concepts of “same” and “different.”
A duckling in the testing arena approaches a stimulus pair composed of “different” shapes.
This study is important for at least three reasons. First, it indicates that animals not generally believed to be especially intelligent are capable of abstract thought. Second, even very young animals may be able to display behavioral signs of abstract thinking. And, third, reliable behavioral signs of abstract relational thinking can be obtained without deploying explicit reward-and-punishment procedures.

Wednesday, July 27, 2016

Brain scans are prone to false positives.

An analysis by Eklund et al. has raised serious doubts about positive correlations reported in many of the 40,000 fMRI studies published in the last 2 decades (think of all those ‘This is your brain on politics’ type articles).

Significance
Functional MRI (fMRI) is 25 years old, yet surprisingly its most common statistical methods have not been validated using real data. Here, we used resting-state fMRI data from 499 healthy controls to conduct 3 million task group analyses. Using this null data with different experimental designs, we estimate the incidence of significant results. In theory, we should find 5% false positives (for a significance threshold of 5%), but instead we found that the most common software packages for fMRI analysis (SPM, FSL, AFNI) can result in false-positive rates of up to 70%. These results question the validity of some 40,000 fMRI studies and may have a large impact on the interpretation of neuroimaging results.
Abstract
The most widely used task functional magnetic resonance imaging (fMRI) analyses use parametric statistical methods that depend on a variety of assumptions. In this work, we use real resting-state data and a total of 3 million random task group analyses to compute empirical familywise error rates for the fMRI software packages SPM, FSL, and AFNI, as well as a nonparametric permutation method. For a nominal familywise error rate of 5%, the parametric statistical methods are shown to be conservative for voxelwise inference and invalid for clusterwise inference. Our results suggest that the principal cause of the invalid cluster inferences is spatial autocorrelation functions that do not follow the assumed Gaussian shape. By comparison, the nonparametric permutation test is found to produce nominal results for voxelwise as well as clusterwise inference. These findings speak to the need of validating the statistical methods being used in the field of neuroimaging.
Added note: Check out this NYTimes piece on these issues.

Tuesday, July 26, 2016

A must see Rube Goldberg machine

The intractability of implicit beliefs

Caoa and Banaji in the Harvard Psychology Dept. introduce their study:
Imagine meeting Jonathan and Elizabeth. One person is a doctor. The other is a nurse. Who is the doctor? Or imagine that an employer is deciding to hire either Colin or Jamaal. A background check will reveal that one person has a violent felony on his record and therefore will not be hired. Who is the violent felon? Before individuating facts are learned, when only gender or race is known, one of two principles can guide beliefs.
The first, which we call the base rate principle, supports the belief that Jonathan is the doctor and Jamaal is the violent felon. If ignoring base rates is considered an error, then one must realize that doctors are more likely to be men than women and people with violent felonies on their record are more likely to be Black than White. In fact, because group membership contains useful information for deciding whether an individual has a certain attribute, stereotypes have been conceptualized as base rates. Moreover, decision theorists have shown that base rates are critical ingredients for making predictions, as neglecting base rates will cause predictions to deviate from what is statistically likely.
Using these base rates, however, is inconsistent with a second principle that we call the fairness principle. By this account, it is morally proper to assume a fair coin, so to speak. Jonathan and Elizabeth are equally likely to be the doctor and Colin and Jamaal are equally likely to have a violent felony on their record. Motivated by egalitarian values, many people believe that base rates cannot and should not be used to make such predictions. In fact, the value of fairness is deeply woven into many legal systems. American courts have rejected the use of base rates to determine guilt, and the European Union has banned gender-based insurance premiums.
In the present work, we assess which principle guides beliefs before individuating facts are learned. Given only information about gender, do beliefs favor Jonathan to be the doctor or both Jonathan and Elizabeth equally to be the doctor? We then assess if the base rate and fairness principles are set aside after individuating facts are learned. Given facts that make abundantly clear who is—and who is not—the doctor, do beliefs align with the facts?
Here is the abstract summarizing their findings:
Meet Jonathan and Elizabeth. One person is a doctor and the other is a nurse. Who is the doctor? When nothing else is known, the base rate principle favors Jonathan to be the doctor and the fairness principle favors both individuals equally. However, when individuating facts reveal who is actually the doctor, base rates and fairness become irrelevant, as the facts make the correct answer clear. In three experiments, explicit and implicit beliefs were measured before and after individuating facts were learned. These facts were either stereotypic (e.g., Jonathan is the doctor, Elizabeth is the nurse) or counterstereotypic (e.g., Elizabeth is the doctor, Jonathan is the nurse). Results showed that before individuating facts were learned, explicit beliefs followed the fairness principle, whereas implicit beliefs followed the base rate principle. After individuating facts were learned, explicit beliefs correctly aligned with stereotypic and counterstereotypic facts. Implicit beliefs, however, were immune to counterstereotypic facts and continued to follow the base rate principle. Having established the robustness and generality of these results, a fourth experiment verified that gender stereotypes played a causal role: when both individuals were male, explicit and implicit beliefs alike correctly converged with individuating facts. Taken together, these experiments demonstrate that explicit beliefs uphold fairness and incorporate obvious and relevant facts, but implicit beliefs uphold base rates and appear relatively impervious to counterstereotypic facts.

Monday, July 25, 2016

Our sociability correlates with epigenetic oxytocin gene modifications.

From Haas et al.:

 Significance
Elucidating the genetic and biological substrates of social behavior serves to advance the way basic human nature is understood and improves the way genetic and biological markers can be used to prevent, diagnose, and treat people with impairments in social cognition and behavior. This study shows that epigenetic modification of the structural gene for oxytocin (OXT) is an important factor associated with individual differences in social processing, including self-report, behavior, and brain function and structure in humans.
Abstract
Across many mammalian species there exist genetic and biological systems that facilitate the tendency to be social. Oxytocin is a neuropeptide involved in social-approach behaviors in humans and others mammals. Although there exists a large, mounting body of evidence showing that oxytocin signaling genes are associated with human sociability, very little is currently known regarding the way the structural gene for oxytocin (OXT) confers individual differences in human sociability. In this study, we undertook a comprehensive approach to investigate the association between epigenetic modification of OXT via DNA methylation, and overt measures of social processing, including self-report, behavior, and brain function and structure. Genetic data were collected via saliva samples and analyzed to target and quantify DNA methylation across the promoter region of OXT. We observed a consistent pattern of results across sociability measures. People that exhibit lower OXT DNA methylation (presumably linked to higher OXT expression) display more secure attachment styles, improved ability to recognize emotional facial expressions, greater superior temporal sulcus activity during two social-cognitive functional MRI tasks, and larger fusiform gyrus gray matter volume than people that exhibit higher OXT DNA methylation. These findings provide empirical evidence that epigenetic modification of OXT is linked to several overt measures of sociability in humans and serve to advance progress in translational social neuroscience research toward a better understanding of the evolutionary and genetic basis of normal and abnormal human sociability.

Friday, July 22, 2016

Cultural differences in music perception

McDermott et al. find that the isolated Tsimane people, who live in the Amazonian rainforest in northwest Bolivia, have no preference for for consonance over dissonance.
Music is present in every culture, but the degree to which it is shaped by biology remains debated. One widely discussed phenomenon is that some combinations of notes are perceived by Westerners as pleasant, or consonant, whereas others are perceived as unpleasant, or dissonant. The contrast between consonance and dissonance is central to Western music and its origins have fascinated scholars since the ancient Greeks. Aesthetic responses to consonance are commonly assumed by scientists to have biological roots11, and thus to be universally present in humans. Ethnomusicologists and composers, in contrast, have argued that consonance is a creation of Western musical culture. The issue has remained unresolved, partly because little is known about the extent of cross-cultural variation in consonance preferences18. Here we report experiments with the Tsimane’—a native Amazonian society with minimal exposure to Western culture—and comparison populations in Bolivia and the United States that varied in exposure to Western music. Participants rated the pleasantness of sounds. Despite exhibiting Western-like discrimination abilities and Western-like aesthetic responses to familiar sounds and acoustic roughness, the Tsimane’ rated consonant and dissonant chords and vocal harmonies as equally pleasant. By contrast, Bolivian city- and town-dwellers exhibited significant preferences for consonance, albeit to a lesser degree than US residents. The results indicate that consonance preferences can be absent in cultures sufficiently isolated from Western music, and are thus unlikely to reflect innate biases or exposure to harmonic natural sounds. The observed variation in preferences is presumably determined by exposure to musical harmony, suggesting that culture has a dominant role in shaping aesthetic responses to music.