Tuesday, September 06, 2016

Feeling Good? Do something unpleasant.

A curious piece from Taquet et al.:
Most theories of motivation have highlighted that human behavior is guided by the hedonic principle, according to which our choices of daily activities aim to minimize negative affect and maximize positive affect. However, it is not clear how to reconcile this idea with the fact that people routinely engage in unpleasant yet necessary activities. To address this issue, we monitored in real time the activities and moods of over 28,000 people across an average of 27 d using a multiplatform smartphone application. We found that people’s choices of activities followed a hedonic flexibility principle. Specifically, people were more likely to engage in mood-increasing activities (e.g., play sports) when they felt bad, and to engage in useful but mood-decreasing activities (e.g., housework) when they felt good. These findings clarify how hedonic considerations shape human behavior. They may explain how humans overcome the allure of short-term gains in happiness to maximize long-term welfare.

Monday, September 05, 2016

Do your friends really like you?

I found this article by Murphy pointing to work by Almaatouq et al. to align with my recent experience of having two long term friends (or so I thought), simply stop responding to emails about getting together. And, from the other direction, being described as "our good friend" by a couple I didn't particularly like. It turns out that studies show that only about half of perceived friendships are mutual. The Alamaatouq et al. study:
...analyzed friendship ties among 84 subjects (ages 23 to 38) in a business management class by asking them to rank one another on a five-point continuum of closeness from “I don’t know this person” to “One of my best friends.” The feelings were mutual 53 percent of the time while the expectation of reciprocity was pegged at 94 percent. This is consistent with data from several other friendship studies conducted over the past decade, encompassing more than 92,000 subjects, in which the reciprocity rates ranged from 34 percent to 53 percent.
Clips from the last portion of Murphy's article:
Because time is limited, so, too, is the number of friends you can have, according to the work of the British evolutionary psychologist Robin I.M. Dunbar. He describes layers of friendship, where the topmost layer consists of only one or two people, say a spouse and best friend with whom you are most intimate and interact daily. The next layer can accommodate at most four people for whom you have great affinity, affection and concern and who require weekly attention to maintain. Out from there, the tiers contain more casual friends with whom you invest less time and tend to have a less profound and more tenuous connection. Without consistent contact, they easily fall into the realm of acquaintance.
...playing it safe by engaging in shallow, unfulfilling or nonreciprocal relationships has physical repercussions. Not only do the resulting feelings of loneliness and isolation increase the risk of death as much as smoking, alcoholism and obesity; you may also lose tone, or function, in the so-called smart vagus nerve, which brain researchers think allows us to be in intimate, supportive and reciprocal relationships in the first place...In the presence of a true friend...the smart or modulating aspect of the vagus nerve is what makes us feel at ease rather than on guard as when we are with a stranger or someone judgmental. It’s what enables us to feel O.K. about exposing the soft underbelly of our psyche and helps us stay engaged and present in times of conflict. Lacking authentic friendships, the smart vagus nerve is not exercised. It loses tone and one’s anxiety remains high, making abiding, deep connections difficult.

Friday, September 02, 2016

Growing Older, Getting Happier

A brief piece from Nicholas Bakalar in the NYTimes summaring the recent paper by Thomas et al. (senior author Dilip Jeste):
Older people tend to be happier than younger people, and their happiness increases with age...Researchers contacted 1,546 people ages 21 to 99 via random telephone calls and found that older age was, not surprisingly, tied to declines in physical and cognitive function. But it was also associated with higher levels of overall satisfaction, happiness and well-being, and lower levels of anxiety, depression and stress. The older the person, the study found, the better his or her mental health tended to be.
The researchers used well-validated scales to assess mental health, although the study relied on self-reports and was a snapshot in time that did not follow an individual through a lifetime. Other studies have found similar results linking advancing age and higher levels of happiness.
The reasons for the effect remain unclear, but the senior author, Dr. Dilip V. Jeste, a professor of psychiatry at the University of California, San Diego, had some suggestions...“Brain studies show that the amygdala in older people responds less to stressful or negative images than in a younger person,” he said. “We become wise. Peer pressure loses its sting. Better decision-making, more control of emotions, doing things that are not just for yourself, knowing oneself better, being more studious and yet more decisive...“This is good news for young people, too,” he added. “You have something to look forward to.”
Here are the methods and results sections from the abstract:
Methods: Cross-sectional data were obtained from 1,546 individuals aged 21–100 years, selected using random digit dialing for the Successful AGing Evaluation (SAGE) study, a structured multicohort investigation that included telephone interviews and in-home surveys of community-based adults without dementia. Data were collected from 1/26/2010 to 10/07/2011 targeting participants aged 50–100 years and from 6/25/2012 to 7/15/2013 targeting participants aged 21–100 years with an emphasis on adding younger individuals. Data included self-report measures of physical health, measures of both positive and negative attributes of mental health, and a phone interview–based measure of cognition.
Results: Comparison of age cohorts using polynomial regression suggested a possible accelerated deterioration in physical and cognitive functioning, averaging 1.5 to 2 standard deviations over the adult lifespan. In contrast, there appeared to be a linear improvement of about 1 standard deviation in various attributes of mental health over the same life period.

Thursday, September 01, 2016

Wednesday, August 31, 2016

Climate disasters act as threat multipliers in ethnic conflicts.

Schleussner et al. offer a proof of a common assumption about the effects of climate disasters: that they drive people further apart rather than closer together:
Social and political tensions keep on fueling armed conflicts around the world. Although each conflict is the result of an individual context-specific mixture of interconnected factors, ethnicity appears to play a prominent and almost ubiquitous role in many of them. This overall state of affairs is likely to be exacerbated by anthropogenic climate change and in particular climate-related natural disasters. Ethnic divides might serve as predetermined conflict lines in case of rapidly emerging societal tensions arising from disruptive events like natural disasters. Here, we hypothesize that climate-related disaster occurrence enhances armed-conflict outbreak risk in ethnically fractionalized countries. Using event coincidence analysis, we test this hypothesis based on data on armed-conflict outbreaks and climate-related natural disasters for the period 1980–2010. Globally, we find a coincidence rate of 9% regarding armed-conflict outbreak and disaster occurrence such as heat waves or droughts. Our analysis also reveals that, during the period in question, about 23% of conflict outbreaks in ethnically highly fractionalized countries robustly coincide with climatic calamities. Although we do not report evidence that climate-related disasters act as direct triggers of armed conflicts, the disruptive nature of these events seems to play out in ethnically fractionalized societies in a particularly tragic way. This observation has important implications for future security policies as several of the world’s most conflict-prone regions, including North and Central Africa as well as Central Asia, are both exceptionally vulnerable to anthropogenic climate change and characterized by deep ethnic divides.

Tuesday, August 30, 2016

Our self and our temporo-parietal junction

Eddy does a review of the Temporo-parietal junction area of our brain that appears to be central to our sense of self and other:

Highlights
•Existing literature places the TPJ at the interface between mind and matter. 
•The right TPJ is critical for the control of self and other representations. 
•Dysfunction of right TPJ may therefore compromise our sense of self. 
•Disintegration of the self may in turn underpin various neuropsychiatric symptoms.
Abstract
The temporo-parietal junction (TPJ) is implicated in a variety of processes including multisensory integration, social cognition, sense of agency and stimulus-driven attention functions. Furthermore, manipulation of cortical excitation in this region can influence a diverse range of personal and interpersonal perceptions, from those involved in moral decision making to judgments about the location of the self in space. Synthesis of existing studies places the TPJ at the neural interface between mind and matter, where information about both mental and physical states is processed and integrated, contributing to self-other differentiation. After first summarising the functions of the TPJ according to existing literature, this narrative review aims to offer insight into the potential role of TPJ dysfunction in neuropsychiatric disorders, with a focus on the involvement of the right TPJ in controlling representations relating to the self and other. Problems with self-other distinctions may reflect or pose a vulnerability to the symptoms associated with Tourette syndrome, Schizophrenia, Autistic Spectrum Disorder and Obsessive Compulsive Disorder. Further study of this most fascinating neural region will therefore make a substantial contribution to our understanding of neuropsychiatric symptomatology and highlight significant opportunities for therapeutic impact.

Anatomical and functional subdivisions of the temporo-parietal junction. Top row: Functional MRI meta-analysis data...Showing forward inference data identified using the terms ‘social’ in red, and ‘attention’ in green, with overlap in yellow. Bottom row: Standard anatomical maps using Automated Anatomical Labelling. Showing right inferior parietal lobe (cyan), supramarginal gyrus (green), angular gyrus (deep blue), superior temporal gyrus (yellow) and middle temporal gyrus (red).

Monday, August 29, 2016

Psychological disruptions of our online lives.

I want to pass on clips from a review by Steiner-Adair in the Washington Post, describing Mary Aiken's book "The Cyber Effect," that describes how cyberspace is changing the way we think, feel, and behave:
She uses the science of human behavior to define cyberspace as a unique environment — an actual space — not simply a virtual extension of the pre-digital world and our characteristic behaviors there. Yes, we still hang out, connect, flirt, fight, learn, do business and do good online. But disinhibition and anonymity in cyberspace foster a particular pattern of impulsivity, careless or inflammatory expression, social cruelty, deception, exploitation — and vulnerability. Consider the unsettling phenomenon of ubiquitous victimology, in which “the criminals are well hidden but you aren’t.” That extends from the ordinary streets of online life to the deep, criminal underground where predators roam and perps hawk illicit wares from drugs, guns and hired assassins to trafficked humans and tools for terrorism. Forget reality TV, this is reality. And it’s a mouse click away from your living room — and your curious child.
Our real-world senses do not serve or protect us adequately in cyberspace, Aiken warns. As humans, we’re caught in the gap between evolution and a sea change in our environment. Our instincts for appraising mates, pals and trustworthy others are visceral, designed by nature for face-to-face, embodied interaction in a physical environment. They fail to pick up signals when we meet in the cyber-realm. Without those protective filters, and unaware that they’ve been disabled, we’re vulnerable in new ways. Connecting on line feels so easy and natural that we come to assume a newfound sameness and closeness with strangers.
This phenomenon of “online syndication,” as Aiken calls it — using the Internet to find others we think are like-minded and to normalize and socialize underlying tendencies — is a setup for easy disaster, as Aiken shows in her examples of people caught in cyber-crises: humiliating exchanges or exposure, debt, love affairs, fetishes, porn and gaming addictions, or the lure of criminal behavior. They fail to see the big disconnect between who they are in real life and who they are online, and the gap is fraught with consequences.
Aiken is concerned for children’s development, health and safety in a cyber-environment that replaces face-to-face interaction with online engagement and includes easy access to pornography and hyper-stimulating, addictive activity. The evidence is in, she says, and it shows conclusively that “there are windows in the formative years when very specific skills need to be learned. When those developmental windows close, a child may be developmentally or emotionally crippled for life.”
...the Internet “is clearly, unmistakably, and emphatically an adult environment. It simply wasn’t designed for children. So why are they there?” Indeed, why are we giving kids keys to the Internet? Who would ever think it’s a good idea for children to have miniature computers in their pockets that can take them anywhere online, unsupervised and unprotected? Aiken describes the lack of regulation, accountability, privacy and protection for children caught in this digital transition as a “crime against innocence.” It represents a massive seduction of parents and other adults who should know better, she argues. Her forensic perspective compels us all to demand better protection, reminding us that children ages 4 through 12 are the most vulnerable population on the Web.

Friday, August 26, 2016

A bit of nostalgia - Powers of 10

I just stumbled across a charming relic from my counter culture days in the 1970's, when I was watching whales and monarch butterflies at the Esalen Institute in Big Sur, and learning gestalt, TA, Alexander, massage, and meditation techniques. At one point I signed up for transcendental meditation instruction, and this 1977 video was shown in the first session, after which the instructor said "That's all there is to it"........Sigh.....


Thursday, August 25, 2016

Alerting or Somnogenic light - pick your color

Bourgin and Hubbard summarize work by Pilorz et al.
Light exerts profound effects on our physiology and behaviour, setting our biological clocks to the correct time and regulating when we are asleep and we are awake. The photoreceptors mediating these responses include the rods and cones involved in vision, as well as a subset of photosensitive retinal ganglion cells (pRGCs) expressing the blue light-sensitive photopigment melanopsin. Previous studies have shown that mice lacking melanopsin show impaired sleep in response to light. However, other studies have shown that light increases glucocorticoid release—a response typically associated with stress. To address these contradictory findings, we studied the responses of mice to light of different colours. We found that blue light was aversive, delaying sleep onset and increasing glucocorticoid levels. By contrast, green light led to rapid sleep onset. These different behavioural effects appear to be driven by different neural pathways. Surprisingly, both responses were impaired in mice lacking melanopsin. These data show that light can promote either sleep or arousal. Moreover, they provide the first evidence that melanopsin directly mediates the effects of light on glucocorticoids. This work shows the extent to which light affects our physiology and has important implications for the design and use of artificial light sources.

Wednesday, August 24, 2016

Oxytocin - a molecular substrate for forming optimistic beliefs about the future

Ma et al. demonstrate a molecular basis for why people tend to incorporate desirable, but not undesirable, feedback into their beliefs:

Significance
People tend to incorporate desirable feedback into their beliefs but discount undesirable ones. Such optimistic updating has evolved as an advantageous mechanism for social adaptation and physical/mental health. Here, in three independent studies, we show that intranasally administered oxytocin (OT), an evolutionary ancient neuropeptide pivotal to social adaptation, augments optimistic belief updating by increasing updates and learning of desirable feedback but impairing updates of undesirable feedback. Moreover, the OT-impaired updating of undesirable feedback is more salient in individuals with high, rather than with low, depression or anxiety traits. OT also increases second-order confidence judgment after desirable feedback. These findings reveal a molecular substrate underlying the formation of optimistic beliefs about the future.
Abstract
Humans update their beliefs upon feedback and, accordingly, modify their behaviors to adapt to the complex, changing social environment. However, people tend to incorporate desirable (better than expected) feedback into their beliefs but to discount undesirable (worse than expected) feedback. Such optimistic updating has evolved as an advantageous mechanism for social adaptation. Here, we examine the role of oxytocin (OT)―an evolutionary ancient neuropeptide pivotal for social adaptation―in belief updating upon desirable and undesirable feedback in three studies (n = 320). Using a double-blind, placebo-controlled between-subjects design, we show that intranasally administered OT (IN-OT) augments optimistic belief updating by facilitating updates of desirable feedback but impairing updates of undesirable feedback. The IN-OT–induced impairment in belief updating upon undesirable feedback is more salient in individuals with high, rather than with low, depression or anxiety traits. IN-OT selectively enhances learning rate (the strength of association between estimation error and subsequent update) of desirable feedback. IN-OT also increases participants’ confidence in their estimates after receiving desirable but not undesirable feedback, and the OT effect on confidence updating upon desirable feedback mediates the effect of IN-OT on optimistic belief updating. Our findings reveal distinct functional roles of OT in updating the first-order estimation and second-order confidence judgment in response to desirable and undesirable feedback, suggesting a molecular substrate for optimistic belief updating.

Tuesday, August 23, 2016

Slow motion increases perceived intent.

The abstract from interesting work of Caruso et al.
To determine the appropriate punishment for a harmful action, people must often make inferences about the transgressor’s intent. In courtrooms and popular media, such inferences increasingly rely on video evidence, which is often played in “slow motion.” Four experiments (n = 1,610) involving real surveillance footage from a murder or broadcast replays of violent contact in professional football demonstrate that viewing an action in slow motion, compared with regular speed, can cause viewers to perceive an action as more intentional. This slow motion intentionality bias occurred, in part, because slow motion video caused participants to feel like the actor had more time to act, even when they knew how much clock time had actually elapsed. Four additional experiments (n = 2,737) reveal that allowing viewers to see both regular speed and slow motion replay mitigates the bias, but does not eliminate it. We conclude that an empirical understanding of the effect of slow motion on mental state attribution should inform the life-or-death decisions that are currently based on tacit assumptions about the objectivity of human perception.

Monday, August 22, 2016

Lifespan changes in brain and cognition - early life sets the stage.

Walhovd et al. present a fascinating study on the origins of lifespan changes in brain and cognition, defining an extensive cortical region wherein surface area relates positively to general cognitive ability (GCA) in development. They find evidence that especially prefrontal and medial and posterolateral temporal clusters relate more strongly to GCA:

Significance
Brain and cognition change with age, with early gains and later declines. Attempts have been made to identify age-specific mechanisms, focusing on when and how declines begin in adults. However, even though general cognitive ability declines with age, there is a high stability in individuals’ cognitive ability relative to their same-age peers. Here we show that the relation between brain and cognition appears remarkably stable through the human lifespan. The cortical area change trajectories of higher and lower cognitive ability groups were parallel through life. Birth weight and parental education were identified as predictors, which provides novel evidence for stability in brain–cognition relationships throughout life, and indicates that early life factors impact brain and cognition for the entire life course.
Abstract
Neurodevelopmental origins of functional variation in older age are increasingly being acknowledged, but identification of how early factors impact human brain and cognition throughout life has remained challenging. Much focus has been on age-specific mechanisms affecting neural foundations of cognition and their change. In contrast to this approach, we tested whether cerebral correlates of general cognitive ability (GCA) in development could be extended to the rest of the lifespan, and whether early factors traceable to prenatal stages, such as birth weight and parental education, may exert continuous influences. We measured the area of the cerebral cortex in a longitudinal sample of 974 individuals aged 4–88 y (1,633 observations). An extensive cortical region was identified wherein area related positively to GCA in development. By tracking area of the cortical region identified in the child sample throughout the lifespan, we showed that the cortical change trajectories of higher and lower GCA groups were parallel through life, suggesting continued influences of early life factors. Birth weight and parental education obtained from the Norwegian Mother–Child Cohort study were identified as such early factors of possible life-long influence. Support for a genetic component was obtained in a separate twin sample (Vietnam Era Twin Study of Aging), but birth weight in the child sample had an effect on cortical area also when controlling for possible genetic differences in terms of parental height. Our results provide novel evidence for stability in brain–cognition relationships throughout life, and indicate that early life factors impact brain and cognition for the entire life course.
A summary graphic from the review by Jagust:


Conceptual model linking brain development, cognition, brain reserve, and late-life cognitive decline. Early life exposures and genes affect brain development, which in turn is related to GCA. GCA and education are related to one another, and provide brain reserve with advancing age. The graph demonstrates two individuals with high (blue) and low (red) brain reserve. Although the rate of their age-related cognitive decline is identical, the person with higher reserve crosses the threshold for dependence at an older age, thus experiencing a longer independent life. Early-life exposures, however, also confer indirect beneficial effects in addition to brain development, and these are likely to be salutary over the lifespan.

Friday, August 19, 2016

Neural link between affective understanding and interpersonal attraction

From Anders et al.:
Being able to comprehend another person’s intentions and emotions is essential for successful social interaction. However, it is currently unknown whether the human brain possesses a neural mechanism that attracts people to others whose mental states they can easily understand. Here we show that the degree to which a person feels attracted to another person can change while they observe the other’s affective behavior, and that these changes depend on the observer’s confidence in having correctly understood the other’s affective state. At the neural level, changes in interpersonal attraction were predicted by activity in the reward system of the observer’s brain. Importantly, these effects were specific to individual observer–target pairs and could not be explained by a target’s general attractiveness or expressivity. Furthermore, using multivoxel pattern analysis (MVPA), we found that neural activity in the reward system of the observer’s brain varied as a function of how well the target’s affective behavior matched the observer’s neural representation of the underlying affective state: The greater the match, the larger the brain’s intrinsic reward signal. Taken together, these findings provide evidence that reward-related neural activity during social encounters signals how well an individual’s “neural vocabulary” is suited to infer another person’s affective state, and that this intrinsic reward might be a source of changes in interpersonal attraction.

Thursday, August 18, 2016

Statistics versus judgement.

This interesting website, pointed out to me by a friend, offers to send a daily gem of information to you, usually an excerpt from a published book...so, being a glutton for input streams, I signed up. I usually move on after glancing at a given day's topic, but this excerpt from Kahneman's "Thinking, Fast and Slow" I pass on, after excerpting even further:
In his book Clinical vs. Statistical Prediction: A The­oretical Analysis and a Review of the Evidence, psychoanalyst Paul Meehl gave evidence that statistical models almost always yield better predictions and diagnoses than the judgment of trained professionals. In fact, experts frequently give different answers when presented with the same information within a matter of a few minutes...Meehl's book provoked shock and disbelief among clinical psychologists, and the controversy it started has engendered a stream of research that is still flowing today, more than fifty years after its publication. The number of studies reporting comparisons of clinical and statistical predictions has increased to roughly two hundred, but the score in the contest between algorithms and humans has not changed. About 60% of the studies have shown significantly better accuracy for the algo­rithms. The other comparisons scored a draw in accuracy, but a tie is tanta­mount to a win for the statistical rules, which are normally much less expensive to use than expert judgment. No exception has been convinc­ingly documented.
The range of predicted outcomes has expanded to cover medical vari­ables such as the longevity of cancer patients, the length of hospital stays, the diagnosis of cardiac disease, and the susceptibility of babies to sudden infant death syndrome; economic measures such as the prospects of success for new businesses, the evaluation of credit risks by banks, and the future career satisfaction of workers; questions of interest to government agencies, including assessments of the suitability of foster parents, the odds of recidivism among juvenile offenders, and the likelihood of other forms of violent behavior; and miscellaneous outcomes such as the evaluation of scientific presentations, the winners of football games, and the future prices of Bor­deaux wine. Each of these domains entails a significant degree of uncer­tainty and unpredictability. We describe them as 'low-validity environments.' In every case, the accuracy of experts was matched or exceeded by a simple algorithm.
Another reason for the inferiority of expert judgment is that humans are incorrigibly inconsistent in making summary judgments of complex information. When asked to evaluate the same information twice, they frequently give different answers. The extent of the inconsistency is often a matter of real concern. Experienced radiologists who evaluate chest X-rays as 'normal' or 'abnormal' contradict themselves 20% of the time when they see the same picture on separate occasions. A study of 101 indepen­dent auditors who were asked to evaluate the reliability of internal corpo­rate audits revealed a similar degree of inconsistency. A review of 41 separate studies of the reliability of judgments made by auditors, pathologists, psy­chologists, organizational managers, and other professionals suggests that this level of inconsistency is typical, even when a case is reevaluated within a few minutes. Unreliable judgments cannot be valid predictors of anything

Wednesday, August 17, 2016

How China is changing your internet.

Here is a fascinating piece done by the NYTimes on the parallel universe of the internet in China.

 

Tuesday, August 16, 2016

The long lives of fairy tales.

I pass on some clips from a review by Pagel of work by Da Silva and Tehrani suggesting that some common fairy tales can be traced back 7,000 years or more, long before written languages appeared.
The Indo-European language family is a collection of related languages that probably arose in Anatolia and is now spoken all over western Eurasia. Its modern descendants include the Celtic, Germanic and Italic or Romance languages of western Europe, the Slavic languages of Russia and much of the Balkans, and the Indo-Iranian languages including Persian, as well as Sanskrit and most of the languages of the Indian sub-continent.
Language evolves faster than genes and language is predominantly vertically transmitted. Similarities and differences among vocabulary items, then, play the same role for cultural phylogenies as genes do for species trees, and provide greater resolution over short timescales. The Indo-European language tree is one of the most carefully studied of these language phylogenies
With a phylogenetic tree in hand, the authors recorded the presence or absence of each of 275 fairy tales in fifty Indo-European languages...Of the 275 tales, the authors discarded 199 after performing two tests of horizontal transmission...This left a group of 76 tales for which vertical transmission over the course of Indo-European history was the dominant signal for the patterns of shared presence and absence among contemporary societies. Hänsel and Gretel didn’t make this cut, but Beauty and the Beast did.
Evolutionary statistical methods were then applied to calculate a probability that each of the tales was present at each of various major historical splitting points on the Indo-European language phylogeny, taking account of uncertainty both in the phylogeny and in the reconstructed state. Calculating the ancestral probabilities depends only upon the distribution of tales in the contemporary languages in combination with the phylogenetic tree and so neatly gets around the problem that few if any tales exist as ‘fossil’ texts...Fourteen of the 76 tales, including Beauty and the Beast, were assigned a 50% or greater chance of having been present in the common ancestor of the entire western branch of the Indo-European languages. ..
A further four of the fourteen tales — but not Beauty and the Beast — had a 50% or greater probability of being present at the root of the Indo-European tree. A proto-Indo-European origin for these four tales represents a probable age of over 7,000 years. The tale with the highest probability (87%) of being present at the root was The Smith and the Devil whose story of a smith selling his soul to the devil is echoed today in the modern story of Faust. The authors suggest that metal working technology — as implied by the presence of a smith — could have been available this long ago.
Considering all these notions might lead us to ask why not more of the fairy tales appeared right back at the Indo-European root, or perhaps to wonder if some could go back even further. Perhaps some do. Flood myths appear in many of the world’s cultures, with some speculation that they date to the end of the last Ice Age perhaps 15,000 to 20,000 years ago when sea levels rose dramatically — if true, the western Bible story of Noah is just a comparatively recent hand-me-down.

Monday, August 15, 2016

Brain changes during hypnosis

Jiang et al. do the most detailed analysis to date of brain changes that are distinctive to people undergoing hypnosis:
Hypnosis has proven clinical utility, yet changes in brain activity underlying the hypnotic state have not yet been fully identified. Previous research suggests that hypnosis is associated with decreased default mode network (DMN) activity and that high hypnotizability is associated with greater functional connectivity between the executive control network (ECN) and the salience network (SN). We used functional magnetic resonance imaging to investigate activity and functional connectivity among these three networks in hypnosis. We selected 57 of 545 healthy subjects with very high or low hypnotizability using two hypnotizability scales. All subjects underwent four conditions in the scanner: rest, memory retrieval, and two different hypnosis experiences guided by standard pre-recorded instructions in counterbalanced order. Seeds for the ECN, SN, and DMN were left and right dorsolateral prefrontal cortex, dorsal anterior cingulate cortex (dACC), and posterior cingulate cortex (PCC), respectively. During hypnosis there was reduced activity in the dACC, increased functional connectivity between the dorsolateral prefrontal cortex (DLPFC;ECN) and the insula in the SN, and reduced connectivity between the ECN (DLPFC) and the DMN (PCC). These changes in neural activity underlie the focused attention, enhanced somatic and emotional control, and lack of self-consciousness that characterizes hypnosis.

Friday, August 12, 2016

Why do people infer “ought” from “is”?

Tworek and Cimpian offer an interesting perspective, doing experiments illustrating how we ascribe intrinsic value to what is customary. I give the start of their introduction setting the context, and then their abstract:
In his dissent from the Supreme Court decision recognizing a federal constitutional right for people to marry a same-sex partner, Chief Justice Roberts noted that heterosexual marriage has been around “for millennia” in societies all over the world: “the Kalahari Bushmen and the Han Chinese, the Carthaginians and the Aztecs”. A possible reading of this remark is that we should take what is typical as a signpost for what is good—how things ought to be.1 Whatever the correct interpretation here, the tendency to move seamlessly from “is” to “ought” is a mainstay of everyday reasoning. However, the validity of such “is”-to-“ought” inferences (or ought inferences) is at best uncertain. The mere existence of a pattern of behavior does not, by itself, reveal that the behavior is good.2 For instance, slavery and child labor were common throughout history, and still are in some parts of the world, yet it does not follow that people ought to engage in these practices. Why, then, do people frequently draw ought inferences and find them persuasive?
Abstract
People tend to judge what is typical as also good and appropriate—as what ought to be. What accounts for the prevalence of these judgments, given that their validity is at best uncertain? We hypothesized that the tendency to reason from “is” to “ought” is due in part to a systematic bias in people’s (nonmoral) explanations, whereby regularities (e.g., giving roses on Valentine’s Day) are explained predominantly via inherent or intrinsic facts (e.g., roses are beautiful). In turn, these inherence-biased explanations lead to value-laden downstream conclusions (e.g., it is good to give roses). Consistent with this proposal, results from five studies (N = 629 children and adults) suggested that, from an early age, the bias toward inherence in explanations fosters inferences that imbue observed reality with value. Given that explanations fundamentally determine how people understand the world, the bias toward inherence in these judgments is likely to exert substantial influence over sociomoral understanding.

Thursday, August 11, 2016

How our brain and visceral monitoring encode the ‘self’

Babo-Rebelo et al. show that two seemingly distinct roles of the default brain network (DN), in self-related cognition on the one hand, and in the monitoring of bodily signals for autonomous function regulation, on the other, are functionally coupled. They do this by testing whether the amplitudes of heartbeat-evoked responses (HERs) during thoughts systematically covary with their self-relatedness, and whether this mechanism engages the DN. They employ two scales of self-relatedness. The “Me” scale described the content of the thought oriented either toward oneself or toward an external object, event, or person. The “I” scale described the engagement of the participant as the protagonist or the agent in the thought. Here is their abstract:
The default network (DN) has been consistently associated with self-related cognition, but also to bodily state monitoring and autonomic regulation. We hypothesized that these two seemingly disparate functional roles of the DN are functionally coupled, in line with theories proposing that selfhood is grounded in the neural monitoring of internal organs, such as the heart. We measured with magnetoencephalograhy neural responses evoked by heartbeats while human participants freely mind-wandered. When interrupted by a visual stimulus at random intervals, participants scored the self-relatedness of the interrupted thought. They evaluated their involvement as the first-person perspective subject or agent in the thought (“I”), and on another scale to what degree they were thinking about themselves (“Me”). During the interrupted thought, neural responses to heartbeats in two regions of the DN, the ventral precuneus and the ventromedial prefrontal cortex, covaried, respectively, with the “I” and the “Me” dimensions of the self, even at the single-trial level. No covariation between self-relatedness and peripheral autonomic measures (heart rate, heart rate variability, pupil diameter, electrodermal activity, respiration rate, and phase) or alpha power was observed. Our results reveal a direct link between selfhood and neural responses to heartbeats in the DN and thus directly support theories grounding selfhood in the neural monitoring of visceral inputs. More generally, the tight functional coupling between self-related processing and cardiac monitoring observed here implies that, even in the absence of measured changes in peripheral bodily measures, physiological and cognitive functions have to be considered jointly in the DN.

Wednesday, August 10, 2016

Leave the kids alone! A cognitive case for un-parenting

I want to pass on some clips from the text of a recent review by Glausiusz of Alison Gopnik's book on child-rearing "The Gardener and the Carpenter," and also from the NYTimes pieces by Gopnik summarizing its main arguments. (Her bottom line: "We don’t have to make children learn, we just have to let them learn." Clips from the book review:
An Amazon trawl for “parenting books” last month offered up 186,262 results. ..This is less genre than tsunami...Yet, as Alison Gopnik notes...the word parenting became common only in the 1970s, rising in popularity as traditional sources of wisdom about child-rearing — large extended families, for example — fell away...Gopnik...argues that the message of this massive modern industry is misguided.
It assumes that the 'right' parenting techniques or expertise will sculpt your child into a successful adult. But using a scheme to shape material into a product is the modus operandi of a carpenter, whose job it is to make the chair steady or the door true. There is very little empirical evidence, Gopnik says, that “small variations” in what parents do (such as whether they sleep-train) “have reliable and predictable long-term effects on who those children become”. Raising and caring for children is more like tending a garden: it involves “a lot of exhausted digging and wallowing in manure” to create a safe, nurturing space in which innovation, adaptability and resilience can thrive. Her approach focuses on helping children to find their own way, even if it isn't one you'd choose for them. The lengthy childhood of our species gives kids ample opportunity to explore, exploit and experiment before they are turned out into an unpredictable world.
Clips from Gopnik:
It’s not just that young children don’t need to be taught in order to learn. In fact, studies show that explicit instruction, the sort of teaching that goes with school and “parenting,” can be limiting. When children think they are being taught, they are much more likely to simply reproduce what the adult does, instead of creating something new.
My lab tried a different version of the experiment with the complicated toy. This time, though, the experimenter acted like a teacher. She said, “I’m going to show you how my toy works,” instead of “I wonder how this toy works.” The children imitated exactly what she did, and didn’t come up with their own solutions.
The children seem to work out, quite rationally, that if a teacher shows them one particular way to do something, that must be the right technique, and there’s no point in trying something new. But as a result, the kind of teaching that comes with schools and “parenting” pushes children toward imitation and away from innovation.
There is a deep irony here. Parents and policy makers care about teaching because they recognize that learning is increasingly important in an information age. But the new information economy, as opposed to the older industrial one, demands more innovation and less imitation, more creativity and less conformity.
In fact, children’s naturally evolved learning techniques are better suited to that sort of challenge than the teaching methods of the past two centuries.