Friday, September 16, 2016

Predicting false memories with fMRI

Chadwick et al. find that the the apex of the ventral processing stream in the brain's temporal pole (TP) contains partially overlapping neural representations of related concepts, and that the extent of this neural overlap directly reflects the degree of semantic similarity between the concepts. Furthermore, the neural overlap between sets of related words predicts the likelihood of making a false-memory error. (One could wonder if further development of work of this sort might make it possible to perform an fMRI evaluation of an eye witness in an important trial to determine whether their testimony is more or less likely to be correct.)

Significance
False memories can arise in daily life through a mixture of factors, including misinformation and prior conceptual knowledge. This can have serious consequences in settings, such as legal eyewitness testimony, which depend on the accuracy of memory. We investigated the brain basis of false memory with fMRI, and found that patterns of activity in the temporal pole region of the brain can predict false memories. Furthermore, we show that each individual has unique patterns of brain activation that can predict their own idiosyncratic set of false-memory errors. Together, these results suggest that the temporal pole may be responsible for the conceptual component of illusory memories.
Abstract
Recent advances in neuroscience have given us unprecedented insight into the neural mechanisms of false memory, showing that artificial memories can be inserted into the memory cells of the hippocampus in a way that is indistinguishable from true memories. However, this alone is not enough to explain how false memories can arise naturally in the course of our daily lives. Cognitive psychology has demonstrated that many instances of false memory, both in the laboratory and the real world, can be attributed to semantic interference. Whereas previous studies have found that a diverse set of regions show some involvement in semantic false memory, none have revealed the nature of the semantic representations underpinning the phenomenon. Here we use fMRI with representational similarity analysis to search for a neural code consistent with semantic false memory. We find clear evidence that false memories emerge from a similarity-based neural code in the temporal pole, a region that has been called the “semantic hub” of the brain. We further show that each individual has a partially unique semantic code within the temporal pole, and this unique code can predict idiosyncratic patterns of memory errors. Finally, we show that the same neural code can also predict variation in true-memory performance, consistent with an adaptive perspective on false memory. Taken together, our findings reveal the underlying structure of neural representations of semantic knowledge, and how this semantic structure can both enhance and distort our memories.

Thursday, September 15, 2016

Self-regulation via neural simulation

A fascinating study from Gilead et al.:

Significance
As Harper Lee tells us in To Kill a Mockingbird, “You never really understand a person until you consider things from his point of view, until you climb in his skin and walk around in it.” Classic theories in social psychology argue that this purported process of social simulation provides the foundations for self-regulation. In light of this, we investigated the neural processes whereby humans may regulate their affective responses to an event by simulating the way others would respond to it. Our results suggest that during perspective-taking, behavioral and neural signatures of negative affect indeed mimic the presumed affective state of others. Furthermore, the anterior medial prefrontal cortex—a region implicated in mental state inference—may orchestrate this affective simulation process.
Abstract
Can taking the perspective of other people modify our own affective responses to stimuli? To address this question, we examined the neurobiological mechanisms supporting the ability to take another person’s perspective and thereby emotionally experience the world as they would. We measured participants’ neural activity as they attempted to predict the emotional responses of two individuals that differed in terms of their proneness to experience negative affect. Results showed that behavioral and neural signatures of negative affect (amygdala activity and a distributed multivoxel pattern reflecting affective negativity) simulated the presumed affective state of the target person. Furthermore, the anterior medial prefrontal cortex (mPFC)—a region implicated in mental state inference—exhibited a perspective-dependent pattern of connectivity with the amygdala, and the multivoxel pattern of activity within the mPFC differentiated between the two targets. We discuss the implications of these findings for research on perspective-taking and self-regulation.

Wednesday, September 14, 2016

A psychological mechanism to explain why childhood adversity diminishes adult health?

A large number of studies have by now shown that harsh social and physical environments early in life are associated with a substantial increase in the risk of chronic illnesses, such as heart disease, diabetes, and some forms of cancer. It is generally assumed that the hypothalamic-pituitary-adrenal (HPA) axis is an essential biological intermediary of these poor health outcomes in adulthood. Zilioli et al. suggest that lowered sense of self worth is the psychological mechanism that persists into adulthood to alter stress physiology. Their abstract:
Childhood adversity is associated with poor health outcomes in adulthood; the hypothalamic-pituitary-adrenal (HPA) axis has been proposed as a crucial biological intermediary of these long-term effects. Here, we tested whether childhood adversity was associated with diurnal cortisol parameters and whether this link was partially explained by self-esteem. In both adults and youths, childhood adversity was associated with lower levels of cortisol at awakening, and this association was partially driven by low self-esteem. Further, we found a significant indirect pathway through which greater adversity during childhood was linked to a flatter cortisol slope via self-esteem. Finally, youths who had a caregiver with high self-esteem experienced a steeper decline in cortisol throughout the day compared with youths whose caregiver reported low self-esteem. We conclude that self-esteem is a plausible psychological mechanism through which childhood adversity may get embedded in the activity of the HPA axis across the life span.
And, a clip from their discussion, noting limits to the interpretation of the correlations they observe:
These findings suggest that one’s sense of self-worth might act as a proximal psychological mechanism through which childhood adversity gets embedded in human stress physiology. Specifically, higher self-esteem was associated with a steeper (i.e., healthier) cortisol decline during the day, whereas low self-esteem was associated with a flatter cortisol slope. Depression and neuroticism were tested as alternative pathways linking childhood adversity to cortisol secretion and were found not to be significant, which suggests that the indirect effect was specific to self-esteem. Nevertheless, it is plausible that other psychological pathways exist that might carry the effects of childhood adversity across the life span. For example, attachment security, a potential antecedent of self-esteem that forms during childhood, would be a strong candidate for playing such a role. Unfortunately, this construct was not assessed in our studies, but we hope that future work will test this hypothesis.

Tuesday, September 13, 2016

The ecstasy of speed - or leisure?

Because I so frequently feel overwhelmed by input streams of chunks of information, I wonder how readers of this blog manage to find time to attend to its contents. (I am gratified that so many seem to do so.) Thoughts like this made me pause over Maria Popova's recent essay on our anxiety about time. I want to pass on a few clips, and recommend that you read all of it. She quotes extensively from James Gleick's book published in 2000: "Faster: The Acceleration of Just About Everything.", and begins by noting a 1918 Bertrand Russell quote, “both in thought and in feeling, even though time be real, to realise the unimportance of time is the gate of wisdom.”
Half a century after German philosopher Josef Pieper argued that leisure is the basis of culture and the root of human dignity, Gleick writes:
We are in a rush. We are making haste. A compression of time characterizes the life of the century....We have a word for free time: leisure. Leisure is time off the books, off the job, off the clock. If we save time, we commonly believe we are saving it for our leisure. We know that leisure is really a state of mind, but no dictionary can define it without reference to passing time. It is unrestricted time, unemployed time, unoccupied time. Or is it? Unoccupied time is vanishing. The leisure industries (an oxymoron maybe, but no contradiction) fill time, as groundwater fills a sinkhole. The very variety of experience attacks our leisure as it attempts to satiate us. We work for our amusement...Sociologists in several countries have found that increasing wealth and increasing education bring a sense of tension about time. We believe that we possess too little of it: that is a myth we now live by.
To fully appreciate Gleick’s insightful prescience, it behooves us to remember that he is writing long before the social web as we know it, before the conspicuous consumption of “content” became the currency of the BuzzMalnourishment industrial complex, before the timelines of Twitter and Facebook came to dominate our record and experience of time. (Prescience, of course, is a form of time travel — perhaps our only nonfictional way to voyage into the future.) Gleick writes:
We live in the buzz. We wish to live intensely, and we wonder about the consequences — whether, perhaps, we face the biological dilemma of the waterflea, whose heart beats faster as the temperature rises. This creature lives almost four months at 46 degrees Fahrenheit but less than one month at 82 degrees...Yet we have made our choices and are still making them. We humans have chosen speed and we thrive on it — more than we generally admit. Our ability to work fast and play fast gives us power. It thrills us… No wonder we call sudden exhilaration a rush.
Gleick considers what our units of time reveal about our units of thought:
We have reached the epoch of the nanosecond. This is the heyday of speed. “Speed is the form of ecstasy the technical revolution has bestowed on man,” laments the Czech novelist Milan Kundera, suggesting by ecstasy a state of simultaneous freedom and imprisonment… That is our condition, a culmination of millennia of evolution in human societies, technologies, and habits of mind.
The more I experience and read about the winding up and acceleration of our lives (think of the rate and omnipresence of the current presidential campaign!),  the more I realize the importance of rediscovering the sanity of leisure and quiet spaces.

Monday, September 12, 2016

Mind and Body - A neural substrate of psychosomatic illness

We all have our "hot buttons" - events or issues that can trigger an acute stress response as our adrenal medulla releases adrenaline, causing heart rate increases, sweating, pupil dilation, etc. Dum et al. use a clever tracer technique to show neural connections between the adrenal medulla and higher cortical centers that might exert a 'top-down' cognitive control of this arousal:

Significance
How does the “mind” (brain) influence the “body” (internal organs)? We identified key areas in the primate cerebral cortex that are linked through multisynaptic connections to the adrenal medulla. The most substantial influence originates from a broad network of motor areas that are involved in all aspects of skeletomotor control from response selection to motor preparation and movement execution. A smaller influence originates from a network in medial prefrontal cortex that is involved in the regulation of cognition and emotion. Thus, cortical areas involved in the control of movement, cognition, and affect are potential sources of central commands to influence sympathetic arousal. These results provide an anatomical basis for psychosomatic illness where mental states can alter organ function.
Abstract
Modern medicine has generally viewed the concept of “psychosomatic” disease with suspicion. This view arose partly because no neural networks were known for the mind, conceptually associated with the cerebral cortex, to influence autonomic and endocrine systems that control internal organs. Here, we used transneuronal transport of rabies virus to identify the areas of the primate cerebral cortex that communicate through multisynaptic connections with a major sympathetic effector, the adrenal medulla. We demonstrate that two broad networks in the cerebral cortex have access to the adrenal medulla. The larger network includes all of the cortical motor areas in the frontal lobe and portions of somatosensory cortex. A major component of this network originates from the supplementary motor area and the cingulate motor areas on the medial wall of the hemisphere. These cortical areas are involved in all aspects of skeletomotor control from response selection to motor preparation and movement execution. The second, smaller network originates in regions of medial prefrontal cortex, including a major contribution from pregenual and subgenual regions of anterior cingulate cortex. These cortical areas are involved in higher-order aspects of cognition and affect. These results indicate that specific multisynaptic circuits exist to link movement, cognition, and affect to the function of the adrenal medulla. This circuitry may mediate the effects of internal states like chronic stress and depression on organ function and, thus, provide a concrete neural substrate for some psychosomatic illness.

Friday, September 09, 2016

Want to predict a group’s social standing? Get a hormonal profile.

Usually the analysis of a group's social standing is attempted by determining demographic or psychological characteristics of group members. Akinola et al. suggest that the collective hormonal profile of the group can be equally predictive, and provides a neurobiological perspective on the factors that determine who rises to the top across, not just within, social hierarchies:

Significance
Past research has focused primarily on demographic and psychological characteristics of group members without taking into consideration the biological make-up of groups. Here we introduce a different construct—a group’s collective hormonal profile—and find that a group’s biological profile predicts its standing across groups and that the particular profile supports a dual-hormone hypothesis. Groups with a collective hormonal profile characterized by high testosterone and low cortisol exhibit the highest performance. The current work provides a neurobiological perspective on factors determining group behavior and performance that are ripe for further exploration.
Abstract
Prior research has shown that an individual’s hormonal profile can influence the individual’s social standing within a group. We introduce a different construct—a collective hormonal profile—which describes a group’s hormonal make-up. We test whether a group’s collective hormonal profile is related to its performance. Analysis of 370 individuals randomly assigned to work in 74 groups of three to six individuals revealed that group-level concentrations of testosterone and cortisol interact to predict a group’s standing across groups. Groups with a collective hormonal profile characterized by high testosterone and low cortisol exhibited the highest performance. These collective hormonal level results remained reliable when controlling for personality traits and group-level variability in hormones. These findings support the hypothesis that groups with a biological propensity toward status pursuit (high testosterone) coupled with reduced stress-axis activity (low cortisol) engage in profit-maximizing decision-making. The current work extends the dual-hormone hypothesis to the collective level and provides a neurobiological perspective on the factors that determine who rises to the top across, not just within, social hierarchies.

Thursday, September 08, 2016

Reason is not required for a life of meaning.

Robert Burton, former neurology chief at UCSF and a neuroscience author, has contributed an excellent short essay to the NYTimes philosophy series The Stone. A few clips:
Few would disagree with two age-old truisms: We should strive to shape our lives with reason, and a central prerequisite for the good life is a personal sense of meaning...Any philosophical approach to values and purpose must acknowledge this fundamental neurological reality: a visceral sense of meaning in one’s life is an involuntary mental state that, like joy or disgust, is independent from and resistant to the best of arguments...Anyone who has experienced a bout of spontaneous depression knows the despair of feeling that nothing in life is worth pursuing and that no argument, no matter how inspired, can fill the void. Similarly, we are all familiar with the countless narratives of religious figures “losing their way” despite retaining their formal beliefs.
As neuroscience attempts to pound away at the idea of pure rationality and underscore the primacy of subliminal mental activity, I am increasingly drawn to the metaphor of idiosyncratic mental taste buds. From genetic factors (a single gene determines whether we find brussels sprouts bitter or sweet), to the cultural — considering fried grasshoppers and grilled monkey brains as delicacies — taste isn’t a matter of the best set of arguments...If thoughts, like foods, come in a dazzling variety of flavors, and personal taste trumps reason, philosophy — which relies most heavily on reason, and aims to foster the acquisition of objective knowledge — is in a bind.
Though we don’t know how thoughts are produced by the brain, it is hard to imagine having a thought unaccompanied by some associated mental state. We experience a thought as pleasing, revolting, correct, incorrect, obvious, stupid, brilliant, etc. Though integral to our thoughts, these qualifiers arise out of different brain mechanisms from those that produce the raw thought. As examples, feelings of disgust, empathy and knowing arise from different areas of brain and can be provoked de novo in volunteer subjects via electrical stimulation even when the subjects are unaware of having any concomitant thought at all. This chicken-and-egg relationship between feelings and thought can readily be seen in how we make moral judgments...The psychologist Jonathan Haidt and others have shown that our moral stances strongly correlate with the degree of activation of those brain areas that generate a sense of disgust and revulsion. According to Haidt, reason provides an after-the-fact explanation for moral decisions that are preceded by inherently reflexive positive or negative feelings. Think about your stance on pedophilia or denying a kidney transplant to a serial killer.
After noting work of Libet and others showing that our sense of agency is an illusion - initiating an action occurs well after our brains have already started that action, especially in tennis players and baseball batters - Burton suggests that:
It is unlikely that there is any fundamental difference in how the brain initiates thought and action. We learn the process of thinking incrementally, acquiring knowledge of language, logic, the external world and cultural norms and expectations just as we learn physical actions like talking, walking or playing the piano. If we conceptualize thought as a mental motor skill subject to the same temporal reorganization as high-speed sports, it’s hard to avoid the conclusion that the experience of free will (agency) and conscious rational deliberation are both biologically generated illusions.
What then are we to do with the concept of rationality? It would be a shame to get rid of a term useful in characterizing the clarity of a line of reasoning. Everyone understands that “being rational” implies trying to strip away biases and innate subjectivity in order to make the best possible decision. But what if the word rational leads us to scientifically unsound conclusions?
Going forward, the greatest challenge for philosophy will be to remain relevant while conceding that, like the rest of the animal kingdom, we are decision-making organisms rather than rational agents, and that our most logical conclusions about moral and ethical values can’t be scientifically verified nor guaranteed to pass the test of time. (The history of science should serve as a cautionary tale for anyone tempted to believe in the persistent truth of untestable ideas).
Even so, I would hate to discard such truisms such as “know thyself” or “the unexamined life isn’t worth living.” Reason allows us new ways of seeing, just as close listening to a piece of music can reveal previously unheard melodies and rhythms or observing an ant hill can give us an unexpected appreciation of nature’s harmonies. These various forms of inquiry aren’t dependent upon logic and verification; they are modes of perception.

Wednesday, September 07, 2016

Brain network characteristics of highly intelligent people.

Schultz and Cole show that higher intelligence is associated with less task-related brain network reconfiguration:

SIGNIFICANCE STATEMENT
The brain's network configuration varies based on current task demands. For example, functional brain connections are organized in one way when one is resting quietly but in another way if one is asked to make a decision. We found that the efficiency of these updates in brain network organization is positively related to general intelligence, the ability to perform a wide variety of cognitively challenging tasks well. Specifically, we found that brain network configuration at rest was already closer to a wide variety of task configurations in intelligent individuals. This suggests that the ability to modify network connectivity efficiently when task demands change is a hallmark of high intelligence.
ABSTRACT
The human brain is able to exceed modern computers on multiple computational demands (e.g., language, planning) using a small fraction of the energy. The mystery of how the brain can be so efficient is compounded by recent evidence that all brain regions are constantly active as they interact in so-called resting-state networks (RSNs). To investigate the brain's ability to process complex cognitive demands efficiently, we compared functional connectivity (FC) during rest and multiple highly distinct tasks. We found previously that RSNs are present during a wide variety of tasks and that tasks only minimally modify FC patterns throughout the brain. Here, we tested the hypothesis that, although subtle, these task-evoked FC updates from rest nonetheless contribute strongly to behavioral performance. One might expect that larger changes in FC reflect optimization of networks for the task at hand, improving behavioral performance. Alternatively, smaller changes in FC could reflect optimization for efficient (i.e., small) network updates, reducing processing demands to improve behavioral performance. We found across three task domains that high-performing individuals exhibited more efficient brain connectivity updates in the form of smaller changes in functional network architecture between rest and task. These smaller changes suggest that individuals with an optimized intrinsic network configuration for domain-general task performance experience more efficient network updates generally. Confirming this, network update efficiency correlated with general intelligence. The brain's reconfiguration efficiency therefore appears to be a key feature contributing to both its network dynamics and general cognitive ability.

Tuesday, September 06, 2016

Feeling Good? Do something unpleasant.

A curious piece from Taquet et al.:
Most theories of motivation have highlighted that human behavior is guided by the hedonic principle, according to which our choices of daily activities aim to minimize negative affect and maximize positive affect. However, it is not clear how to reconcile this idea with the fact that people routinely engage in unpleasant yet necessary activities. To address this issue, we monitored in real time the activities and moods of over 28,000 people across an average of 27 d using a multiplatform smartphone application. We found that people’s choices of activities followed a hedonic flexibility principle. Specifically, people were more likely to engage in mood-increasing activities (e.g., play sports) when they felt bad, and to engage in useful but mood-decreasing activities (e.g., housework) when they felt good. These findings clarify how hedonic considerations shape human behavior. They may explain how humans overcome the allure of short-term gains in happiness to maximize long-term welfare.

Monday, September 05, 2016

Do your friends really like you?

I found this article by Murphy pointing to work by Almaatouq et al. to align with my recent experience of having two long term friends (or so I thought), simply stop responding to emails about getting together. And, from the other direction, being described as "our good friend" by a couple I didn't particularly like. It turns out that studies show that only about half of perceived friendships are mutual. The Alamaatouq et al. study:
...analyzed friendship ties among 84 subjects (ages 23 to 38) in a business management class by asking them to rank one another on a five-point continuum of closeness from “I don’t know this person” to “One of my best friends.” The feelings were mutual 53 percent of the time while the expectation of reciprocity was pegged at 94 percent. This is consistent with data from several other friendship studies conducted over the past decade, encompassing more than 92,000 subjects, in which the reciprocity rates ranged from 34 percent to 53 percent.
Clips from the last portion of Murphy's article:
Because time is limited, so, too, is the number of friends you can have, according to the work of the British evolutionary psychologist Robin I.M. Dunbar. He describes layers of friendship, where the topmost layer consists of only one or two people, say a spouse and best friend with whom you are most intimate and interact daily. The next layer can accommodate at most four people for whom you have great affinity, affection and concern and who require weekly attention to maintain. Out from there, the tiers contain more casual friends with whom you invest less time and tend to have a less profound and more tenuous connection. Without consistent contact, they easily fall into the realm of acquaintance.
...playing it safe by engaging in shallow, unfulfilling or nonreciprocal relationships has physical repercussions. Not only do the resulting feelings of loneliness and isolation increase the risk of death as much as smoking, alcoholism and obesity; you may also lose tone, or function, in the so-called smart vagus nerve, which brain researchers think allows us to be in intimate, supportive and reciprocal relationships in the first place...In the presence of a true friend...the smart or modulating aspect of the vagus nerve is what makes us feel at ease rather than on guard as when we are with a stranger or someone judgmental. It’s what enables us to feel O.K. about exposing the soft underbelly of our psyche and helps us stay engaged and present in times of conflict. Lacking authentic friendships, the smart vagus nerve is not exercised. It loses tone and one’s anxiety remains high, making abiding, deep connections difficult.

Friday, September 02, 2016

Growing Older, Getting Happier

A brief piece from Nicholas Bakalar in the NYTimes summaring the recent paper by Thomas et al. (senior author Dilip Jeste):
Older people tend to be happier than younger people, and their happiness increases with age...Researchers contacted 1,546 people ages 21 to 99 via random telephone calls and found that older age was, not surprisingly, tied to declines in physical and cognitive function. But it was also associated with higher levels of overall satisfaction, happiness and well-being, and lower levels of anxiety, depression and stress. The older the person, the study found, the better his or her mental health tended to be.
The researchers used well-validated scales to assess mental health, although the study relied on self-reports and was a snapshot in time that did not follow an individual through a lifetime. Other studies have found similar results linking advancing age and higher levels of happiness.
The reasons for the effect remain unclear, but the senior author, Dr. Dilip V. Jeste, a professor of psychiatry at the University of California, San Diego, had some suggestions...“Brain studies show that the amygdala in older people responds less to stressful or negative images than in a younger person,” he said. “We become wise. Peer pressure loses its sting. Better decision-making, more control of emotions, doing things that are not just for yourself, knowing oneself better, being more studious and yet more decisive...“This is good news for young people, too,” he added. “You have something to look forward to.”
Here are the methods and results sections from the abstract:
Methods: Cross-sectional data were obtained from 1,546 individuals aged 21–100 years, selected using random digit dialing for the Successful AGing Evaluation (SAGE) study, a structured multicohort investigation that included telephone interviews and in-home surveys of community-based adults without dementia. Data were collected from 1/26/2010 to 10/07/2011 targeting participants aged 50–100 years and from 6/25/2012 to 7/15/2013 targeting participants aged 21–100 years with an emphasis on adding younger individuals. Data included self-report measures of physical health, measures of both positive and negative attributes of mental health, and a phone interview–based measure of cognition.
Results: Comparison of age cohorts using polynomial regression suggested a possible accelerated deterioration in physical and cognitive functioning, averaging 1.5 to 2 standard deviations over the adult lifespan. In contrast, there appeared to be a linear improvement of about 1 standard deviation in various attributes of mental health over the same life period.

Thursday, September 01, 2016

Wednesday, August 31, 2016

Climate disasters act as threat multipliers in ethnic conflicts.

Schleussner et al. offer a proof of a common assumption about the effects of climate disasters: that they drive people further apart rather than closer together:
Social and political tensions keep on fueling armed conflicts around the world. Although each conflict is the result of an individual context-specific mixture of interconnected factors, ethnicity appears to play a prominent and almost ubiquitous role in many of them. This overall state of affairs is likely to be exacerbated by anthropogenic climate change and in particular climate-related natural disasters. Ethnic divides might serve as predetermined conflict lines in case of rapidly emerging societal tensions arising from disruptive events like natural disasters. Here, we hypothesize that climate-related disaster occurrence enhances armed-conflict outbreak risk in ethnically fractionalized countries. Using event coincidence analysis, we test this hypothesis based on data on armed-conflict outbreaks and climate-related natural disasters for the period 1980–2010. Globally, we find a coincidence rate of 9% regarding armed-conflict outbreak and disaster occurrence such as heat waves or droughts. Our analysis also reveals that, during the period in question, about 23% of conflict outbreaks in ethnically highly fractionalized countries robustly coincide with climatic calamities. Although we do not report evidence that climate-related disasters act as direct triggers of armed conflicts, the disruptive nature of these events seems to play out in ethnically fractionalized societies in a particularly tragic way. This observation has important implications for future security policies as several of the world’s most conflict-prone regions, including North and Central Africa as well as Central Asia, are both exceptionally vulnerable to anthropogenic climate change and characterized by deep ethnic divides.

Tuesday, August 30, 2016

Our self and our temporo-parietal junction

Eddy does a review of the Temporo-parietal junction area of our brain that appears to be central to our sense of self and other:

Highlights
•Existing literature places the TPJ at the interface between mind and matter. 
•The right TPJ is critical for the control of self and other representations. 
•Dysfunction of right TPJ may therefore compromise our sense of self. 
•Disintegration of the self may in turn underpin various neuropsychiatric symptoms.
Abstract
The temporo-parietal junction (TPJ) is implicated in a variety of processes including multisensory integration, social cognition, sense of agency and stimulus-driven attention functions. Furthermore, manipulation of cortical excitation in this region can influence a diverse range of personal and interpersonal perceptions, from those involved in moral decision making to judgments about the location of the self in space. Synthesis of existing studies places the TPJ at the neural interface between mind and matter, where information about both mental and physical states is processed and integrated, contributing to self-other differentiation. After first summarising the functions of the TPJ according to existing literature, this narrative review aims to offer insight into the potential role of TPJ dysfunction in neuropsychiatric disorders, with a focus on the involvement of the right TPJ in controlling representations relating to the self and other. Problems with self-other distinctions may reflect or pose a vulnerability to the symptoms associated with Tourette syndrome, Schizophrenia, Autistic Spectrum Disorder and Obsessive Compulsive Disorder. Further study of this most fascinating neural region will therefore make a substantial contribution to our understanding of neuropsychiatric symptomatology and highlight significant opportunities for therapeutic impact.

Anatomical and functional subdivisions of the temporo-parietal junction. Top row: Functional MRI meta-analysis data...Showing forward inference data identified using the terms ‘social’ in red, and ‘attention’ in green, with overlap in yellow. Bottom row: Standard anatomical maps using Automated Anatomical Labelling. Showing right inferior parietal lobe (cyan), supramarginal gyrus (green), angular gyrus (deep blue), superior temporal gyrus (yellow) and middle temporal gyrus (red).

Monday, August 29, 2016

Psychological disruptions of our online lives.

I want to pass on clips from a review by Steiner-Adair in the Washington Post, describing Mary Aiken's book "The Cyber Effect," that describes how cyberspace is changing the way we think, feel, and behave:
She uses the science of human behavior to define cyberspace as a unique environment — an actual space — not simply a virtual extension of the pre-digital world and our characteristic behaviors there. Yes, we still hang out, connect, flirt, fight, learn, do business and do good online. But disinhibition and anonymity in cyberspace foster a particular pattern of impulsivity, careless or inflammatory expression, social cruelty, deception, exploitation — and vulnerability. Consider the unsettling phenomenon of ubiquitous victimology, in which “the criminals are well hidden but you aren’t.” That extends from the ordinary streets of online life to the deep, criminal underground where predators roam and perps hawk illicit wares from drugs, guns and hired assassins to trafficked humans and tools for terrorism. Forget reality TV, this is reality. And it’s a mouse click away from your living room — and your curious child.
Our real-world senses do not serve or protect us adequately in cyberspace, Aiken warns. As humans, we’re caught in the gap between evolution and a sea change in our environment. Our instincts for appraising mates, pals and trustworthy others are visceral, designed by nature for face-to-face, embodied interaction in a physical environment. They fail to pick up signals when we meet in the cyber-realm. Without those protective filters, and unaware that they’ve been disabled, we’re vulnerable in new ways. Connecting on line feels so easy and natural that we come to assume a newfound sameness and closeness with strangers.
This phenomenon of “online syndication,” as Aiken calls it — using the Internet to find others we think are like-minded and to normalize and socialize underlying tendencies — is a setup for easy disaster, as Aiken shows in her examples of people caught in cyber-crises: humiliating exchanges or exposure, debt, love affairs, fetishes, porn and gaming addictions, or the lure of criminal behavior. They fail to see the big disconnect between who they are in real life and who they are online, and the gap is fraught with consequences.
Aiken is concerned for children’s development, health and safety in a cyber-environment that replaces face-to-face interaction with online engagement and includes easy access to pornography and hyper-stimulating, addictive activity. The evidence is in, she says, and it shows conclusively that “there are windows in the formative years when very specific skills need to be learned. When those developmental windows close, a child may be developmentally or emotionally crippled for life.”
...the Internet “is clearly, unmistakably, and emphatically an adult environment. It simply wasn’t designed for children. So why are they there?” Indeed, why are we giving kids keys to the Internet? Who would ever think it’s a good idea for children to have miniature computers in their pockets that can take them anywhere online, unsupervised and unprotected? Aiken describes the lack of regulation, accountability, privacy and protection for children caught in this digital transition as a “crime against innocence.” It represents a massive seduction of parents and other adults who should know better, she argues. Her forensic perspective compels us all to demand better protection, reminding us that children ages 4 through 12 are the most vulnerable population on the Web.

Friday, August 26, 2016

A bit of nostalgia - Powers of 10

I just stumbled across a charming relic from my counter culture days in the 1970's, when I was watching whales and monarch butterflies at the Esalen Institute in Big Sur, and learning gestalt, TA, Alexander, massage, and meditation techniques. At one point I signed up for transcendental meditation instruction, and this 1977 video was shown in the first session, after which the instructor said "That's all there is to it"........Sigh.....


Thursday, August 25, 2016

Alerting or Somnogenic light - pick your color

Bourgin and Hubbard summarize work by Pilorz et al.
Light exerts profound effects on our physiology and behaviour, setting our biological clocks to the correct time and regulating when we are asleep and we are awake. The photoreceptors mediating these responses include the rods and cones involved in vision, as well as a subset of photosensitive retinal ganglion cells (pRGCs) expressing the blue light-sensitive photopigment melanopsin. Previous studies have shown that mice lacking melanopsin show impaired sleep in response to light. However, other studies have shown that light increases glucocorticoid release—a response typically associated with stress. To address these contradictory findings, we studied the responses of mice to light of different colours. We found that blue light was aversive, delaying sleep onset and increasing glucocorticoid levels. By contrast, green light led to rapid sleep onset. These different behavioural effects appear to be driven by different neural pathways. Surprisingly, both responses were impaired in mice lacking melanopsin. These data show that light can promote either sleep or arousal. Moreover, they provide the first evidence that melanopsin directly mediates the effects of light on glucocorticoids. This work shows the extent to which light affects our physiology and has important implications for the design and use of artificial light sources.

Wednesday, August 24, 2016

Oxytocin - a molecular substrate for forming optimistic beliefs about the future

Ma et al. demonstrate a molecular basis for why people tend to incorporate desirable, but not undesirable, feedback into their beliefs:

Significance
People tend to incorporate desirable feedback into their beliefs but discount undesirable ones. Such optimistic updating has evolved as an advantageous mechanism for social adaptation and physical/mental health. Here, in three independent studies, we show that intranasally administered oxytocin (OT), an evolutionary ancient neuropeptide pivotal to social adaptation, augments optimistic belief updating by increasing updates and learning of desirable feedback but impairing updates of undesirable feedback. Moreover, the OT-impaired updating of undesirable feedback is more salient in individuals with high, rather than with low, depression or anxiety traits. OT also increases second-order confidence judgment after desirable feedback. These findings reveal a molecular substrate underlying the formation of optimistic beliefs about the future.
Abstract
Humans update their beliefs upon feedback and, accordingly, modify their behaviors to adapt to the complex, changing social environment. However, people tend to incorporate desirable (better than expected) feedback into their beliefs but to discount undesirable (worse than expected) feedback. Such optimistic updating has evolved as an advantageous mechanism for social adaptation. Here, we examine the role of oxytocin (OT)―an evolutionary ancient neuropeptide pivotal for social adaptation―in belief updating upon desirable and undesirable feedback in three studies (n = 320). Using a double-blind, placebo-controlled between-subjects design, we show that intranasally administered OT (IN-OT) augments optimistic belief updating by facilitating updates of desirable feedback but impairing updates of undesirable feedback. The IN-OT–induced impairment in belief updating upon undesirable feedback is more salient in individuals with high, rather than with low, depression or anxiety traits. IN-OT selectively enhances learning rate (the strength of association between estimation error and subsequent update) of desirable feedback. IN-OT also increases participants’ confidence in their estimates after receiving desirable but not undesirable feedback, and the OT effect on confidence updating upon desirable feedback mediates the effect of IN-OT on optimistic belief updating. Our findings reveal distinct functional roles of OT in updating the first-order estimation and second-order confidence judgment in response to desirable and undesirable feedback, suggesting a molecular substrate for optimistic belief updating.

Tuesday, August 23, 2016

Slow motion increases perceived intent.

The abstract from interesting work of Caruso et al.
To determine the appropriate punishment for a harmful action, people must often make inferences about the transgressor’s intent. In courtrooms and popular media, such inferences increasingly rely on video evidence, which is often played in “slow motion.” Four experiments (n = 1,610) involving real surveillance footage from a murder or broadcast replays of violent contact in professional football demonstrate that viewing an action in slow motion, compared with regular speed, can cause viewers to perceive an action as more intentional. This slow motion intentionality bias occurred, in part, because slow motion video caused participants to feel like the actor had more time to act, even when they knew how much clock time had actually elapsed. Four additional experiments (n = 2,737) reveal that allowing viewers to see both regular speed and slow motion replay mitigates the bias, but does not eliminate it. We conclude that an empirical understanding of the effect of slow motion on mental state attribution should inform the life-or-death decisions that are currently based on tacit assumptions about the objectivity of human perception.

Monday, August 22, 2016

Lifespan changes in brain and cognition - early life sets the stage.

Walhovd et al. present a fascinating study on the origins of lifespan changes in brain and cognition, defining an extensive cortical region wherein surface area relates positively to general cognitive ability (GCA) in development. They find evidence that especially prefrontal and medial and posterolateral temporal clusters relate more strongly to GCA:

Significance
Brain and cognition change with age, with early gains and later declines. Attempts have been made to identify age-specific mechanisms, focusing on when and how declines begin in adults. However, even though general cognitive ability declines with age, there is a high stability in individuals’ cognitive ability relative to their same-age peers. Here we show that the relation between brain and cognition appears remarkably stable through the human lifespan. The cortical area change trajectories of higher and lower cognitive ability groups were parallel through life. Birth weight and parental education were identified as predictors, which provides novel evidence for stability in brain–cognition relationships throughout life, and indicates that early life factors impact brain and cognition for the entire life course.
Abstract
Neurodevelopmental origins of functional variation in older age are increasingly being acknowledged, but identification of how early factors impact human brain and cognition throughout life has remained challenging. Much focus has been on age-specific mechanisms affecting neural foundations of cognition and their change. In contrast to this approach, we tested whether cerebral correlates of general cognitive ability (GCA) in development could be extended to the rest of the lifespan, and whether early factors traceable to prenatal stages, such as birth weight and parental education, may exert continuous influences. We measured the area of the cerebral cortex in a longitudinal sample of 974 individuals aged 4–88 y (1,633 observations). An extensive cortical region was identified wherein area related positively to GCA in development. By tracking area of the cortical region identified in the child sample throughout the lifespan, we showed that the cortical change trajectories of higher and lower GCA groups were parallel through life, suggesting continued influences of early life factors. Birth weight and parental education obtained from the Norwegian Mother–Child Cohort study were identified as such early factors of possible life-long influence. Support for a genetic component was obtained in a separate twin sample (Vietnam Era Twin Study of Aging), but birth weight in the child sample had an effect on cortical area also when controlling for possible genetic differences in terms of parental height. Our results provide novel evidence for stability in brain–cognition relationships throughout life, and indicate that early life factors impact brain and cognition for the entire life course.
A summary graphic from the review by Jagust:


Conceptual model linking brain development, cognition, brain reserve, and late-life cognitive decline. Early life exposures and genes affect brain development, which in turn is related to GCA. GCA and education are related to one another, and provide brain reserve with advancing age. The graph demonstrates two individuals with high (blue) and low (red) brain reserve. Although the rate of their age-related cognitive decline is identical, the person with higher reserve crosses the threshold for dependence at an older age, thus experiencing a longer independent life. Early-life exposures, however, also confer indirect beneficial effects in addition to brain development, and these are likely to be salutary over the lifespan.