Showing posts with label emotions. Show all posts
Showing posts with label emotions. Show all posts

Friday, May 24, 2024

Think AI Can Perceive Emotion? Think Again.

Numerous MindBlog posts have presented the work and writing of Elizabeth Feldman Barrett (enter Barrett in the search box in the right column of this web page). Her book, "How Emotions Are Made," is the one I recommend when anyone asks me what I think is the best popular book on how our brains work.  Here I want to pass on her piece on AI and emotions in the Sat. May 18 Wall Street Journal. It collects together the various reasons that AI can not, and should not, be used for detecting our emotional state from our facial expressions or other body language.  Here is her text: 

Imagine that you are interviewing for a job. The interviewer asks a question that makes you think. While concentrating, you furrow your brow and your face forms a scowl. A camera in the room feeds your scowling face to an AI model, which determines that you’ve become angry. The interview team decides not to hire you because, in their view, you are too quick to anger. Well, if you weren’t angry during the interview, you probably would be now.

This scenario is less hypothetical than you might realize. So-called emotion AI systems already exist, and some are specifically designed for job interviews. Other emotion AI products try to create more empathic chatbots, build more precise medical treatment plans and detect confused students in classrooms. But there’s a catch: The best available scientific evidence indicates that there are no universal expressions of emotion.

In real life, angry people don’t commonly scowl. Studies show that in Western cultures, they scowl about 35% of the time, which is more than chance but not enough to be a universal expression of anger. The other 65% of the time, they move their faces in other meaningful ways. They might pout or frown. They might cry. They might laugh. They might sit quietly and plot their enemy’s demise. Even when Westerners do scowl, half the time it isn’t in anger. They scowl when they concentrate, when they enjoy a bad pun or when they have gas.

Similar findings hold true for every so-called universal facial expression of emotion. Frowning in sadness, smiling in happiness, widening your eyes in fear, wrinkling your nose in disgust and yes, scowling in anger, are stereotypes—common but oversimplified notions about emotional expressions.

Where did these stereotypes come from? You may be surprised to learn that they were not discovered by observing how people move their faces during episodes of emotion in real life. They originated in a book by Charles Darwin, “The Expression of the Emotions in Man and Animals,” which proposed that humans evolved certain facial movements from ancient animals. But Darwin didn’t conduct careful observations for these ideas as he had for his masterwork, “On the Origin of Species.” Instead, he came up with them by studying photographs of people whose faces were stimulated with electricity, then asked his colleagues if they agreed.

In 2019, the journal Psycho--logical Science in the Public Interest engaged five senior scientists, including me, to examine the scientific evidence for the idea that people express anger, sadness, fear, happiness, disgust and surprise in universal ways. We came from different fields—psychology, neuroscience, engineering and computer science—and began with opposing views. Yet, after reviewing more than a thousand papers during almost a hundred videoconferences, we reached a consensus: In the real world, an emotion like anger or sadness is a broad category full of variety. People express different emotions with the same facial movements and the same emotion with different facial movements. The variation is meaningfully tied to a person’s situation.

In short, we can’t train AI on stereotypes and expect the results to work in real life, no matter how big the data set or sophisticated the algorithm. Shortly after the paper was published, Microsoft retired the emotion AI features of their facial recognition software.

Other scientists have also demonstrated that faces are a poor indicator of a person’s emotional state. In a study published in the journal Psychological Science in 2008, scientists combined photographs of stereotypical but mismatched facial expressions and body poses, such as a scowling face attached to a body that’s holding a dirty diaper. Viewers asked to identify the emotion in each image typically chose what was implied by the body, not the face— in this case disgust, not anger. In a study published in the journal Science in 2012, the same lead scientist showed that winning and losing athletes, in the midst of their glory or defeat, make facial movements that are indistinguishable.

Nevertheless, these stereotypes are still widely assumed to be universal expressions of emotion. They’re in posters in U.S. preschools, spread through the media, designed into emojis and now enshrined in AI code. I recently asked two popular AIbased image generators, Midjourney and OpenAI’s DALL-E, to depict “an angry person.” I also asked two AI chatbots, OpenAI’s ChatGPT and Google’s Gemini, how to tell if a person is angry. The results were filled with scowls, furrowed brows, tense jaws and clenched teeth.

Even AI systems that appear to sidestep emotion stereotypes may still apply them in stealth. A 2021 study in the journal Nature trained an AI model with thousands of video clips from the internet and tested it on millions more. The authors concluded that 16 facial expressions are made worldwide in certain social contexts. Yet the trainers who labeled the clips with emotion words were all English--speakers from a single country, India, so they effectively transmitted cultural stereotypes to a machine. Plus, there was no way to objectively confirm what the strangers in the videos were actually feeling at the time.

Clearly, large data sets alone cannot protect an AI system from applying preconceived assumptions about emotion. The European Union’s AI Act, passed in 2023, recognizes this reality by barring the use of emotion AI in policing, schools and workplaces.

So what is the path forward? If you encounter an emotion AI --product that purports to hire skilled job candidates, diagnose anxiety and depression, assess guilt or innocence in court, detect terrorists in airports or analyze a person’s emotional state for any other purpose, it pays to be skeptical. Here are three questions you can ask about any emotion AI product to probe the scientific approach behind it.

Is the AI model trained to account for the huge variation of real-world emotional life? Any individual may express an emotion like anger differently at different times and in different situations, depending on context. People also use the same movements to express different states, even nonemotional ones. AI models must be trained to reflect this variety.

Does the AI model distinguish between observing facial movements and inferring meaning from these movements? Muscle movements are measurable; inferences are guesses. If a system or its designers confuse description with inference, like considering a scowl to be an “anger expression” or even calling a facial movement a “facial expression,” that’s a red flag.

Given that faces by themselves don’t reveal emotion, does the AI model include abundant context? I don’t mean just a couple of signals, such as a person’s voice and heart rate. In real life, when you perceive someone else as emotional, your brain combines signals from your eyes, ears, nose, mouth, skin, and the internal systems of your body and draws on a lifetime of experience. An AI model would need much more of this information to make reasonable guesses about a person’s emotional state.

AI promises to simplify decisions by providing quick answers, but these answers are helpful and justified only if they draw from the true richness and variety of experience. None of us wants important outcomes in our lives, or the lives of our loved ones, to be determined by a stereotype.

Wednesday, May 22, 2024

The Happiness Gap Between Left and Right

 I want to pass on a few clips from a recent Thomas Edsall essay, followed by a condensed version of the longer piece provided by Chat GPT 4:

Why is it that a substantial body of social science research finds that conservatives are happier than liberals?...psychologists and other social scientists have begun to dig deeper into the underpinnings of liberal discontent — not only unhappiness but also depression and other measures of dissatisfaction.
One of the findings emerging from this research is that the decline in happiness and in a sense of agency is concentrated among those on the left who stress matters of identity, social justice and the oppression of marginalized groups.
There is, in addition, a parallel phenomenon taking place on the right as Donald Trump and his MAGA loyalists angrily complain of oppression by liberals who engage in a relentless vendetta to keep Trump out of the White House.
There is a difference in the way the left and right react to frustration and grievance. Instead of despair, the contemporary right has responded with mounting anger, rejecting democratic institutions and norms.

Here is my edited version of a Chat GPT4 4-fold condensation of Edsall's essay: 

…surveys have consistently shown that those on the right of the political spectrum enjoy a higher self-reported sense of happiness compared to their counterparts on the left. The reasons are as complex as they are intriguing.

Conservatives tend to view the social and economic systems as just and fair, where hard work is rewarded and natural hierarchies are maintained. This perspective shields them from much of the anger or dissatisfaction that might arise from witnessing social or economic inequalities. They see market outcomes and social stratifications as generally fair and based on merit, which fosters a sense of contentment or acceptance of their circumstances.

On the other hand, liberals are more likely to perceive social and economic systems as flawed or unfair, nurturing a sense of injustice and dissatisfaction. This ideological stance makes them more sensitive to the inequities and imperfections of society, which can manifest as frustration, sadness, or a pervasive sense of being wronged. The liberal focus on social justice, equity, and the protection of marginalized groups, while morally compelling, can also be a source of continuous discontent and agitation as these goals are often far from being realized.

Recent psychological research has started to probe deeper into these disparities, shifting the focus from documenting differences to understanding their underlying causes. This body of work suggests that the liberal emphasis on identity and the systemic oppression of marginalized groups can sometimes lead to a feeling of disempowerment. By defining themselves in terms of victimhood and systemic barriers, liberals might inadvertently undermine their sense of personal agency, which is closely linked to psychological well-being.

The current political climate, especially with the rise of Donald Trump and his brand of populism, has also highlighted a stark difference in how frustration and grievance are expressed across the political spectrum. While liberals might internalize their discontent, leading to despair and dejection, many conservatives have channeled their frustrations into anger and defiance. This is exemplified by the significant number of Republicans who view Democrats not just as political opponents, but as outright enemies, and who believe in the necessity of strong, even authoritarian leadership to preserve their way of life.

This divergence in emotional response is not without consequences. As observed in various studies and polls, more than twice as many Republicans as Democrats believe that extreme measures, including violence, might be necessary to protect the nation from its leaders. This growing acceptance of force and the bending of democratic norms and institutions reflect a profound shift in conservative sentiment, fueled by perceived threats to their traditional values and way of life.

The implications of these ideological and psychological divides extend beyond mere political debates to affect the very fabric of individual well-being. Scholars like Jamin Halberstadt and Timothy A. Judge argue that a focus on systemic injustices and an external locus of control can significantly dampen happiness and self-esteem. Liberals, with their emphasis on the collective and the structural, might find themselves feeling powerless and disillusioned, while conservatives, with their focus on individualism and personal accountability, maintain a more optimistic and empowered outlook.

Moreover, the phenomenon of concept creep, as discussed by Nick Haslam, illustrates another layer of complexity. This expansion of definitions around harm and abuse, often driven by liberal ideologies, has increased sensitivity to various issues, which while raising awareness, also intensifies feelings of vulnerability and injustice. This heightened sensitivity can lead to an atmosphere where free speech and expression are more heavily scrutinized, further complicating the landscape of political and social discourse.

In conclusion, the happiness gap between conservatives and liberals is a multifaceted issue that reflects deeper ideological beliefs and psychological orientations. While conservatives may find comfort in a worldview that sees the social order as just and self-determined, liberals' commitment to challenging this order and addressing systemic injustices, though noble, may paradoxically contribute to their own discontent. This dynamic interplay between ideology and well-being underscores the profound impact of our political beliefs on our personal lives, shaping not only how we view the world but also how we experience it.

 

 

Monday, May 15, 2023

People who talk too much

I host a monthly discussion group in Austin TX, The Austin Rainbow Forum, that meets at 2 pm on the first Sunday of every month to consider interesting topics and ideas. On this past May 7, one of our group members led a discussion of "Overtalking" in the modern world, which has got us all spouting opinions, giving advice, and getting ourselves in trouble, according to Dan Lyons in his recent book titled "STFU: The Power of Keeping Your Mouth Shut in an Endlessly Noisy World."  The central ideas in Lyons’ book are summarized in this Time Magazine article. I looked through a reviewers copy of the book I was sent, and suggest that it is worth having a look if you are stimulated by the summary article. The bottom line of the book could be stated as "Shut up and listen instead of talking so much." Lyons offers five nudges: 

-When possible, say nothing

-Master the power of the pause

-Quit social media

-Seek out silence

-Learn how to listen

Lyons is a professional columnist who writes with a very engaging style, even if the level of his coverage is sometimes a bit superficial.  (He quotes a researcher who studied brain activity and '“figured out what causes talkaholism,” ...unfortunately, on doing a quick look up of the work describing the neuronal measurements, I found that there is no there there.)

Wednesday, April 05, 2023

The fundamentals of empathy

Akinrinade et al. show that the neuropeptide oxytocin is responsible for emotional fear contagion, and involves the same regions of the brain in zebrafish and in mammals, suggesting this most basal form of empathy could have evolved many, many millions of years ago.
Emotional contagion is the most ancestral form of empathy. We tested to what extent the proximate mechanisms of emotional contagion are evolutionarily conserved by assessing the role of oxytocin, known to regulate empathic behaviors in mammals, in social fear contagion in zebrafish. Using oxytocin and oxytocin receptor mutants, we show that oxytocin is both necessary and sufficient for observer zebrafish to imitate the distressed behavior of conspecific demonstrators. The brain regions associated with emotional contagion in zebrafish are homologous to those involved in the same process in rodents (e.g., striatum, lateral septum), receiving direct projections from oxytocinergic neurons located in the pre-optic area. Together, our results support an evolutionary conserved role for oxytocin as a key regulator of basic empathic behaviors across vertebrates.

Saturday, October 22, 2022

New Perspectives on how our Minds Work

I want to pass on to MindBlog readers this link to a lecture I gave this past Friday (10/21/22) to the Univ. of Texas OLLI (Osher Lifelong Learning Institute) UT FORUM group on Oct. 21, 2022. Here is the brief description of the talk:  

Abstract

Recent research shows that much of what we thought we knew about how our minds work is wrong. Rather than rising from our essential natures, our emotional and social realities are mainly invented by each of us. Modern and ancient perspectives allow us to have some insight into what we have made.
Description
This talk offers a description of how our predictive brains work to generate our perceptions, actions, emotions, concepts, language, and social structures. Our experience that a self or "I" inside our heads is responsible for these behaviors is a useful illusion, but there is in fact no homunculus or discrete place inside our heads where “It all comes together.” Starting before we are born diffuse networks of brain cells begin generating actions and perceiving their consequences to build an internal library of sensing and acting correlations that keep us alive and well, a library that is the source of predictions about what we might expect to happen next in our worlds. Insights from both modern neuroscience research and ancient meditative traditions allow us to partially access and sometimes change this archive that manages our behaviors.

Monday, October 03, 2022

Triggers for mother love

A fascinating open source article from Margaret Livingstone carrying forward the famous experiments by Harry Harlow:  

Significance

Harry Harlow found that infant monkeys form strong and lasting attachments to inanimate surrogates, but only if the surrogate is soft; here I report that postpartum monkey mothers can also form strong and lasting attachments to soft inanimate objects. Thus, mother/infant and infant/mother bonds may both be triggered by soft touch.
Abstract
Previous studies showed that baby monkeys separated from their mothers develop strong and lasting attachments to inanimate surrogate mothers, but only if the surrogate has a soft texture; soft texture is more important for the infant’s attachment than is the provision of milk. Here I report that postpartum female monkeys also form strong and persistent attachments to inanimate surrogate infants, that the template for triggering maternal attachment is also tactile, and that even a brief period of attachment formation can dominate visual and auditory cues indicating a more appropriate target.

Wednesday, September 28, 2022

Neural synchronization predicts marital satisfaction

From Li et al.:  

Significance

Humans establish intimate social and personal relationships with their partners, which enable them to survive, successfully mate, and raise offspring. Here, we examine the neurobiological basis of marital satisfaction in humans using naturalistic, ecologically relevant, interpersonal communicative cues that capture shared neural representations between married couples. We show that in contrast to demographic and personality measures, which are unreliable predictors of marital satisfaction, neural synchronization of brain responses during viewing of naturalistic maritally relevant movies predicted higher levels of marital satisfaction in couples. Our findings demonstrate that brain similarities that reflect real-time mental responses to subjective perceptions, thoughts, and feelings about interpersonal and social interactions are strong predictors of marital satisfaction and advance our understanding of human marital bonding.
Abstract
Marital attachment plays an important role in maintaining intimate personal relationships and sustaining psychological well-being. Mate-selection theories suggest that people are more likely to marry someone with a similar personality and social status, yet evidence for the association between personality-based couple similarity measures and marital satisfaction has been inconsistent. A more direct and useful approach for understanding fundamental processes underlying marital satisfaction is to probe similarity of dynamic brain responses to maritally and socially relevant communicative cues, which may better reflect how married couples process information in real time and make sense of their mates and themselves. Here, we investigate shared neural representations based on intersubject synchronization (ISS) of brain responses during free viewing of marital life-related, and nonmarital, object-related movies. Compared to randomly selected pairs of couples, married couples showed significantly higher levels of ISS during viewing of marital movies and ISS between married couples predicted higher levels of marital satisfaction. ISS in the default mode network emerged as a strong predictor of marital satisfaction and canonical correlation analysis revealed a specific relation between ISS in this network and shared communication and egalitarian components of martial satisfaction. Our findings demonstrate that brain similarities that reflect real-time mental responses to subjective perceptions, thoughts, and feelings about interpersonal and social interactions are strong predictors of marital satisfaction, reflecting shared values and beliefs. Our study advances foundational knowledge of the neurobiological basis of human pair bonding.

Wednesday, July 27, 2022

Emotional contagion and prosocial behavior

Keysers et al. do an open source review of studies on emotional contagion and prosocial behavior in rodents, whose brain regions necessary for emotional contagion closely resemble those associated with human empathy:
Rats and mice show robust emotional contagion by aligning their fear and pain to that of others.
Brain regions necessary for emotional contagion in rodents closely resemble those associated with human empathy; understanding the biology of emotional contagion in rodents can thus shed light on the evolutionary origin and mechanisms of human empathy.
Cingulate area 24 in rats and mice contains emotional mirror neurons that map the emotions of others onto the witnesses’ own emotions.
Emotional contagion prepares animals to deal with threats by using others as sentinels; the fact that rodents approach individuals in distress facilitates such contagion.
In some conditions, rats and mice learn to prefer actions that benefit others, with notable individual differences. This effect depends on structures that overlap with those of emotional contagion.

Monday, July 25, 2022

Efficiently irrational: deciphering the riddle of human choice

Highlights of an open source article from Paul Glimcher:
A central question for decision-making scholars is: why are humans and animals so predictably inconsistent in their choices? In the language of economics, why are they irrational?
Data suggest that this reflects an optimal trade-off between the precision with which the brain represents the values of choices and the biological costs of that precision. Increasing representational precision may improve choice consistency, but the metabolic cost of increased precision is significant.
Given the cost of precision, the brain might use efficient value-encoding mechanisms that maximize informational content. Mathematical analyses suggest that a mechanism called divisive normalization approximates maximal efficiency per action potential in decision systems.
Behavioral studies appear to validate this claim. Inconsistencies produced by decision-makers can be well modeled as the byproduct of efficient divisive normalization mechanisms that maximize information while minimizing metabolic costs.

Friday, June 17, 2022

Testerone production in adult men is regulated by an adolescent period sensitive to family experiences.

 From Gettler et al.:

Significance
Testosterone influences how animals devote energy and time toward reproduction, including opposing demands of mating and competition versus parenting. Reflecting this, testosterone often declines in new fathers and lower testosterone is linked to greater caregiving. Given these roles, there is strong interest in factors that affect testosterone, including early-life experiences. In this multidecade study, Filipino sons whose fathers were present and involved with raising them when they were adolescents had lower testosterone when they later became fathers, compared to sons whose fathers were present but uninvolved or were not coresident. Sons’ own parenting behaviors did not explain these patterns. These results connect key social experiences during adolescence to adult testosterone, and point to possible intergenerational effects of parenting style.
Abstract
Across vertebrates, testosterone is an important mediator of reproductive trade-offs, shaping how energy and time are devoted to parenting versus mating/competition. Based on early environments, organisms often calibrate adult hormone production to adjust reproductive strategies. For example, favorable early nutrition predicts higher adult male testosterone in humans, and animal models show that developmental social environments can affect adult testosterone. In humans, fathers’ testosterone often declines with caregiving, yet these patterns vary within and across populations. This may partially trace to early social environments, including caregiving styles and family relationships, which could have formative effects on testosterone production and parenting behaviors. Using data from a multidecade study in the Philippines (n = 966), we tested whether sons’ developmental experiences with their fathers predicted their adult testosterone profiles, including after they became fathers themselves. Sons had lower testosterone as parents if their own fathers lived with them and were involved in childcare during adolescence. We also found a contributing role for adolescent father–son relationships: sons had lower waking testosterone, before and after becoming fathers, if they credited their own fathers with their upbringing and resided with them as adolescents. These findings were not accounted for by the sons’ own parenting and partnering behaviors, which could influence their testosterone. These effects were limited to adolescence: sons’ infancy or childhood experiences did not predict their testosterone as fathers. Our findings link adolescent family experiences to adult testosterone, pointing to a potential pathway related to the intergenerational transmission of biological and behavioral components of reproductive strategies.

Wednesday, June 08, 2022

Stories move the heart - literally

Continuing my thread of heart activity realted posts (here, and here), I'll mention that I've enjoyed reading this open access PNAS Science and Culture article by Carolyn Beans on the meaning and usefulness of heart rate fluctuations. Here are the starting paragraphs:
In June 2019, at the University of Birmingham in England, psychologist Damian Cruse invited 27 young adults to come to the lab, on separate occasions, and listen to the same clips from an audiobook of Jules Verne’s 20,000 Leagues Under the Sea. Sitting alone, each donned headphones and electrocardiogram (EKG) equipment while a voice with a British accent recounted tales of a mysterious monster taking down ships. When researchers later compared volunteers’ heart rates, a curious phenomenon emerged: The heart rates of nearly two-thirds of the participants rose and fell together as the story progressed (1).
“It’s not that the beats align synchronously, but rather the heart rate fluctuations go up and down in unison,” explains Lucas Parra, a biomedical engineer at City College of New York, and co-senior author on the study.
Research has already shown that brain activity can synchronize when listeners pay attention to the same video or story (2). Now, Parra and others are finding that the heart, too, offers insight into who is really paying attention to a story. Potential applications are myriad. With heart rate recordings from smart watches, a webinar host may one day learn whether the audience is engaged, or a doctor could offer a family insight into whether a loved one will recover consciousness.
But the technology is new and researchers are still grappling with how to harness heart rate data responsibly, even as they continue to explore why stories move hearts in synchrony in the first place.

Monday, May 30, 2022

Brain-Heart interplay in emotional arousal - resolving a hundred year old debate

Candia-Rivera et al. do a fascinating piece of work that answers some long-standing issues in the century old debate on the role of the autonomic nervous system in feelings. I will be slowly re-reading this paper a number of times. The introduction provides an excellent review of contrasting theories of what emotions are.
...The debate about the role of the ANS in emotions can be condensed into two views: specificity or causation. The specificity view is related to the James–Lange theory, which states that bodily responses precede emotions’ central processing, meaning that bodily states would be a response to the environment, followed by an interpretation carried out by the CNS that would result in the feeling felt. However, causation theories represent an updated view of the James–Lange theory, suggesting that peripheral changes influence the conscious emotional experience....While more “classical” theories point to emotions as “the functional states of the brain that provide causal explanations of certain complex behaviors—like evading a predator or attacking prey”, other theories suggest how they are constructions of the world, not reactions to it (see MindBlog posts on Lisa Feldman Barretts work). Namely, emotions are internal states constructed on the basis of previous experiences as predictive schemes to react to external stimuli.
Here is a clip from the discussion of their open source paper, followed by the significance and abstract sections at the begninning of the article:
....To the best of our knowledge, major novelties of the current study with respect to prior state of the art are related to 1) the uncovering of the directed functional interplay between central and peripheral neural dynamics during an emotional elicitation, using ad-hoc mathematical models for synchronized EEG and ECG time series; 2) the uncovering of temporal dynamics of cortical and cardiovascular neural control during emotional processing in both ascending, from the heart to the brain, and descending, from the brain to the heart, functional directions; and 3) the experimental support for causation theories of physiological feelings.
In the frame of investigating the visceral origin of emotions, main findings of this study suggest that ascending BHI (brain-heart interplay) coupling initiates emotional processing and is mainly modulated by the subjective experience of emotional arousal. Such a relationship between arousal and ascending BHI may not be related to the attention levels, as controlled with two different neural correlates of attention. The main interactions begin through afferent vagal pathways (HF power) sustaining EEG oscillations, in which the theta band was repeatedly found related to major vagal modulations. In turn, with a later onset, this ascending modulation actually triggers a cascade of cortical neural activations that, in turn, modulate directed neural control onto the heart, namely from-brain-to-heart interplay. Concurrent bidirectional communication between the brain and body occurs throughout the emotional processing at specific timings, reaching a maximum coupling around 15 to 20 s from the elicitation onset, involving both cardiac sympathetic and vagal activity.

From the beginning of the article;  

Significance

We investigate the temporal dynamics of brain and cardiac activities in healthy subjects who underwent an emotional elicitation through videos. We demonstrate that, within the first few seconds, emotional stimuli modulate heartbeat activity, which in turn stimulates an emotion intensity (arousal)–specific cortical response. The emotional processing is then sustained by a bidirectional brain–heart interplay, where the perceived arousal level modulates the amplitude of ascending heart-to-brain neural information flow. These findings may constitute fundamental knowledge linking neurophysiology and psychiatric disorders, including the link between depressive symptoms and cardiovascular disorders.
Abstract
A century-long debate on bodily states and emotions persists. While the involvement of bodily activity in emotion physiology is widely recognized, the specificity and causal role of such activity related to brain dynamics has not yet been demonstrated. We hypothesize that the peripheral neural control on cardiovascular activity prompts and sustains brain dynamics during an emotional experience, so these afferent inputs are processed by the brain by triggering a concurrent efferent information transfer to the body. To this end, we investigated the functional brain–heart interplay under emotion elicitation in publicly available data from 62 healthy subjects using a computational model based on synthetic data generation of electroencephalography and electrocardiography signals. Our findings show that sympathovagal activity plays a leading and causal role in initiating the emotional response, in which ascending modulations from vagal activity precede neural dynamics and correlate to the reported level of arousal. The subsequent dynamic interplay observed between the central and autonomic nervous systems sustains the processing of emotional arousal. These findings should be particularly revealing for the psychophysiology and neuroscience of emotions.

Monday, March 14, 2022

Addicted to dreaming.

Dopamine (DA) is usually associated with pleasure and addiction. Now Hasegawa et al. show that release of DA in the basolateral amygdala (BLA), a brain structure associated with emotional processing, can trigger rapid eye movement (REM) dreaming sleep in mice.
The sleep cycle is characterized by alternating non–rapid eye movement (NREM) and rapid eye movement (REM) sleeps. The mechanisms by which this cycle is generated are incompletely understood. We found that a transient increase of dopamine (DA) in the basolateral amygdala (BLA) during NREM sleep terminates NREM sleep and initiates REM sleep. DA acts on dopamine receptor D2 (Drd2)–expressing neurons in the BLA to induce the NREM-to-REM transition. This mechanism also plays a role in cataplectic attacks—a pathological intrusion of REM sleep into wakefulness—in narcoleptics. These results show a critical role of DA signaling in the BLA in initiating REM sleep and provide a neuronal basis for sleep cycle generation.

Friday, February 11, 2022

A special issue of Social Cognitive and Affective Neuroscience on tDCS

I want to point to this special open source issue of Social Cognitive and Affective Neuroscience. Paulo Boggio provides an interesting historical introduction, staring in Roman times with the use of the electrical discharge of the torpedo fish to treat headaches (imagine being treated with fish applications over your head!). The articles in the issue consider the effects of low-intensity direct current stimulation of the surface of the scalp on prosocial behavior, aggression, impulsivity, etc. A review article by Galli et al. considers the use of tDCS to relieve the symptomatology of individuals with affective or social cognition disorders. (DIY kits for home experimentrs - which I would not recommend - abound on the internet, regular flashlight batteries being a sufficient source of the low currents used.)

Tuesday, January 25, 2022

Using big data to track major shifts in human cognition

I want to pass on the first few paragraphs of a fascinating commentary by Simon DeDao on an article by Scheffer et al. that was the subject of MindBlog's 12/31/21 post. Motivated readers can obtain a copy of the whole article by emailing me.:
Scheffer et al.’s (1) exciting new work reports an historic rearrangement, occurring in the late 20th century, of the balance between reason and emotion. Its approach is part of a new trend in the psychological sciences that uses extremely large volumes of text to study basic patterns of human cognition. Recent work in this vein has included studies of the universal properties of gender representations (2), the rise of causal thinking (3), and a cognitive bias towards positivity in language itself (4). The goal of going “from text to thought” (5) is an attractive one, and the promise of the machine learning era is that we will only get better at extracting the imprints left, in text, by the mechanisms of the mind.
To establish their claims, Scheffer et al. (1) use principal component analysis to identify two major polarities of correlated vocabulary words in the Google Books corpus (6). The first polarity (PC1) tracks a shift from archaic to modern, in both material life (“iron” is archaic, “computer” is modern) and culture (“liberty” is archaic, “privacy” is modern). The second polarity (PC2) that emerges is the intriguing one, and forms the basis of their paper: Its two poles, the authors argue, correspond to the distinction between “rational” and “intuitive” language.
Their main finding then has two pieces: a shift from the intuitive pole to the rational pole (the “rise” of rationality) and then back (the “fall”) (1). The rise has begun by the start of their data in 1850, and unfolds over the course of a century or more. They attribute it to a society increasingly concerned with quantifying, and justifying, the world through scientific and impersonal language—a gradual tightening of Max Weber’s famous “iron cage” of collectivized, rationalized bureaucracy in service of the capitalist profit motive (7). The fall, meaning a shift from the rational back to the intuitive, begins in 1980, and is more rapid than the rise: By 2020, the balance is similar to that seen in the early 1900s. The fall appears to accelerate in the early 2000s, which leads the authors to associate it with social media use and a “post-truth era” where “feelings trump facts.” Both these interpretations are supported by accompanying shifts toward “collective” pronouns (we, our, and they) in the Weberian period, and then toward the “individualistic” ones (I, my, he, and she) after.
The raw effect sizes the authors report are extraordinarily large (1). At the peak in 1980, rationality words outnumbered intuition words, on average, three to one. Forty years later (and 100 y earlier), however, the balance was roughly one to one. If these represent changes in actual language use, let alone the time devoted to the underlying cognitive processes, they are enormous shifts in the nature of human experience.
1. M. Scheffer, I. van de Leemput, E. Weinans, J. Bollen, The rise and fall of rationality in language. Proc. Natl. Acad. Sci. U.S.A. 118, e2107848118 (2021).
2. T. E. S. Charlesworth, V. Yang, T. C. Mann, B. Kurdi, M. R. Banaji, Gender stereotypes in natural language: Word embeddings show robust consistency across child and adult language corpora of more than 65 million words. Psychol. Sci. 32, 218–240 (2021).
3. R. Iliev, R. Axelrod, Does causality matter more now? Increase in the proportion of causal language in English texts. Psychol. Sci. 27, 635–643 (2016).
4. P. S. Dodds et al, Human language reveals a universal positivity bias. Proc. Natl. Acad. Sci. U.S.A. 112, 2389–2394 (2015).
5. J. C. Jackson et al, From text to thought: How analyzing language can advance psychological science. Perspect. Psychol. Sci., 10.117/17456916211004899 (2021).
6. J. B. Michel et al.; Google Books Team, Quantitative analysis of culture using millions of digitized books. Science 331, 176–182 (2011).

Monday, January 17, 2022

Different circuits in the brain for reward seeking and novelty seeking.

Work by Ogasawara et al. is noted by Peter Stern.
Novelty seeking is a key feature of intelligent behavior and adaptive cognition. However, we know little about the circuits that regulate our attraction to novel objects for novelty’s sake. Ogasawara et al. discovered that a brain nucleus called the zona incerta was causally related to novelty seeking. A region in the anterior medial temporal lobe projected to the zona incerta and sent motivational signals required to control novelty seeking through the zona incerta circuit. A novelty-seeking task, in which monkeys were motivated by the receipt of novel objects, showed that this behavior was not regulated by the dopamine reward-seeking circuitry. This work provides evidence for a clear dissociation in the brain circuitry between reward seeking and novelty seeking.

Wednesday, December 01, 2021

The Science of Hugs?

Schultz describes an entertaining bit of work pursuing the obvious done by Düren et.al. Guys hugging each other use their arms differently than women do, more frequently doing a crisscross hug (on the left) than a neck-waist hug (on the right), most likely because the neck-waist hug feels a bit more intimate.

Without prompting the students on how to hug, the researchers found the crisscross style was more common, accounting for 66 out of 100 hugs. The preference for crisscross was especially prevalent in pairs of men, with 82% of 28 observed pairs opting for the style. Neither emotional closeness nor height had significant effects on the style of hugging; however, the researchers note that most participants were relatively close in height, and they guess that neck-waist might be more common when heights differ more drastically.

Friday, November 05, 2021

Variability, not stereotypical expressions, in facial portraying of emotional states.

Barrett and collaborators use a novel method to offer more evidence against reliable mapping between certain emotional states and facial muscle movements:
It is long hypothesized that there is a reliable, specific mapping between certain emotional states and the facial movements that express those states. This hypothesis is often tested by asking untrained participants to pose the facial movements they believe they use to express emotions during generic scenarios. Here, we test this hypothesis using, as stimuli, photographs of facial configurations posed by professional actors in response to contextually-rich scenarios. The scenarios portrayed in the photographs were rated by a convenience sample of participants for the extent to which they evoked an instance of 13 emotion categories, and actors’ facial poses were coded for their specific movements. Both unsupervised and supervised machine learning find that in these photographs, the actors portrayed emotional states with variable facial configurations; instances of only three emotion categories (fear, happiness, and surprise) were portrayed with moderate reliability and specificity. The photographs were separately rated by another sample of participants for the extent to which they portrayed an instance of the 13 emotion categories; they were rated when presented alone and when presented with their associated scenarios, revealing that emotion inferences by participants also vary in a context-sensitive manner. Together, these findings suggest that facial movements and perceptions of emotion vary by situation and transcend stereotypes of emotional expressions. Future research may build on these findings by incorporating dynamic stimuli rather than photographs and studying a broader range of cultural contexts.
This perspective is opposite to that expressed by Cowen, Keltner et al. who use another novel method to reach opposite conclusions, in work that was noted in MindBlog's 12/29/20 post, along with some reservations about their conclusions.

Friday, October 08, 2021

Reconsolidation of a reactivated memory can be altered by stress hormone levels.

Stern's summary in Science Magazine of work by Antypa et al.:
Reactivation of a memory can make it malleable to subsequent change during reconsolidation. Targeted pharmacological and behavioral manipulations after memory reactivation can modulate reconsolidation and modify the memory. Antypa et al. investigated whether changes in stress hormone levels during sleep affected later memory of a reactivated episode. The authors recited a story accompanied by a slide show to a group of male and female subjects. If subjects were given treatment to block cortisol synthesis during early morning sleep, then their 3-day-old memory of the story was more precisely recalled than if the early morning cortisol spike was uncontrolled. However, this improvement only occurred if the subjects had been given a visual cue for the story just before anti-cortisol treatment.