Showing posts with label attention/perception. Show all posts
Showing posts with label attention/perception. Show all posts

Monday, January 16, 2023

COVID-19 and brain aging

Over the Christmas and New Year's holidays I was hit first by a mild Covid infection that lasted only a few days (I've had all 5 vaccinations and immediately took Paxlovid on testing positive) and then five days later had a slightly longer Paxlovid-rebound infection. A transient brain fog seems to have cleared by now. This personal experience makes me especially attentive to articles like Welberg's note on Covid-19 and brain aging, which suggests that brain changes associated with Covid infection are most likely due to neuroinflammation resulting from the infection, not from the virus itself. Here is the abstract:
Severe COVID-19 has been associated with cognitive impairment and changes in the frontal cortex. In a study published in Nature Aging, Mavrikaki, Lee et al. performed RNA sequencing on frontal cortex samples from 21 individuals with severe COVID-19, 22 age- and sex-matched uninfected controls, and 9 uninfected people who had received intensive care or ventilator treatment. The authors found almost 7,000 differentially expressed genes (DEGs) in the patient samples compared to controls. Upregulated DEGs were enriched for genes involved in immune-related pathways, and downregulated DEGs were enriched for genes involved in synaptic activity, cognition and memory — a profile of transcriptional changes that resembles those previously observed in aging brains. Direct comparisons between frontal cortex samples from young and old individuals confirmed this overlap. Application of tumor necrosis factor, interferon-β or interferon-γ to cultured human primary neurons induced transcriptional changes similar to those seen in patients with severe COVID-19. As no SARS-CoV-2 RNA was detected in the patient samples, these data suggest that the transcriptomic changes in frontal cortex of patients with severe COVID-19 were due to neuroinflammatory processes rather than a direct effect of the virus.

Monday, October 10, 2022

A sleeping touch improves vision.

Interesting work reported by Onuki et al. in the Journal of Neuroscience 

SIGNIFICANCE STATEMENT
Tactile sensations can bias our visual perception as a form of cross-modal interaction. However, it was reported only in the awake state. Here we show that repetitive directional tactile motion stimulation on the fingertip during slow wave sleep selectively enhanced subsequent visual motion perception. Moreover, the visual improvement was positively associated with sleep slow wave activity. The tactile motion stimulation during slow wave activity increased the activation at the high beta frequency over the occipital electrodes. The visual improvement occurred in agreement with a hand-centered reference frame. These results suggest that our sleeping brain can interpret tactile information based on a hand-centered reference frame, which can cause the sleep-dependent improvement of visual motion detection.

ABSTRACT

Tactile sensations can bias visual perception in the awake state while visual sensitivity is known to be facilitated by sleep. It remains unknown, however, whether the tactile sensation during sleep can bias the visual improvement after sleep. Here, we performed nap experiments in human participants (n = 56, 18 males, 38 females) to demonstrate that repetitive tactile motion stimulation on the fingertip during slow wave sleep selectively enhanced subsequent visual motion detection. The visual improvement was associated with slow wave activity. The high activation at the high beta frequency was found in the occipital electrodes after the tactile motion stimulation during sleep, indicating a visual-tactile cross-modal interaction during sleep. Furthermore, a second experiment (n = 14, 14 females) to examine whether a hand- or head-centered coordination is dominant for the interpretation of tactile motion direction showed that the biasing effect on visual improvement occurs according to the hand-centered coordination. These results suggest that tactile information can be interpreted during sleep, and can induce the selective improvement of post-sleep visual motion detection.

Friday, August 26, 2022

Our anterior insula signals salience and deviations from expectations via bursts of beta oscillations

Haufler et al. show that the insula signals salience and prediction errors via amplitude modulations of beta bursts (~15-40 Hertz, or cycles per second), which coincide with the near simultaneous recruitment of vast cortical territories. 

NEW & NOTEWORTHY

Functional imaging studies indicate that the anterior insula encodes salience and deviations from expectations. Beyond changing BOLD signals, however, the physiological underpinnings of these signals are unknown. By recording local field potentials in patients with epilepsy, we found that the anterior insula generates large bursts of beta oscillations whose amplitude is modulated by the salience of outcomes and deviations from expectations. Moreover, insular beta bursts coincide with the activation of many high-order cortical areas.
ABSTRACT:
Functional imaging studies indicate that the insula encodes the salience of stimuli and deviations from expectations, signals that can mobilize cognitive resources and facilitate learning. However, there is no information about the physiological underpinnings of these phenomena beyond changing BOLD signals. To shed light on this question, we analyzed intracerebral local field potentials (LFPs) in five patients with epilepsy of both genders performing a virtual reality task that featured varying odds of monetary rewards and losses. Upon outcome disclosure, the anterior (but not the posterior) insula generated bursts of beta oscillations whose amplitudes were lower for neutral than positive and negative outcomes, consistent with a salience signal. Moreover, beta burst power was higher when outcomes deviated from expectations, whether the outcome was better or worse than expected, indicating that the insula provides an unsigned prediction error signal. Last, in relation to insular beta bursts, many higher-order cortical areas exhibited robust changes in LFP activity that ranged from spectrally nonspecific or differentiated increases in gamma power to bursts of beta activity that closely resembled the insular beta bursts themselves. Critically, the activity of these other cortical regions was more closely tied in time to insular bursts than task events, suggesting that they are associated with particularly significant cognitive phenomena. Overall, our findings suggest that the insula signals salience and prediction errors via amplitude modulations of beta bursts, which coincide with the near simultaneous recruitment of vast cortical territories.

Wednesday, August 24, 2022

The brain chemistry underlying mental exhaustion.

Emily Underwood does a review of work by Wiehler et al. (open source) on the brain chemistry underlying mental fatigue, also describing several reservations expressed by other researchers. From her description:
The researchers divided 39 paid study participants into two groups, assigning one to a series of difficult cognitive tasks that were designed to induce mental exhaustion. In one, participants had to decide whether letters and numbers flashing on a computer screen in quick succession were green or red, uppercase or lowercase, and other variations. In another, volunteers had to remember whether a number matched one they’d seen three characters earlier...As the day dragged on, the researchers repeatedly measured cognitive fatigue by asking participants to make choices that required self-control—deciding to forgo cash that was immediately available so they could earn a larger amount later, for example. The group that had been assigned to more difficult tasks made about 10% more impulsive choices than the group with easier tasks, the researchers observed. At the same time, their glutamate levels rose by about 8% in the lateral prefrontal cortex—a pattern that did not show up in the other group...

Here is the Wiehler et al. abstract:  

Highlights

• Cognitive fatigue is explored with magnetic resonance spectroscopy during a workday 
• Hard cognitive work leads to glutamate accumulation in the lateral prefrontal cortex 
• The need for glutamate regulation reduces the control exerted over decision-making 
• Reduced control favors the choice of low-effort actions with short-term rewards
Summary
Behavioral activities that require control over automatic routines typically feel effortful and result in cognitive fatigue. Beyond subjective report, cognitive fatigue has been conceived as an inflated cost of cognitive control, objectified by more impulsive decisions. However, the origins of such control cost inflation with cognitive work are heavily debated. Here, we suggest a neuro-metabolic account: the cost would relate to the necessity of recycling potentially toxic substances accumulated during cognitive control exertion. We validated this account using magnetic resonance spectroscopy (MRS) to monitor brain metabolites throughout an approximate workday, during which two groups of participants performed either high-demand or low-demand cognitive control tasks, interleaved with economic decisions. Choice-related fatigue markers were only present in the high-demand group, with a reduction of pupil dilation during decision-making and a preference shift toward short-delay and little-effort options (a low-cost bias captured using computational modeling). At the end of the day, high-demand cognitive work resulted in higher glutamate concentration and glutamate/glutamine diffusion in a cognitive control brain region (lateral prefrontal cortex [lPFC]), relative to low-demand cognitive work and to a reference brain region (primary visual cortex [V1]). Taken together with previous fMRI data, these results support a neuro-metabolic model in which glutamate accumulation triggers a regulation mechanism that makes lPFC activation more costly, explaining why cognitive control is harder to mobilize after a strenuous workday.

Wednesday, August 10, 2022

Music training enhances auditory and linguistic processing.

From Neves et al.:  

Highlights

• Systematic review and meta-analysis of neurobehavioral effects of music training. 
• We ask whether music training shapes auditory-perceptual and linguistic skills. 
• Multivariate meta-analytic models are combined with narrative synthesis. 
• Music training has a positive effect on auditory and linguistic processing. 
• Our work informs research on plasticity, transfer, and music-based interventions.
Abstract
It is often claimed that music training improves auditory and linguistic skills. Results of individual studies are mixed, however, and most evidence is correlational, precluding inferences of causation. Here, we evaluated data from 62 longitudinal studies that examined whether music training programs affect behavioral and brain measures of auditory and linguistic processing (N = 3928). For the behavioral data, a multivariate meta-analysis revealed a small positive effect of music training on both auditory and linguistic measures, regardless of the type of assignment (random vs. non-random), training (instrumental vs. non-instrumental), and control group (active vs. passive). The trim-and-fill method provided suggestive evidence of publication bias, but meta-regression methods (PET-PEESE) did not. For the brain data, a narrative synthesis also documented benefits of music training, namely for measures of auditory processing and for measures of speech and prosody processing. Thus, the available literature provides evidence that music training produces small neurobehavioral enhancements in auditory and linguistic processing, although future studies are needed to confirm that such enhancements are not due to publication bias.

Wednesday, June 22, 2022

Effortless training of attention and self-control

I pass on the highlights statement from a fascinating opinion piece by Tang et al. (motivated readers can obtain a copy of the text from me). 

Highlights

A long-held belief in cognitive science is that training attention and self-control must recruit effort. Therefore, various effortful training programs such as attention or working memory training have been developed to improve attention and self-control (or executive function). However, effortful training has limited far-transfer effects.
A growing literature suggests a new way of effortless training for attention and self-control. Effortless training – such as nature exposure, flow experience, and effortless practices – has shown promising effects on improving attention and self-control.
Effortful training requires cognitive control supported by the frontoparietal network to sustain mental effort over the course of training. Effortless training engages autonomic control with less effort, and is supported by the anterior and posterior cingulate cortex, striatum, and parasympathetic nervous system (PNS).
For the past 50 years, cognitive scientists have assumed that training attention and self-control must be effortful. However, growing evidence suggests promising effects of effortless training approaches such as nature exposure, flow experience, and effortless practice on attention and self-control. This opinion article focuses on effortless training of attention and self-control. We begin by introducing our definitions of effortful and effortless training and reviewing the growing literature on these two different forms of training. We then discuss the similarities and differences in their respective behavioral outcomes and neural correlates. Finally, we propose a putative neural mechanism of effortless training. We conclude by highlighting promising directions for research, development, and application of effortless training.
Figure Legend: Core brain regions and their functions during effortless training.
Three colored areas represent the anterior cingulate cortex–posterior cingulate cortex (ACC–PCC)–striatum (APS) and their corresponding functions during training. The broken line arrows indicate that these regions actively communicate with each other during effortless training.

Wednesday, May 25, 2022

Why is a moving hand less sensitive to touch than a stationary hand?

Fuehrer et al. do a nice piece showing how our brains' predictive processing can alter our sensory experience:  

Significance

Tactile sensations on a moving hand are perceived weaker than when presented on the same but stationary hand. There is an ongoing debate about whether this weaker perception is based on sensorimotor predictions or is due to a blanket reduction in sensitivity. Here, we show greater suppression of sensations matching predicted sensory feedback. This reinforces the idea of precise estimations of future body sensory states suppressing the predicted sensory feedback. Our results shine light on the mechanisms of human sensorimotor control and are relevant for understanding clinical phenomena related to predictive processes.
Abstract
The ability to sample sensory information with our hands is crucial for smooth and efficient interactions with the world. Despite this important role of touch, tactile sensations on a moving hand are perceived weaker than when presented on the same but stationary hand. This phenomenon of tactile suppression has been explained by predictive mechanisms, such as internal forward models, that estimate future sensory states of the body on the basis of the motor command and suppress the associated predicted sensory feedback. The origins of tactile suppression have sparked a lot of debate, with contemporary accounts claiming that suppression is independent of sensorimotor predictions and is instead due to an unspecific mechanism. Here, we target this debate and provide evidence for specific tactile suppression due to precise sensorimotor predictions. Participants stroked with their finger over textured objects that caused predictable vibrotactile feedback signals on that finger. Shortly before touching the texture, we probed tactile suppression by applying external vibrotactile probes on the moving finger that either matched or mismatched the frequency generated by the stroking movement along the texture. We found stronger suppression of the probes that matched the predicted sensory feedback. These results show that tactile suppression is specifically tuned to the predicted sensory states of a movement.

Tuesday, May 03, 2022

Older adults store too much information.

From Amer et al.:  

Highlights

Healthy aging is accompanied by declines in control of attention.
These reductions in the control of attention, result in older adults processing too much information, creating cluttered memory representations.
Cluttered representations can impair memory by interfering with the retrieval of target information, but can also provide an advantage on tasks that benefit from extensive knowledge.

Abstract

Declines in episodic memory in older adults are typically attributed to differences in encoding strategies and/or retrieval processes. These views omit a critical factor in age-related memory differences: the nature of the representations that are formed. Here, we review evidence that older adults create more cluttered (or richer) representations of events than do younger adults. These cluttered representations might include target information along with recently activated but no-longer-relevant information, prior knowledge cued by the ongoing situation, as well as irrelevant information in the current environment. Although these representations can interfere with the retrieval of target information, they can also support other memory-dependent cognitive functions.

Monday, March 21, 2022

Mellow Mice - Why deep breathing can keep us calm

How we are breathing is usally a good indicator of whether we are calm or aroused. When we become anxious or aroused, usually the best thing we can do is stop and take a deep breath. Gretchen Reynolds points to interesting work in mice that suggests that taking deep breaths is calming because it does not activate neurons in the brain's breathing center that communicate with the brain's arousal center (breathing pacemakers in humans closely resemble those in mice). Here is the abstract from Yackle et al.:
Slow, controlled breathing has been used for centuries to promote mental calming, and it is used clinically to suppress excessive arousal such as panic attacks. However, the physiological and neural basis of the relationship between breathing and higher-order brain activity is unknown. We found a neuronal subpopulation of about 350 neurons in the mouse preBötzinger complex (preBötC), the primary breathing rhythm generator, which regulates the balance between calm and arousal behaviors. Conditional, bilateral genetic ablation of the ~175 Cdh9/Dbx1 double-positive preBötC neurons in adult mice left breathing intact but increased calm behaviors and decreased time in aroused states. These neurons project to, synapse on, and positively regulate noradrenergic neurons in the locus coeruleus, a brain center implicated in attention, arousal, and panic that projects throughout the brain.

Friday, February 18, 2022

Illusory faces are more likely to be perceived as male than female

Interesting observations from Wardle et al.:
Despite our fluency in reading human faces, sometimes we mistakenly perceive illusory faces in objects, a phenomenon known as face pareidolia. Although illusory faces share some neural mechanisms with real faces, it is unknown to what degree pareidolia engages higher-level social perception beyond the detection of a face. In a series of large-scale behavioral experiments (ntotal = 3,815 adults), we found that illusory faces in inanimate objects are readily perceived to have a specific emotional expression, age, and gender. Most strikingly, we observed a strong bias to perceive illusory faces as male rather than female. This male bias could not be explained by preexisting semantic or visual gender associations with the objects, or by visual features in the images. Rather, this robust bias in the perception of gender for illusory faces reveals a cognitive bias arising from a broadly tuned face evaluation system in which minimally viable face percepts are more likely to be perceived as male.

Wednesday, February 16, 2022

Our brains store concepts as sensory-motor and affective information

Fascinating work from Fernandino et al., who show that concept representations are not independent of sensory-motor experience: 

Significance

The ability to identify individual objects or events as members of a kind (e.g., “knife,” “dog,” or “party”) is a fundamental aspect of human cognition. It allows us to quickly access a wealth of information pertaining to a newly encountered object or event and use it to guide our behavior. How is this information represented in the brain? We used functional MRI to analyze patterns of brain activity corresponding to hundreds of familiar concepts and quantitatively characterized the informational structure of these patterns. Our results indicate that conceptual knowledge is stored as patterns of neural activity that encode sensory-motor and affective information about each concept, contrary to the long-held idea that concept representations are independent of sensory-motor experience.
Abstract
The nature of the representational code underlying conceptual knowledge remains a major unsolved problem in cognitive neuroscience. We assessed the extent to which different representational systems contribute to the instantiation of lexical concepts in high-level, heteromodal cortical areas previously associated with semantic cognition. We found that lexical semantic information can be reliably decoded from a wide range of heteromodal cortical areas in the frontal, parietal, and temporal cortex. In most of these areas, we found a striking advantage for experience-based representational structures (i.e., encoding information about sensory-motor, affective, and other features of phenomenal experience), with little evidence for independent taxonomic or distributional organization. These results were found independently for object and event concepts. Our findings indicate that concept representations in the heteromodal cortex are based, at least in part, on experiential information. They also reveal that, in most heteromodal areas, event concepts have more heterogeneous representations (i.e., they are more easily decodable) than object concepts and that other areas beyond the traditional “semantic hubs” contribute to semantic cognition, particularly the posterior cingulate gyrus and the precuneus.

Friday, February 04, 2022

Attention and executive functions - improvements and declines with ageing.

From Verissimo et al.:
Many but not all cognitive abilities decline during ageing. Some even improve due to lifelong experience. The critical capacities of attention and executive functions have been widely posited to decline. However, these capacities are composed of multiple components, so multifaceted ageing outcomes might be expected. Indeed, prior findings suggest that whereas certain attention/executive functions clearly decline, others do not, with hints that some might even improve. We tested ageing effects on the alerting, orienting and executive (inhibitory) networks posited by Posner and Petersen’s influential theory of attention, in a cross-sectional study of a large sample (N = 702) of participants aged 58–98. Linear and nonlinear analyses revealed that whereas the efficiency of the alerting network decreased with age, orienting and executive inhibitory efficiency increased, at least until the mid-to-late 70s. Sensitivity analyses indicated that the patterns were robust. The results suggest variability in age-related changes across attention/executive functions, with some declining while others improve.

Wednesday, January 26, 2022

Our brains have multiple representations of the same body part.

Here is a neat finding. Remember your elementary biology textbook picture of the homunculi in our somatosensory and motor cortices? The small human figure spread across the surface of the brain, with a cortical location for each part of the hand or other body part? Matsumiya shows that when we direct our eye and hand movements to the same body part these two movements are found to be guided by different body maps! Here is his abstract:  

Significance

Accurate motor control depends on maps of the body in the brain, called the body schema. Disorders of the body schema cause motor deficits. Although we often execute actions with different motor systems such as the eye and hand, how the body schema operates during such actions is unknown. In this study, participants simultaneously directed eye and hand movements to the same body part. These two movements were found to be guided by different body maps. This finding demonstrates multiple motor system–specific representations of the body schema, suggesting that the choice of motor system toward one’s body can determine which of the brain’s body maps is observed. This may offer a new way to visualize patients’ body schema.
Abstract
Purposeful motor actions depend on the brain’s representation of the body, called the body schema, and disorders of the body schema have been reported to show motor deficits. The body schema has been assumed for almost a century to be a common body representation supporting all types of motor actions, and previous studies have considered only a single motor action. Although we often execute multiple motor actions, how the body schema operates during such actions is unknown. To address this issue, I developed a technique to measure the body schema during multiple motor actions. Participants made simultaneous eye and reach movements to the same location of 10 landmarks on their hand. By analyzing the internal configuration of the locations of these points for each of the eye and reach movements, I produced maps of the mental representation of hand shape. Despite these two movements being simultaneously directed to the same bodily location, the resulting hand map (i.e., a part of the body schema) was much more distorted for reach movements than for eye movements. Furthermore, the weighting of visual and proprioceptive bodily cues to build up this part of the body schema differed for each effector. These results demonstrate that the body schema is organized as multiple effector-specific body representations. I propose that the choice of effector toward one’s body can determine which body representation in the brain is observed and that this visualization approach may offer a new way to understand patients’ body schema.

Monday, January 10, 2022

Transcranial stimulation of alpha oscillations up-regulates the default mode network

Interesting work from Clancy et al. on the brain's default mode network that carries out our self referential rumination: Significance
In the brain’s functional organization, the default mode network (DMN) represents a key architecture, whose dysregulation is involved in a host of major neuropsychiatric disorders. However, insights into the regulation of the DMN remain scarce. Through neural synchrony, the alpha-frequency oscillation represents another key underpinning of the brain’s organization and is thought to share an inherent interdependence with the DMN. Here, we demonstrated that transcranial alternating current stimulation of alpha oscillations (α-tACS) not only augmented alpha activity but also strengthened connectivity of the DMN, with the former serving as a mediator of the latter. These findings reveal that alpha oscillations can support DMN functioning. In addition, they identify an effective noninvasive approach to regulate the DMN via α-tACS.
Abstract
The default mode network (DMN) is the most-prominent intrinsic connectivity network, serving as a key architecture of the brain’s functional organization. Conversely, dysregulated DMN is characteristic of major neuropsychiatric disorders. However, the field still lacks mechanistic insights into the regulation of the DMN and effective interventions for DMN dysregulation. The current study approached this problem by manipulating neural synchrony, particularly alpha (8 to 12 Hz) oscillations, a dominant intrinsic oscillatory activity that has been increasingly associated with the DMN in both function and physiology. Using high-definition alpha-frequency transcranial alternating current stimulation (α-tACS) to stimulate the cortical source of alpha oscillations, in combination with simultaneous electroencephalography and functional MRI (EEG-fMRI), we demonstrated that α-tACS (versus Sham control) not only augmented EEG alpha oscillations but also strengthened fMRI and (source-level) alpha connectivity within the core of the DMN. Importantly, increase in alpha oscillations mediated the DMN connectivity enhancement. These findings thus identify a mechanistic link between alpha oscillations and DMN functioning. That transcranial alpha modulation can up-regulate the DMN further highlights an effective noninvasive intervention to normalize DMN functioning in various disorders.

Tuesday, December 14, 2021

New articles on exercise and the brain

Gretchen Reynolds has done two recent brief reviews:

 The Quiet Brain of the Athlete describes work showing that the brains of fit, young athletes dial down extraneous noise and attend to important sounds better than those of other young people. 

And, 

 Staying physically active may protect the aging brain. Simple activities like walking boost immune cells in the brain that may help to keep memory sharp and even ward off Alzheimer’s disease.

Friday, December 10, 2021

Temporal Self-Compression

Brietzke and Meyer (open source) provide behavioral and neural evidence that our past and future selves are compressed as they move away from the present:  

Significance

For centuries, great thinkers have struggled to understand how people represent a personal identity that changes over time. Insight may come from a basic principle of perception: as objects become distant, they also become less discriminable or “compressed.” In Studies 1–3, we demonstrate that people’s ratings of their own personality become increasingly less differentiated as they consider more distant past and future selves. In Study 4, we found neural evidence that the brain compresses self-representations with time as well. When we peer out a window, objects close to us are in clear view, whereas distant objects are hard to tell apart. We provide evidence that self-perception may operate similarly, with the nuance of distant selves increasingly harder to perceive.
Abstract
A basic principle of perception is that as objects increase in distance from an observer, they also become logarithmically compressed in perception (i.e., not differentiated from one another), making them hard to distinguish. Could this basic principle apply to perhaps our most meaningful mental representation: our own sense of self? Here, we report four studies that suggest selves are increasingly non-discriminable with temporal distance from the present as well. In Studies 1 through 3, participants made trait ratings across various time points in the past and future. We found that participants compressed their past and future selves, relative to their present self. This effect was preferential to the self and could not be explained by the alternative possibility that individuals simply perceive arbitrary self-change with time irrespective of temporal distance. In Study 4, we tested for neural evidence of temporal self-compression by having participants complete trait ratings across time points while undergoing functional MRI. Representational similarity analysis was used to determine whether neural self-representations are compressed with temporal distance as well. We found evidence of temporal self-compression in areas of the default network, including medial prefrontal cortex and posterior cingulate cortex. Specifically, neural pattern similarity between self-representations was logarithmically compressed with temporal distance. Taken together, these findings reveal a “temporal self-compression” effect, with temporal selves becoming increasingly non-discriminable with distance from the present.

Monday, December 06, 2021

The Science of Mind Reading

James Somers offers a fascinating article in the Nov. 29 issue of The New Yorker, which I recommend that you read. It describes the development of the technique of Latent Semantic Analysis (L.S.A) originating in the work of a psychologist named Charles Osgood nearly 70 years ago and now being applied to the analysis of fMRI recordings from people to infer what they are internally thinking or seeing.
In 2013, researchers at Google unleashed a descendant of it onto the text of the whole World Wide Web. Google’s algorithm turned each word into a “vector,” or point, in high-dimensional space. The vectors generated by the researchers’ program, word2vec, are eerily accurate: if you take the vector for “king” and subtract the vector for “man,” then add the vector for “woman,” the closest nearby vector is “queen.” Word vectors became the basis of a much improved Google Translate, and enabled the auto-completion of sentences in Gmail. Other companies, including Apple and Amazon, built similar systems. Eventually, researchers realized that the “vectorization” made popular by L.S.A. and word2vec could be used to map all sorts of things. Today’s facial-recognition systems have dimensions that represent the length of the nose and the curl of the lips, and faces are described using a string of coördinates in “face space.” Chess A.I.s use a similar trick to “vectorize” positions on the board. The technique has become so central to the field of artificial intelligence that, in 2017, a new, hundred-and-thirty-five-million-dollar A.I. research center in Toronto was named the Vector Institute. Matthew Botvinick, a professor at Princeton whose lab was across the hall from Norman’s, and who is now the head of neuroscience at DeepMind, Alphabet’s A.I. subsidiary, told me that distilling relevant similarities and differences into vectors was “the secret sauce underlying all of these A.I. advances.”

 


Subsequent sections of the article describe how machine learning has been brought to brain imaging with voxels of neural activity serving as dimensions in a kind of thought space.

...today’s thought-decoding researchers mostly look for specific thoughts that have been defined in advance. But a “general-purpose thought decoder,” Norman told me, is the next logical step for the research. Such a device could speak aloud a person’s thoughts, even if those thoughts have never been observed in an fMRI machine. In 2018, Botvinick, Norman’s hall mate, co-wrote a paper in the journal Nature Communications titled “Toward a Universal Decoder of Linguistic Meaning from Brain Activation.” Botvinick’s team had built a primitive form of what Norman described: a system that could decode novel sentences that subjects read silently to themselves. The system learned which brain patterns were evoked by certain words, and used that knowledge to guess which words were implied by the new patterns it encountered.

Monday, November 22, 2021

Fluid intelligence and the locus coeruleus-norepinephrine system

Tsukahara and Engle suggest that the cognitive mechanisms of fluid intelligence map onto the locus coeruleus–norepinephrine system. I pass on their introductory paragraph (the link takes you to their abstract, which I think is less informative):
In this article, we outline what we see as a potentially important relationship for understanding the biological basis of intelligence: that is, the relationship between fluid intelligence and the locus coeruleus–norepinephrine system. This is largely motivated by our findings that baseline pupil size is related to fluid intelligence; the larger the pupils, the higher the fluid intelligence. The connection to the locus coeruleus is based on research showing that the size of the pupil can be used as an indicator of locus coeruleus activity. A large body of research on the locus coeruleus–norepinephrine system in animal and human studies has shown how this system is critical for an impressively wide range of behaviors and cognitive processes, from regulating sleep/wake cycles, to sensation and perception, attention, learning and memory, decision making, and more. The locus coeruleus–norepinephrine system achieves this primarily through its widespread projection system throughout the cortex, strong connections with the prefrontal cortex, and the effect of norepinephrine at many levels of brain function. Given the broad role of this system in behavior, cognition, and brain function, we propose that the locus coeruleus–norepinephrine system is essential for understanding the biological basis of intelligence.

Wednesday, November 17, 2021

Our brainstems respond to fake therapies and fake side effects.

Here is the abstract from a Journal of Neuroscience paper by Crawford et al. titled "Brainstem mechanisms of pain modulation: a within-subjects 7T fMRI study of Placebo Analgesic and Nocebo Hyperalgesic Responses":
Pain perception can be powerfully influenced by an individual’s expectations and beliefs. Whilst the cortical circuitry responsible for pain modulation has been thoroughly investigated, the brainstem pathways involved in the modulatory phenomena of placebo analgesia and nocebo hyperalgesia remain to be directly addressed. This study employed ultra-high field 7 Tesla functional MRI (fMRI) to accurately resolve differences in brainstem circuitry present during the generation of placebo analgesia and nocebo hyperalgesia in healthy human participants (N = 25; 12 Male). Over two successive days, through blinded application of altered thermal stimuli, participants were deceptively conditioned to believe that two inert creams labelled ‘lidocaine’ (placebo) and ‘capsaicin’ (nocebo) were acting to modulate their pain relative to a third ‘Vaseline’ (control) cream. In a subsequent test phase, fMRI image sets were collected whilst participants were given identical noxious stimuli to all three cream sites. Pain intensity ratings were collected and placebo and nocebo responses determined. Brainstem-specific fMRI analysis revealed altered activity in key pain-modulatory nuclei, including a disparate recruitment of the periaqueductal gray (PAG) – rostral ventromedial medulla (RVM) pathway when both greater placebo and nocebo effects were observed. Additionally, we found that placebo and nocebo responses differentially activated the parabrachial nucleus but overlapped in their engagement of the substantia nigra and locus coeruleus. These data reveal that placebo and nocebo effects are generated through differential engagement of the PAG-RVM pathway, which in concert with other brainstem sites likely influence the experience of pain by modulating activity at the level of the dorsal horn.

Friday, November 12, 2021

Freedom From Illusion

A friend who attended the lecture I gave last Sunday (A New Vision of how our Minds Work), and mentioned in a Monday post, sent me an article from The Buddhist Review "TRICYCLE" by Pema Düddul titled "Freedom From Illusion". If you scan both texts, I suspect you will find, as I do, a striking consonance between the neuroscientific and Buddhist perspectives on "Illusion." 

From the beginning of the Düddul article:

A shooting star, a clouding of the sight, 
a lamp, an illusion, a drop of dew, a bubble, 
a dream, a lightning’s flash, a thunder cloud: 
this is the way one should see the conditioned.
This revered verse from the Diamond Sutra points to one of Buddhism’s most profound yet confounding truths—the illusory nature of all things. The verse is designed to awaken us to ultimate reality, specifically to the fact that all things, especially thoughts and feelings, are the rainbow-like display of the mind. One of the Tibetan words for the dualistic mind means something like “a magician creating illusions.” As my teacher Ngakpa Karma Lhundup Rinpoche explained: “All of our thoughts are magical illusions created by our mind. We get trapped, carried away by our own illusions. We forget that we are the magician in the first place!”
Compare this with my talk's description of predictive processing, and how what we see, hear, touch, taste, and smell are largely simulations or illusions about the world. Here is a summary sentence in one of my slides, taken from a lecture by Ruben Laukkonen, in which I replace his last word, 'fantasies,' with the word 'illusions.'
Everything we do and experience is in service of reducing surprises by fulfilling illusions.