People overestimate their knowledge, at times claiming knowledge of concepts, events, and people that do not exist and cannot be known, a phenomenon called overclaiming. What underlies assertions of such impossible knowledge? We found that people overclaim to the extent that they perceive their personal expertise favorably. A first set of studies showed that self-perceived financial knowledge positively predicts claiming knowledge of nonexistent financial concepts (invented by the researchers: pre-rated stocks, fixed-rate deduction, annualized credit), independent of actual knowledge. A second study demonstrated that self-perceived knowledge within specific domains (e.g., biology) is associated specifically with overclaiming within those domains (taking the fictitious terms meta-toxins, bio-sexual, and retroplex to be real). In another study, warning participants that some of the concepts they saw were fictitious did not reduce the relationship between self-perceived knowledge and overclaiming, which suggests that this relationship is not driven by impression management. Finally, boosting self-perceived expertise in geography (by having participants take an easy versus difficult geography quiz) prompted assertions of familiarity with nonexistent places, which supports a causal role for self-perceived expertise in claiming impossible knowledge.The authors note this line by American historial Daniel Boorstin (1914-2004): "The menace to understanding is not so much ignorance as the illusion of knowledge."
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Tuesday, August 11, 2015
Overclaiming - the illusion of knowledge
Atir et al. probe how our inflated perception of our expertise can lead us to make claims of impossible knowledge. Their edited abstract:
Blog Categories:
attention/perception,
memory/learning,
self
Monday, August 10, 2015
Habitual exercise correlates with lower distractibility.
Yet another demonstration of the salutary effects of long term exercise on mental function:
Aging is associated with compromised executive control functions. Several lines of evidence point to beneficial effects of physical activity on cognition which indicate that regular physical activity may counteract the age-related decline of some executive functions. Here, we investigate the effects of lifelong physical activity (about 50 years) on interference processing in two matched groups of 20 physically high active and 20 low active healthy older men using event-related potentials (ERPs). In a low interference block of the Stroop task, participants had to indicate the meaning of color-words, while color was either compatible or incompatible with the meaning. In the high interference block, participants were asked to respond according to the ink color of the word and to ignore its meaning. Physically active seniors showed faster reaction times, lower individual variability in reaction times, and higher accuracy compared to low active seniors, particularly in the high interference block. This result was confirmed in the classic paper-and-pencil version of the Stroop task showing higher interference score in the low active than high active individuals. ERPs revealed a shorter latency of the P2 and generally more negative amplitudes of the fronto-central N2 and N450 components in the high active group compared to the low active group. The amount of interference was negatively correlated with objectively measured fitness and self-reported physical activity. The positive effect of physical fitness on interference processing in the behavioral data was related to N2 and N450 amplitudes. Taken together, this suggests that seniors reporting long-term physical activity may exhibit generally enhanced activity in the frontal cortex which enables more efficient interference resolution in the Stroop task.
Blog Categories:
aging,
attention/perception,
brain plasticity,
exercise
Friday, August 07, 2015
The benefits of pupil orientation.
This is kinda neat, from the current issue of Science Magazine:
Slit-eyed animals have either vertical or horizontal pupils. It is unclear whether one orientation conveys any sort of competitive advantage over the other, and if so, under what circumstances. Banks et al. suggest that the optics of vertical pupil slits generally benefit predators, whereas the optics of horizontal slits benefit prey. Vertical slits are better for estimating object distance and distances along the ground—perfect for a predator stalking its prey. In contrast, horizontal slits are better for seeing objects on the horizon—ideal for prey seeing an approaching predator and deciding which way to flee.
Are Chatbots destined to become our most sympathetic listeners?
Markoff and Mozur do a fascinating piece in the NYTimes describing how millions of young Chinese use a smartphone program as their intimate companion.
Xiaoice (pronounced Shao-ice) can chat with so many people for hours on end because she is not real. She is a chatbot, a program introduced last year by Microsoft that has become something of a hit in China. It is also making the 2013 film “Her,” in which the actor Joaquin Phoenix plays a character who falls in love with a computer operating system, seem less like science fiction.
The program remembers details from previous exchanges with users, such as a breakup with a girlfriend or boyfriend, and asks in later conversations how the user is feeling. Xiaoice is a text-messaging program; the next version will include a Siri-like voice so people can talk with Xiaoice.
Microsoft has been able to give Xiaoice a more compelling personality and sense of “intelligence” by systematically mining the Chinese Internet for human conversations. The company has developed language processing technology that picks out pairs of questions and answers from actual typed conversations. As a result, Xiaoice has a database of responses that are human and current — she is fond of using emojis, too. (Xiaoice translates roughly to “Little Bing,” after the Microsoft search engine.)The Microsoft App website lists a few simple English language chat bots, with only a few reviews, nothing like the sophisticated A.I. software being used by the Chinese Microsoft program in Beijing.
Blog Categories:
culture/politics,
social cognition,
technology
Thursday, August 06, 2015
Benefits of High School Music training.
Maybe my avoiding gym classes in high school by being in the marching band and chorus paid off some brain benefits (in addition to my already being a pianist). This this work from Tierney et al. also suggests that the nationwide savaging of high school music curricula is a really bad idea.:
Fundamental changes in brain structure and function during adolescence are well-characterized, but the extent to which experience modulates adolescent neurodevelopment is not. Musical experience provides an ideal case for examining this question because the influence of music training begun early in life is well-known. We investigated the effects of in-school music training, previously shown to enhance auditory skills, versus another in-school training program that did not focus on development of auditory skills (active control). We tested adolescents on neural responses to sound and language skills before they entered high school (pretraining) and again 3 y later. Here, we show that in-school music training begun in high school prolongs the stability of subcortical sound processing and accelerates maturation of cortical auditory responses. Although phonological processing improved in both the music training and active control groups, the enhancement was greater in adolescents who underwent music training. Thus, music training initiated as late as adolescence can enhance neural processing of sound and confer benefits for language skills. These results establish the potential for experience-driven brain plasticity during adolescence and demonstrate that in-school programs can engender these changes.
Blog Categories:
brain plasticity,
human development,
language,
music
Wednesday, August 05, 2015
Synchronizing brain theta oscillations strengthens our adaptive behavior control.
Reinhart et al. show that synchronizing low-frequency theta (4-8 Hz) EEG oscillations over the medial-frontal cortex with noninvasive direct current electrical stimulation enhances adaptive control of behavior. :
Significance
Significance
The ability to exert control over our behavior is fundamental to human cognition, and is impaired in many neuropsychiatric disorders. Here, we show evidence for the neural mechanisms of adaptive control that distinguish healthy people from people who have schizophrenia. We found that the noninvasive electrical stimulation phase aligns low-frequency brain rhythms and enhances functional connectivity. This brain stimulation modulated the temporal structure of low-frequency oscillations and synchrony, improving adaptive control. Moreover, we found that causal changes in the low-frequency oscillations improved behavioral responses to errors and long-range connectivity at the single-trial level. These results implicate theories of executive control and cortical dysconnectivity, and point to the possible development of nonpharmacological treatment alternatives for neuropsychiatric conditions.Abstract
Executive control and flexible adjustment of behavior following errors are essential to adaptive functioning. Loss of adaptive control may be a biomarker of a wide range of neuropsychiatric disorders, particularly in the schizophrenia spectrum. Here, we provide support for the view that oscillatory activity in the frontal cortex underlies adaptive adjustments in cognitive processing following errors. Compared with healthy subjects, patients with schizophrenia exhibited low frequency oscillations with abnormal temporal structure and an absence of synchrony over medial-frontal and lateral-prefrontal cortex following errors. To demonstrate that these abnormal oscillations were the origin of the impaired adaptive control in patients with schizophrenia, we applied noninvasive dc electrical stimulation over the medial-frontal cortex. This noninvasive stimulation descrambled the phase of the low-frequency neural oscillations that synchronize activity across cortical regions. Following stimulation, the behavioral index of adaptive control was improved such that patients were indistinguishable from healthy control subjects. These results provide unique causal evidence for theories of executive control and cortical dysconnectivity in schizophrenia.
Tuesday, August 04, 2015
Another awesome brain video
New stuff on exercise, brain, and body.
I'll use this post to point readers to several recent interesting articles on physical activity. Hutchinson does a review of work that distinguishes the effect of strength and endurance training versus balance and stability training. The former isn't all that useful without the later, especially in older adults (have you tried standing on one leg with your eyes closed lately?). A German study followed aged adults for 12 months comparing those who did cardiovascular (walking) exercise three times a week, with those who did coordination training. Both groups showed improvement in cognitive functioning, but in different ways. Cardiovascular training was associated with an increased activation of the sensorimotor network, whereas coordination training was associated with increased activation in the visual–spatial network. Mouse studies show that aerobic exercise and strength training trigger brain chemicals that enhance neuron growth and survival, while balance and coordination exercises also recruit higher-level cognitive processes that seem to increase the number of synapses connecting neurons. Work by Kumpulainen et. al. suggests that novelty and unpredictability (as in gymnasts or dancers), rather than repetition (as in endurance athletes), are essential in brain plasticity and engagement.
In another item, Reynolds updates the story on the beneficial effects of intense interval training. Just a few minutes of very intense exercise are much more effective in improving health and cardiovascular fitness than slow and steady repetitive exercise. To try to deal with the problem that most people really don't enjoy zonking themselves out with intense intervals, Bangsbo and collaborators tried a different approach, asking runners to run gently for 30 seconds, then accelerate to a moderate pace for 20 seconds, then sprint as hard as possible for 10 seconds. Repeat five times, rest for a bit, and continue the sequence during a 5-km run. They observed the same beneficial effects on blood pressure and endurance observed with more arduous (several minute) bouts of high intensity training. I tried this 30-20-10 sequence with my favored aerobic exercise, swimming (just counting the intervals to myself made them pass more quickly), and I came out of the routine feeling way more wired than after my usual moderately active swim period.
In another item, Reynolds updates the story on the beneficial effects of intense interval training. Just a few minutes of very intense exercise are much more effective in improving health and cardiovascular fitness than slow and steady repetitive exercise. To try to deal with the problem that most people really don't enjoy zonking themselves out with intense intervals, Bangsbo and collaborators tried a different approach, asking runners to run gently for 30 seconds, then accelerate to a moderate pace for 20 seconds, then sprint as hard as possible for 10 seconds. Repeat five times, rest for a bit, and continue the sequence during a 5-km run. They observed the same beneficial effects on blood pressure and endurance observed with more arduous (several minute) bouts of high intensity training. I tried this 30-20-10 sequence with my favored aerobic exercise, swimming (just counting the intervals to myself made them pass more quickly), and I came out of the routine feeling way more wired than after my usual moderately active swim period.
Blog Categories:
acting/choosing,
aging,
brain plasticity,
exercise
Monday, August 03, 2015
Brain correlates of the impatience of adolescents.
Teens and young adults show greater impulsivity than children and adults, reflected by increases in emergency room visits, accidents from drug or alcohol use, and increased mortality risk. van den Bos et al. relate this to developmental changes in the structural and functional connectivity of different frontostriatal tracts. Participants made choices between smaller, sooner (SS) and larger, later (LL) monetary rewards in a delay-discounting task. Accepting a delay for a larger, later reward correlated with prefrontal inhibition of areas in the striatum:
By the way, on the subject of adolescents, I'll point to another piece of work by Baker et. al. on developmental changes in brain network hub connectivity in late adolescence
Adolescence is a developmental period associated with an increase in impulsivity. Impulsivity is a multidimensional construct, and in this study we focus on one of the underlying components: impatience. Impatience can result from (i) disregard of future outcomes and/or (ii) oversensitivity to immediate rewards, but it is not known which of these evaluative processes underlie developmental changes. To distinguish between these two causes, we investigated developmental changes in the structural and functional connectivity of different frontostriatal tracts. We report that adolescents were more impatient on an intertemporal choice task and reported less future orientation, but not more present hedonism, than young adults. Developmental increases in structural connectivity strength in the right dorsolateral prefrontal tract were related to increased negative functional coupling with the striatum and an age-related decrease in discount rates. Our results suggest that mainly increased control, and the integration of future-oriented thought, drives the reduction in impatience across adolescence.This clip from one of the figures shows the relevant brain areas:
By the way, on the subject of adolescents, I'll point to another piece of work by Baker et. al. on developmental changes in brain network hub connectivity in late adolescence
Blog Categories:
acting/choosing,
human development,
motivation/reward
Friday, July 31, 2015
Unlearning social biases during sleep
Feld and Born note that tenacious implicit prejudices of race or gender drive discrimination seen in the rise of nationalistic groups, excessive police violence against minority group members, persisting unequal pay for women, and sexual harassment all across the developed world. They point to work by Hu et al. that shows how such unwanted attitudes may be persistently changed by a social counterbias training when the fresh memories of this training are systematically reactivated during sleep after training. Here is part of their summary:
Sleep, and specifically deep or slow-wave sleep [non–rapid eye movement (REM) sleep], benefits memory formation by reactivating neuronal traces that were formed during the preceding period of wakefulness. This reactivation of specific memories leads to their strengthening and transformation. Such reactivation can be experimentally induced during slow-wave sleep by presenting cues that were present during the prior period of memory acquisition. Initial studies showed that an odor present during learning of object locations enhances these memories when the participant is reexposed to the odor during slow-wave sleep after learning. These findings have been confirmed in numerous studies investigating different memory systems and also when auditory instead of olfactory cues are used. This basic research has firmly established the possibility of influencing sleep to enhance specific newly learned memories by targeted memory reactivation.
The findings by Hu et al. now suggest that this method can also be used to influence implicit attitudes that are known to typically manifest themselves early during childhood and remain very stable into adulthood. Before a 90-min nap, participants underwent training aimed at countering typical implicit gender and racial biases by learning to associate genders and races with opposing attributes; that is, to associate female faces with science-related words and black faces with “good” words. Critically, presentation of the to-be-learned counterassociations was combined with a sound, which served as a cue to promote the reactivation of the newly learned associations during a subsequent nap while the participant was deep in slow-wave sleep. Only when this sound was re-presented during slow-wave sleep did the posttraining reduction in implicit social bias survive and was even evident 1 week later. These findings are all the more convincing as the authors conducted the reactivation step during a 90-min daytime nap. During normal sleep at night, the effects are expected to be even stronger, owing to the generally deeper and longer periods of slow-wave sleep and REM sleep. Additionally, the accompanying neuroendocrine milieu makes nocturnal sleep even more efficient for memory reinforcement.
Previous studies have shown that such targeted reactivation of memory during sleep can effectively extinguish unwanted behavior such as experimentally induced fear in humans. The present study is the first to demonstrate that this method can be used to break long-lived, highly pervasive response habits deeply rooted in memory and thereby influence behavior at an entirely unconscious level.A caution:
However, Aldous Huxley's description of a dystopian “brave new world” where young children are conditioned to certain values during sleep reminds us that this research also needs to be guided by ethical considerations. Sleep is a state in which the individual is without willful consciousness and therefore vulnerable to suggestion. Beyond that, Hu et al.'s findings highlight the breadth of possible applications to permanently modify any unwanted behavior by targeted memory reactivation during sleep.
Blog Categories:
acting/choosing,
culture/politics,
memory/learning,
sleep
Thursday, July 30, 2015
The danger of artificial intelligence is artificial stupidity.
Here are some clips from an interesting Op-Ed piece by Quentin Hardy on artificial intelligence. (And, by the way, a recent issue of Science Magazine has a special section on A.I. with a series of related articles.):
...the real worry...is a computer program rapidly overdoing a single task, with no context. A machine that makes paper clips proceeds unfettered, one example goes, and becomes so proficient that overnight we are drowning in paper clips.
There is little sense among practitioners in the field of artificial intelligence that machines are anywhere close to acquiring the kind of consciousness where they could form lethal opinions about their makers...doomsday scenarios confuse the science with remote philosophical problems about the mind and consciousness...If more people learned how to write software, they’d see how literal-minded these overgrown pencils we call computers actually are.
Deep Learning relies on a hierarchical reasoning technique called neural networks, suggesting the neurons of a brain. Comparing a node in a neural network to a neuron, though, is at best like comparing a toaster to the space shuttle....But machine learning is automation, a better version of what computers have always done. The “learning” is not stored and generalized in the ways that make people smart.
DeepMind made a program that mastered simple video games, but it never took the learning from one game into another. The 22 rungs of a neural net it climbs to figure out what is in a picture do not operate much like human image recognition and are still easily defeated...Moving out of that stupidity to a broader humanlike capability is called “transfer learning.” It is at best in the research phase.
“People in A.I. know that a chess-playing computer still doesn’t yearn to capture a queen,” said Stuart Russell, a professor of Computer Science at the University of California, Berkeley... He seeks mathematical ways to ensure dumb programs don’t conflict with our complex human values.
Blog Categories:
consciousness,
future,
futures,
human evolution,
technology
Wednesday, July 29, 2015
Placebo analgesia reduces empathy for pain.
Fascinating observations from Rütgen et al.. They show that experimental modulation of a first-hand emotion experience also modulates empathy for that emotion experience. This confirms that overlapping neural circuitry for a representation of another's emotion is specifically grounded in neural mechanisms that are also subserving the corresponding first-hand emotion experience, as opposed to unspecific or domain-general neural processes associated with emotion experiences:
Previous research in social neuroscience has consistently shown that empathy for pain recruits brain areas that are also activated during the first-hand experience of pain. This has been interpreted as evidence that empathy relies upon neural processes similar to those underpinning the first-hand experience of emotions. However, whether such overlapping neural activations imply that equivalent neural functions are engaged by empathy and direct emotion experiences remains to be demonstrated. We induced placebo analgesia, a phenomenon specifically modulating the first-hand experience of pain, to test whether this also reduces empathy for pain. Subjective and neural measures of pain and empathy for pain were collected using self-report and event-related potentials (ERPs) while participants underwent painful electrical stimulation or witnessed that another person was undergoing such stimulation. Self-report showed decreased empathy during placebo analgesia, and this was mirrored by reduced amplitudes of the pain-related P2, an ERP component indexing neural computations related to the affective-motivational component of pain. Moreover, these effects were specific for pain, as self-report and ERP measures of control conditions unrelated to pain were not affected by placebo analgesia. Together, the present results suggest that empathy seems to rely on neural processes that are (partially) functionally equivalent to those engaged by first-hand emotion experiences. Moreover, they imply that analgesics may have the unwanted side effect of reducing empathic resonance and concern for others.
Blog Categories:
emotion,
emotions,
fear/anxiety/stress,
social cognition
Tuesday, July 28, 2015
Dopamine and subjective well-being
Dolan and collaborators note influences of dopamine on emotion and decision making that are distinct from its known role in learning.:
The neuromodulator dopamine has a well established role in reporting appetitive prediction errors that are widely considered in terms of learning. However, across a wide variety of contexts, both phasic and tonic aspects of dopamine are likely to exert more immediate effects that have been less well characterized. Of particular interest is dopamine's influence on economic risk taking and on subjective well-being, a quantity known to be substantially affected by prediction errors resulting from the outcomes of risky choices. By boosting dopamine levels using levodopa (L-DOPA) as human subjects made economic decisions and repeatedly reported their momentary happiness, we show here an effect on both choices and happiness. Boosting dopamine levels increased the number of risky options chosen in trials involving potential gains but not trials involving potential losses. This effect could be better captured as increased Pavlovian approach in an approach–avoidance decision model than as a change in risk preferences within an established prospect theory model. Boosting dopamine also increased happiness resulting from some rewards. Our findings thus identify specific novel influences of dopamine on decision making and emotion that are distinct from its established role in learning.
Blog Categories:
acting/choosing,
happiness,
motivation/reward
Monday, July 27, 2015
Brain markers of individual differences in human prosociality.
Sul et al. make the fascinating observation that self-regarding and other-regarding regions of the medial prefrontal cortex show greater segregation in selfish individuals and more overlap in prosocial individuals.
Despite the importance of valuing another person’s welfare for prosocial behavior, currently we have only a limited understanding of how these values are represented in the brain and, more importantly, how they give rise to individual variability in prosociality. In the present study, participants underwent functional magnetic resonance imaging while performing a prosocial learning task in which they could choose to benefit themselves and/or another person. Choice behavior indicated that participants valued the welfare of another person, although less so than they valued their own welfare. Neural data revealed a spatial gradient in activity within the medial prefrontal cortex (MPFC), such that ventral parts predominantly represented self-regarding values and dorsal parts predominantly represented other-regarding values. Importantly, compared with selfish individuals, prosocial individuals showed a more gradual transition from self-regarding to other-regarding value signals in the MPFC and stronger MPFC–striatum coupling when they made choices for another person rather than for themselves. The present study provides evidence of neural markers reflecting individual differences in human prosociality.
Friday, July 24, 2015
Top-down alpha band oscillations optimize visual perception.
Interesting work from Samaha et al.:
Significance
Significance
In contrast to canonical, stimulus-driven models of perception, recent proposals argue that perceptual experiences are constructed in an active manner in which top-down influences play a key role. In particular, predictions that the brain makes about the world are incorporated into each perceptual experience. Because forming the appropriate sensory predictions can have a large impact on our visual experiences and visually guided behaviors, a mechanism thought to be disrupted in certain neurological conditions like autism and schizophrenia, an understanding of the neural basis of these predictions is critical. Here, we provide evidence that perceptual expectations about when a stimulus will appear are instantiated in the brain by optimally configuring prestimulus alpha-band oscillations so as to make subsequent processing most efficacious.Abstract
The physiological state of the brain before an incoming stimulus has substantial consequences for subsequent behavior and neural processing. For example, the phase of ongoing posterior alpha-band oscillations (8–14 Hz) immediately before visual stimulation has been shown to predict perceptual outcomes and downstream neural activity. Although this phenomenon suggests that these oscillations may phasically route information through functional networks, many accounts treat these periodic effects as a consequence of ongoing activity that is independent of behavioral strategy. Here, we investigated whether alpha-band phase can be guided by top-down control in a temporal cueing task. When participants were provided with cues predictive of the moment of visual target onset, discrimination accuracy improved and targets were more frequently reported as consciously seen, relative to unpredictive cues. This effect was accompanied by a significant shift in the phase of alpha-band oscillations, before target onset, toward each participant’s optimal phase for stimulus discrimination. These findings provide direct evidence that forming predictions about when a stimulus will appear can bias the phase of ongoing alpha-band oscillations toward an optimal phase for visual processing, and may thus serve as a mechanism for the top-down control of visual processing guided by temporal predictions.
Thursday, July 23, 2015
Universal features of human music.
I have read through a fascinating paper by Savage et al. that makes a convincing case for statistical universals in the structure and function of human music. I pass on the abstract and a few clips from the text. Motivated readers can request a PDF of the article from me.
Music has been called “the universal language of mankind.” Although contemporary theories of music evolution often invoke various musical universals, the existence of such universals has been disputed for decades and has never been empirically demonstrated. Here we combine a music-classification scheme with statistical analyses, including phylogenetic comparative methods, to examine a well-sampled global set of 304 music recordings. Our analyses reveal no absolute universals but strong support for many statistical universals that are consistent across all nine geographic regions sampled. These universals include 18 musical features that are common individually as well as a network of 10 features that are commonly associated with one another. They span not only features related to pitch and rhythm that are often cited as putative universals but also rarely cited domains including performance style and social context. These cross-cultural structural regularities of human music may relate to roles in facilitating group coordination and cohesion, as exemplified by the universal tendency to sing, play percussion instruments, and dance to simple, repetitive music in groups. Our findings highlight the need for scientists studying music evolution to expand the range of musical cultures and musical features under consideration. The statistical universals we identified represent important candidates for future investigation.The 18 universal features:
Pitch: Music tends to use discrete pitches (1) to form nonequidistant scales (2) containing seven or fewer scale degrees per octave (3). Music also tends to use descending or arched melodic contours (4) composed of small intervals (5) of less than 750 cents (i.e., a perfect fifth or smaller).
Rhythm: Music tends to use an isochronous beat (6) organized according to metrical hierarchies (7) based on multiples of two or three beats (8)—especially multiples of two beats (9). This beat tends to be used to construct motivic patterns (10) based on fewer than five durational values (11).
Form: Music tends to consist of short phrases (12) less than 9 s long.
Instrumentation: Music tends to use both the voice (13) and (nonvocal) instruments (14), often together in the form of accompanied vocal song.
Performance style: Music tends to use the chest voice (i.e., modal register) (15) to sing words (16), rather than vocables (nonlexical syllables).
Social context: Music tends to be performed predominantly in groups (17) and by males (18). The bias toward male performance is true of singing, but even more so of instrumental performance.The geographic distribution of the recordings analyzed:
The 304 recordings from the Garland Encyclopedia of World Music show a widespread geographic distribution. They are grouped into nine regions specified a priori by the Encyclopedia’s editors, as color-coded in the legend at bottom: North America (n = 33 recordings), Central/South America (39), Europe (40), Africa (21), the Middle East (35), South Asia (34), East Asia (34), Southeast Asia (14), and Oceania (54).
Blog Categories:
human evolution,
music,
social cognition
Wednesday, July 22, 2015
A small bonbon for techie MindBlog readers.
Passed on to me by a friend... still vaporware, but cute.
Screams and the communication soundscape
A fascinating piece from Amal noting that human screams occupy a privileged niche in the communication soundscape.
Highlights
Highlights
•We provide the first evidence of a special acoustic regime (“roughness”) for screams
•Roughness is used in both natural and artificial alarm signals
•Roughness confers a behavioral advantage to react rapidly and efficiently
•Acoustic roughness selectively activates amygdala, involved in danger processingSummary
Screaming is arguably one of the most relevant communication signals for survival in humans. Despite their practical relevance and their theoretical significance as innate and virtually universal vocalizations, what makes screams a unique signal and how they are processed is not known. Here, we use acoustic analyses, psychophysical experiments, and neuroimaging to isolate those features that confer to screams their alarming nature, and we track their processing in the human brain. Using the modulation power spectrum, a recently developed, neurally informed characterization of sounds, we demonstrate that human screams cluster within restricted portion of the acoustic space (between ∼30 and 150 Hz modulation rates) that corresponds to a well-known perceptual attribute, roughness. In contrast to the received view that roughness is irrelevant for communication, our data reveal that the acoustic space occupied by the rough vocal regime is segregated from other signals, including speech, a pre-requisite to avoid false alarms in normal vocal communication. We show that roughness is present in natural alarm signals as well as in artificial alarms and that the presence of roughness in sounds boosts their detection in various tasks. Using fMRI, we show that acoustic roughness engages subcortical structures critical to rapidly appraise danger. Altogether, these data demonstrate that screams occupy a privileged acoustic niche that, being separated from other communication signals, ensures their biological and ultimately social efficiency.
Blog Categories:
fear/anxiety/stress,
social cognition
Tuesday, July 21, 2015
Ohmygawd - a social app that aims for fleeting authenticity?
I prodded my troglodyte brain to actually read this article about yet another social networking app, Beme, and came away fascinated. More constant and fleeting interacting than I want or could handle, but a clever trick with an iPhone. You just hold the front of the phone against your chest, and the phone's proximity detector functions as a record button to film the scene you are facing. You can view the Bemes of friends you are following, as they can view yours. If you like, you can send a real time selfie to react to what your friend is doing.... after viewing is done, the exchange is erased. Talk about fleeting glances. WIRED magazine thinks that this flight from the curated self to authenticity may be judged to be too boring by many users.
How experiencing nature influences our brains.
Bratman et al., in their PNAS article with the title "Nature experience reduces rumination and subgenual prefrontal cortex activation," suggest an explanation for why nature experience, in contrast to urban experience, enhances our sense of well-being:
Urbanization has many benefits, but it also is associated with increased levels of mental illness, including depression. (More than 50% of people now live in urban areas. By 2050 this proportion will be 70%.) It has been suggested that decreased nature experience may help to explain the link between urbanization and mental illness. This suggestion is supported by a growing body of correlational and experimental evidence, which raises a further question: what mechanism(s) link decreased nature experience to the development of mental illness? One such mechanism might be the impact of nature exposure on rumination, a maladaptive pattern of self-referential thought that is associated with heightened risk for depression and other mental illnesses. We show in healthy participants that a brief nature experience, a 90-min walk in a natural setting, decreases both self-reported rumination and neural activity in the subgenual prefrontal cortex (sgPFC), whereas a 90-min walk in an urban setting has no such effects on self-reported rumination or neural activity. In other studies, the sgPFC has been associated with a self-focused behavioral withdrawal linked to rumination in both depressed and healthy individuals. This study reveals a pathway by which nature experience may improve mental well-being and suggests that accessible natural areas within urban contexts may be a critical resource for mental health in our rapidly urbanizing world.From the body of the paper:
On arrival at our laboratory, each participant completed a self-report measure of rumination (RRQ) and underwent our scanning procedure. We then randomly assigned each participant to a 90-min walk in either a natural environment (19 participants) or urban environment (19 participants). The nature walk took place near Stanford University, in a greenspace comprising grassland with scattered oak trees and shrubs. The urban walk took place on the busiest thoroughfare in nearby Palo Alto (El Camino Real), a street with three to four lanes in each direction and a steady stream of traffic (Fig. S1). After the walk, each participant returned to the laboratory and provided a second, follow-up self-report of levels of rumination (RRQ) before undergoing a second resting-state ASL scan. Transportation to and from the walk was via a car ride of 15-min duration (for both walks). Participants were given a smartphone and told to take 10 photographs during their walk (Fig. S2). These photographs were used to verify that participants went on the walk. We also tracked the phone itself during the walk, as further verification that the correct route was taken by each participant.
The impact of nature experience on self-reported rumination and blood perfusion to the sgPFC. (A) Change in self-reported rumination (postwalk minus prewalk) for participants randomly assigned to take a 90-min walk either in a natural setting or in an urban setting. (B) A time-by-environment interaction in blood perfusion was evident in the sgPFC. F map of significant interactions at a threshold of P less than 0.05, FWE corrected for multiple comparisons. (C) Change in blood perfusion (postwalk minus prewalk) for participants randomly assigned to take a 90-min walk either in a natural setting or in an urban setting. Error bars represent SE within subjects: *P less than 0.05, ***P less than 0.001.
Subscribe to:
Posts (Atom)