Wednesday, February 15, 2017

Gender stereotypes emerge early.

Bian et al. (open source) find that children at age five do not consider boys and girls different with respect to being 'really, really smart' - the childhood version of adult brilliance. But by age 6, girls are more like to put more boys in the 'really, really smart' category and steer themselves away from games intended for that category.
Common stereotypes associate high-level intellectual ability (brilliance, genius, etc.) with men more than women. These stereotypes discourage women’s pursuit of many prestigious careers; that is, women are underrepresented in fields whose members cherish brilliance (such as physics and philosophy). Here we show that these stereotypes are endorsed by, and influence the interests of, children as young as 6. Specifically, 6-year-old girls are less likely than boys to believe that members of their gender are “really, really smart.” Also at age 6, girls begin to avoid activities said to be for children who are “really, really smart.” These findings suggest that gendered notions of brilliance are acquired early and have an immediate effect on children’s interests.

Tuesday, February 14, 2017

How our brains make meaning, with the help of a little LSD

Interesting work from Preller et al:

Highlights
•LSD-induced effects are blocked by the 5-HT2A receptor antagonist ketanserin 
•LSD increased the attribution of meaning to previously meaningless music 
•Simulation of the 5-HT2A receptor is crucial for the generation of meaning 
•Changes in personal meaning attribution are mediated by cortical midline structures
Summary
A core aspect of the human self is the attribution of personal relevance to everyday stimuli enabling us to experience our environment as meaningful. However, abnormalities in the attribution of personal relevance to sensory experiences are also critical features of many psychiatric disorders. Despite their clinical relevance, the neurochemical and anatomical substrates enabling meaningful experiences are largely unknown. Therefore, we investigated the neuropharmacology of personal relevance processing in humans by combining fMRI and the administration of the mixed serotonin (5-HT) and dopamine receptor (R) agonist lysergic acid diethylamide (LSD), well known to alter the subjective meaning of percepts, with and without pretreatment with the 5-HT2AR antagonist ketanserin. General subjective LSD effects were fully blocked by ketanserin. In addition, ketanserin inhibited the LSD-induced attribution of personal relevance to previously meaningless stimuli and modulated the processing of meaningful stimuli in cortical midline structures. These findings point to the crucial role of the 5-HT2AR subtype and cortical midline regions in the generation and attribution of personal relevance. Our results thus increase our mechanistic understanding of personal relevance processing and reveal potential targets for the treatment of psychiatric illnesses characterized by alterations in personal relevance attribution.

Monday, February 13, 2017

An emotional experience can enhance future memory formation.

Tambini et al. show that neural effects of an emotional experience can persist, and bias how new and unrelated information is encoded and stored by our brains:
Emotional arousal can produce lasting, vivid memories for emotional experiences, but little is known about whether emotion can prospectively enhance memory formation for temporally distant information. One mechanism that may support prospective memory enhancements is the carry-over of emotional brain states that influence subsequent neutral experiences. Here we found that neutral stimuli encountered by human subjects 9–33 min after exposure to emotionally arousing stimuli had greater levels of recollection during delayed memory testing compared to those studied before emotional and after neutral stimulus exposure. Moreover, multiple measures of emotion-related brain activity showed evidence of reinstatement during subsequent periods of neutral stimulus encoding. Both slow neural fluctuations (low-frequency connectivity) and transient, stimulus-evoked activity predictive of trial-by-trial memory formation present during emotional encoding were reinstated during subsequent neutral encoding. These results indicate that neural measures of an emotional experience can persist in time and bias how new, unrelated information is encoded and recollected.

Friday, February 10, 2017

Kind words in language - changes over time

John Carson does a nice precis of Iliev et al.:
It is debated whether linguistic positivity bias (LPB) — the cross-cultural tendency to use more positive words than negative — results from a common cognitive underpinning or our environmental and cultural context.
Rumen Iliev, from the University of Michigan, and colleagues tackle the theoretical stalemate by looking at changes in positive word usage within a language over time. They use time-stamped texts from Google Books Ngrams and the New York Times to analyse LPB trends in American English over the last 200 years. They show that LPB has declined overall since 1800, which discounts the importance of universal cognition and, they suggest, aligns most strongly with a decline in social cohesion and prosociality in the United States. They find a significant association between LPB and casualty levels in war, economic performance, and measures of public happiness, suggesting that objective circumstances and subjective public mood drive its dynamics.
Analysing time-stamped historical texts is a powerful way to investigate evolving behaviours. The next step will be to look across other languages and historical events and tease apart the contribution of different contextual factors to LPB.

Thursday, February 09, 2017

Mysterianism

Here is Nicholas Carr's statement of an argument that has always appealed to me, a concept that should be more widely know. Roughly: "I don't expect my cat to understand quantum physics, why should I imagine that I will ever be able to understand consciousness or the basic nature of our universe?"
By leaps, steps, and stumbles, science progresses. Its seemingly inexorable advance promotes a sense that everything can be known and will be known. Through observation and experiment, and lots of hard thinking, we will come to explain even the murkiest and most complicated of nature’s secrets: consciousness, dark matter, time, the full story of the universe.
But what if our faith in nature’s knowability is just an illusion, a trick of the overconfident human mind? That’s the working assumption behind a school of thought known as mysterianism. Situated at the fruitful if sometimes fraught intersection of scientific and philosophic inquiry, the mysterianist view has been promulgated, in different ways, by many respected thinkers, from the philosopher Colin McGinn to the cognitive scientist Steven Pinker. The mysterians propose that human intellect has boundaries and that some of nature’s mysteries may forever lie beyond our comprehension.
Mysterianism is most closely associated with the so-called hard problem of consciousness: How can the inanimate matter of the brain produce subjective feelings? The mysterians argue that the human mind may be incapable of understanding itself, that we will never understand how consciousness works. But if mysterianism applies to the workings of the mind, there’s no reason it shouldn’t also apply to the workings of nature in general. As McGinn has suggested, “It may be that nothing in nature is fully intelligible to us.”
The simplest and best argument for mysterianism is founded on evolutionary evidence. When we examine any other living creature, we understand immediately that its intellect is limited. Even the brightest, most curious dog is not going to master arithmetic. Even the wisest of owls knows nothing of the anatomy of the field mouse it devours. If all the minds that evolution has produced have bounded comprehension, then it’s only logical that our own minds, also products of evolution, would have limits as well. As Pinker has observed, “The brain is a product of evolution, and just as animal brains have their limitations, we have ours.” To assume that there are no limits to human understanding is to believe in a level of human exceptionalism that seems miraculous, if not mystical.
Mysterianism, it’s important to emphasize, is not inconsistent with materialism. The mysterians don’t suggest that what’s unknowable must be spiritual. They posit that matter itself has complexities that lie beyond our ken. Like every other animal on earth, we humans are just not smart enough to understand all of nature’s laws and workings.
What’s truly disconcerting about mysterianism is that, if our intellect is bounded, we can never know how much of existence lies beyond our grasp. What we know or may in the future know may be trifling compared with the unknowable unknowns. “As to myself,” remarked Isaac Newton in his old age, “I seem to have been only like a boy playing on the sea-shore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.” It may be that we are all like that child on the strand, playing with the odd pebble or shell—and fated to remain so.
Mysterianism teaches us humility. Through science, we have come to understand much about nature, but much more may remain outside the scope of our perception and comprehension. If the mysterians are right, science’s ultimate achievement may be to reveal to us its own limits.

Wednesday, February 08, 2017

Feel good fractals.

I want to point to this excerpt from Florence Williams' new book, "The Nature Fix: Why Nature Makes Us Happier, Healthier, and More Creative," that appears on aeon's website. It describes the work and ideas of physicist Richard Taylor, who noted in a Nature paper in 1999 that Jackson Pollock's paintings were fractal in design, 25 years ahead of their scientific discovery. Here is one clip from the text:
...Taylor ran experiments to gauge people’s physiological response to viewing images with similar fractal geometries. He measured people’s skin conductance (a measure of nervous system activity) and found that they recovered from stress 60 per cent better when viewing computer images with a mathematical fractal dimension (called D) of between 1.3 and 1.5. D measures the ratio of the large, coarse patterns (the coastline seen from a plane, the main trunk of a tree, Pollock’s big-sweep splatters) to the fine ones (dunes, rocks, branches, leaves, Pollock’s micro-flick splatters). Fractal dimension is typically notated as a number between 1 and 2; the more complex the image, the higher the D.
Next, Taylor and Caroline Hägerhäll, a Swedish environmental psychologist with a specialty in human aesthetic perception, converted a series of nature photos into a simplistic representation of the landforms’ fractal silhouettes against the sky. They found that people overwhelmingly preferred images with a low to mid-range D (between 1.3 and 1.5.) To find out if that dimension induced a particular mental state, they used EEG to measure people’s brain waves while viewing geometric fractal images. They discovered that in that same dimensional magic zone, the subjects’ frontal lobes easily produced the feel-good alpha brainwaves of a wakefully relaxed state. This occurred even when people looked at the images for only one minute.
EEG measures waves, or electrical frequency, but it doesn’t precisely map the active real estate in the brain. For that, Taylor has now turned to functional MRI, which shows the parts of the brain working hardest by imaging the blood flow. Preliminary results show that mid-range fractals activate some brain regions that you might expect, such as the ventrolateral cortex (involved with high-level visual processing) and the dorsolateral cortex, which codes spatial long-term memory. But these fractals also engage the parahippocampus, which is involved with regulating emotions and is also highly active while listening to music. To Taylor, this is a cool finding. ‘We were delighted to find [mid-range fractals] are similar to music,’ he said. In other words, looking at an ocean might have a similar effect on us emotionally as listening to Brahms.
But why is the mid-range of D (remember, that’s the ratio of large to small patterns) so magical and so highly preferred among most people? Taylor and Hägerhäll have an interesting theory, and it doesn’t necessarily have to do with a romantic yearning for Arcadia. In addition to lungs, capillaries and neurons, another human system is branched into fractals: the visual system as expressed by the movement of the eye’s retina. When Taylor used an eye-tracking machine to measure precisely where people’s pupils were focusing on projected images (of Pollock paintings, for example, but also other things), he saw that the pupils used a search pattern that was itself fractal. The eyes first scanned the big elements in the scene and then made micro passes in smaller versions of the big scans, and it does this in a mid-range D. Interestingly, if you draw a line over the tracks that animals make to forage for food, for example albatrosses surveying the ocean, you also see this fractal pattern of search trajectories. It’s simply an efficient search strategy, said Taylor.
‘Your visual system is in some way hardwired to understand fractals,’ said Taylor. ‘The stress-reduction is triggered by a physiological resonance that occurs when the fractal structure of the eye matches that of the fractal image being viewed.’

Tuesday, February 07, 2017

Why our supermarket tomatoes are sturdy and flavorless.

Having dinked with tomato breeding and genetic manipulations to make our supermarket tomatoes sturdy, colorful, and tasteless, now geneticists have tried to figure out why flavor got thrown away. Tieman et al. combined tasting panels with chemical and genomic analyses of nearly 400 varieties of tomatoes to identify flavorful components that have been lost over time. Now maybe they will get to work and put the flavor back in?
Modern commercial tomato varieties are substantially less flavorful than heirloom varieties. To understand and ultimately correct this deficiency, we quantified flavor-associated chemicals in 398 modern, heirloom, and wild accessions. A subset of these accessions was evaluated in consumer panels, identifying the chemicals that made the most important contributions to flavor and consumer liking. We found that modern commercial varieties contain significantly lower amounts of many of these important flavor chemicals than older varieties. Whole-genome sequencing and a genome-wide association study permitted identification of genetic loci that affect most of the target flavor chemicals, including sugars, acids, and volatiles. Together, these results provide an understanding of the flavor deficiencies in modern commercial varieties and the information necessary for the recovery of good flavor through molecular breeding.

Monday, February 06, 2017

MindBlog’s 11th anniversary…some statistics.

Today is MindBlog’s 11 anniversary. I let the 10th anniversary pass without noticing, so I want to briefly comment this year. Google analytics tells me that there have been about four million views of the 4,115 posts that have appeared thus far. After I have weeded out the 4-5 comments submitted each week whose purpose is to insert a link to a commercial site, authentic comments on the posts are few and far between. I also receive 1-2 emails each week from sites wanting to contribute a post and get a crosslink to their site. My cut and paste response is: “I must decline your kind offer. MindBlog is my own idiosyncratic hobby, and I only post content that I initiate.  I have no interest in revenue.”

The blog passes on material that I would be reading even if I were not doing the blogging gig, and it would seem a shame not to share what I find interesting. While I do write occasional posts that are entirely of my composition, most of the posting is better described as ‘curated content.’ I keep a queue of 5-10 completed posts that are post dated for automatic 3 a.m. daily posting by the Blogger platform. My actual writing occurs in bursts. (It is not happening this week, while my husband and I are on a Caribbean cruise!)

Each day MindBlog receives about 1,500 page views. A typical post, on its first day, will gather 300-600 views, rising to ~1,000 views after two weeks. MindBlog’s RSS feed has ~750 followers, and the automatic reposts to twitter have ~1450 followers. I haven’t monitored the response to reposts on Facebook and Google+.

I’m grateful that this many people seem to find the material interesting. The occasional emails from readers who express gratitude for my efforts motivate me to continue.

Friday, February 03, 2017

Artificial intelligence: Machines that reason

Stavroula Kousta does a precis of work reported by Graves et al. of the Google DeepMind project:
Complex reasoning is a hallmark of natural intelligence, as is learning from experience. Artificial neural networks — biologically inspired computational models — also learn from examples and excel at pattern recognition tasks, such as object and speech recognition. However, they cannot handle complex reasoning tasks that require memory to be solved.
Alex Graves, Greg Wayne and co-workers at Google DeepMind have now developed a neural network with read–write access to external memory, called a differentiable neural computer (DNC). The DNC's two modules — the memory and the neural network that controls it — interact like a digital computer's RAM and CPU, but do not need to be programmed. The system learns through exposure to examples to provide highly accurate responses to questions that require deductive reasoning (for example, “Sheep are afraid of wolves. Gertrude is a sheep. What is Gertrude afraid of?”), to traverse a novel network (for example, the London Underground map), and to carry out logical planning tasks.
This work represents a major leap forward in showing how symbolic reasoning can arise from an entirely non-symbolic system that learns through experience.

Thursday, February 02, 2017

The origins of happiness

A MindBlog reader pointed me to this presentation, at this year's Davos World Economic Forum, on the subject of what government actions (in a European context) might be most relevant to fostering the wellbeing of citizens.
As Thomas Jefferson once said, “The care of human life and happiness… is the only legitimate object of good government”. But...well being will only take off when policymakers have numbers that tell them how any change of policy will affect the measured well being of the people, and at what cost.
..The right definition [of well being], in our view, should be life satisfaction: “Overall how satisfied are you with your life, these days?”, measured on a scale of 0 to 10 ..That is a profoundly democratic concept because it allows people to evaluate their own wellbeing rather than have policymakers deciding what is more important for them and what is less so...policymakers like the concept – and so they should. Work in our group at LSE shows that in European elections since 1970 the life satisfaction of the people is the best predictor of whether the government gets re-elected – much more important than economic growth, unemployment or inflation.
The article then proceeds with tables and statistics showing that the big factors in determining life satisfaction are all non-economic (such as whether someone is partnered, and especially how healthy they are). The biggest feature distinguishing the miserable from the happy is mental illness, and the authors' analysis shows that funds directed towards mental health are much more cost effective (with respect to increasing well being in the population) than those directed towards reducing poverty or unemployment, or improving physical health.

Wednesday, February 01, 2017

Are human-specific plastic cortical synaptic connections what makes us human?

I want to pass on an excellent primer (open source) on the plasticity of a specific synapse between pyramidal neurons and fast-spiking interneurons of the human neocortex observed only in our human brains.
One outstanding difference between Homo sapiens and other mammals is the ability to perform highly complex cognitive tasks and behaviors, such as language, abstract thinking, and cultural diversity. How is this accomplished? According to one prominent theory, cognitive complexity is proportional to the repetition of specific computational modules over a large surface expansion of the cerebral cortex (neocortex). However, the human neocortex was shown to also possess unique features at the cellular and synaptic levels, raising the possibility that expanding the computational module is not the only mechanism underlying complex thinking. In a study published in PLOS Biology, Szegedi and colleagues analyzed a specific cortical circuit from live postoperative human tissue, showing that human-specific, very powerful excitatory connections between principal pyramidal neurons and inhibitory neurons are highly plastic. This suggests that exclusive plasticity of specific microcircuits might be considered among the mechanisms endowing the human neocortex with the ability to perform highly complex cognitive tasks.

Tuesday, January 31, 2017

An individual's ultimate economic burden can be forecast in childhood

Important work from Caspi et al., who show that 20% of the population accounts for close to 80% of economic burden. This group can be predicted with high accuracy from as early as age 3.
Policymakers are interested in early-years interventions to ameliorate childhood risks. They hope for improved adult outcomes in the long run that bring a return on investment. The size of the return that can be expected partly depends on how strongly childhood risks forecast adult outcomes, but there is disagreement about whether childhood determines adulthood. We integrated multiple nationwide administrative databases and electronic medical records with the four-decade-long Dunedin birth cohort study to test child-to-adult prediction in a different way, using a population-segmentation approach. A segment comprising 22% of the cohort accounted for 36% of the cohort’s injury insurance claims; 40% of excess obese kilograms; 54% of cigarettes smoked; 57% of hospital nights; 66% of welfare benefits; 77% of fatherless child-rearing; 78% of prescription fills; and 81% of criminal convictions. Childhood risks, including poor brain health at three years of age, predicted this segment with large effect sizes. Early-years interventions that are effective for this population segment could yield very large returns on investment.

Monday, January 30, 2017

The uniformity illusion.

Otten et al. investigate a visual illusion in which the accurate and detailed vision in the center of our visual field, accomplished by the fovea, influences our perception of peripheral stimuli, making them seem more similar to the center. The open source article contains several nice examples of this illusion.
Vision in the fovea, the center of the visual field, is much more accurate and detailed than vision in the periphery. This is not in line with the rich phenomenology of peripheral vision. Here, we investigated a visual illusion that shows that detailed peripheral visual experience is partially based on a reconstruction of reality. Participants fixated on the center of a visual display in which central stimuli differed from peripheral stimuli. Over time, participants perceived that the peripheral stimuli changed to match the central stimuli, so that the display seemed uniform. We showed that a wide range of visual features, including shape, orientation, motion, luminance, pattern, and identity, are susceptible to this uniformity illusion. We argue that the uniformity illusion is the result of a reconstruction of sparse visual information (from the periphery) based on more readily available detailed visual information (from the fovea), which gives rise to a rich, but illusory, experience of peripheral vision.

Friday, January 27, 2017

Regression to the mean - Why we would all be better off if we ignored Trump’s tweets

O’Donnell’s answer to the annual edge.org question "What scientific term or concept ought to be more widely known?":
My candidate is an old, simple, and powerful one: the law of regression to the mean. It’s a concept from the discipline of statistics, but in real life it means that anomalies are anomalies, coincidences happen (all the time, with stunning frequency), and the main thing they tell us is that the next thing to happen is very likely to be a lot more boring, ordinary, and predictable. Put in the simplest human terms, it teaches us not to be so excitable, not to be so worried, not to be so excited: Life really will be, for the most part, boring and predictable.
The ancient and late antique intellectuals whom I spend my life studying wouldn’t talk so much about miracles and portents if they could calm down and think about the numbers. The baseball fans thrilled to see the guy on a hitting streak come to the plate wouldn’t be so disappointed when he struck out. Even people reading election returns would see much more normality lurking inside shocking results than television reporters can admit.
Heeding the law of regression to the mean would help us slow down, calm down, pay attention to the long term and the big picture, and react with a more strategic patience to crises large and small. We’d all be better off.

Thursday, January 26, 2017

Smartphone reprogramming of our brains?

Nicolelis makes some good points as he adds to the genre of literature that predicts a diminution of our brain power caused by dependence on the latest technological advance (abacus, slide rule, electronic calculator, computer, etc.). Here is his statement of alarm:
Could our constant reciprocal interaction with digital logic (through laptops, tablets, smartphones, all the way to highly immersive virtual reality environments), particularly when it leads to powerfully hedonic experiences, result in the slow compromise or even elimination of some of the behaviours and cognitive aptitudes that represent the most exquisite and cherished attributes of the human condition? Attributes such as our multifaceted social skills, empathy, linguistic semantics, aesthetic sense, artistic expression, intuition, creativity and the ability to improvise solutions to novel contingencies, to name just a few. In other words, could opting for the fast lane of the never-ending highway to full digital immersion and automation — an obvious current trend in our modern society — produce a reduction in human cognitive capabilities?
Nicolelis goes on to note that the human brain can not be reduced to the algorithmic nature of Turing machine, but rather is an organic computer in which hardware and software from the molecular to the organismal level cannot be dissociated, one that uses a recursive mix of analogue and digital processing.
Even though the brain cannot be reduced to a digital machine, could the human brain simply assimilate and begin to mimic the rigid binary logic and algorithmic mode of operation of digital machines due to the growing overexposure to digital devices and the hedonic response triggered by these interactions, and become a biological digital system?
I would volunteer the notion that passive immersion in the digital systems of modern airplanes (in the case of pilots), digital imaging diagnostics (radiologists) and computer-assisted design (architects) may gradually curtail the range and acuity of some mental functions and cognitive skills, such as creativity, insight and the ability to solve novel problems…when people believe that a series of statements that they have been asked to remember will be stored online, they perform worse than a control group that relies only on their own biological memory to remember the statements. This suggests that subcontracting some simple mental searches to Google may, after all, reduce our own brain’s ability to store and recall memories reliably.
The impact of online social media on our natural social skills is another area in which we may be able to measure the true effects of digital systems on human behaviour…An intense presence on social media and virtual reality environments can produce significant anxiety, a reduction in real social interactions, lack of social skills and human empathy, and difficulties in handling solitude. … symptoms and signs of addiction to virtual life are often reported…I began wondering whether the new ‘always connected’ routine is overtaxing our cerebral cortex by dramatically expanding the number of people with whom we can closely communicate, almost instantaneously, via the multitude of social media outlets available on the internet. Instead of respecting the group size limit (about 150 individuals) afforded by our cortical volume, we are now in continuous contact with a group of people that could far exceed that neurobiological limit. What are the consequences of this cortical overtaxing? Anxiety, attention, cognitive and even memory deficits?
Homo digitalis
Is the above scenario something we should pay attention to? I think so. If not because of the potential impact on the mental health of this and future generations, but also because of the far-reaching consequences of our increasing interaction with digital systems. For example, at the far limit, I can conceive that this staggering expansion in our online social connectivity is capable of providing a completely new type of selective pressure that may, eventually, bias the evolutionary future of our species. One may begin wondering whether the dawn of ‘Homo digitalis’ is upon us or, more surprisingly, whether he/she is already around, texting and tweeting without being noticed.

Wednesday, January 25, 2017

Gender and the conflation of equality and sameness

I want to pass on some clips from a sane brief essay by Helena Cronin, author of "The Ant and the Peacock: Altruism and Sexual Selection from Darwin to Today."
The poet Philip Larkin famously proclaimed that sex began in 1963. He was inaccurate by 800 million years. Moreover, what began in the 1960s was instead a campaign to oust sex—in particular, sex differences—in favor of gender...biological differences were thought to spell genetic determinism, immutability, anti-feminism and, most egregiously, women's oppression. Gender, however, was the realm of societal forces; "male" and "female" were social constructs...
...gender has distorted social policy. This is because the campaign has undergone baleful mission-creep. Its aim has morphed from ending discrimination against women into a deeply misguided quest for sameness of outcome for males and females in all fields—above all, 50:50 across the entire workplace. This stems from a fundamental error: the conflation of equality and sameness. And it's an error all too easily made if your starting point is that the sexes are "really" the same and that apparent differences are mere artifacts of sexist socialization.
Equality is about fair treatment, not about people or outcomes being identical; so fairness does not and should not require sameness. However, when sameness gets confused with equality—and equality is of course to do with fairness—then sameness ends up undeservedly sharing their moral high ground. And male/female discrepancies become a moral crusade. Why so few women CEOs or engineers? It becomes socially suspect to explain this as the result not of discrimination but of differential choice.
Well, it shouldn’t be suspect. Because the sexes do differ—and in ways that, on average, make a notable difference to their distribution in today's workplace.
Here's why the sexes differ. A sexual organism must divide its total reproductive investment into two—competing for mates and caring for offspring. Almost from the dawn of sexual reproduction, one sex specialized slightly more in competing for mates and the other slightly more in caring for offspring...the differences go far beyond reproductive plumbing. They are distinctive adaptations for the different life-strategies of competers and carers. Wherever ancestral males and females faced different adaptive problems, we should expect sex differences—encompassing bodies, brains and behaviour. And we should expect that, reflecting those differences, competers and carers will have correspondingly different life-priorities.
As for different outcomes in the workplace, the causes are above all different interests and temperaments (and not women being "less clever" than men). Women on average have a strong preference for working with people—hence the nurses and teachers; and, compared to men, they care more about family and relationships and have broader interests and priorities—hence little appeal in becoming CEOs. Men have far more interest in "things"—hence the engineers; and they are vastly more competitive: more risk-taking, ambitious, status-seeking, single-minded, opportunistic—hence the CEOs. So men and women have, on average, different conceptions of what constitutes success (despite the gender quest to impose the same—male—conception on all).
And here's some intriguing evidence. "Gender" predicts that, as discrimination diminishes, males and females will increasingly converge. But a study of 55 nations found that it was in the most liberal, democratic, equality-driven countries that divergence was greatest. The less the sexism, the greater the sex differences. Difference, this suggests, is evidence not of oppression but of choice; not socialization, not patriarchy, not false consciousness, not even pink t-shirts or personal pronouns … but female choice.
An evolutionary understanding shows that you can't have sex without sex differences. It is only within that powerful scientific framework—in which ideological questions become empirical answers—that gender can be properly understood. And, as the fluidity of "sexualities" enters public awareness, sex is again crucial for informed, enlightened discussion.
So for the sake of science, society and sense, bring back sex.

Tuesday, January 24, 2017

Knowing how confidently we know

Here is a fascinating piece of work from Miyamoto et al. showing that parallel stream of information in the brain regulate the confidence that a memory is correct, apart from the memory itself. From the journal's description of the work:
Self-monitoring and evaluation of our own memory is a mental process called metamemory. For metamemory, we need access to information about the strength of our own memory traces. The brain structures and neural mechanisms involved in metamemory are completely unknown. Miyamoto et al. devised a test paradigm for metamemory in macaques, in which the monkeys judged their own confidence in remembering past experiences. The authors combined this approach with functional brain imaging to reveal the neural substrates of metamemory for retrospection. A specific region in the prefrontal brain was essential for meta mnemonic decision-making. Inactivation of this region caused selective impairment of metamemory, but not of memory itself.
and, the abstract from Miyamoto et al.:
We know how confidently we know: Metacognitive self-monitoring of memory states, so-called “metamemory,” enables strategic and efficient information collection based on past experiences. However, it is unknown how metamemory is implemented in the brain. We explored causal neural mechanism of metamemory in macaque monkeys performing metacognitive confidence judgments on memory. By whole-brain searches via functional magnetic resonance imaging, we discovered a neural correlate of metamemory for temporally remote events in prefrontal area 9 (or 9/46d), along with that for recent events within area 6. Reversible inactivation of each of these identified loci induced doubly dissociated selective impairments in metacognitive judgment performance on remote or recent memory, without impairing recognition performance itself. The findings reveal that parallel metamemory streams supervise recognition networks for remote and recent memory, without contributing to recognition itself.

Monday, January 23, 2017

How our evolutionary psychology elected Donald Trump.

While I feel that in principle our world might be best governed by a multinational meritocratic elite (of the sort that just met in Davos Switzerland) I can’t even begin to feel the same kind of emotional bonding to this vague impersonal entity that I feel towards my hometown of Austin Texas, or Madison Wisconsin where I spent my adult working life. (And, business oligarchies governing the world have shown much more regard for maximizing profits than for the maintenance and quality of local human communities, the entities that most of us care about and can bond to.) Our brains evolved and are hard wired for caring most about family and tribe. Brooks makes these points very compellingly in his recent Op-Ed piece that notes the old German sociological distinction between gemeinschaft and gesellschaft.
All across the world, we have masses of voters who live in a world of gemeinschaft: where relationships are personal, organic and fused by particular affections. These people define their loyalty to community, faith and nation in personal, in-the-gut sort of ways.
But we have a leadership class and an experience of globalization that is from the world of gesellschaft: where systems are impersonal, rule based, abstract, indirect and formal.
Many people in Europe love their particular country with a vestigial affection that is like family — England, Holland or France. But meritocratic elites of Europe gave them an abstract intellectual construct called the European Union.
Many Americans think their families and their neighborhoods are being denuded by the impersonal forces of globalization, finance and technology. All the Republican establishment could offer was abstract paeans to the free market. All the Democrats could offer was Hillary Clinton, the ultimate cautious, remote, calculating, gesellschaft thinker.
It was the right moment for Trump, the ultimate gemeinschaft man. He is all gut instinct, all blood and soil, all about loyalty over detached reason. His business is a pre-modern family clan, not an impersonal corporation, and he is staffing his White House as a pre-modern family monarchy, with his relatives and a few royal retainers. In his business and political dealings, he simply doesn’t acknowledge the difference between private and public, personal and impersonal. Everything is personal, pulsating outward from his needy core.
Brooks goes on to argue that what made Trump right electorally will also make him an incompetent president. The danger is not so much the rise of fascism, a new authoritarian age, but that "everything will become disorganized, chaotic, degenerate, clownish and incompetent." How does the ultimate anti-institutional man sit at the nerve center of a four-million-person institution?

I think a good analogy is to hope that over time these millions of people, like the nerve cells in our brain, will do a "work-around" the focal lesion (Trump) to restore and maintain normal operations of the system.

Friday, January 20, 2017

The deepening of our cultural echo chambers.

Farhad Manjoo does a nice piece in the Tech and Society section of the NY Times, pointing out how much has changed since the 1970s, when TV programs like “All in the Family” had broad cultural reach, being watched by one out of every three households with a television. Norman Lear’s “One Day at a Time” was watched by 17 million viewers every week. A new version of “One day at a Time” on Netflix will almost certainly fail to replicate such a broad cultural reach. Some clips from Manjoo:
The shows are separated by 40 years of technological advances — a progression from the over-the-air broadcast era in which Mr. Lear made it big, to the cable age of MTV and CNN and HBO, to, finally, the modern era of streaming services like Netflix. Each new technology allowed a leap forward in choice, flexibility and quality; the “Golden Age of TV” offers so much choice that some critics wonder if it’s become overwhelming…Across the entertainment business, from music to movies to video games, technology has flooded us with a profusion of cultural choice.
...we’re returning to the cultural era that predated radio and TV, an era in which entertainment was fragmented and bespoke…It was a really odd moment in history to have so many people watching the same thing at the same time… for a brief while, from the 1950s to the late 1980s, broadcast television served cultural, social and political roles far greater than the banality of its content would suggest. Because it featured little choice, TV offered something else: the raw material for a shared culture.
As the broadcast era changed into one of cable and then streaming, TV was transformed from a wasteland into a bubbling sea of creativity. But it has become a sea in which everyone swims in smaller schools...Only around 12 percent of television households, or about 14 million to 15 million people, regularly tuned into “NCIS” and “The Big Bang Theory,” the two most popular network shows of the 2015-16 season…Netflix’s biggest original drama last year, “Stranger Things,” was seen by about 14 million adults in the month after it first aired…during much of the 1980s, a broadcast show that attracted 14 million to 16 million viewers would have been in danger of cancellation.
A spokesman for Netflix pointed out that even if audiences were smaller than in the past, its shows still had impact. “Making a Murderer” set off a re-examination of a widely criticized murder trial, for instance, while “Orange Is the New Black” was one of the first shows to feature a transgender actor, Laverne Cox….But I suspect the impacts, like the viewership, tend to be restricted along the same social and cultural echo chambers into which we’ve split ourselves in the first place. Those effects do not approach the vast ways that TV once remade the culture.

Thursday, January 19, 2017

The immensity of the vacated present.

I am repeating, as I did with last Thursday's post, a post from several years ago with material that continues to be personally important to me. Here it is:

The title of this post is a phrase from a recent essay by Vivian Gornick, "The cost of daydreaming," describing an experience that very much resonates with my own, and that I think is describing her discovery and way of noticing the distinction between our internal mind wandering (default mode) and present centered outwardly oriented (attentional) brain networks (the subject of many MindBlog posts). On finding that she could sense the start of daydreaming and suppress it:
...the really strange and interesting thing happened. A vast emptiness began to open up behind my eyes as I went about my daily business. The daydreaming, it seemed, had occupied more space than I’d ever imagined. It was as though a majority of my waking time had routinely been taken up with fantasizing, only a narrow portion of consciousness concentrated on the here and now...I began to realize what daydreaming had done for me — and to me.
Turning 60 was like being told I had six months to live. Overnight, retreating into the refuge of a fantasized tomorrow became a thing of the past. Now there was only the immensity of the vacated present...It wasn’t hard to cut short the daydreaming, but how exactly did one manage to occupy the present when for so many years one hadn’t?"
Then, after a period of time:
...I became aware, after a street encounter, that the vacancy within was stirring with movement. A week later another encounter left me feeling curiously enlivened. It was the third one that did it. A hilarious exchange had taken place between me and a pizza deliveryman, and sentences from it now started repeating themselves in my head as I walked on, making me laugh each time anew, and each time with yet deeper satisfaction. Energy — coarse and rich — began to swell inside the cavity of my chest. Time quickened, the air glowed, the colors of the day grew vivid; my mouth felt fresh. A surprising tenderness pressed against my heart with such strength it seemed very nearly like joy; and with unexpected sharpness I became alert not to the meaning but to the astonishment of human existence. It was there on the street, I realized, that I was filling my skin, occupying the present.