Monday, June 14, 2021

Four Americas

I want to recommend that MindBlog readers have a look at George Packer's essay in the Atlantic presenting a condensed version of arguments in his new book, Last Best Hope: America in Crisis and Renewal. Below, I pass on an even more condensed version in a few clips from the Atlantic article: 

1 - Free America

Call the first narrative “Free America.” In the past half century it’s been the most politically powerful of the four. Free America draws on libertarian ideas, which it installs in the high-powered engine of consumer capitalism. The freedom it champions is very different from Alexis de Tocqueville’s art of self-government. It’s personal freedom, without other people—the negative liberty of “Don’t tread on me.”...The conservative movement began to dominate the Republican Party in the 1970s, and then much of the country after 1980 with the presidency of Ronald Reagan.
A character in Jonathan Franzen’s 2010 novel, Freedom, puts it this way: “If you don’t have money, you cling to your freedoms all the more angrily. Even if smoking kills you, even if you can’t afford to feed your kids, even if your kids are getting shot down by maniacs with assault rifles. You may be poor, but the one thing nobody can take away from you is the freedom to fuck up your life.”
2 Smart America
The new knowledge economy created a new class of Americans: men and women with college degrees, skilled with symbols and numbers—salaried professionals in information technology, computer engineering, scientific research, design, management consulting, the upper civil service, financial analysis, law, journalism, the arts, higher education...they dominate the top 10 percent of American incomes, with outsize economic and cultural influence...After the 1970s, meritocracy..., a system intended to give each new generation an equal chance to rise, created a new hereditary class structure. Educated professionals pass on their money, connections, ambitions, and work ethic to their children, while less educated families fall further behind, with less and less chance of seeing their children move up...a lower-class child is nearly as unlikely to be admitted to one of the top three Ivy League universities as they would have been in 1954.
In the early 1970s, the party became the home of educated professionals, nonwhite voters, and the shrinking unionized working class. The more the party identified with the winners of the new economy, the easier it became for the Republican Party to pull away white workers by appealing to cultural values...these two classes, rising professionals and sinking workers, which a couple of generations ago were close in income and not so far apart in mores, no longer believe they belong to the same country. But they can’t escape each other, and their coexistence breeds condescension, resentment, and shame...Smart Americans are uneasy with patriotism. It’s an unpleasant relic of a more primitive time, like cigarette smoke or dog racing. It stirs emotions that can have ugly consequences. The winners in Smart America—connected by airplane, internet, and investments to the rest of the globe—have lost the capacity and the need for a national identity, which is why they can’t grasp its importance for others...abandoning patriotism to other narratives guarantees that the worst of them will claim it.
3 - Real America
Real America is a very old place. The idea that the authentic heart of democracy beats hardest in common people who work with their hands goes back to the 18th century. It was embryonic in the founding creed of equality. “State a moral case to a ploughman and a professor,” Thomas Jefferson wrote in 1787. “The former will decide it as well, and often better than the latter, because he has not been led astray by artificial rules.” Moral equality was the basis for political equality...The triumph of popular democracy brought an anti-intellectual bias to American politics that never entirely disappeared. Self-government didn’t require any special learning, just the native wisdom of the people...The overwhelmingly white crowds that lined up to hear Palin speak were nothing new. Real America has always been a country of white people...
From its beginnings, Real America has also been religious, and in a particular way: evangelical and fundamentalist, hostile to modern ideas and intellectual authority...Finally, Real America has a strong nationalist character. Its attitude toward the rest of the world is isolationist, hostile to humanitarianism and international engagement, but ready to respond aggressively to any incursion against national interests...Ever since the age of Reagan, the Republican Party has been a coalition of business interests and less affluent white people, many of them evangelical Christians. The persistence of the coalition required an immense amount of self-deception on both sides.
When Trump ran for president, the party of Free America collapsed into its own hollowness...Trump didn’t try to shape his people ideologically with new words and concepts. He used the low language of talk radio, reality TV, social media, and sports bars, and to his listeners this language seemed far more honest and grounded in common sense than the mincing obscurities of “politically correct” experts. His populism brought Jersey Shore to national politics. The goal of his speeches was not to whip up mass hysteria but to get rid of shame. He leveled everyone down together...More than anything, Trump was a demagogue—a thoroughly American type, familiar to us from novels like All the King’s Men and movies like Citizen Kane. “Trump is a creature native to our own style of government and therefore much more difficult to protect ourselves against,” the Yale political theorist Bryan Garsten wrote. “He is a demagogue, a popular leader who feeds on the hatred of elites that grows naturally in democratic soil.” A demagogue can become a tyrant, but the people put him there—the people who want to be fed fantasies and lies, the people who set themselves apart from and above their compatriots. So the question isn’t who Trump was, but who we are.
4 - Just America
In 2014, American character changed....A large and influential generation came of age in the shadow of accumulating failures by the ruling class—especially by business and foreign-policy elites...My generation told our children’s generation a story of slow but steady progress...If anyone doubted that the country was becoming a more perfect union, the election of a Black president who loved to use that phrase proved it...Of course the kids didn’t buy it. In their eyes “progress” looked like a thin upper layer of Black celebrities and professionals, who carried the weight of society’s expectations along with its prejudices, and below them, lousy schools, overflowing prisons, dying neighborhoods...Then came one video after another of police killing or hurting unarmed Black people. Then came the election of an openly racist president. These were conditions for a generational revolt.
Call this narrative “Just America.” It’s another rebellion from below. As Real America breaks down the ossified libertarianism of Free America, Just America assails the complacent meritocracy of Smart America. It does the hard, essential thing that the other three narratives avoid, that white Americans have avoided throughout history. It forces us to see the straight line that runs from slavery and segregation to the second-class life so many Black Americans live today—the betrayal of equality that has always been the country’s great moral shame, the heart of its social problems.
In the same way that libertarian ideas had been lying around for Americans to pick up in the stagflated 1970s, young people coming of age in the disillusioned 2000s were handed powerful ideas about social justice to explain their world. The ideas came from different intellectual traditions: the Frankfurt School in 1920s Germany, French postmodernist thinkers of the 1960s and ’70s, radical feminism, Black studies. They converged and recombined in American university classrooms, where two generations of students were taught to think as critical theorists.
Critical theory upends the universal values of the Enlightenment: objectivity, rationality, science, equality, freedom of the individual. These liberal values are an ideology by which one dominant group subjugates another. All relations are power relations, everything is political, and claims of reason and truth are social constructs that maintain those in power...But in identity politics, equality refers to groups, not individuals, and demands action to redress disparate outcomes among groups—in other words, equity, which often amounts to new forms of discrimination. In practice, identity politics inverts the old hierarchy of power into a new one: bottom rail on top. The fixed lens of power makes true equality, based on common humanity, impossible...By the turn of the millennium, these ideas were nearly ubiquitous in humanities and social-science departments. Embracing them had become an important credential for admittance into sectors of the professorate...In turn, these scholars formed the worldview of young Americans educated by elite universities to thrive in the meritocracy.
Millions of young Americans were steeped in the assumptions of critical theory and identity politics without knowing the concepts...Here is the revolutionary power of the narrative: What had been considered, broadly speaking, American history (or literature, philosophy, classics, even math) is explicitly defined as white, and therefore supremacist...The most radical version of the narrative lashes together the oppression of all groups in an encompassing hell of white supremacy, patriarchy, homophobia, transphobia, plutocracy, environmental destruction, and drones...There are too many things that Just America can’t talk about for the narrative to get at the hardest problems. It can’t talk about the complex causes of poverty. Structural racism—ongoing disadvantages that Black people suffer as a result of policies and institutions over the centuries—is real. But so is individual agency, and in the Just America narrative, it doesn’t exist. The narrative can’t talk about the main source of violence in Black neighborhoods, which is young Black men, not police.
...another way to understand Just America is in terms of class. Why does so much of its work take place in human-resources departments, reading lists, and awards ceremonies? In the summer of 2020, the protesters in the American streets were disproportionately Millennials with advanced degrees making more than $100,000 a year. Just America is a narrative of the young and well educated, which is why it continually misreads or ignores the Black and Latino working classes. The fate of this generation of young professionals has been cursed by economic stagnation and technological upheaval. The jobs their parents took for granted have become much harder to get, which makes the meritocratic rat race even more crushing. Law, medicine, academia, media—the most desirable professions—have all contracted. The result is a large population of overeducated, underemployed young people living in metropolitan areas...The historian Peter Turchin coined the phrase elite overproduction to describe this phenomenon. He found that a constant source of instability and violence in previous eras of history, such as the late Roman empire and the French Wars of Religion, was the frustration of social elites for whom there were not enough jobs. Turchin expects this country to undergo a similar breakdown in the coming decade.
**********
All four of the narratives I’ve described emerged from America’s failure to sustain and enlarge the middle-class democracy of the postwar years. They all respond to real problems. Each offers a value that the others need and lacks ones that the others have. Free America celebrates the energy of the unencumbered individual. Smart America respects intelligence and welcomes change. Real America commits itself to a place and has a sense of limits. Just America demands a confrontation with what the others want to avoid.
In Free America, the winners are the makers, and the losers are the takers who want to drag the rest down in perpetual dependency on a smothering government. In Smart America, the winners are the credentialed meritocrats, and the losers are the poorly educated who want to resist inevitable progress. In Real America, the winners are the hardworking folk of the white Christian heartland, and the losers are treacherous elites and contaminating others who want to destroy the country. In Just America, the winners are the marginalized groups, and the losers are the dominant groups that want to go on dominating.
I don’t much want to live in the republic of any of them...I don’t think we are dying. We have no choice but to live together—we’re quarantined as fellow citizens. Knowing who we are lets us see what kinds of change are possible...Meanwhile, we remain trapped in two countries. Each one is split by two narratives—Smart and Just on one side, Free and Real on the other. Neither separation nor conquest is a tenable future. The tensions within each country will persist even as the cold civil war between them rages on.

Friday, June 11, 2021

Social Media isn't the problem...We are.

Mark Manson is one really smart guy. I have to pass on a precis of a nicely structured item pointed to in his weekly newsletter:

He begins his piece by recalling relatively recent moral panics over offensive music and violent video games:

Today, we chuckle at the hair metal bands of the late eighties as innocent fun while the shocking hip hop of the early nineties has evolved into a cornerstone of our modern culture. And after hundreds of studies across multiple decades, the American Psychological Association reports that they still haven’t found any evidence that playing video games motivates people to commit violence...Time has resolved our collective anxiety. The new has become the old, the shocking has become the expected. Yet, today we find ourselves in the grips of another moral panic—this time around social media.

1 - The New Culprit

Manson lists a number of books decrying the effects of social media and notes that:

...one has hit that “sky is falling” pitch of hysteria quite like the recent Netflix film, The Social Dilemma. I would call it a documentary except that there is a conspicuous absence of any data or actual scientific evidence in it. Instead, we’re treated to fictionalized reenactments of the repeated warnings given by tech industry “experts,” all of whom simply repeat and reinforce one another’s opinions for 94 minutes...The tech author and social media defender, Nir Eyal, has told me that the entirety of his three-hour interview was left out of the film, as was all but about ten seconds of the interview with another skeptic of social media criticism, Jonathan Haidt.
Then,
The problem is the data...there’s been research on social media and its effects on people. Lots of it...how it affects adults, how it affects children, how it influences politics and mood and self-esteem and general happiness...the results will probably surprise you. Social media is not the problem...We are.
2 - Three Common Criticisms of Social Media That Are Wrong
Criticism One: Social Media Harms Mental Health
...over the past two decades, we have seen a worrying increase in rates of suicide, depression, and anxiety, especially in young people. But it’s not clear that social media is the cause...A lot of scary research on social media usage is correlational research...lots of social media usage = lots of depressed teenagers...The problem with studies like this is that it’s a chicken-and-egg situation. Is it that social media causes kids to feel more depressed? Or is it that really depressed kids are more likely to use social media?..correlational studies kind of suck... So why do people do them?...because they’re easy.
It’s very easy to round up a few hundred kids, ask them how much they use social media, then ask them if they feel anxious or depressed, and create a spreadsheet. It’s much, much harder to round up thousands of kids, track them over the course of a decade and calculate how any shifts or changes in their social media usage actually affect their mental health over the years...Well, researchers with a lot of time and money have run those longitudinal studies, and the results are in...Manson lists a number of studies that are...leaning towards the conclusion that it’s anxiety and depression that drives us to use social media in all the horrible ways we use it—not the other way around.
Then, there’s the studies you never hear about. Like the one from 2012 that found posting status updates on Facebook reduces feelings of loneliness. Or the one from earlier this year that found activity on Twitter can potentially increase happiness. Or one that found that active social media use actually decreases symptoms of depression and anxiety...Those of us who were around in 2004 can remember why social media was such a big deal in the first place—it connected you to everybody in your life in a way that was simply impossible in the before-times. And those initial benefits of social media are so immediate and obvious that we’ve likely become inured to them and take them for granted.
Criticism Two: Social Media Causes Political Extremism or Radicalization
...three facts make it unlikely that social media is the culprit: 
-Studies show that political polarization has increased most among the older generations who use social media the least. Younger generations who are more active on social media tend to have more moderate views. 
-Polarization has been widening in the United States and many other countries since the 1970s, long before the advent of the internet. 
-Polarization has not occurred universally around the world. In fact, some countries are experiencing less polarization than in previous decades.
Criticism Three: Big Tech Companies Are Profiting Off the Mayhem
...social media is not destroying society, and even if it was, Big Tech is not fanning the flames. They’re actually spending a lot of money trying to put it out...These companies have spent billions in efforts to fight back against disinformation and conspiracy theories. A recent study to see if Google’s algorithm promoted extremist alt-right content actually found the opposite: the YouTube algorithm seemed to go out of its way to promote mainstream, established news sites far more often than its fringe looney figures...Similarly, last year Facebook banned tens of thousands of conspiracy theorist and terrorist groups. This has been part of their ongoing campaign to clean up their platform. They’ve hired over 10,000 new employees in the past two years just to review content on the site for disinformation and violence.
3 - But Clearly Something’s Not Right… So What Is It?
Back in the 90s, conspiracy theories like like the Y2K computer armageddon were just as common as they are now. The difference was that they were far less harmful because the social networks that existed at the time cut them off aggressively at the source...But today someone goes online, finds a web forum, or a Facebook group or a Clubhouse room, and all the little Y2Kers get together and spend all of their time socializing and validating each other based on the shared assumption that the world is about to end...Facebook didn’t create the crazy Y2Kers. It merely gives them an opportunity to find each other and connect—because, for better or worse, Facebook gives everybody the opportunity to find each other and connect...This asymmetry in beliefs is important, as the more extreme and negative the belief, the more motivated the person is to share it with others. And when you build massive platforms based on sharing… well, things get ugly.
4 - The 90/9/1 Rule
The Pareto Principle or the 80/20 Rule states that 80% of results come from 20% of the processes. I.e., 80% of a company’s revenue will often come from 20% of its customers; 80% of your social life is probably spent with 20% of your friends; 80% of traffic accidents are caused by 20% of the drivers; 80% of the crime is committed by 20% of the people. Etc.
People who have studied social networks and online communities have found a similar rule to describe information shared on the internet, dubbed the “90/9/1 Rule.” It states that in any social network or online community, 1% of the users generate 90% of the content, 9% of the users create 10% of the content, and the other 90% of people are mostly silent observers...Let’s call the 1% who create 90% of the content creators. We’ll call the 9% the engagers—as most of their content is a reaction to what the 1% is creating—and the 90% who are merely observers, we’ll refer to as lurkers.
The...dynamic of social networks comes to reflect Bertrand Russell’s old lament: “The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts.”...The creators are largely the fools and fanatics who are so certain of themselves...It’s not necessarily the platform’s algorithms that favor these fanatics—it’s that human psychology favors these fools and fanatics and the algorithms simply reflect our psychology back to us...Issues that are important to small but loud minorities dictate the discussion of the majority...Because radical and unconventional views exert a disproportionate influence online, they are mistakenly seen as common and conventional...
People develop extreme and irrational levels of pessimism. Because creators online tend to be the doomsayers and extremists, the overall perception of the state of the world skews increasingly negative. Polling data shows optimism in much of the developed world to be at all-time lows despite the fact that by almost every statistical measurement—wealth, longevity, peace, education, equality, technology, etc.—we live in the best time in human history, and it’s not even close...Much of this can be summed up in the simple phrase: social media does not accurately reflect the underlying society.
5 - Optimizing for Controversy vs Consensus
A few generations ago, there were only a few television channels, a few radio stations, and a few international news services...if you were in charge of one of these few channels of information, it was in your interest to produce content that appealed to as many people as possible...what we saw was a traditional media in the 20th century that largely sought to produce content focused on consensus.
But with the internet, the supply of information exploded. Suddenly everyone had 500 TV channels and dozens of radio stations and an infinite number of websites to choose from...Therefore, the most profitable strategy in the media and entertainment stopped being consensus and instead became controversy...This optimization for controversy trickled down, all the way from politicians and major news outlets, to individual influencers on social media...The result is this fun-house-mirror version of reality, where you go online (or turn on cable news) and feel like the world is constantly falling apart around you but it’s not.
And the fun-house-mirror version of reality isn’t caused by social media, it’s caused by the profit incentives on media/entertainment in an environment where there’s way more supply of content than there is demand. Where there’s far more supply of news and information than there is time to consume it. Where there’s a natural human proclivity to pay more attention to a single car crash than the hundreds of people successfully driving down the highway, going on their merry way.
6 - The Silent Majority
What we get is a cultural environment that looks like this:
Social media has not changed our culture. It’s shifted our awareness of culture to the extremes of all spectrums. And until we all recognize this, it will be impossible to have serious conversations about what to do or how to move forward.
You can affect the culture simply by shifting people’s awareness about certain subjects. The hysteria in the media over their irrelevance has pushed our culture to a place where we overestimate social media and underestimate our own psychology.
Instead, we must push our perception back to a more realistic and mature understanding of social media and social networks. To do this, it’s important each of us individually understands concepts such as The Attention Diet and the Attention Economy, that we learn to cut out the most news consumption, and, you know, maybe spend some more time outside.
Moral panics find a scapegoat to blame for what we hate to admit about ourselves. My parents and their friends didn’t ask why kids were drawn to such aggressive and vulgar music. They were afraid of what it might have revealed about themselves. Instead, they simply blamed the musicians and the video games.
Similarly, rather than owning up to the fact that these online movements are part of who we are—that these are the ugly underbellies of our society that have existed and persisted for generations—we instead blame the social media platforms for accurately reflecting ourselves back to us.
The great biographer Robert Caro once said, “Power doesn’t always corrupt, but power always reveals.” Perhaps the same is true of the most powerful networks in human history.
Social media has not corrupted us, it’s merely revealed who we always were.



Thursday, June 10, 2021

Storytelling increases oxytocin and positive emotions, decreases cortisol and pain, in hospitalized kids

From Brockington et al.:
Storytelling is a distinctive human characteristic that may have played a fundamental role in humans’ ability to bond and navigate challenging social settings throughout our evolution. However, the potential impact of storytelling on regulating physiological and psychological functions has received little attention. We investigated whether listening to narratives from a storyteller can provide beneficial effects for children admitted to intensive care units. Biomarkers (oxytocin and cortisol), pain scores, and psycholinguistic associations were collected immediately before and after storytelling and an active control intervention (solving riddles that also involved social interaction but lacked the immersive narrative aspect). Compared with the control group, children in the storytelling group showed a marked increase in oxytocin combined with a decrease in cortisol in saliva after the 30-min intervention. They also reported less pain and used more positive lexical markers when describing their time in hospital. Our findings provide a psychophysiological basis for the short-term benefits of storytelling and suggest that a simple and inexpensive intervention may help alleviate the physical and psychological pain of hospitalized children on the day of the intervention.

Wednesday, June 09, 2021

Cultural Evolution of Genetic Heritability

Behavioral and Brain Sciences has accepted an article from Uchiyama et al., whose abstract I copy below, and invites the submission of commentary proposals.
Behavioral genetics and cultural evolution have both revolutionized our understanding of human behavior—largely independent of each other. Here we reconcile these two fields under a dual inheritance framework, offering a more nuanced understanding of the interaction between genes and culture. Going beyond typical analyses of gene- environment interactions, we describe the cultural dynamics that shape these interactions by shaping the environment and population structure. A cultural evolutionary approach can explain, for example, how factors such as rates of innovation and diffusion, density of cultural sub-groups, and tolerance for behavioral diversity impact heritability estimates, thus yielding predictions for different social contexts. Moreover, when cumulative culture functionally overlaps with genes, genetic effects become masked, unmasked, or even reversed, and the causal effects of an identified gene become confounded with features of the cultural environment. The manner of confounding is specific to a particular society at a particular time, but a WEIRD (Western, educated, industrialized, rich, democratic) sampling problem obscures this boundedness. Cultural evolutionary dynamics are typically missing from models of gene-to-phenotype causality, hindering generalizability of genetic effects across societies and across time. We lay out a reconciled framework and use it to predict the ways in which heritability should differ between societies, between socioeconomic levels and other groupings within some societies but not others, and over the life course. An integrated cultural evolutionary behavioral genetic approach cuts through the nature-nurture debate and helps resolve controversies in topics such as IQ.

Tuesday, June 08, 2021

We’ve been great at extending our lives, but not at ending them.

Steven Pinker does a review of Steven Johnson's recent book "Extra Life - A Short History of Living Longer," whose subject matter overlaps considerably with Chapters 5 ("Life") and 6 ("Health") of Pinker's book "Enlightenment Now" that MindBlog abstracted in a series of posts March 1-12, 2018.
Starting in the second half of the 19th century, the average life span began to climb rapidly, giving humans not just extra life, but an extra life. In rich countries, life expectancy at birth hit 40 by 1880, 50 by 1900, 60 by 1930, 70 by 1960, and 80 by 2010. The rest of the world is catching up. Global life expectancy in 2019 was 72.6 years, higher than that of any country, rich or poor, in 1950...Of the eight innovations that have saved the most lives, as Johnson sees it, six are defenses against infectious disease.
The sin of ingratitude is said to condemn one to the ninth circle of hell, and that’s where we may be headed for our attitudes toward the granters of extra life. In the list of inventions that saved lives by the hundreds of millions, we find antibiotics (squandered to fatten chickens in factory farms), blood transfusions (condemned as sinful by the devout), and chlorination and pasteurization (often mistrusted as unnatural). Among those that saved lives by the billions, we find the lowly toilet and sewer (metaphors for the contemptible), artificial fertilizer (a devil for Whole Foods shoppers) and vaccines (perhaps the greatest invention in history, and the target of head-smackingly stupid resistance).
Johnson shakes us out of our damnable ingratitude and explains features of modernity that are reviled by sectors of the right and left: government regulation, processed food, high-tech farming, big data and bureaucracies like the Food and Drug Administration, the Centers for Disease Control and Prevention and the World Health Organization. He is open about their shortcomings and dangers. But much depends on whether we see them as evils that must be abolished or as lifesavers with flaws that must be mitigated.
A New Yorker essay by Brooke Jarvis also reviews Johnson's book and recaps the story of extending our lives, but moves on to consider another goal that we have been much less clear on - attaining a good death - by reviewing Katie Engelhart’s “The Inevitable: Dispatches on the Right to Die," which describes the right-to-die underground, a world of people who ask why a medical system so good at extending their lives will do little to help them end those lives in a peaceful and painless way. She gives the stories of people who have managed their exit through using the manual "The Peaceful Pill Handbook."
In the United States, physician-assisted suicide is permitted in a slowly growing number of states, but only to ease the deaths of patients who fit a narrow set of legal criteria. Generally, they must have received a terminal diagnosis with a prognosis of six months or less; be physically able to administer the drugs to themselves; have been approved by doctors as mentally competent to make the decision; and have made a formal request more than once, including after a waiting period.
Doctors who specialize in aid in dying often distinguish between “despair suicides,” the most familiar version, and “rational suicides,” those sought by people who have, in theory, weighed a terminal or painful or debilitating diagnosis and made a measured, almost mathematical choice about how best to deal with it. In practice, though, Engelhart finds that it’s hard to isolate pure rationality; many emotional factors always seem to tilt the scales. People worry about their lives having a sense of narrative integrity and completion. They worry about autonomy, and about “dignity” (this is another word that comes up a lot, and when Engelhart digs in she finds that many people define it quite specifically: control over one’s own defecation and mess). They worry about what other people will think of them. They worry about who will take care of them when they can no longer take care of themselves.
Behind every fraught ethical debate about physician-assisted suicide stands this inescapable reality: there are many people for whom the way we do things is not working. The right to die can’t be extricated from a right to care. One of the doctors Engelhart interviews—an oncologist in Belgium, where euthanasia laws are widely supported, and aid in dying is legal even for psychiatric patients who request it and qualify—tells her that America is not ready for such laws. “It’s a developing country,” he says. “You shouldn’t try to implement a law of euthanasia in countries where there is no basic healthcare.”

Monday, June 07, 2021

Making the hard problem of consciousness easier

Yet another morsel of text for consciousness mavens. Melloni et al. (open source) describe efforts to narrow down on which of several current theories better explains conscious experience by using the approach of adversarial collaboration...
...adversarial collaboration rests on identifying the most diagnostic points of divergence between competing theories, reaching agreement on precisely what they predict, and then designing experiments that directly test those diverging predictions. During the past 2 years, several groups have adopted this approach...The global neuronal workspace theory (GNWT)  claims that consciousness is instantiated by the global broadcasting and amplification of information across an interconnected network of prefrontal-parietal areas and many high-level sensory cortical areas...Conversely, the integrated information theory (IIT)  holds that consciousness should be understood in terms of cause-effect “power” that reflects the amount of maximally irreducible integrated information generated by certain neuronal architectures. On the basis of mathematical and neuroanatomical considerations, the IIT holds that the posterior cortex is ideally situated for generating a maximum of integrated information ...experiments designed by neuroscientists and philosophers not directly associated with the theories are being conducted in six independent laboratories.
I pass on their summary graphic (click to enlarge):

Friday, June 04, 2021

Inequality is a law of nature

DeDeo and Hobson do a commentary on a model developed by Kawakatsu et al. (open source) that explains the emergence of hierarchy in networked endorsement dynamics. I pass on a few clips from both, and after that list titles with links to a number of previous MindBlog posts that have presented explanations of why inequality and hierarchy are features of all natural systems. First, from DeDeo and Hobson:
As an old Scottish proverb says, “give a Dog an ill Name, and he’ll soon be hanged.” Even when the signal has little to do with underlying reality, endorsement—or contempt—can produce lasting consequences for a person’s social position. The ease with which such pieces of folk wisdom translate across both time and species suggests that there is a general, and even perhaps universal, logic to hierarchies and how they form. Kawakatsu et al. make an important advance in the quest for this kind of understanding, providing a general model for how subtle differences in individual-level decision-making can lead to hard-to-miss consequences for society as a whole...Their work reveals two distinct regimes—one egalitarian, one hierarchical—that emerge from shifts in individual-level judgment. These lead to statistical methods that researchers can use to reverse engineer observed hierarchies, and understand how signaling systems work when prestige and power are in play. The results make a singular contribution at the intersection of two distinct traditions of research into social power: the mechanistic (how hierarchies get made) and the functional (the adaptive roles they can play in society).
Kawakatsu et al.'s abstract:
Many social and biological systems are characterized by enduring hierarchies, including those organized around prestige in academia, dominance in animal groups, and desirability in online dating. Despite their ubiquity, the general mechanisms that explain the creation and endurance of such hierarchies are not well understood. We introduce a generative model for the dynamics of hierarchies using time-varying networks, in which new links are formed based on the preferences of nodes in the current network and old links are forgotten over time. The model produces a range of hierarchical structures, ranging from egalitarianism to bistable hierarchies, and we derive critical points that separate these regimes in the limit of long system memory. Importantly, our model supports statistical inference, allowing for a principled comparison of generative mechanisms using data. We apply the model to study hierarchical structures in empirical data on hiring patterns among mathematicians, dominance relations among parakeets, and friendships among members of a fraternity, observing several persistent patterns as well as interpretable differences in the generative mechanisms favored by each. Our work contributes to the growing literature on statistically grounded models of time-varying networks.
And, I list a few relevant past MindBlog posts:
Wealth inequality as a law of nature.
The science of inequality.
The Pareto Principle - unfairness is a law.
Simple mechanisms can generate wealth inequality.
A choice mind-set perpetuates acceptance of wealth inequality.

Thursday, June 03, 2021

Optogenetics used to induce pair bonding and restore vision.

I want to note two striking technical advances that make use of the light activated protein rhodopsin that I spent 36 years of my laboratory life studying. Using genetic techniques, a version of this protein found in algae, called channelrhodopsin, can be inserted into nerve cells so that they become activated by light. Hughes does a lucid explanation of a technical tour de force in bioengineering reported by Yang et al. They used transgenic mice in which light sensitive dopaminergic (DA) neurons in the ventral tegmental area (VTA) brain region (involved in processing reward and promoting social behavior) can be activated by blue light pulses from a tiny LED device implanted under the skull. It is known that some VTA areas fire in synchrony when two mice (or humans) are cooperating or bonding. When two male mice were dropped into a cage, they exhibited mild animus towards each other, but when both were zapped with blue light at the same high frequency they clung to and started grooming each other! (Aside from being forbidden and impractical in humans, how about this means of getting someone to like you!...all you would have to do is control the transmitters controlling VTA DA neuron activity in yourself and your intended.) 

A second striking use of optogenetics is reported in Zimmer's summary of work of Sahel et al., who have partially restored sight in one eye of a blind man with retinitis pigmentosa, a hereditary disease that destroys light sensitive photoreceptor cells in the retina but spares the ganglion cell layer whose axons normally send visual information to the brain. Here is the Sahel et. al. abstract:

Optogenetics may enable mutation-independent, circuit-specific restoration of neuronal function in neurological diseases. Retinitis pigmentosa is a neurodegenerative eye disease where loss of photoreceptors can lead to complete blindness. In a blind patient, we combined intraocular injection of an adeno-associated viral vector encoding ChrimsonR with light stimulation via engineered goggles. The goggles detect local changes in light intensity and project corresponding light pulses onto the retina in real time to activate optogenetically transduced retinal ganglion cells. The patient perceived, located, counted and touched different objects using the vector-treated eye alone while wearing the goggles. During visual perception, multichannel electroencephalographic recordings revealed object-related activity above the visual cortex. The patient could not visually detect any objects before injection with or without the goggles or after injection without the goggles. This is the first reported case of partial functional recovery in a neurodegenerative disease after optogenetic therapy.

Wednesday, June 02, 2021

I’m Not Scared to Reenter Society. I’m Just Not Sure I Want To.

I have to pass on a few clips from Tim Kreider's piece in the Atlantic:
...after a year in isolation, I, at least, have gotten acclimated to a different existence—quieter, calmer, and almost entirely devoid of bullshit. If you’d told me in March 2020 that quarantine would last more than a year, I would have been appalled; I can’t imagine how I would’ve reacted if you’d told me, once it ended, I would miss it.
Quarantine has given us all time and solitude to think—a risk for any individual, and a threat to any status quo. People have gotten to have the experience—some of them for the first time in their life—of being left alone...Relieved of the... world’s battering demands and expectations, people’s personalities have started to assume their true shape. And a lot of them don’t want to return to wasting their days in purgatorial commutes, to the fluorescent lights and dress codes and middle-school politics of the office. Service personnel are apparently ungrateful for the opportunity to get paid not enough to live on by employers who have demonstrated they don’t care whether their workers live or die. More and more people have noticed that some of the basic American axioms—that hard work is a virtue, productivity is an end in itself—are horseshit. I’m remembering those science-fiction stories in which someone accidentally sees behind the façade of their blissful false reality to the grim dystopia they actually inhabit.
Maybe this period of seeming dormancy, of hibernation, has actually been a phase of metamorphosis. Though, before caterpillars become butterflies, they first digest themselves, dissolving into an undifferentiated mush called “the pupal soup.” People are at different stages of this transformation—some still unformed, some already opulently emergent. Some of us may wither on exposure to the air. Escape from the chrysalis is always a struggle. Me, I am still deep in the mush phase, still watching TV on the couch, trying to finish just this one essay, awaiting, with vague faith in the forces that shape us, whatever imago is assembling within.

Tuesday, June 01, 2021

Watching brain regions that help us anticipate what's going to happen next.

A primary function of the brain is to adaptively use past experience to generate expectations aboutevents that are likely to occur in the future. Lee et al. have used a machine learning model to analyze fMRI measurents made on 30 individuals as they watched repeated viewing of a movie, and a nice summary of this work is presented by the PNAS Journal Club. Areas in the frontal cortex anticipate (possibly foreseeing movie plot changes) up to 15 seconds in advance, while back of the brain cortical areas only anticipate about 1 second ahead. Frontal regions of the brain can keep track of tens of seconds, compared to only a few seconds at the back of the brain. 

Vertical slices of the brain, imaged at different locations, reveal a timescale gradient for anticipation. Timescales are short at the back of the brain (cool blues) and longer at the front (warm reds).

These results demonstrate a hierarchy of anticipatory signals in the human brain and link them to subjective experiences of events...This hierarchical view of the brain is very different from the traditional, modular view,...In the traditional view, there are systems that process raw sensory inputs, such as sights or sounds, and there are separate systems that call up memories or make plans. The latest findings blur the lines between those systems, showing that regions known for processing simple pieces of visual information, such as the visual cortex, can also anticipate what’s coming up soon, even if just by a few seconds.

Monday, May 31, 2021

Will our American democracy collapse?

An apocalyptic opinion piece by Krugman induces me to ruminations of the sort I’ve been inflicting on MindBlog readers in recent posts (see last paragrapht). But first, a few edited clips from Krugman. He notes:
...our two major political parties are very different in their underlying structures. The Democrats are a coalition of interest groups — labor unions, environmentalists, L.G.B.T.Q. activists and more. The Republican Party is the vehicle of a cohesive, monolithic movement. This is often described as an ideological movement, although given the twists and turns of recent years — the sudden embrace of protectionism, the attacks on “woke” corporations (see Edsall's piece “Is Wokeness ‘Kryptonite for Democrats"?) — the ideology of movement conservatism seems less obvious than its will to power.
America’s democratic experiment may well be nearing its end...Republicans might take power legitimately; they might win through pervasive voter suppression; G.O.P. legislators might simply refuse to certify Democratic electoral votes and declare Donald Trump or his political heir the winner. However it plays out, the G.O.P. will try to ensure a permanent lock on power and do all it can to suppress dissent.
...how did we get here?...the predominance of craven careerists is what has made the Republican Party so vulnerable to authoritarian takeover...a great majority of Republicans in Congress know that the election wasn’t stolen. Very few really believe that the storming of the Capitol was a false-flag antifa operation or simply a crowd of harmless tourists. But decades as a monolithic, top-down enterprise have filled the G.O.P. with people who will follow the party line wherever it goes... So if Trump or a Trump-like figure declares that we have always been at war with East Asia, well, his party will say that we’ve always been at war with East Asia. If he says he won a presidential election in a landslide, never mind the facts, they’ll say he won the election in a landslide...The point is that neither megalomania at the top nor rage at the bottom explains why American democracy is hanging by a thread. Cowardice, not craziness, is the reason government by the people may soon perish...

Does the democratic experiment end? Does the U.S. become an autocracy to protect the wealth and well being of financially secure white men like myself?’ My early childhood experiences of being an outsider  watching from the periphery of groups inclines me to view the current  ‘crisis in democracy’ as another installment of the zig-zag that has resonated throughout history: Autocratic order -> inequality and poverty -> revolution -> more equity and equality but democratic chaos -> new privileged class assumes autocratic power-> repeat. If I were to get excited it would be on the side of democracy, but I am uncertain about whether either a democracy or an autocratic government is up to the task of regulating its human herd in our hi-tech future. Yuval Harari repeatedly makes this point in his writings. As I indicated in another recent rumination, I think it likely that our future lies in the hands of an ill-defined oligarchy of international information technology corporations managed by an educated elite.   ('Ill-defined" because it is hard to imagine a  global cabal - some modern version of the paranoid 'Elders of Zion' fantasy - running the show better than the current autocratic regimes that are botching things up.)

Friday, May 28, 2021

The latest on wearable stress-relief

Over its 15 year history MindBlog has done several reviews of widgets currently on the market that are meant to monitor and relieve stress. Chiu has done a review of three current devices you might have a look at if you have $250-$350 to burn. It seems that most users find mild benefits when trying them, but don't feel a strong urge to continue using them. The online reviews offered by vendors are predictability ecstatic (placebo effects anyone?). There is no peer-reviewed research from large-scale clinical trials of their efficacy yet available. 

The Appolo Neuro is a slightly curved rectangular box on a band that can fit around wrist or ankle and issue "soothing vibrations that speak to your nervous system" to increase heart-rate variability during normal activity and sleep... Hmmmmm. The other widgets are used in sessions that are set aside from daily activities. The Sensate 2 looks like a smooth river rock that is placed on your chest during sessions and "combines vibrations, or “sonic frequencies,” synchronized with specially composed soundtracks to enhance relaxation." The Muse 2 is a meditation headband, which has built-in sensors monitor brain waves using “advanced signal processing” to translate brain waves into sounds of weather. 

I tried similar widgets 5-10 years ago. This time, I don't think I'm gonna go there....from Chin's article:  "most people don’t need wearables or technology to effectively manage stress,... Before investing in a device...why not try well-established, free approaches, such as spending time in nature, exercising, practicing mindfulness and cultivating social interactions.

Thursday, May 27, 2021

Bias Is a Big Problem. But So Is ‘Noise.’

I have to pass on the strong dose of sanity offered by Daniel Kahneman and his colleagues in a recent NYTimes guest essay. They make some elemental distinctions that are important to keep in mind. Some edited clips:
A bias is any predictable error that inclines your judgment in a particular direction (for instance against women or in favor of Ivy League graduates, or when forecasts of sales are consistently optimistic or investment decisions overly cautious).
There is another type of error that attracts far less attention: noise. While bias is the average of errors, noise is their variability. In a 1981 study, for example, 208 federal judges were asked to determine the appropriate sentences for the same 16 cases...The average difference between the sentences that two randomly chosen judges gave for the same crime was more than 3.5 years. Considering that the mean sentence was seven years, that was a disconcerting amount of noise...In 2015, we conducted a study of underwriters in a large insurance company. Forty-eight underwriters were shown realistic summaries of risks to which they assigned premiums, just as they did in their jobs...the typical difference we found between two underwriters was an astonishing 55 percent of their average premium.
Where does noise come from? ...irrelevant circumstances can affect judgments...a judge’s mood, fatigue and even the weather can all have modest but detectable effects on judicial decisions. Another source is general tendencies...There are “hanging” judges and lenient ones...a third source is different patterns of assessment (say, which types of cases they believe merit being harsh or lenient about). Underwriters differ in their views of what is risky, and doctors in their views of which ailments require treatment. We celebrate the uniqueness of individuals, but we tend to forget that, when we expect consistency, uniqueness becomes a liability.
Once you become aware of noise, you can look for ways to reduce it. For instance, independent judgments from a number of people can be averaged (a frequent practice in forecasting). Guidelines, such as those often used in medicine, can help professionals reach better and more uniform decisions. As studies of hiring practices have consistently shown, imposing structure and discipline in interviews and other forms of assessment tends to improve judgments of job candidates.
No noise-reduction techniques will be deployed, however, if we do not first recognize the existence of noise. Noise is too often neglected. But it is a serious issue that results in frequent error and rampant injustice. Organizations and institutions, public and private, will make better decisions if they take noise seriously.

Wednesday, May 26, 2021

Darwin’s insights: How the evolutionary perspective has come to permeate the social sciences.

I want to pass on a review article on human evolution by Richerson1,Gavrilets, and de Waal (open source) in a recent issue of Science Magazine. Here is the Editor's summary: 

150 years of The Descent of Man

Charles Darwin's The Descent of Man was published in 1871. Ever since, it has been the foundation stone of human evolutionary studies. Richerson et al. have reviewed how modern studies of human biological and cultural evolution reflect the ideas in Darwin's work. They emphasize how cooperation, social learning, and cumulative culture in the ancestors of modern humans were key to our evolution and were enhanced during the environmental upheavals of the Pleistocene. The evolutionary perspective has come to permeate not just

Tuesday, May 25, 2021

Can You Have More Than 150 Friends?

MindBlog has done more than 9 posts over the past 15 years (enter Dunbar in the search box in the left column of this web page) pointing to Robin Dunbar's work showing that for a large number of animal species brain size and social group size get larger together, with his curve predicting that the optimal group size for humans is about 150. The staus of this widely accepted number has been challenged by Lind and collaborators, whose article suggests that the number could be much higher. Here is their abstract and a few clips from their discussion:
A widespread and popular belief posits that humans possess a cognitive capacity that is limited to keeping track of and maintaining stable relationships with approximately 150 people. This influential number, ‘Dunbar's number’, originates from an extrapolation of a regression line describing the relationship between relative neocortex size and group size in primates. Here, we test if there is statistical support for this idea. Our analyses on complementary datasets using different methods yield wildly different numbers. Bayesian and generalized least-squares phylogenetic methods generate approximations of average group sizes between 69–109 and 16–42, respectively. However, enormous 95% confidence intervals (4–520 and 2–336, respectively) imply that specifying any one number is futile. A cognitive limit on human group size cannot be derived in this manner.
Ruiter et al. make the point that
Dunbar's assumption that the evolution of human brain physiology corresponds with a limit in our capacity to maintain relationships ignores the cultural mechanisms, practices, and social structures that humans develop to counter potential deficiencies...Human information process management, we argue, cannot be understood as a simple product of brain physiology. Cross-cultural comparison of not only group size but also relationship-reckoning systems like kinship terminologies suggests that although neocortices are undoubtedly crucial to human behavior, they cannot be given such primacy in explaining complex group composition, formation, or management.
An article by Jenny Gross quotes Dunbar's responses to the above.
The new analysis, he said, “is bonkers, absolutely bonkers,” adding that the Stockholm University researchers conducted a flawed statistical analysis and misunderstood both the nuances of his analyses and of human connections. “I marvel at their apparent failure to understand relationships.”
Dr. Dunbar defines meaningful relationships as those people you know well enough to greet without feeling awkward if you ran into them in an airport lounge. That number typically ranges from 100 to 250, with the average around 150...Around 6000 B.C., the size of Neolithic villages from the Middle East was 120 to 150 people, judging by the number of dwellings. In 1086, the average size of most English villages recorded in the Domesday Book was 160 people. In modern armies, fighting units contain an average of 130 to 150 people, he said...Dr. Dunbar contended that his theory is still viable, even in today’s hyper-connected world, since the quality of connections on social networks is often low. “These are not personalized relationships,” he said...“It’s fairly blatantly obvious to most people when they sit down and think about it that that’s how their social network is organized,” he said. Dunbar’s number, he said, is not going anywhere.

Monday, May 24, 2021

For consciousness theory mavens: an argument against Tononi’s Integrated Information Theory.

Behavioral and Brain Science invites commentary on a forthcoming article by Merker et al. Here is the abstract:
Giulio Tononi's Integrated Information Theory (IIT) proposes explaining consciousness by directly identifying it with integrated information. We examine the construct validity of IIT's measure of consciousness, phi (Φ), by analyzing its formal properties, its relation to key aspects of consciousness, and its co-variation with relevant empirical circumstances. Our analysis shows that IIT's identification of consciousness with the causal efficacy with which differentiated networks accomplish global information transfer (which is what Φ in fact measures) is mistaken. This misidentification has the consequence of requiring the attribution of consciousness to a range of natural systems and artifacts that include, but are not limited to, large-scale electrical power grids, gene-regulation networks, some electronic circuit boards, and social networks. Instead of treating this consequence of the theory as a disconfirmation, IIT embraces it. By regarding these systems as bearers of consciousness ex hypothesi, IIT is led towards the orbit of panpsychist ideation. This departure from science as we know it can be avoided by recognizing the functional misattribution at the heart of IIT's identity claim. We show, for example, what function is actually performed, at least in the human case, by the cortical combination of differentiation with integration that IIT identifies with consciousness. Finally, we examine what lessons may be drawn from IIT's failure to provide a credible account of consciousness for progress in the very active field of research concerned with exploring the phenomenon from formal and neural points of view.

Saturday, May 22, 2021

I owe my soul to the company store (Google).

As I pass through my 79th birthday I offer a cathartic rant - without worrying about whether its parts cohere:

I am the white collar equivalent of the coal miner in Tennessee Ernie Fords classic “16 Tons” lyric line that provides the title of this post.... except that I am shoveling information in bytes rather than shoveling coal. The tendrils of google extend into all my internet activities, it is a virtual prosthesis. I am symbiotic with the cloud, losing access to it is like losing the use of my limbs. Yuval Harari has it exactly right - Google’s A.I. knows more about me than I know about myself... what YouTube movies, programs, and classical piano performances with scrolling scores I want to watch, where I go (my google maps) and what I do (my google calendar). 

Google knows what I think and write from https://mindblog.dericbownds.net/ (powered by Google’s Blogger platform), as well from my google cloud drive that holds my important legal and financial documents.  I put my piano performances on my YouTube channel at https://www.youtube.com/DericPiano. My techie son’s Google workspace account supporting the bownds.com domain provides me with a deric@bownds.com email address and youtube without advertisements. Underground Google fiber was recently installed on my street and suddenly this week I have 1 GB down internet and google drive cloud storage of 1 terabyte for the same amount that I had been paying another fiber optic provider for 300 MB down with no extras.

My only non-Google platform, http://www.dericbownds.net/  (the archive of my lectures, writing, personal and laboratory history), is a historical relic dating back to geocities.com and now called  'Yahoo Small Business." I tried switching mainly to Apple’s iCloud in early 2020, because it is more protective of privacy, but after a year of trying I have given up and returned to the evil empire. Apple’s web interface for word processing is slow and klutzy, has unnecessary prompts, is crash prone, and too frequently requires relogging into iCloud. 

Governance of me, along with the rest of the human herd, is quietly being transferred from nation states - whose internal conflicts render them ineffective - to an oligarchy of IT companies like Google, Apple, Microsoft, Facebook, and Amazon, along with their European and Asian counterparts. Tech companies are pretending to be governments, as in the quasi-governmental posturing of Amazon and Facebook . Their cloud based artificial intelligence analyzes and manipulates what we want and sells it to us.

After doing numerous posts on the sociopathy of social media, you might think I would practice what I preach and abstain from having a facebook and twitter account. But no, I maintain both accounts (Deric Bownds and @DericBownds)  to re-broadcast the posts I compose on the Blogger platform that masquerades as coming from mindblog.dericbownds.net. I don’t look at tweets or re-tweets on my twitter account, but I do check facebook for news of my family and two facebook social groups. 

I am continually amazed by my occasional looks down the rabbit hole of social media (See How Roblox Sparked a Chaotic Music Scene)  Realizing that this is the generation that contains our future leaders reminds me of Willian Butler Yeats' poem “The Second Coming” in which he seems to have seen it long ago:

Things fall apart; the centre cannot hold; 
Mere anarchy is loosed upon the world, The blood-dimmed tide is loosed, and everywhere
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity.
Surely some revelation is at hand; 
Surely the Second Coming is at hand.

What's not to enjoy?  Sit back and read Niall Ferguson on "The Politics of Catastrophe" as our humanity slowly and surely chaotically merges with the great A.I. in the cloud!

Friday, May 21, 2021

Educational attainment does not influence brain aging

Well...so much for my smugness over knowing that educational attainment slows brain aging. Nyberg et al. (open souce) show that it ain't so. Check out the link for some nice graphic of their data:
Education has been related to various advantageous lifetime outcomes. Here, using longitudinal structural MRI data (4,422 observations), we tested the influential hypothesis that higher education translates into slower rates of brain aging. Cross-sectionally, education was modestly associated with regional cortical volume. However, despite marked mean atrophy in the cortex and hippocampus, education did not influence rates of change. The results were replicated across two independent samples. Our findings challenge the view that higher education slows brain aging.

Wednesday, May 19, 2021

The curiosity circuits of our brains.

Every morning, as I am passing through the waking, exercise, and breakfast rituals that finally deliver me to my 'professor is in' office - a converted front bedroom of our house - I marvel at parallel ritualistic behaviors in my two abyssinian 1 year old cats, driven by an almost manic curiosity that impells them to seek new objects, crannies and nooks that hey can explore, occasionally hitting the jackpot of finding a cockroach, or a new object that they can break or brush onto the floor. Curiosity is one of the most important innate drives that they or I posses, and many think it should elevated to join the list of the four F's we teach first year Medical Students (fighting, feeding, fleeing, and fornicating). As Farahbakhsh and Siciliano note in their perspectives article on the work of Ahmadlout et al., "Attraction to the unknown, or curiosity, is a prerequisite for higher-order knowledge. Innate attraction to novelty is thought to be an evolutionary prerequisite for complex learning and guides organisms toward acquisition of adaptive behavioral repertoires."

Ahmadlou et al. have found circuitry in the mouse brain that is necessary for the exploration of new objects and conspecifics. A specific population of genetically identified γ-aminobutyric acid (GABA)—ergic neurons in a brain region called the zona incerta receive excitatory input in the form of novelty and/or arousal information from the prelimbic cortex, and these neurons send inhibitory projections to the periaqueductal gray region. Here is a summary graphic from the perspectives article (click to enlarge):

 

Tuesday, May 18, 2021

Liberal fact-master nerds go down the rabbit hole of a social media conspiracy.

I pass on this clip from the Morning Dispatch sent by a friend. Check out the link to the NYTimes article it refers to. Ben Smith describes how hundreds of “Jeopardy!” contestants talked themselves into a baseless conspiracy theory — and won’t be talked out of it.
It’s tempting to believe that conspiratorial, tin-hat thinking is something only other people are susceptible to—especially people less educated and more credulous than we imagine ourselves to be. Which is why this piece from the New York Times’s Ben Smith is so fascinating in its depiction of a moral panic that descended on a small online community of former winners of the quiz show Jeopardy! after a contestant who had just won his third game held up three fingers on his right hand—a gesture which, the contestants quickly decided, was likely some sort of white power symbol. “The element of this story that interests me most is how the beating heart of nerdy, liberal fact-mastery can pump blood into wild social media conspiracy, and send all these smart people down the sort of rabbit hole that leads other groups of Americans to believe that children are being transported inside refrigerators,” Smith wrote. “It reflects a depth of alienation among Americans, in which our warring tribes squint through the fog at one another for mysterious and abstruse signs of malice.”