Studies of social influences on behavior have led to the idea that a range of characteristics from loneliness to obesity might be contagious. A significant problem for the field has been to distinguish effects due to similarities between people (homophily) from social influence. One strategy for doing this has been to look at changes that occur over time. However, such studies have been the subject of considerable debate, and Noel and Nyhan now add a cautionary note. Their analyses of a model used in past social contagion studies suggest that previous investigations have not fully controlled for the possibility that friendship formation and termination are dynamic processes, and friendships between people who are more similar may tend to be more stable over time. Or to put it in Facebook terms, friendships that are between people who are less similar may be less stable, and therefore may result in “unfriending.” Homophily might thus be having a larger effect than appreciated, and under certain conditions could account for most of the contagion effects observed. They conclude that this unfriending problem renders a determination of causality much more complicated in longitudinal social network data.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Monday, August 29, 2011
Estimates of social influence - the "unfriending problem"
In the latest issue (Aug 26) of Science Magazine Barbara Jasny does a nice summary of recent work by Noel and Nyhan:
Friday, August 26, 2011
Synthesis of new brain cells and social dysfunction.
In rodent models of depression, antidepressant drugs are effective only if the hippocampus is able to generate new nerve cells (neurogenesis), suggesting an association between adult hippocampal neurogenesis and depression. Synder et al. have done the direct experiment of using a genetic trick to make mouse hippocampal cells sensitive to the antiviral drug valganciclovir, which inhibits cell proliferation. Valganciclovir treatment of the genetically altered mice almost completely abolished hippocampal neurogenesis. Their results support a direct role for adult neurogenesis in depressive illness. Here is their abstract:
Glucocorticoids are released in response to stressful experiences and serve many beneficial homeostatic functions. However, dysregulation of glucocorticoids is associated with cognitive impairments and depressive illness. In the hippocampus, a brain region densely populated with receptors for stress hormones, stress and glucocorticoids strongly inhibit adult neurogenesis. Decreased neurogenesis has been implicated in the pathogenesis of anxiety and depression, but direct evidence for this role is lacking. Here we show that adult-born hippocampal neurons are required for normal expression of the endocrine and behavioural components of the stress response. Using either transgenic or radiation methods to inhibit adult neurogenesis specifically, we find that glucocorticoid levels are slower to recover after moderate stress and are less suppressed by dexamethasone in neurogenesis-deficient mice than intact mice, consistent with a role for the hippocampus in regulation of the hypothalamic–pituitary–adrenal (HPA) axis. Relative to controls, neurogenesis-deficient mice also showed increased food avoidance in a novel environment after acute stress, increased behavioural despair in the forced swim test, and decreased sucrose preference, a measure of anhedonia. These findings identify a small subset of neurons within the dentate gyrus that are critical for hippocampal negative control of the HPA axis and support a direct role for adult neurogenesis in depressive illness.
Thursday, August 25, 2011
Brain excitation/inhibition balance and social dysfunction
Yates does a review of recent work by Deisseroth and colleagues, who have now shown that in mice, an elevation in the excitation/inhibition ratio in the medial prefrontal cortex (mPFC) impairs cellular information processing and leads to specific behavioral impairments. They made, and then genetically inserted, different forms of opsin molecules in different excitatory and inhibitory neuronal mPFC populations. By flashing the cortex with different wavelengths of light they could increase levels of either excitation or inhibition. Here is their abstract:
Severe behavioural deficits in psychiatric diseases such as autism and schizophrenia have been hypothesized to arise from elevations in the cellular balance of excitation and inhibition (E/I balance) within neural microcircuitry. This hypothesis could unify diverse streams of pathophysiological and genetic evidence, but has not been susceptible to direct testing. Here we design and use several novel optogenetic tools to causally investigate the cellular E/I balance hypothesis in freely moving mammals, and explore the associated circuit physiology. Elevation, but not reduction, of cellular E/I balance within the mouse medial prefrontal cortex was found to elicit a profound impairment in cellular information processing, associated with specific behavioural impairments and increased high-frequency power in the 30–80 Hz range, which have both been observed in clinical conditions in humans. Consistent with the E/I balance hypothesis, compensatory elevation of inhibitory cell excitability partially rescued social deficits caused by E/I balance elevation. These results provide support for the elevated cellular E/I balance hypothesis of severe neuropsychiatric disease-related symptoms.
Wednesday, August 24, 2011
A unified bottleneck in our brains limits our attention
It has been a common assumption that different tasks requiring our attention, like making perception distinctions or making action choices are limited by brain areas most associated with those function. Tombu et al. now find a unified attentional bottleneck, including the inferior frontal junction, superior medial frontal cortex, and bilateral insula.
Human information processing is characterized by bottlenecks that constrain throughput. These bottlenecks limit both what we can perceive and what we can act on in multitask settings. Although perceptual and response limitations are often attributed to independent information processing bottlenecks, it has recently been suggested that a common attentional limitation may be responsible for both. To date, however, evidence supporting the existence of such a “unified” bottleneck has been mixed. Here, we tested the unified bottleneck hypothesis using time-resolved fMRI. The first experiment isolated brain regions involved in the response selection bottleneck that limits speeded dual-task performance. These same brain regions were not only engaged by a perceptual encoding task in a second experiment, their activity also tracked delays to a speeded decision-making task caused by concurrent perceptual encoding in a third experiment. We conclude that a unified attentional bottleneck, including the inferior frontal junction, superior medial frontal cortex, and bilateral insula, temporally limits operations as diverse as perceptual encoding and decision-making.
Blog Categories:
acting/choosing,
attention/perception
Tuesday, August 23, 2011
New views on cancer - 99% of the functioning genes in our bodies are not ‘ours’.
They are the genes of bacteria and fungi that have evolved with us in a symbiotic relationship. This fascinating factoid is from an article by George Johnson describing fundamental changes in the way researchers are viewing the cancer process, as the reigning model - that “Through a series of random mutations, genes that encourage cellular division are pushed into overdrive, while genes that normally send growth-restraining signals are taken offline” - is supplemented by a number of subtle variations:
..genes in this microbiome — [of bacteria and fungi] exchanging messages with genes inside human cells — may be involved with cancers of the colon, stomach, esophagus and other organs...The idea that people in different regions of the world have co-evolved with different microbial ecosystems may be a factor — along with diet, lifestyle and other environmental agents — in explaining why they are often subject to different cancers.The article lists a number of further ideas, involving various classes of small or micro RNAs, here's a great sentence:
...Most DNA…was long considered junk … Only about 2 percent of the human genome carries the code for making enzymes and other proteins…These days “junk” DNA is referred to more respectfully as “noncoding” DNA, and researchers are finding clues that “pseudogenes” lurking within this dark region may play a role in cancer.
...With so much internal machinery, malignant tumors are now being compared to renegade organs sprouting inside the body…[they] contain healthy cells that have been conscripted into the cause. Cells called fibroblasts collaborate by secreting proteins the tumor needs to build its supportive scaffolding and expand into surrounding tissues. Immune system cells, maneuvered into behaving as if they were healing a wound, emit growth factors that embolden the tumor and stimulate angiogenesis, the generation of new blood vessels. Endothelial cells, which form the lining of the circulatory system, are also enlisted in the construction of the tumor’s own blood supply.
...other exotic players: lincRNA, (for large intervening noncoding), siRNA (small interfering), snoRNA (small nucleolar) and piRNA (Piwi-interacting (short for “P-element induced wimpy testis” (a peculiar term that threatens to pull this sentence into a regress of nested parenthetical explanations))).
Monday, August 22, 2011
Trying to live forever - Centenarians have plenty of bad habits
Here are two recent bits on aging:
O'Connor points to a study that
And, here is a bit of sanity, from Gary Gutting, on trying to live forever. He emphasizes that correlations do not prove causes (lower HDL levels correlate with more heart attacks, but clinical studies show raising HDL (good) cholesterol with drugs does nothing to protect against heart attacks.) He argues against chasing after the latest dietary supplement whose relevance is implied from correlation studies ('It can't hurt, it might help'... which I'm guilty of), and simply following the humdrum standard advice we’ve heard all our lives about eating sensibly, exercising regularly, and having recommended medical tests and exams. Apart from that, "how we die is a crap-shoot, and, short of avoiding obvious risks such as smoking and poor diet, there’s little we can do to load the dice."
O'Connor points to a study that
...focused on Ashkenazi Jews, a group that is more genetically homogenous than other populations, making it easier to identify genetic differences that contribute to life span. In the study, the researchers followed 477 Ashkenazi centenarians who were 95 or older and living independently. They asked them about their habits and the ways they lived when they were younger. Using data collected in the 1970s, the researchers compared the long-lived group with another group of 3,000 people in the general population who were born around the same time but who generally did not make it to age 95...They found that the people who lived to 95 and beyond did not seem to exhibit healthier lifestyles than those who died younger.The article continues to discuss social, personality, and genetic factors influencing longevity. The take home message is that people with the genes for longevity live past age 95 with habits no different from most others, but the average person would probably have to follow a healthy lifestyle to live comfortably past 80.
And, here is a bit of sanity, from Gary Gutting, on trying to live forever. He emphasizes that correlations do not prove causes (lower HDL levels correlate with more heart attacks, but clinical studies show raising HDL (good) cholesterol with drugs does nothing to protect against heart attacks.) He argues against chasing after the latest dietary supplement whose relevance is implied from correlation studies ('It can't hurt, it might help'... which I'm guilty of), and simply following the humdrum standard advice we’ve heard all our lives about eating sensibly, exercising regularly, and having recommended medical tests and exams. Apart from that, "how we die is a crap-shoot, and, short of avoiding obvious risks such as smoking and poor diet, there’s little we can do to load the dice."
Friday, August 19, 2011
Neotony - how long does our pre-frontal cortex stay young?
When I first looked at the title "Extraordinary neoteny of synaptic spines in the human prefrontal cortex", I excitedly thought "Great, I'm going to learn that my 69 year old prefrontal cortex is still crafting and pruning synapses." Alas, by extraordinary, the authors mean that they have determined that the 2-3 fold decrease in the density of dendritic spines previously thought to be largely complete by the end of adolescence continues well into the third decade of life before stabilizing at the adult level.
The major mechanism for generating diversity of neuronal connections beyond their genetic determination is the activity-dependent stabilization and selective elimination of the initially overproduced synapses [Changeux JP, Danchin A (1976) Nature 264:705–712]. The largest number of supranumerary synapses has been recorded in the cerebral cortex of human and nonhuman primates. It is generally accepted that synaptic pruning in the cerebral cortex, including prefrontal areas, occurs at puberty and is completed during early adolescence [Huttenlocher PR, et al. (1979) Brain Res 163:195–205]. In the present study we analyzed synaptic spine density on the dendrites of layer IIIC cortico–cortical and layer V cortico–subcortical projecting pyramidal neurons in a large sample of human prefrontal cortices in subjects ranging in age from newborn to 91 y. We confirm that dendritic spine density in childhood exceeds adult values by two- to threefold and begins to decrease during puberty. However, we also obtained evidence that overproduction and developmental remodeling, including substantial elimination of synaptic spines, continues beyond adolescence and throughout the third decade of life before stabilizing at the adult level. Such an extraordinarily long phase of developmental reorganization of cortical neuronal circuitry has implications for understanding the effect of environmental impact on the development of human cognitive and emotional capacities as well as the late onset of human-specific neuropsychiatric disorders.
Thursday, August 18, 2011
The dark side of emotion in decision making.
A mindblog reader emailed me pointing out this (before MindBlog started up) 2005 publication by Bechara and collaborators on the role of emotion in making decisions in risky situations (rather relevant to our current financial crisis, with investors rushing like lemmings to emotionally drive the market in huge up or down swings). Disabling normal emotional reactivity by either brain lesions or substance abuse leads people to make more advantageous decision in risky situation. (In a 2009 post I noted Bechara's more recent work on reward processing in different parts of the brain.
Can dysfunction in neural systems subserving emotion lead, under certain circumstances, to more advantageous decisions? To answer this question, we investigated how individuals with substance dependence (ISD), patients with stable focal lesions in brain regions related to emotion (lesion patients), and normal participants (normal controls) made 20 rounds of investment decisions. Like lesion patients, ISD made more advantageous decisions and ultimately earned more money from their investments than the normal controls. When normal controls either won or lost money on an investment round, they adopted a conservative strategy and became more reluctant to invest on the subsequent round, suggesting that they were more affected than lesion patients and ISD by the outcomes of decisions made in the previous rounds.
Wednesday, August 17, 2011
Why worry? It's good for you.
I've been meaning to point out an interesting piece by Robert Frank in the business section of the NYTimes, a subject mindblog has touched on in several posts. It's a bit of a gloss, but I pull out a few clips:
…people are particularly inept at predicting how changes in their life circumstances will affect their happiness. Even when the changes are huge — positive or negative — most people adapt much more quickly and completely than they expected…Paradoxically, our prediction errors often lead us to choices that are wisest in hindsight. In such cases, evolutionary biology often provides a clearer guide than cognitive psychology for thinking about why people behave as they do…the brain has evolved not to make us happy, but to motivate actions that help push our DNA into the next round. Much of the time, in fact, the brain accomplishes that by making us unhappy. Anxiety, hunger, fatigue, loneliness, thirst, anger and fear spur action to meet the competitive challenges we face…pleasure is an inherently fleeting emotion, one we experience while escaping from emotionally aversive states. In other words, pleasure is the carrot that provokes us to extricate ourselves from such states, but it almost always fades quickly…The human brain was formed by relentless competition in the natural world, so it should be no surprise that we adapt quickly to changes in circumstances.
Most people would love to have a job with interesting, capable colleagues, a high level of autonomy and ample opportunities for creative expression. But only a limited number of such jobs are available — and it’s our fretting that can motivate us to get them....Within limits, worry about success causes students to study harder to gain admission to better universities. It makes assistant professors work harder to earn tenure. It leads film makers to strive harder to create the perfect scene, and songwriters to dig deeper for the most pleasing melody. In every domain, people who work harder are more likely to succeed professionally, more likely to make a difference...The anxiety we feel about whether we’ll succeed is evolution’s way of motivating us.
Blog Categories:
acting/choosing,
happiness,
motivation/reward
Tuesday, August 16, 2011
Class warfare and voting
I thought this cartoon was a nice job, and for days have been debating passing it on in a post... so, here it is (click to enlarge).
And,while I'm at it, I'll also pass on a George Carlin video a friend sent me that is hysterical, but has (be warned) VERY offensive language.
And,while I'm at it, I'll also pass on a George Carlin video a friend sent me that is hysterical, but has (be warned) VERY offensive language.
Information and ideas are not the same thing!
Neal Gabler does a terrific opinion piece in this past Sunday's NYTimes on how our culture increasing follows present centered and transient flashes of information at the expense of integrative ideas and metaphors. It hit me between the eyes, resonating with my own frustration over feeling that I am constantly awash in streams of information chunks that do not cohere - are not integrated into perceiving patterns and overarching ideas. It was a reaffirmation of my recent decision test the effect of going cold turkey for awhile - to shut off my daily cruising of the Huffington Post and several other aggregators and news feeds. To stop watching the Jon Stewart Daily News, Colbert Report, and evening news. Already I can feel a detoxification process settling in, a slightly more calm mind. Gabler starts by noting that The Atlantic's “14 Biggest Ideas of the Year” are not in fact ideas, they are observations (sample: “Wall Street: Same as it Ever Was”) Here are some clips from Gabler's article:
Ideas just aren’t what they used to be. Once upon a time, they could ignite fires of debate, stimulate other thoughts, incite revolutions and fundamentally change the ways we look at and think about the world…They could penetrate the general culture and make celebrities out of thinkers — notably Albert Einstein, but also Reinhold Niebuhr, Daniel Bell, Betty Friedan, Carl Sagan and Stephen Jay Gould, to name a few. The ideas themselves could even be made famous: for instance, for “the end of ideology,” “the medium is the message,” “the feminine mystique,” “the Big Bang theory,” “the end of history.”…we are living in an increasingly post-idea world — a world in which big, thought-provoking ideas that can’t instantly be monetized are of so little intrinsic value that fewer people are generating them and fewer outlets are disseminating them, the Internet notwithstanding.
…especially here in America...we live in a post-Enlightenment age in which rationality, science, evidence, logical argument and debate have lost the battle in many sectors, and perhaps even in society generally, to superstition, faith, opinion and orthodoxy. While we continue to make giant technological advances, we may be the first generation to have turned back the epochal clock — to have gone backward intellectually from advanced modes of thinking into old modes of belief. But post-Enlightenment and post-idea, while related, are not exactly the same...Post-Enlightenment refers to a style of thinking that no longer deploys the techniques of rational thought. Post-idea refers to thinking that is no longer done, regardless of the style.
We live in the much vaunted Age of Information. Courtesy of the Internet, we seem to have immediate access to anything that anyone could ever want to know…In the past, we collected information not simply to know things….We also collected information to convert it into something larger than facts and ultimately more useful — into ideas that made sense of the information..But if information was once grist for ideas, over the last decade it has become competition for them. We are like the farmer who has too much wheat to make flour. We are inundated with so much information that we wouldn’t have time to process it even if we wanted to, and most of us don’t want to…We prefer knowing to thinking because knowing has more immediate value. It keeps us in the loop, keeps us connected to our friends and our cohort. Ideas are too airy, too impractical, too much work for too little reward. Few talk ideas. Everyone talks information, usually personal information. Where are you going? What are you doing? Whom are you seeing? These are today’s big questions.
…social networking sites are the primary form of communication among young people, and they are supplanting print, which is where ideas have typically gestated. …social networking sites engender habits of mind that are inimical to the kind of deliberate discourse that gives rise to ideas. Instead of theories, hypotheses and grand arguments, we get instant 140-character tweets about eating a sandwich or watching a TV show.
…We have become information narcissists, so uninterested in anything outside ourselves and our friendship circles or in any tidbit we cannot share with those friends that if a Marx or a Nietzsche were suddenly to appear, blasting his ideas, no one would pay the slightest attention, certainly not the general media, which have learned to service our narcissism.
Monday, August 15, 2011
How google effects our memory.
Daniel Wegner can be counted on to be always coming up with interesting stuff. Here he does a series of experiments showing how google is taking a load off our explicit memory storage habits (of the sort that occurred in the transition from the oral tradition to writing). As most of us know from our daily experience, google is replacing books and encyclopedias as our main group or transactive memory, and we become increasingly able to remember where information is stored better than remembering the information itself:
The advent of the Internet, with sophisticated algorithmic search engines, has made accessing information as easy as lifting a finger. No longer do we have to make costly efforts to find the things we want. We can “Google” the old classmate, find articles online, or look up the actor who was on the tip of our tongue. The results of four studies suggest that when faced with difficult questions, people are primed to think about computers and that when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it. The Internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves.
Blog Categories:
culture/politics,
memory/learning,
technology
Friday, August 12, 2011
The art of musical notation - scoring outside the lines
Pat Muchmore writes a fascinating piece on musical scores that are presented in a more exotic form than the standard music clef notations that I and other musicians have spend many thousands of hours with. He uses the term ergodic notations,
...which I derive from the game and literary theorist Espen J. Aarseth’s phrase “ergodic literature.” These are writings that require some amount of effort to read beyond simply moving one’s eyes and flipping pages. There are ancient examples, such as Egyptian texts that span several walls across several rooms or, more recently, Islamic calligrams that render Arabic words like Allah and Bismillah in many different directions and scales.The article contains numerous modern examples of ergodic scores, and also notes:
Ergodic notation is not new. Baude Cordier, a composer of the ars subtilior style, wrote during the first half of the 15th century. He created many graphic scores, one of the most elegant of which is for a piece called “Belle, bonne, sage.”The article provides further examples of ergodic notations from modern composers George Crumb, Peter Maxwell Davies, John Cage. It also includes notation and audio files of a composition by the author.
“Belle, bonne, sage” by Baude Cordier.
It’s a love song, so it’s rendered in the shape of a heart. The performance is essentially unaffected by the shape, but we needn’t condemn it — it’s a beautiful addition to the artwork. Furthermore, not every visual element is purely decorative, the red notes indicate a rhythmic alteration that was otherwise very difficult to notate at the time.
Thursday, August 11, 2011
Cowboys and Pit Crews
Yesterday's posting on Atul Gawande's writing has reminded of his more recent essay "Cowboys and Pit Crews," on medical practice, which has been languishing in my list of potential posts:
The core structure of medicine—how health care is organized and practiced—emerged in an era when doctors could hold all the key information patients needed in their heads and manage everything required themselves. One needed only an ethic of hard work, a prescription pad, a secretary, and a hospital willing to serve as one’s workshop, loaning a bed and nurses for a patient’s convalescence, maybe an operating room with a few basic tools. We were craftsmen. We could set the fracture, spin the blood, plate the cultures, administer the antiserum. The nature of the knowledge lent itself to prizing autonomy, independence, and self-sufficiency among our highest values, and to designing medicine accordingly. But you can’t hold all the information in your head any longer, and you can’t master all the skills. No one person can work up a patient’s back pain, run the immunoassay, do the physical therapy, protocol the MRI, and direct the treatment of the unexpected cancer found growing in the spine.
Everyone has just a piece of patient care. We’re all specialists now—even primary-care doctors. A structure that prioritizes the independence of all those specialists will have enormous difficulty achieving great care.
We don’t have to look far for evidence. Two million patients pick up infections in American hospitals, most because someone didn’t follow basic antiseptic precautions. Forty per cent of coronary-disease patients and sixty per cent of asthma patients receive incomplete or inappropriate care. And half of major surgical complications are avoidable with existing knowledge. It’s like no one’s in charge—because no one is. The public’s experience is that we have amazing clinicians and technologies but little consistent sense that they come together to provide an actual system of care, from start to finish, for people. We train, hire, and pay doctors to be cowboys. But it’s pit crews people need.
Recently, you might be interested to know, I met an actual cowboy. He described to me how cowboys do their job today, herding thousands of cattle. They have tightly organized teams, with everyone assigned specific positions and communicating with each other constantly. They have protocols and checklists for bad weather, emergencies, the inoculations they must dispense. Even the cowboys, it turns out, function like pit crews now. It may be time for us to join them.
Wednesday, August 10, 2011
Atul Gawande on aging.
I've been assembling a short list of possible essay/lecture topics, and one of the putative titles is "You're gonna die... get over it." It would be in the spirit of a crisp and clear essay in The New Yorker by Atul Gawande titled "The way we age now," which I have posted before and re-post here:
...one the best articles on aging that I have read, written by Atul Gawande (Asst. Prof. in the Harvard School of Public Health, and staff writer for the New Yorker Magazine). The article appears in the April 30 issue of the New Yorker.
Some clips:
Even though some genes have been shown to influence longevity in worms, fruit flies, and mice..
...one the best articles on aging that I have read, written by Atul Gawande (Asst. Prof. in the Harvard School of Public Health, and staff writer for the New Yorker Magazine). The article appears in the April 30 issue of the New Yorker.
Some clips:
Even though some genes have been shown to influence longevity in worms, fruit flies, and mice..
...scientists do not believe that our life spans are actually programmed into us.. (Deric note: in the post I just prepared for next Tuesday, this point is contested). After all, for most of our hundred-thousand-year existence—all but the past couple of hundred years—the average life span of human beings has been thirty years or less...Today, the average life span in developed countries is almost eighty years. If human life spans depend on our genetics, then medicine has got the upper hand. We are, in a way, freaks living well beyond our appointed time. So when we study aging what we are trying to understand is not so much a natural process as an unnatural one...Gawande proceeds to a discussion of social and medical consequences of people over 65 becoming 20% of the population.
...complex systems—power plants, say—have to survive and function despite having thousands of critical components. Engineers therefore design these machines with multiple layers of redundancy: with backup systems, and backup systems for the backup systems. The backups may not be as efficient as the first-line components, but they allow the machine to keep going even as damage accumulates...within the parameters established by our genes, that’s exactly how human beings appear to work. We have an extra kidney, an extra lung, an extra gonad, extra teeth. The DNA in our cells is frequently damaged under routine conditions, but our cells have a number of DNA repair systems. If a key gene is permanently damaged, there are usually extra copies of the gene nearby. And, if the entire cell dies, other cells can fill in.
Nonetheless, as the defects in a complex system increase, the time comes when just one more defect is enough to impair the whole, resulting in the condition known as frailty. It happens to power plants, cars, and large organizations. And it happens to us: eventually, one too many joints are damaged, one too many arteries calcify. There are no more backups. We wear down until we can’t wear down anymore.
Improvements in the treatment and prevention of heart disease, respiratory illness, stroke, cancer, and the like mean that the average sixty-five-year-old can expect to live another nineteen years—almost four years longer than was the case in 1970. (By contrast, from the nineteenth century to 1970, sixty-five-year-olds gained just three years of life expectancy.)
The result has been called the “rectangularization” of survival. Throughout most of human history, a society’s population formed a sort of pyramid: young children represented the largest portion—the base—and each successively older cohort represented a smaller and smaller group. In 1950, children under the age of five were eleven per cent of the U.S. population, adults aged forty-five to forty-nine were six per cent, and those over eighty were one per cent. Today, we have as many fifty-year-olds as five-year-olds. In thirty years, there will be as many people over eighty as there are under five.
Americans haven’t come to grips with the new demography. We cling to the notion of retirement at sixty-five—a reasonable notion when those over sixty-five were a tiny percentage of the population, but completely untenable as they approach twenty per cent. People are putting aside less in savings for old age now than they have in any decade since the Great Depression. More than half of the very old now live without a spouse, and we have fewer children than ever before—yet we give virtually no thought to how we will live out our later years alone.
...medicine has been slow to confront the very changes that it has been responsible for—or to apply the knowledge we already have about how to make old age better. Despite a rapidly growing elderly population, the number of certified geriatricians fell by a third between 1998 and 2004.
Tuesday, August 09, 2011
Do 18-month old humans have a theory of mind?
Senju et al., following up on an experiment by Meltzoff and Brooks, use a rather clever experimental design to show that 18-month old children can attribute false beliefs to others, a capacity previously thought to appear only after 3-4 years:
In the research reported here, we investigated whether 18-month-olds would use their own past experience of visual access to attribute perception and consequent beliefs to other people. Infants in this study wore either opaque blindfolds (opaque condition) or trick blindfolds that looked opaque but were actually transparent (trick condition). Then both groups of infants observed an actor wearing one of the same blindfolds that they themselves had experienced, while a puppet removed an object from its location. Anticipatory eye movements revealed that infants who had experienced opaque blindfolds expected the actor to behave in accordance with a false belief about the object’s location, but that infants who had experienced trick blindfolds did not exhibit that expectation. Our results suggest that 18-month-olds used self-experience with the blindfolds to assess the actor’s visual access and to update her belief state accordingly. These data constitute compelling evidence that 18-month-olds infer perceptual access and appreciate its causal role in altering the epistemic states of other people.
Monday, August 08, 2011
Effects of oxytocin in humans - a critical review
Over the past several years MindBlog has posted examples from the outpouring of work on the "trust hormone" oxytocin. Trends in Cognitive Science offers open access to this more critical and balanced review by Bartz et al. Their abstract:
Building on animal research, the past decade has witnessed a surge of interest in the effects of oxytocin on social cognition and prosocial behavior in humans. This work has generated considerable excitement about identifying the neurochemical underpinnings of sociality in humans, and discovering compounds to treat social functioning deficits. Inspection of the literature, however, reveals that the effects of oxytocin in the social domain are often weak and/or inconsistent. We propose that this literature can be informed by an interactionist approach in which the effects of oxytocin are constrained by features of situations and/or individuals. We show how this approach can improve understanding of extant research, suggest novel mechanisms through which oxytocin might operate, and refine predictions about oxytocin pharmacotherapy.By the way, the same issue of Trends in Cognitive Science has a brief note by van Honk et al. on testosterone as a social hormone, also noting the complexity of hormone-behavior relationships (PDF here).
Blog Categories:
emotion,
happiness,
social cognition
Friday, August 05, 2011
Macho mice make manly melodies.
Susan Reardon points to work by work of Pasch et al at U of F at Gainesville, who compared the songs of castrated male mice (singing mice from Costa Rica) with males having a male hormone implant. Females were attracted to speakers playing recordings of the songs of hormonally encanced males. (audio file here, video file in links above).
Thursday, August 04, 2011
Boredom - a Lively History
Peter Toohey's book with the title of this post is reviewed by Anthony Gottlieb in the NYTimes:
In Oscar Wilde’s play “A Woman of No Importance,” Lord Illingworth says of society: “To be in it is merely a bore. But to be out of it simply a tragedy.” To be a bore oneself is the ultimate failing and makes one the target for a quintessentially English put-down. “Even the grave yawns for him,” the actor and theater manager Sir Herbert Beerbohm Tree once said of an earnest writer. ...it was (and still is) regarded in some quarters as stylish and rather aristocratic to suffer from boredom, so the English ought really to thank their bores for providing them with the occasion to display wit and appear grand.
Toohey...suggests that the unpleasant feeling of simple boredom developed as a warning signal to steer us away from social situations that are “confined, predictable, too samey for one’s sanity.” In other words, it is a useful aversion: the discomfort of boredom is a blessing in disguise...a colleague of his once argued that there isn’t really any such thing as boredom, just a blurring together of a constellation of feelings and moods — frustration, surfeit, apathy and the like. Toohey rejects this idea, and perhaps there is indeed little harm in keeping the word, provided that one is vigilantly aware of the loose, subjective and confusing ways in which it is often used. When the actor George Sanders — the archetypal cad, at least on-screen, and in the title of his autobiography — committed suicide in a Spanish hotel in 1972, he left a note that began: “Dear World, I am leaving because I am bored.” It is worth noting that he was ill, lonely and had sold his beloved house on Majorca. Was boredom really what his death was about? When a man says he is bored — as Oscar Wilde never quite got round to saying — it sometimes means that he cannot be bothered to tell you what really ails him.
Blog Categories:
culture/politics,
emotion,
motivation/reward
Subscribe to:
Posts (Atom)