Ideas just aren’t what they used to be. Once upon a time, they could ignite fires of debate, stimulate other thoughts, incite revolutions and fundamentally change the ways we look at and think about the world…They could penetrate the general culture and make celebrities out of thinkers — notably Albert Einstein, but also Reinhold Niebuhr, Daniel Bell, Betty Friedan, Carl Sagan and Stephen Jay Gould, to name a few. The ideas themselves could even be made famous: for instance, for “the end of ideology,” “the medium is the message,” “the feminine mystique,” “the Big Bang theory,” “the end of history.”…we are living in an increasingly post-idea world — a world in which big, thought-provoking ideas that can’t instantly be monetized are of so little intrinsic value that fewer people are generating them and fewer outlets are disseminating them, the Internet notwithstanding.
…especially here in America...we live in a post-Enlightenment age in which rationality, science, evidence, logical argument and debate have lost the battle in many sectors, and perhaps even in society generally, to superstition, faith, opinion and orthodoxy. While we continue to make giant technological advances, we may be the first generation to have turned back the epochal clock — to have gone backward intellectually from advanced modes of thinking into old modes of belief. But post-Enlightenment and post-idea, while related, are not exactly the same...Post-Enlightenment refers to a style of thinking that no longer deploys the techniques of rational thought. Post-idea refers to thinking that is no longer done, regardless of the style.
We live in the much vaunted Age of Information. Courtesy of the Internet, we seem to have immediate access to anything that anyone could ever want to know…In the past, we collected information not simply to know things….We also collected information to convert it into something larger than facts and ultimately more useful — into ideas that made sense of the information..But if information was once grist for ideas, over the last decade it has become competition for them. We are like the farmer who has too much wheat to make flour. We are inundated with so much information that we wouldn’t have time to process it even if we wanted to, and most of us don’t want to…We prefer knowing to thinking because knowing has more immediate value. It keeps us in the loop, keeps us connected to our friends and our cohort. Ideas are too airy, too impractical, too much work for too little reward. Few talk ideas. Everyone talks information, usually personal information. Where are you going? What are you doing? Whom are you seeing? These are today’s big questions.
…social networking sites are the primary form of communication among young people, and they are supplanting print, which is where ideas have typically gestated. …social networking sites engender habits of mind that are inimical to the kind of deliberate discourse that gives rise to ideas. Instead of theories, hypotheses and grand arguments, we get instant 140-character tweets about eating a sandwich or watching a TV show.
…We have become information narcissists, so uninterested in anything outside ourselves and our friendship circles or in any tidbit we cannot share with those friends that if a Marx or a Nietzsche were suddenly to appear, blasting his ideas, no one would pay the slightest attention, certainly not the general media, which have learned to service our narcissism.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Tuesday, August 16, 2011
Information and ideas are not the same thing!
Neal Gabler does a terrific opinion piece in this past Sunday's NYTimes on how our culture increasing follows present centered and transient flashes of information at the expense of integrative ideas and metaphors. It hit me between the eyes, resonating with my own frustration over feeling that I am constantly awash in streams of information chunks that do not cohere - are not integrated into perceiving patterns and overarching ideas. It was a reaffirmation of my recent decision test the effect of going cold turkey for awhile - to shut off my daily cruising of the Huffington Post and several other aggregators and news feeds. To stop watching the Jon Stewart Daily News, Colbert Report, and evening news. Already I can feel a detoxification process settling in, a slightly more calm mind. Gabler starts by noting that The Atlantic's “14 Biggest Ideas of the Year” are not in fact ideas, they are observations (sample: “Wall Street: Same as it Ever Was”) Here are some clips from Gabler's article:
Monday, August 15, 2011
How google effects our memory.
Daniel Wegner can be counted on to be always coming up with interesting stuff. Here he does a series of experiments showing how google is taking a load off our explicit memory storage habits (of the sort that occurred in the transition from the oral tradition to writing). As most of us know from our daily experience, google is replacing books and encyclopedias as our main group or transactive memory, and we become increasingly able to remember where information is stored better than remembering the information itself:
The advent of the Internet, with sophisticated algorithmic search engines, has made accessing information as easy as lifting a finger. No longer do we have to make costly efforts to find the things we want. We can “Google” the old classmate, find articles online, or look up the actor who was on the tip of our tongue. The results of four studies suggest that when faced with difficult questions, people are primed to think about computers and that when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it. The Internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves.
Blog Categories:
culture/politics,
memory/learning,
technology
Friday, August 12, 2011
The art of musical notation - scoring outside the lines
Pat Muchmore writes a fascinating piece on musical scores that are presented in a more exotic form than the standard music clef notations that I and other musicians have spend many thousands of hours with. He uses the term ergodic notations,
...which I derive from the game and literary theorist Espen J. Aarseth’s phrase “ergodic literature.” These are writings that require some amount of effort to read beyond simply moving one’s eyes and flipping pages. There are ancient examples, such as Egyptian texts that span several walls across several rooms or, more recently, Islamic calligrams that render Arabic words like Allah and Bismillah in many different directions and scales.The article contains numerous modern examples of ergodic scores, and also notes:
Ergodic notation is not new. Baude Cordier, a composer of the ars subtilior style, wrote during the first half of the 15th century. He created many graphic scores, one of the most elegant of which is for a piece called “Belle, bonne, sage.”The article provides further examples of ergodic notations from modern composers George Crumb, Peter Maxwell Davies, John Cage. It also includes notation and audio files of a composition by the author.
“Belle, bonne, sage” by Baude Cordier.
It’s a love song, so it’s rendered in the shape of a heart. The performance is essentially unaffected by the shape, but we needn’t condemn it — it’s a beautiful addition to the artwork. Furthermore, not every visual element is purely decorative, the red notes indicate a rhythmic alteration that was otherwise very difficult to notate at the time.
Thursday, August 11, 2011
Cowboys and Pit Crews
Yesterday's posting on Atul Gawande's writing has reminded of his more recent essay "Cowboys and Pit Crews," on medical practice, which has been languishing in my list of potential posts:
The core structure of medicine—how health care is organized and practiced—emerged in an era when doctors could hold all the key information patients needed in their heads and manage everything required themselves. One needed only an ethic of hard work, a prescription pad, a secretary, and a hospital willing to serve as one’s workshop, loaning a bed and nurses for a patient’s convalescence, maybe an operating room with a few basic tools. We were craftsmen. We could set the fracture, spin the blood, plate the cultures, administer the antiserum. The nature of the knowledge lent itself to prizing autonomy, independence, and self-sufficiency among our highest values, and to designing medicine accordingly. But you can’t hold all the information in your head any longer, and you can’t master all the skills. No one person can work up a patient’s back pain, run the immunoassay, do the physical therapy, protocol the MRI, and direct the treatment of the unexpected cancer found growing in the spine.
Everyone has just a piece of patient care. We’re all specialists now—even primary-care doctors. A structure that prioritizes the independence of all those specialists will have enormous difficulty achieving great care.
We don’t have to look far for evidence. Two million patients pick up infections in American hospitals, most because someone didn’t follow basic antiseptic precautions. Forty per cent of coronary-disease patients and sixty per cent of asthma patients receive incomplete or inappropriate care. And half of major surgical complications are avoidable with existing knowledge. It’s like no one’s in charge—because no one is. The public’s experience is that we have amazing clinicians and technologies but little consistent sense that they come together to provide an actual system of care, from start to finish, for people. We train, hire, and pay doctors to be cowboys. But it’s pit crews people need.
Recently, you might be interested to know, I met an actual cowboy. He described to me how cowboys do their job today, herding thousands of cattle. They have tightly organized teams, with everyone assigned specific positions and communicating with each other constantly. They have protocols and checklists for bad weather, emergencies, the inoculations they must dispense. Even the cowboys, it turns out, function like pit crews now. It may be time for us to join them.
Wednesday, August 10, 2011
Atul Gawande on aging.
I've been assembling a short list of possible essay/lecture topics, and one of the putative titles is "You're gonna die... get over it." It would be in the spirit of a crisp and clear essay in The New Yorker by Atul Gawande titled "The way we age now," which I have posted before and re-post here:
...one the best articles on aging that I have read, written by Atul Gawande (Asst. Prof. in the Harvard School of Public Health, and staff writer for the New Yorker Magazine). The article appears in the April 30 issue of the New Yorker.
Some clips:
Even though some genes have been shown to influence longevity in worms, fruit flies, and mice..
...one the best articles on aging that I have read, written by Atul Gawande (Asst. Prof. in the Harvard School of Public Health, and staff writer for the New Yorker Magazine). The article appears in the April 30 issue of the New Yorker.
Some clips:
Even though some genes have been shown to influence longevity in worms, fruit flies, and mice..
...scientists do not believe that our life spans are actually programmed into us.. (Deric note: in the post I just prepared for next Tuesday, this point is contested). After all, for most of our hundred-thousand-year existence—all but the past couple of hundred years—the average life span of human beings has been thirty years or less...Today, the average life span in developed countries is almost eighty years. If human life spans depend on our genetics, then medicine has got the upper hand. We are, in a way, freaks living well beyond our appointed time. So when we study aging what we are trying to understand is not so much a natural process as an unnatural one...Gawande proceeds to a discussion of social and medical consequences of people over 65 becoming 20% of the population.
...complex systems—power plants, say—have to survive and function despite having thousands of critical components. Engineers therefore design these machines with multiple layers of redundancy: with backup systems, and backup systems for the backup systems. The backups may not be as efficient as the first-line components, but they allow the machine to keep going even as damage accumulates...within the parameters established by our genes, that’s exactly how human beings appear to work. We have an extra kidney, an extra lung, an extra gonad, extra teeth. The DNA in our cells is frequently damaged under routine conditions, but our cells have a number of DNA repair systems. If a key gene is permanently damaged, there are usually extra copies of the gene nearby. And, if the entire cell dies, other cells can fill in.
Nonetheless, as the defects in a complex system increase, the time comes when just one more defect is enough to impair the whole, resulting in the condition known as frailty. It happens to power plants, cars, and large organizations. And it happens to us: eventually, one too many joints are damaged, one too many arteries calcify. There are no more backups. We wear down until we can’t wear down anymore.
Improvements in the treatment and prevention of heart disease, respiratory illness, stroke, cancer, and the like mean that the average sixty-five-year-old can expect to live another nineteen years—almost four years longer than was the case in 1970. (By contrast, from the nineteenth century to 1970, sixty-five-year-olds gained just three years of life expectancy.)
The result has been called the “rectangularization” of survival. Throughout most of human history, a society’s population formed a sort of pyramid: young children represented the largest portion—the base—and each successively older cohort represented a smaller and smaller group. In 1950, children under the age of five were eleven per cent of the U.S. population, adults aged forty-five to forty-nine were six per cent, and those over eighty were one per cent. Today, we have as many fifty-year-olds as five-year-olds. In thirty years, there will be as many people over eighty as there are under five.
Americans haven’t come to grips with the new demography. We cling to the notion of retirement at sixty-five—a reasonable notion when those over sixty-five were a tiny percentage of the population, but completely untenable as they approach twenty per cent. People are putting aside less in savings for old age now than they have in any decade since the Great Depression. More than half of the very old now live without a spouse, and we have fewer children than ever before—yet we give virtually no thought to how we will live out our later years alone.
...medicine has been slow to confront the very changes that it has been responsible for—or to apply the knowledge we already have about how to make old age better. Despite a rapidly growing elderly population, the number of certified geriatricians fell by a third between 1998 and 2004.
Tuesday, August 09, 2011
Do 18-month old humans have a theory of mind?
Senju et al., following up on an experiment by Meltzoff and Brooks, use a rather clever experimental design to show that 18-month old children can attribute false beliefs to others, a capacity previously thought to appear only after 3-4 years:
In the research reported here, we investigated whether 18-month-olds would use their own past experience of visual access to attribute perception and consequent beliefs to other people. Infants in this study wore either opaque blindfolds (opaque condition) or trick blindfolds that looked opaque but were actually transparent (trick condition). Then both groups of infants observed an actor wearing one of the same blindfolds that they themselves had experienced, while a puppet removed an object from its location. Anticipatory eye movements revealed that infants who had experienced opaque blindfolds expected the actor to behave in accordance with a false belief about the object’s location, but that infants who had experienced trick blindfolds did not exhibit that expectation. Our results suggest that 18-month-olds used self-experience with the blindfolds to assess the actor’s visual access and to update her belief state accordingly. These data constitute compelling evidence that 18-month-olds infer perceptual access and appreciate its causal role in altering the epistemic states of other people.
Monday, August 08, 2011
Effects of oxytocin in humans - a critical review
Over the past several years MindBlog has posted examples from the outpouring of work on the "trust hormone" oxytocin. Trends in Cognitive Science offers open access to this more critical and balanced review by Bartz et al. Their abstract:
Building on animal research, the past decade has witnessed a surge of interest in the effects of oxytocin on social cognition and prosocial behavior in humans. This work has generated considerable excitement about identifying the neurochemical underpinnings of sociality in humans, and discovering compounds to treat social functioning deficits. Inspection of the literature, however, reveals that the effects of oxytocin in the social domain are often weak and/or inconsistent. We propose that this literature can be informed by an interactionist approach in which the effects of oxytocin are constrained by features of situations and/or individuals. We show how this approach can improve understanding of extant research, suggest novel mechanisms through which oxytocin might operate, and refine predictions about oxytocin pharmacotherapy.By the way, the same issue of Trends in Cognitive Science has a brief note by van Honk et al. on testosterone as a social hormone, also noting the complexity of hormone-behavior relationships (PDF here).
Blog Categories:
emotion,
happiness,
social cognition
Friday, August 05, 2011
Macho mice make manly melodies.
Susan Reardon points to work by work of Pasch et al at U of F at Gainesville, who compared the songs of castrated male mice (singing mice from Costa Rica) with males having a male hormone implant. Females were attracted to speakers playing recordings of the songs of hormonally encanced males. (audio file here, video file in links above).
Thursday, August 04, 2011
Boredom - a Lively History
Peter Toohey's book with the title of this post is reviewed by Anthony Gottlieb in the NYTimes:
In Oscar Wilde’s play “A Woman of No Importance,” Lord Illingworth says of society: “To be in it is merely a bore. But to be out of it simply a tragedy.” To be a bore oneself is the ultimate failing and makes one the target for a quintessentially English put-down. “Even the grave yawns for him,” the actor and theater manager Sir Herbert Beerbohm Tree once said of an earnest writer. ...it was (and still is) regarded in some quarters as stylish and rather aristocratic to suffer from boredom, so the English ought really to thank their bores for providing them with the occasion to display wit and appear grand.
Toohey...suggests that the unpleasant feeling of simple boredom developed as a warning signal to steer us away from social situations that are “confined, predictable, too samey for one’s sanity.” In other words, it is a useful aversion: the discomfort of boredom is a blessing in disguise...a colleague of his once argued that there isn’t really any such thing as boredom, just a blurring together of a constellation of feelings and moods — frustration, surfeit, apathy and the like. Toohey rejects this idea, and perhaps there is indeed little harm in keeping the word, provided that one is vigilantly aware of the loose, subjective and confusing ways in which it is often used. When the actor George Sanders — the archetypal cad, at least on-screen, and in the title of his autobiography — committed suicide in a Spanish hotel in 1972, he left a note that began: “Dear World, I am leaving because I am bored.” It is worth noting that he was ill, lonely and had sold his beloved house on Majorca. Was boredom really what his death was about? When a man says he is bored — as Oscar Wilde never quite got round to saying — it sometimes means that he cannot be bothered to tell you what really ails him.
Blog Categories:
culture/politics,
emotion,
motivation/reward
Wednesday, August 03, 2011
Collectivism promotes bribery
From Mazar and Aggarwal:
Why are there national differences in the propensity to bribe? To investigate this question, we conducted a correlational study with cross-national data and a laboratory experiment. We found a significant effect of the degree of collectivism versus individualism present in a national culture on the propensity to offer bribes to international business partners. Furthermore, the effect was mediated by individuals’ sense of responsibility for their actions. Together, these results suggest that collectivism promotes bribery through lower perceived responsibility for one’s actions.later note: I forgot to put the link to this article, it's now added.
Tuesday, August 02, 2011
Diversity is Universal
Here is an interesting little nugget from Joan Chiao:
At every level in the vast and dynamic world of living things lies diversity. From biomes to biomarkers, the complex array of solutions to the most basic problems regarding survival in a given environment afforded to us by nature is riveting. In the world of humans alone, diversity is apparent in the genome, in the brain and in our behavior.
The mark of multiple populations lies in the fabric of our DNA. The signature of selfhood in the brain holds dual frames, one for thinking about one's self as absolute, the other in context of others. From this biological diversity in humans arises cultural diversity directly observable in nearly every aspect of how people think, feel and behavior. From classrooms to conventions across continents, the range and scope of human activities is stunning.
Recent centuries have seen the scientific debate regarding the nature of human nature cast as a dichotomy between diversity on the one hand and universalism on the other. Yet a seemingly paradoxical, but tractable, scientific concept that may enhance our cognitive toolkit over time is the simple notion that diversity is universal.
Monday, August 01, 2011
The sunny side of smut.
Coming across an article with the same title as this post gave me an immediate flashback to my days at Harvard, when as a graduate student and resident tutor in Winthrop House I would invite down various campus notables to have dinner in the dining hall at a table with my students (coats and ties were still required then), after which we retired to the common room for a chat over sherry (sigh....the good old days). The guest I'm remembering was the famous psychologist B.F. Skinner, whose immediate response, when he was asked how he managed to remain so vital at his advanced age, was "I read pornography." Here are a few clips from the article in the Scientific American on this topics by Moyer:
...Now pornography is just one Google search away, and much of it is free. Age restrictions have become meaningless, too, with the advent of social media—one teenager in five has sent or posted naked pictures of themselves online...Certainly pornography addiction or overconsumption seems to cause relationship problems...But what about the more casual exposure typical of most porn users?...“There’s absolutely no evidence that pornography does anything negative,” says Milton Diamond, director of the Pacific Center for Sex and Society at the University of Hawaii at Manoa. “It’s a moral issue, not a factual issue.”...Perhaps the most serious accusation against pornography is that it incites sexual aggression. But not only do rape statistics suggest otherwise, some experts believe the consumption of pornography may actually reduce the desire to rape by offering a safe, private outlet for deviant sexual desires...as access to pornography grew in once restrictive Japan, China and Denmark in the past 40 years, rape statistics plummeted. Within the U.S., the states with the least Internet access between 1980 and 2000—and therefore the least access to Internet pornography—experienced a 53 percent increase in rape incidence, whereas the states with the most access experienced a 27 percent drop in the number of reported rapes .
It is important to note that these associations are just that—associations. They do not prove that pornography is the cause of the observed crime reductions. Nevertheless, the trends just don’t fit with the theory that rape and sexual assault are in part influenced by pornography...patients requesting treatment in clinics for sex offenders commonly say that pornography helps them keep their abnormal sexuality within the confines of their imagination. Pornography seems to be protective...perhaps because exposure correlates with lower levels of sexual repression, a potential rape risk factor.
Friday, July 29, 2011
MindBlog retrospective: A new description of our inner lives.
This is another of my old posts that emerged from the retrospective scan of this blog that I did recently, another interesting perspective I don't want to loose touch with. It drew a number of comments, and a second post several months later discussed them. Here is a repeat of the original post:
I rarely mention my internal experience and sensations on this blog - first, because I have viewed readers as "wanting the beef," objective stuff on how minds work. Second and more important, because my experience of noting the flow of my brain products as emotion laced chunks of sensing/cognition/action - knowing the names of the neurotransmitters and hormones acting during desire, arousal, calming, or affiliation - strikes me as a process which would feel quite alien to most people. Still, if we are materialists who believe that someday we will understand how the brain-body generates our consciousness and sense of a self, we will be able to think in terms like the following (a quote taken from Larissa MacFarquhar's profile of Paul and Patricia Churchland in the Feb. 12 New Yorker Magazine):
"...he and Pat like to speculate about a day when whole chunks of English, especially the bits that consitute folk psychology, are replaced by scientific words that call a thing by its proper name rather than some outworn metaphor... as people learn to speak differently they will learn to experience differently, and sooner or later even their most private introspections will be affected. Already Paul feels pain differently than he used to: when he cut himself shaving now he fells not "pain" but something more complicated - first the sharp, superficial A-delta-fibre pain, and then a couple of seconds later, the sickening, deeper feeling of C-fibre pain that lingers. The new words, far from being reductive or dry, have enhanced his sensations, he feels, as an oenophile's complex vocabulary enhances the taste of wine."
"Paul and Pat, realizing that the revolutionary neuroscience they dream of is still in its infancy, are nonetheless already preparing themselve for this future, making the appropriate adjustments in their everyday conversation. One afternoon recently, Paul says, he was home making dinner when Pat burst in the door, having come straight from a frustrating faculty meeting. "She said, 'Paul, don't speak to me, my serotonin levels have hit bottom, my brain is awash in glucocortocoids, my blood vessels are full of adrenaline, and if it weren't for my endogenous opiates I'd have driven the car into a tree on the way home. My dopamine levels need lifting. Pour me a Chardonnay, and I'll be down in a minute.' " Paul and Pat have noticed that it is not just they who talk this way - their students now talk of psychopharmacology as comfortably as of food."
I rarely mention my internal experience and sensations on this blog - first, because I have viewed readers as "wanting the beef," objective stuff on how minds work. Second and more important, because my experience of noting the flow of my brain products as emotion laced chunks of sensing/cognition/action - knowing the names of the neurotransmitters and hormones acting during desire, arousal, calming, or affiliation - strikes me as a process which would feel quite alien to most people. Still, if we are materialists who believe that someday we will understand how the brain-body generates our consciousness and sense of a self, we will be able to think in terms like the following (a quote taken from Larissa MacFarquhar's profile of Paul and Patricia Churchland in the Feb. 12 New Yorker Magazine):
"...he and Pat like to speculate about a day when whole chunks of English, especially the bits that consitute folk psychology, are replaced by scientific words that call a thing by its proper name rather than some outworn metaphor... as people learn to speak differently they will learn to experience differently, and sooner or later even their most private introspections will be affected. Already Paul feels pain differently than he used to: when he cut himself shaving now he fells not "pain" but something more complicated - first the sharp, superficial A-delta-fibre pain, and then a couple of seconds later, the sickening, deeper feeling of C-fibre pain that lingers. The new words, far from being reductive or dry, have enhanced his sensations, he feels, as an oenophile's complex vocabulary enhances the taste of wine."
"Paul and Pat, realizing that the revolutionary neuroscience they dream of is still in its infancy, are nonetheless already preparing themselve for this future, making the appropriate adjustments in their everyday conversation. One afternoon recently, Paul says, he was home making dinner when Pat burst in the door, having come straight from a frustrating faculty meeting. "She said, 'Paul, don't speak to me, my serotonin levels have hit bottom, my brain is awash in glucocortocoids, my blood vessels are full of adrenaline, and if it weren't for my endogenous opiates I'd have driven the car into a tree on the way home. My dopamine levels need lifting. Pour me a Chardonnay, and I'll be down in a minute.' " Paul and Pat have noticed that it is not just they who talk this way - their students now talk of psychopharmacology as comfortably as of food."
Thursday, July 28, 2011
The utility of being vague.
I'm just getting to glance at the last few issue of Psychological Science, and find this gem, "In Praise of Vagueness" by Mishra et al., which they introduce as follows:
People are increasingly surrounded by devices that provide highly precise information. For instance, technologically advanced bathroom scales can now give measurements of weight, body fat, and hydration levels within two and even three decimal places. People can find out exactly how many calories they are eating, how much weight they can lift, and how many steps they walk in a typical day. The overarching belief exemplified by the use of such technologies could be summed up by the phrase, “If I can measure it, I can manage it.” In other words, people seem to believe that precise information increases their likelihood of performing better and meeting personal goals (e.g., improving physical strength or losing weight). People generally prefer precise information over vague information because precise information gives them a greater sense of security and confidence in their ability to predict unknown outcomes in their environment. Despite this preference, we have found that vague information sometimes serves people better than precise information does.Their experiments examined the progress of people towards goals when they were given precise versus vague (error range given) feedback on that progress. Perhaps the most striking example was provided in the weight loss experiment whose participants gained, on average, one pound over the course of the experiment after being given precise feedback, those given vague feedback lost nearly four pounds. Here is their abstract:
Why might individuals perform better when they receive vague information than when they receive precise information? We posit that vague information allows individuals leeway in interpretation so that they form expectancies in accordance with the outcomes that they desire. Further, we posit that these positive expectancies can give rise to favorable performance-related outcomes.
Is the eternal quest for precise information always worthwhile? Our research suggests that, at times, vagueness has its merits. Previous research has demonstrated that people prefer precise information over vague information because it gives them a sense of security and makes their environments more predictable. However, we show that the fuzzy boundaries afforded by vague information can actually help individuals perform better than can precise information. We document these findings across two laboratory studies and one quasi–field study that involved different performance-related contexts (mental acuity, physical strength, and weight loss). We argue that the malleability of vague information allows people to interpret it in the manner they desire, so that they can generate positive response expectancies and, thereby, perform better. The rigidity of precise information discourages desirable interpretations. Hence, on certain occasions, precise information is not as helpful as vague information in boosting performance.
Wednesday, July 27, 2011
Inappropriate cravings? Hold a magnet by your head!
Here's an idea for a BioTech startup!...(I'm not serious)...suggested by an article from McClemon et al. titled "Repetitive Transcranial Magnetic Stimulation of the Superior Frontal Gyrus Modulates Craving for Cigarettes."
BACKGROUND:By the way, from Wikipedia via google images, here is the superior frontal gyrus:
Previous functional magnetic resonance imaging studies have shown strong correlations between cue-elicited craving for cigarettes and activation of the superior frontal gyrus (SFG). Repetitive transcranial magnetic stimulation (rTMS) offers a noninvasive means to reversibly affect brain cortical activity, which can be applied to testing hypotheses about the causal role of SFG in modulating craving.
METHODS:
Fifteen volunteer smokers were recruited to investigate the effects of rTMS on subjective responses to smoking versus neutral cues and to controlled presentations of cigarette smoke. On different days, participants were exposed to three conditions: 1) high-frequency (10 Hz) rTMS directed at the SFG; 2) low-frequency (1 Hz) rTMS directed at the SFG; and 3) low-frequency (1 Hz) rTMS directed at the motor cortex (control condition).
RESULTS:
Craving ratings in response to smoking versus neutral cues were differentially affected by the 10-Hz versus 1-Hz SFG condition. Craving after smoking cue presentations was elevated in the 10-Hz SFG condition, whereas craving after neutral cue presentations was reduced. Upon smoking in the 10-Hz SFG condition, ratings of immediate craving reduction as well as the intensity of interoceptive airway sensations were also attenuated.
CONCLUSIONS:
These results support the view that the SFG plays a role in modulating craving reactivity; moreover, the results suggest that the SFG plays a role in both excitatory and inhibitory influences on craving, consistent with prior research demonstrating the role of the prefrontal cortex in the elicitation as well as inhibition of drug-seeking behaviors.
Tuesday, July 26, 2011
Watching Humor in the Brain
Bekinschtein et al. look at what may be the brain correlates of "humor as a cognitive cleanup mechanism" mentioned in my June 17 post, at least in the case of jokes that depend on semantic ambiguity resolution:
What makes us laugh? One crucial component of many jokes is the disambiguation of words with multiple meanings. In this functional MRI study of normal participants, the neural mechanisms that underlie our experience of getting a joke that depends on the resolution of semantically ambiguous words were explored. Jokes that contained ambiguous words were compared with sentences that contained ambiguous words but were not funny, as well as to matched verbal jokes that did not depend on semantic ambiguity. The results confirm that both the left inferior temporal gyrus and left inferior frontal gyrus are involved in processing the semantic aspects of language comprehension, while a more widespread network that includes both of these regions and the temporoparietal junction bilaterally is involved in processing humorous verbal jokes when compared with matched nonhumorous material. In addition, hearing jokes was associated with increased activity in a network of subcortical regions, including the amygdala, the ventral striatum, and the midbrain, that have been implicated in experiencing positive reward. Moreover, activity in these regions correlated with the subjective ratings of funniness of the presented material. These results allow a more precise account of how the neural and cognitive processes that are involved in ambiguity resolution contribute to the appreciation of jokes that depend on semantic ambiguity.
Monday, July 25, 2011
Confabulation
Here is an entry from Fiery Cushman on the Edge.org question "What scientific concept would improve everybody's cognitive toolkit?," on how we frequently rationalize our behavior, unaware of unconscious factors that actually guided it. Here are some clips:
We are shockingly ignorant of the causes of our own behavior. The explanations that we provide are sometimes wholly fabricated, and certainly never complete. Yet, that is not how it feels. Instead it feels like we know exactly what we're doing and why. This is confabulation: Guessing at plausible explanations for our behavior, and then regarding those guesses as introspective certainties…The problem is that we get all of our explanations partly right, correctly identifying the conscious and deliberate causes of our behavior. Unfortunately, we mistake "party right" for "completely right", and thereby fail to recognize the equal influence of the unconscious, or to guard against it.
People make harsher moral judgments in foul-smelling rooms, reflecting the role of disgust as a moral emotion. Women are less likely to call their fathers (but equally likely to call their mothers) during the fertile phase of their menstrual cycle, reflecting a means of incest avoidance. Students indicate greater political conservatism when polled near a hand-sanitizing station during a flu epidemic, reflecting the influence of a threatening environment on ideology. They also indicate a closer bond to their mother when holding hot coffee versus iced coffee, reflecting the metaphor of a "warm" relationship.
Automatic behaviors can be remarkably organized, and even goal-driven. For example, research shows that people tend to cheat just as much as they can without realizing that they are cheating. This is a remarkable phenomenon: Part of you is deciding how much to cheat, calibrated at just the level that keeps another part of you from realizing it.
One of the ways that people pull off this trick is with innocent confabulations: When self-grading an exam, students think, "Oh, I was going to circle e, I really knew that answer!" This isn't a lie, any more than it's a lie to say you have always loved your mother (latte in hand), but don't have time to call your dad during this busy time of the month. These are just incomplete explanations, confabulations that reflect our conscious thoughts while ignoring the unconscious ones.
Perhaps you have noticed that people have an easier time sniffing out unseemly motivations for other's behavior than recognizing the same motivations for their own behavior…we jump to the conclusion that others' behaviors reflect their bad motives and poor judgment, attributing conscious choice to behaviors that may have been influenced unconsciously… we assume that our own choices were guided solely by the conscious explanations that we conjure, and reject or ignore the possibility of our own unconscious biases...By understanding confabulation we can begin to remedy both faults.
Friday, July 22, 2011
The importance of our brains’ resting state activity.
Pizoli et al find, in an open access article describing the clinical case of a young boy with epileptic encephalopathy who underwent successful corpus callosotomy surgery for treatment of drop seizures (i.e., separation of connections between the two hemispheres), that resting state brain activity (temporal synchrony across distributed brain regions termed resting-state networks that persists during waking, sleep, and anesthesia) is required for normal brain development and maintenance:
One of the most intriguing recent discoveries concerning brain function is that intrinsic neuronal activity manifests as spontaneous fluctuations of the blood oxygen level–dependent (BOLD) functional MRI signal. These BOLD fluctuations exhibit temporal synchrony within widely distributed brain regions known as resting-state networks. Resting-state networks are present in the waking state, during sleep, and under general anesthesia, suggesting that spontaneous neuronal activity plays a fundamental role in brain function. Despite its ubiquitous presence, the physiological role of correlated, spontaneous neuronal activity remains poorly understood. One hypothesis is that this activity is critical for the development of synaptic connections and maintenance of synaptic homeostasis. We had a unique opportunity to test this hypothesis in a 5-y-old boy with severe epileptic encephalopathy. The child developed marked neurologic dysfunction in association with a seizure disorder, resulting in a 1-y period of behavioral regression and progressive loss of developmental milestones. His EEG showed a markedly abnormal pattern of high-amplitude, disorganized slow activity with frequent generalized and multifocal epileptiform discharges. Resting-state functional connectivity MRI showed reduced BOLD fluctuations and a pervasive lack of normal connectivity. The child underwent successful corpus callosotomy surgery for treatment of drop seizures. Postoperatively, the patient's behavior returned to baseline, and he resumed development of new skills. The waking EEG revealed a normal background, and functional connectivity MRI demonstrated restoration of functional connectivity architecture. These results provide evidence that intrinsic, coherent neuronal signaling may be essential to the development and maintenance of the brain's functional organization.
Thursday, July 21, 2011
Pressure to conform - survey of tight and loose cultures.
Gelfand et al. have constructed a metric they term "tightness-looseness" - the extent to which societies impose social norms, and have collected data across 33 large-scale cultures from ~7,000 individuals. Their questionnaire asked people to rate the appropriateness of 12 behaviors (such as eating or crying) in 15 situations (such as being in a bank or at a party). Then, they compared the responses to an array of ecological and historical factors. From Norenzayan's summary:
...Overall, they found that societies exposed to contemporary or historical threats, such as territorial conflict, resource scarcity, or exposure to high levels of pathogens, more strictly regulate social behavior and punish deviance. These societies are also more likely to have evolved institutions that strictly regulate social norms. At the psychological level, individuals in tightly regulated societies report higher levels of self-monitoring, more intolerant attitudes toward outsiders, and paying stricter attention to time. In this multilevel analysis, ecological, historical, institutional, and psychological variables comprise a loosely integrated system that defines a culture.
These findings complement a growing literature that reveals the power of the comparative approach in explaining critically important features of human behavior. For example, research suggests that the substantial variation in religious involvement among nations can be explained, in large part, by perceived levels of security. Religion thrives when existential threats to human security, such as war or natural disaster, are rampant, and declines considerably in societies with high levels of economic development, low income inequality and infant mortality, and greater access to social safety nets.
Blog Categories:
culture/politics,
human evolution,
social cognition
Subscribe to:
Posts (Atom)