Sleep enhances memories, particularly emotional memories. As such, it has been suggested that sleep deprivation may reduce posttraumatic stress disorder. This presumes that emotional memory consolidation is paralleled by a reduction in emotional reactivity, an association that has not yet been examined. In the present experiment, we used an incidental memory task in humans and obtained valence and arousal ratings during two sessions separated either by 12 h of daytime wake or 12 h including overnight sleep. Recognition accuracy was greater following sleep relative to wake for both negative and neutral pictures. While emotional reactivity to negative pictures was greatly reduced over wake, the negative emotional response was relatively preserved over sleep. Moreover, protection of emotional reactivity was associated with greater time in REM sleep. Recognition accuracy, however, was not associated with REM. Thus, we provide the first evidence that sleep enhances emotional memory while preserving emotional reactivity.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Wednesday, February 08, 2012
Don't go to bed angry...
Several studies have shown that sleep enhances emotional memories. Baran et al. show that maybe its better let anger keep you awake at night than to sleep on it. Sleep consolidates the negative emotional memory. Having trouble sleeping after an unsettling experience may be the brain's way of trying to keep the memory or emotion from being stored. The abstract:
Tuesday, February 07, 2012
Our connectomes 'R Us?
I have received by now several offers from Houghton Mifflin publishers to send a reviewer's copy of Sebastian Seung's "Connectome: How the Brain's Wiring Makes Us Who We Are." I haven't acted on it, because a review of the synopsis has convinced me that Seung's own brilliant efforts, and similar works he describes to map every connection in our brain, are not the complete key to understanding ourselves that he implies. I was actually starting to write a list of problems I see with the idea that when you've got the wiring diagram between nerve cells you've got it all, but this succinct critique by Chris Koch permits me to be lazy:
Treating the connectome as the be-all and end-all of brain function has its problems. Seung, for example, rebrands autism and schizophrenia as 'connectopathies' — diseases in which the brain's wiring goes awry. Yet plenty of other things are wrong in brains with these disorders besides their connectivity.
Faults in synaptic transmission and in processes inside neurons and the glial cells that support them have all been implicated in mental illness and brain disease. Neurons are intricate devices with elaborate input structures that show complex, time-dependent and nonlinear processing. They have various characteristic, and often tortuous, morphologies. Connectionism treats all this as irrelevant. Even though we have known the connectome of the nematode worm for 25 years, we are far from reading its mind. We don't yet understand how its nerve cells work.
Monday, February 06, 2012
Massage therapy suppresses expression of inflammatory genes after exercise..
I've been getting regular deep tissue structural massage for years, and continue to be amazed at how good it makes me feel. This report from Tarnopolsky and colleagues explains at least part of the reason why. They profiled the expression of genes involved in both inflammatory pathways and in pathways that regenerate energy generating mitochondria in the leg muscles of eleven young men after very strenuous leg exercise, with one leg being massaged after the exercise. The massaged legs had 30% more PGC-1alpha, a gene that helps muscle cells build mitochondria. They also had three times less NFkB, which turns on genes associated with inflammation. (The study found no evidence to support often-repeated claims that massage removes lactic acid, a byproduct of exertion long blamed for muscle soreness, or waste products from tired muscles.) Here is the detailed abstract:
Massage therapy is commonly used during physical rehabilitation of skeletal muscle to ameliorate pain and promote recovery from injury. Although there is evidence that massage may relieve pain in injured muscle, how massage affects cellular function remains unknown. To assess the effects of massage, we administered either massage therapy or no treatment to separate quadriceps of 11 young male participants after exercise-induced muscle damage. Muscle biopsies were acquired from the quadriceps (vastus lateralis) at baseline, immediately after 10 min of massage treatment, and after a 2.5-hour period of recovery. We found that massage activated the mechanotransduction signaling pathways focal adhesion kinase (FAK) and extracellular signal–regulated kinase 1/2 (ERK1/2), potentiated mitochondrial biogenesis signaling [nuclear peroxisome proliferator–activated receptor γ coactivator 1α (PGC-1α)], and mitigated the rise in nuclear factor κB (NFκB) (p65) nuclear accumulation caused by exercise-induced muscle trauma. Moreover, despite having no effect on muscle metabolites (glycogen, lactate), massage attenuated the production of the inflammatory cytokines tumor necrosis factor–α (TNF-α) and interleukin-6 (IL-6) and reduced heat shock protein 27 (HSP27) phosphorylation, thereby mitigating cellular stress resulting from myofiber injury. In summary, when administered to skeletal muscle that has been acutely damaged through exercise, massage therapy appears to be clinically beneficial by reducing inflammation and promoting mitochondrial biogenesis.
Friday, February 03, 2012
The good life can be a killer.
I've enjoyed the recent piece on our dysfunctional modern community structures by Jane Brodie (who got her journalism degree at the University of Wisconsin, Madison, where I teach).
...homes and shopping malls far from city centers…[have created] creating vehicle-dependent environments that foster obesity, poor health, social isolation, excessive stress and depression…Physical activity has been disappearing from the lives of young and old, and many communities are virtual “food deserts,” serviced only by convenience stores that stock nutrient-poor prepared foods and drinks…people in the current generation (born since 1980) will be the first in America to live shorter lives than their parents do.On the question of whether our suburbs can be saved, Brodie notes environmental redesigning projects to foster better physical and mental health proceeding in Atlanta, GA., Lakewood, CO., Syracuse, NY, and Elgin, IL. (and, see designinghealthycommunities.org.)
In a healthy environment…people who are young, elderly, sick or poor can meet their life needs without getting in a car, which means creating places where it is safe and enjoyable to walk, bike, take in nature and socialize…People who walk more weigh less and live longer…People who are fit live longer… People who have friends and remain socially active live longer…In 1974, 66 percent of all children walked or biked to school By 2000, that number had dropped to 13 percent…We’ve engineered physical activity out of children’s lives…two in seven volunteers for the military can’t get in because they’re not in good enough physical condition…Not only are Americans of all ages fatter than ever, but also growing numbers of children are developing diseases once seen only in adults: Type 2 diabetes, heart disease and fatty livers.
Thursday, February 02, 2012
Should I use condoms for sex? The millisecond scan.
This item caught my eye, since I spend the winter months each year living in the center of the gay Wilton Manors ghetto of Fort Lauderdale, FL., where the boys hook up at a fast and furious rate. Many base a decision on whether to use condoms on their 'intuition' of whether a potential partner is HIV positive. Renner et al. find that this guessing is very rapid and based on a few fairly simple facial trait characteristics (that in fact have not been shown to have any relationship to actual HIV status).
Research indicates that many people do not use condoms consistently but instead rely on intuition to identify sexual partners high at risk for HIV infection. The present studies examined neural correlates for first impressions of HIV risk and determined the association of perceived HIV risk with other trait characteristics. Participants were presented with 120 self-portraits retrieved from a popular online photo-sharing community (www.flickr.com). Factor analysis of various explicit ratings of trait characteristics yielded two orthogonal factors: (1) a ‘valence-approach’ factor encompassing perceived attractiveness, healthiness, valence, and approach tendencies, and (2) a ‘safeness’ factor, entailing judgments of HIV risk, trustworthiness, and responsibility. These findings suggest that HIV risk ratings systematically relate to cardinal features of a high-risk HIV stereotype. Furthermore, event-related brain potential recordings revealed neural correlates of first impressions about HIV risk. Target persons perceived as risky elicited a differential brain response in a time window from 220–340 ms and an increased late positive potential in a time window from 350–700 ms compared to those perceived as safe. These data suggest that impressions about HIV risk can be formed in a split second and despite a lack of information about the actual risk profile. Findings of neural correlates of risk impressions and their relationship to key features of the HIV risk stereotype are discussed in the context of the ‘risk as feelings’ theory.
Wednesday, February 01, 2012
Simplicity itself.
I have to pass on this brief essay by one of my heroes, Thomas Metzinger:
Elegance is more than an aesthetic quality, or some ephemeral sort of uplifting feeling we experience in deeper forms of intuitive understanding. Elegance is formal beauty. And formal beauty as a philosophical principle is one of the most dangerous, subversive ideas humanity has discovered: it is the virtue of theoretical simplicity. Its destructive force is greater than Darwin's algorithm or that of any other single scientific explanation, because it shows us what the depth of an explanation is.
Elegance as theoretical simplicity comes in many different forms. Everybody knows Occam's razor, the ontological principle of parsimony: Entities are not to be multiplied beyond necessity. William of Occam gave us a metaphysical principle for choosing between competing theories: All other things being equal, it is rational to always prefer the theory that makes fewer ontological assumptions about the kinds of entities that really exist (souls, life forces, abstract objects, or an absolute frame of reference like electromagnetic ether). We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances—Isaac Newton formulated this as the First Rule of Reasoning in Philosophy, in his Principia Mathematica. Throw out everything that is explanatorily idle, and then shift the burden of proof to the proponent of a less simple theory. In Albert Einstein's words: The grand aim of all science … is to cover the greatest possible number of empirical facts by logical deductions from the smallest possible number of hypotheses or axioms.
Of course, in today's technical debates new questions have emerged: Why do metaphysics at all? Isn't it simply the number of free, adjustable parameters in competing hypotheses what we should measure? Is it not syntactic simplicity that captures elegance best, say, the number fundamental abstractions and guiding principles a theory makes use of? Or will the true criterion for elegance ultimately be found in statistics, in selecting the best model for a set of data points while optimally balancing parsimony with the "goodness of fit" of a suitable curve? And, of course, for Occam-style ontological simplicity the BIG question always remains: Why should a parsimonious theory more likely be true? Ultimately, isn't all of this rooted in a deeply hidden belief that God must have created a beautiful universe?
I find it fascinating to see how the original insight has kept its force over the centuries. The very idea of simplicity itself, applied as a metatheoretical principle, has demonstrated great power—the subversive power of reason and reductive explanation. The formal beauty of theoretical simplicity is deadly and creative at the same time. It destroys superfluous assumptions whose falsity we just cannot bring ourselves to believe, whereas truly elegant explanations always give birth to an entirely new way of looking at the world. What I would really like to know is this: Can the fundamental insight—the destructive, creative virtue of simplicity—be transposed from the realm of scientific explanation into culture or onto the level of conscious experience? What kind of formal simplicity would make our culture a deeper, more beautiful culture? And what is an elegant mind?
Tuesday, January 31, 2012
Our Thrifty Brains.
Andy Clark has done a piece that is really worth reading in the Stone, a New York Times forum for contemporary philosophers. (And, check out the video below):
Might the miserly use of neural resources be one of the essential keys to understanding how brains make sense of the world? Some recent work in computational and cognitive neuroscience suggests that it is indeed the frugal use of our native neural capacity (the inventive use of restricted “neural bandwidth,” if you will) that explains how brains like ours so elegantly make sense of noisy and ambiguous sensory input. That same story suggests, intriguingly, that perception, understanding and imagination, which we might intuitively consider to be three distinct chunks of our mental machinery, are inextricably tied together as simultaneous results of a single underlying strategy known as “predictive coding.” This strategy saves on bandwidth using (who would have guessed it?) one of the many technical wheezes that enable us to economically store and transmit pictures, sounds and videos using formats such as JPEG and MP3.
...perception may best be seen as what has sometimes been described as a process of “controlled hallucination” ...in which we (or rather, various parts of our brains) try to predict what is out there, using the incoming signal more as a means of tuning and nuancing the predictions rather than as a rich (and bandwidth-costly) encoding of the state of the world.
The basic effect hereabouts is neatly illustrated by a simple but striking demonstration (used by the neuroscientist Richard Gregory back in the 1970’s to make this very point) known as “the hollow face illusion.” This is a well-known illusion in which an ordinary face mask viewed from the back can appear strikingly convex. That is, it looks (from the back) to be shaped like a real face, with the nose sticking outward rather than having a concave nose cavity. Just about any hollow face mask will produce some version of this powerful illusion, and there are many examples on the Web, like this one:
Blog Categories:
attention/perception,
brain plasticity
Monday, January 30, 2012
A simple way to attentuate emotional arousal?
I just came across these interesting observations of Herwig et al.. They show that simply using self referential reflection (i.e., using mindfullness) to make an emotional state aware can attenuate amygdala activation and emotional arousal:
The regulation of emotions is an ongoing internal process and often a challenge. Current related neural models concern the intended control of reactions towards external events, mediated by prefrontal cortex regions upon basal emotion processing as in the amygdala. Cognitive strategies to regulate emotions in the context of affective disorders or stress reduction, increasingly applied in clinical practice, are also related to mindfulness techniques. We questioned their effects on neural emotion processing and investigated brain activity during purely internal mental self-referential processes of making current emotions and self-related cognitions aware. Thirty healthy subjects performed a task comprising periods of cognitive self-reflection, of introspection for actual own emotions and feelings, and of a neutral condition, while they were scanned with functional magnetic resonance imaging. Brain activations of twenty-seven subjects during emotion-introspection and self-reflection, and also a conjunction of both, were compared with the neutral condition. The conditions of self-reflection and emotion-introspection showed distinguishable activations in medial and ventrolateral prefrontal areas, in parietal regions and in the amygdala. Notably, amygdala activity decreased during emotion-introspection and increased compared to ‘neutral’ during self-reflection. The results indicate that already the self-referential mental state of making the actual emotional state aware is capable of attenuating emotional arousal. This extends current theories of emotion regulation and has implications for the application of mindfulness techniques as a component of psychotherapeutic strategies in affective disorders and also for possible everyday emotion regulation.
Friday, January 27, 2012
You think, therefore I am.
I pass on this contribution from Rose and Markus as their answer to this year's annual question from Edge.org (What is your favorite deep, elegant, or beautiful explanation?):
"I think, therefore I am." Cogito ergo sum. Remember this elegant and deep idea from René Descartes' Principles of Philosophy? The fact that a person is contemplating whether she exists, Descartes argued, is proof that she, indeed, actually does exist. With this single statement, Descartes knit together two central ideas of Western philosophy: 1) thinking is powerful, and 2) individuals play a big role in creating their own I's—that is, their psyches, minds, souls, or selves.
Most of us learn "the cogito" at some point during our formal education. Yet far fewer of us study an equally deep and elegant idea from social psychology: Other people's thinking likewise powerfully shapes the I's that we are. Indeed, in many situations, other people's thinking has a bigger impact on our own thoughts, feelings, and actions than do the thoughts we conjure while philosophizing alone.
In other words, much of the time, "You think, therefore I am." For better and for worse.
An everyday instance of how your thinking affects other people's being is the Pygmalion effect. Psychologists Robert Rosenthal and Lenore Jacobson captured this effect in a classic 1963 study. After giving an IQ test to elementary school students, the researchers told the teachers which students would be "academic spurters" because of their allegedly high IQs. In reality, these students' IQs were no higher than those of the "normal" students. At the end of the school year, the researchers found that the "spurters'" had attained better grades and higher IQs than the "normals." The reason? Teachers had expected more from the spurters, and thus given them more time, attention, and care. And the conclusion? Expect more from students, and get better results.
A less sanguine example of how much our thoughts affect other people's I's is stereotype threat. Stereotypes are clouds of attitudes, beliefs, and expectations that follow around a group of people. A stereotype in the air over African Americans is that they are bad at school. Women labor under the stereotype that they suck at math.
As social psychologist Claude Steele and others have demonstrated in hundreds of studies, when researchers conjure these stereotypes—even subtly, by, say, asking people to write down their race or gender before taking a test—students from the stereotyped groups score lower than the stereotype-free group. But when researchers do not mention other people's negative views, the stereotyped groups meet or even exceed their competition. The researchers show that students under stereotype threat are so anxious about confirming the stereotype that they choke on the test. With repeated failures, they seek their fortunes in other domains. In this tragic way, other people's thoughts deform the I's of promising students.
As the planet gets smaller and hotter, knowing that "You think, therefore I am" could help us more readily understand how we affect our neighbours and how our neighbours affect us. Not acknowledging how much we impact each other, in contrast, could lead us to repeat the same mistakes.
Blog Categories:
culture/politics,
memory/learning,
social cognition
Thursday, January 26, 2012
Cellular 'self eating' accounts for some beneficial effects of exercise.
Population studies suggest that exercise protects against diabetes, cancer, and age related diseases such as Alzheimer's. Work by Congcong He et al. has now shown that at least part of this effect is due to the increased "self-eating" (Autophagy) that cells must do to meet the energy demands of exercise. Autophagy recycles used or flawed membranes and internal cell structures by encircling its target material and then dumping it into a compartment that digests it. It has been shown in animal models to reduce diabetes, cancer, and neuro-degenerative diseases. The He et al. work documents that exercise induces autophagy in the skeletal muscles of mice, which in turn lowers glucose and insulin in the bloodstream. Mutant mice that don't induce more autophagy during exercise didn't show this effect. Further, the exercise induced reversal of diabetes induced by overfeeding mice was observed only the mice who showed a exercise induced increased autophagy. Here is the abstract with more details:
Exercise has beneficial effects on human health, including protection against metabolic disorders such as diabetes. However, the cellular mechanisms underlying these effects are incompletely understood. The lysosomal degradation pathway, autophagy, is an intracellular recycling system that functions during basal conditions in organelle and protein quality control. During stress, increased levels of autophagy permit cells to adapt to changing nutritional and energy demands through protein catabolism. Moreover, in animal models, autophagy protects against diseases such as cancer, neurodegenerative disorders, infections, inflammatory diseases, ageing and insulin resistance. Here we show that acute exercise induces autophagy in skeletal and cardiac muscle of fed mice. To investigate the role of exercise-mediated autophagy in vivo, we generated mutant mice that show normal levels of basal autophagy but are deficient in stimulus (exercise- or starvation)-induced autophagy. These mice (termed BCL2 AAA mice) contain knock-in mutations in BCL2 phosphorylation sites (Thr69Ala, Ser70Ala and Ser84Ala) that prevent stimulus-induced disruption of the BCL2–beclin-1 complex and autophagy activation. BCL2 AAA mice show decreased endurance and altered glucose metabolism during acute exercise, as well as impaired chronic exercise-mediated protection against high-fat-diet-induced glucose intolerance. Thus, exercise induces autophagy, BCL2 is a crucial regulator of exercise- (and starvation)-induced autophagy in vivo, and autophagy induction may contribute to the beneficial metabolic effects of exercise.
Wednesday, January 25, 2012
The psychology of perceived wealth.
Studies have shown that not every dollar contributes equally to perceived wealth, people’s standing relative to those around them often predicts well-being better than net worth does, and increasing income trends are preferred over decreasing ones. Sussman and Shafir (at Princeton, where Kahneman has carried out his behavioral economics studies) show several factors that can influence the perception of wealth:
We studied the perception of wealth as a function of varying levels of assets and debt. We found that with total wealth held constant, people with positive net worth feel and are seen as wealthier when they have lower debt (despite having fewer assets). In contrast, people with equal but negative net worth feel and are considered wealthier when they have greater assets (despite having larger debt). This pattern persists in the perception of both the self and others.In their concluding discussion,
…people have a robust preference for higher assets in cases of negative net worth and for lower debt in cases of positive net worth…debt appears relatively salient in contexts of positive wealth, whereas assets loom relatively large in contexts of negative wealth, and this differential salience has a corresponding impact on financial judgments and decisions.
…the present findings show how the appeal of a loan may depend on one’s perceived financial state. For a person who is in the red, a loan may provide an appealing infusion of cash, whereas for a person in the black, it might present an aversive incursion into debt. Conversely, people who are in the black may be tempted to diminish their debt, whereas it may prove unappealing for those in the red to lower their debt at the expense of their assets.
Remarkably, the same striving for financial wealth and stability can trigger opposing behaviors: preference for greater assets in some circumstances, and for lower debt in others. Such impulses may not always be aligned with what is best financially. People who are in the red and eager to borrow will sometimes have access only to high-interest loans. And people who are eager to clear their debt will sometimes do so even when their debt (e.g., tax-incentivized mortgages) is financially beneficial. Such psychology may be of great consequence. A remarkable 25% of U.S. households had zero or negative net worth in 2009 (for Black households, the figure was about 40%. Better insight into the determinants of perceived financial wealth and financial decision making could help shape behaviorally informed policy.
Blog Categories:
evolutionary psychology,
social cognition
Tuesday, January 24, 2012
Bounded rationality.
I thought I would pass on clips from Mahzarin Banaji's response to the Edge.org annual question "What is your favorite deep, elegant, or beautiful explanation?":
…my candidate for the most deeply satisfying explanation of recent decades is the idea of bounded rationality…Herbert Simon put one stake in the ground through the study of information processing and AI, showing that both people and organizations follow principles of behavior such as "satisficing" that constrain them to decent but not the best decisions. The second stake was placed by Kahneman and Tversky, who showed the stunning ways in even experts are error-prone—with consequences for not only their own health and happiness but that of their societies broadly.
Together the view of human nature that evolved over the past four decades has systematically changed the explanation for who we are and why we do what we do. We are error-prone in the unique ways in which we are, the explanation goes, not because we have malign intent, but because of the evolutionary basis of our mental architecture, the manner in which we remember and learn information, the way in which we are affected by those around us and so on. The reason we are boundedly rational is because the information space in which we must do our work is large compared to the capacities we have, including severe limits on conscious awareness, the ability to be able to control behavior, and to act in line even with our own intentions.
The idea that bad outcomes result from limited minds that cannot store, compute and adapt to the demands of the environment is a radically different explanation of our capacities and thereby our nature. It's elegance and beauty comes from it placing the emphasis on the ordinary and the invisible rather than on specialness and malign motives. This seems not so dissimilar from another shift in explanation from god to natural section and it is likely to be equally resisted.
Monday, January 23, 2012
The age of anxiety
Daniel Smith does an interesting piece asking whether it is appropriate to consider our current times an "age of anxiety." Some clips:
...it is undeniable that ours is an age in which an enormous and growing number of people suffer from anxiety. According to the National Institute of Mental Health, anxiety disorders now affect 18 percent of the adult population of the United States, or about 40 million people. By comparison, mood disorders — depression and bipolar illness, primarily — affect 9.5 percent…anti-anxiety drug alprazolam — better known by its brand name, Xanax — was the top psychiatric drug on the list, clocking in at 46.3 million prescriptions in 2010.
Just because our anxiety is heavily diagnosed and medicated, however, doesn’t mean that we are more anxious than our forebears. It might simply mean that we are better treated — that we are, as individuals and a culture, more cognizant of the mind’s tendency to spin out of control.
Earlier eras might have been even more jittery than ours. Fourteenth-century Europe, for example, experienced devastating famines, waves of pillaging mercenaries, peasant revolts, religious turmoil and a plague that wiped out as much as half the population in four years. The evidence suggests that all this resulted in mass convulsions of anxiety, a period of psychic torment in which, as one historian has put it, “the more one knew, the less sense the world made.”
It’s hard to imagine that we have it even close to as bad as that. Yet there is an aspect of anxiety that we clearly have more of than ever before: self-awareness…Anxiety didn’t emerge as a cohesive psychiatric concept until the early 20th century..By 1977, the psychoanalyst Rollo May was noting an explosion in papers, books and studies on the subject.
...we shouldn’t be possessive about our uncertainties, particularly as one of the dominant features of anxiety is its recursiveness. Anxiety begins with a single worry, and the more you concentrate on that worry, the more powerful it gets, and the more you worry. One of the best things you can do is learn to let go: to disempower the worry altogether. If you start to believe that anxiety is a foregone conclusion — if you start to believe the hype about the times we live in — then you risk surrendering the battle before it’s begun.
Friday, January 20, 2012
On Solitude.
Reading a recent New York Times Op-Ed piece by Susan Cain ("The Rise of the New Groupthink") transported me back over 20 years to what I then experienced as a transformative reading of British Psychotherapist Anthony Storr's book "Solitude, a return to the self." It's reading provided me with a my needed validation of my own solitary and introspective nature (preferring to do my work and thinking my myself, even while serving and respecting social groups, such as the laboratory I ran). Storr's book was a reaction against the popular psychotherapies of the 1980s which emphasized intimate interpersonal relationships as the chief, if not the only, source of human happiness. He made a strong case that the life of an average person, not just a familiar list of brilliant scholars and artists such as Beethoven, Kant, Newton, etc., could be greatly enriched more time spent alone.
In a similar vein Cain writes against the current assumption that creativity, particularly in business, requires the collaboration of group of people addressing the problem at hand. Her central illustration describes the origins of the Apple computer, It's creation required the support of a creative group of engineers and Steve Jobs' business sense, but the creative kernel of work and insight that put together the core of the actual hardware and code that ran it was done by Wozniak's solitary effort. Cain notes:
In a similar vein Cain writes against the current assumption that creativity, particularly in business, requires the collaboration of group of people addressing the problem at hand. Her central illustration describes the origins of the Apple computer, It's creation required the support of a creative group of engineers and Steve Jobs' business sense, but the creative kernel of work and insight that put together the core of the actual hardware and code that ran it was done by Wozniak's solitary effort. Cain notes:
...brainstorming sessions are one of the worst possible ways to stimulate creativity...People in groups tend to sit back and let others do the work; they instinctively mimic others’ opinions and lose sight of their own; and, often succumb to peer pressure... fear of rejection actives the brain's amygdala.
The one important exception to this dismal record is electronic brainstorming, where large groups outperform individuals; and the larger the group the better. The protection of the screen mitigates many problems of group work. This is why the Internet has yielded such wondrous collective creations. Marcel Proust called reading a “miracle of communication in the midst of solitude,” and that’s what the Internet is, too. It’s a place where we can be alone together — and this is precisely what gives it power.
...most humans have two contradictory impulses: we love and need one another, yet we crave privacy and autonomy....To harness the energy that fuels both these drives, we need to move beyond the New Groupthink and embrace a more nuanced approach to creativity and learning. Our offices should encourage casual, cafe-style interactions, but allow people to disappear into personalized, private spaces when they want to be alone. Our schools should teach children to work with others, but also to work on their own for sustained periods of time. And we must recognize that introverts like Steve Wozniak need extra quiet and privacy to do their best work.
Before Mr. Wozniak started Apple, he designed calculators at Hewlett-Packard, a job he loved partly because HP made it easy to chat with his colleagues. Every day at 10 a.m. and 2 p.m., management wheeled in doughnuts and coffee, and people could socialize and swap ideas. What distinguished these interactions was how low-key they were. For Mr. Wozniak, collaboration meant the ability to share a doughnut and a brainwave with his laid-back, poorly dressed colleagues — who minded not a whit when he disappeared into his cubicle to get the real work done.
Blog Categories:
culture,
self,
self help,
social cognition
Thursday, January 19, 2012
Chill-out architecture - The use of tree metaphors
I gravitate towards forests and trees (typing right now at a desk that looks out at a large tree canopy on the opposite riverbank) because the vision of green trees under a blue sky is vastly more calming that having to look at the more brown and red tints of modern city structures. (My current Fort Lauderdale location is an extended strip mall only occasionally small bits of nature to intrude). Old pine forests give me the same sheltered feeling as the great cathedrals of Europe.
Thus I am very sympathetic to efforts to argue for a evolutionary or biological basis for these feelings, which appear to be common to most human cultures. E.O. Wilson, the father of "Sociobiology" and evolutionary psychology, has written a book "Biophilia" that essentially argues that our preference for natural scenes is innate, the product of a psychology that evolved in paleolithic times. I would like this to be a correct view, but alas, it is, like most of evolutionary psychology, more like Rudyard Kipling's "Just so Stories" than hard science.
It is one thing to simply note trees as a metaphor for shelter, and thus to find it natural that architectural designs (such as the Metropol Parasol in Seville shown in the picture) that incorporate the tree metaphor would be pleasing to us. It is quite another hang this all on the supposed cognitive neuroscience of embodied cognition, as Sarah Williams Goldhagen, the architecture critic for The New Republic, has done in a rather confused piece. A recent post by Voytek, and the discussion following, point out a number of reservations and relevant points.
Thus I am very sympathetic to efforts to argue for a evolutionary or biological basis for these feelings, which appear to be common to most human cultures. E.O. Wilson, the father of "Sociobiology" and evolutionary psychology, has written a book "Biophilia" that essentially argues that our preference for natural scenes is innate, the product of a psychology that evolved in paleolithic times. I would like this to be a correct view, but alas, it is, like most of evolutionary psychology, more like Rudyard Kipling's "Just so Stories" than hard science.
It is one thing to simply note trees as a metaphor for shelter, and thus to find it natural that architectural designs (such as the Metropol Parasol in Seville shown in the picture) that incorporate the tree metaphor would be pleasing to us. It is quite another hang this all on the supposed cognitive neuroscience of embodied cognition, as Sarah Williams Goldhagen, the architecture critic for The New Republic, has done in a rather confused piece. A recent post by Voytek, and the discussion following, point out a number of reservations and relevant points.
Blog Categories:
culture/politics,
evolutionary psychology
Wednesday, January 18, 2012
Living large - how the powerful overestimate.
From Duguid and Goncalo, their abstract, slightly edited:
In three experiments, we tested the prediction that individuals’ experience of power influences their perceptions of their own height. In the first experiment high power, relative to low power, was associated with smaller estimates of a pole’s height relative to the self, in a second experiment with larger estimates of one’s own height, and in a third experiment with choice of a taller avatar to represent the self in a second-life game . These results emerged regardless of whether power was experientially primed (In the first and third experiments) or manipulated through assigned roles (in the second experiment). Although a great deal of research has shown that more physically imposing individuals are more likely to acquire power, this work is the first to show that powerful people feel taller than they are. The discussion considers the implications for existing and future research on the physical experience of power.
Tuesday, January 17, 2012
My pushing back against our diffusion into “the cloud”
My son visits over the new year's holiday every year, which gives me the chance to have a "techie" conference with him to see what I've been missing. One of the web applications he mentioned lead me to Ghostery, a web app that installs on your web browser with a cute little pac-man like ghost that shows you who is tracking your web movements and what cookies have been put on your browser (I was rather taken aback to see that I'm tracked by 759 'bugs' and have 412 cookies). The Ghostery App allows you to inactivate them individually or as a group. Even though most of the monitoring of our movements on the web is supposedly for benign marketing purposes, I'm more than happy to turn it all off.
A storm of controversy has risen over Google recent effort to conflate supposedly neutral web searches with its Google Plus social network, so that a search for information on some idea or item might now yield results that include posts, photos, profiles and conversations from Google Plus that are public or were shared privately with the person searching. I go to google for Google for links to expert information, and don't want my search results to be cluttered with friends’ postings. Since I use google for practically everything I do on the web (this blog, mail, calendar, contacts, google+, google voice, etc.), this cross linking of my search results and my google+ account is in fact happening. Fortunately, you can turn off this google+ feature by going to the gear-shaped options icon at the top right of google search results, selecting "Search settings," scrolling down till you see "Personal results" and tick the box next to "Do not use personal results."
A storm of controversy has risen over Google recent effort to conflate supposedly neutral web searches with its Google Plus social network, so that a search for information on some idea or item might now yield results that include posts, photos, profiles and conversations from Google Plus that are public or were shared privately with the person searching. I go to google for Google for links to expert information, and don't want my search results to be cluttered with friends’ postings. Since I use google for practically everything I do on the web (this blog, mail, calendar, contacts, google+, google voice, etc.), this cross linking of my search results and my google+ account is in fact happening. Fortunately, you can turn off this google+ feature by going to the gear-shaped options icon at the top right of google search results, selecting "Search settings," scrolling down till you see "Personal results" and tick the box next to "Do not use personal results."
Monday, January 16, 2012
Remembering a rosy future.
Here is a fascinating tidbit from Dan Schacter's laboratory. When we imagine events in the future, our subsequent recall of negative simulations fades more rapidly than our recall of positive ones.:
Mental simulations of future experiences are often concerned with emotionally arousing events. Although it is widely believed that mental simulations enhance future behavior, virtually nothing is known about how memory for these simulations changes over time or whether simulations of emotional experiences are especially well remembered. We used a novel paradigm that combined recently developed methods for generating simulations of future events and well-established procedures for testing memory to examine the retention of positive, negative, and neutral simulations over delays of 10 min and 1 day. We found that at the longer delay, details associated with negative simulations were more difficult to remember than details associated with positive or neutral simulations. We suggest that these effects reflect the influence of the fading-affect bias, whereby negative reactions fade more quickly than positive reactions, and that this influence results in a tendency to remember a rosy simulated future. We discuss implications of our findings for individuals with affective disorders, such as depression and anxiety.(Schacter, in the Harvard Psychology department, is a prolific memory researcher, and is author of such popular books as "The Seven Sins of Memory: How the Mind Forgets and Remembers." as well as coauthor, along with Gilbert and Wegner, of a really excellent introductory college Psychology text.)
Friday, January 13, 2012
Our bias against creativity
In principle we are all for creativity, but, when faced with the prospect of actually altering our behavior or opinions we falter. Mueller et al suggest that this is a covert, largely unconscious process regulated by how uncertain we feel. Their results show that regardless of the degree to which people are open minded, when they feel motivated to reduce uncertainty (either because they have an immediate goal of reducing uncertainty or they feel uncertain generally), they may experience more negative associations with creativity, which results in lower evaluations of a creative idea. Their findings imply an irony. Other research has shown that uncertainty spurs the search for and generation of creative ideas, yet these findings reveal that uncertainty also makes people less able to recognize creativity, perhaps when they need it most. Here is the abstract.:
People often reject creative ideas, even when espousing creativity as a desired goal. To explain this paradox, we propose that people can hold a bias against creativity that is not necessarily overt and that is activated when people experience a motivation to reduce uncertainty. In two experiments, we manipulated uncertainty using different methods, including an uncertainty-reduction prime. The results of both experiments demonstrated the existence of a negative bias against creativity (relative to practicality) when participants experienced uncertainty. Furthermore, this bias against creativity interfered with participants’ ability to recognize a creative idea. These results reveal a concealed barrier that creative actors may face as they attempt to gain acceptance for their novel ideas.
Blog Categories:
brain plasticity,
emotion,
fear/anxiety/stress
Thursday, January 12, 2012
IQ scores are malleable.
Brinch and Galloway do a rather clean demonstration that contests the common notion that education has little effect on IQ. Here is the abstract and one figure from the paper.:
Although some scholars maintain that education has little effect on intelligence quotient (IQ) scores, others claim that IQ scores are indeed malleable, primarily through intervention in early childhood. The causal effect of education on IQ at later ages is often difficult to uncover because analyses based on observational data are plagued by problems of reverse causation and self-selection into further education. We exploit a reform that increased compulsory schooling from 7 to 9 y in Norway in the 1960s to estimate the effect of education on IQ. We find that this schooling reform, which primarily affected education in the middle teenage years, had a substantial effect on IQ scores measured at the age of 19 y.
Average IQ and education by time to reform.
Subscribe to:
Posts (Atom)