Tuesday, October 21, 2008

Botnets

Here is a scary article.

Evolution of Religious Prosociality

Norenzayan and Shariff offer an interesting review article on empirical evidence for religious prosociality. Here is one clip and two figures from the article.
Agreement is emerging that selective pressures over the course of human evolution can explain the wide cross-cultural re-occurrence, historical persistence, and predictable cognitive structure of religious beliefs and behaviors. The tendency to detect agency in nature likely supplied the cognitive template that supports the pervasive belief in supernatural agents. These agents are widely believed to transcend physical, biological, and psychological limitations. However, other important details are subject to cultural variation. Although in many societies supernatural agents are not directly concerned with human morality, in many others, morally concerned agents use their supernatural powers to observe and, in some cases, to punish and reward human social interactions. Examples include the God of Abrahamic religions and Viracocha, the Incan supreme God, but also many morally concerned deities found in traditional societies, such as the adalo, ancestral spirits of the Kwaio Solomon islanders. These beliefs are likely to spread culturally to the extent that they facilitate ingroup cooperation. This could occur by conforming to individual psychology that favors reputation-sensitive prosocial tendencies, as the by-product account holds; by competition among social groups, as the cultural group selection account would suggest; or possibly by some combination of the two. Religious behaviors and rituals, if more costly to cooperating group members than to freeloaders, may have reliably signaled the presence of devotion and, therefore, cooperative intention toward ingroup members, in turn, buffering religious groups against defection from freeloaders and reinforcing cooperative norms. Religious prosociality, thus, may have softened the limitations that kinship-based and (direct or indirect) reciprocity-based altruism place on group size. In this way, the cultural spread of religious prosociality may have facilitated the rise of stable, large, cooperative communities of genetically unrelated individuals.


Figure - Implicit activation of God concepts, relative to a neutral prime, increased offers in the one-shot, anonymous Dictator Game. Priming secular concepts indicating moral authority had a similar effect. The results showed not only a quantitative increase in generosity, but also a qualitative shift in social norms. In the control group, the modal response was selfishness, a plurality of players pocketed all $10. In the God group, the mode shifted to fairness, a plurality of players split the money evenly (N = 75). It remains to be seen, however, whether these effects
would occur if the recipient was clearly marked as an outgroup member.


Figure - Life expectancy of religious versus secular communes. An analysis of 200 religious and secular communes in 19th-century America (29), for every year of their life course, religious communes were about four times as likely to survive than their secular counterparts. This difference remained after statistically controlling for type of commune movement, year founded, and year at risk of dissolution (the last control assesses major historical trends that may independently impact commune dissolution).

Redefining Depression as Mere Sadness

An article by Pies with the title of this post is worth reading. It deals with the criticism that modern psychiatric practice, in collusion with pill pushing pharmaceutical companies, has medicalized “normal sadness” brought on by external circumstances. (Added note: Pies has emailed this this link to a more detailed discussion posted on the PsychCentral website.) Here are some clips from the NYTimes article:
In their recent book “The Loss of Sadness” (Oxford, 2007), Allan V. Horwitz and Jerome C. Wakefield assert that for thousands of years, symptoms of sadness that were “with cause” were separated from those that were “without cause.” Only the latter were viewed as mental disorders.

With the advent of modern diagnostic criteria, these authors argue, doctors were directed to ignore the context of the patient’s complaints and focus only on symptoms — poor appetite, insomnia, low energy, hopelessness and so on. The current criteria for major depression, they say, largely fail to distinguish between “abnormal” reactions caused by “internal dysfunction” and “normal sadness” brought on by external circumstances. And they blame vested interests — doctors, researchers, pharmaceutical companies — for fostering this bloated concept of depression.

But while this increasingly popular thesis contains a kernel of truth, it conceals a bushel basket of conceptual and scientific problems.

For one thing, if modern diagnostic criteria were converting mere sadness into clinical depression, we would expect the number of new cases of depression to be skyrocketing compared with rates in a period like the 1950s to the 1970s. But several new studies in the United States and Canada find that the incidence of serious depression has held relatively steady in recent decades.

Second, it may seem easy to determine that someone with depressive complaints is reacting to a loss that touched off the depression. Experienced clinicians know this is rarely the case.

Third, and perhaps most troubling, is the implication that a recent major loss makes it more likely that the person’s depressive symptoms will follow a benign and limited course, and therefore do not need medical treatment. This has never been demonstrated, to my knowledge, in any well-designed studies. And what has been demonstrated, in a study by Dr. Sidney Zisook, is that antidepressants may help patients with major depressive symptoms occurring just after the death of a loved one.

Monday, October 20, 2008

The iPathology of your iBrain

Small and Vorgan offer an engaging article in Scientific American Mind on how daily exposure to television, computers, smart phones, video games, search engines and web browsers is rewiring our brains. A modern generation is rising with brains that are very different from the brains of those of us whose basic brain wiring was laid down in a time when direct social interactions were more the norm. One of the authors (Small) compared brain activities in computer- savvy versus computer-naive 50-60 year olds while searching for accurate information on a topic using Google, subtracting activity associated with just reading a text to determine activity specific to the searching function (which was the same in the two groups). In the baseline scanning session during searching on Google, the computer-savvy subjects engaged their dorsolateral prefrontal cortex while the Internet-naive subjects showed minimal activation in this region. After just five days of practice,the exact same neural circuitry in the front part of the brain became active in the Internet-naive subjects. Five hours on the Internet, and these participants had already rewired their brains.
Our high-tech revolution has plunged us into a state of “continuous partial attention,” which software executive Linda Stone, who coined the term in 1998, describes as continually staying busy—keeping tabs on everything while never truly focusing on anything. Continuous partial attention differs from multitasking, wherein we have a purpose for each task and we are trying to improve efficiency and productivity. Instead, when our minds partially attend, and do so continuously, we scan for an opportunity for any type of contact at every given moment. We virtually chat as our text messages flow, and we keep tabs on active buddy lists (friends and other screen names in an instant message program); everything, everywhere, is connected through our peripheral attention. Although having all our pals online from moment to moment seems intimate, we risk losing personal touch with our real-life relationships and may experience an artificial sense of intimacy as compared with when we shut down our devices and devote our attention to one
individual at a time.

When paying continuous partial attention, people may place their brain in a heightened state of stress. They no longer have time to reflect, contemplate or make thoughtful decisions. Instead they exist in a sense of constant crisis—on alert for a new contact or bit of exciting news or information at any moment. Once people get used to this state, they tend to thrive on the perpetual connectivity. It feeds their ego and sense of self-worth, and it becomes irresistible. Neuroimaging studies suggest that this sense of selfworth may protect the size of the hippocampus—the horseshoeshaped brain region in the medial (inward-facing) temporal lobe, which allows us to learn and remember new information. Psychiatry professor Sonia J. Lupien and her associates at McGill University studied hippocampal size in healthy younger and older adult volunteers. Measures of self esteem correlated significantly with hippocampal size, regardless of age. They also found that the more people felt in control of their lives, the larger the hippocampus. But at some point, the sense of control and self-worth we feel when we maintain continuous partial attention tends to break down—our brains were not built to sustain such monitoring for extended periods. Eventually the hours of unrelenting digital connectivity can create a unique type of brain strain. Many people who have been working on the Internet for several hours without a break report making frequent errors in their work. On signing off, they notice feeling spaced out, fatigued, irritable and distracted, as if they are in a “digital fog.” This new form of mental stress, what Small terms “techno-brain burnout,” is threatening to become an epidemic. Under this kind of stress, our brains instinctively signal the adrenal gland to secrete cortisol and adrenaline. In the short run, these stress hormones boost energy levels and augment memory, but over time they actually impair cognition, lead to depression, and alter the neural circuitry in the hippocampus, amygdala and prefrontal cortex—the brain regions that control mood and thought. Chronic and prolonged techno-brain burnout can even reshape the underlying brain structure.

While the brains of today’s digital natives are wiring up for rapid-fire cyber searches, however, the neural circuits that control the more traditional learning methods are neglected and gradually diminished. The pathways for human interaction and communication weaken as customary one-on-one people skills atrophy. Our U.C.L.A. research team and other scientists have shown that we can intentionally alter brain wiring and reinvigorate some of these dwindling neural pathways, even while the newly evolved technology circuits bring our brains to extraordinary levels of potential.

Bayesian estimation on the presidential election.

From the "Random Samples" section of the Oct. 17 Science Magazine:
In the winner-take-all world of politics, candidates know that even a modest lead in the polls can spell almost certain victory. Sheldon Jacobson, an operations research specialist at the University of Illinois, Urbana-Champaign, and colleagues, including a group of students, have attempted to quantify that insight for the current United States presidential election, putting their predictions for the Electoral College on a Web site, election08.cs.uiuc.edu. Using a statistical method known as Bayesian estimation, they combined an analysis of results from the 2004 Bush-versus-Kerry contest with current state-by-state polls for Obama versus McCain to produce probabilities for each candidate of carrying each state. They then converted the estimates into a probability distribution for the total number of Electoral College votes a candidate might receive. In Indiana, for example, polls as of 4 October gave McCain a slight 2.5% lead. But given that Bush carried Indiana in 2004 by 20.7%, a Bayesian calculation indicates McCain's chance of winning the state's 11 Electoral College votes at about 87%. Most states are now in the bag for one candidate or the other; only a handful are truly in Bayesian play. Current calculations give McCain no chance of victory. "However," Jacobson cautions, "if the polls move, then so will our forecasts."

Friday, October 17, 2008

The rise of the machines

Richard Dooloing writes on how a physicist, a wizard and a serial killer warned us of the current financial crisis.
We are living, we have long been told, in the Information Age. Yet now we are faced with the sickening suspicion that technology has run ahead of us. Man is a fire-stealing animal, and we can’t help building machines and machine intelligences, even if, from time to time, we use them not only to outsmart ourselves but to bring us right up to the doorstep of Doom.

We are still fearful, superstitious and all-too-human creatures. At times, we forget the magnitude of the havoc we can wreak by off-loading our minds onto super-intelligent machines, that is, until they run away from us, like mad sorcerers’ apprentices, and drag us up to the precipice for a look down into the abyss.

As the financial experts all over the world use machines to unwind Gordian knots of financial arrangements so complex that only machines can make — “derive” — and trade them, we have to wonder: Are we living in a bad sci-fi movie? Is the Matrix made of credit default swaps?

When Treasury Secretary Paulson (looking very much like a frightened primate) came to Congress seeking an emergency loan, Senator Jon Tester of Montana, a Democrat still living on his family homestead, asked him: “I’m a dirt farmer. Why do we have one week to determine that $700 billion has to be appropriated or this country’s financial system goes down the pipes?”

“Well, sir,” Mr. Paulson could well have responded, “the computers have demanded it.”

Men and women - different gene expression changes in brain on aging

Berchtold et al. pack quite a lot of interesting information into their abstract describing work on sexually dimorphic gene expression changes during human aging. I'm right in the middle of this quote: "Prominent change occurred in the sixth to seventh decades across cortical regions, suggesting that this period is a critical transition point in brain aging, particularly in males." Here is the abstract:
Gene expression profiles were assessed in the hippocampus, entorhinal cortex, superior-frontal gyrus, and postcentral gyrus across the lifespan of 55 cognitively intact individuals aged 20–99 years. Perspectives on global gene changes that are associated with brain aging emerged, revealing two overarching concepts. First, different regions of the forebrain exhibited substantially different gene profile changes with age. For example, comparing equally powered groups, 5,029 probe sets were significantly altered with age in the superior-frontal gyrus, compared with 1,110 in the entorhinal cortex. Prominent change occurred in the sixth to seventh decades across cortical regions, suggesting that this period is a critical transition point in brain aging, particularly in males. Second, clear gender differences in brain aging were evident, suggesting that the brain undergoes sexually dimorphic changes in gene expression not only in development but also in later life. Globally across all brain regions, males showed more gene change than females. Further, Gene Ontology analysis revealed that different categories of genes were predominantly affected in males vs. females. Notably, the male brain was characterized by global decreased catabolic and anabolic capacity with aging, with down-regulated genes heavily enriched in energy production and protein synthesis/transport categories. Increased immune activation was a prominent feature of aging in both sexes, with proportionally greater activation in the female brain. These data open opportunities to explore age-dependent changes in gene expression that set the balance between neurodegeneration and compensatory mechanisms in the brain and suggest that this balance is set differently in males and females, an intriguing idea.

Thursday, October 16, 2008

Embodyment and Art

I just received an email from Andrew Werth regarding the previous post. Andrew is a former software engineer turned artist whose paintings are about perception and embodiment. I thought I would pass on this link to his website, which lets one view paintings in his Embodyment Series.

Arguing for Embodied Consciousness

I thought I would pass along portions of a review in Science by Harold Fromm which has the title of this post, of Edward Slingerland's new book, "What Science Offers the Humanities - Integrating Body and Culture."
...his overall task is to address the befuddled dualism that still dominates most of our intellectual disciplines...Slingerland's central theme is that everything human has evolved in the interests of the materiality of the body. He identifies objectivist realism and postmodern relativity, both insufficiently attentive to the body, as the major epistemologies to be swept away, followed by the dualism of body and soul. For Slingerland, the presiding genii behind such a cleansing are George Lakoff and Mark Johnson, with heavier debts to Johnson [whose terse summary of embodiment in (1) appeared too late for Slingerland to reference]. They view all thought and human behavior as generated by the body and expressed as conceptual metaphors that translate physical categories (such as forward, backward, up, and down) into abstract categories (such as progress, benightedness, divinity, immorality). These body-driven metaphors, Slingerland writes, are a "set of limitations on human cognition, constraining human conceptions of entities, categories, causation, physics, psychology, biology, and other humanly relevant domains."

The supposedly objective world is not "some preexisting object out there in the world, with a set of invariant and observer-independent properties, simply waiting to be found the way one finds a lost sock under the bed." All we can ever see or understand is what our own bodily faculties permit via the current structure of the brain.

In opposition to objective realism, postmodern relativity regards language and culture as constituting the only "real" world possible for us. It posits an endless hall of mirrors with no access to outside--epitomized by Derrida's notorious remark that there is nothing (at least for humans) outside of texts (i.e., culture). This view, which dominated the humanities for several decades, is mercifully beginning to fade as the cognitive sciences have matured and are increasingly promulgated.

Even though the knowing human subject is itself just a thing and not an immaterial locus of reason, the universe it experiences is as real and functional for us as any "thing" could possibly be. We do get some things "right," even if we can never know the noumenal genesis behind our knowledge. And the very concept of noumena (things in themselves independent of any observer) now seems somewhat obsolete, given that the intuition of discrete, self-bounded "things" is as built-in to the human psyche as the Kantian intuitions of space and time, grounding all experience.

Our million billion synapses produce a "person" with the illusion of a self. Slingerland holds that "we are robots designed to be constitutionally incapable of experiencing ourselves and other conspecifics as robots." Our innate and overactive theory of mind (that other people, like ourselves, have "intentions") projects agency onto everything--in the past, even onto stones and trees. The "hard problem" for philosophy of consciousness (to use David Chalmers's phrase) remains: what are thoughts, cogitations, thinkers, qualia? Chalmers's solution, alas, swept away Cartesian dualism only to sneak his own magic spook, conscious experience (for him, on par with mass, charge, and space-time), in through the back door (2, 3).

Slingerland starts with Darwin and eventually follows Daniel Dennett so far as to agree that consciousness can be done full justice through third-person descriptions that require no mysterious, unaccounted-for, nonmaterial, first-person entity as substrate. Thus the famous "Mary," who intellectually knows everything there is to know about color despite having been sequestered for life in a color-free lab, will recognize red the first time she steps outside (4). And Thomas Nagel's famous bats don't know anything about bathood that we can't figure out for ourselves from observation (5). No first-person construct, no locus of consciousness, need be invoked.

The next step, if you want to go so far (the jury is out), is to eliminate consciousness altogether, because there's nothing for it to do that can't be done without it. And with it, you need a spook to keep the show on the road. Choose your insoluble problem: eliminate consciousness altogether as superfluous or explain it (if there's really a you who makes such choices). Slingerland prefers the first option.

His conclusion, which I can hardly do justice to here, is relatively satisfying. He notes that although we don't have great difficulty knowing that Earth revolves around the Sun while feeling that the Sun is rising and setting (Dennett's favorite example of folk psychology), "no cognitively undamaged human being can help acting like and at some level really feeling that he or she is free"--however nonsensical the notion of agencyless free will (i.e., "choices" without a self to make them). Still, once the corrosive acid of Darwinism [to use Dennett's figure from (6)] has resolved the body-mind dualism into body alone, some but not most of us are able "to view human beings simultaneously under two descriptions: as physical systems and as persons."

References

1. M. Johnson, The Meaning of the Body: Aesthetics of Human Understanding (Univ. of Chicago Press, Chicago, 2007).
2. D. J. Chalmers, J. Consciousness Stud. 2, 200 (1995).
3. D. J. Chalmers, The Conscious Mind: In Search of a Fundamental Theory (Oxford Univ. Press, Oxford, 1996).
4. F. Jackson, Philos. Q. 32, 127 (1982).
5. T. Nagel, Philos. Rev. 83, 435 (1974).
6. D. C. Dennett, Darwin's Dangerous Idea: Evolution and the Meaning of Life (Simon and Schuster, New York, 1995).

Your bladder and your brain.

Overactive bladder, usually caused by bladder obstruction in males, apparently affects ~17% of the population, towards whom those awful pharmaceutical television adds are directed. Signals arising from bladder or colonic pathology are processed by the cortex and can potentially be expressed as central symptoms (e.g., hyperarousal, attention disorders, anxiety) that occur alongside the visceral pathology. Rickenbacher et al. now show, in a rat model, that bladder obstruction not only botches up the bladder, but also brain regions involved in its regulation.
Neural circuits that allow for reciprocal communication between the brain and viscera are critical for coordinating behavior with visceral activity. At the same time, these circuits are positioned to convey signals from pathologic events occurring in viscera to the brain, thereby providing a structural basis for comorbid central and peripheral symptoms. In the pons, Barrington's nucleus and the norepinephrine (NE) nucleus, locus coeruleus (LC), are integral to a circuit that links the pelvic viscera with the forebrain and coordinates pelvic visceral activity with arousal and behavior. Here, we demonstrate that a prevalent bladder dysfunction, produced by partial obstruction in rat, has an enduring disruptive impact on cortical activity through this circuit. Within 2 weeks of partial bladder obstruction, the activity of LC neurons was tonically elevated. LC hyperactivity was associated with cortical electroencephalographic activation that was characterized by decreased low-frequency (1–3 Hz) activity and prominent theta oscillations (6–8 Hz) that persisted for 4 weeks. Selective lesion of the LC–NE system significantly attenuated the cortical effects. The findings underscore the potential for significant neurobehavioral consequences of bladder disorders, including hyperarousal, sleep disturbances, and disruption of sensorimotor integration, as a result of central noradrenergic hyperactivity. The results further imply that pharmacological manipulation of central NE function may alleviate central sequelae of these visceral disorders.

Wednesday, October 15, 2008

Applied neuroeconomics - the fear of loss

Bajaj does an interesting writeup of well-known crowd psychological dynamics behind recent "irrational" drops in the stock market. Fear is a more powerful force than greed. Our aversive reaction to losing $1000 is greater than our pleasure at earning the same amount...
fear now seems to rule, with investors often exhibiting a Wall Street version of the fight-or-flight mechanism — selling first, and asking questions later...some analysts are starting to suggest the markets are showing signs of “capitulation” — what happens when even the bullish holdouts, the unflagging optimists, throw up their hands and join the stampede out of the market...To some, signs of capitulation can be read as an indicator that the bottom may be near.
The opposite swing of the cycle is buying at the top of a bubble. I remember during my winter stays in Ft. Lauderdale in 2005 and 2006, every fourth person I chatted with seemed to be a realtor and dinner conversations were dominated by stories about fast profits on flipped condominiums.

Wordwatchers

An article by Wapner points to the work of James Pennebacker and his Wordwatchers blog that is tracking the candidates use of words during the 2008 election. The blog makes fascinating reading. Here is just one clip:
Predicting how they will govern. Most language dimensions that we study are probably better markers of how people will lead than who will vote for them. Some dimensions that are relevant include:

Cognitive complexity. A particularly reliable marker of cognitive complexity is the exclusive word dimension. Exclusive words such as but, except, without, exclude, signal that the speaker is making an effort to distinguish what is in a category and not in a category. Those who use more exclusive words make better grades in college, are more honest in lab studies, and have more nuanced understanding of events and people. Through the primaries until now, Obama has consistently been the highest in exclusive word use and McCain the lowest.

Categorical versus fluid thinking. Some people naturally approach problems by assigning them to categories. Categorical thinking involves the use of articles (a, an, the) and concrete nouns. Men, for example, use articles at much higher rates than women. Fluid thinking involves describing actions and changes, often in more abstract ways. A crude measure of fluid thinking is the use of verbs. Women use verbs more than men.

McCain and Obama could not be more different in their use of articles and verbs. McCain uses verbs at an extremely low rate and articles at a fairly high rate. Obama, on the other hand, is remarkably high in his use of verbs and low in his use of articles. These patterns suggest that McCain’s natural way of understanding the world is to first label the problem and find a way to put it into a pre-existing category. Obama is more likely to define the world as ongoing actions or processes.

Personal and socially connected. Individuals who think about and try to connect with others tend to use more personal pronouns (I, we, you, she, they) than those who are more socially detached. Bush was higher than Kerry or Gore. McCain has consistently been much higher than any other candidate in this election cycle. His use of 1st person singular (I, me, my) is particularly high which often signals an openness and honesty. Obama uses personal pronouns at moderate levels - similar to Hillary Clinton and most other primary candidates of both parties.

Restrained versus impulsive. People vary in the degree to which they act quickly or shoot from the hip versus stand back and consider their options. Over the last few years, some have argued that the use of negations (e.g., no, not, never) indicate a sign of inhibition or constraint. Low use of negations may be linked to impulsiveness. Bush was low in negations whereas Kerry was quite high. Across the election cycle, Obama has consistently been the highest user of negations - suggesting a restrained approach - where as McCain has been the lowest - a more impulsive way of dealing with the world.

Tuesday, October 14, 2008

MRI of moral emotions while causing harm.

Kédia et al. perform brain imaging of subjects as they imagine harm in several different contexts of subject and victim:
The statement "An agent harms a victim" depicts a situation that triggers moral emotions. Depending on whether the agent and the victim are the self or someone else, it can lead to four different moral emotions: self-anger ("I harm myself"), guilt ("I harm someone"), other-anger ("someone harms me"), and compassion ("someone harms someone"). In order to investigate the neural correlates of these emotions, we examined brain activation patterns elicited by variations in the agent (self vs. other) and the victim (self vs. other) of a harmful action. Twenty-nine healthy participants underwent functional magnetic resonance imaging while imagining being in situations in which they or someone else harmed themselves or someone else. Results indicated that the three emotional conditions associated with the involvement of other, either as agent or victim (guilt, other-anger, and compassion conditions), all activated structures that have been previously associated with the Theory of Mind (ToM, the attribution of mental states to others), namely, the dorsal medial prefrontal cortex, the precuneus, and the bilateral temporo-parietal junction. Moreover, the two conditions in which both the self and other were concerned by the harmful action (guilt and other-anger conditions) recruited emotional structures (i.e., the bilateral amygdala, anterior cingulate, and basal ganglia). These results suggest that specific moral emotions induce different neural activity depending on the extent to which they involve the self and other.

This year's Ig-Noble prize in cognitive science goes to a slime mold

A clip from Steve Nadis' write up in Nature News of this year's Ig-Noble prizes:
Slime moulds exhibit the kind of "contemplative behaviour" that Hamlet is famous for, muses Toshiyuki Nakagaki of Hokkaido University in Japan. ...The slime mold's puzzle-solving ability — Shakespearean or otherwise — is a discovery that is unlikely to change the world, but it won Nakagaki and his colleagues an Ig Nobel Prize for cognitive science last week at the annual event held at Harvard University in Cambridge, Massachusetts. Their research... showed that slime molds looking for food have "the ability to find the minimum-length solution between two points in a labyrinth".

Subsequently, the team has found that molds can find the shortest path between 30–50 points, which is something even supercomputers cannot yet work out. "We can't even check the mold's solution," notes Nakagaki, "but it looks good."

Monday, October 13, 2008

How context can set our emotional reaction to a smell.

Rolls et al. do a piece of work on how selective attention to affective value can alter how our brains process olfactory stimuli:
How does selective attention to affect influence sensory processing? In a functional magnetic resonance imaging investigation, when subjects were instructed to remember and rate the pleasantness of a jasmin odor, activations were greater in the medial orbitofrontal and pregenual cingulate cortex than when subjects were instructed to remember and rate the intensity of the odor. When the subjects were instructed to remember and rate the intensity, activations were greater in the inferior frontal gyrus. These top–down effects occurred not only during odor delivery but started in a preparation period after the instruction before odor delivery, and continued after termination of the odor in a short-term memory period. Thus, depending on the context in which odors are presented and whether affect is relevant, the brain prepares itself, responds to, and remembers an odor differently. These findings show that when attention is paid to affective value, the brain systems engaged to prepare for, represent, and remember a sensory stimulus are different from those engaged when attention is directed to the physical properties of a stimulus such as its intensity. This differential biasing of brain regions engaged in processing a sensory stimulus depending on whether the cognitive demand is for affect-related versus more sensory-related processing may be an important aspect of cognition and attention. This has many implications for understanding the effects not only of olfactory but also of other sensory stimuli.

Autistic people have the visual acuity of hawks.

Ashwin et al. have come up with a fascinating observation during their testing of 15 men with autism-spectrum disorders using the Freiburg Visual Acuity and Contrast Test. They found them to have, on average, 20:7 vision. This means they can see the same detail on an object 20 meters away that a person with average vision can see at 7 meters. Birds of prey have roughly 20:6 vision. What gives these people with autism hawk-like vision isn't known.

Early Fall on Twin Valley Road

Picture of my front yard taken Saturday at my home in Town of Middleton, Wisconsin.

Friday, October 10, 2008

We seek mates that resemble our opposite-sex parents.

The research highlights section of Nature points to work by Bereczkei and his colleagues at the University of Pécs in Hungary who find new evidence linking partner choices to parental appearance. By measuring 14 facial proportions of 312 adults from 52 families, Bereczkei et al. show significant correlations in appearance between young men and their partner's father and young women and their partner's mother. This supports the theory that children are imprinted with their opposite-sex parent's face. The abstract from Proc. Roy. Soc. B:
Former studies have suggested that imprinting-like processes influence the shaping of human mate preferences. In this study, we provide more direct evidence for assessing facial resemblance between subjects' partner and subjects' parents. Fourteen facial proportions were measured on 312 adults belonging to 52 families, and the correlations between family members were compared with those of pairs randomly selected from the population. Spouses proved to be assortatively mated in the majority of measured facial proportions. Significant correlations have been found between the young men and their partner's father (but not his mother), especially on facial proportions belonging to the central area of the face. Women also showed resemblance to their partner's mother (but not to their father) in the facial characteristics of their lower face. Replicating our previous studies, facial photographs of participants were also matched by independent judges who ascribed higher resemblance between partners, and subjects and their partners' opposite-sex parents, compared with controls. Our results support the sexual imprinting hypothesis which states that children shape a mental template of their opposite-sex parents and search for a partner who resembles that perceptual schema. The fact that only the facial metrics of opposite-sex parents showed resemblance to the partner's face tends to rule out the role of familiarity in shaping mating preferences. Our findings also reject several other rival hypotheses. The adaptive value of imprinting-related human mating is discussed, and a hypothesis is made of why different facial areas are involved in males' and females' search for resemblance.

Light exciting our eyes, an intimate picture.

In a former life, I spent 30 years running a laboratory that studies how light is changed into a nerve signal in our eyes. Much of our work centered on the visual pigment rhodopsin, which starts an excitation cascade after its excitation by light by binding to the alpha subunit of a G-protein. I am in awe of new technologies that have, since my work, revealed many of the finer details of this process. Thus I can't resist showing this beautiful graphic from a recent review by Schwartz and Hubbell, describing work by Sheerer et al.


a, Rhodopsin, shown here in its inactivated conformation, is a light-sensing receptor found in cell membranes. It consists of a protein (opsin, green) and a ligand (retinal, pink, also shown in its inactivated conformation). When activated by light, rhodopsin binds to part of an adjacent G protein (binding region in red), triggering a cascade of biological responses. The protein plug (blue) is part of the extracellular domain of opsin, and immobilizes the extracellular transmembrane segments of the receptor. b, Scheerer et al. have determined the activated structure of opsin in complex with the receptor-binding peptide fragment of the G protein (the Galpha peptide). The most notable difference when compared with the inactivated receptor is that transmembrane helix 6 (TM-VI) has moved substantially outward (indicated by the red arrow), thereby creating the binding pocket for the G-protein peptide.

Thursday, October 09, 2008

Models to compute and predict our current economic chaos?

Here is an interesting article on the resistance of economic theorists to using modeling approaches that have proven useful in predicting dramatic and sudden transitions. Such models have been successfully applied to predicting heart attacks, epileptic seizures, stock market bubbles, eutrophication of lakes, etc. They are based in part on the observation that variance in an apparent steady state begins to change in predictable ways in advance of large rapid transitions. Modeling the dynamics of a systems of agents by simulating their workings from the bottom up can reveal how instabilities or phase transitions can rise in a system of linked agents by trouble in one of them.