Friday, October 24, 2014

How to chill on aggression...get blood glucose levels up.

Here is an interesting and quirky piece by Bushman et al. that has been languishing in my queue of potential posts for quite a while:
People are often the most aggressive against the people to whom they are closest—intimate partners. Intimate partner violence might be partly a result of poor self-control. Self-control of aggressive impulses requires energy, and much of this energy is provided by glucose derived from the food we eat. We measured glucose levels in 107 married couples over 21 days. To measure aggressive impulses, participants stuck 0–51 pins into a voodoo doll that represented their spouse each night, depending how angry they were with their spouse. To measure aggression, participants blasted their spouse with loud noise through headphones. Participants who had lower glucose levels stuck more pins into the voodoo doll and blasted their spouse with louder and longer noise blasts.
This is in line with a body of work (reviewed by Gailliot) suggesting that self-control requires and can deplete a limited energy source, glucose. A relationship between glucose utilization and aggression may be universal, it is also observed in honey bees and fruitflies (Li-Byarlay et al.)

Happiness and well-being sources.

I’m passing on three items from my queue of potential posts that touch on well-being and happiness. First, of course, there’s an App for that! Kit Eaton reviews three of these: Happify, iMoodJournal and Smiling Mind. Basaraba points to a number of sources on the health benefits of gratitude, as does Dashel Keltner’s Greater Good site. Finally, Reynolds points to studies just published in Cell Magazine that delve into the biochemical details of how exercise may protect us against depression.

Thursday, October 23, 2014

Speaking out in a group correlates with gender.

The effectiveness of group decision-making depends on whether the best informed members actually contribute to the discussion. Coffman does a laboratory experiment to examine factors that influence an individual's propensity to contribute, finding that in general undergraduate women contribute less than men, but show the least reluctance for more female-stereotyped subject areas such as art and the most for male-stereotyped subject such as sports:
We use a lab experiment to explore the factors that predict an individual's decision to contribute her idea to a group. We find that contribution decisions depend upon the interaction of gender and the gender stereotype associated with the decision-making domain: conditional on measured ability, individuals are less willing to contribute ideas in areas that are stereotypically outside of their gender's domain. Importantly, these decisions are largely driven by self-assessments, rather than fear of discrimination. Individuals are less confident in gender incongruent areas and are thus less willing to contribute their ideas. Because even very knowledgeable group members under-contribute in gender incongruent categories, group performance suffers and, ex post, groups have difficulty recognizing who their most talented members are. Our results show that even in an environment where other group members show no bias, women in male-typed areas and men in female-typed areas may be less influential. An intervention that provides feedback about a woman's (man's) strength in a male-typed (female-typed) area does not significantly increase the probability that she contributes her ideas to the group. A back-of-the-envelope calculation reveals that a “lean in” style policy that increases contribution by women would significantly improve group performance in male-typed domains.
And, a related bit of work from Eddy et al. shows that although females outnumber males in biology, does a study of 23 different introductory biology classrooms that reveals systematic gender disparities in student performance on exams and student participation when instructors ask students to volunteer answers to instructor-posed questions.

Wednesday, October 22, 2014

Scientific evidence does not support anti-aging claims of the brain game industry.

MindBlog has done numerous posts on brain training games as possible antidotes to cognitive decline in the elderly. (I've played with both Merzenich's BrainHQ exercises and Luminosity exercises). The Stanford Center for Longevity and the Max Planck Institute for Human Development have together just issued a joint statement skeptical about the effectiveness of "brain game" products such as these (the full statement, with references, is here), signed by 69 prominent psychologists and cognitive scientists from around the world,  even including Adam Gazzaley at UCSF, who has a financial interest in the brain gaming industry (and whose PT Barnum approach to publicizing his work I have criticized - see also a a recent NY Times piece on Gazzaley "Can Video Games Fend Off Mental Decline?"). Daniel Schacter at Harvard is among the other prominent signatories.

I pass on their closing recommendations and summary:
Much more research needs to be done before we understand whether and what types of challenges and engagements benefit cognitive functioning in everyday life. In the absence of clear evidence, the recommendation of the group, based largely on correlational findings, is that individuals lead physically active, intellectually challenging, and socially engaged lives, in ways that work for them. Before investing time and money on brain games, consider what economists call opportunity costs: If an hour spent doing solo software drills is an hour not spent hiking, learning Italian, making a new recipe, or playing with your grandchildren, it may not be worth it. But if it replaces time spent in a sedentary state, like watching television, the choice may make more sense for you.
Physical exercise is a moderately effective way to improve general health, including brain fitness. Scientists have found that regular aerobic exercise increases blood flow to the brain, and helps to support formation of new neural and vascular connections. Physical exercise has been shown to improve attention, reasoning, and components of memory. All said, one can expect small but noticeable gains in cognitive performance, or attenuation of loss, from taking up aerobic exercise training.
A single study, conducted by researchers with financial interests in the product, or one quote from a scientist advocating the product, is not enough to assume that a game has been rigorously examined. Findings need to be replicated at multiple sites, based on studies conducted by independent researchers who are funded by independent sources. Moreover, participants of training programs should show evidence of significant advantage over a comparison group that does not receive the treatment but is otherwise treated exactly the same as the trained group.
No studies have demonstrated that playing brain games cures or prevents Alzheimer’s disease or other forms of dementia.
Do not expect that cognitively challenging activities will work like one-shot treatments or vaccines; there is little evidence that you can do something once (or even for a concentrated period) and be inoculated against the effects of aging in an enduring way. In all likelihood, gains won’t last long after you stop the challenge.
In summary: We object to the claim that brain games offer consumers a scientifically grounded avenue to reduce or reverse cognitive decline when there is no compelling scientific evidence to date that they do. The promise of a magic bullet detracts from the best evidence to date, which is that cognitive health in old age reflects the long-term effects of healthy, engaged lifestyles. In the judgment of the signatories, exaggerated and misleading claims exploit the anxiety of older adults about impending cognitive decline. We encourage continued careful research and validation in this field.

Humans and robots.

A recent issue of Science magazine has a special section of articles on the social life of robots. The introduction by Stone and Lavine provides links to the abstracts of the articles (full text is not open access). I pass on their introduction:
Autonomous machines have gripped our imagination ever since the first robot flickered on the silver screen, Maria (left) in the 1927 film Metropolis. Most of the robots we know today—unglamorous devices like robotic welders on car assembly lines and the Roomba vacuum cleaner—fall short of those in science fiction. But our relationship with robots is about to become far more intimate. Would you be comfortable with a robot butler, or a self-driving car? How about a robo-scientist toiling away next to you at the bench, not only pipetting but also formulating hypotheses and designing experiments?
As robots become more sophisticated, psychological paradoxes are coming into sharper relief. Robots that look human strike many of us as downright creepy (as this week's cover attests), while robots that act human—when they are programmed, for example, to cheat at cards—somehow put us at ease. And no matter how uncannily lifelike some of today's robots may seem, the resemblance is skin-deep. A stubborn challenge has been endowing robots with not only the capability to sense their environment, but also the wits to make sense of it. Robots will get there eventually, and when that happens we'll be confronted with a new array of ethical and moral questions. Questions like: Should robots be accorded rights as sentient beings? The rise of the machines will be anything but predictable.
And here is the abstract to one of the articles, "In our own image" by Dennis Normile:
For 2 decades, Hiroshi Ishiguro's teams have deployed various robots—some with vaguely human forms, others crafted to look indistinguishable from people—as customers in cafes, clerks in stores, guides in malls and museums, teachers in schools, and partners in recreational activities. The roboticists, who use robots both operating autonomously and under human remote control, have come to some startling conclusions. In some situations, people prefer to speak with an android instead of another person, and they feel that robots should be held accountable for mistakes and treated fairly. And humans can quickly form deep emotional bonds with robots. Some find the implications of the work worrisome. But with a wave of more sophisticated social robots about to hit the mass market, the debate is no longer academic.

Tuesday, October 21, 2014

Pianists’ brains are different from everyone else...

Because I'm a pianist who started lessons at age 6 and now usually give two concerts a year, I'm always fascinated by articles like this one from a music site (pointed out to me by my artistic daughter-in-law, who does improvisation theater), that points to several interesting studies on brain changes that are caused by high level music training, most pronounced if training is begun before age 7. Because the hands of pianists must negotiate 88 keys with ten fingers, sometimes playing 10 notes at once, the normal asymmetry of hand motor area of the brain associated with being right or left handed is reduced. (Usually the brain's central sulcus that contains the hand motor area is deeper on the dominant side.) Watson summarizes a number of other differences in hand and motor coordination. Also, high level music training enhances ability to integrate sensory information from hearing, touch, and sight. Brain circuits involved in musical improvisation are shaped by systematic training, leading to less reliance on working memory and more extensive connectivity within the brain. Finally, when experienced pianists play and improvise, they literally switch off the part of the brain associated with providing stereotypical responses, ensuring that they play with their own unique voice and not the voices of others.

Monday, October 20, 2014

Vitamin D prevents cognitive decline

...in aging rats, to be sure. Work like the following piece from my colleagues at the University of Wisconsin reinforces my determination to continue my vitamin D supplements (over the objection of my internist). At the risk of TMI (too much information), I can also report that I sense the association of my vitamin D (25-hydroxyvitamin D) levels with androgen (testosterone) levels that has been reported. Latimer et al.:
Significance
Higher blood levels of vitamin D are associated with better health outcomes. Vitamin D deficiency, however, is common among the elderly. Despite targets in the brain, little is known about how vitamin D affects cognitive function. In aging rodents, we modeled human serum vitamin D levels ranging from deficient to sufficient and tested whether increasing dietary vitamin D could maintain or improve cognitive function. Treatment was initiated at middle age, when markers of aging emerge, and maintained for ∼6 mo. Compared with low- or normal-dietary vitamin D groups, only aging rats on higher vitamin D could perform a complex memory task and had blood levels considered in the optimal range. These results suggest that vitamin D may improve the likelihood of healthy cognitive aging.
Abstract
Vitamin D is an important calcium-regulating hormone with diverse functions in numerous tissues, including the brain. Increasing evidence suggests that vitamin D may play a role in maintaining cognitive function and that vitamin D deficiency may accelerate age-related cognitive decline. Using aging rodents, we attempted to model the range of human serum vitamin D levels, from deficient to sufficient, to test whether vitamin D could preserve or improve cognitive function with aging. For 5–6 mo, middle-aged F344 rats were fed diets containing low, medium (typical amount), or high (100, 1,000, or 10,000 international units/kg diet, respectively) vitamin D3, and hippocampal-dependent learning and memory were then tested in the Morris water maze. Rats on high vitamin D achieved the highest blood levels (in the sufficient range) and significantly outperformed low and medium groups on maze reversal, a particularly challenging task that detects more subtle changes in memory. In addition to calcium-related processes, hippocampal gene expression microarrays identified pathways pertaining to synaptic transmission, cell communication, and G protein function as being up-regulated with high vitamin D. Basal synaptic transmission also was enhanced, corroborating observed effects on gene expression and learning and memory. Our studies demonstrate a causal relationship between vitamin D status and cognitive function, and they suggest that vitamin D-mediated changes in hippocampal gene expression may improve the likelihood of successful brain aging.

Friday, October 17, 2014

How culture shapes spatial conceptions of time - Is your past in front of, or behind you?

A interesting perspective from Fuente et al. on spatial conceptions of time. Some clips from their article:
Across many of the world’s languages, the future is “ahead” of the speaker, and the past is “behind.” In English, people can look “forward” to their retirement or look “back” on their childhood....yet some languages exhibit the opposite space-time mapping. In the Andean language Aymara, for example, metaphors place the past in front (e.g., nayra mara, tr. “front year,” means last year) and the future behind (e.g., qhipa marana, tr. “back year,” means next year)...In the research reported here, we investigated this question by exploring a surprising discovery about temporal language and thought in speakers of Darija, a Moroccan dialect of modern Arabic. Front-back time metaphors in Arabic are similar to metaphors in English and other future-in-front languages.
We compared how native Spanish and Darija speakers gesture when talking about past and future events. Whereas Spaniards showed a weak tendency to gesture according to the future-in-front mapping, Moroccans showed a strong tendency to gesture according to the past-in-front mapping—despite using future-in-front metaphors in speech. On the basis of their co-speech gestures, it appears that Darija speakers think about time like the Aymara do, even though they talk about it like speakers of English, Spanish, and other familiar future-in-front languages.
Since existing theories cannot explain the pattern of space-time mappings observed across cultures, we proposed an alternative explanation, the temporal-focus hypothesis: People’s implicit associations of “past” and “future” with “front” and “back” should depend on their temporal focus. That is, in people’s mental models, they should place in front of them whichever pole of the space-time continuum they tend to “focus on” metaphorically—locating it where they could focus on it literally with their eyes if events in time were visible objects. Consistent with the temporal-focus hypothesis, our results showed that, compared with Moroccans, Spaniards tend to be future focused, attributing more importance to social change, economic and technological progress, and modernization. By contrast, compared with Spaniards, Moroccans tend to be past focused, attributing more importance to older generations and respect for traditional practices.

Thursday, October 16, 2014

Why are we fooled by the ventriloquist?

As we watch the movement's of a dummy's mouth while it is sitting in a ventriloguist's lap, we perceive the speech as coming from the dummy's mouth, rather than it's master's voice. Berger and Ehrsson show that this illusory translocation is associated with increased activity the left superior temporal sulcus (L. STS). This is the region that has been shown to be central in determining the spatial coordinates of our experienced self. (It is associated also, for example, with the out of body illusion.)
It is well understood that the brain integrates information that is provided to our different senses to generate a coherent multisensory percept of the world around us, but how does the brain handle concurrent sensory information from our mind and the external world? Recent behavioral experiments have found that mental imagery—the internal representation of sensory stimuli in one's mind—can also lead to integrated multisensory perception; however, the neural mechanisms of this process have not yet been explored. Here, using functional magnetic resonance imaging and an adapted version of a well known multisensory illusion (i.e., the ventriloquist illusion), we investigated the neural basis of mental imagery-induced multisensory perception in humans. We found that simultaneous visual mental imagery and auditory stimulation led to an illusory translocation of auditory stimuli and was associated with increased activity in the left superior temporal sulcus (L. STS), a key site for the integration of real audiovisual stimuli. This imagery-induced ventriloquist illusion was also associated with increased effective connectivity between the L. STS and the auditory cortex. These findings suggest an important role of the temporal association cortex in integrating imagined visual stimuli with real auditory stimuli, and further suggest that connectivity between the STS and auditory cortex plays a modulatory role in spatially localizing auditory stimuli in the presence of imagined visual stimuli.

Wednesday, October 15, 2014

Our microbial aura - in the house and in the garden

I've always been fascinated by the fact that in "our" bodies most of the cells are not our own, they are microbial symbionts. I pass on here two more takes on this. First, from Lax et al. on microbial interaction between humans and the indoor environment, signature microbes follow us from house to house:
The bacteria that colonize humans and our built environments have the potential to influence our health. Microbial communities associated with seven families and their homes over 6 weeks were assessed, including three families that moved their home. Microbial communities differed substantially among homes, and the home microbiome was largely sourced from humans. The microbiota in each home were identifiable by family. Network analysis identified humans as the primary bacterial vector, and a Bayesian method significantly matched individuals to their dwellings. Draft genomes of potential human pathogens observed on a kitchen counter could be matched to the hands of occupants. After a house move, the microbial community in the new house rapidly converged on the microbial community of the occupants’ former house, suggesting rapid colonization by the family’s microbiota.
And, Anna North points to the beneficial effects of exposure to soil organisms. Some soil bacteria have the same antidepressant effect on mice as serotonin reuptake inhibitors like Prozac. Clips from Lowry et al.'s abstract:
We have found that peripheral immune activation with antigens derived from the nonpathogenic, saprophytic bacterium, Mycobacterium vaccae, activated a specific subset of serotonergic neurons in the interfascicular part of the dorsal raphe nucleus (DRI) of mice...The effects of immune activation were associated with increases in serotonin metabolism within the ventromedial prefrontal cortex, consistent with an effect of immune activation on mesolimbocortical serotonergic systems. The effects of M. vaccae administration on serotonergic systems were temporally associated with reductions in immobility in the forced swim test, consistent with the hypothesis that the stimulation of mesolimbocortical serotonergic systems by peripheral immune activation alters stress-related emotional behavior.

Tuesday, October 14, 2014

Boredom = Stress.... and misbehavior

From Merrifield and Danckert, a crisp piece of work (using the usual covey of college undergraduate as subjects) demonstrating that boredom increases stress indicators:
Research on the experience and expression of boredom is underdeveloped. The purpose of the present study was to explore the psychophysiological signature of the subjective experience of boredom. Healthy undergraduates (n = 72) viewed previously validated and standardized video clips to induce boredom, sadness, and a neutral affective state, while their heart rate (HR), skin conductance levels (SCL), and cortisol levels were measured. Boredom yielded dynamic psychophysiological responses that differed from the other emotional states. Of particular interest, the physiological signature of boredom relative to sadness was characterized by rising HR, decreased SCL, and increased cortisol levels. This pattern of results suggests that boredom may be associated with both increased arousal and difficulties with sustained attention. These findings may help to resolve divergent conceptualizations of boredom in the extant literature and, ultimately, to enhance our understanding and treatment of clinical syndromes in which self-reported boredom is a prominent symptom.
And, Bruursema et al. note a correlation between boredom and counterproductive work behavior:
In this study, the relationships among boredom proneness, job boredom, and counterproductive work behaviour (CWB) were examined. Boredom proneness consists of several factors, which include external stimulation and internal stimulation. Given the strong relationships between both the external stimulation factor of boredom proneness (BP-ext) and anger as well as the strong relationship between trait anger and CWB, we hypothesized that examining BP-ext would help us to better understand why employees commit CWB. Five types of CWB have previously been described: abuse against others, production deviance, sabotage, withdrawal and theft. To those we added a sixth, horseplay. Using responses received from 211 participants who were recruited by email from throughout North America (112 of them matched with co-workers), we found support for our central premise. Indeed, both BP-ext and job boredom showed significant relationships with various types of CWB. The boredom proneness factor also moderated the relationship between job boredom and some types of CWB, suggesting that a better understanding of boredom is imperative for designing interventions to prevent CWB.

Monday, October 13, 2014

Improvement of performance by transcranial stimulation depends on existing degree of expertise.

Furuya et al. make the interesting observation that the fine motor hand performance of musically untrained people is improved by transcranial direction current stimulation (tDCS) over the primary motor cortices, but the performance of skilled pianists can be degraded.
The roles of the motor cortex in the acquisition and performance of skilled finger movements have been extensively investigated over decades. Yet it is still not known whether these roles of motor cortex are expertise-dependent. The present study addresses this issue by comparing the effects of noninvasive transcranial direction current stimulation (tDCS) on the fine control of sequential finger movements in highly trained pianists and musically untrained individuals. Thirteen pianists and 13 untrained controls performed timed-sequence finger movements with each of the right and left hands before and after receiving bilateral tDCS over the primary motor cortices. The results demonstrate an improvement of fine motor control in both hands in musically untrained controls, but deterioration in pianists following anodal tDCS over the contralateral cortex and cathodal tDCS over the ipsilateral cortex compared with the sham stimulation. However, this change in motor performance was not evident after stimulating with the opposite montage. These findings support the notion that changes in dexterous finger movements induced by bihemispheric tDCS are expertise-dependent.

Friday, October 10, 2014

Our sleep cycle started 700 million years ago in a worm?

Zimmer points to a nice piece of work by Tosches et al. suggesting that the melatonin rhythm that regulates our sleep may have arisen ~700 million years ago in a marine worm larvae - to regulate swarming up to the surface of the sea at twilight to feed and then sink back to lower depths during light to avoid sunlight and predation. A clip from the Zimmer review:
The new study offers an intriguing idea for how our vertebrate ancestors adapted the melatonin genes as they evolved a complex brain.
Originally, the day-night cycle was run by all-purpose cells that could catch light and make melatonin. But then the work was spread among specialized cells. The eyes now took care of capturing light, for example, while the pineal gland made melatonin.
The new study may also help explain how sleep cuts us off from the world. When we’re awake, signals from our eyes and other senses pass through the thalamus, a gateway in the brain. Melatonin shuts the thalamus down by causing its neurons to produce a regular rhythm of bursts. “They’re busy doing their own thing, so they can’t relay information to the rest of the brain,” Dr. Tosches said.
It may be no coincidence that in worms, melatonin also produces electrical rhythms that jam the normal signals of the day. We may sink into sleep the way our ancestors sank into the depths of the ocean.

Thursday, October 09, 2014

Inflammatory signaling is bad for the aging brain.

Baruch et al. do some interesting work suggesting that preventing antiviral-like responses may protect aging brain function. They find that the choroid plexus of older mice produces more RNA for the inflammatory cytokine interferon-I than younger mice. This increase is also seen in human post-mortem brain samples. (The choroid plexus produces cerebrospinal fluid that bathes the brain, is exposed both to blood and cerebrospial fluid, and constitutes the blood–cerebrospinal fluid barrier.) Blocking interferon signaling in the aging mouse brain partially restored cognitive function. Here is their abstract:
Aging-associated cognitive decline is affected by factors produced inside and outside the brain. By using multiorgan genome-wide analysis of aged mice, we found that the choroid plexus, an interface between the brain and the circulation, shows a type I interferon (IFN-I)–dependent gene expression profile that was also found in aged human brains. In aged mice, this response was induced by brain-derived signals, present in the cerebrospinal fluid. Blocking IFN-I signaling within the aged brain partially restored cognitive function and hippocampal neurogenesis and reestablished IFN-II–dependent choroid plexus activity, which is lost in aging. Our data identify a chronic aging-induced IFN-I signature, often associated with antiviral response, at the brain’s choroid plexus and demonstrate its negative influence on brain function, thereby suggesting a target for ameliorating cognitive decline in aging.

Wednesday, October 08, 2014

Why our childhood takes so long - the metabolic costs of brain development

Kuzawa et al. do a nice job of explaining how the energy requirements of our brain growth slow down our body growth in childhood:
The metabolic costs of brain development are thought to explain the evolution of humans’ exceptionally slow and protracted childhood growth; however, the costs of the human brain during development are unknown. We used existing PET and MRI data to calculate brain glucose use from birth to adulthood. We find that the brain’s metabolic requirements peak in childhood, when it uses glucose at a rate equivalent to 66% of the body’s resting metabolism and 43% of the body’s daily energy requirement, and that brain glucose demand relates inversely to body growth from infancy to puberty. Our findings support the hypothesis that the unusually high costs of human brain development require a compensatory slowing of childhood body growth.

Tuesday, October 07, 2014

Is it love or lust? Look at eye gaze.

Bolmont et al. ask:
When you are on a date with a person you barely know, how do you evaluate that person’s goals and intentions regarding a long-term relationship with you? Love is not a prerequisite for sexual desire, and sexual desire does not necessarily lead to love. Love and lust can exist by themselves or in combination, and to any degree.
Using the usual collection of heterosexual college students as subjects, the authors tracked eye movements as subjects viewed a series of photographs of persons they had never met before. In a separate session the subjects were asked whether the same photographs elicited feelings (yes or no) of sexual desire or romantic love. The results of a lot of fancy eye tracking analysis?
...subjects were more likely to fixate on the face when making decisions about romantic love than when making decisions about sexual desire, and the same subjects were more likely to look at the body when making decisions about sexual desire than when making decisions about romantic love
Duh........anyway, here is their abstract, which inexplicably doesn't include the above bottom line:
"Reading other people’s eyes is a valuable skill during interpersonal interaction. Although a number of studies have investigated visual patterns in relation to the perceiver’s interest, intentions, and goals, little is known about eye gaze when it comes to differentiating intentions to love from intentions to lust (sexual desire). To address this question, we conducted two experiments: one testing whether the visual pattern related to the perception of love differs from that related to lust and one testing whether the visual pattern related to the expression of love differs from that related to lust. Our results show that a person’s eye gaze shifts as a function of his or her goal (love vs. lust) when looking at a visual stimulus. Such identification of distinct visual patterns for love and lust could have theoretical and clinical importance in couples therapy when these two phenomena are difficult to disentangle from one another on the basis of patients’ self-reports."

Monday, October 06, 2014

Having 'no self' as self transcendence, or spirituality.

I've finally read another item in my queue of potential posts, an interview by Gary Gutting of Sam Harris, whose most recent book is titled "Waking Up: A Guide to Spirituality Without Religion. " I recommend the article to philosophically inclined MindBlog readers. Harris takes deities and religion to be nonsense, but argues that spirituality (probably the foundation of many religions) is a noble pursuit. The following clip is Harris on contrasting the claims about mind and cosmos made by science and religion:
There is a big difference between making claims about the mind and making claims about the cosmos. Every religion (including Buddhism) uses first-person experience to do both of these things, but the latter pretensions to knowledge are almost always unwarranted. There is nothing that you can experience in the darkness of your closed eyes that will help you understand the Big Bang or the connection between consciousness and the physical world. Look within, and you will find no evidence that you even have a brain, much less gain any insight into how it works.
However, one can discover specific truths about the nature of consciousness through a practice like meditation. Religious people are always entitled to claim that certain experiences are possible — feelings of bliss or selfless love, for instance. But Christians, Hindus and atheists have experienced the same states of consciousness. So what do these experiences prove? They certainly don’t support claims about the unique divinity of Christ or about the existence of the monkey god Hanuman. Nor do they demonstrate the divine origin of certain books. These reports only suggest that certain rare and wonderful experiences are possible. But this is all we need to take “spirituality” (the unavoidable term for this project of self-transcendence) seriously. To understand what is actually going on — in the mind and in the world — we need to talk about these experiences in the context of science.
In the interview Harris gives one of the nicest and most simple expositions of how our sense of self can be an illusion that I have seen. It is a response to Gutting's question:
You deny the existence of the self, understood as “an inner subject thinking our thoughts and experiencing our experiences.” You say, further, that the experience of meditation (as practiced, for example, in Buddhism) shows that there is no self. But you also admit that we all “feel like an internal self at almost every waking moment.” Why should a relatively rare — and deliberately cultivated — experience of no-self trump this almost constant feeling of a self?
Harris:
Because what does not survive scrutiny cannot be real. Perhaps you can see the same effect in this perceptual illusion:
It certainly looks like there is a white square in the center of this figure, but when we study the image, it becomes clear that there are only four partial circles. The square has been imposed by our visual system, whose edge detectors have been fooled. Can we know that the black shapes are more real than the white one? Yes, because the square doesn’t survive our efforts to locate it — its edges literally disappear. A little investigation and we see that its form has been merely implied.
What could we say to a skeptic who insisted that the white square is just as real as the three-quarter circles and that its disappearance is nothing more than, as you say, “a relatively rare — and deliberately cultivated — experience”? All we could do is urge him to look more closely.
The same is true about the conventional sense of self — the feeling of being a subject inside your head, a locus of consciousness behind your eyes, a thinker in addition to the flow of thoughts. This form of subjectivity does not survive scrutiny. If you really look for what you are calling “I,” this feeling will disappear. In fact, it is easier to experience consciousness without the feeling of self than it is to banish the white square in the above image.

Sunday, October 05, 2014

An update of dericbownds.net

I wanted to mention some recent changes to my main website, dericbownds.net, which first started during my transition from a career of laboratory research on vision to a second phase of studies of the human mind in the early 1990's. It is the parent from which Deric's MindBlog sprung in 2006. In recent years I have not been very attentive to the site, and just haphazardly added to it the web versions of lectures I have prepared and given in various venues. I've now obtained the latest version of the web editing program Adobe Dreamweaver (the one I originally used now being dysfunctional), and done a bit of a cleanup, simplifying the home page to just list a few of the more recent lectures ( Upstairs/Downstairs in our Brain  What’s running our show?  -  Making our Brains Younger  -   Are you holding your breath? - Structures of arousal and calm   -   and Making Minds - Evolving and Constructing the "I"), as well as noting two earlier lectures that have been popular ("The I Illusion" and "The Beast Within"). A link is provided to a complete list of my mind lectures, writings, and podcasts. I also realized that a social history of my vision research laboratory at the University of Wisconsin done for a laboratory reunion in 2012 (using the Prezi lecture and web presentation tool) was not very user-friendly, and so attempted to make it easier to click through.

Friday, October 03, 2014

A few Self-Help Nostrums.

I thought I would pass on a few random self-help pieces from the NY Times that caught my eye and have accumulated in my queue of potential posts. (I call these items "nostrums" because having insight or knowledge is an ‘unproven cure,’ - its application to our real life behaviors frequently doesn't happen.)

Feel starved for physical contact,touching?, hugs?, affection? There's an App for that! It's called Cuddlr, described by Anna Altman, that allows individuals to find others nearby who wish to cuddle in a PG-rated, non-sexual way.

Feel like you've caught the general mood of despondency, passivity, and despondency that has seemed to go with an unravelling of the international and domestic order? David Brooks argues that we should get a grip, snap out of it, noting that the scope of the problems we face are way below historic averages. He suggests possible remedies to the current international domestic and international leadership crises.

Wondering why your favorite pleasures loose their glow as you repeat them?  Anna North writes on how performing the pleasures of life in a habitual way....Duh!.... causes them to habituate, loose their force and intensity.  It's how nerve cells work.... If you really want to enjoy something, don't repeat it in a routine way.

Thursday, October 02, 2014

Fish as brain food - it’s not just the omega-3

An interesting study from Raji et al., examines data from 260 cognitively normal people with an average age of 78, and finds baked or broiled fish (but not fried fish) consumption correlates with volumes of brain grey matter areas responsible for memory and cognition, the areas where amyloid plaques associated with Alzheimer's disease first appear. The correlation persists after controlling for co-variates such as age, eduction, physical activity, body mass index, race, sex, etc. From their summary:
Data were analyzed from 260 cognitively normal individuals from the Cardiovascular Health Study with information on fish consumption from the National Cancer Institute Food Frequency Questionnaire and brain magnetic resonance imaging (MRI). The relationship between fish consumption data collected in 1989–1990 and brain structural MRI obtained in 1998–1999 was assessed using voxel-based morphometry in multiple regression analyses in 2012. Covariates were age, gender, race, education, white matter lesions, MRI-identified infarcts, waist–hip ratio, and physical activity as assessed by the number of city blocks walked in 1 week. Volumetric changes were further modeled with omega-3 fatty acid estimates to better understand the mechanistic link between fish consumption, brain health, and Alzheimer disease.
Weekly consumption of baked or broiled fish was positively associated with gray matter volumes in the hippocampus, precuneus, posterior cingulate, and orbital frontal cortex even after adjusting for covariates. These results did not change when including omega-3 fatty acid estimates in the analysis....Dietary consumption of baked or broiled fish is related to larger gray matter volumes independent of omega-3 fatty acid content. These findings suggest that a confluence of lifestyle factors influence brain health, adding to the growing body of evidence that prevention strategies for late-life brain health need to begin decades earlier.