Monday, September 29, 2014

Hearing and imagination shape what we see.

Vetter et al. have done the interesting experiment of blindfolding people and then scanning their brains while they listened to birds singing, traffic noise, or people talking. They were able to identify the category of sounds just by examining the pattern of activity in the primary visual cortex, thus making a nice demonstration of the interconnectedness of the brain's sensory systems.

Highlights
•Early visual cortex receives nonretinal input carrying abstract information 
•Both auditory perception and imagery generate consistent top-down input 
•Information feedback may be mediated by multisensory areas 
•Feedback is robust to attentional, but not visuospatial, manipulation
Summary
Human early visual cortex was traditionally thought to process simple visual features such as orientation, contrast, and spatial frequency via feedforward input from the lateral geniculate nucleus. However, the role of nonretinal influence on early visual cortex is so far insufficiently investigated despite much evidence that feedback connections greatly outnumber feedforward connections. Here, we explored in five fMRI experiments how information originating from audition and imagery affects the brain activity patterns in early visual cortex in the absence of any feedforward visual stimulation. We show that category-specific information from both complex natural sounds and imagery can be read out from early visual cortex activity in blindfolded participants. The coding of nonretinal information in the activity patterns of early visual cortex is common across actual auditory perception and imagery and may be mediated by higher-level multisensory areas. Furthermore, this coding is robust to mild manipulations of attention and working memory but affected by orthogonal, cognitively demanding visuospatial processing. Crucially, the information fed down to early visual cortex is category specific and generalizes to sound exemplars of the same category, providing evidence for abstract information feedback rather than precise pictorial feedback. Our results suggest that early visual cortex receives nonretinal input from other brain areas when it is generated by auditory perception and/or imagery, and this input carries common abstract information. Our findings are compatible with feedback of predictive information to the earliest visual input level, in line with predictive coding models.

No comments:

Post a Comment