Verbal communication is a joint activity; however, speech production and comprehension have primarily been analyzed as independent processes within the boundaries of individual brains. Here, we applied fMRI to record brain activity from both speakers and listeners during natural verbal communication. We used the speaker's spatiotemporal brain activity to model listeners’ brain activity and found that the speaker's activity is spatially and temporally coupled with the listener's activity. This coupling vanishes when participants fail to communicate. Moreover, though on average the listener's brain activity mirrors the speaker's activity with a delay, we also find areas that exhibit predictive anticipatory responses. We connected the extent of neural coupling to a quantitative measure of story comprehension and find that the greater the anticipatory speaker–listener coupling, the greater the understanding. We argue that the observed alignment of production- and comprehension-based processes serves as a mechanism by which brains convey information.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Wednesday, August 04, 2010
Speakers and Listeners - fMRI shows coupled brains
As a followup to Monday's post on mirror neurons, this fascinating study by Stephens et al. shows that brain activities in a speaker-listener pair are tightly coupled, and that the magnitude of activity in areas exhibiting predictive anticipatory responses correlates with understanding. The graphic summaries of fMRI data in this open access article are quite nice, and you might want to check them out. The MindBlog reader who pointed out this early PNAS publication to me wonders "Could their findings open a new window of how to interpret the "function" of "conscious self", with the conscious self as the evaluating "outpost" of the coupled companion.?"
This does remind me a little bit of the mirror neuron revelations - tangentially of course. Dr. Disshum's last blog ..How process goals can make you win
ReplyDeleteI don't know how much this will "open a new window", besides showing that it's technically possible to do this technique with two people and be able to identify coupling in their fMRI reading. This does nothing but confirm the ideas that when we speak to someone, it's associated with a certain type of brain activity, and when we listen to words, it's also associated with a certain type of brain activity. Of course they are coupled, and of course they are decoupled when the two aren't working together. I don't think this is a theoretically groundbreaking study at all, although it does provide an interesting technique that might, as the authors suggest, allow the quantification of story comprehension (or social attention) at the neural level.
ReplyDeleteIt seems as if it weren’t only words, erasing such activity, coupling but rather “communication”, which is more than only allocating sense to words, but a reconciliation of context.
ReplyDeleteAssuming that communicating partners have different episodic memories, “emotional maps”, one normaly would expect that these differences would result in at least variations of activation patterns
The question was: could there be another driving force behind it: a type of priming to align on a higher-level structure: mind.
The evolutionary advantage of such “priming to align” could have been an automatic calibration of how to review of “what is out there”.
If this is so, mind would then have to be interpretad much less subjectively but more as an attribute of agentivity.