More on mirror systems from Warren et al in Journal of Neuroscience. Edited clips from their paper:
Social interaction relies on the ability to react to communication signals. Although cortical sensory–motor "mirror" networks are thought to play a key role in visual aspects of primate communication, evidence for a similar generic role for auditory–motor interaction in primate nonverbal communication is lacking.
In this functional magnetic resonance imaging (fMRI) study, the authors investigated cortical regions responsive to both the perception of human vocalizations and the voluntary generation of facial expressions. In four auditory–perceptual conditions, subjects listened passively, without overt motor response, to nonverbal emotional vocalizations conveying two positive-valence emotions, amusement and triumph, and two negative-valence emotions, fear and disgust. Use of nonverbal, rather than verbal, vocalizations optimized recognizability of emotional content and avoided confounds of phonological and verbal content. In a facial movement condition, subjects performed voluntary smiling movements in the absence of auditory input. They hypothesized that cortical regions showing combined auditory–perceptual and motor responses would be located within premotor and motor cortical regions.
Figure legend: Brain regions demonstrating auditory–motor mirror responses. A shows regions (red) displaying a significant modulatory effect of emotion category on perceptual activation. B shows regions (light green) displaying significant activation during voluntary facial movements (motor > baseline). C, A masked inclusively in B shows regions (dark green) displaying both a significant modulatory effect of emotion category on perceptual activation and significant activation during voluntary facial movements.
Figure legend: Correlations with emotional valence and arousal in brain regions demonstrating auditory–motor mirror responses. Left, Regions (green) displaying both a significant modulatory effect of emotion category on perceptual activation and significant activation during voluntary facial movements as shown in the figure above. Right, Regions demonstrating a significant positive correlation between hemodynamic responses and emotional valence (red), emotional arousal (blue), or both (purple).
The authors demonstrated that a network of human premotor cortical regions activated during facial movement is also involved in auditory processing of affective nonverbal vocalizations. Within this auditory–motor mirror network, distinct functional subsystems respond preferentially to emotional valence and arousal properties of heard vocalizations. Positive emotional valence enhanced activation in a left posterior inferior frontal region involved in representation of prototypic actions, whereas increasing arousal enhanced activation in presupplementary motor area cortex involved in higher-order motor control. Their findings demonstrate that listening to nonverbal vocalizations can automatically engage preparation of responsive orofacial gestures, an effect that is greatest for positive-valence and high-arousal emotions. The automatic engagement of responsive orofacial gestures by emotional vocalizations suggests that auditory–motor interactions provide a fundamental mechanism for mirroring the emotional states of others during primate social behavior.
Motor facilitation by positive vocal emotions suggests a basic neural mechanism for establishing cohesive bonds within primate social groups.