A recent
article by Benedict Carey suggests we may be heading towards a future in which instructional and emotional needs of those not able to obtain appropriate human contact are met through presentation of changing robotic emotional expressions that activate the same brain areas as normal human gestures. A
report by Chaminade et al., however, on a multi-national collaboration involving the humanoid robot WE4-RII - which expresses emotions by using facial expressions and the movement of the upper-half of the body including neck, shoulders, trunk, waist, as well as arms and hands - suggests that we have some way to go:
...activity in cortical areas endowed with mirror properties, like left Broca's area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance.
The
Carey article reviews a number of different robotic instructional studies that show, in spite of the attenuated effectiveness of robotic versus human emotions, that robots can engage people and teach them simple skills, including household tasks, vocabulary or, in the case of autistic children, playing, elementary imitation and taking turns.