Unlike most other objects that are processed analytically, faces are processed configurally. This configural processing is reflected early in visual processing following face inversion and contrast reversal, as an increase in the N170 amplitude, a scalp-recorded event-related potential. Here, we show that these face-specific effects are mediated by the eye region. That is, they occurred only when the eyes were present, but not when eyes were removed from the face. The N170 recorded to inverted and negative faces likely reflects the processing of the eyes. We propose a neural model of face processing in which face- and eye-selective neurons situated in the superior temporal sulcus region of the human brain respond differently to the face configuration and to the eyes depending on the face context. This dynamic response modulation accounts for the N170 variations reported in the literature. The eyes may be central to what makes faces so special.
Figure - Simplified neural model of early face processing. Three sources are simultaneously active around 170 msec poststimulus onset. One source in the superior temporal sulcus (STS) region with a radial orientation generates the ERP N170 component. The combination of tangential sources in the fusiform gyrus (FG) and middle occipital gyrus (MOG) generates the MEG M170. The dynamic response modulation of eye- and face-selective neurons within the STS accounts for inversion and CR effects on the face N170 amplitude and for the other existing ERP data on the N170. The + signs represent the amount of activation of the neurons. The absence of + signs signifies that the neurons are not responding.
Thursday, October 25, 2007
It's in the Eyes!
Another curious bit on our brain's specialization for recognizing faces, noting the central role of the eyes. The abstract and a figure: