Incoming sensory information is often ambiguous, and the brain has to make decisions during perception. "Predictive coding" proposes that the brain resolves perceptual ambiguity by anticipating the forthcoming sensory environment, generating a template against which to match observed sensory evidence. Summerfield et al have observed a neural representation of predicted perception in the medial frontal cortex, while human subjects decided whether visual objects were faces or not. Perceptual decisions about faces were associated with an increase in top-down connectivity from the frontal cortex to face-sensitive visual areas, consistent with the matching of predicted and observed evidence for the presence of faces.
Figure: A simple dynamic causal model with hierarchically ordered bidirectional connections between vMFC (ventromedial frontal cortex), amygdala, FFA (fusiform face area), and IOG (inferior occipital gyrus). Face and nonface stimuli were modeled as inputs to IOG, and face sets as inputs to vMFC.