Individuals quickly and accurately categorize others into groups; indeed, for groups with salient perceptual markers (e.g., sex, age, race), category activation is deemed to be an unavoidable consequence of the person-perception process. But what about social groups with less obvious physical cues, do they also trigger automatic person categorization? Recent data hint that this may, indeed, be the case. Take, for example, male sexual orientation. Although the cues to male sexual orientation are ostensibly ambiguous (yielding categorization accuracy of approximately 60–70% against a chance guessing rate of 50%), differences between gay and straight men can be judged significantly better than chance following very brief (50 ms) exposure to a target and can modulate incidental memory for previously encountered faces.
To explore the possibility that information pertaining to male sexual orientation may be extracted automatically from faces (like sex, age, and race) we employed a lexical decision task in which participants responded to gay and straight verbal associates ( after the presentation of facial primes. A subset of 20 head shots of gay (n= 10) and straight (n= 10) men were randomly selected from a previously validated, standardized set of photographs obtained from Internet dating sites. The targets self-identified as either gay or straight and did not differ systematically along dimensions such as facial attractiveness. Pretesting showed that the faces were categorized with accuracy better than chance. Ten words relating to gay stereotypes (e.g., fabulous, rainbow) and 10 words relating to straight stereotypes (e.g., rough, football) were selected based on pretests. For the purpose of the lexical decision task, 20 nonword letter strings were constructed from these stereotype-related items.
The basic result was that exposure to faces of members of a perceptually ambiguous group slightly facilitated access to associated stereotypic material.
Monday, April 20, 2009
Automatically extracting group membership from faces.
Rule et al. show that everyone has 'Gaydar'. Information on sexual orientation is automatically extracted from faces, even though this has not been considered to be a property akin to the primary categories of sex, age, or race of a person. This suggest that the automaticity of person categorization associated with perceptually salient groups may extend to categories with less obvious visual markers. Edited clips from their paper: