Friday, October 13, 2017

A 'Gaydar' machine?

Heather Murphy describes the kerfuffle that has ensued after Stanford researchers published a preprint of their work that will soon appear in The Journal of Personality and Social Psychlogy. To teach a machine (a widely used facial analysis program employing a pattern identifying neural network) to detect sexuality, authors Kosinski and Wang copied more than 75,000 dating profiles of men and women seeking same or different sex partners. The software extracted information from thousands of facial data points to generate average composite heterosexual and gay male and female faces (pictures are in the Murphy article). They found that their model did much better than humans at identifying sexual orientation. When the computer was given five photos for each person instead of just one, accuracy rose to 83% for women and 91% for men.

The negative Tweet storms and blog posts criticized the study as being a technology-fueled revival of the long discredited notion that physiognomy, measuring the size and shape of a person's eyes, nose and face, can predict personality traits. Highly inaccurate science, racism by algorithm, etc.

And, even if the machine works as stated, William T.L. Cox, a psychologist who studies stereotypes at the University of Wisconsin-Madison, notes:
Let’s say 5 percent of the population is gay, or 50 of every 1,000 people. A facial scan that is 91 percent accurate would misidentify 9 percent of straight people as gay; in the example above, that’s 85 people (0.91 x 950).
The software would also mistake 9 percent of gay people as straight people. The result: Of 130 people the facial scan identified as gay, 85 actually would be straight.
When an algorithm with 91 percent accuracy operates in the real world, almost two-thirds of the times it says someone is gay, it would be wrong.

No comments:

Post a Comment