The negative Tweet storms and blog posts criticized the study as being a technology-fueled revival of the long discredited notion that physiognomy, measuring the size and shape of a person's eyes, nose and face, can predict personality traits. Highly inaccurate science, racism by algorithm, etc.
And, even if the machine works as stated, William T.L. Cox, a psychologist who studies stereotypes at the University of Wisconsin-Madison, notes:
Let’s say 5 percent of the population is gay, or 50 of every 1,000 people. A facial scan that is 91 percent accurate would misidentify 9 percent of straight people as gay; in the example above, that’s 85 people (0.91 x 950).
The software would also mistake 9 percent of gay people as straight people. The result: Of 130 people the facial scan identified as gay, 85 actually would be straight.
When an algorithm with 91 percent accuracy operates in the real world, almost two-thirds of the times it says someone is gay, it would be wrong.