Another clip from the NYTimes Magazine "Ideas" issue:
"The Emotional-Social Intelligence Prosthesis, developed by Rana el Kaliouby and Rosalind Picard, consists of a small camera mounted on a cap or glasses that monitors a conversation partner’s facial expressions and feeds the data into a hand-held computer. Software tracks the movement of facial features and classifies them using a coding system developed by the psychologist Paul Ekman, which is then correlated with a second taxonomy of emotional states created by the Cambridge autism researcher (and Ali G cousin) Simon Baron-Cohen. Almost instantaneously, the computer crunches each raised eyebrow and pucker of the lips, giving a whispered verdict about how the person is feeling. (Another version of the device, meant to be used separately, points back at users, allowing them to better understand — and perhaps modify — the face they present to the world.)" (CLICK to enlarge image below).
Reading body language is totally not reading the mind.
ReplyDelete