"The Emotional-Social Intelligence Prosthesis, developed by Rana el Kaliouby and Rosalind Picard, consists of a small camera mounted on a cap or glasses that monitors a conversation partner’s facial expressions and feeds the data into a hand-held computer. Software tracks the movement of facial features and classifies them using a coding system developed by the psychologist Paul Ekman, which is then correlated with a second taxonomy of emotional states created by the Cambridge autism researcher (and Ali G cousin) Simon Baron-Cohen. Almost instantaneously, the computer crunches each raised eyebrow and pucker of the lips, giving a whispered verdict about how the person is feeling. (Another version of the device, meant to be used separately, points back at users, allowing them to better understand — and perhaps modify — the face they present to the world.)" (CLICK to enlarge image below).
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Saturday, December 16, 2006
A "mind reading" prosthesis for autistic people?
Another clip from the NYTimes Magazine "Ideas" issue:
"The Emotional-Social Intelligence Prosthesis, developed by Rana el Kaliouby and Rosalind Picard, consists of a small camera mounted on a cap or glasses that monitors a conversation partner’s facial expressions and feeds the data into a hand-held computer. Software tracks the movement of facial features and classifies them using a coding system developed by the psychologist Paul Ekman, which is then correlated with a second taxonomy of emotional states created by the Cambridge autism researcher (and Ali G cousin) Simon Baron-Cohen. Almost instantaneously, the computer crunches each raised eyebrow and pucker of the lips, giving a whispered verdict about how the person is feeling. (Another version of the device, meant to be used separately, points back at users, allowing them to better understand — and perhaps modify — the face they present to the world.)" (CLICK to enlarge image below).
"The Emotional-Social Intelligence Prosthesis, developed by Rana el Kaliouby and Rosalind Picard, consists of a small camera mounted on a cap or glasses that monitors a conversation partner’s facial expressions and feeds the data into a hand-held computer. Software tracks the movement of facial features and classifies them using a coding system developed by the psychologist Paul Ekman, which is then correlated with a second taxonomy of emotional states created by the Cambridge autism researcher (and Ali G cousin) Simon Baron-Cohen. Almost instantaneously, the computer crunches each raised eyebrow and pucker of the lips, giving a whispered verdict about how the person is feeling. (Another version of the device, meant to be used separately, points back at users, allowing them to better understand — and perhaps modify — the face they present to the world.)" (CLICK to enlarge image below).
Blog Categories:
autism,
social cognition
Subscribe to:
Post Comments (Atom)
Reading body language is totally not reading the mind.
ReplyDelete