MIT Develops Electronic Glasses That Can Read Emotions
By Kevin Lee
Mind-reading still might be a ways off, but the MIT Media Lab has come up with spectacles that can read a person’s emotions. These social X-ray specs have a built-in camera linked to software that analyzes facial expressions.
The special glasses were developed by Rosalind Picard, Rana el Kaliouby, and Simon Baron-Cohen. The research was originally intended to help autistic individuals who might lack the social instincts to recognize a person’s emotions during conversations.
The prototype glasses are built with a rice-grain-size camera wired to a computer the size of a deck of cards. The camera can track 24 feature points of facial expression. The linked computer scans the micro-expressions to gauge how often they appear and for how long, while comparing the data with a bank of expressions created by actors and identified by volunteers.
The glasses relay a summarized version of emotional information to the user through an earpiece, telling the user the subject’s emotional state, such as confused or in disagreement. An included light signifies agreement or conflict by turning green or red.
So far the device has been effective in getting autistic individuals more involved in conversations. Picard and el Kaliouby also found that an average person could correctly interpret expressions only 54 percent of the time whereas the glasses could identify them correctly 64 percent of the time. Sill, though the social X-ray specs are effective, Picard says that they are not foolproof. The glasses can be tricked, though doing so requires constant concentration.
The technology is currently being used by Picard and el Kaliouby’s company, Affectiva, to provide deeper market testing for advertisements and movies. Meanwhile, another colleague of the researchers, Mohammed Hoque has been fine-tuning the algorithms to detect subtle differences between expressions, such as smiles of delight versus smiles of frustration, along with 10 different types of Japanese smiles.