How do you feel? Your phone may soon tell you
A variety of projects unveiled in the past year aim to give mobile apps the ability to instantly detect a person’s emotional state.
A startup called Affectiva, which emerged out of MIT’s Media Lab, last month launched a software developer kit (SDK) for its emotion-tracking technology. The company claims that it’s possible to assess the effect that advertising and branding have on a person if you analyze that person’s facial expressions through the camera of a mobile device.
With Affdex, processing takes place on the device, not on a remote server, as is the case with some comparable technologies, according to Affectiva. That raises the possibility of developing systems that sense emotions in real time and feed their results into another app and thereby change what happens with that app. For example, constant emotional feedback could change the trajectory of a game or an interactive story depending on how the user feels about various scenes.
Another startup, Emotient (pronounced like emotion with a T sound at the end), just received an additional $6 million in funding to further develop its facial-recognition emotion-sensing technology, especially an API for third-party software development. The company was founded by six people with Ph.D.s from the University of California at San Diego who are experts in machine learning, computer vision, cognitive science and facial behavioral analysis.
The Emotient system watches and analyzes facial expressions to determine seven emotions: joy, surprise, sadness, anger, fear, disgust and contempt, and also the general mood of people—whether they’re broadly happy, unhappy or somewhere in between.
Like several other companies in this market niche, Emotient wants to help retailers understand how customers feel about products while shopping.
The company this week announced the development of a Google Glass “glassware” app that’s designed to perform on-the-spot “sentiment analysis.” If you’re wearing a Google Glass device, it will interpret the emotions of the people you’re looking at and then tell you generally how the others are feeling. The first target customers of the app are restaurant workers, salespeople and others in retail who want to know how happy customers are about products or services.
Intel is an investor in Emotient, and reportedly plans to bring Emotient’s libraries into the next version of its RealSense SDK. RealSense is like Microsoft’s Kinect—it’s designed to be used with PCs and built into laptops to enable real-time 3D scans of the surrounding environment and the user for gaming and also content creation.
A Norwegian computer scientist named Audun Øygard created a face-reading tool called CLMtrackr. Applications created to demonstrate CLMtrackr’s technology are available for free online. For example, you can visit this website to have your emotions tracked in real time using the emotion-tracking example.
CLMtrackr’s approach is to read facial expressions and interpret them based on thousands of previous models. Essentially, the technology creates green lines based on 70 specific points on the human face, then compares the relative orientation of those lines to past examples.
Øygard expects the technology to be useful in retail and sales—to, for example, help analyze the effectiveness of TV commercials.
By the way, Øygard also created a program that superimposes the facial features of a famous celebrity on top of yours in real time. You can try it here.
Meanwhile, researchers at the University of Genoa in Italy have created a system that uses Microsoft Kinect cameras to figure out how you feel.
The system does this by reading and interpreting body language. It creates a stick figure in software, then interprets how the sticks move, and how fast or slow they move. Software looks for the same things people do when reading body language: head down and shoulders drooping may show sadness, for example.
The researchers are already applying their technology to build games that teach autistic children how to read body language, and how to use body language to express emotions.
What most of these projects have in common, besides tools with the ability to read emotions and then use that emotion data for a certain purpose, is that they’re designed to be extended, built-upon and built into, usually, mobile apps and mobile devices. None of these major projects are holding back their technology as proprietary; instead, they’re making their tools available as open systems for other companies to use.
That’s really what makes all these various approaches to emotion detection so exciting: The systems can be integrated into a wide variety of mobile apps and devices and—I don’t see why not—combined to enhance accuracy or flexibility.
The first targets appear to be in retail sales—to figure out how customers feel. But with other app developers applying their creativity, we could see emotion sensing built into user interfaces to, say, make apps more friendly or more “tactful” in how they interact with users.
Who knows where emotion detection will show up next?
Read more about emerging technologies in Computerworld’s Emerging Technologies Topic Center.
For comprehensive coverage of the Android ecosystem, visit Greenbot.com.