A Norwegian computer scientist named Audun ˜ygard created a face-reading tool called CLMtrackr. Applications created to demonstrate CLMtrackr's technology are available for free online. For example, you can visit this website to have your emotions tracked in real time using the emotion-tracking example.
CLMtrackr's approach is to read facial expressions and interpret them based on thousands of previous models. Essentially, the technology creates green lines based on 70 specific points on the human face, then compares the relative orientation of those lines to past examples.
˜ygard expects the technology to be useful in retail and sales — to, for example, help analyze the effectiveness of TV commercials.
By the way, ˜ygard also created a program that superimposes the facial features of a famous celebrity on top of yours in real time. You can try it here.
Meanwhile, researchers at the University of Genoa in Italy have created a system that uses Microsoft Kinect cameras to figure out how you feel.
The system does this by reading and interpreting body language. It creates a stick figure in software, then interprets how the sticks move, and how fast or slow they move. Software looks for the same things people do when reading body language: head down and shoulders drooping may show sadness, for example.
The researchers are already applying their technology to build games that teach autistic children how to read body language, and how to use body language to express emotions.
What most of these projects have in common, besides tools with the ability to read emotions and then use that emotion data for a certain purpose, is that they're designed to be extended, built-upon and built into, usually, mobile apps and mobile devices. None of these major projects are holding back their technology as proprietary; instead, they're making their tools available as open systems for other companies to use.
That's really what makes all these various approaches to emotion detection so exciting: The systems can be integrated into a wide variety of mobile apps and devices and — I don't see why not — combined to enhance accuracy or flexibility.
The first targets appear to be in retail sales —to figure out how customers feel. But with other app developers applying their creativity, we could see emotion sensing built into user interfaces to, say, make apps more friendly or more "tactful" in how they interact with users.
Who knows where emotion detection will show up next?
Sign up for MIS Asia eNewsletters.