Emotion Tracker

The Emotion Tracker was developed to extract contextual information and visual features of a user allowing the detection of the level of his/her engagement with a robot.

Built on a 3D distribution of facial features extracted by the existing commercial FaceAPI* product we can track a face and recover the valence and interest towards the game companion to infer the player's emotional state and appropriately modify the game companion's behaviour. The feeling index fuses data on the facial expression, head direction and the contextual information from the game state to accurately and robustly predict the engagement of the player with the companion. The system has been successfully tested in long term interactions in real social scenarios.

The Emotion Tracker was developed by Queen Mary University of London as a part of LIREC (Living with Robots & Interactive Companions) a 5 year EU-funded project that ends in August 2012.

Contact: Dr. Hamit Soyel - hamit.soyel@eecs.qmul.ac.uk

Photos

www.flickr.com