2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII)
Download PDF

Abstract

Developing intelligent machines that recognize facial expressions, detect spontaneous emotions and infer affective states of an individual are all challenging problems. While significant amount of work in recent years has focussed on advancing machine learning techniques for affect recognition and affect classification, the prediction of mood from facial analysis and the usage of mood data have received less attention. Questionnaires for psychometric measurement of mood-states are common, but using them during interventions that target psychological well-being of people are arduous and may burden an already troubled population. In this work, we present mood prediction as a sequence learning problem that uses facial Action Units (AUs) as inputs to a Long Short-Term Memory (LSTM) machine. We create two separate automated LSTM models - a total mood disturbance predictor and a mood sub-scale predictor, and then use them to aid behavioral assessments of engagement. Our mood-aware engagement predictor uses total mood disturbance score, and our analysis compares both mood sub-scale predictors and an overall mood disturbance predictor for engagement prediction. We evaluate our mood models on a large scale dataset consisting of 8M+ frames from multiple videos collected from 110 subjects during a web-intervention for trauma recovery. Our experiments show that mood-aware engagement predictor using our novel visual analysis approach performs significantly better or on par with using self-reports.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Related Articles