Individualizing the New Interfaces: Extraction of User's Emotions from Facial Data
Abstract
When developing new multimodal user interfaces emotional user information may be of great interest. In this paper we present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles and mouth shape). The method defines and detects the six basic emotions (plus the neutral one) in terms of this information and has been fine-tuned with a database of 399 images. We analyze the effect of different facial parameters and other issues like gender and ethnicity in the classification results. For the moment, the method is applied to static images.
BibTeX
@inproceedings {10.2312:LocalChapterEvents:siacg:siacg06:179-185,
booktitle = {SIACG 2006: Ibero-American Symposium in Computer Graphics},
editor = {Pere Brunet and Nuno Correia and Gladimir Baranoski},
title = {{Individualizing the New Interfaces: Extraction of User's Emotions from Facial Data}},
author = {Hupont, Isabelle and Cerezo, Eva},
year = {2006},
publisher = {The Eurographics Association},
ISBN = {3-905673-60-6},
DOI = {10.2312/LocalChapterEvents/siacg/siacg06/179-185}
}
booktitle = {SIACG 2006: Ibero-American Symposium in Computer Graphics},
editor = {Pere Brunet and Nuno Correia and Gladimir Baranoski},
title = {{Individualizing the New Interfaces: Extraction of User's Emotions from Facial Data}},
author = {Hupont, Isabelle and Cerezo, Eva},
year = {2006},
publisher = {The Eurographics Association},
ISBN = {3-905673-60-6},
DOI = {10.2312/LocalChapterEvents/siacg/siacg06/179-185}
}