Show simple item record

dc.contributor.authorHupont, Isabelleen_US
dc.contributor.authorCerezo, Evaen_US
dc.contributor.editorPere Brunet and Nuno Correia and Gladimir Baranoskien_US
dc.date.accessioned2014-01-31T18:53:40Z
dc.date.available2014-01-31T18:53:40Z
dc.date.issued2006en_US
dc.identifier.isbn3-905673-60-6en_US
dc.identifier.urihttp://dx.doi.org/10.2312/LocalChapterEvents/siacg/siacg06/179-185en_US
dc.description.abstractWhen developing new multimodal user interfaces emotional user information may be of great interest. In this paper we present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles and mouth shape). The method defines and detects the six basic emotions (plus the neutral one) in terms of this information and has been fine-tuned with a database of 399 images. We analyze the effect of different facial parameters and other issues like gender and ethnicity in the classification results. For the moment, the method is applied to static images.en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectCategories and Subject Descriptors (according to ACM CCS): I.3.6 [Computer Graphics]: Interaction Techniques I.4.8 [Image Processing and Computer Vision]: Scene Analysisen_US
dc.titleIndividualizing the New Interfaces: Extraction of User's Emotions from Facial Dataen_US
dc.description.seriesinformationSIACG 2006: Ibero-American Symposium in Computer Graphicsen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record