Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/2628
DC FieldValueLanguage
dc.contributor.authorTsapatsoulis, Nicolas-
dc.contributor.authorKarpouzis, Kostas-
dc.contributor.authorStamou, Giorgos-
dc.contributor.authorFrederic, Piat-
dc.contributor.authorKollias, Stefanos D.-
dc.contributor.otherΤσαπατσούλης, Νικόλαςen
dc.date.accessioned2015-02-05T07:15:08Z-
dc.date.accessioned2015-12-02T11:51:51Z-
dc.date.available2015-02-05T07:15:08Z-
dc.date.available2015-12-02T11:51:51Z-
dc.date.issued2000-
dc.identifier.citationEUSIPCO, Tampere, Finland, September 2000en
dc.identifier.urihttps://hdl.handle.net/20.500.14279/2628-
dc.description.abstractThe human face is, in essence, an advanced expression apparatus; despite its adverse complexity and variety of distinct expressions, researchers has concluded that at least six emotions, conveyed by human faces, are universally associated with distinct expressions. In particular, sadness, anger, joy, fear, disgust and surprise form categories of facial expressions that are recognizable across different cultures. In this work we form a description of the six universal facial expressions, using the MPEG-4 Facial Definition Parameter Set (FDP) [1]. According to the MPEG-4 Standard, this is a set of tokens that describe minimal perceptible actions in the facial area. Groups of such actions in different magnitudes produce the perception of expression [2]. A systematic approach towards the recognition and classification of such an expression is based on characteristic points in the facial area that can be automatically detected and tracked. Metrics obtained from these points feed a fuzzy inference system whose output is a vector of parameters that depicts the systems’ degree of belief with respect to the observed emotion. Apart from modeling the archetypal expressions we go a step further: by modifying the membership functions of the involved features according to the activation parameter [3] we provide an efficient way for recognizing a broader range of emotions than that related with the archetypal expressions.en
dc.formatpdfen
dc.language.isoenen
dc.subjectFacial expressionsen
dc.subjectMPEG-4en
dc.subjectEmotion Classificationen
dc.subjectFacial Definition Parameter Seten
dc.titleA fuzzy system for emotion classification based on the MPEG-4 facial definition parameter seten
dc.typeConference Papersen
dc.collaborationNational Technical University Of Athens-
dc.subject.categoryMedia and Communicationsen
dc.countryGreece-
dc.subject.fieldSocial Sciencesen
dc.dept.handle123456789/54en
item.grantfulltextopen-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.openairecristypehttp://purl.org/coar/resource_type/c_c94f-
item.openairetypeconferenceObject-
item.fulltextWith Fulltext-
crisitem.author.deptDepartment of Communication and Marketing-
crisitem.author.facultyFaculty of Communication and Media Studies-
crisitem.author.orcid0000-0002-6739-8602-
crisitem.author.parentorgFaculty of Communication and Media Studies-
Appears in Collections:Δημοσιεύσεις σε συνέδρια /Conference papers or poster or presentation
Files in This Item:
File Description SizeFormat
N. Tsapatsoulis_A fuzzy system for emotion classification.pdf124.07 kBAdobe PDFView/Open
CORE Recommender
Show simple item record

Page view(s)

505
Last Week
0
Last month
3
checked on Nov 6, 2024

Download(s)

143
checked on Nov 6, 2024

Google ScholarTM

Check


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.