Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/2987
DC FieldValueLanguage
dc.contributor.authorTsapatsoulis, Nicolas-
dc.contributor.authorRaouzaiou, Amaryllis-
dc.contributor.authorKollias, Stefanos D.-
dc.contributor.authorCowie, Roddy I D-
dc.contributor.authorDouglas-Cowie, Ellen-
dc.contributor.otherΤσαπατσούλης, Νικόλας-
dc.date.accessioned2015-05-27T12:49:37Z-
dc.date.accessioned2015-12-02T12:29:02Z-
dc.date.available2015-05-27T12:49:37Z-
dc.date.available2015-12-02T12:29:02Z-
dc.date.issued2002-07-
dc.identifier.citationMPEG-4 facial animation: the standard, implementation and applications, 2002, Part 3: Implemetationsen_US
dc.identifier.isbn978-0-470-84465-6-
dc.identifier.urihttps://hdl.handle.net/20.500.14279/2987-
dc.description.abstractIn the framework of MPEG-4 hybrid coding of natural and synthetic data streams, one can include teleconferencing and telepresence applications, in which a synthetic proxy or a virtual agent is capable of substituting the actual user. Such agents can interact with each other, analyzing input textual data entered by the user and multisensory data, including human emotions, facial expressions and nonverbal speech. This not only enhances interactivity, by replacing single media representations with dynamic multimedia renderings, but also assists human–computer interaction issues, letting the system become accustomed to the current needs and feelings of the user. Actual application of this technology [1] is expected in educational environments, 3-D videoconferencing and collaborative workplaces, online shopping and gaming, virtual communities and interactive entertainment. Facial expression synthesis and animation has gained much interest within the MPEG-4 framework; explicit facial animation parameters (FAPs) have been dedicated to this purpose. However, FAP implementation is an open research area [2]. In this chapter we describe a method for generating emotionally enriched human–computer interaction, focusing on analysis and synthesis of primary [3] and intermediate facial expressions [4]. To achieve this goal we utilize both MPEG-4 facial definition parameters (FDPs) and FAPs. The contribution of the work is twofold: it proposes a way of modeling primary expressions using FAPs and it describes a rule-based technique for analyzing both archetypal and intermediate expressions; for the latter we propose an innovative model generation framework. In particular, a relation between FAPs and the activation parameter proposed in classical psychological studies is established, extending the archetypal expression studies that the computer society has concentrated on. The overall scheme leads to a parameterized approach to facial expression synthesis that is compatible with the MPEG-4 standard and can be used for emotion understanding.en_US
dc.formatpdfen_US
dc.language.isoenen_US
dc.rights© John Wiley & Sons, Inc.en_US
dc.subjectEmotion recognitionen_US
dc.subjectMPEG-4en_US
dc.subjectFAPsen_US
dc.titleEmotion recognition and synthesis based on MPEG-4 FAPsen_US
dc.typeBook Chapteren_US
dc.collaborationNational Technical University Of Athensen_US
dc.collaborationQueen’s University Belfasten_US
dc.subject.categoryComputer and Information Sciencesen_US
dc.reviewPeer Revieweden
dc.countryGreeceen_US
dc.countryIrelanden_US
dc.subject.fieldNatural Sciencesen_US
dc.dept.handle123456789/54en
cut.common.academicyearemptyen_US
item.fulltextWith Fulltext-
item.cerifentitytypePublications-
item.grantfulltextopen-
item.openairecristypehttp://purl.org/coar/resource_type/c_3248-
item.openairetypebookPart-
item.languageiso639-1en-
crisitem.author.deptDepartment of Communication and Marketing-
crisitem.author.facultyFaculty of Communication and Media Studies-
crisitem.author.orcid0000-0002-6739-8602-
crisitem.author.parentorgFaculty of Communication and Media Studies-
Appears in Collections:Κεφάλαια βιβλίων/Book chapters
Files in This Item:
File Description SizeFormat
Tsapatsoulis.pdf652.73 kBAdobe PDFView/Open
CORE Recommender
Show simple item record

Page view(s) 10

507
Last Week
2
Last month
11
checked on May 9, 2024

Download(s) 20

350
checked on May 9, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.