Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/1895
DC FieldValueLanguage
dc.contributor.authorRaouzaiou, Amaryllis-
dc.contributor.authorTsapatsoulis, Nicolas-
dc.contributor.authorKarpouzis, Kostas-
dc.contributor.authorKollias, Stefanos D.-
dc.contributor.otherΚόλλιας, Στέφανος Δ.-
dc.contributor.otherΤσαπατσούλης, Νικόλας-
dc.date.accessioned2009-05-26T11:58:45Zen
dc.date.accessioned2013-05-16T13:11:03Z-
dc.date.accessioned2015-12-02T09:38:45Z-
dc.date.available2009-05-26T11:58:45Zen
dc.date.available2013-05-16T13:11:03Z-
dc.date.available2015-12-02T09:38:45Z-
dc.date.issued2002-10-22-
dc.identifier.citationEURASIP Journal on Applied Signal Processing, 2002, vol. 2002, no. 10, pp. 1021-1038en_US
dc.identifier.urihttps://hdl.handle.net/20.500.14279/1895-
dc.description.abstractIn the framework of MPEG-4, one can include applications where virtual agents, utilizing both textual and multisensory data, including facial expressions and nonverbal speech help systems become accustomed to the actual feelings of the user. Applications of this technology are expected in educational environments, virtual collaborative workplaces, communities, and interactive entertainment. Facial animation has gained much interest within the MPEG-4 framework; with implementation details being an open research area (Tekalp, 1999). In this paper, we describe a method for enriching human computer interaction, focusing on analysis and synthesis of primary and intermediate facial expressions (Ekman and Friesen (1978)). To achieve this goal, we utilize facial animation parameters (FAPs) to model primary expressions and describe a rule-based technique for handling intermediate ones. A relation between FAPs and the activation parameter proposed in classical psychological studies is established, leading to parameterized facial expression analysis and synthesis notions, compatible with the MPEG-4 standard.en_US
dc.formatpdfen_US
dc.language.isoenen_US
dc.relation.ispartofEURASIP Journal on Applied Signal Processingen_US
dc.rights© Springeren_US
dc.subjectFacial expressionen_US
dc.subjectMPEG-4 facial definition parametersen_US
dc.subjectActivationen_US
dc.subjectParameterized expression synthesisen_US
dc.titleParameterized facial expression synthesis based on MPEG-4en_US
dc.typeArticleen_US
dc.collaborationNational Technical University Of Athensen_US
dc.subject.categoryENGINEERING AND TECHNOLOGYen_US
dc.journalsOpen Accessen_US
dc.countryGreeceen_US
dc.subject.fieldEngineering and Technologyen_US
dc.publicationPeer Revieweden_US
dc.identifier.doi10.1155/S1110865702206149en_US
dc.dept.handle123456789/54en
dc.relation.issue10en_US
dc.relation.volume2002en_US
cut.common.academicyear2002-2003en_US
dc.identifier.spage1021en_US
dc.identifier.epage1038en_US
item.openairetypearticle-
item.cerifentitytypePublications-
item.fulltextWith Fulltext-
item.grantfulltextopen-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.languageiso639-1en-
crisitem.author.deptDepartment of Communication and Marketing-
crisitem.author.facultyFaculty of Communication and Media Studies-
crisitem.author.orcid0000-0002-6739-8602-
crisitem.author.parentorgFaculty of Communication and Media Studies-
crisitem.journal.journalissn1687-6180-
crisitem.journal.publisherSpringer Nature-
Appears in Collections:Άρθρα/Articles
Files in This Item:
File Description SizeFormat
Tsapatsoulis_Parameterized facial expression.pdf1.23 MBAdobe PDFView/Open
CORE Recommender
Show simple item record

SCOPUSTM   
Citations 50

81
checked on Nov 9, 2023

WEB OF SCIENCETM
Citations 50

57
Last Week
0
Last month
0
checked on Oct 31, 2023

Page view(s) 50

615
Last Week
0
Last month
4
checked on Jan 30, 2025

Download(s) 50

372
checked on Jan 30, 2025

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.