Please use this identifier to cite or link to this item:
https://hdl.handle.net/20.500.14279/9220
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Panis, Gabriel | - |
dc.contributor.author | Lanitis, Andreas | - |
dc.contributor.author | Tsapatsoulis, Nicolas | - |
dc.contributor.author | Cootes, Timothy F. | - |
dc.contributor.other | Λανίτης, Ανδρέας | - |
dc.contributor.other | Τσαπατσούλης, Νικόλας | - |
dc.date.accessioned | 2017-01-24T08:46:59Z | - |
dc.date.available | 2017-01-24T08:46:59Z | - |
dc.date.issued | 2016-06-01 | - |
dc.identifier.citation | IET Biometrics, 2016, vol. 5, no. 2, pp. 37-46 | en_US |
dc.identifier.issn | 20474946 | - |
dc.identifier.uri | https://hdl.handle.net/20.500.14279/9220 | - |
dc.description.abstract | The face and gesture recognition network (FG-NET) ageing database was released in 2004 in an attempt to support research activities aimed at understanding the changes in facial appearance caused by ageing. Since then the database was used for carrying out research in various disciplines including age estimation, age-invariant face recognition and age progression. On the basis of the analysis of published work where the FG-NET ageing database was used, conclusions related to the type of research carried out in relation to the impact of the dataset in shaping up the research topic of facial ageing are presented. This study also includes a review of key articles from different thematic areas, where the FG-NET ageing database was used and the presentation of benchmark results. The ultimate aims of this study are to present concrete facts related to research activities in facial ageing during the past decade, provide an indication of the main methodologies adopted, present a comprehensive list of benchmark results and most importantly provide roadmaps for future trends, requirements and research directions in facial ageing. | en_US |
dc.format | en_US | |
dc.language.iso | en | en_US |
dc.relation.ispartof | IET Biometrics | en_US |
dc.rights | © Institution of Engineering and Technology | en_US |
dc.subject | Database systems | en_US |
dc.subject | Gesture recognition | en_US |
dc.subject | Face recognition | en_US |
dc.title | Overview of research on facial ageing using the FG-NET ageing database | en_US |
dc.type | Article | en_US |
dc.collaboration | Cyprus University of Technology | en_US |
dc.collaboration | The University of Manchester | en_US |
dc.subject.category | Electrical Engineering - Electronic Engineering - Information Engineering | en_US |
dc.journals | Subscription | en_US |
dc.country | United Kingdom | en_US |
dc.country | Cyprus | en_US |
dc.subject.field | Engineering and Technology | en_US |
dc.publication | Peer Reviewed | en_US |
dc.identifier.doi | 10.1049/iet-bmt.2014.0053 | en_US |
dc.relation.issue | 2 | en_US |
dc.relation.volume | 5 | en_US |
cut.common.academicyear | 2015-2016 | en_US |
dc.identifier.spage | 37 | en_US |
dc.identifier.epage | 46 | en_US |
item.fulltext | No Fulltext | - |
item.cerifentitytype | Publications | - |
item.grantfulltext | none | - |
item.openairecristype | http://purl.org/coar/resource_type/c_6501 | - |
item.openairetype | article | - |
item.languageiso639-1 | en | - |
crisitem.author.dept | Department of Multimedia and Graphic Arts | - |
crisitem.author.dept | Department of Communication and Marketing | - |
crisitem.author.faculty | Faculty of Fine and Applied Arts | - |
crisitem.author.faculty | Faculty of Communication and Media Studies | - |
crisitem.author.orcid | 0000-0001-6841-8065 | - |
crisitem.author.orcid | 0000-0002-6739-8602 | - |
crisitem.author.parentorg | Faculty of Fine and Applied Arts | - |
crisitem.author.parentorg | Faculty of Communication and Media Studies | - |
Appears in Collections: | Άρθρα/Articles |
CORE Recommender
SCOPUSTM
Citations
155
checked on Nov 9, 2023
WEB OF SCIENCETM
Citations
10
124
Last Week
0
0
Last month
3
3
checked on Oct 29, 2023
Page view(s)
480
Last Week
2
2
Last month
13
13
checked on May 12, 2024
Google ScholarTM
Check
Altmetric
Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.