Παρακαλώ χρησιμοποιήστε αυτό το αναγνωριστικό για να παραπέμψετε ή να δημιουργήσετε σύνδεσμο προς αυτό το τεκμήριο: https://hdl.handle.net/20.500.14279/22655
Πεδίο DCΤιμήΓλώσσα
dc.contributor.authorLoizou,  Christos P.-
dc.date.accessioned2021-06-08T07:56:18Z-
dc.date.available2021-06-08T07:56:18Z-
dc.date.issued2021-06-
dc.identifier.citationSpeech Communication, 2021, vol. 130, pp. 15-26en_US
dc.identifier.issn01676393-
dc.identifier.urihttps://hdl.handle.net/20.500.14279/22655-
dc.description.abstractObjective: Human interactions are related to speech and facial characteristics. It was suggested that speech signals and/or images of facial expressions may reveal human emotions and that both interact for the verification of a person's identity. The present study proposes and evaluates an automated integrated speech signal and facial image analysis system for the identification of seven different human emotions (Normal (N), Happy (H), Sad (S), Disgust (D), Fear (F), Anger (A), and Surprise (Su)). Methods: Speech recordings and face images from 7,441 subjects aged 20≤age≤74 were collected, normalized and filtered. From all the above recordings 55 speech signal features and 61 different image face texture features were extracted. Statistical and model multi-classification analysis were performed to select the features able to statistically significantly distinguish between the seven aforementioned human emotions (N, H, S, D, F, A and Su). The selected features alone or a combination of these features along with age and gender of the sample investigated were used to build two learning-based classifiers; and the classifiers’ accuracy was computed. Results: For each of the above mentioned human emotions, statistical significantly different speech and face image features were identified that may be used to distinguish between the aforementioned groups (N vs H, N vs S, N vs D, N vs F, N vs A, N vs Su). Using solely the statistically significant speech and image features identified, an overall percentage of correct classification (%CC) score of 93% was achieved. Conclusions: A significant number of speech and face image features have been derived from continuous speech and face images. Features were identified that were able to identify between seven different emotional human states. This study poses the basis for the development of an integrated system for the identification of emotional states from automatic analysis of free speech and image face analysis. Future work will investigate the development and integration of the proposed method into a mobile device.en_US
dc.formatpdfen_US
dc.language.isoenen_US
dc.relation.ispartofSpeech Communicationen_US
dc.rights© Elsevieren_US
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectSpeech analysisen_US
dc.subjectFace analysisen_US
dc.subjectHuman emotions detectionen_US
dc.subjectStatistical analysisen_US
dc.subjectClassification analysisen_US
dc.titleAn automated integrated speech and face imageanalysis system for the identification of human emotionsen_US
dc.typeArticleen_US
dc.collaborationCyprus University of Technologyen_US
dc.subject.categoryComputer and Information Sciencesen_US
dc.journalsSubscriptionen_US
dc.countryCyprusen_US
dc.subject.fieldNatural Sciencesen_US
dc.publicationPeer Revieweden_US
dc.identifier.doi10.1016/j.specom.2021.04.001en_US
dc.relation.volume130en_US
cut.common.academicyear2020-2021en_US
dc.identifier.spage15en_US
dc.identifier.epage26en_US
item.grantfulltextnone-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.openairetypearticle-
item.fulltextNo Fulltext-
crisitem.journal.journalissn0167-6393-
crisitem.journal.publisherElsevier-
crisitem.author.deptDepartment of Electrical Engineering, Computer Engineering and Informatics-
crisitem.author.facultyFaculty of Engineering and Technology-
crisitem.author.orcid0000-0003-1247-8573-
crisitem.author.parentorgFaculty of Engineering and Technology-
Εμφανίζεται στις συλλογές:Άρθρα/Articles
CORE Recommender
Δείξε τη σύντομη περιγραφή του τεκμηρίου

SCOPUSTM   
Citations

6
checked on 9 Νοε 2023

WEB OF SCIENCETM
Citations

4
Last Week
0
Last month
0
checked on 29 Οκτ 2023

Page view(s)

300
Last Week
1
Last month
8
checked on 6 Νοε 2024

Google ScholarTM

Check

Altmetric


Αυτό το τεκμήριο προστατεύεται από άδεια Άδεια Creative Commons Creative Commons