Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/10927
DC FieldValueLanguage
dc.contributor.authorPartaourides, Harris-
dc.contributor.authorChatzis, Sotirios P.-
dc.date.accessioned2018-04-17T06:53:47Z-
dc.date.available2018-04-17T06:53:47Z-
dc.date.issued2018-05-15-
dc.identifier.citationExpert Systems with Applications, 2018, vol. 98, pp. 84-92en_US
dc.identifier.issn09574174-
dc.identifier.urihttps://hdl.handle.net/20.500.14279/10927-
dc.description.abstractBayesian learning has been recently considered as an effective means of accounting for uncertainty in trained deep network parameters. This is of crucial importance when dealing with small or sparse training datasets. On the other hand, shallow models that compute weighted sums of their inputs, after passing them through a bank of arbitrary randomized nonlinearities, have been recently shown to enjoy good test error bounds that depend on the number of nonlinearities. Inspired from these advances, in this paper we examine novel deep network architectures, where each layer comprises a bank of arbitrary nonlinearities, linearly combined using multiple alternative sets of weights. We effect model training by means of approximate inference based on a t-divergence measure; this generalizes the Kullback–Leibler divergence in the context of the t-exponential family of distributions. We adopt the t-exponential family since it can more flexibly accommodate real-world data, that entail outliers and distributions with fat tails, compared to conventional Gaussian model assumptions. We extensively evaluate our approach using several challenging benchmarks, and provide comparative results to related state-of-the-art techniques.en_US
dc.formatpdfen_US
dc.language.isoenen_US
dc.relation.ispartofExpert systems with applicationsen_US
dc.rights© Elsevieren_US
dc.subjectRandom kitchen sinksen_US
dc.subjectStudent's-t distributionen_US
dc.subjectt-divergenceen_US
dc.subjectVariational bayesen_US
dc.titleDeep learning with t-exponential Bayesian kitchen sinksen_US
dc.typeArticleen_US
dc.collaborationCyprus University of Technologyen_US
dc.subject.categoryElectrical Engineering - Electronic Engineering - Information Engineeringen_US
dc.journalsHybrid Open Accessen_US
dc.countryCyprusen_US
dc.subject.fieldEngineering and Technologyen_US
dc.publicationPeer Revieweden_US
dc.identifier.doi10.1016/j.eswa.2018.01.013en_US
dc.relation.volume98en_US
cut.common.academicyear2017-2018en_US
dc.identifier.spage84en_US
dc.identifier.epage92en_US
item.fulltextNo Fulltext-
item.cerifentitytypePublications-
item.grantfulltextnone-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.openairetypearticle-
item.languageiso639-1en-
crisitem.journal.journalissn0957-4174-
crisitem.journal.publisherElsevier-
crisitem.author.deptDepartment of Electrical Engineering, Computer Engineering and Informatics-
crisitem.author.deptDepartment of Electrical Engineering, Computer Engineering and Informatics-
crisitem.author.facultyFaculty of Engineering and Technology-
crisitem.author.facultyFaculty of Engineering and Technology-
crisitem.author.orcid0000-0002-8555-260X-
crisitem.author.orcid0000-0002-4956-4013-
crisitem.author.parentorgFaculty of Engineering and Technology-
crisitem.author.parentorgFaculty of Engineering and Technology-
Appears in Collections:Άρθρα/Articles
CORE Recommender
Show simple item record

SCOPUSTM   
Citations

1
checked on Nov 9, 2023

WEB OF SCIENCETM
Citations

1
Last Week
0
Last month
0
checked on Oct 29, 2023

Page view(s) 20

458
Last Week
2
Last month
14
checked on May 11, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.