Please use this identifier to cite or link to this item:
https://hdl.handle.net/20.500.14279/10927
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Partaourides, Harris | - |
dc.contributor.author | Chatzis, Sotirios P. | - |
dc.date.accessioned | 2018-04-17T06:53:47Z | - |
dc.date.available | 2018-04-17T06:53:47Z | - |
dc.date.issued | 2018-05-15 | - |
dc.identifier.citation | Expert Systems with Applications, 2018, vol. 98, pp. 84-92 | en_US |
dc.identifier.issn | 09574174 | - |
dc.identifier.uri | https://hdl.handle.net/20.500.14279/10927 | - |
dc.description.abstract | Bayesian learning has been recently considered as an effective means of accounting for uncertainty in trained deep network parameters. This is of crucial importance when dealing with small or sparse training datasets. On the other hand, shallow models that compute weighted sums of their inputs, after passing them through a bank of arbitrary randomized nonlinearities, have been recently shown to enjoy good test error bounds that depend on the number of nonlinearities. Inspired from these advances, in this paper we examine novel deep network architectures, where each layer comprises a bank of arbitrary nonlinearities, linearly combined using multiple alternative sets of weights. We effect model training by means of approximate inference based on a t-divergence measure; this generalizes the Kullback–Leibler divergence in the context of the t-exponential family of distributions. We adopt the t-exponential family since it can more flexibly accommodate real-world data, that entail outliers and distributions with fat tails, compared to conventional Gaussian model assumptions. We extensively evaluate our approach using several challenging benchmarks, and provide comparative results to related state-of-the-art techniques. | en_US |
dc.format | en_US | |
dc.language.iso | en | en_US |
dc.relation.ispartof | Expert systems with applications | en_US |
dc.rights | © Elsevier | en_US |
dc.subject | Random kitchen sinks | en_US |
dc.subject | Student's-t distribution | en_US |
dc.subject | t-divergence | en_US |
dc.subject | Variational bayes | en_US |
dc.title | Deep learning with t-exponential Bayesian kitchen sinks | en_US |
dc.type | Article | en_US |
dc.collaboration | Cyprus University of Technology | en_US |
dc.subject.category | Electrical Engineering - Electronic Engineering - Information Engineering | en_US |
dc.journals | Hybrid Open Access | en_US |
dc.country | Cyprus | en_US |
dc.subject.field | Engineering and Technology | en_US |
dc.publication | Peer Reviewed | en_US |
dc.identifier.doi | 10.1016/j.eswa.2018.01.013 | en_US |
dc.relation.volume | 98 | en_US |
cut.common.academicyear | 2017-2018 | en_US |
dc.identifier.spage | 84 | en_US |
dc.identifier.epage | 92 | en_US |
item.fulltext | No Fulltext | - |
item.cerifentitytype | Publications | - |
item.grantfulltext | none | - |
item.openairecristype | http://purl.org/coar/resource_type/c_6501 | - |
item.openairetype | article | - |
item.languageiso639-1 | en | - |
crisitem.journal.journalissn | 0957-4174 | - |
crisitem.journal.publisher | Elsevier | - |
crisitem.author.dept | Department of Electrical Engineering, Computer Engineering and Informatics | - |
crisitem.author.dept | Department of Electrical Engineering, Computer Engineering and Informatics | - |
crisitem.author.faculty | Faculty of Engineering and Technology | - |
crisitem.author.faculty | Faculty of Engineering and Technology | - |
crisitem.author.orcid | 0000-0002-8555-260X | - |
crisitem.author.orcid | 0000-0002-4956-4013 | - |
crisitem.author.parentorg | Faculty of Engineering and Technology | - |
crisitem.author.parentorg | Faculty of Engineering and Technology | - |
Appears in Collections: | Άρθρα/Articles |
CORE Recommender
SCOPUSTM
Citations
1
checked on Nov 9, 2023
WEB OF SCIENCETM
Citations
1
Last Week
0
0
Last month
0
0
checked on Oct 29, 2023
Page view(s) 20
458
Last Week
2
2
Last month
14
14
checked on May 11, 2024
Google ScholarTM
Check
Altmetric
Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.