Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/10120
DC FieldValueLanguage
dc.contributor.authorPartaourides, Charalampos-
dc.contributor.authorChatzis, Sotirios P.-
dc.contributor.otherΠαρταουρίδης, Χαράλαμπος-
dc.contributor.otherΧατζής, Σωτήριος-
dc.date.accessioned2017-06-16T10:33:13Z-
dc.date.available2017-06-16T10:33:13Z-
dc.date.issued2017-01-01-
dc.identifier.citation21st Pacific-Asia Conference on Knowledge Discovery and Data Mining, Jeju, South Korea, 23 -26 May 2017en_US
dc.identifier.isbn9783319574530-
dc.descriptionAdvances in Knowledge Discovery and Data Mining, 2017, Pages 30-41 Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) Volume 10234 LNAI, 2017, Pages 30-41en_US
dc.description.abstractDeep neural networks (DNNs) often require good regularizers to generalize well. Currently, state-of-the-art DNN regularization techniques consist in randomly dropping units and/or connections on each iteration of the training algorithm. Dropout and DropConnect are characteristic examples of such regularizers, that are widely popular among practitioners. However, a drawback of such approaches consists in the fact that their postulated probability of random unit/connection omission is a constant that must be heuristically selected based on the obtained performance in some validation set. To alleviate this burden, in this paper we regard the DNN regularization problem from a Bayesian inference perspective: We impose a sparsity-inducing prior over the network synaptic weights, where the sparsity is induced by a set of Bernoulli-distributed binary variables with Beta (hyper-)priors over their prior parameters. This way, we eventually allow for marginalizing over the DNN synaptic connectivity for output generation, thus giving rise to an effective, heuristics-free, network regularization scheme. We perform Bayesian inference for the resulting hierarchical model by means of an efficient Black-Box Variational inference scheme. We exhibit the advantages of our method over existing approaches by conducting an extensive experimental evaluation using benchmark datasets.en_US
dc.formatpdfen_US
dc.language.isoenen_US
dc.rights© 2017, Springeren_US
dc.subjectBayesian networksen_US
dc.subjectData miningen_US
dc.subjectHierarchical systemsen_US
dc.subjectInference enginesen_US
dc.subjectIterative methodsen_US
dc.titleDeep network regularization via bayesian inference of synaptic connectivityen_US
dc.typeConference Papersen_US
dc.collaborationCyprus University of Technologyen_US
dc.subject.categoryElectrical Engineering - Electronic Engineering - Information Engineeringen_US
dc.countryCyprusen_US
dc.subject.fieldEngineering and Technologyen_US
dc.publicationPeer Revieweden_US
dc.relation.conferencePacific-Asia Conference on Knowledge Discovery and Data Miningen_US
dc.identifier.doi10.1007/978-3-319-57454-7_3en_US
cut.common.academicyear2016-2017en_US
item.fulltextNo Fulltext-
item.cerifentitytypePublications-
item.grantfulltextnone-
item.openairecristypehttp://purl.org/coar/resource_type/c_c94f-
item.openairetypeconferenceObject-
item.languageiso639-1en-
crisitem.author.deptDepartment of Electrical Engineering, Computer Engineering and Informatics-
crisitem.author.deptDepartment of Electrical Engineering, Computer Engineering and Informatics-
crisitem.author.facultyFaculty of Engineering and Technology-
crisitem.author.facultyFaculty of Engineering and Technology-
crisitem.author.orcid0000-0002-8555-260X-
crisitem.author.orcid0000-0002-4956-4013-
crisitem.author.parentorgFaculty of Engineering and Technology-
crisitem.author.parentorgFaculty of Engineering and Technology-
Appears in Collections:Δημοσιεύσεις σε συνέδρια /Conference papers or poster or presentation
CORE Recommender
Show simple item record

SCOPUSTM   
Citations

3
checked on Nov 6, 2023

Page view(s) 50

377
Last Week
2
Last month
9
checked on May 10, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.