Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/29868
DC FieldValueLanguage
dc.contributor.authorPanousis, Konstantinos P.-
dc.contributor.authorAntoniadis, Anastasios-
dc.contributor.authorChatzis, Sotirios P.-
dc.date.accessioned2023-07-14T09:22:22Z-
dc.date.available2023-07-14T09:22:22Z-
dc.date.issued2022-06-30-
dc.identifier.citationProceedings of the 36th AAAI Conference on Artificial Intelligence (Virtual, Online), 2022, pp. 7931 - 7940en_US
dc.identifier.isbn1577358767-
dc.identifier.isbn978-157735876-3-
dc.identifier.urihttps://hdl.handle.net/20.500.14279/29868-
dc.description.abstractThis work aims to address the long-established problem of learning diversified representations. To this end, we combine information-theoretic arguments with stochastic competition-based activations, namely Stochastic Local Winner-Takes-All (LWTA) units. In this context, we ditch the conventional deep architectures commonly used in Representation Learning, that rely on non-linear activations; instead, we replace them with sets of locally and stochastically competing linear units. In this setting, each network layer yields sparse outputs, determined by the outcome of the competition between units that are organized into blocks of competitors. We adopt stochastic arguments for the competition mechanism, which perform posterior sampling to determine the winner of each block. We further endow the considered networks with the ability to infer the sub-part of the network that is essential for modeling the data at hand; we impose appropriate stick-breaking priors to this end. To further enrich the information of the emerging representations, we resort to information-theoretic principles, namely the Information Competing Process (ICP). Then, all the components are tied together under the stochastic Variational Bayes framework for inference. We perform a thorough experimental investigation for our approach using benchmark datasets on image classification. As we experimentally show, the resulting networks yield significant discriminative representation learning abilities. In addition, the introduced paradigm allows for a principled investigation mechanism of the emerging intermediate network representations.en_US
dc.language.isoenen_US
dc.rightsCopyright © Elsevier B.Ven_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/-
dc.subjectArtificial intelligenceen_US
dc.subjectClassification (of information)en_US
dc.subjectInformation theoryen_US
dc.subjectNetwork layersen_US
dc.subjectStochastic systemsen_US
dc.titleCompeting Mutual Information Constraints with Stochastic Competition-Based Activations for Learning Diversified Representationsen_US
dc.typeArticleen_US
dc.collaborationCyprus University of Technologyen_US
dc.subject.categoryElectrical Engineering - Electronic Engineering - Information Engineeringen_US
dc.countryCyprusen_US
dc.subject.fieldEngineering and Technologyen_US
dc.relation.conference36th AAAI Conference on Artificial Intelligence (AAAI)en_US
dc.relation.volume36en_US
cut.common.academicyear2022-2023en_US
dc.identifier.spage7931en_US
dc.identifier.epage7940en_US
item.fulltextNo Fulltext-
item.cerifentitytypePublications-
item.grantfulltextnone-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.openairetypearticle-
item.languageiso639-1en-
crisitem.author.deptDepartment of Electrical Engineering, Computer Engineering and Informatics-
crisitem.author.facultyFaculty of Engineering and Technology-
crisitem.author.orcid0000-0002-4956-4013-
crisitem.author.parentorgFaculty of Engineering and Technology-
Appears in Collections:Δημοσιεύσεις σε συνέδρια /Conference papers or poster or presentation
CORE Recommender
Show simple item record

Page view(s)

110
Last Week
3
Last month
9
checked on May 11, 2024

Google ScholarTM

Check

Altmetric


This item is licensed under a Creative Commons License Creative Commons