Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/4128
DC FieldValueLanguage
dc.contributor.authorDemiris, Yiannis-
dc.contributor.authorChatzis, Sotirios P.-
dc.date2012en
dc.date.accessioned2014-07-09T07:04:29Z-
dc.date.accessioned2015-12-09T11:30:25Z-
dc.date.available2014-07-09T07:04:29Z-
dc.date.available2015-12-09T11:30:25Z-
dc.date.issued2012-12-
dc.identifier.citationIEEE Transactions on Neural Networks and Learning Systems, 2012, vol. 23, no. 12, pp. 1862-1871en_US
dc.identifier.issn21622388-
dc.identifier.urihttps://hdl.handle.net/20.500.14279/4128-
dc.description.abstractGaussian processes (GPs) constitute one of the most important Bayesian machine learning approaches, based on a particularly effective method for placing a prior distribution over the space of regression functions. Several researchers have considered postulating mixtures of GPs as a means of dealing with nonstationary covariance functions, discontinuities, multimodality, and overlapping output signals. In existing works, mixtures of GPs are based on the introduction of a gating function defined over the space of model input variables. This way, each postulated mixture component GP is effectively restricted in a limited subset of the input space. In this paper, we follow a different approach. We consider a fully generative nonparametric Bayesian model with power-law behavior, generating GPs over the whole input space of the learned task. We provide an efficient algorithm for model inference, based on the variational Bayesian framework, and prove its efficacy using benchmark and real-world datasets. 2012 IEEE.en_US
dc.formatpdfen_US
dc.languageenen
dc.language.isoenen_US
dc.relation.ispartofIEEE transactions on neural networks and learning systemsen_US
dc.rights© 2012 IEEEen_US
dc.subjectGaussian Processesen_US
dc.subjectMachine learning approachesen_US
dc.subjectMixture modelen_US
dc.subjectNon-stationary covarianceen_US
dc.subjectNonparametric Bayesian modelsen_US
dc.subjectNonparametric mixturesen_US
dc.subjectVariational Bayesian frameworksen_US
dc.subjectAlgorithmsen_US
dc.subjectBayesian networksen_US
dc.subjectGaussian distributionen_US
dc.subjectGaussian noise (electronic)en_US
dc.subjectMixturesen_US
dc.subjectSpace power generationen_US
dc.subjectProcess regressionen_US
dc.subjectNeural-networksen_US
dc.subjectModelsen_US
dc.titleNonparametric Mixtures of Gaussian Processes With Power-Law Behavioren_US
dc.typeArticleen_US
dc.collaborationCyprus University of Technologyen_US
dc.collaborationImperial College Londonen_US
dc.subject.categoryElectrical Engineering - Electronic Engineering - Information Engineeringen_US
dc.journalsSubscriptionen_US
dc.reviewPeer Reviewed-
dc.countryCyprusen_US
dc.countryUnited Kingdomen_US
dc.subject.fieldEngineering and Technologyen_US
dc.publicationPeer Revieweden_US
dc.identifier.doi10.1109/TNNLS.2012.2217986en_US
dc.dept.handle123456789/134en
dc.relation.issue12en_US
dc.relation.volume23en_US
cut.common.academicyear2012-2013en_US
dc.identifier.spage1862en_US
dc.identifier.epage1871en_US
item.grantfulltextnone-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.fulltextNo Fulltext-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.openairetypearticle-
crisitem.journal.journalissn2162237X-
crisitem.journal.publisherIEEE-
crisitem.author.deptDepartment of Electrical Engineering, Computer Engineering and Informatics-
crisitem.author.facultyFaculty of Engineering and Technology-
crisitem.author.orcid0000-0002-4956-4013-
crisitem.author.parentorgFaculty of Engineering and Technology-
Appears in Collections:Άρθρα/Articles
CORE Recommender
Show simple item record

SCOPUSTM   
Citations

19
checked on Nov 9, 2023

WEB OF SCIENCETM
Citations

20
Last Week
0
Last month
0
checked on Oct 29, 2023

Page view(s)

405
Last Week
0
Last month
3
checked on Dec 23, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.