Please use this identifier to cite or link to this item:
Title: A latent variable Gaussian process model with Pitman–Yor process priors for multiclass classification
Authors: Chatzis, Sotirios P. 
Keywords: Gaussian process;Pitman–Yor process;Mixture model
Category: Computer and Information Sciences
Field: Natural Sciences
Issue Date: Nov-2013
Publisher: Elsevier B.V
Source: Neurocomputing, 2013, Volume 120, Pages 482–489
Abstract: Gaussian processes (GPs) constitute one of the most important Bayesian machine learning approaches. Several researchers have considered postulating mixtures of Gaussian processes as a means of dealing with non-stationary covariance functions, discontinuities, multi-modality, and overlapping output signals. In existing works, mixtures of Gaussian processes are based on the introduction of a gating function defined over the space of model input variables. This way, each postulated mixture component Gaussian process is effectively restricted in a limited subset of the input space. Additionally, the applicability of these models is limited to regression tasks. In this paper, for the first time in the literature, we devise a Gaussian process mixture model especially suitable for multiclass classification applications: We consider a GP classification scheme the prior distribution of which is a fully generative nonparametric Bayesian model with power-law behavior, generating Gaussian processes over the whole input space of the learned task. We provide an efficient algorithm for model inference, based on the variational Bayesian framework, and exhibit its efficacy using benchmark and real-world classification datasets.
ISSN: 0925-2312
Rights: © Elsevier B.V.
Type: Article
Appears in Collections:Άρθρα/Articles

Show full item record

Page view(s) 50

Last Week
Last month
checked on Dec 15, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.