Please use this identifier to cite or link to this item:
|Title:||A possibilistic clustering approach toward generative mixture models||Authors:||Chatzis, Sotirios P.
Chatzis, Sotirios P.
|Keywords:||Pattern recognition;Mixtures;Computer science||Issue Date:||2012||Publisher:||Elsevier||Source:||Pattern recognition, 2012, Volume 45, Issue 5, Pages 1819–1825||Abstract:||Generative mixture models (MMs) provide one of the most popular methodologies for unsupervised data clustering. MMs are formulated on the basis of the assumption that each observation derives from (belongs to) a single cluster. However, in many applications, data may intuitively belong to multiple classes, thus rendering the single-cluster assignment assumptions of MMs irrelevant. Furthermore, even in applications where a single-cluster data assignment is required, the induced multinomial allocation of the modeled data points to the clusters derived by a MM, imposing the constraint that the membership probabilities of a data point across clusters sum to one, makes MMs very vulnerable to the presence of outliers in the clustered data sets, and renders them ineffective in discriminating between cases of equal evidence or ignorance. To resolve these issues, in this paper we introduce a possibilistic formulation of MMs. Possibilistic clustering is a methodology that yields possibilistic data partitions, with the obtained membership values being interpreted as degrees of possibility (compatibilities) of the data points with respect to the various clusters. We provide an efficient maximum-likelihood fitting algorithm for the proposed model, and we conduct an objective evaluation of its efficacy using benchmark data||URI:||http://ktisis.cut.ac.cy/handle/10488/7244||ISSN:||0031-3203||DOI:||http://dx.doi.org/10.1016/j.patcog.2011.10.010||Rights:||© 2011 Elsevier Ltd. All Rights Reserved||Type:||Article|
|Appears in Collections:||Άρθρα/Articles|
Show full item record
checked on Feb 13, 2018
Page view(s) 2076
checked on Dec 16, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.