Please use this identifier to cite or link to this item:
Title: Deformable probability maps: probabilistic shape and appearance-based object segmentation
Authors: Chatzis, Sotirios P. 
Tsechpenakis, Gavriil 
Keywords: Computer science;Classification;Data processing;Computer vision
Category: Computer and Information Sciences
Field: Engineering and Technology
Issue Date: 2011
Publisher: Elsevier
Source: Computer vision and image understanding, 2011, vol. 115, no. 8, pp. 1157–1169
Journal: Computer Vision and Image Understanding 
Abstract: We present the Deformable Probability Maps (DPMs) for object segmentation, which are graphical learning models incorporating properties of deformable models into discriminative classification. The DPM configuration is described by probabilistic energy functionals, which incorporate shape and appearance, and determine boundary smoothness, image features consistency, and topology with respect to the image salient edges. Similarly to deformable models, DPMs are dynamic, and their evolution is solved as a MAP inference problem. DPMs offer two major advantages: (i) they extend the Markovian property in the image domain to incorporate local shape constraints, similar to the known internal energy of deformable models, and therefore provide increased robustness in capturing objects with fuzzy boundaries; (ii) during their evolution, DPMs update the region statistics, and therefore they are robust to image feature variations. In our experiments we evaluate the DPMs' performance in a variety of images, while we compare them with existing deformable models and classification approaches on standard benchmark datasets
ISSN: 1077-3142 (print)
1090-235X (online)
DOI: 10.1016/j.cviu.2010.09.010
Rights: © 2011 Elsevier
Type: Article
Appears in Collections:Άρθρα/Articles

Show full item record

Citations 20

checked on Feb 8, 2018

Citations 10

checked on Jun 17, 2019

Page view(s) 10

Last Week
Last month
checked on Jun 19, 2019

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.