Boosting Bayesian MAP classification

Paolo Piro*, Richard Nock, Frank Nielsen, Michel Barlaud

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper we redefine and generalize the classic k-nearest neighbors (k-NN) voting rule in a Bayesian maximum-a-posteriori (MAP) framework. Therefore, annotated examples are used for estimating pointwise class probabilities in the feature space, thus giving rise to a new instance-based classification rule. Namely, we propose to "boost" the classic k-NN rule by inducing a strong classifier from a combination of sparse training data, called "prototypes". In order to learn these prototypes, our MAPBOOST algorithm globally minimizes a multiclass exponential risk defined over the training data, which depends on the class probabilities estimated at sample points themselves. We tested our method for image categorization on three benchmark databases. Experimental results show that MAPBOOST significantly outperforms classic k-NN (up to 8%). Interestingly, due to the supervised selection of sparse prototypes and the multiclass classification framework, the accuracy improvement is obtained with a considerable computational cost reduction.

Original languageEnglish
Title of host publicationProceedings - 2010 20th International Conference on Pattern Recognition, ICPR 2010
Pages661-665
Number of pages5
DOIs
Publication statusPublished - 2010
Externally publishedYes
Event2010 20th International Conference on Pattern Recognition, ICPR 2010 - Istanbul, Turkey
Duration: 23 Aug 201026 Aug 2010

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Conference

Conference2010 20th International Conference on Pattern Recognition, ICPR 2010
Country/TerritoryTurkey
CityIstanbul
Period23/08/1026/08/10

Fingerprint

Dive into the research topics of 'Boosting Bayesian MAP classification'. Together they form a unique fingerprint.

Cite this