Properties of bagged nearest neighbour classifiers

Peter Hall, Richard J. Samworth*

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    42 Citations (Scopus)

    Abstract

    It is shown that bagging, a computationally intensive method, asymptotically improves the performance of nearest neighbour classifiers provided that the resample size is less than 69% of the actual sample size, in the case of with-replacement bagging, or less than 50% of the sample size, for without-replacement bagging. However, for larger sampling fractions there is no asymptotic difference between the risk of the regular nearest neighbour classifier and its bagged version. In particular, neither achieves the large sample performance of the Bayes classifier. In contrast, when the sampling fractions converge to 0, but the resample sizes diverge to co, the bagged classifier converges to the optimal Bayes rule and its risk converges to the risk of the latter. These results are most readily seen when the two populations have well-defined densities, but they may also be derived in other cases, where densities exist in only a relative sense. Cross-validation can be used effectively to choose the sampling fraction. Numerical calculation is used to illustrate these theoretical properties.

    Original languageEnglish
    Pages (from-to)363-379
    Number of pages17
    JournalJournal of the Royal Statistical Society. Series B: Statistical Methodology
    Volume67
    Issue number3
    DOIs
    Publication statusPublished - 2005

    Fingerprint

    Dive into the research topics of 'Properties of bagged nearest neighbour classifiers'. Together they form a unique fingerprint.

    Cite this