Boosting through optimization of margin distributions

Chunhua Shen*, Hanxi Li

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    53 Citations (Scopus)

    Abstract

    Boosting has been of great interest recently in the machine learning community because of the impressive performance for classifi- cation and regression problems. The success of boosting algorithms may be interpreted in terms of the margin theory. Recently, it has been shown that generalization error of classifiers can be obtained by explicitly taking the margin distribution of the training data into account. Most of the current boosting algorithms in practice usually optimize a convex loss function and do not make use of the margin distribution. In this brief, we design a new boosting algorithm, termed margin-distribution boosting (MDBoost), which directly maximizes the average margin and minimizes the margin variance at the same time. This way the margin distribution is optimized. A totally corrective optimization algorithm based on column generation is proposed to implement MDBoost. Experiments on various data sets show that MDBoost outperforms AdaBoost and LPBoost in most cases.

    Original languageEnglish
    Article number5411921
    Pages (from-to)659-666
    Number of pages8
    JournalIEEE Transactions on Neural Networks
    Volume21
    Issue number4
    DOIs
    Publication statusPublished - Apr 2010

    Fingerprint

    Dive into the research topics of 'Boosting through optimization of margin distributions'. Together they form a unique fingerprint.

    Cite this