On the dual formulation of boosting algorithms

Chunhua Shen*, Hanxi Li

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    104 Citations (Scopus)

    Abstract

    We study boosting algorithms from a new perspective. We show that the Lagrange dual problems of l1-norm-regularized AdaBoost, LogitBoost, and soft-margin LPBoost with generalized hinge loss are all entropy maximization problems. By looking at the dual problems of these boosting algorithms, we show that the success of boosting algorithms can be understood in terms of maintaining a better margin distribution by maximizing margins and at the same time controlling the margin variance. We also theoretically prove that approximately,l1-norm-regularized AdaBoost maximizes the average margin, instead of the minimum margin. The duality formulation also enables us to develop column-generation-based optimization algorithms, which are totally corrective. We show that they exhibit almost identical classification results to that of standard stagewise additive boosting algorithms but with much faster convergence rates. Therefore, fewer weak classifiers are needed to build the ensemble using our proposed optimization technique.

    Original languageEnglish
    Article number5432192
    Pages (from-to)2216-2231
    Number of pages16
    JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
    Volume32
    Issue number12
    DOIs
    Publication statusPublished - 2010

    Fingerprint

    Dive into the research topics of 'On the dual formulation of boosting algorithms'. Together they form a unique fingerprint.

    Cite this