Abstract
We study boosting algorithms from a new perspective. We show that the Lagrange dual problems of l1-norm-regularized AdaBoost, LogitBoost, and soft-margin LPBoost with generalized hinge loss are all entropy maximization problems. By looking at the dual problems of these boosting algorithms, we show that the success of boosting algorithms can be understood in terms of maintaining a better margin distribution by maximizing margins and at the same time controlling the margin variance. We also theoretically prove that approximately,l1-norm-regularized AdaBoost maximizes the average margin, instead of the minimum margin. The duality formulation also enables us to develop column-generation-based optimization algorithms, which are totally corrective. We show that they exhibit almost identical classification results to that of standard stagewise additive boosting algorithms but with much faster convergence rates. Therefore, fewer weak classifiers are needed to build the ensemble using our proposed optimization technique.
Original language | English |
---|---|
Article number | 5432192 |
Pages (from-to) | 2216-2231 |
Number of pages | 16 |
Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
Volume | 32 |
Issue number | 12 |
DOIs | |
Publication status | Published - 2010 |