arXiv Analytics

Sign in

arXiv:0904.2037 [cs.LG]AbstractReferencesReviewsResources

Boosting through Optimization of Margin Distributions

Chunhua Shen, Hanxi Li

Published 2009-04-14, updated 2010-01-06Version 3

Boosting has attracted much research attention in the past decade. The success of boosting algorithms may be interpreted in terms of the margin theory. Recently it has been shown that generalization error of classifiers can be obtained by explicitly taking the margin distribution of the training data into account. Most of the current boosting algorithms in practice usually optimizes a convex loss function and do not make use of the margin distribution. In this work we design a new boosting algorithm, termed margin-distribution boosting (MDBoost), which directly maximizes the average margin and minimizes the margin variance simultaneously. This way the margin distribution is optimized. A totally-corrective optimization algorithm based on column generation is proposed to implement MDBoost. Experiments on UCI datasets show that MDBoost outperforms AdaBoost and LPBoost in most cases.

Comments: 9 pages. To publish/Published in IEEE Transactions on Neural Networks, 21(7), July 2010
Categories: cs.LG, cs.CV
Related articles: Most relevant | Search more
arXiv:1909.12673 [cs.LG] (Published 2019-09-27)
A Constructive Prediction of the Generalization Error Across Scales
arXiv:1808.01174 [cs.LG] (Published 2018-08-03)
Generalization Error in Deep Learning
arXiv:1206.3274 [cs.LG] (Published 2012-06-13)
Small Sample Inference for Generalization Error in Classification Using the CUD Bound