Boosting

被引:20
作者
Buhlmann, Peter [1 ]
Yu, Bin [2 ]
机构
[1] Swiss Fed Inst Technol, Stat, CH-8092 Zurich, Switzerland
[2] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
关键词
boosting; gradient descent; AdaBoost; L2Boost; base learner;
D O I
10.1002/wics.55
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this contribution, we review boosting, one of the most effective machine learning methods for classification and regression. Most of the article takes the gradient descent point of view, even though we do include the margin point of view as well. In particular, AdaBoost in classification and various versions of L2boosting in regression are covered. Advice on how to choose base (weak) learners and loss functions and pointers to software are also given for practitioners. (C) 2009 John Wiley & Sons, Inc.
引用
收藏
页码:69 / 74
页数:6
相关论文
共 34 条
[1]  
Bartlett PL, 2007, J MACH LEARN RES, V8, P2347
[2]  
Bickel PJ, 2006, J MACH LEARN RES, V7, P705
[3]   Prediction games and arcing algorithms [J].
Breiman, L .
NEURAL COMPUTATION, 1999, 11 (07) :1493-1517
[4]   Boosting algorithms: Regularization, prediction and model fitting [J].
Buehlmann, Peter ;
Hothorn, Torsten .
STATISTICAL SCIENCE, 2007, 22 (04) :477-505
[5]  
Buhlmann P, 2006, J MACH LEARN RES, V7, P1001
[6]   Boosting with the L2 loss:: Regression and classification [J].
Bühlmann, P ;
Yu, B .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2003, 98 (462) :324-339
[7]   Boosting for high-dimensional linear models [J].
Buhlmann, Peter .
ANNALS OF STATISTICS, 2006, 34 (02) :559-583
[8]   Linear programming boosting via column generation [J].
Demiriz, A ;
Bennett, KP ;
Shawe-Taylor, J .
MACHINE LEARNING, 2002, 46 (1-3) :225-254
[9]   Least angle regression - Rejoinder [J].
Efron, B ;
Hastie, T ;
Johnstone, I ;
Tibshirani, R .
ANNALS OF STATISTICS, 2004, 32 (02) :494-499
[10]  
Freund Y., 1996, Machine Learning. Proceedings of the Thirteenth International Conference (ICML '96), P148