Home

People

Research

Publications

Demos

 

 

 

 

 

News

Jobs

Prospective
Students

About

Internal

TaylorBoost: First and Second Order Boosting Algorithms

Boosting algorithms are shown to be equivalent to optimization in functional space for minimizing risk of classification. In this project A new family of boosting algorithms, denoted, TaylorBoost, is proposed. TaylorBoost supports any combination of loss function and first or second order optimization, and includes classical algorithms such as AdaBoost, AnyBoost, or LogitBoost as special cases. Unlike previous approach for deriving Boosting methods in functional space TaylorBoost does not require arbitrary weight specification. Instead, for both first and second-order algorithms, the weights follow naturally from the optimization. TaylorBoost show that there are two different weighting mechanism in Boosting. The first one concentrates on harder examples and the other one focuses on the examples close to boundary.

Related Publications:
  TaylorBoost: First and Second Order Boosting Algorithms with Explicit Margin Control
Mohammad J. Saberian, Hamed Masnadi-Shirazi and Nuno Vasconcelos .
Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR),
Colorado Springs, CO, 2011.
[ps] [pdf] [code]
   



Copyright @ 2009 www.svcl.ucsd.edu