Gradient Boosting

Creator
Creator
Seonglae Cho
Created
Created
2021 Oct 6 14:39
Editor
Edited
Edited
2025 Mar 24 22:11
Refs
Refs

Gradient Boosting
is a Generalization of boosting to arbitrary loss function

Similar to
AdaBoost
, Gradient Boosting builds upon previous models by adding new models that compensate for the errors of earlier ones. However, it takes a different approach:
  1. Forward stagewise boosting adds at each steps the tree that reduces the loss the most (given the current model)
  • Instead of updating sample weights at each learning stage, it trains new models on the residual errors from previous stages
  • This focus on residual errors allows for more precise error correction and model improvement
Gradient Boosting Tools
 
 
notion image
 
 
 
for classification
 
 

Recommendations