Gradient Boosting

Creator
Creator
Seonglae ChoSeonglae Cho
Created
Created
2021 Oct 6 14:39
Editor
Edited
Edited
2025 Mar 24 22:11
Refs
Refs

Gradient Boosting
is a Generalization of boosting to arbitrary loss function

Similar to
AdaBoost
, Gradient Boosting builds upon previous models by adding new models that compensate for the errors of earlier ones. However, it takes a different approach:
  1. Forward stagewise boosting adds at each steps the tree that reduces the loss the most (given the current model)
  • Instead of updating sample weights at each learning stage, it trains new models on the residual errors from previous stages
  • This focus on residual errors allows for more precise error correction and model improvement
Gradient Boosting Tools
 
 
notion image
 
 
 
ML simple works - A Gentle Introduction to Gradient Boosting
부스팅 알고리즘은 약한 학습기weak learner를 순차적으로 학습시키고 개별 학습기의 예측을 모두 더해 최종 예측을 만들어내는 앙상블 메소드의 한 종류입니다. 그 중 그래디언트 부스팅은 강력한 성능으로 가장 많이 애용되는 알고리즘입니다.
Gradient boosting
Gradient boosting is a machine learning technique used in regression and classification tasks, among others. It gives a prediction model in the form of an ensemble of weak prediction models, i.e., models that make very few assumptions about the data, which are typically simple decision trees. When a decision tree is the weak learner, the resulting algorithm is called gradient-boosted trees; it usually outperforms random forest. A gradient-boosted trees model is built in a stage-wise fashion as in other boosting methods, but it generalizes the other methods by allowing optimization of an arbitrary differentiable loss function.
Gradient boosting
 
 

Recommendations