Ridge regression

Creator
Creator
Seonglae Cho
Created
Created
2023 Mar 14 2:28
Editor
Edited
Edited
2025 Jan 28 12:44

Perform regularization as a method to increase bias and reduce variance

Regularization
concept is first adapted to Ridge Regression model
not like linear regression, focus on Regularizer too not only on Fitting error

Linear regression +
L2 Norm

J(θ)=12(θΦy)T(θΦy)+λ2θTθJ(\theta) = \frac{1}{2}(\theta\Phi - \vec{y})^T(\theta\Phi - \vec{y}) + \frac{\lambda}{2}\theta^T\theta
find 0 point in function derivation by θ\theta
  • λ\lambda controls the tradeoff between Fitting error, Regularizer (overfitting and underfitting)
alpha, lambda, regularization parameter, penalty term is same
Ridge regression shrinks the model’s coefficients but not does not make them zero.
 
 
 
 
 
 

Recommendations