Bias-Variance Trade-off

Creator
Creator
Seonglae Cho
Created
Created
2023 May 9 2:33
Editor
Edited
Edited
2025 Jun 2 11:23

Model Complexity and Its Impact

Overly simple models underfit, and overly complex models overfit.
The relationship between model complexity and performance is a fundamental concept in machine learning that influences both bias and variance.
For better generalization performance, we prefer biased models within the
Bias-Variance Trade-off
to prevent the high variance that occurs in unbiased models
notion image
  • Simple Models (Low Complexity)
    • Have large bias but small variance
    • Tend to underfit the data
    • Use fewer parameters
  • Complex Models (High Complexity)
    • Have small bias but large variance
    • Tend to overfit the data
    • Use many parameters
This trade-off suggests there might be fundamental limits to artificial intelligence capabilities, similar to the uncertainty principle in physics - we may need to balance between model complexity and generalization ability.
notion image
Modern interpolating regime by Belkin et al. (2018) , ,
Modern interpolating regime by Belkin et al. (2018) , ,

Bias-Variance Decomposition (
Risk
)

notion image
 
 
 
 
 

Recommendations