Back Propagation

Creator
Creator
Seonglae Cho
Created
Created
2021 Oct 6 10:3
Editor
Edited
Edited
2024 Nov 26 11:42

Backpropagation

One method of Parametric Optimization

Calculate the partial Derivatives via chain-rule

LWL=LhLhLWL\frac{\partial L}{\partial W_L} = \frac{\partial L}{\partial h_L} \frac{\partial h_L}{\partial W_L}
Compute the gradient of
Loss Function
and update weights
Chain Rule
Computational Graph
and gradient available modeling through parameters required
downstream  gradient=upstream  gradient×local  gradientdownstream \; gradient = upstream \; gradient \times local \; gradient
Not a same path of
Computational Graph
, no reuse, function part also has one derivative chain
notion image
notion image
Back Propagation Notion
 
 
 
 
 
 
 

Recommendations