Update weights & manage learning rate
An algorithm that adjusts the parameters of a model in order to minimize the difference between the predicted output and the actual output of the training data.
When applying a fixed learning rate, the model may oscillate or fail to converge. With constant gradients like the sign function in L1 norm, why adaptive learning rates are applied.
Model Optimizers
Model Optimizer Notion