Texonom
Texonom
/
Computing
Computing
/Computing Theory/Computability Theory/Problem Solving/Optimization/Optimization Algorithm/
Newton–Raphson method
Search

Newton–Raphson method

Creator
Creator
Seonglae ChoSeonglae Cho
Created
Created
2023 Mar 16 2:22
Editor
Editor
Seonglae ChoSeonglae Cho
Edited
Edited
2026 Mar 20 16:48
Refs
Refs
Convexity
Gradient Descent
Curvature

generalization to multi-dimension

An n-th degree equation requires n iterations for a meaningful approximation
The reason it's not actually used is complexity, and the Newton-Raphson method works well when the Hessian matrix is positive definite
H is
Hessian Matrix
notion image
  • more few steps and one directional

Generalized form

 
Newton methods approximations
BFGS
 
 
 
 
 
 
 
Newton's method
In numerical analysis, Newton's method, also known as the Newton–Raphson method, named after Isaac Newton and Joseph Raphson, is a root-finding algorithm which produces successively better approximations to the roots (or zeroes) of a real-valued function. The most basic version starts with a single-variable function f defined for a real variable x, the function's derivative f′, and an initial guess x0 for a root of f. If the function satisfies sufficient assumptions and the initial guess is close, then
Newton's method
https://en.wikipedia.org/wiki/Newton's_method
 
 

Backlinks

Convex OptimizationLogistic RegressionGauss-Newton MethodRegressionRegression analysis

Recommendations

Texonom
Texonom
/
Computing
Computing
/Computing Theory/Computability Theory/Problem Solving/Optimization/Optimization Algorithm/
Newton–Raphson method
Copyright Seonglae Cho