# The Newton's method

Newton's Method

Newton's method generates a sequence $\displaystyle x_{k}$ to find the root $\displaystyle \alpha$ of a function $\displaystyle f$ starting from an initial guess $\displaystyle x_{0}$ . This initial guess $x_{0}$ should be close enough to the root $\alpha$ for the convergence to be guaranteed. We construct the tangent of $\displaystyle f$ at $\displaystyle x_{0}$ and we find an approximation of $\alpha$ by computing the root of the tangent. Repeating this iterative process we obtain the sequence $\displaystyle x_{k}$ .

## Derivation of Newton's Method

Approximating $\displaystyle f(x)$ with a second order Taylor expansion around $\displaystyle x_{k}$ ,

$f(x)=f(x_{k})+f'(x_{k})(x-x_{k})+{\frac {f''(\eta _{k})}{2}}(x-x_{k})^{2},$ with $\displaystyle \eta _{k}$ between $\displaystyle x$ and $\displaystyle x_{k}$ . Imposing $x=\alpha$ and recalling that $f(\alpha )=0$ , with a little rearranging we obtain

$\alpha =x_{k}-{\frac {f(x_{k})}{f'(x_{k})}}-{\frac {(\alpha -x_{k})^{2}}{2}}{\frac {f''(\eta _{k})}{f'(x_{k})}}.$ Neglecting the last term, we find an approximation of $\displaystyle \alpha$ which we shall call $\displaystyle x_{k+1}$ . We now have an iteration which can be used to find successively more precise approximations of $\displaystyle \alpha$ :

Newton's method :

$x_{k+1}=x_{k}-{\frac {f(x_{k})}{f'(x_{k})}}.$ ## Convergence Analysis

It's clear from the derivation that the error of Newton's method is given by

Newton's method error formula:

$\alpha -x_{k+1}=-{\frac {f''(\eta _{k})}{2f'(x_{k})}}(\alpha -x_{k})^{2}.$ From this we note that if the method converges, then the order of convergence is 2. On the other hand, the convergence of Newton's method depends on the initial guess $\displaystyle x_{0}$ .

The following theorem holds

Theorem

Assume that $f(x),\,f'(x),$ and $\displaystyle f''(x)$ are continuous in neighborhood of the root $\displaystyle \alpha$ and that $\displaystyle f'(\alpha )\neq 0$ . Then, taken $\displaystyle x_{0}$ close enough to $\displaystyle \alpha$ , the sequence $\displaystyle x_{k}$ , with $k\geq 0$ , defined by the Newton's method converges to $\alpha$ . Moreover the order of convergence is $\displaystyle p=2$ , as

$\lim _{k\to \infty }{\frac {\alpha -x_{k+1}}{(\alpha -x_{k})^{2}}}=-{\frac {f''(\alpha )}{2f'(\alpha )}}.$ The disadvantages of using this method are numerous. First of all, it is not guaranteed that Newton's method will converge if we select an $\displaystyle x_{0}$ that is too far from the exact root. Likewise, if our tangent line becomes parallel or almost parallel to the x-axis, we are not guaranteed convergence with the use of this method. Also, because we have two functions to evaluate with each iteration ($f(x_{k})$ and $f'(x_{k})$ , this method is computationally expensive. Another disadvantage is that we must have a functional representation of the derivative of our function, which is not always possible if we working only from given data.