Error Analysis of Newton's Method for Approximating Roots
Recall from the Newton's Method for Approximating Roots page that if $f$ is a differentiable function that contains the root $\alpha$, and $x_0$ is an approximation of $\alpha$, then we can obtain a sequence of approximations $\{ x_{n+1} \} = \left \{ x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)} \right \}$ for $n ≥ 0$ that may or may not converge to $\alpha$. If our initial approximation $x_0$ is too far away from $\alpha$, then this sequence may not converge to $\alpha$. We will now look at what must hold so that the error between our approximations $x_n$ and $\alpha$ converge to $0$ so that $\{ x_n \}$ converges to $\alpha$.
Consider the interval $[a, b]$ and suppose that there exists a root $\alpha \in (a, b)$ ($f(\alpha) = 0$). Assume that both $f'$ and $f''$ are continuous functions and that $f'(\alpha) \neq 0$ (that is the slope of the tangent line at $(\alpha, f(\alpha))$ is not $0$ and hence is not a horizontal line). Using Taylor's Theorem we have that for some $c_n$ between $\alpha$ and $x_n$ that:
(1)If we divide both sides of the equation by $f'(x_n)$ we get that:
(2)Now since $x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}$ then by rearranging these terms we get that $\frac{f(x_n)}{f'(x_n)} = x_n - x_{n+1}$, and substituting this into the equation above and isolating for $\alpha - x_{n+1}$ we get:
(3)Note that in the above equation for the error in the approximation $x_{n+1}$ of $\alpha$, that $\mathrm{Error} (x_n) = \alpha - x_n$ appears. We can see that if the initial approximations are very small, then Newton's Method converges very quickly towards $\alpha$ upon successive iterations.
Now suppose that $x_n$ is very close to the root $\alpha$. Then since $c_n$ is between $x_n$ and $\alpha$ then $c_n$ is also very close to $\alpha$ and hence:
(4)Therefore, for $n ≥ 0$ we can approximate the error of $x_{n+1}$ from $\alpha$ as:
(5)We will now multiply both sides of the equation above by $M_{\alpha}$ to get that:
(6)Now note that $M_{\alpha} (\alpha - x_n) \approx M_{\alpha}^2(\alpha - x_{n-1}) = \left ( M_{\alpha} (\alpha - x_{n-1}) \right)^2$. If we repeat this process then we get that for $n ≥ 0$:
(7)Now for the error $\alpha - x_n$ to converge to $0$ (once again, so that our approximations $x_n$ converge to $\alpha$), we must have that $-1 < M_{\alpha} (\alpha - x_0) < 1$, i.e, $\mid M_{\alpha}(\alpha - x_0) \mid < 1$, because if so, then as $n \to \infty$ we have that $M_{\alpha}(\alpha - x_n) \to 0$.
(8)One important thing to note is that if we maximize $M_{\alpha}$ on the interval $[a, b]$, that is let $M$ be defined to be the largest possible $M_{\alpha}$ over $[a, b]$, then for any $M_{\alpha}$ we have that:
(9)If we have that $M < 1$, then all $M_{\alpha} < 1$ which will guarantee us convergence of Newton's Method to $\alpha$.