Error Analysis of Newton's Method for Approximating Roots

Error Analysis of Newton's Method for Approximating Roots

Recall from the Newton's Method for Approximating Roots page that if $f$ is a differentiable function that contains the root $\alpha$, and $x_0$ is an approximation of $\alpha$, then we can obtain a sequence of approximations $\{ x_{n+1} \} = \left \{ x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)} \right \}$ for $n ≥ 0$ that may or may not converge to $\alpha$. If our initial approximation $x_0$ is too far away from $\alpha$, then this sequence may not converge to $\alpha$. We will now look at what must hold so that the error between our approximations $x_n$ and $\alpha$ converge to $0$ so that $\{ x_n \}$ converges to $\alpha$.

Consider the interval $[a, b]$ and suppose that there exists a root $\alpha \in (a, b)$ ($f(\alpha) = 0$). Assume that both $f'$ and $f''$ are continuous functions and that $f'(\alpha) \neq 0$ (that is the slope of the tangent line at $(\alpha, f(\alpha))$ is not $0$ and hence is not a horizontal line). Using Taylor's Theorem we have that for some $c_n$ between $\alpha$ and $x_n$ that:

(1)
\begin{align} \quad f(\alpha) = f(x_n) + (\alpha - x_n)f'(x_n) + \frac{(\alpha - x_n)^2}{2} f''(c_n) \\ \quad 0 = f(x_n) + (\alpha - x_n)f'(x_n) + \frac{(\alpha - x_n)^2}{2} f''(c_n) \end{align}

If we divide both sides of the equation by $f'(x_n)$ we get that:

(2)
\begin{align} \quad 0 = \frac{f(x_n)}{f'(x_n)} + (\alpha - x_n) + \frac{(\alpha - x_n)^2}{2} \frac{f''(c_n)}{f'(x_n)} \end{align}

Now since $x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}$ then by rearranging these terms we get that $\frac{f(x_n)}{f'(x_n)} = x_n - x_{n+1}$, and substituting this into the equation above and isolating for $\alpha - x_{n+1}$ we get:

(3)
\begin{align} \quad 0 = (x_n - x_{n+1}) + (\alpha - x_n) + \frac{(\alpha - x_n)^2}{2} \frac{f''(c_n)}{f'(x_n)} \\ \quad \alpha - x_{n+1} = -\frac{(\alpha - x_n)^2}{2} \frac{f''(c_n)}{f'(x_n)} \\ \quad \mathrm{Error} (x_{n+1}) = -\frac{(\alpha - x_n)^2}{2} \frac{f''(c_n)}{f'(x_n)} \end{align}

Note that in the above equation for the error in the approximation $x_{n+1}$ of $\alpha$, that $\mathrm{Error} (x_n) = \alpha - x_n$ appears. We can see that if the initial approximations are very small, then Newton's Method converges very quickly towards $\alpha$ upon successive iterations.

Now suppose that $x_n$ is very close to the root $\alpha$. Then since $c_n$ is between $x_n$ and $\alpha$ then $c_n$ is also very close to $\alpha$ and hence:

(4)
\begin{align} \quad -\frac{f''(c_n)}{2f'(x_n)} \approx -\frac{f''(\alpha)}{2f'(\alpha)} = M_{\alpha} \end{align}

Therefore, for $n ≥ 0$ we can approximate the error of $x_{n+1}$ from $\alpha$ as:

(5)
\begin{align} \quad \mathrm{Error} (x_{n+1}) = \alpha - x_{n+1} \approx M_{\alpha} (\alpha - x_n)^2 \end{align}

We will now multiply both sides of the equation above by $M_{\alpha}$ to get that:

(6)
\begin{align} \quad M_{\alpha} (\alpha - x_{n+1}) \approx M_{\alpha}^2 (\alpha - x_n)^2 = \left ( M_{\alpha} (\alpha - x_n) \right)^2 \end{align}

Now note that $M_{\alpha} (\alpha - x_n) \approx M_{\alpha}^2(\alpha - x_{n-1}) = \left ( M_{\alpha} (\alpha - x_{n-1}) \right)^2$. If we repeat this process then we get that for $n ≥ 0$:

(7)
\begin{align} \quad M_{\alpha}(\alpha - x_n) \approx \left ( M_{\alpha} (\alpha - x_0) \right)^{2^n} \end{align}

Now for the error $\alpha - x_n$ to converge to $0$ (once again, so that our approximations $x_n$ converge to $\alpha$), we must have that $-1 < M_{\alpha} (\alpha - x_0) < 1$, i.e, $\mid M_{\alpha}(\alpha - x_0) \mid < 1$, because if so, then as $n \to \infty$ we have that $M_{\alpha}(\alpha - x_n) \to 0$.

(8)
\begin{align} \quad \mid \alpha - x_0 \mid < \frac{1}{\mid M_{\alpha} \mid} = \biggr \rvert \frac{2 f'(\alpha)}{f''(\alpha)} \biggr \rvert \end{align}

One important thing to note is that if we maximize $M_{\alpha}$ on the interval $[a, b]$, that is let $M$ be defined to be the largest possible $M_{\alpha}$ over $[a, b]$, then for any $M_{\alpha}$ we have that:

(9)
\begin{align} \quad \mid M_{\alpha} \mid = \biggr \rvert - \frac{f''(\alpha)}{2f'(\alpha)} \biggr \rvert ≤ \max\limits_{a ≤ x ≤ b} \biggr \rvert \frac{f''(x)}{2f'(x)} \biggr \rvert ≤ \frac{\max\limits_{a ≤ x ≤ b} \mid f''(x) \mid}{2 \min\limits_{a ≤ x ≤ b} \mid f'(x) \mid} = M \end{align}

If we have that $M < 1$, then all $M_{\alpha} < 1$ which will guarantee us convergence of Newton's Method to $\alpha$.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License