The Convergence of Newton's Method

# The Convergence of Newton's Method

Suppose that $f$ is a twice differentiable function on an interval containing the root of interest, $\alpha$ and suppose that $f'(\alpha) \neq 0$. Now consider the first order Taylor polynomial of $f$ about $x_n$ denoted $P_1(x) = f(x_n) + (x - x_n)f'(x_n)$. By Taylor's Theorem, there exists a $\xi_n$ between $\alpha$ and $x_n$ such that the remainder/error term is given by $E_n(x) = \frac{(\alpha - x_n)^2 }{2!} f''(\xi_n)$. We know that $f(x) = P_1(x) + E_n(x)$ and so:

(1)
\begin{align} \quad f(x) = f(x_n) + (x - x_n)f'(x_n) + \frac{(\alpha - x_n)^2}{2} f''(\xi_n) \end{align}

Plugging in $x = \alpha$ and noting that $f(\alpha) = 0$ since $\alpha$ is a root of $f$ and we get that:

(2)
\begin{align} \quad 0 = f(x_n) + (\alpha - x_n)f'(x_n) + \frac{(\alpha - x_n)^2}{2} f''(\xi_n) \\ \end{align}

Note that Newton's iteration formula is $x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}$ and so $\frac{f(x_n)}{f'(x_n)} = x_n - x_{n+1}$. We will apply this substitution after dividing the equation above by $f'(x_n)$:

(3)
\begin{align} \quad 0 = \frac{f(x_n)}{f'(x_n)} + \frac{(\alpha - x_n)f'(x_n)}{f'(x_n)} + \frac{(\alpha - x_n)^2}{2} \frac{f''(\xi_n)}{f'(x_n)} \\ \quad 0 = \frac{f(x_n)}{f'(x_n)} + (\alpha - x_n) + \frac{(\alpha - x_n)^2}{2} \frac{f''(\xi_n)}{f'(x_n)} \\ \quad 0 = (x_n - x_{n+1}) + (\alpha - x_n) + \frac{(\alpha - x_n)^2}{2} \frac{f''(\xi_n)}{f'(x_n)} \\ \quad 0 = \alpha - x_{n+1} + \frac{(\alpha - x_n)^2}{2} \frac{f''(\xi_n)}{f'(x_n)} \\ \quad \alpha - x_{n+1} = - \frac{(\alpha - x_n)^2}{2} \frac{f''(\xi_n)}{f'(x_n)} \\ \end{align}

So we have obtained a formula for our error, however, we do not necessarily know the value of $\xi_n$ so we want to replace this. Suppose that $x_n$ is near $\alpha$. Then $f'(x_n) \approx f'(\alpha)$. Furthermore, since $\xi_n$ is between $x_n$ and $\alpha$, then we must have that $\xi_n$ is near $\alpha$ and so $f''(c_n) \approx f''(\alpha)$. We then define $M$ as follows:

(4)
\begin{align} \quad M = -\frac{f''(\alpha)}{2f'(\alpha)} \approx \frac{-f''(\xi_n)}{2f'(x_n)} \end{align}

Substituting $M$ into our error formula from above and then multiplying both sides of the equation by $M$ and we get that:

(5)
\begin{align} \quad \alpha - x_{n+1} \approx M(\alpha - x_n)^2 \\ \quad M (\alpha - x_{n+1} \approx M^2 (\alpha - x_n)^2 = [M(\alpha - x_n)]^2 \end{align}

Using induction, we have that:

(6)
\begin{align} \quad M(\alpha - x_{n+1}) \approx M^2 (\alpha - x_n)^2 = [M(\alpha - x_n)]^2 \approx [M^2(\alpha - x_{n-1})^2]^2 = [M(\alpha - x_{n-1})]^{2^2} \approx [M(\alpha - x_0)]^{2^n} \end{align}

Now if $\lim_{n \to \infty} (\alpha - x_n) = 0$ then $\lim_{n \to \infty} x_n = \alpha$. To ensure convergence, we will need that $\mid M(\alpha - x_0) \mid < 1$, that is choose an initial approximation $x_0$ such that:

(7)
\begin{align} \quad \mid \alpha - x_0 \mid < \frac{1}{\mid M \mid} = \biggr \rvert \frac{2 f'(\alpha)}{f''(\alpha)} \biggr \rvert \end{align}

We will now construct an upper bound for $M$. We start with $M = \frac{1}{2} \biggr \rvert \frac{f'(\alpha)}{f''(\alpha)} \biggr \rvert$. If we maximize $\frac{1}{2} \biggr \rvert \frac{f'(\alpha)}{f''(\alpha)} \biggr \rvert = M^*$ on the interval $[a, b]$ then:

(8)
\begin{align} \quad \mid M \mid = \frac{1}{2} \biggr \rvert \frac{f''(\alpha)}{f''(\alpha)} \biggr \rvert ≤ \max\limits_{a ≤ x ≤ b} \frac{1}{2} \biggr \rvert \frac{f''(x)}{f'(x)} \biggr \rvert = \frac{\max\limits_{a ≤ x ≤ b} \mid f''(x) \mid}{2 \min\limits_{a ≤ x ≤ b} \mid f'(x) \mid} = M^* \end{align}

Thus if we can guarantee $M^* < 1$, then Newton's Method will converge with the initial approximation $x_0$.