The Convergence of The Fixed Point Method

# The Convergence of The Fixed Point Method

Recall from The Fixed Point Method for Approximating Roots page that if we want to solve the equation $f(x) = 0$, then if we can rewrite this equation as $x = g(x)$ then the fixed points of $g$ are precisely the roots of $f$. We saw that it is possible from the Fixed Point Method formula $x_{n+1} = g(x_n)$ starting with an initial approximating of $x_0$ and for $n ≥ 0$, to get closer and closer approximations of a root $\alpha$ provided that the sequence of approximations $\{ x_n \}$ does in fact converge. However, the convergence of the Fixed Point method is not guaranteed and relies heavily on $f$, the choice of $g$, and the initial approximation $x_0$. We will now show how to test the Fixed Point Method for convergence. We will build a condition for which we can guarantee with a sufficiently close initial approximation $x_0$ that the sequence $\{ x_n \}$ generated by the Fixed Point Method will indeed converge to $\alpha$.

 Theorem 1: Let $g$ and $g'$ be continuous on $[a, b]$ and suppose that if $a ≤ x ≤ b$ then $a ≤ g(x) ≤ b$. Also suppose that $\lambda = \mathrm{max}_{a ≤ x ≤ b} \mid g'(x) \mid < 1$. Then: a) There exists a unique solution $\alpha \in [a, b]$ to the equation $x = g(x)$. b) For any initial estimation $x_0 \in [a, b]$, we have that $\lim_{n \to \infty} x_n = \alpha$. c) For $n ≥ 0$ we have that $\mid \alpha - x_n \mid ≤ \frac{\lambda^n}{1 - \lambda} \mid x_1 - x_0 \mid$. d) $\lim_{n \to \infty} \frac{\alpha - x_{n+1}}{\alpha - x_n} = g'(\alpha)$ and for $x_n$ close to $\alpha$ we have $\alpha - x_{n+1} \approx g'(\alpha)(\alpha - x_n)$.
• Proof of a) Since $g$ is continuous on $[a, b]$, $x \in [a, b]$ implies that $g(x) \in [a, b]$, and for all $x \in (a, b)$ we have that $\mid g'(x) \mid < 1$, then we are already guaranteed a unique solution from one of the theorems on the Fixed Points page.
• Proof of b) Let $x_0 \in [a, b]$ be an initial approximation of the root $\alpha$. Then since $x_0 \in [a, b]$ we have that $x_1 = g(x_0) \in [a, b]$. By induction we see that $x_n \in [a, b]$ for $n ≥ 0$. Now we will show that these iterates converge. Take the equation $\alpha = g(\alpha)$ and subtract $x_{n+1} = g(x_n)$.
(1)
\begin{align} \quad \alpha - x_{n+1} = g(\alpha) - g(x_n) \end{align}
• Applying the Mean Value Theorem, there exists a $\xi_n \in (a, x_n)$ such that:
(2)
\begin{align} \quad \alpha - x_{n+1} = g'(\xi_n)(\alpha - x_n) \\ \quad \mid \alpha - x_{n+1} \mid = \mid g'(\xi_n) \mid \mid \alpha - x_n \mid \end{align}
• Since $\xi_n \in (a,x_n)$ for $n ≥ 0$ we have that $\mid g'(\xi_n) \mid ≤ \lambda = \mathrm{max}_{a ≤ x ≤ b} \mid g'(x) \mid < 1$, and so:
(3)
\begin{align} \quad \mid \alpha - x_{n+1} \mid ≤ \lambda \mid \alpha - x_n \mid \end{align}
• Now inductively we obtain the following sequence of inequalities:
(4)
\begin{align} \quad \mid \alpha - x_{n+1} \mid ≤ \lambda \mid \alpha - x_n \mid ≤ \lambda^2 \mid \alpha - x_{n-1} \mid ≤ ... ≤ \lambda^{n+1} \mid \alpha - x_0 \mid \end{align}
• Therefore $0 ≤ \mid \alpha - x_n \mid ≤ \lambda^n \mid \alpha - x_0 \mid$. Since $\lambda < 1$, we have that $\lim_{n \to \infty} \lambda^n \mid \alpha - x_0 \mid \to 0$ which implies that $\lim_{n \to \infty} x_n = \alpha$.
• Proof of c) Note that $\alpha - x_0 = \alpha - x_1 + x_1 - x_0$. Taking the absolute value of both sides and applying the triangle inequality and we get that
(5)
\begin{align} \quad \alpha - x_0 = \alpha - x_1 + x_1 - x_0 \\ \quad \mid \alpha - x_0 \mid = \mid \alpha - x_1 + x_1 - x_0 \mid \\ \quad \mid \alpha - x_0 \mid ≤ \mid \alpha - x_1 \mid + \mid x_1 - x_0 \mid \end{align}
• From the proof of b) above, we saw that $\mid \alpha - x_{n+1} \mid ≤ \lambda \mid \alpha - x_n \mid$ and so in particular, $\mid \alpha - x_1 \mid ≤ \lambda \mid \alpha - x_0 \mid$. Making this substitution into the inequality above and we get that:
(6)
\begin{align} \quad \mid \alpha - x_0 \mid ≤ \lambda \mid \alpha - x_0 \mid + \mid x_1 - x_0 \mid \\ \quad (1 - \lambda) \mid \alpha - x_0 \mid ≤ \mid x_1 - x_0 \mid \\ \quad \mid \alpha - x_0 \mid ≤ \frac{1}{1 - \lambda} \mid x_1 - x_0 \mid \end{align}
• Also from the proof of b) we have that $\mid \alpha - x_n \mid ≤ \lambda^n \mid \alpha - x_0 \mid$ and so $\frac{1}{\lambda^n} \mid \alpha - x_n \mid ≤ \mid \alpha - x_0 \mid$. Merging this inequality with that of above and:
(7)
\begin{align} \quad \frac{1}{\lambda^n} \mid \alpha - x_n \mid ≤ \frac{1}{1 - \lambda} \mid x_1 - x_0 \mid \\ \quad \mid \alpha - x_n \mid ≤ \frac{\lambda^n}{1 - \lambda} \mid x_1 - x_0 \mid \end{align}
• Proof of d) From the proof of b) we have that $\alpha - x_{n+1} = g'(\xi_n)(\alpha - x_n)$ and so:
(8)
\begin{align} \quad g'(\xi_n) = \frac{\alpha - x_{n+1}}{\alpha - x_n} \end{align}
• Thus taking the limits of both sides of the equation above and we get that:
(9)
\begin{align} \quad \lim_{n \to \infty} g'(\xi_n) = \lim_{n \to \infty} \frac{\alpha - x_{n+1}}{\alpha - x_n} \end{align}
• Now since $\xi_n \in (a, x_n)$ for $n ≥ 0$ and $\lim_{n \to \infty} x_n = \alpha$ by b), we must then have that $\lim_{n \to \infty} \xi_n = \alpha$. Now since $g'$ is continuous, then:
(10)
\begin{align} \quad \lim_{n \to \infty} g'(\xi_n) = g'\left ( \lim_{n \to \infty} \xi _n \right ) = g'(\alpha) \end{align}

The following Corollary will provide us criterion for determining whether our choice of $g(x)$ will converge to the root $\alpha$.

 Corollary 1 (Convergence Criterion of the Fixed Point Method): If $g$ and $g'$ are continuous on the interval $[a, b]$ containing the root $\alpha$ and if $\mathrm{max}_{a ≤ x ≤ b} \mid g'(x) \mid < 1$ then the fixed point iterates $x_{n+1} = g(x_n)$ will converge to $\alpha$.