# The Convergence of The Fixed Point Method

Recall from The Fixed Point Method for Approximating Roots page that if we want to solve the equation $f(x) = 0$, then if we can rewrite this equation as $x = g(x)$ then the fixed points of $g$ are precisely the roots of $f$. We saw that it is possible from the Fixed Point Method formula $x_{n+1} = g(x_n)$ starting with an initial approximating of $x_0$ and for $n ≥ 0$, to get closer and closer approximations of a root $\alpha$ provided that the sequence of approximations $\{ x_n \}$ does in fact converge. However, the convergence of the Fixed Point method is not guaranteed and relies heavily on $f$, the choice of $g$, and the initial approximation $x_0$. We will now show how to test the Fixed Point Method for convergence. We will build a condition for which we can guarantee with a sufficiently close initial approximation $x_0$ that the sequence $\{ x_n \}$ generated by the Fixed Point Method will indeed converge to $\alpha$.

Theorem 1: Let $g$ and $g'$ be continuous on $[a, b]$ and suppose that if $a ≤ x ≤ b$ then $a ≤ g(x) ≤ b$. Also suppose that $\lambda = \mathrm{max}_{a ≤ x ≤ b} \mid g'(x) \mid < 1$. Then:a) There exists a unique solution $\alpha \in [a, b]$ to the equation $x = g(x)$.b) For any initial estimation $x_0 \in [a, b]$, we have that $\lim_{n \to \infty} x_n = \alpha$.c) For $n ≥ 0$ we have that $\mid \alpha - x_n \mid ≤ \frac{\lambda^n}{1 - \lambda} \mid x_1 - x_0 \mid$.d) $\lim_{n \to \infty} \frac{\alpha - x_{n+1}}{\alpha - x_n} = g'(\alpha)$ and for $x_n$ close to $\alpha$ we have $\alpha - x_{n+1} \approx g'(\alpha)(\alpha - x_n)$. |

**Proof of a)**Since $g$ is continuous on $[a, b]$, $x \in [a, b]$ implies that $g(x) \in [a, b]$, and for all $x \in (a, b)$ we have that $\mid g'(x) \mid < 1$, then we are already guaranteed a unique solution from one of the theorems on the Fixed Points page.

**Proof of b)**Let $x_0 \in [a, b]$ be an initial approximation of the root $\alpha$. Then since $x_0 \in [a, b]$ we have that $x_1 = g(x_0) \in [a, b]$. By induction we see that $x_n \in [a, b]$ for $n ≥ 0$. Now we will show that these iterates converge. Take the equation $\alpha = g(\alpha)$ and subtract $x_{n+1} = g(x_n)$.

- Applying the Mean Value Theorem, there exists a $\xi_n \in (a, x_n)$ such that:

- Since $\xi_n \in (a,x_n)$ for $n ≥ 0$ we have that $\mid g'(\xi_n) \mid ≤ \lambda = \mathrm{max}_{a ≤ x ≤ b} \mid g'(x) \mid < 1$, and so:

- Now inductively we obtain the following sequence of inequalities:

- Therefore $0 ≤ \mid \alpha - x_n \mid ≤ \lambda^n \mid \alpha - x_0 \mid$. Since $\lambda < 1$, we have that $\lim_{n \to \infty} \lambda^n \mid \alpha - x_0 \mid \to 0$ which implies that $\lim_{n \to \infty} x_n = \alpha$.

**Proof of c)**Note that $\alpha - x_0 = \alpha - x_1 + x_1 - x_0$. Taking the absolute value of both sides and applying the triangle inequality and we get that

- From the proof of
**b)**above, we saw that $\mid \alpha - x_{n+1} \mid ≤ \lambda \mid \alpha - x_n \mid$ and so in particular, $\mid \alpha - x_1 \mid ≤ \lambda \mid \alpha - x_0 \mid$. Making this substitution into the inequality above and we get that:

- Also from the proof of
**b)**we have that $\mid \alpha - x_n \mid ≤ \lambda^n \mid \alpha - x_0 \mid$ and so $\frac{1}{\lambda^n} \mid \alpha - x_n \mid ≤ \mid \alpha - x_0 \mid$. Merging this inequality with that of above and:

**Proof of d)**From the proof of**b)**we have that $\alpha - x_{n+1} = g'(\xi_n)(\alpha - x_n)$ and so:

- Thus taking the limits of both sides of the equation above and we get that:

- Now since $\xi_n \in (a, x_n)$ for $n ≥ 0$ and $\lim_{n \to \infty} x_n = \alpha$ by
**b)**, we must then have that $\lim_{n \to \infty} \xi_n = \alpha$. Now since $g'$ is continuous, then:

The following Corollary will provide us criterion for determining whether our choice of $g(x)$ will converge to the root $\alpha$.

Corollary 1 (Convergence Criterion of the Fixed Point Method): If $g$ and $g'$ are continuous on the interval $[a, b]$ containing the root $\alpha$ and if $\mathrm{max}_{a ≤ x ≤ b} \mid g'(x) \mid < 1$ then the fixed point iterates $x_{n+1} = g(x_n)$ will converge to $\alpha$. |