The Jacobi Iteration Method
The Jacobi Iteration Method
So far we have looked at Gaussian Elimination to solve a system of $n$ linear equations in $n$ unknowns (provided that a solution exists) to $Ax = b$. Unfortunately, for very large systems, it is often unrealistic to apply Gaussian Elimination as the computations are long and cumbersome, and error can accumulate rapidly. As a result, we will begin looking at iterative methods to approximate the solution to $Ax = b$, the first iterative method being the Jacobi Iteration Method for solving a system of $n$ linear equations in $n$ unknowns.
Suppose that $x^{(0)} =\begin{bmatrix}x_1^{(0)}\\ x_2^{(0)}\\ \vdots\\ x_n^{(0)} \end{bmatrix}$ is an initial approximation to the solution $x$ of the following system of $n$ equations in $n$ unknowns:
(1)
\begin{align} \quad \begin{matrix} E(1): & a_{11}x_1 & + & a_{12}x_2 & + & \cdots & + & a_{1n}x_n & = & b_1\\ E(2): & a_{21}x_1 & + & a_{22}x_2 & + & \cdots & + & a_{2n}x_n & = & b_2\\ \vdots & \vdots & & \vdots & & \ddots & & \vdots & & \vdots\\ E(n): & a_{n1}x_1 & + & a_{n2}x_2 & + & \cdots & + & a_{nn}x_n & = & b_n & \end{matrix} \end{align}
For the Jacobi Iteration method, from the $i^{\mathrm{th}}$ equation in the system $Ax = b$ we isolate for the each variable $x_i$. Provided that $a_{ii} \neq 0$ for $i = 1, 2, ..., n$ we get that:
(2)
\begin{align} x_1 = \frac{b_1 - \left [ a_{12}x_2 + a_{13}x_3 + ... + a_{1n}x_n \right ]}{a_{11}} \\ x_2 = \frac{b_2 - \left [ a_{21}x_1 + a_{23}x_3 + ... + a_{2n}x_n \right ]}{a_{22}} \\ \quad \quad \quad \quad\quad \quad \vdots \quad \quad \quad \quad \quad \quad \\ x_n = \frac{b_n - \left [ a_{n1}x_1 + a_{n2}x_2 + ... + a_{n,n-1}x_{n-1} \right ]}{a_{nn}} \end{align}
In sigma notation, for each $i = 1, 2, ..., n$, we have that:
(3)
\begin{align} \quad x_i = \frac{b_i - \sum_{j=1}^{n}_{j \neq i} a_{ij}x_j}{a_{ii}} \end{align}
To obtain our first approximation $x^{(1)}$ of the solution $x$ using the Jacobi Iteration Method, we take the isolated equations above and input the values of our initial approximation $x^{(0)}$ to get:
(4)
\begin{align} \quad x_1^{(1)} = \frac{b_1 - \left [ a_{12}x_2^{(0)} + a_{13}x_3^{(0)} + ... + a_{1n}x_n^{(0)} \right ]}{a_{11}} \\ x_2^{(1)} = \frac{b_2 - \left [ a_{21}x_1^{(0)} + a_{23}x_3^{(0)} + ... + a_{2n}x_n^{(0)} \right ]}{a_{22}} \\ \quad \quad \quad \quad\quad \quad \vdots \quad \quad \quad \quad \quad \quad \\ x_n^{(1)} = \frac{b_n - \left [ a_{n1}x_1^{(0)} + a_{n2}x_2^{(0)} + ... + a_{n,n-1}x_{n-1}^{(0)} \right ]}{a_{nn}} \end{align}
Or in sigma notation, for $i = 1, 2, ..., n$:
(5)
\begin{align} \quad x_i^{(1)} = \frac{b_i - \sum_{j=1}^{n}_{j \neq i} a_{ij}x_j^{(0)}}{a_{ii}} \end{align}
We can then use our approximation $x^{(1)}$ in a similar manner to obtain another approximation, $x^{(2)}$, and so forth in the hopes that these successive approximations converge to the actual solution $x$ of $Ax = b$. For each $k ≥ 1$, the $k^{\mathrm{th}}$ iteration of the Jacobi Iteration Method yields:
(6)
\begin{align} \quad x_1^{(k)} = \frac{b_1 - \left [ a_{12}x_2^{(k-1)} + a_{13}x_3^{(k-1)} + ... + a_{1n}x_n^{(k-1)} \right ]}{a_{11}} \\ x_2^{(k)} = \frac{b_2 - \left [ a_{21}x_1^{(k-1)} + a_{23}x_3^{(k-1)} + ... + a_{2n}x_n^{(k-1)} \right ]}{a_{22}} \\ \quad \quad \quad \quad\quad \quad \vdots \quad \quad \quad \quad \quad \quad \\ x_n^{(k)} = \frac{b_n - \left [ a_{n1}x_1^{(k-1)} + a_{n2}x_2^{(k-1)} + ... + a_{n,n-1}x_{n-1}^{(k-1)} \right ]}{a_{nn}} \end{align}
In sigma notation, we have that each entry of the approximation solution $x^{(k)}$ for $k ≥ 1$ and $i = 1, 2, ..., n$ can be obtained as:
(7)
\begin{align} \quad x_i^{(k)} = \frac{b_i - \sum_{j=1}^{n}_{j\neq i} a_{ij}x_j^{(k-1)}}{a_{ii}} \end{align}
For example, suppose that we want to solve the following system of $3$ linear equations in the three unknowns $x_1$, $x_2$, and $x_3$:
(8)
\begin{align} \quad \begin{matrix} E(1): & 3x_1 & + & x_2 & + & 4x_3 & = & 1 \\ E(2): & x_1 & + & 5x_2 & + & 9x_3 & = & 1\\ E(3): & 2x_1 & + & 6x_2 & + & 5x_3 & = & 1 \end{matrix} \end{align}
Furthermore, suppose that our initial approximation of the solution is $x^{(0)} = \begin{bmatrix} 0.22\\ 0.03\\ 0.06 \end{bmatrix}$. We first isolate the variable $x_i$ from equation $E(i)$ for $i = 1, 2, 3$ to get:
(9)
\begin{align} \quad x_1 = \frac{1 - x_2 - 4x_3}{3} \\ \quad x_2 = \frac{1 - x_1 - 9x_3}{5} \\ \quad x_3 = \frac{1 - 2x_1 - 6x_2}{5} \end{align}
For our first approximation $x^{(1)}$ using the Jacobi Iteration method, we have that:
(10)
\begin{align} \quad x_1^{(1)} = \frac{1 - (0.03) - 4(0.06)}{3} = 0.243 \\ \quad x_2^{(1)} = \frac{1 - (0.22) - 9(0.06)}{5} = 0.048\\ \quad x_3^{(1)} = \frac{1 - 2(0.22) - 6(0.03)}{5} = 0.184 \end{align}
We can then continue the Jacobi Iteration method to get successive approximations. The actual solution to this system is $x = \begin{bmatrix}0.2\bar{3}\\ 0.0\bar{3}\\ 0.0\bar{6}\end{bmatrix}$. Whether or not the Jacobi Iteration Method converges to $x$ will be discussed subsequently.