Gaussian Elimination and Back Substitution

Gaussian Elimination

Consider the system of linear equations:

(1)
\begin{align} a_{11}x_1 + a_{12}x_2 + \cdots + a_{1n}x_n = b_1 \\ a_{21}x_1 + a_{22}x_2 + \cdots + a_{2n}x_n = b_2 \\ \vdots \quad \quad \quad \vdots \quad \quad \quad \quad \vdots \quad \quad \vdots \: \: \\ a_{m1}x_1 + a_{m2}x_2 + \cdots + a_{mn}x_n = b_m \end{align}

If we take what is known as the augmented matrix of this system, that is, the matrix corresponding the coefficients and constants of the system, $\begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} & b_1\\ a_{21} & a_{22} & \cdots & a_{2n} & b_2\\ \vdots & \vdots & & \vdots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} & b_{m} \end{bmatrix}$ and reduce it to Row Echelon Form, then we will be able to solve the system rather easily.

For example, consider the following system of linear equations:

(2)
\begin{align} 2x + 3y -z = 2 \\ 2x + 4y +z = 4 \\ x + 2y +z = 1 \\ \end{align}

The augmented matrix for this system is $\begin{bmatrix} 2 & 3 & -1 & 2\\ 2 & 4 & 1 & 4\\ 1 & 2 & 1 & 1 \end{bmatrix}$. If we reduce this matrix to REF, we obtain the following matrix (as you should verify):

(3)
\begin{bmatrix} 1 & 0 & 0 & -9\\ 0 & 1 & 0 & 6\\ 0 & 0 & 1 & -2 \end{bmatrix}

From this matrix we can extract the following set of linear equations that is much easier to work with:

(4)
\begin{align} x + 0y + 0z = -9 \\ 0x + y + 0z = 6 \\ 0x + 0y + z = -2 \end{align}

Clearly we can see a solution $(x, y, z) = (-9, 6, -2)$ to the original system of linear equations. Sometimes we may have to use the algebraic technique of back substitution as well which we will now describe.

Back Substitution

Consider the following augmented matrix for a system of 3 linear equations and 4 unknowns $x_1, x_2, x_3, x_4$ that has already been put into REF:

(5)
\begin{bmatrix} 1 & 0 & 0 & 4 & -1\\ 0 & 1 & 0 & 2 & 6\\ 0 & 0 & 1 & 3 & 2 \end{bmatrix}

From this we obtain the following linear equations:

(6)
\begin{align} x_1 + 0x_2 + 0x_3 + 4x_4 = -1 \\ 0x_1 + x_2 + 0x_3 + 2x_4 = 6 \\ 0x_1 + 0x_2 + x_3 + 3x_4 = 2 \end{align}

The variables $x_1, x_2, x_3$ correspond to leading $1$s in the matrix, so we call them leading variables, while the variable $x_4$ is called a free variable or a pivot.

To solve this system, we let our free variables equal some arbitrary value, say $t$. So let $x_4 = t$ for $t \in \mathbb{R}$. Therefore we obtain a general solution:

(7)
\begin{align} x_1 = -1 - 4t \\ x_2 = 6 - 2t \\ x_3 = 2 - 3t \\ x_4 = t \end{align}

That is for all $t \in \mathbb{R}$, there exists a solution $(x_1, x_2, x_3, x_4) = (-1 - 4t, 6 - 2t, 2 - 3t, t)$. For for any value of $t \in \mathbb{R}$ we get a corresponding solution to the system, and so this system has infinitely many solutions.

Example 1

Given the following REF matrix that represents a system of 3 linear equations in 3 variables $x_1, x_2, x_3$, find a solution.

(8)
\begin{bmatrix} 1 & 1 & 0 & 4\\ 0 & 1 & 1 & 2\\ 0 & 0 & 1 & 3 \end{bmatrix}

We obtain the following system of linear equations:

(9)
\begin{align} x_1 + x_2 = 4 \\ x_2 + x_3 = 2 \\ x_3 = 4 \end{align}

We first note that $x_3 = 4$. We can back-substitute this into the second equation to get that:

(10)
\begin{align} x_2 + x_3 = 2 \\ x_2 + (4) = 2 \\ x_2 = -2 \end{align}

Therefore $x_2 = -2$. Lastly, we can substitute this information into the first equation to get that:

(11)
\begin{align} x_1 + x_2 = 4 \\ x_1 + (-2) = 4 \\ x_1 = 6 \end{align}

Therefore we have one solution to our system, namely $(x_1, x_2, x_3) = (6, -2, 4)$

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License