LU Decompositions for Tridiagonal Matrices

This page is intended to be a part of the Numerical Analysis section of Math Online. Similar topics can also be found in the Linear Algebra section of the site.

LU Decompositions for Tridiagonal Matrices

Consider an $n \times n$ matrix $A$ in the following form:

(1)
\begin{align} \quad A = \begin{bmatrix} b_1 & c_1 & 0 & 0 & 0 & 0\\ a_2 & b_2 & c_2 & 0 & 0 & 0\\ 0 & a_3 & b_3 & c_3 & 0 & 0\\ 0 & 0 & \ddots & \ddots & \ddots & 0\\ \vdots & \vdots & \ddots & a_{n-1} & b_{n-1} & c_{n-1}\\ 0 & 0 & \cdots & 0 & a_{n} & b_n \end{bmatrix} \end{align}

Such a matrix is known as a Tridiagonal Matrix is it in a sense contains three diagonals. If we have a system of $Ax = f$ and assume pivoting is not used, then most of the multipliers $m_{ik} = 0$. This will result in a corresponding $LU$ decomposition of the form:

(2)
\begin{align} \quad A = \begin{bmatrix} b_1 & c_1 & 0 & 0 & 0 & 0\\ a_2 & b_2 & c_2 & 0 & 0 & 0\\ 0 & a_3 & b_3 & c_3 & 0 & 0\\ 0 & 0 & \ddots & \ddots & \ddots & 0\\ \vdots & \vdots & \ddots & a_{n-1} & b_{n-1} & c_{n-1}\\ 0 & 0 & \cdots & 0 & a_{n} & b_n \end{bmatrix} = \begin{bmatrix}1 & 0 & 0 & \cdots & 0\\ \alpha_2 & 1 & 0 & \cdots & 0\\ 0 & \alpha_3 & 1 & \ddots & \vdots\\ \vdots & \ddots & \ddots & 1 & 0\\ 0 & \cdots & 0 & \alpha_n & 1 \end{bmatrix} \begin{bmatrix} \beta_1 & c_1 & 0 & \cdots & 0\\ 0 & \beta_2 & c_2 & \ddots & \vdots\\ 0 & 0 & \ddots & \ddots & 0\\ \vdots & \vdots & \ddots & \beta_{n-1} & c_{n-1}\\ 0 & 0 & \cdots & 0 & \beta_n \end{bmatrix} = LU \end{align}

If we then apply Doolittle's Method, we first see that from the first row of $A$ we have that:

(3)
\begin{align} \quad b_1 = \beta_1 \end{align}

Furthermore, from the second row of $A$ we have that:

(4)
\begin{align} \quad a_2 = \alpha_2 \beta_1 \quad , \quad b_2 = \alpha_2c_1 + \beta_2 \end{align}

For the $j^{\mathrm{th}}$ row of $A$ we have that:

(5)
\begin{align} \quad a_{j} = \alpha_j \beta_{j-1} , \quad b_j = \alpha_j c_{j-1} + \beta_j \end{align}

Thus as you can see, the formulas finding the values in the matrices $L$ and $U$ are much nicer to work with. In terms of computing time, systems whose coefficient matrices are tridiagonal are simpler to obtain an $LU$ factorization of, for which we can then apply forward and backwards substitution where necessary.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License