The Characteristic Polynomial of a Matrix

The Characteristic Polynomial of a Matrix

Recall from The Eigenvalues of a Matrix page that if $A$ is an $n \times n$ matrix, then the number $\lambda$ is said to be an eigenvalue of $A$ if there exists a nonzero vector $v$ such that $Av = \lambda v$. Furthermore, the vectors $v$ for which $Av = \lambda v$ are called the corresponding eigenvectors to the eigenvalue $\lambda$.

Now suppose that $\lambda$ is an eigenvalue of the matrix $A$. Then $Av = \lambda v$ for some nonzero vector $v$. We can rewrite this equation as:

\begin{align} \quad \lambda v - Av = 0 \\ \quad \lambda I v - Av = 0 \\ \quad (\lambda I - A)v = 0 \end{align}

We note that the above equation represents a homogenous system of linear equations. Furthermore, note that $v$ is a nonzero vector. So if $\lambda I - A$ is an invertible matrix, then the only solution to this system is the trivial solution, namely, $v = 0$ which cannot happen. Therefore $\lambda I - A$ is not invertible, and so $\mathrm{det} (\lambda I - A) = 0$. This equation (which produces a polynomial) is extremely useful for finding the eigenvalues of a matrix, and we formally define it below.

Definition: If $A$ is an $n \times n$ matrix, then the Characteristic Polynomial of $A$ is the function $f(\lambda) = \mathrm{det} (\lambda I - A)$.

For example, consider the matrix $A = \begin{bmatrix} 1 & 2\\ 3 & 4 \end{bmatrix}$. Now let's form the matrix $\lambda I - A$:

\begin{align} \quad \lambda I - A = \begin{bmatrix} \lambda & 0\\ 0 & \lambda \end{bmatrix} - \begin{bmatrix} 1 & 2\\ 3 & 4 \end{bmatrix} = \begin{bmatrix} \lambda -1 & -2\\ -3 & \lambda - 4 \end{bmatrix} \end{align}

Now let's set the determinant of this matrix equal to zero:

\begin{align} \quad \begin{vmatrix} \lambda -1 & -2\\ -3 & \lambda - 4 \end{vmatrix} = (\lambda - 1)(\lambda - 4) - (-2)(-3) = \lambda^2 -5 \lambda - 2 = 0 \end{align}

The resulting eigenvalues are the roots of the polynomial above which can be computed using the quadratic formula. They are $\lambda_1 = - \frac{\sqrt{33} - 5}{2}$ and $\lambda_2 = \frac{\sqrt{33} + 5}{2}$ as you should verify.

Now the following theorem gives us an upper bound for the number of eigenvalues that a square $n \times n$ matrix can have.

Theorem 1: If $A$ is an $n \times n$ matrix then $A$ has at most $n$ distinct eigenvalues.

In general, if we have a relatively small matrix $A$, then computing the corresponding characteristic polynomial to find the eigenvalues of $A$ is nice. However, for very large matrices - using this method is not very useful because the number of computations grows extremely fast. For example, consider a $100 \times 100$ matrix. In reducing such a matrix, we would need to compute determinants of $100$ $99 \times 99$ matrices, and for each $99 \times 99$ matrix, we would need to compute the determinants of $99$ $98 \times 98$ matrices and so forth. Even after all of the additions, subtractions, etc… associated with this process, we would end up with a polynomial with degree at most $100$ which could be very complicated to solve.

Of course, there are ways to work around it. Once such way is to reduce $A$ to an upper triangular matrix to find the eigenvalues immediately. We will look at other methods later on.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License