Eigenvectors of Square Matrices
Recall from the Eigenvalues of Square Matrices page that if $A$ is an $n \times n$ matrix and $\lambda \in \mathbb{R}$ then the characteristic polynomial for $A$ is the polynomial:
(1)The roots/solutions of the characteristic polynomial are called the eigenvalues of $A$.
Now recall that we originally began with the matrix equation $Ax = \lambda x$ which is equivalent to the matrix equation $(A - \lambda I)x = 0$. We noted that this matrix equation has the trivial solution $x = 0$. If $\det (A - \lambda I) \neq 0$ the $(A - \lambda I)x = 0$ has only the trivial solution. However, we noted that if $\det (A - \lambda I) = 0$ (which happens when $\lambda$ is an eigenvalue of $A$) then there are infinitely many solutions $x$ corresponding to this $\lambda$. These nontrivial solutions are defined below.
Definition: Let $A$ be an $n \times n$ matrix. If $\lambda$ is an eigenvalue of $A$ then a corresponding nonzero vector $v$ is called an Eigenvector of $A$ corresponding to $\lambda$ if $(A - \lambda I)v = 0$. |
Note that eigenvectors of $A$ corresponding to an eigenvalue $\lambda$ are not unique as there are infinitely many.
Also note that the zero vector is not an eigenvector.
For example, consider the following matrix:
(2)We have previously seen that the eigenvalues of $A$ are $\lambda_1 = 1$ and $\lambda_2 = -5$. Let's find a corresponding eigenvalue to $\lambda_1 = 1$. We consider the following matrix equation:
(3)This matrix equation gives us the following system of equations:
(4)By letting $x_1 = 1$ we get that $x_2 = -\frac{1}{7}$. So a corresponding eigenvector of $A$ to the eigenvalue $\lambda_1 = 1$ is:
(5)In fact, for each $t \in \mathbb{R} \setminus \{ 0 \}$, the following is a corresponding eigenvector of $A$ to the eigenvalue $\lambda_1 = 1$
(6)