Invariant Subspaces in Finite-Dimensional Real Vector Spaces

Invariant Subspaces in Finite-Dimensional Real Vector Spaces

Recall from The Existence of an Eigenvalue on Finite-Dimensional Complex Vector Spaces page that if $V$ is a finite-dimensional nonzero complex vector space, then every operator $T \in \mathcal L (V)$ contains an eigenvalue $\lambda$. Thus we have that the subspace $U = \mathrm{span} (u)$ where $u$ is an eigenvector of $U$, is an invariant one-dimensional subspace of $V$. We can see this since if $v \in U$ then for some $a \in \mathbb{F}$ we have that $v = au$, and so:

(1)
\begin{align} \quad T(v) = T(au) = aT(u) = a\lambda u = (a \lambda) \end{align}

Thus $T(v) \in U$. So every finite-dimensional nonzero complex vector space has an invariant subspace of dimension $1$. Now we note that if instead $V$ is a finite-dimensional real vector space, then $T \in \mathcal L (V)$ may not contain any eigenvalues and thus no one-dimensional invariant subspaces. The following Theorem will indeed at least guarantee that a finite-dimensional nonzero real vector space has an invariant two-dimensional subspace though.

Theorem 1: If $V$ is a finite-dimensional nonzero real vector space and $T \in \mathcal L (V)$ then there exists a subspace $U$ of $V$ that is invariant under $T$ with $\mathrm{dim} (U) = 1$ or $\mathrm{dim} (U) = 2$.
  • Proof: Let $V$ be a finite-dimensional nonzero real vector space, and let $\mathrm{dim} (V) = n$. Let $T \in \mathcal L (V)$, and choose a vector $v \in V$ such that $v \neq 0$. Consider the set of vectors $\{ v, T(v), T^2(v), ..., T^n(v) \}$. This set of $n + 1$ vectors cannot be linearly independent since $\mathrm{dim} (V) = n$, and there exists scalars $a_0, a_1, ..., a_n \in \mathbb{R}$ that are not all zero such that:
(2)
\begin{align} \quad 0 = a_0v + a_1T(v) + a_2T^2(v) + ... + a_nT^n(v) \\ \quad 0 = (a_0I + a_1T + a_2T^2 + ... + a_nT^n)(v) \end{align}
  • Now we can factor this polynomial as a product of linear factors and irreducible quadratic factors. For $\lambda_1, \lambda_, ..., \lambda_n, \alpha_1, \alpha_2, ..., \alpha_m, \beta_1, \beta_2, ..., \beta_M, c \in \mathbb{R}$ we have that:
(3)
\begin{align} \quad 0 = c\underbrace{(T - \lambda_1I)(T - \lambda_2I)...(T - \lambda_mI)}_{\mathrm{Linear \: Factors}} \underbrace{(T^2 + \alpha_1T + \beta_1I)(T^2 + \alpha_2T + \beta_2I)...(T^2 + \alpha_MT + \beta_MI)}_{\mathrm{Irreducible \: Quadratic \: Factors}}(v) \end{align}
  • Thus we have that either some $T - \lambda_jI$ is not injective for $j = 1, 2, ..., m$ or $T^2 + \alpha_iT + \beta_iI$ is not injective for $i = 1, 2, ..., M$. We will look at these two cases separately.
  • Case 1: Suppose that $T - \lambda_jI$ is not injective for some $j = 1, 2, ..., m$. Then this implies that there exists a nonzero vector $u \in V$ such that $(T - \lambda_jI)(u) = 0$, that is $T(u) = \lambda (u)$, and so $\lambda$ is an eigenvalue of $T$. As we've seen before, if we let $U = \mathrm{span} (u)$, then $U$ is a one-dimensional subspace of $V$ that is invariant under $T$. To reverify this, let $w \in U$. Then for some $a \in \mathbb{R}$ we have that $w = au$ and so $T(w) = T(au) = aT(u) = (a \lambda) u$ and so $T(w) \in U$. Thus there exists a one-dimensional subspace $U$ of $V$ that is invariant under $T$.
  • Case 2: Suppose that $T^2 + \alpha_iT + \beta_iI$ is not injective for some $i = 1, 2, ..., M$. Then there exists a nonzero vector $u \in V$ such that $(T^2 + \alpha_iT + \beta_i)(u) = 0$. Let $U = \mathrm{span} (u, T(u))$. We will show this subspace $U$ of $V$ is invariant under $T$. Let $w \in U$. Then for $a, b \in \mathbb{R}$ we have that $w = au + bT(u)$, and so applying the operator $T$ to $w$ we have that:
(4)
\begin{align} \quad T(w) = T(au + bT(u)) = aT(u) + bT(T(u)) = aT(u) + bT^2(u) \end{align}
  • Now note that $(T^2 + \alpha_iT + \beta_iI)(u) = 0$ implies that $T^2(u) + \alpha_iT(u) + \beta_iI(u) = 0$, and so rearranging these terms we get that $T^2(u) = -\alpha_iT(u) - \beta_iI(u)$. Plugging this into the equation above and we get that:
(5)
\begin{align} \quad T(w) = aT(u) + bT^2(u) \\ \quad T(w) = aT(u) + b[-\alpha_iT(u) - \beta_i] \\ \quad T(w) = aT(u) - b\alpha_i T(u) - b\beta_iu \end{align}
  • From above, we see that then $T(w) \in \mathrm{span} (u, T(u))$, and so $U = \mathrm{span} (u, T(u))$ is an two-dimensional subspace that is invariant under $T$. $\blacksquare$
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License