Bases for a Vector Space

# Bases for a Vector Space

Definition: Let $E$ be a vector space. A Base (or Basis) for $E$ is a subset $A \subseteq E$ that spans $E$ and that is linearly independent. |

Perhaps the simplest example of a base is the standard basis in $\mathbb{R}^n$, which is the collection of $n$ vectors:

(1)\begin{align} \{ (1, 0, 0, ..., 0), (0, 1, 0, ..., 0), ..., (0, 0, 0, ..., 1) \} \end{align}

Definition: A vector space $E$ is Finite-Dimensional if there exists an $n \in \mathbb{N}$ such that $E$ has a base with exactly $n$ vectors. If no such $n \in \mathbb{N}$ exists, then $E$ is Infinite-Dimensional. |

The following results conclude that every vector space has a base.

Lemma 1: Let $E$ be a vector space. Then every maximal linearly independent subset of $E$, is a base for $E$. |

**Proof:**Let $A$ be a maximal linearly independent subset of $E$. In other words, assume that $A$ is linearly independent and that for all $x \in E \setminus A$, $A \cup \{ x \}$ is linearly dependent.

- We want to show that $A$ spans $E$. So let $x \in E$. Then either $x \in A$ or $x \in E \setminus A$. If $x \in A$ then $x \in \mathrm{span}(A)$. If $x \in E \setminus A$, then from above, $A \cup \{ x \}$ is linearly dependent. So for some $n \in \mathbb{N}$ and for some $x_1, x_2, ..., x_n \in A$, the equation:

\begin{align} \quad \lambda x + \lambda_1 x_1 + \lambda_2 x_2 + ... + \lambda_n x_n = o \end{align}

- has a solution in terms of $\lambda, \lambda_1, ..., \lambda_n$ for which not all of $\lambda, \lambda_1, ..., \lambda_n$ are zero. Note that such a solution must be with $\lambda \neq 0$, for if $\lambda = 0$ then the linear independence of $A$ implies that the equation $\lambda_1x_1 + \lambda_2x_2 + ... + \lambda_n x_n = 0$ implies that $\lambda_1 = \lambda_2 = ... \lambda_n = 0$ too. Thus the equation above can be rewritten as:

\begin{align} \quad x = \left ( -\frac{\lambda_1}{\lambda} \right ) x_1 + \left ( - \frac{\lambda_2}{\lambda} \right ) x_2 + ... + \left ( - \frac{\lambda_n}{\lambda} \right ) x_n \end{align}

- Hence $x \in \mathrm{span}(A)$, so $A$ spans $E$ and $A$ is a base for $E$. $\blacksquare$

Theorem 2: Let $E$ be a vector space. Then every minimal spanning subset of $E$ is a base for $E$. |

**Proof:**Let $A$ be a minimal spanning subset of $E$. In other words, assume that if $F \subseteq E$, then $\mathrm{span}(F) \neq A$.

- We want to show that $A$ is linearly independent. Suppose instead it were linearly dependent. Then there exists $n \in \mathbb{N}$ and $x_1, x_2, ..., x_n \in A$ such that the equation:

\begin{align} \quad \lambda_1 x_1 + \lambda_2 x_2 + ... + \lambda_n x_n = o \end{align}

- has a solution in $\lambda_1, \lambda_2, ..., \lambda_n$ for which not all of $\lambda_1, \lambda_2, ..., \lambda_n$ are all zero. Without loss of generality, assume that $\lambda_1 \neq 0$. Then the equation above can be rewritten as:

\begin{align} \quad x_1 = \left ( -\frac{\lambda_2}{\lambda_1} \right ) x_2 + ... + \left ( - \frac{\lambda_n}{\lambda_1} \right ) x_n \end{align}

- Thus $x_1 \in \mathrm{span} (E \setminus \{ x_1 \})$. But then $\mathrm{span}(F) = \mathrm{span}(E) = A$, contradicting the assumption that $A$ is a minimal spanning set. Thus $A$ must be linearly independent, and so $A$ is a base for $E$. $\blacksquare$