Orthonormal Bases of Vector Spaces

Orthonormal Bases of Vector Spaces

Recall that if $V$ is a finite-dimensional inner product space with $\mathrm{dim} (V) = n$, then we can obtain a basis $\{ v_1, v_2, ..., v_n \}$. For purposes we'll see later, we would often want to obtain a basis of $V$ for which each basis vector is orthogonal to each other, that is $<v_i, v_j> = 0$ for all $i, j = 1, 2, ..., n$ where $i \neq j$. At the same time, it will be convenient if the norms of each basis vector were equal to $1$, that is $\| v_j \| = 1$ for each $j = 1, 2, ..., n$. Such bases are very important, and we will formally define them below.

Definition: Let $V$ be a finite-dimensional inner product space. A set of vectors $\{ e_1, e_2, ..., e_n \}$ is an Orthonormal Basis of $V$ if this set is a basis of $V$ and $<e_i, e_j> = 0$ if $i \neq j$ and $<e_i, e_i> = 1$ if $i = j$ for each $i, j = 1, 2, ..., n$.

Note that the condition $<e_i, e_i> = 1$ for each $i = 1, 2, ..., n$ is equivalent to saying $\| e_i \| = 1$ for each $i = 1, 2, ... n$. This is because $\| e_i \| = 1$ if and only if $\| e_i \|^2 = 1$, but $\| e_i \|^2 = <e_i, e_i>$.

Consider the inner product space $\mathbb{R}^3$ with the dot product. It is not hard to see that the standard basis vectors $\{ (1, 0, 0), (0, 1, 0), (0, 0, 1) \}$ are an orthonormal set, each basis vector being perpendicular to each other (running along the $x$, $y$, and $z$ axes respectively) and each having a norm (or in this case, length) of $1$. In fact, the standard basis vectors $\{ (1, 0, 0, ..., 0), (0, 1, 0, ..., 0), (0, 0, 0, ..., 1) \}$ is an orthonormal basis of the vector space $\mathbb{F}^n$ with the Euclidean inner product.

We will now look at some important properties of orthonormal bases in outlining their usefulness.

Proposition 1: Let $V$ be a finite-dimensional inner product space over the field $\mathbb{F}$ ($\mathbb{R}$ or $\mathbb{C}$) and let $\{ e_1, e_2, ..., e_n \}$ be an orthonormal basis of $V$. Then $\| a_1 e_1 + a_2e_2 + ... + a_ne_n \|^2 = \mid a_1 \mid^2 + \mid a_2 \mid^2 + ... + \mid a_n \mid^2$ for all $a_1, a_2, ..., a_n \in \mathbb{F}$.
  • Proof: We will first show that each basis vector is orthogonal to the sum of the others. Without loss of generality, consider the two vectors $e_1 + e_2 + ... + e_{n-1}$ and $e_n$. These vectors are orthogonal to each other since:
(1)
\begin{align} \quad <e_1 + e_2 + ... + e_{n-1}, e_n> = <e_1, e_n> + <e_2, e_n> + ... + <e_{n-1}, e_n> \end{align}
  • But $<e_1, e_n> = <e_2, e_n> = ... = <e_{n-1}, e_n> = 0$ since $\{ e_1, e_2, ..., e_n \}$ is an orthonormal set of vectors. So we have shown that each basis vector is orthogonal to the sum of the others. Now by applying Pythagorean we have that:
(2)
\begin{align} \quad \| (a_1e_1 + a_2e_2 + ... + a_{n-1}e_{n-1}) + (a_ne_n) \|^2 = \| a_1e_1 + a_2e_2 + ... + a_{n-1}e_{n-1} \|^2 + \| a_ne_n \|^2 = \| a_1e_1 + a_2e_2 + ... + a_{n-1}e_{n-1} \|^2 + \mid a_n \mid^2 \| e_n \|^2 \end{align}
  • But $\| e_n \|^2 = 1$ since the vectors $\{ e_1, e_2, ..., e_n \}$ are orthonormal. By repeating the Pythagorean Theorem further, we obtain that $\| a_1 e_1 + a_2e_2 + ... + a_ne_n \|^2 = \mid a_1 \mid^2 + \mid a_2 \mid^2 + ... + \mid a_n \mid^2$. $\blacksquare$

Now the following proposition will tell us that if $\{ e_1, e_2, ..., e_n \}$ is any set of orthonormal vectors, then this set of vectors is linearly independent.

Proposition 2: Let $V$ be an inner product space. If $\{ e_1, e_2, ..., e_n \}$ is a set of orthonormal vectors, then the vectors in $\{ e_1, e_2, ..., e_n \}$ are linearly independent.
  • Proof: Let $\{ e_1, e_2, ..., e_n \}$ be a set of orthonormal vectors, and for $a_1, a_2, ..., a_n \in \mathbb{F}$, consider the following vector equation:
(3)
\begin{align} \quad a_1e_1 + a_2e_2 + ... + a_ne_n = 0 \end{align}
  • We want to show that the only way we can expressed $0$ as a linear combination of the vectors in $\{ e_1, e_2, ..., e_n \}$ is by having $a_1 = a_2 = ... = a_n = 0$. We will do this by taking the norm squared of both sides of the equation above to get that $\| a_1e_1 + a_2e_2 + ... + a_ne_n \|^2 = \mid a_1 \mid^2 + \mid a_2 \mid^2 + ... + \mid a_n \mid^2 = 0$. But then we must have that $a_1 = a_2 = ... = a_n = 0$. Therefore every set of orthonormal vectors is linearly independent. $\blacksquare$

Now let's continue to look at orthonormal bases of finite-dimensional inner product spaces. In general, if $\{ v_1, v_2, ..., v_n \}$ is any basis of a finite dimensional inner product space over the field $\mathbf{F}$, then for any $v \in V$ we have that $v = a_1v_1 + a_2v_2 + ... + a_nv_n$ where $a_1, a_2, ..., a_n \in \mathbb{F}$. Finding the coefficients $a_1, a_2, ..., a_n$ is usually quite difficult. If we instead find an orthonormal basis $\{ e_1, e_2, ..., e_n \}$ of $V$, then the problem of finding the coefficients which allow us to write each vector $v \in V$ as a sum of the basis vectors becomes very simple as outlined in the following theorem.

Theorem 1: Let $V$ be a finite-dimensional inner product space, and let $\{ e_1, e_2, ..., e_n \}$ be an orthonormal basis of $V$. Then for each vector $v \in V$ we have that $v = <v, e_1> e_1 + <v, e_2>e_2 + ... + <v, e_n> e_n$.
  • Proof: Since $\{ e_1, e_2, ..., e_n \}$ is a basis of $V$, then each vector $v \in V$ there exists scalars $a_1, a_2, ..., a_n \in \mathbb{F}$ such that:
(4)
\begin{align} \quad v = a_1e_1 + a_2e_2 + ... + a_ne_n \end{align}
  • We want to prove that $a_j = <v, e_j>$ for each $j = 1, 2, ..., n$. Taking the inner product of both sides with $e_j$ and we get that:
(5)
\begin{align} \quad <v, e_j> = <a_1e_1 + a_2e_2 + ... + a_ne_n, e_j> = <a_1e_1, e_j> + <a_2e_2, e_j> + ... + <a_je_j, e_j> + ... + <a_ne_n, e_j> \\ = a_1 <e_1, e_j> + a_2<e_2, e_j> + ... + a_j<e_j, e_j> + ... + a_n<e_n, e_j> \end{align}
  • But $<e_1, e_j>$, $<e_2, e_j>$, …, $<e_{j-1}, e_j>$, $<e_{j+1}, e_j>$, …, $<e_n, e_j>$ are equal to zero since $\{e_1, e_2, ..., e_n \}$ is an orthonormal set of vectors. Furthermore, $<e_j, e_j> = 1$, and so from above, we see that $a_j <v, e_j>$ as desired. So for any vector $v \in V$ we have that:
(6)
\begin{align} \quad v = <v, e_1> e_1 + <v, e_2>e_2 + ... + <v, e_n> e_n \quad \blacksquare \end{align}
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License