Orthonormal Vectors Review

# Orthonormal Vectors Review

We will now review some of the recent content regarding orthonormal vectors.

- Recall from the Orthonormal Bases of Vector Spaces page that if $V$ is a finite-dimensional inner product space then an
**Orthonormal Basis**of $V$ is a basis $\{ e_1, e_2, ..., e_n \}$ such that $<e_i, e_j> = 0$ for all $i, j = 1, 2, ..., n$, $i \neq j$ and $<e_i, e_i> = 1$ for all $i = 1, 2, ..., n$ - the first condition giving us orthogonality of the vectors and the second condition giving us unit (normal) vectors.

- We then look at a very important proposition regarding orthonormal vectors which said that if $V$ is a finite-dimensional vector space over $\mathbb{R}$ or $\mathbb{C}$ and $\{ e_1, e_2, ..., e_n \}$ is an orthonormal basis of $V$ then the norm squared of any vector $v = a_1e_1 + a_2e_2 + ... + a_ne_n$ in $V$ is given by:

\begin{align} \quad \| a_1e_1 + a_2e_2 + ... + a_ne_n \|^2 = \mid a_1 \mid^2 + \mid a_2 \mid^2 + ... + \mid a_n \mid^2 \end{align}

- We also obtained the nice property that if $\{ e_1, e_2, ..., e_n \}$ is an orthonormal set of vectors in $V$ then $\{ e_1, e_2, ..., e_n \}$ is a linearly independent set.

- Furthermore, we noted that if $V$ is an inner product space and $\{ e_1, e_2, ..., e_n \}$ is an orthonormal basis of $V$ then for every $v \in V$:

\begin{align} \quad v = <v, e_1>e_1 + <v, e_2>e_2 + ... + <v, e_n>e_n \end{align}

- We then looked at a very important process known as The Gram-Schmidt Process which allows us take a linearly independent set of vectors $\{ v_1, v_2, ..., v_n \}$ in an inner product space $V$ and produce an orthonormal set of vectors $\{ e_1, e_2, ..., e_n \}$ from it where:

\begin{align} \quad e_1 = \frac{v_1}{\| v_1 \|} \: , \end{align}

(4)
\begin{align} \quad e_j = \frac{v_j - <v_j, e_1>e_1 - <v_j, e_2>e_2 - ... - <v_j, e_{j-1}> e_{j-1}}{\| v_j - <v_j, e_1>e_1 - <v_j, e_2>e_2 - ... - <v_j, e_{j-1}>e_{j-1} \| } \end{align}

- As an important corollary to the Gram-Schmidt process we noted that if $V$ is a finite-dimensional inner product space then $V$ has an orthonormal basis.

- As another important corollary we had that if $V$ is an finite-dimensional inner product space then any orthonormal set of vectors $\{ e_1, e_2, ..., e_n \}$ can be extended to a basis of $V$ due the linear independence of this set of vectors.

- On the Orthogonal Complements page we said that if $V$ is an inner product space and $U$ is a subset of $V$ (not needing $U$ to be a subspace of $V$) then the
**Orthogonal Complement of $U$**denoted $U^{\perp}$ is defined to be the set of vectors $v \in V$ such that $<u, v> = 0$ for all $u \in U$.

- Some nice properties of an orthogonal complement $U^{\perp}$ is that $U^{\perp}$ is a subspace of $V$, $\{ 0 \}^{\perp} = V$, $V^{\perp} = \{ 0 \}$ and if $U_1$ and $U_2$ are subsets of $V$ such that $U_1 \subseteq U_2$ then $U_1^{\perp} \supseteq U_2^{\perp}$

- If $U$ is actually a subset of a finite-dimensional vector space $V$ then we also saw that $V = U \oplus U{\perp}$.

- We also saw that $(U^{\perp})^{\perp} = U$.

- On the Orthogonal Projection Operators, if $V = U \oplus U^{\perp}$ such that $v = u + w$ where $u \in U$ and $w \in U^{\perp}$ then we defined the
**Orthogonal Projection Operator**of $V$ onto $U$ to be the linear map $P_U \in \mathcal L(V)$ defined as $P_U(v) = u$.

- Some of the nice properties of the orthogonal projection operator of $V$ onto $U$ is that $\mathrm{range} (P_U) = U$, $\mathrm{null} (P_U) = U^{\perp}$, $(v - P_U(v)) \in U^{\perp}$, $P_U^2 = P_U$, and that $\| P_U(v) \| ≤\| v \|$.