Vector Norms

# Vector Norms

Before we move on in discussing more about the errors in computed solutions $\hat{x}$ to the actual solution $x$ of a system of equations $Ax = b$ (where $A$ is an $n \times n$ matrix), we will first need to define what the norm of a vector is and look at some important norms. We will only be looking at norms on $\mathbb{R}^n$, that is the set $n$-component vectors with real components, or equivalently, the set of $n \times 1$ matrices whose entries are real.

 Definition: A Vector Norm on $\mathbb{R}^n$ is a function that maps each vector $x \in \mathbb{R}^n$ to a number $\| x \| \in \mathbb{R}$ that has the following properties: 1) $\| x \| ≥ 0$ for all $x \in \mathbb{R}$ and $\| x \| = 0$ if and only if $x = 0$ (Positivity and Definiteness Property). 2) $\| \alpha x \| = \mid \alpha \mid \| x \|$ for all $\alpha \in \mathbb{R}$ and for all $x \in \mathbb{R}^n$. 3) $\| x + y \| ≤ \| x \| + \| y \|$ for all $x, y \in \mathbb{R}^n$ (The Triangle Inequality for Vector Norms).

There are many different types of vector norms that can be defined on $\mathbb{R}^n$. We will only be interested in the following three vector norms for right now. Let $x = (x_1, x_2, ..., x_n) \in \mathbb{R}^n$. Then:

(1)
\begin{align} \quad \| x \|_1 = \sum_{i=1}^{n} \mid x_i \mid \end{align}
(2)
\begin{align} \quad \| x \|_2 = \sqrt{\sum_{i=1}^n x_i^2} \end{align}
(3)
\begin{align} \quad \| x \|_{\infty} = \max_{1≤i≤n} \mid x_i \mid \end{align}

We will verify that $\| x \|_1$ is indeed a norm on $\mathbb{R}^n$ by verifying the three properties that define a norm. The reader should verify that the functions $\| x \|_2$ and $\| x \|_{\infty}$ are also norms on $\mathbb{R}^n$.

Let $x = (x_1, x_2, ..., x_n) \in \mathbb{R}^n$. Then we have that:

(4)
\begin{align} \quad \| x \|_1 = \sum_{i=1}^{n} \mid x_i \mid = \mid x_1 \mid + \mid x_2 \mid + ... + \mid x_n \mid \end{align}

But each $\mid x_j \mid ≥ 0$ for $j = 1, 2, ..., n$ since $x_1, x_2, ..., x_n \in \mathbb{R}$, and thus $\| x \|_1 ≥ 0$.

Furthermore, if $\| x \|_1 = 0$ then we must have that $\mid x_j \mid = 0$ for $j = 1, 2, ..., n$ which implies that $x_j = 0$ for $j = 1, 2, ..., n$, so $x = (0, 0, ..., 0)$. Conversely, if $x = (0, 0, ..., 0)$ then $\mid x_j \mid = 0$ for $j = 1, 2, ..., n$ so $0 = \sum_{i=1}^{n} \mid x_i \mid = \| x \|_1$. Thus the positivity and definiteness properties hold.

Now let $\alpha \in \mathbb{R}$. Then we have that:

(5)
\begin{align} \quad \| \alpha x \|_1 = \sum_{i=1}^{n} \mid \alpha x_i \mid = \sum_{i=1}^{n} \mid \alpha \mid \mid x_i \mid = \mid \alpha \mid \sum_{i=1}^{n} \mid x_i \mid = \mid \alpha \mid \| x \|_1 \end{align}

Thus property 2 holds. Lastly, let $x, y \in \mathbb{R}^n$ such that $x = (x_1, x_2, ..., x_n)$ and $y = (y_1, y_2, ..., y_n)$ and so by applying the triangular inequality for the absolute value of two numbers, we have that:

(6)
\begin{align} \quad \| x + y \|_1 = \sum_{i=1}^{n} \mid x_i + y_i \mid ≤ \sum_{i=1}^{n} \left ( \mid x_i \mid + \mid y_i \mid \right ) = \sum_{i=1}^{n} \mid x_i \mid + \sum_{i=1}^{n} \mid y_i \mid = \| x \|_1 + \| y \|_1 \end{align}

Thus property 3 holds, and so we have verified that $\| x \|_1$ is a norm on $\mathbb{R}^n$.