Matrix Arithmetic

This page is intended to be a part of the Numerical Analysis section of Math Online. Similar topics can also be found in the Linear Algebra section of the site.

Table of Contents

Matrix Arithmetic

We are about to look at various numerical methods to solve systems of linear equations. In doing so, we will be talking a lot about matrices as they can be used to represent such systems. Thus we will look at what exactly a matrix is and some important arithmetic operations and properties of matrices on this page. We begin by defining a matrix.

Definition: An Matrix is an array of numbers in the form $\begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix}$. The Entries are the values of $a_{ij}$ for $i = 1, 2, ..., m$ and $j = 1, 2, ..., n$. The Size of the matrix is written as $m \times n$ where $m$ represents the number of rows and $n$ represents the number of columns in the matrix. A matrix is said to be Square if it has the same number of rows and columns, that is $m = n$.

In some cases, writing $a_{i,j}$ (separating the row number and the column number of each entry) instead of $a_{ij}$ is neater in denoting the entries of a matrix.

Often times we use capital letters to denote matrices. For example, $A = \begin{bmatrix} 1 & -2 & 8 \\ 0 & 5 & 2 \end{bmatrix}$ is a $2 \times 3$ matrix with real number entries. Another example is $B = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$ which is a $2 \times 2$ (square) matrix and whose entries are $a, b, c, d$ for whatever they may represent (usually numbers). Furthermore, it should not be difficult to see that a matrix of size $m \times n$ has $mn$ entries.

Some various arithmetic operations can be defined on matrices. For example, if $A$ and $B$ are two matrices that has the same size, $m \times n$, then we can define the sum $A + B$ to be the $m \times n$ matrix whose entries are the sums of the corresponding entries from $A$ and $B$, that is:

(1)
\begin{align} \quad A + B = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix} + \begin{bmatrix} b_{11} & b_{12} & \cdots & b_{1n}\\ b_{21} & b_{22} & \cdots & b_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ b_{m1} & b_{m2} & \cdots & b_{mn} \end{bmatrix} = \begin{bmatrix} a_{11} + b_{11} & a_{12} + b_{12} & \cdots & a_{1n} + b_{1n} \\ a_{21} + b_{21} & a_{22} + b_{22} & \cdots & a_{2n} + b_{2n} \\ \vdots & \vdots & \ddots & \vdots\\ a_{m1} + b_{m1} & a_{m2} + b_{m2} & \cdots & a_{mn} + b_{mn} \end{bmatrix} \end{align}

Similarly, subtraction of two matrices $A$ and $B$ that have the same size, $m \times n$ can be defined analogously. If $A$ and $B$ do not have the same size, then we say that the sum/difference of $A$ and $B$ is undefined.

Now if $k$ is a scalar, then we can define the product $k$ multiplied by the $m \times n$ matrix $A$ as the $m \times n$ matrix $kA$ whose entries are obtained by taking the corresponding entries of $A$ and multiplying them by $k$, that is:

(2)
\begin{align} \quad kA = \begin{bmatrix} ka_{11} & ka_{12} & \cdots & ka_{1n}\\ ka_{21} & ka_{22} & \cdots & ka_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ ka_{m1} & ka_{m2} & \cdots & ka_{mn} \end{bmatrix} \end{align}

Now defining matrix multiplication with another matrix is more complex. The obvious definition of matrix multiplication yields little when it comes to mathematical importance. Instead we say that if $A$ is an $m \times n$ matrix and $B$ is an $n \times r$ matrix, then the product of $A$ and $B$ denoted $AB = C$ is the resulting $m \times r$ matrix whose entries for $i = 1, 2, ..., m$ and $j = 1, 2, ..., r$ can be computed as:

(3)
\begin{align} \quad c_{ij} = a_{i1}b_{1j} + a_{i2}b_{2j} + ... + a_{in}b_{nj} = \sum_{k=1}^{n} a_{ik}b_{kj} \end{align}

The size of the matrix product $AB$ is equal to the number of rows of $A$ by the number of columns of $B$. Furthermore, if $A$ is an $m \times n$ matrix and $B$ is a $p \times r$ matrix where $n \neq p$, that is, the number of columns of $A$ does not equal the number of rows of $B$, then the matrix product $AB$ is said to be undefined. It is also important to note that $AB$ need not equal $BA$. In fact, $AB$ may be defined while $BA$ might not such as when $A$ and $B$ are not square matrices.

We will now look at some important types of matrices and definitions.

Definition: If $A$ is an $m \times n$ matrix, then the Transpose of $A$ denoted $A^T$ is the $n \times m$ matrix that is obtained by interchanging the rows of $A$ with the columns of $A$.

For example, a general matrix transpose of an $m \times n$ matrix $A$ has the form:

(4)
\begin{align} \quad A^T = \begin{bmatrix} a_{11} & a_{21} & \cdots & a_{n1}\\ a_{12} & a_{22} & \cdots & a_{n2}\\ \vdots & \vdots & \ddots & \vdots\\ a_{1m} & a_{2m} & \cdots & a_{nm} \end{bmatrix} \end{align}
Definition: If $A$ is a square $n \times n$ matrix, then the Main Diagonal of $A$ are the entries $a_{ii}$ for $i = 1, 2, ..., n$.

Simply put, the entries of the main diagonal of a square matrix are the entries whose column and row number are equal. For a less general example, if $A = \begin{bmatrix} 0 & 3 & 2 \\ 1 & -1 & 2 \end{bmatrix}$ then $A^T = \begin{bmatrix} 0 & 1 \\ 3 & -1 \\ 2 & 2 \end{bmatrix}$. Furthermore, if $A$ is a square matrix, then the entries along the main diagonal of $A$ are the same as the entries along the main diagonal of $A^T$ as you should verify.

Definition: If $A$ is a square $n \times n$ matrix, then the Trace of $A$ denoted $\mathrm{tr}(A)$ is the sum of the elements on the main diagonal of $A$.

From the definition above, we see that the trace of an $n \times n$ matrix $A$ is simply:

(5)
\begin{align} \quad \mathrm{tr} (A) = \sum_{i=1}^{n} a_{ii} \end{align}
Definition: The $n \times n$ Identity Matrix denoted $I_n$ is the square matrix whose entries are zero everywhere except on the main diagonal where they are ones.

For example, the $I_2 = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$ is the $2 \times 2$ identity matrix.

Definition: If $A$ is a square $n \times n$ matrix, then $A$ is said to be Invertible if there exists an $n \times n$ matrix $B$ such that $AB = I_n = BA$. In such case we say that $B = A^{-1}$ is the Inverse of $A$.

It is important to note that not all square matrices are invertible. Furthermore, it is also important to note that the inverse of a matrix is unique which can easily be proven by assuming two inverses exist and then showing that they're equal.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License