The Ring of n x n Matrices
Recall from the Rings page that if $+$ and $*$ are binary operations on the set $R$, then $R$ is called a ring under $+$ and $*$ denoted $(R, +, *)$ when the following are satisfied:
- 1. For all $a, b \in R$ we have that $(a + b \in R)$ (Closure under $+$).
- 2. For all $a, b, c \in R$, $a + (b + c) = (a + b) + c$ (Associativity of elements in $R$ under $+$).
- 3. There exists an $0 \in R$ such that for all $a \in R$ we have that $a + 0 = a$ and $0 + a = a$ (The existence of an identity element $0$ of $R$ under $+$).
- 4. For all $a \in R$ there exists a $-a \in R$ such that $a + (-a) = 0$ and $(-a) + a = 0$ (The existence of inverses for each element in $R$ under $+$).
- 5. For all $a, b \in R$ we have that $a + b = b + a$ (Commutativity of elements in $R$ under $+$).
- 6. For all $a, b \in R$ we have that $a * b = b * a$ (Closure under $*$).
- 7. For all $a, b, c \in R$, $a * (b * c) = (a * b) * c$ (Associativity of elements in $R$ under $*$).
- 8. There exists a $1 \in R$ such that for all $a \in R$ we have that $a * 1 = a$ and $1 * a = a$ (The existence of an identity element $1$ of $R$ under $*$).
- 9. For all $a, b, c \in R$ we have that $a * (b + c) = (a * b) + (b * c)$ and $(a + b) * c = (a * c) + (b * c)$ (Distributivity of $*$ over $+$).
We will now look at the ring of $n \times n$ matrices.
Let $M_{nn}$ denote the set of $n \times n$ matrices and let $+$ be the operation of standard matrix addition and let $*$ be the operation of standard square matrix multiplication. Let $A, B, C \in M_{nn}$.
We know from linear algebra that the sum of two $n \times n$ matrices is an $n \times n$ matrice so $M_{nn}$ is clearly closed under $+$.
Furthermore, matrix addition is associative since:
(1)
\begin{align} \quad A + (B + C) = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn} \end{bmatrix} + \left ( \begin{bmatrix} b_{11} & b_{12} & \cdots & b_{1n}\\ b_{21} & b_{22} & \cdots & b_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ b_{n1} & b_{n2} & \cdots & b_{nn} \end{bmatrix} + \begin{bmatrix} c_{11} & c_{12} & \cdots & c_{1n}\\ c_{21} & c_{22} & \cdots & c_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ c_{n1} & c_{n2} & \cdots & c_{nn} \end{bmatrix} \right ) \\ \quad A + (B + C) = \begin{bmatrix} b_{11} + c_{11} & b_{12} + c_{12} & \cdots & b_{1n} + c_{1n}\\ b_{21} + c_{21} & b_{22} + c_{22} & \cdots & b_{2n} + c_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ b_{n1} + c_{n1} & b_{n2} + c_{n2} & \cdots & b_{nn} + c_{nn} \end{bmatrix} = \begin{bmatrix} (a_{11} + b_{11}) + c_{11} & (a_{12} + b_{12}) + c_{12} & \cdots & (a_{1n} + b_{1n}) + c_{1n}\\ (a_{21} + b_{21}) + c_{21} & (a_{22} + b_{22}) + c_{22} & \cdots & (a_{2n} + b_{2n}) + c_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ (a_{n1} + b_{n1}) + c_{n1} & (a_{n2} + b_{n2}) + c_{n2} & \cdots & (a_{nn} + b_{nn}) + c_{nn} \end{bmatrix} \\ \quad A + (B + C) = \begin{bmatrix} a_{11} + b_{11} & a_{12} + b_{12} & \cdots & a_{1n} + b_{1n}\\ a_{21} + b_{21} & a_{22} + b_{22} & \cdots & a_{2n} + b_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} + b_{n1} & a_{n2} + b_{n2} & \cdots & a_{nn} + b_{nn} \end{bmatrix} + \begin{bmatrix} c_{11} & c_{12} & \cdots & c_{1n}\\ c_{21} & c_{22} & \cdots & c_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ c_{n1} & c_{n2} & \cdots & c_{nn} \end{bmatrix} \\ \quad A + (B + C) = \left ( \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn} \end{bmatrix} + \begin{bmatrix} b_{11} & b_{12} & \cdots & b_{1n}\\ b_{21} & b_{22} & \cdots & b_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ b_{n1} & b_{n2} & \cdots & b_{nn} \end{bmatrix} \right ) + \begin{bmatrix} c_{11} & c_{12} & \cdots & c_{1n}\\ c_{21} & c_{22} & \cdots & c_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ c_{n1} & c_{n2} & \cdots & c_{nn} \end{bmatrix} = (A + B) + C \end{align}
The identity element of $+$ is the $n \times n$ zero matrix $0 = \begin{bmatrix} 0 & 0 & \cdots & 0\\ 0 & 0 & \cdots & 0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & 0 & \cdots & 0 \end{bmatrix}$ since:
(2)
\begin{align} \quad A + 0 = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn} \end{bmatrix} + \begin{bmatrix} 0 & 0 & \cdots & 0\\ 0 & 0 & \cdots & 0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & 0 & \cdots & 0 \end{bmatrix} = \begin{bmatrix} a_{11} + 0 & a_{12} + 0 & \cdots & a_{1n} + 0\\ a_{21} + 0 & a_{22} + 0 & \cdots & a_{2n} + 0 \\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} + 0 & a_{n2} + 0 & \cdots & a_{nn} + 0 \end{bmatrix} = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn} \end{bmatrix} = A \end{align}
Similarly, it can be shown rather easily that $0 + A = A$.
For each $n \times n$ matrix $A$, the additive inverse is $-A$ since:
(3)
\begin{align} \quad A + (-A) = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn} \end{bmatrix} + \begin{bmatrix} -a_{11} & -a_{12} & \cdots & -a_{1n}\\ -a_{21} & -a_{22} & \cdots & -a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ -a_{n1} & -a_{n2} & \cdots & -a_{nn} \end{bmatrix} = \begin{bmatrix} a_{11} - a_{11} & a_{12} - a_{12} & \cdots & a_{1n} - a_{1n} \\ a_{21} - a_{21} & a_{22} - a_{22} & \cdots & a_{2n} - a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} - a_{n1} & a_{n2} - a_{n2} & \cdots & a_{nn} - a_{nn} \end{bmatrix} = \begin{bmatrix} 0 & 0 & \cdots & 0\\ 0 & 0 & \cdots & 0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & 0 & \cdots & 0 \end{bmatrix} = 0 \end{align}
Once again, it can also be shown that $(-A) + A = 0$.
The operation $+$ of matrix addition is also commutative since:
(4)
\begin{align} \quad A + B = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn} \end{bmatrix} + \begin{bmatrix} b_{11} & b_{12} & \cdots & b_{1n}\\ b_{21} & b_{22} & \cdots & b_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ b_{n1} & b_{n2} & \cdots & b_{nn} \end{bmatrix} = \begin{bmatrix} a_{11} + b_{11} & a_{12} + b_{12} & \cdots & a_{1n} + b_{1n} \\ a_{21} + b_{21} & a_{22} + b_{22} & \cdots & a_{2n} + b_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} + b_{n1} & a_{n2} + b_{n2} & \cdots & a_{nn} + b_{nn} \end{bmatrix} \\ = \begin{bmatrix} b_{11} + a_{11} & b_{12} + a_{12} & \cdots & b_{1n} + a_{1n} \\ b_{21} + a_{21} & b_{22} + a_{22} & \cdots & b_{2n} + a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ b_{n1} + a_{n1} & b_{n2} + a_{n2} & \cdots & b_{nn} + a_{nn} \end{bmatrix} = \begin{bmatrix} b_{11} & b_{12} & \cdots & b_{1n}\\ b_{21} & b_{22} & \cdots & b_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ b_{n1} & b_{n2} & \cdots & b_{nn} \end{bmatrix} + \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn} \end{bmatrix} = B + A \end{align}
We know from linear algebra that an $m \times n$ matrix can be multiplied by an $n \times r$ matrix to produce an $m \times r$ matrix. Therefore an $n \times n$ matrix multiplied by an $n \times n$ matrix will produce an $n \times n$ matrix, so $M_{nn}$ is closed under $*$
Showing $M_{nn}$ is associative is rather tedious though it can be done. We will omit it here.
The identity element for multiplication of matrices in $M_{nn}$ is the $n \times n$ identity matrix, $I_n$.
Lastly, it can be shown that the distributivity property also holds, but once again, it's rather cumbersome so we leave it to the reader to verify.
Therefore $(M_{nn}, +, *)$ is a ring.