Exist. of a Lin. Map σ on X⊗Y to Z that Mat. a Bilin. Map on X×Y to Z

The Existence of a Linear Map σ on X⊗Y to Z that Matches a Bilinear Map on X×Y to Z

Theorem 1: Let $X$, $Y$, and $Z$ be normed spaces. If $T : X \times Y \to Z$ is a bilinear map then there exists a unique linear map $\sigma : X \otimes Y \to Z$ such that for all $x \in X$ and for all $y \in Y$ we have that $\sigma (x \otimes y) = T(x, y)$.

If we let $\tau : X \times Y \to X \otimes Y$ be defined for all $(x, y) \in X \times Y$ by $\tau(x, y) = x \otimes y$, then Theorem 1 states that for any bilinear map $T : X \times Y \to Z$ there exists a unique map $\sigma : X \otimes Y \to Z$ such that $T = \sigma \circ \tau$.

  • Proof: Suppose that $\displaystyle{\sum_{r=1}^{k} x_r \otimes y_r = 0}$. We want to then show that $\displaystyle{\sum_{r=1}^{k} T(x_r, y_r) = 0}$.
  • Let $\{ a_i \}$ be a basis for $\mathrm{span} (x_1, x_2, ..., x_k)$ and let $\{ b_j \}$ be a basis for $\mathrm{span} (y_1, y_2, ..., y_k)$. Then each $x_r$, $1 \leq r \leq k$ can be written uniquely as a linear combination of $\{ a_i \}$, and similarly, each $y_r$, $1 \leq r \leq k$ can be written uniquely as a linear combination of $\{ b_j \}$. So for each $1 \leq r \leq k$ write:
(1)
\begin{align} \quad x_r = \sum_{i} \alpha_{ir} a_i \: , \quad \alpha_{ir} \in \mathbf{F} \end{align}
(2)
\begin{align} \quad y_r = \sum_{j} \beta_{jr} b_j \: , \quad \beta_{jr} \in \mathbf{F} \end{align}
  • Then:
(3)
\begin{align} \quad 0 &= \sum_{r=1}^{k} x_r \otimes y_r \\ &= \sum_{r=1}^{k} \left [ \left [ \sum_{i} \alpha_{ir} a_i\right ] \otimes \left [ \sum_{j} \beta_{jr} b_j \right ] \right ] \\ &= \sum_{r=1}^{k} \sum_{i} \sum_{j} \alpha_{ir} \beta_{jr} a_i \otimes b_j \\ &= \sum_{i} \sum_{j} \sum_{r=1}^{k} \alpha_{ir} \beta_{jr} a_i \otimes b_j \end{align}
  • Since $\{ a_i \} \subset X$ is linearly independent in $X$ and $\{ b_j \} \subset Y$ is linearly independent in $Y$, we know by one of the propositions on the Basic Theorems Regarding the Algebraic Tensor Product of Two Normed Spaces page we have $\{ a_i \otimes b_j \}_{i, j}$ is linearly independent in $X \otimes Y$. Thus the above equation implies that for all $i$ and all $j$:
(4)
\begin{align} \quad \sum_{r=1}^{k} \alpha_{ir} \beta_{jr} = 0 \end{align}
  • Therefore we have that:
(5)
\begin{align} \quad \sum_{r=1}^{k} T(x_r, y_r) &= \sum_{r=1}^{k} T \left ( \sum_{i} \alpha_{ir} a_i, \sum_{j} \beta_{jr} b_j \right ) \\ &= \sum_{r=1}^{k} \sum_{i} \sum_{j} \alpha_{ir} \beta_{jr} T(a_i, b_j) \\ &= \sum_{i} \sum_{j} \underbrace{\sum_{r=1}^{k} \alpha_{ir} \beta_{jr}}_{=0 \: \forall i, \: \forall j} T(a_i, b_j) \\ &= 0 \end{align}
  • So we define $\sigma : X \otimes Y \to Z$ for all $u = \sum_{r=1}^{k} x_k \otimes y_k \in X \otimes Y$ by:
(6)
\begin{align} \quad \sigma (u) = \left ( \sum_{r=1}^{k} x_r \otimes y_r \right ) := \sum_{r=1}^{k} T(x_r, y_r) \end{align}
  • Then $\sigma (x \otimes y) = T(x, y)$ for all $x \in X$ and $y \in Y$. Now there are a few things to check. First we will check that $\sigma$ is well-defined, that is, the values of $\sigma$ are independent of how we choose to write $u$ as a linear combination in $X \otimes Y$.
  • Showing that $\sigma$ is well-defined: Let $u \in X \otimes Y$ and suppose that $u = \sum_{r=1}^{k} x_r \otimes y_r$ and $u = \sum_{r=1}^{k'} x_r' \otimes y_r'$. Then:
(7)
\begin{align} \quad 0 = u' - u = \sum_{r=1}^{k'} x_r' \otimes y_r' - \sum_{r=1}^{k} x_r \otimes y_r \end{align}
  • Then as we have proven above:
(8)
\begin{align} \quad 0 = \sum_{r=1}^{k'} T(x_r', y_r') - \sum_{r=1}^{k} T(x_r, y_r) \end{align}
  • So $\sigma (u)$ is well-defined regardless of the way we choose to write $u$ as a linear combination of elements of the form $x_k \otimes y_k$.
  • Showing that $\sigma$ is linear: Let $u, v \in X \otimes Y$ with $u = \sum_{r=1}^{k} x_r \otimes y_r$ and let $v = \sum_{r=1}^{k'} x_r' \otimes y_r'$, and let $\alpha \in \mathbf{F}$. Then:
(9)
\begin{align} \quad \quad \sigma (u + v) = \sigma \left ( \sum_{r=1}^{k} x_r \otimes y_r + \sum_{r=1}^{k'} x_r' \otimes y_r' \right ) = \sum_{r=1}^{k} T(x_r, y_r) + \sum_{r=1}^{k'} T(x_r', y_r') = \sigma(u) + \sigma(v) \end{align}
(10)
\begin{align} \quad \quad \sigma (\alpha u) = \sigma \left ( \alpha \sum_{r=1}^{k} x_r \otimes y_r \right ) = \sigma \left ( \sum_{r=1}^{k} \alpha x_r \otimes y_r \right ) = \alpha \sum_{r=1}^{k} T(x_r, y_r) = \alpha \sigma (u) \end{align}
  • Therefore $\sigma$ is linear.
  • Showing that $\sigma$ is unique: Suppose there exists another linear map $\sigma^* : X \otimes Y \to Z$ such that $\sigma^*(x \otimes y) = T(x, y)$ for all $x \in X$ and for all $y \in Y$. Let $u = \sum_{r=1}^{k} x_k \otimes y_k \in X \otimes Y$. Then by the linearity of $\sigma^*$ and $\sigma$ we have that:
(11)
\begin{align} \quad \sigma^*(u) &= \sigma^* \left ( \sum_{r=1}^{k} x_k \otimes y_k \right ) \\ &= \sum_{r=1}^{k} \sigma^* (x_k \otimes y_k) \\ &= \sum_{r=1}^{k} T(x_k, y_k) \\ &= \sum_{r=1}^{k} \sigma(x_k \otimes y_k) \\ &= \sigma \left ( \sum_{r=1}^{k} x_k \otimes y_k \right ) \\ &= \sigma (u) \end{align}
  • So $\sigma = \sigma^*$ on all of $X \otimes Y$, so $\sigma$ is unique. $\blacksquare$
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License