The Projective Tensor Product of X⊗Y
Recall from The Weak Tensor Product of X⊗Y page that if $X$ and $Y$ are normed linear spaces and $u \in X \otimes Y$ with $u = \sum_{i=1}^{m} x_i \otimes y_i$ then we defined the weak tensor norm on $X \otimes Y$ to be defined by:
(1)Furthermore, we defined the weak tensor product on $X$ and $Y$ denoted $X \otimes_w Y$ to be the completion of $(X \otimes Y, w)$ in $\mathrm{BL}(X^*, Y^*, \mathbf{F})$.
We will now define another norm on $X \otimes Y$ called the projective tensor norm on $X \otimes Y$.
Definition: Let $X$ and $Y$ be normed linear spaces. The Projective Tensor Norm on $X \otimes Y$ is the function $p : X \otimes Y \to [0, \infty)$ defined for all $u \in X \otimes Y$ by $\displaystyle{p(u) = \inf \left \{ \sum_{i=1}^{m} \| x_i \| \| y_i \| : u = \sum_{i=1}^{m} x_i \otimes y_i \right \}}$ (the infimum is taken over all finite ways in which $u$ can expressed as a sum of elements in $X \otimes Y$). Furthermore, the Projective Tensor Product of $X$ and $Y$ denoted $X \otimes_p Y$ is the completion of $(X \otimes Y, p)$ in $\mathrm{BL}(X^*, Y^*; \mathbf{F})$. |
Sometimes the projective tensor norm of $u$ is denoted by $\pi(u)$ and the projective tensor product of $X$ and $Y$ is sometimes denoted by $X \widehat{\otimes}_{\pi} Y$.
Proposition 1: Let $X$ and $Y$ be normed linear spaces. Then: a) If $u \in X \otimes Y$ then $w(u) \leq p(u)$. b) $p(x \otimes y) = \| x \| \| y \|$ for all $x \in X$ and for all $y \in Y$. |
- Proof of a): Let $u = \sum_{i=1}^{m} x_i \otimes y_i$. Then for all $f \in X^*$ and all $g \in Y^*$ with $\| f \| \leq 1$, $\| g \| \leq 1$ we have that
- Taking the supremum of the lefthand side as $f \in X^*$ varies through $\| f \| \leq 1$ and $g \in Y^*$ varies through $\| g \| \leq 1$ shows us that for all representations of $u = \sum_{i=1}^{m} x_i \otimes y_i$ we have that $w(u) \leq \sum_{i=1}^{m} \| x_i \| \| y_i \|$. Taking the infimum of the righthand side as the sum varies through all such representations of $u$ in $X \otimes Y$ gives us that $w(u) \leq p(u)$. $\blacksquare$
- Proof of b) Let $u = x \otimes y$. Then by part (a) we have that $w(x \otimes y) \leq p(x \otimes y)$. But we know that $w(x \otimes y) = \| x \| \| y \|$ and thus $\| x \| \| y \| \leq p(x \otimes y)$.
- For the reverse inequality observe that by definition:
- Thus we conclude that $p(x \otimes y) = \| x \| \| y \|$ for all $x \in X$, $y \in Y$. $\blacksquare$
Before we move on we should verify that the projective tensor norm $p$ is indeed a norm on $X \otimes Y$.
Proposition 2: Let $X$ and $Y$ be normed linear spaces. Then the projective tensor norm on $X \otimes Y$ is a norm. |
- Proof: There are three things to show.
- 1. Showing that $p(u) = 0$ if and only if $u = 0$: Suppose that $p(u) = 0$. From proposition 1 above we have that $w(u) \leq p(u) = 0$, so $w(u) = 0$. But $w$ is a norm on $X \otimes Y$ and thus $u = 0$.
- On the other hand, suppose that $u = 0$. Then certainly $u = 0 \otimes 0$, so by proposition $1$ we have that $p(u) = p(0 \otimes 0) = \| 0 \| \| 0 \| = 0$.
- 2. Showing that $p(\alpha u) = |\alpha| p(u)$: Let $u \in X \otimes Y$ and let $\alpha \in \mathbf{F}$. Observe that if $u = \sum_{i=1}^{m} x_i \otimes y_i$ then $\alpha u = \alpha \sum_{i=1}^{m} x_i \otimes y_i = \sum_{i=1}^{m} (\alpha x_i) \otimes y_i$. Therefore:
- 3. Showing that $p(u + v) \leq p(u) + p(v)$: Let $u, v \in X \otimes Y$. Observe that if $u = \sum_{i=1}^{m} x_i \otimes y_i$ and $v = \sum_{j=1}^{n} x_j' \otimes y_j'$ then $u + v = \sum_{i=1}^{m} x_i \otimes y_j + \sum_{j=1}^{n} x_j' \otimes y_j'$. If $A$ denotes the set of representations of $u$, $B$ the set of representations of $v$, and $C$ the set of representations of $u + v$ then $A + B \subseteq C$. Therefore:
- Thus $p$ is a norm on $X \otimes Y$. $\blacksquare$
In the proof of Proposition 2 above, we used the fact that $w$ is a norm on $X \otimes Y$ in order to show that $p(u) = 0$ if and only if $u = 0$. We will now demonstrate a direct proof of this result.
First suppose that $p(u) = 0$. Then:
(6)So for each $\epsilon > 0$ there exists a representation $\sum_{i=1}^{n} x_i \otimes y_i$ of $u$ with the property that:
(7)So for all $f \in X^*$ and all $g \in Y^*$ we have that:
(8)Since the value of the lefthand side sum does not depend of the choice of representation of $u$, we have that as $\epsilon \to 0$ that $\sum_{i=1}^{n} f(x_i)g(y_i) = 0$ for all $f \in X^*$ and all $g \in Y^*$. By the Criteria for a Tensor u in X⊗Y to be 0 page we have that $u = 0$.
Proposition 3: Let $X$ and $Y$ be normed linear spaces. Then every $u \in X \otimes_p Y$ can be written in the form $u = \sum_{i=1}^{\infty} x_i \otimes y_i$ for which $\sum_{i=1}^{\infty} \| x_i \| \| y_i \| < \infty$ and moreover, $\displaystyle{p(u) = \inf \left \{ \sum_{i=1}^{\infty} \| x_i \| \| y_i \| : u = \sum_{i=1}^{\infty} x_i \otimes y_i \: \mathrm{and} \: \sum_{i=1}^{\infty} \| x_i \| \| y_i \| < \infty \right \}}$. |