The Projective Tensor Product of X⊗Y

The Projective Tensor Product of X⊗Y

Recall from The Weak Tensor Product of X⊗Y page that if $X$ and $Y$ are normed linear spaces and $u \in X \otimes Y$ with $u = \sum_{i=1}^{m} x_i \otimes y_i$ then we defined the weak tensor norm on $X \otimes Y$ to be defined by:

(1)
\begin{align} \quad w(u) = \sup_{\| f \| \leq 1, \| g \| \leq 1} \left \{ \left | \sum_{i=1}^{m} f(x_i)g(y_i) \right | \right \} \end{align}

Furthermore, we defined the weak tensor product on $X$ and $Y$ denoted $X \otimes_w Y$ to be the completion of $(X \otimes Y, w)$ in $\mathrm{BL}(X^*, Y^*, \mathbf{F})$.

We will now define another norm on $X \otimes Y$ called the projective tensor norm on $X \otimes Y$.

Definition: Let $X$ and $Y$ be normed linear spaces. The Projective Tensor Norm on $X \otimes Y$ is the function $p : X \otimes Y \to [0, \infty)$ defined for all $u \in X \otimes Y$ by $\displaystyle{p(u) = \inf \left \{ \sum_{i=1}^{m} \| x_i \| \| y_i \| : u = \sum_{i=1}^{m} x_i \otimes y_i \right \}}$ (the infimum is taken over all finite ways in which $u$ can expressed as a sum of elements in $X \otimes Y$). Furthermore, the Projective Tensor Product of $X$ and $Y$ denoted $X \otimes_p Y$ is the completion of $(X \otimes Y, p)$ in $\mathrm{BL}(X^*, Y^*; \mathbf{F})$.

Sometimes the projective tensor norm of $u$ is denoted by $\pi(u)$ and the projective tensor product of $X$ and $Y$ is sometimes denoted by $X \widehat{\otimes}_{\pi} Y$.

Proposition 1: Let $X$ and $Y$ be normed linear spaces. Then:
a) If $u \in X \otimes Y$ then $w(u) \leq p(u)$.
b) $p(x \otimes y) = \| x \| \| y \|$ for all $x \in X$ and for all $y \in Y$.
  • Proof of a): Let $u = \sum_{i=1}^{m} x_i \otimes y_i$. Then for all $f \in X^*$ and all $g \in Y^*$ with $\| f \| \leq 1$, $\| g \| \leq 1$ we have that
(2)
\begin{align} \quad |u(f, g)| = \left | \left [ \sum_{i=1}^{m} x_i \otimes y_i \right ](f, g) \right | = \left | \sum_{i=1}^{m} f(x_i)g(y_i) \right | \leq \sum_{i=1}^{m} |f(x_i)|g(y_i)| \leq \sum_{i=1}^{m} \| f \| \| x_i \| \| g \| \| y_i \| = \underbrace{\| f \| \| g \|}_{\leq 1} \sum_{i=1}^{m} \| x_i \| \| y_i \| \leq \sum_{i=1}^{m} \| x_i \| \| y_i \| \end{align}
  • Taking the supremum of the lefthand side as $f \in X^*$ varies through $\| f \| \leq 1$ and $g \in Y^*$ varies through $\| g \| \leq 1$ shows us that for all representations of $u = \sum_{i=1}^{m} x_i \otimes y_i$ we have that $w(u) \leq \sum_{i=1}^{m} \| x_i \| \| y_i \|$. Taking the infimum of the righthand side as the sum varies through all such representations of $u$ in $X \otimes Y$ gives us that $w(u) \leq p(u)$. $\blacksquare$
  • Proof of b) Let $u = x \otimes y$. Then by part (a) we have that $w(x \otimes y) \leq p(x \otimes y)$. But we know that $w(x \otimes y) = \| x \| \| y \|$ and thus $\| x \| \| y \| \leq p(x \otimes y)$.
  • For the reverse inequality observe that by definition:
(3)
\begin{align} \quad p(x \otimes y) = \inf \left \{ \sum_{i=1}^{m} \| x_i \| \| y_i \| : u = \sum_{i=1}^{m} x_i \otimes y_i \right \} \leq \| x \| \| y \| \end{align}
  • Thus we conclude that $p(x \otimes y) = \| x \| \| y \|$ for all $x \in X$, $y \in Y$. $\blacksquare$

Before we move on we should verify that the projective tensor norm $p$ is indeed a norm on $X \otimes Y$.

Proposition 2: Let $X$ and $Y$ be normed linear spaces. Then the projective tensor norm on $X \otimes Y$ is a norm.
  • Proof: There are three things to show.
  • 1. Showing that $p(u) = 0$ if and only if $u = 0$: Suppose that $p(u) = 0$. From proposition 1 above we have that $w(u) \leq p(u) = 0$, so $w(u) = 0$. But $w$ is a norm on $X \otimes Y$ and thus $u = 0$.
  • On the other hand, suppose that $u = 0$. Then certainly $u = 0 \otimes 0$, so by proposition $1$ we have that $p(u) = p(0 \otimes 0) = \| 0 \| \| 0 \| = 0$.
  • 2. Showing that $p(\alpha u) = |\alpha| p(u)$: Let $u \in X \otimes Y$ and let $\alpha \in \mathbf{F}$. Observe that if $u = \sum_{i=1}^{m} x_i \otimes y_i$ then $\alpha u = \alpha \sum_{i=1}^{m} x_i \otimes y_i = \sum_{i=1}^{m} (\alpha x_i) \otimes y_i$. Therefore:
(4)
\begin{align} \quad p(\alpha u) = \inf \left \{ \sum_{i=1}^{m} \| \alpha x_i \| \| y_i \| : \alpha u = \sum_{i=1}^{m} \alpha x_i \otimes y_i \right \} = |\alpha| \inf \left \{ \sum_{i=1}^{m} \| x_i \| \| y_i \| : u = \sum_{i=1}^{m} x_i \otimes y_i \right \} = |\alpha| p(u) \end{align}
  • 3. Showing that $p(u + v) \leq p(u) + p(v)$: Let $u, v \in X \otimes Y$. Observe that if $u = \sum_{i=1}^{m} x_i \otimes y_i$ and $v = \sum_{j=1}^{n} x_j' \otimes y_j'$ then $u + v = \sum_{i=1}^{m} x_i \otimes y_j + \sum_{j=1}^{n} x_j' \otimes y_j'$. If $A$ denotes the set of representations of $u$, $B$ the set of representations of $v$, and $C$ the set of representations of $u + v$ then $A + B \subseteq C$. Therefore:
(5)
\begin{align} \quad p(u + v) = \inf \left \{ \sum_{k=1}^{o} \| x_k'' \| \| y_k'' \| : u + v = \sum_{k=1}^{o} x_k'' \otimes y_k'' \right \} & \leq \inf \left \{ \sum_{i=1}^{m} \| x_i \| \| y_i \| : u = \sum_{i=1}^{m} x_i \otimes y_i \right \} + \inf \left \{ \sum_{j=1}^{n} \| x_j' \| \| y_j' \| : v = \sum_{j=1}^{n} x_j \otimes y_j \right \} \\ & \leq p(u) + p(v) \end{align}
  • Thus $p$ is a norm on $X \otimes Y$. $\blacksquare$

In the proof of Proposition 2 above, we used the fact that $w$ is a norm on $X \otimes Y$ in order to show that $p(u) = 0$ if and only if $u = 0$. We will now demonstrate a direct proof of this result.

First suppose that $p(u) = 0$. Then:

(6)
\begin{align} \quad p(u) = \inf \left \{ \sum_{i=1}^{n} \| x_i \| \| y_i \| : u = \sum_{i=1}^{n} x_i \otimes y_i \right \} = 0 \end{align}

So for each $\epsilon > 0$ there exists a representation $\sum_{i=1}^{n} x_i \otimes y_i$ of $u$ with the property that:

(7)
\begin{align} \quad \sum_{i=1}^{n} \| x_i \| \| y_i \| \leq \epsilon \end{align}

So for all $f \in X^*$ and all $g \in Y^*$ we have that:

(8)
\begin{align} \quad \left | \sum_{i=1}^{n} f(x_i)g(y_i) \right | \leq \sum_{i=1}^{n} |f(x_i)||g(y_i)| \leq \| f \| \| g \| \sum_{i=1}^{n} \| x_i \| \| y_i \| = \| f \| \| g \| \epsilon \end{align}

Since the value of the lefthand side sum does not depend of the choice of representation of $u$, we have that as $\epsilon \to 0$ that $\sum_{i=1}^{n} f(x_i)g(y_i) = 0$ for all $f \in X^*$ and all $g \in Y^*$. By the Criteria for a Tensor u in X⊗Y to be 0 page we have that $u = 0$.

Proposition 3: Let $X$ and $Y$ be normed linear spaces. Then every $u \in X \otimes_p Y$ can be written in the form $u = \sum_{i=1}^{\infty} x_i \otimes y_i$ for which $\sum_{i=1}^{\infty} \| x_i \| \| y_i \| < \infty$ and moreover, $\displaystyle{p(u) = \inf \left \{ \sum_{i=1}^{\infty} \| x_i \| \| y_i \| : u = \sum_{i=1}^{\infty} x_i \otimes y_i \: \mathrm{and} \: \sum_{i=1}^{\infty} \| x_i \| \| y_i \| < \infty \right \}}$.
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License