Cross Norms on X⊗Y
Table of Contents

Cross Norms on X⊗Y

Definition: Let $X$ and $Y$ be normed linear spaces. A Cross Norm on $X \otimes Y$ is a norm $\alpha$ on $X \otimes Y$ such that $\alpha (x \otimes y) = \| x \| \| y \|$ for all $x \in X$ and for all $y \in Y$.

Recall from The Weak Tensor Product of X⊗Y and The Projective Tensor Product of X⊗Y pages that both the weak tensor norm and projective tensor norms are cross norms.

Proposition 1: Let $X$ and $Y$ be normed linear spaces. If $\alpha$ is a cross norm on $X \otimes Y$ then $\alpha (u) \leq p(u)$ for all $u \in X \otimes Y$. In other words, the projective tensor norm is the greatest cross norm on $X \otimes Y$.
  • Proof: Let $u \in X \otimes Y$ and write $u = \sum_{i=1}^{m} x_i \otimes y_i$. Since $\alpha$ is a cross norm on $X \otimes Y$ we have that:
(1)
\begin{align} \quad \alpha (u) = \alpha \left ( \sum_{i=1}^{m} x_i \otimes y_i \right ) \leq \sum_{i=1}^{m} \alpha(x_i \otimes y_i) \leq \sum_{i=1}^{m} \| x_i \| \| y_i \| \end{align}
  • Since the above equality holds true for all representations of $u$, we have that:
(2)
\begin{align} \quad \alpha (u) \leq \inf \left \{ \sum_{i=1}^{m} \| x_i \| \| y_i \| : u = \sum_{i=1}^{m} x_i \otimes y_i \right \} = p(u) \quad \blacksquare \end{align}
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License