This section is divided into a number of subsections, links to which are:

Vector products

Cross products

Triple products

Wedge products

Rotors

 

Tensor product

The tensor product is an operation combining two smaller vector spaces into one larger vector space. Elements of this larger space are called tensors.

Let V and U be two finite dimensional spaces over the same field of scalars 𝔽. Let α = { v1, v2, … , vn } and β = { u1, u2, … , um } be their bases, respectively. Then the tensor product of vector spaces V and U, denoted by VU, is spanned on the basis { viuj : i = 1, 2, … , n; j = 1, 2,… , m }. Elements of VU are called tensors that are linear combinations of nm of basis vectors satisfying the following two axioms: \begin{equation} \label{EqTensor.1} \begin{split} c\left( {\bf v} \otimes {\bf u} \right) = c\, {\bf v} \otimes {\bf u} = {\bf v} \otimes \left( c\,{\bf u} \right) , \qquad c\in \mathbb{F}, \end{split} \end{equation} and \begin{equation} \label{EqTensor.2} {\bf a} \otimes {\bf u} + {\bf b} \otimes {\bf u} = \left( {\bf a} + {\bf b} \right) \otimes {\bf u} , \\ {\bf v} \otimes {\bf u} + {\bf v} \otimes {\bf w} = {\bf v} \otimes \left( {\bf u} + {\bf w} \right) . \end{equation}

This definition does not specify the structure of basis elements viuj. Therefore, tensor product can be applied to a great variety of objects and structures, including vectors, matrices, tensors, vector spaces, algebras, topological vector spaces, and modules among others. The most familiar case is, perhaps, when A = ℝm and B = ℝn. Then, ℝm⊗ℝn ≌ ℝmn (see Example 4).

There are known two interpretations of tensor product, and both are widely used. Since the tensor product of n-dimensional vector \( \displaystyle {\bf a} = a_1 {\bf u}_1 + a_2 {\bf u}_2 + \cdots + a_n {\bf u}_n \) and m-dimensional vector \( \displaystyle {\bf b} = b_1 {\bf v}_1 + b_2 {\bf v}_2 + \cdots + b_m {\bf v}_m \) has dimension nm, it is natural to represent this product in matrix form:

\begin{equation} \label{EqTensor.3} {\bf a} \otimes {\bf b} = {\bf a} \,\overline{{\bf b}^{\mathrm{T}}} = \begin{bmatrix} a_1 b_1^{\ast} & a_1 b_2^{\ast} & \cdots & a_1 b_m^{\ast} \\ a_2 b_1^{\ast} & a_2 b_2^{\ast} & \cdots & a_2 b_m^{\ast} \\ \vdots & \vdots & \ddots & \vdots \\ a_n b_1^{\ast} & a_n b_2^{\ast} & \cdots & a_n b_m^{\ast} \end{bmatrix} \in \mathbb{F}^{n,m} . \end{equation}
Here \( \displaystyle \overline{b} = b^{\ast} = x -{\bf j}\,y \) is complex conjugate of complex number b = x + jy and j is the imaginary unit on complex plane ℂ, so j² = −1.

This approach is common, for instance, in digital image processing where images are represented by rectangular matrices. Hence, matrix representation \eqref{EqTensor.3} of the tensor product deserves a special name:

An outer product is the tensor product of two vectors \( {\bf u} = \left[ u_1 , u_2 , \ldots , u_m \right] \) and \( {\bf v} = \left[ v_1 , v_2 , \ldots , v_n \right] , \) denoted by \( {\bf u} \otimes {\bf v} , \) is an m-by-n matrix W such that its coordinates satisfy \( w_{i,j} = u_i v_j . \) The outer product \( {\bf u} \otimes {\bf v} , \) is equivalent to a matrix multiplication \( {\bf u} \, {\bf v}^{\ast} , \) (or \( {\bf u} \, {\bf v}^{\mathrm T} , \) if vectors are real) provided that u is represented as a column \( m \times 1 \) vector, and v as a column \( n \times 1 \) vector. Here \( {\bf v}^{\ast} = \overline{{\bf v}^{\mathrm T}} . \)

In particular, uv is a matrix of rank 1, which means that most matrices cannot be written as tensor products of two vectors. The special case constitute elements eiej that form the matrix which is 1 at (i, j) and 0 elsewhere, and the set of all such matrices forms a basis for the set of m × n -matrices, denoted by ℝm,n. Note that we use the field of real numbers as an example.

On the other hand, in quantum mechanics, it is a custom to represent tensor product of two vectors as a long (column) vector of length n + m, similarly to an element from the direct product. For example, the state of two-particle system can be described by something called a density matrix ρ on the tensor product of their respective spaces ℂn⊗ℂn. A density matrix is a generalization of a unit vector—it accounts for interactions between the two particles.

The vector representation of the tensor product itself captures all ways that basic things can "interact" with each other! We can visualize this situation in the following picture showing interactions of basis elements of the tensor product of 2- and 3-vectors.

   ℝ2⊗ℝ3 ≌ ℝ6
     
Interaction with 5 nodes

It is possible to represent a tensor product of two vectors of length m and n as a single vector of length mn. However, this approach, known as the Kronecker product, is less efficient than vector representation embedded from the direct product. Therefore, the Kronecker product is described in matrix area where it originated.
Example 1: Let us consider two vectors
\[ {\bf v} = \begin{pmatrix} 3 \\ 2 \\ 1 \end{pmatrix} \qquad \mbox{and} \qquad {\bf u} = \begin{pmatrix} 1 \\ 2 \\ 3 \\ 4 \end{pmatrix} \]
Then their tensor product in matrix form is
\[ {\bf v} \otimes {\bf u} = \begin{bmatrix} 3 & 6 & 9 & 12 \\ 2 & 4 & 6 & 8 \\ 1 & 2 & 3 & 4 \end{bmatrix} . \]
Example 2: Let us consider two vectors
\[ {\bf v} = \begin{pmatrix} 3 \\ 2 \\ 1 \end{pmatrix} \qquad \mbox{and} \qquad {\bf u} = \begin{pmatrix} 1 \\ 2 \\ 3 \\ 4 \end{pmatrix} \]
For ℝ³, we have the standard basis:
\[ {\bf e}_1 = \begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix} , \qquad {\bf e}_2 = \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix} , \qquad {\bf e}_3 = \begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix} ; \]
So the given vector v can be expanded as
\[ {\bf v} = 3\,{\bf e}_1 + 2\,{\bf e}_2 + {\bf e}_3 . \]
For ℝ4, we have a similar standard basis:
\[ {\bf e}^1 = \begin{pmatrix} 1 \\ 0 \\ 0 \\ 0 \end{pmatrix} , \qquad {\bf e}^2 = \begin{pmatrix} 0 \\ 1 \\ 0 \\ 0 \end{pmatrix} , \qquad {\bf e}^3 = \begin{pmatrix} 0 \\ 0 \\ 1 \\ 0 \end{pmatrix} , \qquad {\bf e}^4 = \begin{pmatrix} 0 \\ 0 \\ 0 \\ 1 \end{pmatrix} . \]
This allows us to represent vector u as
\[ {\bf u} = {\bf e}^1 + 2\,{\bf e}^2 + 3\,{\bf e}^3 + 4\, {\bf e}^4 . \]
Of these basis vectors for ℝ³ and ℝ4, we build 12 basis vectors for the tensor product by concatenating or stacking these two vectors:
\[ {\bf e}_1 \otimes {\bf e}^1 = \left( \begin{array}{c} 1 \\ 0 \\ 0 \\ \hline 1 \\ 0 \\ 0 \\ 0 \end{array} \right) , \quad {\bf e}_2 \otimes {\bf e}^1 = \left( \begin{array}{c} 0 \\ 1 \\ 0 \\ \hline 1 \\ 0 \\ 0 \\ 0 \end{array} \right) , \quad {\bf e}_3 \otimes {\bf e}^1 = \left( \begin{array}{c} 0 \\ 0 \\ 1 \\ \hline 1 \\ 0 \\ 0 \\ 0 \end{array} \right) , \]
and
\[ {\bf e}_1 \otimes {\bf e}^2 = \left( \begin{array}{c} 1 \\ 0 \\ 0 \\ \hline 0 \\ 1 \\ 0 \\ 0 \end{array} \right) , \quad {\bf e}_2 \otimes {\bf e}^2 = \left( \begin{array}{c} 0 \\ 1 \\ 0 \\ \hline 0 \\ 1 \\ 0 \\ 0 \end{array} \right) , \quad {\bf e}_3 \otimes {\bf e}^2 = \left( \begin{array}{c} 0 \\ 0 \\ 1 \\ \hline 0 \\ 1 \\ 0 \\ 0 \end{array} \right) , \]
and
\[ {\bf e}_1 \otimes {\bf e}^3 = \left( \begin{array}{c} 1 \\ 0 \\ 0 \\ \hline 0 \\ 0 \\ 1 \\ 0 \end{array} \right) , \quad {\bf e}_2 \otimes {\bf e}^3 = \left( \begin{array}{c} 0 \\ 1 \\ 0 \\ \hline 0 \\ 0 \\ 1 \\ 0 \end{array} \right) , \quad {\bf e}_3 \otimes {\bf e}^3 = \left( \begin{array}{c} 0 \\ 0 \\ 1 \\ \hline 0 \\ 0 \\ 1 \\ 0 \end{array} \right) , \]
and
\[ {\bf e}_1 \otimes {\bf e}^4 = \left( \begin{array}{c} 1 \\ 0 \\ 0 \\ \hline 0 \\ 0 \\ 0 \\ 1 \end{array} \right) , \quad {\bf e}_2 \otimes {\bf e}^4 = \left( \begin{array}{c} 0 \\ 1 \\ 0 \\ \hline 0 \\ 0 \\ 0 \\ 1 \end{array} \right) , \quad {\bf e}_3 \otimes {\bf e}^4 = \left( \begin{array}{c} 0 \\ 0 \\ 1 \\ \hline 0 \\ 0 \\ 0 \\ 1 \end{array} \right) , \]
Then we can express the tensor product vu through these basis vectors:
\begin{align*} {\bf v} \otimes {\bf u} &= \left( 3\,{\bf e}_1 + 2\,{\bf e}_2 + {\bf e}_3 \right) \otimes \left( {\bf e}^1 + 2\,{\bf e}^2 + 3\,{\bf e}^3 + 4\, {\bf e}^4 \right) \\ &= 3\,{\bf e}_1 \otimes {\bf e}^1 + 6\, {\bf e}_1 \otimes {\bf e}^2 + 9\,{\bf e}_1 \otimes {\bf e}^3 + 12\,{\bf e}_1 \otimes {\bf e}^4 \\ &\quad + 2\,{\bf e}_2 \otimes {\bf e}^1 + 4\, {\bf e}_2 \otimes {\bf e}^2 + 6\,{\bf e}_2 \otimes {\bf e}^3 + 8\,{\bf e}_2 \otimes {\bf e}^4 \\ &\quad + {\bf e}_3 \otimes {\bf e}^1 + 2\, {\bf e}_3 \otimes {\bf e}^2 + 3\,{\bf e}_3 \otimes {\bf e}^3 + 4\,{\bf e}_3 \otimes {\bf e}^4 . \end{align*}
If you rewrite this tensor as a single 7-dimensional vector, most information about the tensor product will be lost:
\[ {\bf v} \otimes {\bf u} = \mbox{wrong } \left( \begin{array}{c} 30 \\ 20 \\ 10 \\ \hline 6 \\ 12 \\ 18 \\ 24 \end{array} \right) . \]
End of Example 2
There is a similarity between the direct product VU and the tensor product VU: the space VU consists of pairs (v, u) with vV and uU, while VU is built from vectors vu. Multiplication by scalars in tensor product are defined according to the rule (axiom) in Eq.(1) and addition is formulated in Eq.\eqref{EqTensor.2}. So we see that these two operations completely different from the addition and multiplication in the direct product.

This seemingly innocuous changes clearly have huge implications on the structure of the tensor space. It can be shown that the tensor product VU is a vector space because it is a quotient space of the product space (we wouldn’t prove it, see, for instance, Phil Lucht works).

Example 3: Let us simplify
\[ (2, 1) \otimes (1, 4) + (2, -3) \otimes (-2, 3) , \]
Let us introduce the basis vectors x = (1, 0) and y = (0, 1), then
\begin{align*} {\bf u} \otimes {\bf v} &= (2, 1) \otimes (1, 4) + (2, -3) \otimes (-2, 3) \\ &= \left( 2x + y \right) \otimes \left( x + 4y \right) + \left( 2x -3y \right) \otimes \left( -2x + 3y \right) \\ &= 2 x \otimes x + 8 x \otimes y + y \otimes x + 4 y \otimes y - 4 x \otimes x + 6 x \otimes y + 6 y \otimes x - 9 y \otimes y \\ &= - 2 x \otimes x + 14 x \otimes y + 7 y \otimes x - 5 y \otimes y . \end{align*}
Example 4: First, we consider a simple case of m = n, and find ℝ>⊗ℝ. It is spanned on the single basis vector ij, where i and j are unit vectors on ℝ (hence, they are the same). Hence, ℝ⊗ℝ can be visualized by the line (x, x) drawn on the plane ℝ².

In general, we have ℝm,1⊗ℝ1,n ≌ ℝm,n with the product defined to be uv = uv, the matrix product of a column and a row vector. Of course, these two spaces 𝔽m,n and 𝔽n,m are naturally isomorphic to each other, 𝔽m,n ≌ 𝔽n,m, so in that sense they are the same, but we would never write vu when we mean uv.

End of Example 4
Example 5: For instance, if m = 4 and n = 3, then
\[ {\bf u} \otimes {\bf v} = {\bf u} \, {\bf v}^{\mathrm T} = \begin{bmatrix} u_1 \\ u_2 \\ u_3 \\ u_4 \end{bmatrix} \begin{bmatrix} v_1 & v_2 & v_3 \end{bmatrix} = \begin{bmatrix} u_1 v_1 & u_1 v_2 & u_1 v_3 \\ u_2 v_1 & u_2 v_2 & u_2 v_3 \\ u_3 v_1 & u_3 v_2 & u_3 v_3 \\ u_4 v_1 & u_4 v_2 & u_4 v_3 \end{bmatrix} . \]
In Mathematica, the outer product has a special command:
Outer[Times, {1, 2, 3, 4}, {a, b, c}]
Out[1]= {{a, b, c}, {2 a, 2 b, 2 c}, {3 a, 3 b, 3 c}, {4 a, 4 b, 4 c}}
If we take two complex-valued vectors \( {\bf u} = [1 + {\bf j}, 2, -1 -2{\bf j}, 2 -{\bf j}] \) and \( {\bf v} = [3 + {\bf j}, -1 + {\bf j}, 2 -{\bf j}] , \) then their outer product becomes
\[ {\bf u} \otimes {\bf v} = {\bf u} \, \overline{{\bf v}^{\mathrm T}} = \begin{bmatrix} 4+2{\bf j} & - 2{\bf j}&1 + 3{\bf j} \\ 6-2{\bf j} & -2-2{\bf j} & 4+2{\bf j} \\ -5 -5{\bf j} & -1 + 3{\bf j} & -5{\bf j} \\ 5 - 5{\bf j} & -3-{\bf j} & 5 \end{bmatrix} , \]
Outer[Times, {1 + I, 2, -1 - 2*I, 2 - I} , Conjugate[{3 + I, -1 + I, 2 - I}]]
{{1 + I}, {2}, {-1 - 2*I}, { 2 - I}} .Conjugate[{{3 + I, -1 + I, 2 - I}}]
{{4 + 2 I, -2 I, 1 + 3 I}, {6 - 2 I, -2 - 2 I, 4 + 2 I}, {-5 - 5 I, -1 + 3 I, -5 I}, {5 - 5 I, -3 - I, 5}}
MatrixRank[%]
Out[3]= 1
which is rank 1 matrix.
End of Example 5
Example 6: Let V = 𝔽[x] be vector space of polynomials in one variable. Then VV = 𝔽[x₁, x₂] is the space of polynomials in two variables, the product is defined to be p(x)⊗q(x) = p(x₁) q(x₂).

Note: this is not a commutative product, because in general

\[ p(x)\otimes q(x) = p\left( x_1 \right) q\left( x_2 \right) \ne q \left( x_1 \right) p \left( x_2 \right) = q\left( x \right) \otimes p \left( x \right) . \]
End of Example 6

Example 7: If V is any vector space over 𝔽, then V ⊗ 𝔽 ≌ V. In this case, tensor product, ⊗, is just scalar multiplication.
End of Example 7
Example 8: If V and U are finite dimensional vector spaces over field 𝔽, then V* = ℒ(V, 𝔽) is the set of all linear functionals on V. Hence, V* ⊗ U = ℒ(V, U), with multiplication defined as φ⊗u ∈ ℒ(V, U) is the linear transformation from V to U because (φ⊗u)(v) = φ(v)⋅u, for φ ∈ V* and uU.

This result is just the abstract version of Example 4. If V = 𝔽n,1 and U = 𝔽m,1, then V* = 𝔽1,n. From Example 4, VU is identified with 𝔽m,n, which is in turn identified with ℒ(V, U), the set of all linear transformations from V to U.

Note: If V and U are both infinite dimensional, then V* ⊗ U is a subspace of ℒ(V, U), but not equal to it because linear combinations include only finite number of terms. Specifically, V* ⊗ U = { T ∈ ℒ(V, U) : dim range(T) < ∞ } is the set of finite rank linear transformations in ℒ(V, U), the space of all linear functions from V to U. Recall that the tensor product V* ⊗ U consists of only finite number of linear combinations of rank 1 elements φ⊗u.

End of Example 8
Example 9: Let us consider ℚn and ℝ as vector spaces over the field of rational numbers. Then ℚn⊗ℝ is isomorphic as a vector space over ℚ to ℝn, where the multiplication is just scalar multiplication on ℝn.

Let α = { e1, e2, … , en } be the standard basis for ℚn and let β be an infinite basis for ℝ over ℚ. First we show that α⊗β spans ℝn. Let (𝑎1, 𝑎2, … , 𝑎n) ∈ ℝn. Every component can be expanded into finite sum \( \displaystyle a_i = \sum_j b_{ij} x_j , \quad i=1,2,\ldots , n , \) where xi ∈ β. Thus, \( \displaystyle ( a_1 , a_2 , \ldots , a_n ) = \sum_{ij} b_{ij} {\bf e}_i \otimes x_j . \)

Next we show that α⊗β is linearly independent. Suppose opposite \( \displaystyle \sum_{i,j} c_{ij} {\bf e}_i \otimes x_j = 0 . \) Since \[ \sum_{i,j} c_{ij} {\bf e}_i \otimes x_j = \left( \sum_j c_{1,j} x_j , \sum_j c_{2,j} x_j , \ldots , \sum_j c_{n,j} x_j , \right) , \] we have \( \displaystyle \sum_{j} c_{ij} x_j = 0 , \) for all i. Since { xi } are linearly independent, cij = 0, as required.

End of Example 9

 

  1. Anton, Howard (2005), Elementary Linear Algebra (Applications Version) (9th ed.), Wiley International