⎡ 0 0 0 1 0 0 0 0 0 ⎦ ⎤ e ‾ 1 ⊗ e ‾ 2 \underline{\boldsymbol{ e}}_{ 1}\otimes\underline{\boldsymbol{ e}}_{ 2} e 1 ⊗ e 2 is similar to a basis vector, e ‾ i \underline{\boldsymbol{ e}}_{ i} e i , except that we now have a 2nd order base tensor In pure index notation, the open product correspond to multiplying two index objects, say b i b_i b i and c j c_j c j , with different free indices, i.e. a i j = b i c j a_{ij} = b_i c_j a ij = b i c j
If we would like to express this as matrix operations, it corresponds to the following vector-vector multiplication
u ‾ = [ u 1 u 2 u 3 ] , v ‾ = [ v 1 v 2 v 3 ] , u ‾ v ‾ T = [ u 1 v 1 u 1 v 2 u 1 v 3 u 2 v 1 u 2 v 2 u 2 v 3 u 3 v 1 u 3 v 2 u 3 v 3 ] \begin{aligned} \underline{\boldsymbol{ u}} = \begin{bmatrix} u_1 \\ u_2 \\ u_3 \end{bmatrix}, \quad \underline{\boldsymbol{ v}} = \begin{bmatrix} v_1 \\ v_2 \\ v_3 \end{bmatrix}, \quad \underline{\boldsymbol{ u}}\underline{\boldsymbol{ v}}^{\mathrm{T}} = \begin{bmatrix} u_1 v_1 & u_1 v_2 & u_1 v_3 \\ u_2 v_1 & u_2 v_2 & u_2 v_3 \\ u_3 v_1 & u_3 v_2 & u_3 v_3 \end{bmatrix} \end{aligned} u = ⎣ ⎡ u 1 u 2 u 3 ⎦ ⎤ , v = ⎣ ⎡ v 1 v 2 v 3 ⎦ ⎤ , u v T = ⎣ ⎡ u 1 v 1 u 2 v 1 u 3 v 1 u 1 v 2 u 2 v 2 u 3 v 2 u 1 v 3 u 2 v 3 u 3 v 3 ⎦ ⎤ where we see that we increased the order from vectors (1st order) to a matrix (2nd order). The single contraction is defined as the dot product between the closest basis vectors. For two 2nd order tensors this becomes a ⋅ b = a i j e ‾ i ⊗ e ‾ j ⋅ b k l e ‾ k ⊗ e ‾ l = a i j b k l e ‾ i ⊗ e ‾ j ⋅ e ‾ k ⊗ e ‾ l = a i j b k l δ j k e ‾ i ⊗ e ‾ l = a i j b j l e ‾ i ⊗ e ‾ l \begin{aligned} \boldsymbol{ a}\cdot\boldsymbol{ b} &= a_{ij}\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j} \cdot b_{kl} \underline{\boldsymbol{ e}}_{ k}\otimes\underline{\boldsymbol{ e}}_{ l} = a_{ij} b_{kl} \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j} \cdot \underline{\boldsymbol{ e}}_{ k}\otimes\underline{\boldsymbol{ e}}_{ l} \\ & = a_{ij} b_{kl} \delta_{jk} \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ l} = a_{ij} b_{jl} \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ l} \end{aligned} a ⋅ b = a ij e i ⊗ e j ⋅ b k l e k ⊗ e l = a ij b k l e i ⊗ e j ⋅ e k ⊗ e l = a ij b k l δ jk e i ⊗ e l = a ij b j l e i ⊗ e l Often, we skip the dot, "⋅ \cdot ⋅ ", except between two vectors, for convenience: e.g. a ⋅ b = a b \boldsymbol{ a}\cdot\boldsymbol{ b}=\boldsymbol{ a}\boldsymbol{ b} a ⋅ b = a b . Additional examples Two 1st order tensors, u ‾ ⋅ v ‾ \underline{\boldsymbol{ u}}\cdot\underline{\boldsymbol{ v}} u ⋅ v c = u ‾ ⋅ v ‾ = u i v i c = \underline{\boldsymbol{ u}}\cdot\underline{\boldsymbol{ v}} = u_i v_i c = u ⋅ v = u i v i
Two 2nd order tensors, a b \boldsymbol{ a}\boldsymbol{ b} a b c = c i j e ‾ i ⊗ e ‾ j = a b = a i j b j k e ‾ i ⊗ e ‾ k \boldsymbol{ c}=c_{ij}\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j}=\boldsymbol{ a}\boldsymbol{ b} = a_{ij} b_{jk} \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ k} c = c ij e i ⊗ e j = a b = a ij b jk e i ⊗ e k In pure index notation: c i j = a i k b k j c_{ij} = a_{ik} b_{kj} c ij = a ik b kj
2nd and 1st order tensors, a u ‾ \boldsymbol{ a}\underline{\boldsymbol{ u}} a u v = v i e ‾ i = a u ‾ = a i j u j e ‾ i \boldsymbol{ v}=v_{i}\underline{\boldsymbol{ e}}_{ i}=\boldsymbol{ a}\underline{\boldsymbol{ u}} = a_{ij} u_{j} \underline{\boldsymbol{ e}}_{ i} v = v i e i = a u = a ij u j e i In pure index notation: v i = a i j u j v_i=a_{ij} u_j v i = a ij u j
Did you notice anything strange about the expression for two 2nd order tensors? The indices on the left and right hand side basis don't match (c i j e ‾ i ⊗ e ‾ j c_{ij}\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j} c ij e i ⊗ e j versus a i j b j k e ‾ i ⊗ e ‾ k a_{ij}b_{jk}\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ k} a ij b jk e i ⊗ e k ). This is perfectly valid! The indices are dummy indices . However, when not writing out the basis vectors, we assume that we have the same indices and index order of the basis vectors in each term in an expression. If we have that, such as in the equivalent expression c i k e ‾ i ⊗ e ‾ k = a i j b j k e ‾ i ⊗ e ‾ k c_{ik}\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ k}=a_{ij} b_{jk} \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ k} c ik e i ⊗ e k = a ij b jk e i ⊗ e k , we can express it as c i k = a i j b j k c_{ik} = a_{ij} b_{jk} c ik = a ij b jk and drop the basis vectors. This way of expressing it works if: (1) we have an orthonormal basis system and (2) that the basis vector indices and order are the same on both sides.
The single contraction is the "standard" matrix-vector product used when doing matrix multiplication. It is sometimes called the inner product .
The double contraction between tensors A A A and B B B (arbitrary order ≥ 2 \geq 2 ≥ 2 ) is defined as the dot product between A A A 's second last basis vector and B B B 's first basis vector and, at the same time, the dot product between A A A 's last basis vector B B B 's second basis vector. For two 4th order tensors this becomes A : B = A i j k l e ‾ i ⊗ e ‾ j ⊗ e ‾ k ⊗ e ‾ l : B m n o p e ‾ m ⊗ e ‾ n ⊗ e ‾ o ⊗ e ‾ p = A i j k l B m n o p ( e ‾ k ⋅ e ‾ m ) ( e ‾ l ⋅ e ‾ n ) e ‾ i ⊗ e ‾ j ⊗ e ‾ o ⊗ e ‾ p = A i j k l B m n o p δ k m δ l n e ‾ i ⊗ e ‾ j ⊗ e ‾ o ⊗ e ‾ p = A i j k l B k l o p e ‾ i ⊗ e ‾ j ⊗ e ‾ o ⊗ e ‾ p \begin{aligned} \textbf{\textsf{ A}}:\textbf{\textsf{ B}} &= \textsf{ A}_{ ijkl}\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j}\otimes\underline{\boldsymbol{ e}}_{ k}\otimes\underline{\boldsymbol{ e}}_{ l} : \textsf{ B}_{ mnop} \underline{\boldsymbol{ e}}_{ m}\otimes\underline{\boldsymbol{ e}}_{ n}\otimes\underline{\boldsymbol{ e}}_{ o}\otimes\underline{\boldsymbol{ e}}_{ p} \\ & = \textsf{ A}_{ ijkl} \textsf{ B}_{ mnop} (\underline{\boldsymbol{ e}}_{ k} \cdot \underline{\boldsymbol{ e}}_{ m}) (\underline{\boldsymbol{ e}}_{ l} \cdot \underline{\boldsymbol{ e}}_{ n})\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j}\otimes\underline{\boldsymbol{ e}}_{ o}\otimes\underline{\boldsymbol{ e}}_{ p}\\ & = \textsf{ A}_{ ijkl} \textsf{ B}_{ mnop} \delta_{km} \delta_{ln} \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j}\otimes\underline{\boldsymbol{ e}}_{ o}\otimes\underline{\boldsymbol{ e}}_{ p}\\ & = \textsf{ A}_{ ijkl} \textsf{ B}_{ klop} \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j}\otimes\underline{\boldsymbol{ e}}_{ o}\otimes\underline{\boldsymbol{ e}}_{ p} \end{aligned} A : B = A ijk l e i ⊗ e j ⊗ e k ⊗ e l : B mn o p e m ⊗ e n ⊗ e o ⊗ e p = A ijk l B mn o p ( e k ⋅ e m ) ( e l ⋅ e n ) e i ⊗ e j ⊗ e o ⊗ e p = A ijk l B mn o p δ km δ l n e i ⊗ e j ⊗ e o ⊗ e p = A ijk l B k l o p e i ⊗ e j ⊗ e o ⊗ e p If we have C = A : B \textbf{\textsf{ C}}=\textbf{\textsf{ A}}:\textbf{\textsf{ B}} C = A : B , then, in pure index notation, C i j o p = A i j k l B k l o p \textsf{ C}_{ ijop}=\textsf{ A}_{ ijkl}\textsf{ B}_{ klop} C ij o p = A ijk l B k l o p . Additional examples Two 2nd order tensors, a : b \boldsymbol{ a}:\boldsymbol{ b} a : b c = a : b = a i j b i j c = \boldsymbol{ a}:\boldsymbol{ b} = a_{ij} b_{ij} c = a : b = a ij b ij
4th and 2nd order tensors, A : b \textbf{\textsf{ A}}:\boldsymbol{ b} A : b c = c i j e ‾ i ⊗ e ‾ j = A : b = A i j k l b k l e ‾ i ⊗ e ‾ j \boldsymbol{ c} = c_{ij}\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j} = \textbf{\textsf{ A}}:\boldsymbol{ b} = \textsf{ A}_{ ijkl} b_{kl} \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j} c = c ij e i ⊗ e j = A : b = A ijk l b k l e i ⊗ e j In pure index notation: c i j = A i j k l b k l c_{ij} = \textsf{ A}_{ ijkl} b_{kl} c ij = A ijk l b k l
The double contraction can loosely be thought of, for 2nd and 4th order tensors, as the equivalent to the dot product for 1st and 2nd order tensors (cf. vectors and matrices ). For example, as we will see later, the norm of a second order tensor, a \boldsymbol{ a} a is a : a \sqrt{\boldsymbol{ a}:\boldsymbol{ a}} a : a and an "angle", θ \theta θ , between two tensors, a \boldsymbol{ a} a and b \boldsymbol{ b} b can be defined via a : a b : b cos ( θ ) = a : b \sqrt{\boldsymbol{ a}:\boldsymbol{ a}}\sqrt{\boldsymbol{ b}:\boldsymbol{ b}}\cos(\theta)=\boldsymbol{ a}:\boldsymbol{ b} a : a b : b cos ( θ ) = a : b .
We may introduce special open products, that permutate the order of the basis vectors. Two common definitions for 2nd order tensors are a ⊗ ‾ b = ( a i k e ‾ i ⊗ e ‾ k ) ⊗ ‾ ( b j l e ‾ j ⊗ e ‾ l ) = a i k b j l [ e ‾ i ⊗ e ‾ k ] ⊗ ‾ [ e ‾ j ⊗ e ‾ l ] = a i k b j l e ‾ i ⊗ e ‾ j ⊗ e ‾ k ⊗ e ‾ l \begin{aligned} \boldsymbol{ a}\overline{\otimes}\boldsymbol{ b} &= \left(a_{ik}\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ k} \right) \overline{\otimes} \left(b_{jl} \underline{\boldsymbol{ e}}_{ j}\otimes\underline{\boldsymbol{ e}}_{ l}\right) \\ &= a_{ik} b_{jl} \left[\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ k}\right] \overline{\otimes} \left[\underline{\boldsymbol{ e}}_{ j}\otimes\underline{\boldsymbol{ e}}_{ l}\right] \\ &= a_{ik} b_{jl} \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j}\otimes\underline{\boldsymbol{ e}}_{ k}\otimes\underline{\boldsymbol{ e}}_{ l} \end{aligned} a ⊗ b = ( a ik e i ⊗ e k ) ⊗ ( b j l e j ⊗ e l ) = a ik b j l [ e i ⊗ e k ] ⊗ [ e j ⊗ e l ] = a ik b j l e i ⊗ e j ⊗ e k ⊗ e l If we have C = a ⊗ ‾ b \textbf{\textsf{ C}}=\boldsymbol{ a}\overline{\otimes}\boldsymbol{ b} C = a ⊗ b , then, in pure index notation, C i j k l = a i k b j l \textsf{ C}_{ ijkl}=a_{ik} b_{jl} C ijk l = a ik b j l a ⊗ ‾ b = ( a i l e ‾ i ⊗ e ‾ l ) ⊗ ‾ ( b j k e ‾ j ⊗ e ‾ k ) = a i l b j k [ e ‾ i ⊗ e ‾ l ] ⊗ ‾ [ e ‾ j ⊗ e ‾ k ] = a i l b j k e ‾ i ⊗ e ‾ j ⊗ e ‾ k ⊗ e ‾ l \begin{aligned} \boldsymbol{ a}\underline{\otimes}\boldsymbol{ b} &= \left(a_{il}\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ l} \right) \underline{\otimes} \left(b_{jk} \underline{\boldsymbol{ e}}_{ j}\otimes\underline{\boldsymbol{ e}}_{ k}\right) \\ &= a_{il} b_{jk} \left[\underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ l}\right] \underline{\otimes} \left[\underline{\boldsymbol{ e}}_{ j}\otimes\underline{\boldsymbol{ e}}_{ k}\right] \\ &= a_{il} b_{jk} \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j}\otimes\underline{\boldsymbol{ e}}_{ k}\otimes\underline{\boldsymbol{ e}}_{ l} \end{aligned} a ⊗ b = ( a i l e i ⊗ e l ) ⊗ ( b jk e j ⊗ e k ) = a i l b jk [ e i ⊗ e l ] ⊗ [ e j ⊗ e k ] = a i l b jk e i ⊗ e j ⊗ e k ⊗ e l If we have C = a ⊗ ‾ b \textbf{\textsf{ C}}=\boldsymbol{ a}\underline{\otimes}\boldsymbol{ b} C = a ⊗ b , then, in pure index notation, C i j k l = a i l b j k \textsf{ C}_{ ijkl}=a_{il} b_{jk} C ijk l = a i l b jk While it might seem arbitrary to introduce these special open products, they are useful in many cases. For example, the 4th order identity tensor is
I = I ⊗ ‾ I \textbf{\textsf{ I}}=\boldsymbol{ I}\overline{\otimes}\boldsymbol{ I} I = I ⊗ I where
I \boldsymbol{ I} I is the 2nd order
identity tensor .
By using our rules for index notation, we can investigate how the above multiplication operations behave. The first example here, is the order of operations for the dot product between two vectors, that is,
u ‾ ⋅ v ‾ = u i v i = v i u i = v ‾ ⋅ u ‾ \begin{aligned} \underline{\boldsymbol{ u}}\cdot\underline{\boldsymbol{ v}} = u_i v_i = v_i u_i = \underline{\boldsymbol{ v}}\cdot\underline{\boldsymbol{ u}} \end{aligned} u ⋅ v = u i v i = v i u i = v ⋅ u so clearly the order of the vectors do not matter. Similarly, for the double contraction between two 2nd order tensors,
a : b = a i j b i j = b i j a i j = b : a \begin{aligned} \boldsymbol{ a}:\boldsymbol{ b} = a_{ij} b_{ij} = b_{ij} a_{ij} = \boldsymbol{ b}:\boldsymbol{ a} \end{aligned} a : b = a ij b ij = b ij a ij = b : a we also have no difference if we flip the order. However, if we take the dot product between a 2nd and 1st order tensor,
a ⋅ u ‾ = a i j u j e ‾ i = u j a i j e ‾ i ≠ u j a j i e ‾ i = u ‾ ⋅ a ∀ a \begin{aligned} \boldsymbol{ a}\cdot\underline{\boldsymbol{ u}} = a_{ij} u_j \underline{\boldsymbol{ e}}_{ i} = u_j a_{ij} \underline{\boldsymbol{ e}}_{ i} \neq u_j a_{ji} \underline{\boldsymbol{ e}}_{ i} = \underline{\boldsymbol{ u}}\cdot\boldsymbol{ a}\; \forall\; \boldsymbol{ a} \end{aligned} a ⋅ u = a ij u j e i = u j a ij e i = u j a ji e i = u ⋅ a ∀ a The reason that the last relation is not equal for all a \boldsymbol{ a} a , is that we contract the the last index of a \boldsymbol{ a} a with u ‾ \underline{\boldsymbol{ u}} u 's index. (Note that if a \boldsymbol{ a} a is symmetric , then this would be an equality.)
If we instead would look at a sum, we can consider A : [ b + c ] \textbf{\textsf{ A}}:\left[\boldsymbol{ b} + \boldsymbol{ c}\right] A : [ b + c ] ,
A i j k l [ b k l + c k l ] = A i j k l b k l + A i j k l c k l \begin{aligned} \textsf{ A}_{ ijkl}\left[b_{kl} + c_{kl}\right] = \textsf{ A}_{ ijkl} b_{kl} + \textsf{ A}_{ ijkl} c_{kl} \end{aligned} A ijk l [ b k l + c k l ] = A ijk l b k l + A ijk l c k l which holds as we are only considering summation and multiplication when working with the indices, i.e. A : [ b + c ] = A : b + A : c \textbf{\textsf{ A}}:\left[\boldsymbol{ b} + \boldsymbol{ c}\right]=\textbf{\textsf{ A}}:\boldsymbol{ b} + \textbf{\textsf{ A}}:\boldsymbol{ c} A : [ b + c ] = A : b + A : c
For completeness, the cross product between a 2nd order tensor and a vector is given as
a × v ‾ = a i j v k e ‾ i ⊗ e ‾ j × e ‾ k = a i j v k e ‾ i ⊗ [ ε j k m e ‾ m ] = ε j k m a i j v k e ‾ i ⊗ e ‾ m \begin{aligned} \boldsymbol{ a}\times\underline{\boldsymbol{ v}} = a_{ij} v_k \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ j} \times \underline{\boldsymbol{ e}}_{ k} = a_{ij} v_k \underline{\boldsymbol{ e}}_{ i} \otimes [\varepsilon_{jkm} \underline{\boldsymbol{ e}}_{ m}] = \varepsilon_{jkm} a_{ij} v_k \underline{\boldsymbol{ e}}_{ i}\otimes\underline{\boldsymbol{ e}}_{ m} \end{aligned} a × v = a ij v k e i ⊗ e j × e k = a ij v k e i ⊗ [ ε jkm e m ] = ε jkm a ij v k e i ⊗ e m