Tensor invariants

One important application for this is modeling of material behavior. For example, the von Mises effective stress, (2/3)σdev:σdev\sqrt{(2/3)\boldsymbol{ \sigma}^{ \mathrm{dev} }:\boldsymbol{ \sigma}^{ \mathrm{dev} }} is an invariant, because the double contraction with itself, and the trace (used to get the deviatoric part), is an invariant.

Basic invariants

The basic invariants of a 2nd order tensor are

  • Ia(1)=tr(a)I^{(1)}_{\boldsymbol{ a}} = \mathrm{tr}(\boldsymbol{ a})

  • Ia(2)=tr(a2)=a:aTI^{(2)}_{\boldsymbol{ a}} = \mathrm{tr}(\boldsymbol{ a}^2) = \boldsymbol{ a}:\boldsymbol{ a}^{\mathrm{T}}

  • Ia(3)=tr(a3)I^{(3)}_{\boldsymbol{ a}} = \mathrm{tr}(\boldsymbol{ a}^3)

tr(RaRT)=RijajkRkiT=RkiTRijajk=δkjajk=ajj=tr(a)(RaRT):(RaTRT)=RijajkRklTRimamnTRnlT=RmiTRijRklTRlnajkamnT=amnamnT=a:aT\begin{aligned} \mathrm{tr}(\boldsymbol{ R}\boldsymbol{ a}\boldsymbol{ R}^{\mathrm{T}}) &= R_{ij} a_{jk} R^{\mathrm{T}}_{ki} = R^{\mathrm{T}}_{ki} R_{ij} a_{jk} = \delta_{kj} a_{jk} = a_{jj} = \mathrm{tr}(\boldsymbol{ a})\\ (\boldsymbol{ R}\boldsymbol{ a}\boldsymbol{ R}^{\mathrm{T}}):(\boldsymbol{ R}\boldsymbol{ a}^{\mathrm{T}}\boldsymbol{ R}^{\mathrm{T}}) &= R_{ij} a_{jk} R^{\mathrm{T}}_{kl} R_{im} a^{\mathrm{T}}_{mn} R^{\mathrm{T}}_{nl}\\ &= R^{\mathrm{T}}_{mi} R_{ij} R^{\mathrm{T}}_{kl} R_{ln} a_{jk} a^{\mathrm{T}}_{mn} = a_{mn} a^{\mathrm{T}}_{mn} = \boldsymbol{ a}:\boldsymbol{ a}^{\mathrm{T}} \end{aligned}

Eigenvalues

Another set of invariants for a 2nd order tensor are its eigenvalues, λi\lambda_i. The eigenvalues and corresponding eigenvectors are defined by

avi=λivi\begin{aligned} \boldsymbol{ a}\underline{\boldsymbol{ v}}_i = \lambda_i \underline{\boldsymbol{ v}}_i \end{aligned}

The non-zero vectors vi\underline{\boldsymbol{ v}}_i that fulfill this equation are called eigenvectors of a\boldsymbol{ a}, and the corresponding values λi\lambda_i are called eigenvalues.

As for matrices, we can determine the eigenvalues by solving the characterstic equation

det(aλI)=0\begin{aligned} \mathrm{det}\left(\boldsymbol{ a}-\lambda \boldsymbol{ I}\right) = 0 \end{aligned}

Given the eigenvalues, λi\lambda_i, each corresponding eigenvector vi\underline{\boldsymbol{ v}}_i can be obtained by solving

(aλiI)vi=0\begin{aligned} \left(\boldsymbol{ a}-\lambda_i \boldsymbol{ I}\right)\underline{\boldsymbol{ v}}_i = \underline{\boldsymbol{ 0}} \end{aligned}

for a nonzero vi\underline{\boldsymbol{ v}}_i. As we have equality with zero, the magnitude of vi\underline{\boldsymbol{ v}}_i is arbitrary. In practice, it is common to normalize the eigenvectors.

Spectral decomposition

Above, we introduced the right eigenvectors v\underline{\boldsymbol{ v}}, such that av=λv\boldsymbol{ a}\underline{\boldsymbol{ v}}=\lambda\underline{\boldsymbol{ v}}. It is also possible to introduce the left eigenvectors, w\underline{\boldsymbol{ w}}, such that wa=λw\underline{\boldsymbol{ w}}\boldsymbol{ a}=\lambda\underline{\boldsymbol{ w}}. This leads to the equivalent characteristic equation, det(aTλI)=det(aλI)=0\mathrm{det}\left(\boldsymbol{ a}^{\mathrm{T}}-\lambda\boldsymbol{ I}\right) = \mathrm{det}\left(\boldsymbol{ a}-\lambda\boldsymbol{ I}\right) = 0. Hence, the eigenvalues are the same. However, the eigenvectors are not always the same. If a\boldsymbol{ a} is symmetric, then they are equal.

If we have distinct eigenvalues, λiλj\lambda_i \neq \lambda_j if iji\neq j, then we have

wiavj=wi(λjvj)=λjwivj=(λiwi)vj=λiwivj\begin{aligned} \underline{\boldsymbol{ w}}_i \boldsymbol{ a} \underline{\boldsymbol{ v}}_j &= \underline{\boldsymbol{ w}}_i \cdot (\lambda_j \underline{\boldsymbol{ v}}_j) = \lambda_j \underline{\boldsymbol{ w}}_i\cdot\underline{\boldsymbol{ v}}_j\\ &= (\lambda_i \underline{\boldsymbol{ w}}_i) \cdot \underline{\boldsymbol{ v}}_j = \lambda_i \underline{\boldsymbol{ w}}_i \cdot \underline{\boldsymbol{ v}}_j \end{aligned}

Subtracting the first line from the second, we obtain

(λiλj)wivj=0wivj=0ij\begin{aligned} \left(\lambda_i - \lambda_j\right) \underline{\boldsymbol{ w}}_i \cdot \underline{\boldsymbol{ v}}_j = 0\\ \underline{\boldsymbol{ w}}_i \cdot \underline{\boldsymbol{ v}}_j &= 0 \quad i\neq j \end{aligned}

showing that for distinct eigenvalues, λiλj\lambda_i\neq\lambda_j, wi\underline{\boldsymbol{ w}}_i and vj\underline{\boldsymbol{ v}}_j are orthogonal.

If the tensor a\boldsymbol{ a} is symmetric, the right and left eigenvectors are equal. In that case, this implies that its eigenvectors are orthogonal. A symmetric tensor have 6 degrees of freedom (independent components). Let's write a\boldsymbol{ a} as

a=λijvivj\begin{aligned} \boldsymbol{ a} = \lambda_{ij} \underline{\boldsymbol{ v}}_i \otimes \underline{\boldsymbol{ v}}_j \end{aligned}

where λij\lambda_{ij} are just the coefficients for the eigenvector basis. If we check avk\boldsymbol{ a}\boldsymbol{ v}_k, we obtain

avk=λijvivjvk=λkvk(No sum on k)=λikvi=λkvk\begin{aligned} \boldsymbol{ a}\boldsymbol{ v}_k &= \lambda_{ij} \underline{\boldsymbol{ v}}_i \otimes \underline{\boldsymbol{ v}}_j \cdot \underline{\boldsymbol{ v}}_k = \lambda_k \underline{\boldsymbol{ v}}_k \quad (\text{No sum on }k) \\ &= \lambda_{ik} \underline{\boldsymbol{ v}}_i= \lambda_k \underline{\boldsymbol{ v}}_k \end{aligned}

showing that

λij={λii=j0ij\begin{aligned} \lambda_{ij} = \left\lbrace\begin{matrix} \lambda_i & i=j \\ 0 & i\neq j\end{matrix}\right. \end{aligned}

The conditions in Equation (8) give 9 conditions as for each kk there are 3 equations. However, we already restricted to symmetric tensors, from which we obtained that the eigenvectors are orthogonal. Therefore, only 6 of the conditions are linearly independent. Still, this suffices to fully specify our tensor, and we see that if we know the eigenvalues, λi\lambda_i and normalized eigenvectors, vi\underline{\boldsymbol{ v}}_i, of a symmetric tensor a\boldsymbol{ a}, we can express it as

a=i=13λivivi(No Einstein summation convention on i)\begin{aligned} \boldsymbol{ a} = \sum_{i=1}^3 \lambda_i \underline{\boldsymbol{ v}}_i \otimes \underline{\boldsymbol{ v}}_i\quad (\text{No Einstein summation convention on }i) \end{aligned}

This is the spectral decomposition of a\boldsymbol{ a}

Invariants tensors which are not 2nd order

Invariants are not only restricted to 2nd order tensors. For example, the length of a vector is an invariant. Actually, the norm of any tensor is an invariant.

Invariants of multiple tensors (combined invariants)

It is also possible to define combined invariants that depend on multiple tensors. These remain invariant as long as all involved tensors are transformed the same way. A common example is the angle between two tensors, a:b/(ab)\boldsymbol{ a}:\boldsymbol{ b}/(\left\vert\left\vert \boldsymbol{ a}\right\vert\right\vert\left\vert\left\vert \boldsymbol{ b}\right\vert\right\vert). From this expression, we can also easily see that the double contraction a:b\boldsymbol{ a}:\boldsymbol{ b} is invariant.