Examples of transformations

 

Isometric transformations


A transformation is isomeric when ∥A x∥ = ∥ x∥.
This implies that the eigenvalues of an isometric transformation are given by λ = exp(jφ). Then also we have ⟨ Ax , Ay ⟩ = ⟨ x m y ⟩.

When W is an invariant subspace of the isometric transformation A with dim(A) < ∞, then also W is also invariant subspace.

 

Orthogonal transformations


A transformation A is orthogonal if A is isometric and its inverse exists.
For an orthogonal transformation O, the identity OTO = I, so OT = O−1. If A and B are orthogonal, then AB and A−1 are also orthogonal.

Let A : VV be orthogonal with dim(V) < ∞, then A is direct orthogonal if det(A) = +1. Matrix A describes a rotation. In particular, A provides a rotation of ℝ² through angle φ, it is given by

\[ {\bf R} = \begin{bmatrix} \cos\varphi & -\sin\varphi \\ \sin\varphi & \phantom{-}\cos\varphi \end{bmatrix} . \]
So the rotation angle φ is determined by trace tr(A) = 2cos(φ) with 0 ≤ φ ≤ π. Let λ₁ and λ₂ be the roots of the characteristic equation. Then Re(λ₁) = Re(λ₂) = cos(φ) and λ₁ = exp(jφ) and λ₂ = exp(−jφ).

In ℝ³, λ₁ = 1, λ₂ = λ₃* = exp(jφ). A rotation over eigenspace corresponding λ₁ is given by matrix

\[ {\bf R} = \begin{bmatrix} 1 & 0 & 0 \\ 0 & \cos\varphi & -\sin\varphi \\ 0 & \sin\varphi & \phantom{-}\cos\varphi \end{bmatrix} . \]
A transformation A is called mirrored orthogonal if det(A) = −1. Vectors from E−1 are mirrored by A with respect to the invariant subspace E−1. A mirroring in ℝ² in <\( \left( \cos \left( \frac{1}{2}\,\varphi \right) , \sin \left( \frac{1}{2}\,\varphi \right) \right) \) > is given by
\[ {\bf S} = \begin{bmatrix} \cos\varphi & \phantom{-}\sin\varphi \\ \sin\varphi & - \cos\varphi \end{bmatrix} . \]
Mirrored orthogonal transformations in ℝ³ are rotational mirroring rotations of axis < a > through angle φ and mirror plane < a >. The matrix of such transformation is given by
\[ {\bf S} = \begin{bmatrix} -1 & 0 & 0 \\ 0 & \cos\varphi & -\sin\varphi \\ 0 & \sin\varphi & \phantom{-}\cos\varphi \end{bmatrix} . \]
For all orthogonal transformations in ℝ³, O(xO(y) = O(x×y).

n (n < ∞) can be decomposed in invariant subspaces with dimension 1 or 2 for each orthogonal transformation.

 

Unitary transformations


Let V be complex vector space with inner product. A linear transformation U of V is called unitary if it is isometric and its inverse exists.
An n × n matrix U is unitary if U*U = I, the identity matrix. Its determinant is det(U) = ±1. Each isometric transformation in a finite dimensional complex vector space is unitary.
Theorem 1: For an n × n matrix A, the following statements are equivalent:
  1. A is unitary.
  2. The columns of A form an orthonormal set.
  3. The rows of matrix A form an orthonormal set.

 

Symmetric transformations


A transformation of ℝn is called symmetric if ⟨ Ax , y ⟩ = ⟨ x , Ay ⟩ for any vectors x and y from the vector space.
A square matrix A is symmetric if AT = A. A linear transformation is symmetric if its matrix with respect to an arbitrary basis is symmetric. All eigenvalues of a symmetric transformation are real. Eigenvectors corresponding to distinct eigenvalues are orthogonal. If A is symmetric, then AT = A = A* for any orthogonal basis. The product ATA is symmetric if T is.

 

Self-adjoint transformations


A transformation H : ℂn → ℂn is called self-adjoint or Hermitian if ⟨ Ax , y ⟩ = ⟨ x , Ay ⟩ for any vectors x and y from the vector space.
A product AB of two self-adjoint matrices A and B is self-adjoint if its commutator is zero, [A, B] = ABBA = 0.

Eigenvalues of any self-adjoint matrix are real numbers.

 

Normal transformations


A linear transformation A is called normal if A*A = AA*.
Let the different roots of the characteristic equation of normal matrix A be βi with multiplicities ni. Than the dimension of each eigenspace Vi equalsni. These eigenspaces are mutually perpendicular and each vector xV can be written in exactly one way as
\[ {\bf x} = \sum_i {\bf x}_i , \qquad {\bf x}_i = P_i {\bf x} \in V_i , \]
where Pi is a projection on Vi.

 

  1. Anton, Howard (2005), Elementary Linear Algebra (Applications Version) (9th ed.), Wiley International
  2. Beezer, R.A., A First Course in Linear Algebra, 2017.
  3. Fitzpatrick, S., Linear Algebra: A second course, featuring proofs and Python.