Direct-sum decompositions
Let V be a linear space over field 𝔽 (which is either ℂ set of complex numbers, or ℝ. the set of real numbers, or ℚ, rational numbers) and f : V ⇾ V be a linear mapping, so f is endomorphism of V.
A subspace U ⊆ V is called an invariant subspace of f if f(U) ⊆ U. Then, restricting the domain to U we get a linear mapping f|U : U ⇾ U defined by
f|U(x) = f(x) for x ∈ U.
f|U(x) = f(x) for x ∈ U.
Example 1:
■
Recall the following definition from section on direct sums.
End of Example 1
A family { Xi }i∈I, where I is some set (finite or not) of indexes, of subspaces of linear spaqce V is called independent, when
a finite sum \( \displaystyle \quad \sum \mathbf{x}_j \quad \) of elements xj ∈ Xj, can vanish only if xj = 0 for all j.
As we shall mainly be concerned with finite families of subspaces, we shall
restrict the index set to be finite, or ℕ. This is only a notational simplification.
Example 2:
Let X = {X₁, X₂, … , Xk} be a linearly independent family of subspaces of V and let W = X₁ ⊕ X₂ ⊕ ⋯ ⊕ Xk be the direct sum of them. As we know from section of Part 3, any x ∈ W is uniquely written as x = x₁ + x₂ + ⋯ + xk with x₁ ∈ X₁, x₂ ∈ X₂, … , xk ∈ Xk. For family of linear endomorphismss fi : Xi ⇾ Xi (i = 1, 2, … , k), we define the linear mapping f₁ ⊕ f₂ ⊕ ⋯ ⊕ fk : W ⇾ W by
■
End of Example 2
\[
\left( f_1 \oplus f_2 \oplus \cdots \oplus f_k \right) (\mathbf{x}) = f_1 (\mathbf{x}_1 ) + f_2 (\mathbf{x}_2 ) + \cdots + f_k (\mathbf{x}_k ) .
\]
We say that this endomorphism is the direct sum of linear mappings. Suppose that each subspace Xi is finite dimensional with
basis βi and the linear mapping fi is represented by a matrix Ai = ⟧fi⟦ based on βi for i = 1, 2, … , k. Then,
β = β₁ ∪ β₂ ∪ ⋯ ∪ βk is a basis of W, and on this basis, the linear mapping f₁ ⊕ f₂ ⊕ ⋯ ⊕ fk
is represented by the matrix
\begin{equation} \label{EqDirect.1}
\mathbf{A} =
\begin{bmatrix} \mathbf{A}_1 & \mathbf{0} & \cdots & \mathbf{0} \\
\mathbf{0} & \mathbf{A}_2 & \cdots & \mathbf{0} \\
\vdots & \vdots & \ddots & \vdots \\
\mathbf{0} & \mathbf{0} & \cdots & \mathbf{A}_k
\end{bmatrix} .
\end{equation}
We call this block diagonal matrix \eqref{EqDirect.1} the direct sum of matrices, which is written as A = A₁ ⊕ A₂ ⊕ ⋯ ⊕ Ak.
Let f : W ⇾ W be a linear endomorphism such that each Xi is an invariant subspace of f for i = 1, 2, … , k. In this case we say that X decomposes f. Then, we have \[ f = \left. f\right\vert_{X_1} \oplus \left. f\right\vert_{X_2} \oplus \cdots \oplus \left. f\right\vert_{X_k} , \] and we call this expression the direct sum decomposition of f.
Let f : W ⇾ W be a linear endomorphism such that each Xi is an invariant subspace of f for i = 1, 2, … , k. In this case we say that X decomposes f. Then, we have \[ f = \left. f\right\vert_{X_1} \oplus \left. f\right\vert_{X_2} \oplus \cdots \oplus \left. f\right\vert_{X_k} , \] and we call this expression the direct sum decomposition of f.
Example 3:
■
End of Example 3
For a linear mapping f : V ⇾ V, we define fn : V ⇾ V nductively by
\[
f^n \stackrel{\tiny def}{=} \begin{cases}
I , & \quad \mbox{if} \quad n = 0 ,
\\
f \circ f^{n-1} & \quad \mbox{for} \quad n=1, 2, \ldots ,
\end{cases}
\]
where I is the identity mapping on V. If V is finite dimensional with basis β and A = ⟦f⟧β is the representation
matrix of f on V, then An is a representation matrix of fn on V.
We define the subspaces K(n) and R(n) of V by
\[
K^{(n)} = \mbox{kernel}\left( f^n \right) \qquad \mbox{and} \qquad R^{(n)} = \mbox{Image}\left( f^n \right)
\]
We have the inclusion relations
\begin{align*}
\left\{ \mathbf{0} \right\} &= K^{(0)} \subseteq K^{(1)} \subseteq K^{(2)} \subseteq \cdots ,
\\
V &= R^{(0)} \supseteq R^{(1)} \supseteq R^{(2)} \supseteq \cdots .
\end{align*}
Example 4:
■
Let us define
End of Example 4
\begin{equation} \label{EqDirect.1}
K \stackrel{\tiny def}{=} \bigcup_{n\ge 0} K^{(n)} \qquad \mbox{and} \qquad R \stackrel{\tiny def}{=} \bigcap_{n\ge 0} R^{(n)} .
\end{equation}
These two sets, K and K\R are invariant subsets of f. Suppose that V is a finite-dimensional space. Then there exist k and m such that
\begin{align*}
K^{(k)} &= K^{(k+1)} = K^{(k+2)} = \cdots = K ,
\\
R^{(m)} &= R^{(m+1)} = R^{(m+2)} = \cdots = R .
\end{align*}
Theorem 1:
Let f : V ⇾ V a linear endomorphist on finite dimensional vector space V, and let K and R are kernel and range subspaces constructed according to Eq.\eqref{EqDirect.1}.
Then V = K ⊕ R.
Let k₀ = max{k, m}. Then the linear mapping fR : R ⇾ R is also bijective. In particular, we see that \( \displaystyle \quad K \cap R = \mbox{kernel}\left( \left( \left. f \right\vert_R \right)^{k_0} \right) = \{ 0 \} . \quad \) Hence {K, R} is linearly independent. For any u ∈ V, set
\( \displaystyle \quad \mathbf{v} = f^{(k_0 )} (\mayjbf{u}) . \quad \) Sincev &isin: R,
Example 5:
■
End of Example 5
Corollary 1:
Every singular matrix can be decomposed into two block matrix:
\[
\mathbf{A} \sim \begin{bmatrix} \mathbf{R} & \mathbf{0} \\ \mathbf{0} & \mathbf{K}\end{bmatrix} .
\]
Example 6:
■
End of Example 6
Block matrix decomposition
- Beezer, R.A., A First Course in Linear Algebra, 2017.