This section is divided into a few of subsections, links to which are:
Duality in matrix/vector multiplication
Elementary column operations
A linear system of equations,
- Interchanging the positions of two columns. We abbreviate it as Ci ↔ Cj.
- Multiplying one column of the matrix by a nonzero scalar. This operation is denoted as Ci ← λ·Ci for column i.
- Adding a constant multiple of a column to another column. The following notation is used: Ci ← Ci + λ·Cj.
Theorem 1: An elementary column operation on a matrix A can be accomplished by multiplying A on the right (postmultiplying) by an identity matrix on which the operation has been performed, that is, A(ECO) = A(I(ECO)), or more explicitly
- Interchanging the positions of two columns: A(Ci ↔ Cj) = A(I(Ci ↔ Cj)).
- Multiplying one column of the matrix by a nonzero scalar: A(λ·Ci) = A(I(λCi)).
- Adding a constant multiple of a column to another column: A(λ·Ci + Cj) = A (I(λ·Ci + Cj)).
The matrices IECO (which we could call elementary column matrices) can be identified with the elementary (row) matrices from previous section. Specifically, we have
Theorem 2: \( \displaystyle \mathbf{A} \,\underset{C}{\sim} \, \mathbf{B} \) if and only if there exists a nonsingular matrix P such that A P = B.
Theorem 3: If a sequence of elementary column operations reduces A to the identity matrix I, then the same sequence of operations on \[ \left[ \begin{array}{c} \mathbf{A} \\ \hline \mathbf{I} \end{array} \right] \qquad \mbox{yields} \qquad \left[ \begin{array}{c} \mathbf{I} \\ \hline \mathbf{A}^{-1} \end{array} \right] . \]
There is another point of view worth mentioning with would allow us to deduce the "column" theorems above from their row analogs. Since the columns of A are the transposes of the rows of AT, we could perform row operations on AT and then transpose again. For example,
The column operations introduced here could be used to determine simplifying substitution for the linear system A x = b. If Q is a nonsingular matrix such that A Q is in column-reduced echelon form, then the nonsingular substitution x = A y yields a new system, (A Q)y = b, which is readily solvable for y. We could then find x = Q y by matrix multiplication. This approach would, however, involve more numerical work than the row-reduction procedure.
- Define "column-reduced echelon matrix".
- Use Theorem 3 to compute \[ \begin{bmatrix} 2& \phantom{-}1& 1 \\ 3& -1& 2 \\ 1& -1& 1 \end{bmatrix}^{-1} . \]
- Show that \( \displaystyle \mathbf{A} \, \underset{C}{\sim} \, \mathbf{B} \iff \mathbf{A}^{\mathrm T} \, \underset{R}{\sim} \, \mathbf{B}^{\mathrm T} . \)
- Find nonsingular matrices Q₁ and Q₂ so that A Q₁ and B Q₂ are in column reduced echelon form where \[ \mathbf{A} = \begin{bmatrix} 1& 3& 2 \\ 3& -1& 3 \\ 2& 6& 4 \end{bmatrix} , \qquad \mathbf{B} = \begin{bmatrix} 1& 2& -1 \\ -1& 1& 2 \\ 3& 6& -3 \end{bmatrix} . \] Then find a nonsingular matrix Q such that A Q = B.
- Using column operations only express \[ \mathbf{C} = \begin{bmatrix} 2& -2& -3 \\ -2& 3& -2 \\ -3& 1& 1l \end{bmatrix} \] as a product of elementary matrices.
- Show that if \[ \left[ \begin{array}{c} \mathbf{A} \\ \hline \mathbf{I} \end{array} \right] \,\underset{C}{\sim} \, \left[ \begin{array}{c} \mathbf{B} \\ \hline \mathbf{Q} \end{array}\right] , \] then A Q = B.
- Prove that matrix A is invertible if and only if it is column equivalent to the identity matrix.
- Anton, Howard (2005), Elementary Linear Algebra (Applications Version) (9th ed.), Wiley International
- Beezer, R., A First Course in Linear Algebra, 2015.
- Beezer, R., A Second Course in Linear Algebra, 2013.
