This section is divided into a few of subsections, links to which are:

Reduction Operations

Duality in matrix/vector multiplication

Elementary operations

Elementary row operations

 

Elementary column operations

A linear system of equations,

\begin{equation} \label{EqRow.1} \begin{split} a_{1,1} x_1 + a_{1,2} x_2 + \cdots + a_{1,n} x_n &= b_1 , \\ a_{2,1} x_1 + a_{2,2} x_2 + \cdots + a_{2,n} x_n &= b_2 , \\ \vdots & \vdots \\ a_{m,1} x_1 + a_{m,2} x_2 + \cdots + a_{m,n} x_n &= b_m , \end{split} \end{equation}
To represent system \eqref{EqRow.1} in concise form, it is convenient to use the augmented matrix. In this format, each row corresponds to an equation, and the columns hold the coefficients for each variable, with the constants from the right side of the equation placed in the final column, often separated by a vertical line.
\begin{equation} \label{EqRow.2} \left[ \mathbf{A} \mid \mathbf{b} \right] = \left[ \begin{array}{cccc|c} a_{1,1} & a_{1,2} & \cdots & a_{1,n} & b_1 \\ a_{2,1} & a_{2,2} & \cdots & a_{2,n} & b_2 \\ \vdots& \vdots & \ddots & \vdots & \vdots \\ a_{m,1} & a_{m,2} & \cdots & a_{m,n} & b_m \end{array} \right] , \end{equation}
where
\[ \mathbf{A} = \begin{bmatrix} a_{1,1} & a_{1,2} & \cdots & a_{1,n} \\ a_{2,1} & a_{2,2} & \cdots & a_{2,n} \\ \vdots& \vdots & \ddots & \vdots \\ a_{m,1} & a_{m,2} & \cdots & a_{m,n} \end{bmatrix} , \qquad \mathbf{b} = \begin{pmatrix} b_1 \\ b_2 \\ \vdots \\ b_m \end{pmatrix} . \]
All of the results of the previous section can be rephrased in terms of the columns of matrix A. We will list here some definitions and theorems, as well as some clarifying examples.
An elementary column operation on a matrix M is any one of the following three ways of modifying M to produce a new equivalent matrix.
  1. Interchanging the positions of two columns. We abbreviate it as Ci ↔ Cj.
  2. Multiplying one column of the matrix by a nonzero scalar. This operation is denoted as Ci ← λ·Ci for column i.
  3. Adding a constant multiple of a column to another column. The following notation is used: Ci ← Ci + λ·Cj.
   
Example 1:    ■
End of Example 1
Matrices A and B are said to be column equivalent (\( \displaystyle \underset{C}{\sim} \) ) if B is obtained from A by a finite sequence of elementary column operations.

Theorem 1: An elementary column operation on a matrix A can be accomplished by multiplying A on the right (postmultiplying) by an identity matrix on which the operation has been performed, that is, A(ECO) = A(I(ECO)), or more explicitly

  1. Interchanging the positions of two columns: A(Ci ↔ Cj) = A(I(Ci ↔ Cj)).
  2. Multiplying one column of the matrix by a nonzero scalar: A(λ·Ci) = A(I(λCi)).
  3. Adding a constant multiple of a column to another column: A(λ·Ci + Cj) = A (I(λ·Ci + Cj)).
   
Example 2:    ■
End of Example 2

The matrices IECO (which we could call elementary column matrices) can be identified with the elementary (row) matrices from previous section. Specifically, we have

\begin{align*} \mathbf{I}_{(C_i \leftrightarrow C_j)} &\rightleftharpoons \mathbf{I}_{R_i \,\updownarrow \, R_j} , \\ \mathbf{I}_{(\lambda\cdot C_i)} &\rightleftharpoons \mathbf{I}_{(\lambda\cdot R_i)} , \\ \mathbf{I}_{(\lambda\cdot C_i + C_j)} &\rightleftharpoons \mathbf{I}_{(\lambda\cdot R_i + R_j)} . \end{align*}
The following statement characterizes column equivalent matrices.

Theorem 2: \( \displaystyle \mathbf{A} \,\underset{C}{\sim} \, \mathbf{B} \) if and only if there exists a nonsingular matrix P such that A P = B.

   
Example 3:    ■
End of Example 3

Theorem 3: If a sequence of elementary column operations reduces A to the identity matrix I, then the same sequence of operations on \[ \left[ \begin{array}{c} \mathbf{A} \\ \hline \mathbf{I} \end{array} \right] \qquad \mbox{yields} \qquad \left[ \begin{array}{c} \mathbf{I} \\ \hline \mathbf{A}^{-1} \end{array} \right] . \]

   
Example 4:    ■
End of Example 4

There is another point of view worth mentioning with would allow us to deduce the "column" theorems above from their row analogs. Since the columns of A are the transposes of the rows of AT, we could perform row operations on AT and then transpose again. For example,

\begin{align*} \mathbf{A}_{(\lambda \cdot C_i + C_j)} &= \left( \mathbf{A}^{\mathrm T}_{(\lambda \cdot R_i + R_j)} \right)^{\mathrm T} = \mathbf{A} \left( \mathbf{I}_{\lambda \cdot R_i + R_j} \right)^{\mathrm T} , \\ &= \mathbf{A} \left( \mathbf{I}_{\lambda \cdot C_i + C_j} \right) . \end{align*}
It is important to note that column operations should not be used, like row operations, to solve linear systems    A x = b.    That is, \( \displaystyle \left[ \mathbf{A} \mid \mathbf{b} \right] \underset{C}{\sim} \left[ \mathbf{B} \mid \mathbf{h} \right] \ \) does not imply A x = b and B x = h are equivalent systems.

   
Example 5:    ■
End of Example 5

The column operations introduced here could be used to determine simplifying substitution for the linear system A x = b. If Q is a nonsingular matrix such that A Q is in column-reduced echelon form, then the nonsingular substitution x = A y yields a new system, (A Q)y = b, which is readily solvable for y. We could then find x = Q y by matrix multiplication. This approach would, however, involve more numerical work than the row-reduction procedure.    

Example 6:    ■
End of Example 6

 

 

  1. Define "column-reduced echelon matrix".
  2. Use Theorem 3 to compute \[ \begin{bmatrix} 2& \phantom{-}1& 1 \\ 3& -1& 2 \\ 1& -1& 1 \end{bmatrix}^{-1} . \]
  3. Show that \( \displaystyle \mathbf{A} \, \underset{C}{\sim} \, \mathbf{B} \iff \mathbf{A}^{\mathrm T} \, \underset{R}{\sim} \, \mathbf{B}^{\mathrm T} . \)
  4. Find nonsingular matrices Q₁ and Q₂ so that A Q₁ and B Q₂ are in column reduced echelon form where \[ \mathbf{A} = \begin{bmatrix} 1& 3& 2 \\ 3& -1& 3 \\ 2& 6& 4 \end{bmatrix} , \qquad \mathbf{B} = \begin{bmatrix} 1& 2& -1 \\ -1& 1& 2 \\ 3& 6& -3 \end{bmatrix} . \] Then find a nonsingular matrix Q such that A Q = B.
  5. Using column operations only express \[ \mathbf{C} = \begin{bmatrix} 2& -2& -3 \\ -2& 3& -2 \\ -3& 1& 1l \end{bmatrix} \] as a product of elementary matrices.
  6. Show that if \[ \left[ \begin{array}{c} \mathbf{A} \\ \hline \mathbf{I} \end{array} \right] \,\underset{C}{\sim} \, \left[ \begin{array}{c} \mathbf{B} \\ \hline \mathbf{Q} \end{array}\right] , \] then A Q = B.
  7. Prove that matrix A is invertible if and only if it is column equivalent to the identity matrix.

 

  1. Anton, Howard (2005), Elementary Linear Algebra (Applications Version) (9th ed.), Wiley International
  2. Beezer, R., A First Course in Linear Algebra, 2015.
  3. Beezer, R., A Second Course in Linear Algebra, 2013.