es

Elementary Matrices

Column transformations

Gaussian elimination

Inverse Matrices

Theorem 4: If a fionite sequence of elementary row operations reduces a square matrix A to the identity matrix I, then the same sequence reduces the augmented matrix [A | I] to [I | A−1].

 

Inversion Algorithm: To find the inverse of a square matrix A, use two steps
  • Step 1: form the augmented matrix \( \left[ {\bf A} \ {\bf I} \right] . \)
  • Step 2: apply the Gauss--Jordan method to attempt to reduce \( \left[ {\bf A} \ {\bf I} \right] \) to \( \left[ {\bf I} \ {\bf C} \right] . \) If the reduction can be carried out, then A-1 = C. Otherwise, A-1 does not exist.
A simple method for carrying out this algorithm is given in the following example.
Example 11: Consider the 3 × 3 matrix
\[ {\bf A} = \begin{bmatrix} 2&3&1 \\ 4&7&4 \\ 1&-2&-6 \end{bmatrix} . \]
We find its inverse using Gauss-Jordan method applied to the augmented matrix \( \left[ {\bf A} \ {\bf I} \right] . \) Of course, using Mathematica, it is an easy job:
A = {{2, 3, 1}, {4, 7, 4}, {1, -2, -6}}
Inverse[A]
{{-34, 16, 5}, {28, -13, -4}, {-15, 7, 2}}
B = Join[A, IdentityMatrix[3], 2]
RowReduce[B]
{{1, 0, 0, -34, 16, 5}, {0, 1, 0, 28, -13, -4}, {0, 0, 1, -15, 7, 2}}
Next, we extract the last three columns that will define the inverse matrix:
BB = RowReduce[B]
{{1, 0, 0, -34, 16, 5}, {0, 1, 0, 28, -13, -4}, {0, 0, 1, -15, 7, 2}}
Drop[BB, {}, {1}]
{{0, 0, -34, 16, 5}, {1, 0, 28, -13, -4}, {0, 1, -15, 7, 2}}
Drop[%, {}, {1}]
{{0, -34, 16, 5}, {0, 28, -13, -4}, {1, -15, 7, 2}}
invA = Drop[%, {}, {1}]
{{-34, 16, 5}, {28, -13, -4}, {-15, 7, 2}}
However, we show all steps required to determine the inverse matrix without applying the standard commands. We start with the augmented matrix, which we denote as B:
\[ {\bf B} = \left[ {\bf A}\ \vert \ {\bf I}_3 \right] = \begin{bmatrix} 2&3&1&1&0&0 \\ 4&7&4&0&1&0 \\ 1&-2&-6&0&0&1 \end{bmatrix} . \]
We multiply the first row by (-2) and add to the second row. Then we multiply the first row by (-1/2) and add to the third row. This yields
\[ {\bf B} \,\sim \, {\bf B}_1 = \begin{bmatrix} 2&3&1&1&0&0 \\ 0&1&2&-2&1&0 \\ 0&-\frac{7}{2} & -\frac{13}{2} & -\frac{1}{2} & 0 & 1 \end{bmatrix} . \]
This can also be achieved with multiplication by the elementary matrices:
\[ {\bf E}_1 = \begin{bmatrix} 1&0&0 \\ -2&1&0 \\ 0&0&1 \end{bmatrix} , \qquad {\bf E}_2 = \begin{bmatrix} 1&0&0 \\ 0&1&0 \\ -1/2&0&1\end{bmatrix} . \]
Then
\[ {\bf E}_{21} = {\bf E}_2 {\bf E}_1 = {\bf E}_1 {\bf E}_2 = \begin{bmatrix} 1&0&0 \\ -2&1&0 \\ -1/2&0&1 \end{bmatrix} \qquad \Longrightarrow \qquad {\bf E}_{21}^{-1} = \begin{bmatrix} 1&0&0 \\ 2&1&0 \\ 1/2&0&1 \end{bmatrix} . \]
B = Join[A, IdentityMatrix[3], 2]
{{2, 3, 1, 1, 0, 0}, {4, 7, 4, 0, 1, 0}, {1, -2, -6, 0, 0, 1}}
B1 = {{1, 0, 0}, {-2, 1, 0}, {0, 0, 1}}.B
{{2, 3, 1, 1, 0, 0}, {0, 1, 2, -2, 1, 0}, {1, -2, -6, 0, 0, 1}}
B1 = {{1, 0, 0}, {0, 1, 0}, {-1/2, 0, 1}}.B1
{{2, 3, 1, 1, 0, 0}, {0, 1, 2, -2, 1, 0}, {0, -(7/2), -(13/2), -(1/2), 0, 1}}
Next move would be multiplication of the second row by (7/2) and adding to the third row. This is equivalent to multiplication by the elementary matrix:
\[ {\bf E}_3 = \begin{bmatrix} 1&0&0 \\ 0&1&0 \\ 0&7/2&1 \end{bmatrix} . \]
Multiplying by it, we get
\[ {\bf E}_{13} = {\bf E}_3 {\bf E}_2 {\bf E}_1 = \begin{bmatrix} 1&0&0 \\ -2&1&0 \\ -\frac{15}{2}&\frac{7}{2}&1 \end{bmatrix} \qquad \Longrightarrow \qquad {\bf E}_{13}^{-1} = \begin{bmatrix} 1&0&0 \\ 2&1&0 \\ \frac{1}{2}&-\frac{7}{2}&1 \end{bmatrix} . \]
E13 = {{1, 0, 0}, {0, 1, 0}, {0, 7/2, 1}}.{{1, 0, 0}, {0, 1, 0}, {-1/2, 0, 1}}.{{1, 0, 0}, {-2, 1, 0}, {0, 0, 1}}
{{1, 0, 0}, {-2, 1, 0}, {-(15/2), 7/2, 1}}
Inverse[E13]
{{1, 0, 0}, {2, 1, 0}, {1/2, -(7/2), 1}}
We are now halfway because the obtained matrix in the first three columns to be U, upper triangular:
\[ {\bf E}_{13} {\bf B} = \begin{bmatrix} 2&3&1&1&0&0 \\ 0&1&2&-2&1&0 \\ 0&0&\frac{1}{2} & -\frac{15}{2}&\frac{7}{2}&1 \end{bmatrix} . \]
We can speed up this part by using a subroutine:
A = {{2, 3, 1}, {4, 7, 4}, {1, -2, -6}}
B = Join[A, IdentityMatrix[3], 2]

PivotDown[m_, {i_, j_}, oneflag_: 0] :=
Block[{k}, If[m[[i, j]] == 0, Return[m]];
Return[Table[
Which[k < i, m[[k]], k > i, m[[k]] - m[[k, j]]/m[[i, j]] m[[i]],
k == i && oneflag == 0, m[[k]] , k == 1 && oneflag == 1,
m[[k]]/m[[i, j]] ], {k, 1, Length[m]}]]]

PivotDown[B, {1, 1}]
{{2, 3, 1, 1, 0, 0}, {0, 1, 2, -2, 1, 0}, {0, -(7/2), -(13/2), -(1/2), 0, 1}}
B = %
PivotDown[B, {2, 2}]
{{2, 3, 1, 1, 0, 0}, {0, 1, 2, -2, 1, 0}, {0, 0, 1/2, -(15/2), 7/2, 1}}
The forward elimination procedure would finish over here because Gauss is happy with this part, called reduced echelon form. The pivots 2, 1, 1/2 are on diagonal of submatrix U. The contribution of the geodesist Wilhelm Jordan is to continue with elimination. Wilhelm goes all the way to obtain the reduced row echelon form: rows are added to rows to produce zeroes above the pivots:
\begin{align*} {\bf E}_{13} {\bf B} & = \begin{bmatrix} 2&3&1&1&0&0 \\ 0&1&2&-2&1&0 \\ 0&0&\frac{1}{2} & -\frac{15}{2}&\frac{7}{2}&1 \end{bmatrix} \qquad (-3) \mbox{ row 2 + row 1} \\ & \sim \begin{bmatrix} 2&0&-5&7&-3&0 \\ 0&1&2&-2&1&0 \\ 0&0&\frac{1}{2} & -\frac{15}{2}&\frac{7}{2}&1 \end{bmatrix} \qquad (-4) \mbox{ row 3 + row 2} \\ & \sim \begin{bmatrix} 2&0&-5&7&-3&0 \\ 0&1&0&28&-13&-4 \\ 0&0&\frac{1}{2} & -\frac{15}{2}&\frac{7}{2}&1 \end{bmatrix} \qquad (10) \mbox{ row 3 + row 1} \\ & \sim \begin{bmatrix} 2&0&0&-68&32&10 \\ 0&1&0&28&-13&-4 \\ 0&0&\frac{1}{2} & -\frac{15}{2}&\frac{7}{2}&1 \end{bmatrix} . \end{align*}
The last step is to divide (except second row) by its pivot. The new pivots are all ones and we have reached I in the first half of the matrix. The last three columns will provide us the inverse matrix A-1:
\[ {\bf A}^{-1} = \begin{bmatrix} -34&16&5 \\ 28&-13&-4 \\ -15&7&2 \end{bmatrix} . \qquad\blacksquare \]
End of Example 11
Column Transformations

 


  1. Axier, S., Linear Algebra Done Right. Undergraduate Texts in Mathematics (3rd ed.). Springer. 2015, ISBN 978-3-319-11079-0.
  2. Beezer, R.A., A First Course in Linear Algebra, 2017.
  3. Dillon, M., Linear Algebra, Vector Spaces, and Linear Transformations, American Mathematical Society, Providence, RI, 2023.