es
Numerical analysts have developed a number of algorithms for orthogonal matrices [Golub 89] [Press 88], in large part because orthogonality limits the accumulation of numerical error. Given a square—and presumably non-singular matrix, three promising orthogonal decompositions are available: QR decomposition, Singular Value Decomposition (SVD), and Polar Decomposition. The QR factors of a matrix M = Q R are, respectively, orthogonal and lower triangular. The SVD gives three factors, M = U Λ VT, with U and V orthogonal and Λ diagonal and positive. The less common Polar Decomposition, M = Q S, yields an orthogonal factor and a symmetric positive definite factor. The latter two decompositions can factor singular matrices, with “positive” replaced by “non-negative” in the factors.

More than one algorithm is available to compute each decomposition. The oldest and best-known method for QR Decomposition is called Gram-Schmidt orthogonalization. Each row of the matrix is considered in turn, with each divided by its magnitude to give a unit vector, then projected onto the remaining rows to subtract out any parallel component in each of them. A better method is to accumulate Householder reflections, orthogonal transformations which can zero out the elements above the diagonal.

There is no simple SVD algorithm. The most common approach is first to use Householder reflections to make M bidiagonal, then to perform an iteration involving QR Decomposition until the off-diagonal entries converge to zero. While this is numerically reliable, it is complicated to code, and by no means cheap.

It is possible to compute a Polar Decomposition using the results of SVD, suggesting great cost; but a simpler method is available [Higham 86]. Compute the othogonal factor by averaging the matrix with its inverse transpose until convergence: Set Q₀ = M, then Qi+1 = 1/2 (Qi + Qi–T) until Qi+1Qi0. This is essentially a Newton algorithm for the square root of I, the identity matrix, and converges quadratically when Qi is nearly orthogonal.

  1. Vector addition
  2. Tea
  3. Milk