More than one algorithm is available to compute each decomposition. The oldest and best-known method for QR Decomposition is called Gram-Schmidt orthogonalization. Each row of the matrix is considered in turn, with each divided by its magnitude to give a unit vector, then projected onto the remaining rows to subtract out any parallel component in each of them. A better method is to accumulate Householder reflections, orthogonal transformations which can zero out the elements above the diagonal.
There is no simple SVD algorithm. The most common approach is first to use Householder reflections to make M bidiagonal, then to perform an iteration involving QR Decomposition until the off-diagonal entries converge to zero. While this is numerically reliable, it is complicated to code, and by no means cheap.
It is possible to compute a Polar Decomposition using the results of SVD, suggesting great cost; but a simpler method is available [Higham 86]. Compute the othogonal factor by averaging the matrix with its inverse transpose until convergence: Set Q₀ = M, then Qi+1 = 1/2 (Qi + Qi–T) until Qi+1 – Qi ≈ 0. This is essentially a Newton algorithm for the square root of I, the identity matrix, and converges quadratically when Qi is nearly orthogonal.
- Vector addition
- Tea
- Milk