Page 37 - Compact Numerical Methods For Computers
P. 37
Formal problems in linear algebra 27
Note that the zeros below the diagonal in both R and S imply that, apart from
orthogonality conditions imposed by (2.47), the elements of columns (n + 1),
(n + 2), . . . , m of Q are arbitrary. In fact, they are not needed in most calcula-
tions, so will be dropped, leaving the m by n matrix U, where
T
U U = 1 . (2.52)
n
T
Note that it is no longer possible to make any statement regarding UU . Omitting
rows (n + 1) to m of both R and S allows the decompositions to be written as
T
A = UR = USV (2.53)
T
where A is m by n, U is m by n and U U = 1 , R is n by n and upper-triangular, S
n
is n by n and diagonal, and V is n by n and orthogonal. In the singular-value
decomposition the diagonal elements of S are chosen to be non-negative.
Both the QR and singular-value decompositions can also be applied to square
matrices. In addition, an n by n matrix A can be decomposed into a product of
a lower- and an upper-triangular matrix, thus
A = LR. (2.54)
In the literature this is also known as the LU decomposition from the use of ‘U’ for
‘upper triangular’. Here another mnemonic, ‘U’ for ‘unitary’ has been employed.
If the matrix A is symmetric and positive definite, the decomposition
T
A = LL (2.55)
is possible and is referred to as the Choleski decomposition.
A scaled form of this decomposition with unit diagonal elements for L can be written
A = LDL T
where D is a diagonal matrix.
To underline the importance of decompositions, it can be shown by direct
substitution that if
A = USV T (2.53)
then the matrix
+
+
A = VS U T (2.56)
where
{ 1/S ii for S 0
ii
S = (2.57)
0 for S = 0
ii
satisfies the four conditions (2.40), (2.41), (2.44) and (2.45), that is
+ T + T T
AA A = USV VS U USV
+ T
= USS SV (2.58)
= USV T = A
(2.59)