Page 314 - Matrix Analysis & Applied Linear Algebra
P. 314
310 Chapter 5 Norms, Inner Products, and Orthogonality
Solution:
1
x 1 1 0
k =1: u 1 ← = √
x 1 2 0
−1
0 0
T
k =2: u 2 ← x 2 − (u x 2 )u 1 = , u 2 ← u 2 1
2
1 0 u 2 =
0
0 0
1 1
T
T
k =3: u 3 ← x 3 − (u x 3 )u 1 − (u x 3 )u 2 = , u 3 ← u 3 1 0
0
1 2 1 u 3 = √
1
3
1 1
Thus
1 0 1
1 0 1 1 0
, u 2 = ,
u 1 = √ u 3 = √
2 0 0 3 1
−1 0 1
is the desired orthonormal basis.
The Gram–Schmidt process frequently appears in the disguised form of a
matrix factorization. To see this, let A m×n = a 1 | a 2 |···| a n be a matrix with
linearly independent columns. When Gram–Schmidt is applied to the columns
of A, the result is an orthonormal basis {q 1 , q 2 ,..., q n } for R (A), where
k−1
a 1 a k − i=1 q i a k q i
q 1 = and q k = for k =2, 3,...,n,
ν 1 ν k
k−1
where ν 1 = a 1 and ν k =
a k −
for k> 1. The above
i=1 q i a k q i
relationships can be rewritten as
a 1 = ν 1 q 1 and a k = q 1 a k q 1 + ··· + q k−1 a k q k−1 + ν k q k for k> 1,
which in turn can be expressed in matrix form by writing
ν 1 q 1 a 2 q 1 a 3 · · · q 1 a n
0 ν 2 q 2 a 3 · · · q 2 a n
0 0 ν 3 ··· q 3 a n .
. . . . .
a 1 | a 2 |···| a n = q 1 | q 2 |···| q n
. . . . .
. . . . .
0 0 0 ··· ν n
This says that it’s possible to factor a matrix with independent columns as
A m×n = Q m×n R n×n , where the columns of Q are an orthonormal basis for
R (A) and R is an upper-triangular matrix with positive diagonal elements.