Page 198 - Matrix Analysis & Applied Linear Algebra
P. 198
4.3 Linear Independence 193
m
4.3.12. Suppose that S = {u 1 , u 2 ,..., u n } is a set of vectors from . Prove
that S is linearly independent if and only if the set
2 3 n
S = u 1 , u i , u i ,. . . , u i
i=1 i=1 i=1
is linearly independent.
4.3.13. Which of the following sets of functions are linearly independent?
(a) {sin x, cos x, x sin x} .
x x 2 x
(b) e ,xe ,x e .
2 2
(c) sin x, cos x, cos 2x .
4.3.14. Prove that the converse of the statement given in Example 4.3.6 is false
3 3
by showing that S = x , |x| is a linearly independent set, but the
associated Wronski matrix W(x) is singular for all values of x.
4.3.15. If A T is diagonally dominant, explain why partial pivoting is not needed
when solving Ax = b by Gaussian elimination. Hint: If after one step
of Gaussian elimination we have
T T
α d one step α d
A = −−−−−−−−→ cd T ,
c B 0 B −
α
T T
show that A T being diagonally dominant implies X = B − cd
α
must also be diagonally dominant.