Page 138 - A Course in Linear Algebra with Applications
P. 138
122 Chapter Five: Basis and Dimension
Since v i , . . . , v n are linearly independent, the only way that
a c
C1U1 + • • • + c mu rn can be zero is if the sums Y^lLi ji i vanish
1
for j — ,... ,n. This amounts to requiring that AC = 0
where C is the column consisting of ci,...,c m . We know
from 2.1.3 that there is such a C different from 0 precisely
when the number of pivots of A is less than m. So this is the
condition for u i , . . . , u m to be linearly dependent.
Example 5.1.6
2
2
3
s
Are the polynomials l — x + 2x — x , x + x , 2 + x + 4x +x 3
linearly independent in Pt(R)?
3
Use the standard ordered basis {1 > X • OC < X } of P 4 (R).
Then the coordinate columns of the given polynomials are the
columns of the matrix
/ 1 0 2
- 1 1 1
2 0 4
V-i i i
Using row operations, we see that the number of pivots of
the matrix is 2, which is less than the number of vectors.
Therefore the given polynomials are linearly dependent.
The next theorem lessens the work needed to show that
a particular set is a basis.
Theorem 5.1.9
Let V be a finitely generated vector space with positive dimen-
sion n. Then
(i) any set of n linearly independent vectors of V is a
basis;
(ii) any set of n vectors that generates V is a basis.
Proof
Assume first that the vectors vi, v 2 ) ..., v n are linearly inde-
pendent. Then by 5.1.4 the set {vi, v 2 ,..., v n } is contained