Page 186 - Matrix Analysis & Applied Linear Algebra
P. 186
4.3 Linear Independence 181
4.3 LINEAR INDEPENDENCE
For a given set of vectors S = {v 1 , v 2 ,..., v n } there may or may not exist
dependency relationships in the sense that it may or may not be possible to
express one vector as a linear combination of the others. For example, in the set
1 3 9
,
A = −1 0 −3 ,
,
2 −1 4
the third vector is a linear combination of the first two—i.e., v 3 =3v 1 +2v 2 .
Such a dependency always can be expressed in terms of a homogeneous equation
by writing
3v 1 +2v 2 − v 3 = 0.
On the other hand, it is evident that there are no dependency relationships in
the set
1 0 0
1
,
0
B =
,
0
0 0 1
because no vector can be expressed as a combination of the others. Another way
to say this is to state that there are no solutions for α 1 ,α 2 , and α 3 in the
homogeneous equation
α 1 v 1 + α 2 v 2 + α 3 v 3 = 0
other than the trivial solution α 1 = α 2 = α 3 =0. These observations are the
basis for the following definitions.
Linear Independence
A set of vectors S = {v 1 , v 2 ,..., v n } is said to be a linearly in-
dependent set whenever the only solution for the scalars α i in the
homogeneous equation
α 1 v 1 + α 2 v 2 + ··· + α n v n = 0 (4.3.1)
is the trivial solution α 1 = α 2 = ··· = α n =0. Whenever there is a
nontrivial solution for the α ’s (i.e., at least one α i = 0 ) in (4.3.1), the
set S is said to be a linearly dependent set. In other words, linearly
independent sets are those that contain no dependency relationships,
and linearly dependent sets are those in which at least one vector is a
combination of the others. We will agree that the empty set is always
linearly independent.