Page 202 - Advanced Linear Algebra
P. 202
186 Advanced Linear Algebra
#~ #
(
In this case, is called an eigenvector or characteristic vector) of
#
associated with .
)
(
2 A scalar - is an eigenvalue for a matrix if there exists a nonzero
column vector for which
%
(% ~ %
(
In this case, is called an eigenvector or characteristic vector) for (
%
associated with .
3 The set of all eigenvectors associated with a given eigenvalue , together
)
with the zero vector, forms a subspace of , called the eigenspace of and
=
denoted by . This applies to both linear operators and matrices.
;
4 The set of all eigenvalues of an operator or matrix is called the spectrum
)
of the operator or matrix. We denote the spectrum of by Spec²³ .
Theorem 8.1 Let ²= ³ have minimal polynomial ²%³ and characteristic
B
polynomial ²%³ .
)
1 The spectrum of is the set of all roots of ²%³ or of ²%³ , not counting
multiplicity.
2 The eigenvalues of a matrix are invariants under similarity.
)
) of the matrix is the solution space to the homogeneous
3 The eigenspace ; (
system of equations
² 0 c (³²%³ ~
One way to compute the eigenvalues of a linear operator is to first represent
by a matrix and then solve the characteristic equation
(
det²%0 c (³ ~
Unfortunately, it is quite likely that this equation cannot be solved when
dim²= ³ . As a result, the art of approximating the eigenvalues of a matrix is
a very important area of applied linear algebra.
The following theorem describes the relationship between eigenspaces and
eigenvectors of distinct eigenvalues.
Theorem 8.2 Suppose that ÁÃÁ are distinct eigenvalues of a linear
operator B ²= . ³
)
1 Eigenvectors associated with distinct eigenvalues are linearly independent;
, then the set ¸# ÁÃÁ# ¹ is linearly independent.
that is, if # ;
) ; ; bÄb ; ; lÄl exists.
2 The sum is direct; that is,
Proof. For part 1), if ¸# ÁÃÁ# ¹ is linearly dependent, then by renumbering if
necessary, we may assume that among all nontrivial linear combinations of