Page 192 - Linear Algebra Done Right
P. 192
The Minimal Polynomial
Proof: Let
p(z) = a 0 + a 1 z + a 2 z +· · ·+ a m−1 z
be the minimal polynomial of T. 2 m−1 + z m 181
First suppose that λ ∈ F is a root of p. Then the minimal polynomial
of T can be written in the form
p(z) = (z − λ)q(z),
where q is a monic polynomial with coefficients in F (see 4.1). Because
p(T) = 0, we have
0 = (T − λI)(q(T)v)
for all v ∈ V. Because the degree of q is less than the degree of the
minimal polynomial p, there must exist at least one vector v ∈ V such
that q(T)v = 0. The equation above thus implies that λ is an eigenvalue
of T, as desired.
To prove the other direction, now suppose that λ ∈ F is an eigen-
value of T. Let v be a nonzero vector in V such that Tv = λv. Repeated
j
j
applications of T to both sides of this equation show that T v = λ v
for every nonnegative integer j. Thus
m
2
0 = p(T)v = (a 0 + a 1 T + a 2 T +· · ·+ a m−1 T m−1 + T )v
2
m
= (a 0 + a 1 λ + a 2 λ +· · ·+ a m−1 λ m−1 + λ )v
= p(λ)v.
Because v = 0, the equation above implies that p(λ) = 0, as desired.
Suppose we are given, in concrete form, the matrix (with respect to
some basis) of some operator T ∈L(V). To find the minimal polyno-
mial of T, consider
m
2
(M(I), M(T), M(T) ,..., M(T) )
for m = 1, 2,... until this list is linearly dependent. Then find the
scalars a 0 ,a 1 ,a 2 ,...,a m−1 ∈ F such that You can think of this as
a system of (dim V) 2
2
a 0 M(I) + a 1 M(T) + a 2 M(T) +· · ·+ a m−1 M(T) m−1 +M(T) m = 0.
equations in m
The scalars a 0 ,a 1 ,a 2 ,...,a m−1 , 1 will then be the coefficients of the variables
a 0 ,a 1 ,...,a m−1 .
minimal polynomial of T. All this can be computed using a familiar
process such as Gaussian elimination.