Page 185 - Matrices theory and applications
P. 185
10
Approximation of Eigenvalues
The computation of the eigenvalues of a square matrix is a problem of
considerable difficulty. The naive idea, according to which it is enough to
compute the characteristic polynomial and then find its roots, turns out
to be hopeless because of Abel’s theorem, which states that the general
equation P(x)=0, where P is a polynomial of degree d ≥ 5, is not solvable
using algebraic operations and roots of any order. For this reason, there
exists no direct method, even an expensive one, for the computation of
Sp(M).
Dropping half of that program, one could compute the characteristic
polynomial exactly, then compute an approximation of its roots. But the
cost and the instability of the computation are prohibitive. Amazingly, the
opposite strategy is often used: A standard algorithm for computing the
roots of a polynomial of high degree consists in forming its companion
1
matrix and then applying to this matrix the QR algorithm to compute its
eigenvalues with good accuracy.
Hence, all the methods are iterative. In particular, we shall limit ourselves
to the cases K = IR or CC. The general strategy consists in constructing a
sequence of matrices
M (0) ,M (1) ,... ,M (m) ,... ,
1
Fortunately, the companion matrix is a Hessenberg matrix; see below for this notion
and its practical aspects.