Page 321 - A Course in Linear Algebra with Applications
P. 321
9.1: Symmetric and Hermitian Matrices 305
where c\,..., c r are the eigenvectors corresponding to
X\,..., X r respectively. Hence
/ci 0 0
0 c 2 0 0
AU = (X 1X 2...X r) = UD,
\ 0 0 0 C r/
where D is the diagonal matrix with diagonal entries c\,...,
c r. Since the columns of U form an orthonormal set, U*U =
I r.
In general r < n, but should it be the case that r = n,
1
then U is n x n and we have U^ = U*, so that U is unitary
(see 7.3). Therefore U*AU = D and A is diagonalized by the
matrix U. In other words, if there exist n mutually orthogonal
eigenvectors of A, then A can be diagonalized by a unitary
matrix. The outstanding question is, of course, whether there
are always that many linearly independent eigenvectors. We
shall shortly see that this is the case.
A key result must first be established.
Theorem 9.1.2 (Schur '5 Theorem)
Let A be an arbitrary square complex matrix. Then there is a
unitary matrix U such that U*AU is upper triangular. More-
over, if A is a real symmetric matrix, then U can be chosen
real and orthogonal.
Proof
Let A be an n x n matrix. The proof is by induction on n.
Of course, if n = 1, then A is already upper triangular, so
let n > 1. There is an eigenvector X\ of A, with associated
eigenvalue c\ say. Here we can choose X\ to be a unit vector
n
in C . Using 5.1.4 we adjoin vectors to X\ to form a basis of
n
C . Then the Gram-Schmidt procedure (in the complex case)
may be applied to produce an orthonormal basis X\,..., X n
n
of C ; note that X\ is a member of this basis.