Page 118 - Numerical methods for chemical engineering
P. 118
3 Matrix eigenvalue analysis
We now resume our discussion of linear algebra, which previously focused upon the solu-
tion of linear systems Ax = b by Gaussian elimination. The interpretation of A as a linear
transformation was found useful in understanding the existence and uniqueness of solu-
tions. Here, we consider a powerful tool in analyzing the transformational properties of a
matrix, eigenvalue analysis, based upon identifying for a matrix A the eigenvectors w and
corresponding scalar eigenvalues λ such that
Aw = λw (3.1)
We shall encounter numerous situations in which eigenvalue analysis provides insight
into the behavior and performance of an algorithm, or is itself of direct use, as when
estimating the vibrational frequencies of a structure or when calculating the states
of a system in quantum mechanics. The related method of singular value decom-
position (SVD), an extension of eigenvalue analysis to nonsquare matrices, is also
discussed.
Orthogonal matrices
We begin our discussion of eigenvalue analysis by demonstrating how it may be used
to diagnose the transformational properties of a matrix. Here, we consider a 3 × 3 real
3
matrix Q that rotates vectors in . We specify the particular rotation that it performs by
[1]
[3]
[2]
designating an orthonormal basis set {u , u , u } that is obtained from the orthonormal
basis
1 0 0
0
1
e [1] = e [2] = e [3] = e [m] = δ mj (3.2)
0
j
0 0 1
by transformation under Q (Figure 3.1),
u [k] = Qe [k] u [ j] · u [k] = δ jk (3.3)
Note that we rotate the vectors, not the coordinate system. We now ask:
If we only know the new basis vectors, can we determine the elements of Q?
Given only Q, is there any way that we can recognize that it performs a rotation, and if so,
can we extract from Q any information about the particular rotation that it represents?
104