Page 305 - Advanced engineering mathematics
P. 305
9.3 Some Special Types of Matrices 285
THEOREM 9.8
Let A be an n × n matrix of real numbers. Then
n
1. A is orthogonal if and only the row vectors are mutually orthogonal unit vectors in R .
2. A is orthogonal if and only if the column vectors are mutually orthogonal unit vectors
in R .
n
We say that the row vectors of an orthogonal matrix form an orthonormal set of vectors in
n
R . The column vectors also form an orthonormal set.
t
t
Proof The i, j element of AA is the dot product of row i of A with column j of A , and this
is the dot product of row i of A with row j of A.
If i = j, then this dot product is zero, because the i, j− element of I n is zero. And if i =
j, then this dot product is 1 because the i,i− element of I n is 1. This proves that, if A is an
n
orthogonal matrix, then its rows form an orthonormal set of vectors in R .
n
Conversely, suppose the rows are mutually orthogonal unit vectors in R . Then the i, j
t
t
element of AA is 0 if i = j and 1 if i = j,so AA = I n .
t
By applying this argument to A , this transpose is orthogonal if and only if its rows are
orthogonal unit vectors, and these rows are the columns of A.
We now know a lot about orthogonal matrices. We will use this information to determine all
2 × 2 real orthogonal matrices. Suppose
a b
Q =
c d
is orthogonal. What does this tell us about a,b,c and d? Because the row (column) vectors are
mutually orthogonal unit vectors,
ac + bd = 0
ab + cd = 0
2
2
a + b = 1
2
2
c + d = 1.
Furthermore, |Q|=±1, so
ad − bc = 1or ad − bc =−1.
By analyzing these equations in all cases, we find that there must be some θ in [0,2π) such that
a = cos(θ) and b = sin(θ), and Q must have one of the two forms:
cos(θ) sin(θ) cos(θ) sin(θ)
or ,
−sin(θ) cos(θ) sin(θ) −cos(θ)
depending on whether the determinant is 1 or −1. For example, with θ = π/6, we obtain the
orthogonal 2 × 2 matrices
√ √
3/2 1/2 3/2 1/2
√ or √ .
−1/2 3/2 1/2 − 3/2
If we put Theorems 9.4 and 9.8 together, we obtain an interesting conclusion. Suppose S is
a real, symmetric n × n matrix with n distinct eigenvalues. Then the associated eigenvectors are
orthogonal. These may not be unit vectors. However, a scalar multiple of an eigenvector is still
an eigenvector. Divide each eigenvector by its length and use these unit eigenvectors as columns
of an orthogonal matrix Q that diagonalizes S. This proves the following.
Copyright 2010 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
October 14, 2010 14:49 THM/NEIL Page-285 27410_09_ch09_p267-294