Page 189 - Matrix Analysis & Applied Linear Algebra
P. 189
184 Chapter 4 Vector Spaces
Example 4.3.3
Diagonal Dominance. A matrix A n×n is said to be diagonally dominant
whenever
n
|a ii | > |a ij | for each i =1, 2,...,n.
j=1
j =i
That is, the magnitude of each diagonal entry exceeds the sum of the magni-
tudes of the off-diagonal entries in the corresponding row. Diagonally dominant
matrices occur naturally in a wide variety of practical applications, and when
solving a diagonally dominant system by Gaussian elimination, partial pivoting
is never required—you are asked to provide the details in Exercise 4.3.15.
Problem: In 1900, Minkowski (p. 278) discovered that all diagonally dominant
matrices are nonsingular. Establish the validity of Minkowski’s result.
Solution: The strategy is to prove that if A is diagonally dominant, then
N (A)= {0}, so that (4.3.2) together with (4.3.6) will provide the desired
conclusion. Use an indirect argument—suppose there exists a vector x = 0 such
that Ax = 0, and assume that x k is the entry of maximum magnitude in x.
Focus on the k th component of Ax, and write the equation A k∗ x =0 as
n
a kk x k = − a kj x j .
j=1
j =k
Taking absolute values of both sides and using the triangle inequality together
with the fact that |x j |≤|x k | for each j produces
n n n n
a kj x j ≤ |a kj x j | = |a kj ||x j |≤ |a kj | |x k |.
|a kk ||x k | =
j=1 j=1 j=1 j=1
j =k j =k j =k j =k
But this implies that
n
|a kk |≤ |a kj |,
j=1
j =k
which violates the hypothesis that A is diagonally dominant. Therefore, the
assumption that there exists a nonzero vector in N (A) must be false, so we
may conclude that N (A)= {0}, and hence A is nonsingular.
Note: An alternate solution is given in Example 7.1.6 on p. 499.