Page 143 - Matrix Analysis & Applied Linear Algebra
P. 143
3.9 ElementaryMatrices and Equivalence 137
row
Proof. It is always true that A ∼ E A so that there is a nonsingular matrix
P such that PA = E A . If rank (A)= r, then the basic columns in E A are
the r unit columns. Apply column interchanges to E A so as to move these r
unit columns to the far left-hand side. If Q 1 is the product of the elementary
matrices corresponding to these column interchanges, then PAQ 1 has the form
I r J
PAQ 1 = E A Q 1 = .
0 0
Multiplying both sides of this equation on the right by the nonsingular matrix
I r −J I r J I r −J I r 0
Q 2 = produces PAQ 1 Q 2 = = .
0 I 0 0 0 I 0 0
Thus A ∼ N r . because P and Q = Q 1 Q 2 are nonsingular.
Example 3.9.3
A 0
Problem: Explain why rank = rank (A)+ rank (B).
0 B
Solution: If rank (A)= r and rank (B)= s, then A ∼ N r and B ∼ N s .
Consequently,
A 0 N r 0 A 0 N r 0
∼ =⇒ rank = rank = r + s.
0 B 0 N s 0 B 0 N s
Given matrices A and B, how do we decide whether or not A ∼ B,
row col
A ∼ B, or A ∼ B? We could use a trial-and-error approach by attempting to
reduce A to B by elementary operations, but this would be silly because there
are easy tests, as described below.
Testing for Equivalence
For m × n matrices A and B the following statements are true.
• A ∼ B if and only if rank (A)= rank (B). (3.9.8)
row
• A ∼ B if and only if E A = E B . (3.9.9)
col
• A ∼ B if and only if E A T = E B T . (3.9.10)
Corollary. Multiplication by nonsingular matrices cannot change rank.