Page 123 - Matrix Analysis & Applied Linear Algebra
P. 123

3.7 Matrix Inversion                                                               117

                                    Proof.  The fact that (3.7.6) ⇐⇒ (3.7.7) is a direct consequence of the defi-
                                    nition of rank, and (3.7.6) ⇐⇒ (3.7.8) was established in §2.4. Consequently,
                                    statements (3.7.6), (3.7.7), and (3.7.8) are equivalent, so if we establish that
                                    (3.7.5) ⇐⇒ (3.7.6), then the proof will be complete.
                                        Proof of (3.7.5) =⇒ (3.7.6).  Begin by observing that (3.5.5) guarantees
                                    that a matrix X =[X ∗1 | X ∗2 |···| X ∗n ] satisfies the equation AX = I if and
                                    only if X ∗j is a solution of the linear system Ax = I ∗j . If A is nonsingular,
                                    then we know from (3.7.4) that there exists a unique solution to AX = I, and
                                    hence each linear system Ax = I ∗j has a unique solution. But in §2.5 we learned
                                    that a linear system has a unique solution if and only if the rank of the coefficient
                                    matrix equals the number of unknowns, so rank (A)= n.
                                        Proof of (3.7.6) =⇒ (3.7.5). If rank (A)= n, then (2.3.4) insures that
                                    each system Ax = I ∗j is consistent because rank[A | I ∗j ]= n = rank (A).
                                    Furthermore, the results of §2.5 guarantee that each system Ax = I ∗j has a
                                    unique solution, and hence there is a unique solution to the matrix equation
                                    AX = I. We would like to say that X = A −1 , but we cannot jump to this
                                    conclusion without first arguing that XA = I. Suppose this is not true—i.e.,
                                    suppose that XA − I 
= 0. Since
                                                    A(XA − I)=(AX)A − A = IA − A = 0,
                                    it follows from (3.5.5) that any nonzero column of XA−I is a nontrivial solution
                                    of the homogeneous system Ax = 0. But this is a contradiction of the fact that
                                    (3.7.6) ⇐⇒ (3.7.8). Therefore, the supposition that XA − I 
= 0 must be false,
                                    and thus AX = I = XA, which means A is nonsingular.
                                        The definition of matrix inversion says that in order to compute A −1 , it is
                                    necessary to solve both of the matrix equations AX = I and XA = I. These
                                    two equations are necessary to rule out the possibility of nonsquare inverses. But
                                    when only square matrices are involved, then any one of the two equations will
                                    suffice—the following example elaborates.
                   Example 3.7.2
                                    Problem: If A and X are square matrices, explain why
                                                            AX = I =⇒ XA = I.                      (3.7.9)
                                    In other words, if A and X are square and AX = I, then X = A −1 .
                                    Solution: Notice first that AX = I implies X is nonsingular because if X is
                                    singular, then, by (3.7.8), there is a column vector x 
= 0 such that Xx = 0,
                                    which is contrary to the fact that x = Ix = AXx = 0. Now that we know X −1
                                    exists, we can establish (3.7.9) by writing

                                          AX = I =⇒ AXX      −1  = X −1  =⇒ A = X  −1  =⇒ XA = I.
                                    Caution! The argument above is not valid for nonsquare matrices. When
                                    m 
= n, it’s possible that A m×n X n×m = I m , but XA 
= I n .
   118   119   120   121   122   123   124   125   126   127   128