Page 97 - A Course in Linear Algebra with Applications
P. 97

3.3:  Determinants  and  Inverses  of  Matrices  81

        is
                             /    5   -11      7
                             (  —18      2      3
                             \
                            ' - 1 6      7   -13
             The  significance  of  the  adjoint  matrix  is  made  clear  by
        the  next  two  results.
        Theorem     3.3.6
        //  A  is  any  n  x  n  matrix,  then


                      A  adj(A)  =  (det(A))I n  =  adj(A)A.




        Proof
        The  (i,j)  entry  of the  matrix  product  A  adj(i4)  is

                         n                     n
                        ^2aik(adj(A))kj    =     ^2a ikA jk.
                        k=i                  fc=i

        If  i  =  j , this  is just  the  expansion  of  det(A)  by  rov/  i;  on  the
        other  hand,  if  i  ^  j , the  sum  is  also  a  row  expansion  of  a
        determinant,   but  one  in  which  rows  i  and  j  are  identical.  By
        3.2.2  the  sum  will  vanish  in  this  case.  This  means  that  the
        off-diagonal  entries  of the  matrix  product  A  a,d](A) are  zero,
        while  the  entries  on  the  diagonal  all  equal  det(^4).  Therefore
        A  adj(^4)  is  the  scalar  matrix  (det(A))I n,  as  claimed.  The
        second  statement  can  be  proved  in  a  similar  fashion.

             Theorem   3.3.5  leads  to  an  attractive  formula  for  the  in-
        verse  of  an  invertible  matrix.
        Theorem     3.3.7
        If  A  is  an  invertible  matrix,  then  A" 1  =  (l/det(A))adj(A).
   92   93   94   95   96   97   98   99   100   101   102