Page 464 - Advanced Linear Algebra
P. 464

448    Advanced Linear Algebra




                                        b
                                      (~ <   ' Z  <  i

            where  '  Z  is obtained from  '   by replacing all nonzero entries by their
            multiplicative inverses. This follows from the characterization above and also
            from the fact that for    ,

                            <     Z  < '     i  #  ~  <    Z       ~ '       c        <     ~         c        "
            and for  €  ,
                                  <     Z  < '     i  #  ~  <    Z       ~ '


            Least Squares Approximation
            Let  us  now  discuss  the  most important use of the MP inverse. Consider the
            system of linear equations
                                         (% ~ #
                          ²-³ (               or -~ s  .  This system has a solution
                                                      )
                             .  As usual, -~ d
            where (  4  Á
            if and only if # im ²  ( ³ . If the system has no solution, then it is of considerable
            practical importance to be able to solve the system
                                         (% ~ # V
                                                           #
                 #                      ² V  ³   that is closest to  , as measured by the
            where   is the unique vector in im   (
                  (
                             )
            unitary  or Euclidean  distance. This problem is called the linear least squares
            problem. Any solution to the system (% ~ # V  is called a least squares solution
            to the system (% ~ # . Put another way, a least squares solution to (% ~ #  is a
            vector   for which  (  )  %  c  #  )   is minimized.
                 %
            Suppose that   and   are least squares solutions to  %  (  ~  #  . Then
                            '
                       $
                                      ($ ~#~('
                                            V
                                (
            and so $c'  ker ²(³ .  We will write   for   ( .  Thus, if   is a particular least
                                                   )
                                                           $
                                            (
            squares solution, then the set of  all  least squares solutions is  $b ker ²(³ .
            Among all solutions, the most interesting is the solution of  minimum  norm.
            Note that if there is a least squares solution   that lies in ker ²  (  ³  ž , then for any
                                                $
            ' ker ²(³, we have
                                          $
                                                       $
                              ) b  $  '  ) ~     )) b     ) ) ‚ '     ))
            and so   will be the unique least squares solution of minimum norm.
                 $
            Before proceeding, we recall  Theorem 9.14  that if   is a subspace of a finite-
                                                )
                                    (
                                                       :
            dimensional inner product space  =  , then the best approximation  to  a  vector
            #=  from within  : is the unique vector  V  #: for which  # c #ž:. Now we
                                                               V
            can see how the MP inverse comes into play.
   459   460   461   462   463   464   465   466   467   468   469