Page 404 - Matrix Analysis & Applied Linear Algebra
P. 404

400              Chapter 5                    Norms, Inner Products, and Orthogonality

                                                                                                       k
                                    •  If Ax = b is a consistent system of linear equations in which b ∈ R A ,

                                                  D
                                       then x = A b is the unique solution that belongs to R A k     (Exercise
                                       5.10.9).
                                                                     k             k              D
                                          D
                                    •  AA    is the projector onto R A  along N A , and I − AA      is the

                                       complementary projector onto N A k  along R A k  (Exercise 5.10.10).
                                                                              n
                                    •  If A is considered as a linear operator on   , then, with respect to a basis

                                                   k
                                       B R for R A , C is the matrix representation for the restricted operator
                                       A       (see p. 263). Thus A     is invertible. Moreover,
                                        /R(A k )                 /R(A k )
                                                            3            4
                                       1        2                      −1                               −1
                                        A D         = C −1  =  A           ,  so   A D      = A          .
                                           /R(A k )             /R(A k )              /R(A k )  /R(A k )
                                                 B R
                                                                          B R

                                                                                    k
                                       In other words, A D  is the inverse of A on R A , and A D  is the zero

                                                       k
                                       operator on N A , so, in the context of Example 5.10.4,
                                                                                         −1
                                           A = A       ⊕ A         and   A D  = A          ⊕ 0       .
                                                 /R(A k )  /N(A k )               /R(A k )     /N(A k )
                   Exercises for section 5.10


                                                                                                k
                                   5.10.1. If A is a square matrix of index k> 0, prove that index(A )=1.

                                   5.10.2. If A is a nilpotent matrix of index k, describe the components in a
                                           core-nilpotent decomposition of A.


                                   5.10.3. Prove that if A is a symmetric matrix, then index(A) ≤ 1.


                                                 n×n
                                                                                                      ∗
                                                                                                ∗
                                   5.10.4. A ∈C       is said to be a normal matrix whenever AA = A A.
                                           Prove that if A is normal, then index(A) ≤ 1.
                                           Note: All symmetric matrices are normal, so the result of this exercise
                                           includes the result of Exercise 5.10.3 as a special case.

                                    Drazin’s concept attracted little interest—perhaps due to Drazin’s abstract algebraic pre-
                                    sentation. But eventually Drazin’s generalized inverse was recognized to be a useful tool for
                                    analyzing nonorthogonal types of problems involving singular matrices. In this respect, the
                                    Drazin inverse is complementary to the Moore–Penrose pseudoinverse discussed in Exercise
                                    4.5.20 and on p. 423 because the Moore–Penrose pseudoinverse is more useful in applications
                                    where orthogonality is somehow wired in (e.g., least squares).
   399   400   401   402   403   404   405   406   407   408   409