Page 412 - Matrix Analysis & Applied Linear Algebra
P. 412

408              Chapter 5                    Norms, Inner Products, and Orthogonality

                                    the singular value decomposition (p. 412), and in a somewhat parallel manner,
                                    the core-nilpotent decomposition paves the way to the Jordan form (p. 590).
                                    These two parallel tracks constitute the backbone for the theory of modern linear
                                    algebra, so it’s worthwhile to take a moment and reflect on them.
                                                                                    n
                                        The range-nullspace decomposition decomposes    with square matrices
                                    while the orthogonal decomposition theorem does it with rectangular matrices.
                                    So does this mean that the range-nullspace decomposition is a special case of,
                                    or somehow weaker than, the orthogonal decomposition theorem? No! Even for
                                    square matrices they are not very comparable because each says something that
                                    the other doesn’t. The core-nilpotent decomposition (and eventually the Jordan
                                    form) is obtained by a similarity transformation, and, as discussed in §§4.8–4.9,
                                    similarity is the primary mechanism for revealing characteristics of A that are
                                    independent of bases or coordinate systems. The URV factorization has little
                                    to say about such things because it is generally not a similarity transforma-
                                    tion. Orthogonal decomposition has the advantage whenever orthogonality is
                                    naturally built into a problem—such as least squares applications. And, as dis-
                                    cussed in §5.7, orthogonal methods often produce numerically stable algorithms
                                    for floating-point computation, whereas similarity transformations are generally
                                    not well suited for numerical computations. The value of similarity is mainly on
                                    the theoretical side of the coin.
                                        So when do we get the best of both worlds—i.e., when is a URV factoriza-
                                    tion also a core-nilpotent decomposition? First, A must be square and, second,
                                    (5.11.11) must be a similarity transformation, so U = V. Surprisingly, this
                                    happens for a rather large class of matrices described below.


                                                  Range Perpendicular to Nullspace
                                       For rank (A n×n )= r, the following statements are equivalent:
                                       •   R (A) ⊥ N (A),                                     (5.11.12)
                                                       T
                                       •   R (A)= R A    ,                                    (5.11.13)
                                                        T
                                       •   N (A)= N A     ,                                   (5.11.14)

                                                         0
                                                   C r×r      T
                                       •   A = U             U                                (5.11.15)
                                                     0   0
                                       in which U is orthogonal and C is nonsingular. Such matrices will
                                       be called RPN matrices, short for“range perpendicular to nullspace.”
                                       Some authors call them range-symmetric or EP matrices. Nonsingular
                                       matrices are trivially RPN because they have a zero nullspace. For com-
                                                                      ∗
                                       plex matrices, replace ( ) T  by ( ) and “orthogonal” by “unitary.”


                                    Proof.  The fact that (5.11.12) ⇐⇒ (5.11.13) ⇐⇒ (5.11.14) is a direct conse-
                                    quence of (5.11.5). It suffices to prove (5.11.15) ⇐⇒ (5.11.13). If (5.11.15) is a
   407   408   409   410   411   412   413   414   415   416   417