Page 247 - Compact Numerical Methods For Computers
P. 247

Chapter 19
                            THE CONJUGATE GRADIENTS METHOD APPLIED
                                   TO PROBLEMS IN LINEAR ALGEBRA




                                                    19.1. INTRODUCTION

                            This monograph concludes by applying the conjugate gradients method, de-
                            veloped in chapter 16 for the minimisation of nonlinear functions, to linear equations,
                            linear least-squares and algebraic eigenvalue problems. The methods suggested
                            may not be the most efficient or effective of their type, since this subject area has
                            not attracted a great deal of careful research. In fact much of the work which has
                            been performed on the sparse algebraic eigenvalue problem has been carried out
                            by those scientists and engineers in search of solutions. Stewart (1976) has
                            prepared an extensive bibliography on the large, sparse, generalised symmetric
                            matrix eigenvalue problem in which it is unfortunately difficult to find many
                            reports that do more than describe a method. Thorough or even perfunctory
                            testing is often omitted and convergence is rarely demonstrated, let alone proved.
                            The work of Professor Axe1 Ruhe and his co-workers at Umea is a notable
                            exception to this generality. Partly, the lack of testing is due to the sheer size of
                            the matrices that may be involved in real problems and the cost of finding
                            eigensolutions.
                              The linear equations and least-squares problems have enjoyed a more diligent
                            study. A number of studies have been made of the conjugate gradients method
                            for linear-equation systems with positive definite coefficient matrices, of which
                            one is that of Reid (1971). Related methods have been developed in particular by
                            Paige and Saunders (1975) who relate the conjugate gradients methods to the
                            Lanczos algorithm for the algebraic eigenproblem. The Lanczos algorithm has
                            been left out of this work because I feel it to be a tool best used by someone
                            prepared to tolerate its quirks. This sentiment accords with the statement of
                            Kahan and Parlett (1976):‘The urge to write a universal Lanczos program should
                            be resisted, at least until the process is better understood.’ However, in the hands
                            of an expert, it is a very powerful method for finding the eigenvalues of a large
                            symmetric matrix. For indefinite coefficient matrices, however, I would expect the
                            Paige-Saunders method to be preferred, by virtue of its design. In preparing the first
                            edition of this book, I experimented briefly with some FORTRAN codes for several
                            methods for iterative solution of linear equations and least-squares problems, finding
                            no clear advantage for any one approach, though I did not focus on indefinite
                            matrices. Therefore, the treatment which follows will stay with conjugate gradients,
                            which has the advantage of introducing no fundamentally new ideas.
                              It must be pointed out that the general-purpose minimisation algorithm 22 does
                            not perform very well on linear least-squares or Rayleigh quotient minimisations.
                                                              234
   242   243   244   245   246   247   248   249   250   251   252