Page 154 - Compact Numerical Methods For Computers
P. 154

Optimisation and nonlinear equations             143
                        By using the shorthand of vector notation, the nonlinear least-squares problem
                      is written: minimise
                                                             T
                                                       T
                                               S ( b) = f f = f (b, Y)f(b, Y)           (12.4)
                      with respect to the parameters b. Once again, K is the number of variables, M is
                      the number of data points and n is the number of parameters.
                        Every nonlinear least-squares problem as defined above is an unconstrained
                      minimisation problem, though the converse is not true. In later sections methods
                      will be presented with aim to minimise S(b) where S is any function of the
                      parameters. Some ways of handling constraints will also be mentioned. Unfortu-
                      nately, the mathematical programming problem, in which a minimum is sought for
                      a function subject to many constraints, will only be touched upon briefly since
                      very little progress has been made to date in developing algorithms with minimal
                      storage requirements.
                        There is also a close relationship between the nonlinear least-squares problem
                      and the problem of finding solutions of systems of nonlinear equations. A system
                      of nonlinear equations
                                                     f (b,Y) = 0                        (12.5)
                      having n = M (number of parameters equal to the number of equations) can be
                      approached as the nonlinear least-squares problem: minimise
                                                           T
                                                      S = f f                           (12.6)
                      with respect to b. For M greater than n, solutions can be sought in the least-
                      squares sense; from this viewpoint the problems are then indistinguishable. The
                      minimum in (12.6) should be found with S = 0 if the system of equations has a
                      solution. Conversely, the derivatives
                                                           j = 1, 2, . . . , n          (12.7)

                      for an unconstrained minimisation problem, and in particular a least-squares
                      problem, should be zero at the minimum of the function S(b), so that these
                      problems may be solved by a method for nonlinear equations, though local
                      maxima and saddle points of the function will also have zero derivatives and are
                      acceptable solutions of the nonlinear equations. In fact, very little research has
                      been done on the general minimisation or nonlinear-equation problem where
                      either all solutions or extrema are required or a global minimum is to be found.
                        The minimisation problem when n = 1 is of particular interest as a subproblem
                      in some of the methods to be discussed. Because it has only one parameter it is
                      usually termed the linear search problem. The comparable nonlinear-equation
                      problem is usually called root-finding. For the case that f(b) is a polynomial of
                      degree (K – 1), that is

                                                                                         (12.8)

                      the problem has a particularly large literature (see, for instance, Jenkins and
                      Traub 1975).
   149   150   151   152   153   154   155   156   157   158   159