Page 362 - Applied Numerical Methods Using MATLAB
P. 362

MATLAB BUILT-IN ROUTINES FOR OPTIMIZATION  351

             %nm731_1
             % to minimize an objective function f(x) by various methods.
             clear, clf
             % An objective function and its gradient function
             f = inline(’(x(1) - 0.5).^2.*(x(1) + 1).^2 + (x(2)+1).^2.*(x(2) - 1).^2’,’x’);
             g0 = ’[2*(x(1)- 0.5)*(x(1)+ 1)*(2*x(1)+ 0.5) 4*(x(2)^2 - 1).*x(2)]’;
             g = inline(g0,’x’);
             x0 = [0 0.5] %initial guess
             [xon,fon] = opt_Nelder(f,x0) %min point, its ftn value by opt_Nelder
             [xos,fos] = fminsearch(f,x0) %min point, its ftn value by fminsearch()
             [xost,fost] = opt_steep(f,x0) %min point, its ftn value by opt_steep()
             TolX = 1e-4; MaxIter = 100;
             xont = Newtons(g,x0,TolX,MaxIter);
             xont,f(xont) %minimum point and its function value by Newtons()
             [xocg,focg] = opt_conjg(f,x0) %min point, its ftn value by opt_conjg()
             [xou,fou] = fminunc(f,x0) %min point, its ftn value by fminunc()


              Noting that it depends mainly on the initial value x 0 whether each routine
            succeeds in finding a minimum point, we summarize the results of running those
            routines with various initial values in Table 7.2. It can be seen from this table
            that the gradient-based optimization routines like “opt_steep()”, “Newtons()”,
            “opt_conj()”, and “fminunc()” sometimes get to a saddle point or even a
            maximum point (Remark 7.1) and that the routines do not always approach the
            extremum that is closest to the initial point. It is interesting to note that even
            the non-gradient-based MATLAB built-in routine “fminsearch()” may get lost,
            while our routine “opt_Nelder()” works well for this case. We cannot, how-
            ever, conclude that this routine is better than that one based on only one trial,
            because there may be some problems for which the MATLAB built-in routine
            works well, but our routine does not. What we can state over this happening is
            that no human work is free from defect.
              Now, we will see a MATLAB built-in routine “lsqnonlin(f,x0,l,u,
            options,p1,..)”, which presents a nonlinear least-squares (NLLS) solution to


            Table 7.2 Results of Running Several Unconstrained Optimization Routines with
            Various Initial Values

            x 0      opt − Nelder fminsearch opt − steep  Newtons  opt − conjg  fminunc
            [0, 0]     [−1, 1]   [0.5, 1]  [0.5, 0]  [−0.25, 0]  [0.5, 0]  [0.5, 0]
                     (minimum) (minimum)  (saddle)  (maximum)  (saddle)  (saddle)
            [0, 0.5]   [0.5, 1]  [0.02, 1]  [0.5, 1]  [−0.25, −1]  [0.5, 1]  [0.5, 1]
                     (minimum)    (lost)  (minimum)  (saddle)  (minimum) (minimum)
            [0.4, 0.5]  [0.5, 1]  [0.5, 1]  [0.5, 1]  [0.5, −1]  [0.5, 1]  [0.5, 1]
                     (minimum) (minimum) (minimum) (minimum) (minimum) (minimum)
            [−0.5, 0.5]  [0.5, 1]  [−1, 1]  [−1, 1]  [−0.25, −1]  [−1, 1]  [−1, 1]
                     (minimum) (minimum) (minimum)  (saddle)  (minimum) (minimum)
            [−0.8, 0.5]  [−1, 1]  [−1, 1]  [−1, 1]  [−1, −1]  [−1, 1]   [−1, 1]
                     (minimum) (minimum) (minimum) (minimum) (minimum) (minimum)
   357   358   359   360   361   362   363   364   365   366   367