Page 372 - Numerical Methods for Chemical Engineering
P. 372

Markov chains and processes; Monte Carlo methods                    361



                  using the sampling distribution

                                                      H   x
                                               P(x) =                               (7.262)
                                                       V
                  Simulated annealing

                  Consider again the Boltzmann distribution for a system with a state vector x and an energy
                  function E(x), such that the probability of finding the system at state x is

                                                          E(x)
                                             P(x) ∝ exp −                           (7.263)
                                                          k b T
                  As T → 0, the system is confined more closely to its minimum energy state, until at T = 0 +
                  only the minimum energy state is observed. By analogy to this result, we obtain a method
                  for finding the global minimum of a cost function known as simulated annealing. Unlike
                  the methods of Chapter 5, simulated annealing may be used for both continuously-varying
                  and discretely-varying parameters. We merely replace the energy with the cost function and
                  set k b to 1:

                                                          F(x)
                                             P(x) ∝ exp −                           (7.264)
                                                           T
                  We sample from this distribution using the Metropolis Monte Carlo method, starting initially
                  at a very high temperature so that the system is able to move efficiently around parameter
                  space. We then slowly decrease the temperature to zero, to allow the system sufficient time
                  to escape from higher-lying local minima and find the global minimum (which is easier
                  said than done).
                    This method is implemented by simulated annealing.m, called by

                  [x,F,iflag,x traj] = simulated annealing(func name,x0,OptParam,ModelParam);
                  func name is the name of a routine that returns the cost function,

                  function f = fun name(x,ModelParam);
                  ModelParam is a structure of fixed parameters passed to the cost function routine. x0 is the
                  initial guess of the state vector. Here, all components of x are assumed to vary continuously,
                  but the routine can be modified easily to treat discretely-varying parameters. OptParam
                  is an optional structure that allows the user to modify the behavior of the algorithm. Type
                  help simulated annealing for further details. The output includes the estimated minimum
                  x and its cost function value F, an integer iflag for which 1 denotes success, and x traj,a
                  trajectory of state values sampled during the annealing.
                    The performance of this simulated annealing routine is examined for the following cost
                  function, which has two minima, one at ˜x (1)  = (−3, 3) and a lower one at ˜x (2)  = (3, −3):
                                              (1) 2        (1) 2       (2) 2
                                 10 + 0.1	x − ˜ x 	 ,  if 	x − ˜ x 	 ≤	x − ˜ x
                         F(x) =        (2) 2    2             2          2          (7.265)
                                 	x − ˜ x 	 ,       otherwise
                                         2
                  test simulated annealing.m performs this calculation, and calls global min cost func.m,
                  that returns the value of the cost function.
   367   368   369   370   371   372   373   374   375   376   377