Page 112 - Innovations in Intelligent Machines
P. 112

102    I.K. Nikolos et al.
                           the computing cost associated with the use of neural networks is the cost of
                           training the networks, whereas the use of a trained network to evaluate a new
                           individual adds negligible computation cost [45].


                           5.3 Using RBFN for Accelerating DE Algorithm
                           In each DE generation, during the evaluation procedure, each trial vector
                           must be evaluated and then compared with the corresponding current vector,
                           in order to select the better-fitted between them to pass to the next genera-
                           tion. The concept is to replace the costly exact evaluations of trial vectors with
                           fast inexact approximations, and at the same time maintain the robustness
                           of the DE algorithm. During the evaluation phase, each trial vector is pre-
                           evaluated, using the approximate model. If it is pre-evaluated as lower-fitted
                           (higher objective function in minimization problems) than the corresponding
                           vector of the current population, then no further exact evaluation is needed
                           and the current vector is transferred to the next generation, while the trial
                           vector is abandoned. In case the trial vector is pre-evaluated as better fit-
                           ted than the corresponding current vector, then an exact re-evaluation takes
                           place after the pre-evaluation, along with a new comparison between the two
                           vectors. If the trial vector is still better-fitted than the current vector, then
                           the trial vector passes to the next generation. Otherwise the current vector is
                           the one that will pass to the next generation. Additionally, a small percent-
                           age of the candidate solutions, are selected with uniform probability to be
                           exactly evaluated, without taking into account their performance provided by
                           the approximation model. In the first two generations, all vectors are exactly
                           evaluated. According to the afore mentioned procedure, only exactly evaluated
                           trial vectors have the opportunity to pass to the new generation, so the cur-
                           rent population always comprises exactly evaluated individuals. In this way,
                           one part of the comparison (the current vector) is always an exact-evaluated
                           vector, and this enhances the robustness of the procedure.
                              The result of each evaluation (exact or inexact), along with the corre-
                           sponding chromosome, are stored in a database. In order to have a local
                           approximation model, only the best-fitted individuals of database entries are
                           used in each generation to re-train the RBFN. In this way the approxima-
                           tion model evolves with the population and uses only the useful information
                           for approximating the objective function. The surrogate model predictions
                           replace exact and costly evaluations only for the less-promising individuals,
                           while the more-promising ones are always exactly evaluated.


                           6 Simulation Results

                           The same artificial environment was used for all the test cases considered, with
                           different starting and target points. The (experimentally optimized) settings
                           of the Differential Evolution algorithm were as follows: population size = 50,
   107   108   109   110   111   112   113   114   115   116   117