Page 975 - The Mechatronics Handbook
P. 975
0066_Frame_C32.fm Page 25 Wednesday, January 9, 2002 7:54 PM
TABLE 32.2 Population of Second Generation
String Decimal Variable Function Fraction
Number String Value Value Value of Total
1 010111 47 1.175 0.0696 0.1587
2 100100 37 0.925 0.0307 0.0701
3 110101 53 1.325 0.0774 0.1766
4 010001 41 1.025 0.0475 0.1084
5 100001 33 0.825 0.0161 0.0368
6 110101 53 1.325 0.0774 0.1766
7 110000 48 1.200 0.0722 0.1646
8 101001 41 1.025 0.0475 0.1084
Total 0.4387 1.0000
Note that two identical highest ranking members of the second generation are very close to the solution
x = 1.309. The randomly chosen parents for the third generation are:
010111 → 47 110101 → 53 110000 → 48 101001 → 41
110101 → 53 110000 → 48 101001 → 41 110101 → 53
which produces the following children:
010101 → 21 110000 → 48 110001 → 49 101101 → 45
110111 → 55 110101 → 53 101000 → 40 110001 → 49
The best result in the third population is the same as in the second one. By careful inspection of all
strings from the second or third generation, it may be concluded that using crossover, where strings are
always split in half, the best solution 110100 → 52 will never be reached, regardless of how many generations
are created. This is because none of the population in the second generation has a substring ending with
100. For such crossover, a better result can be only obtained due to the mutation process, which may
require many generations. Better results in the future generation also can be obtained when strings are
split in random places. Another possible solution is that only randomly chosen bits are exchanged between
parents.
The genetic algorithm is very rapid, and it leads to a good solution within a few generations. This
solution is usually close to global maximum, but not the best.
Defining Terms
Backpropagation: Training technique for multilayer neural networks.
Bipolar neuron: Neuron with output signal between −1 and +1.
Feedforward network: Network without feedback.
Perceptron: Network with hard threshold neurons.
Recurrent network: Network with feedback.
Supervised learning: Learning procedure when desired outputs are known.
Unipolar neuron: Neuron with output signal between 0 and +1.
Unsupervised learning: Learning procedure when desired outputs are unknown.
References
Fahlman, S.E. 1988. Faster-learning variations on backpropagation: An empirical study. Proceedings of
the Connectionist Models Summer School, D. Touretzky, G. Hinton, and T. Sejnowski, Eds., Morgan
Kaufmann, San Mateo, CA.
Fahlman, S.E. and Lebiere, C. 1990. The cascade correlation learning architecture. Adv. Ner. Inf. Proc.
Syst., 2, D.S. Touretzky, ed., pp. 524–532. Morgan Kaufmann, Los Altos, CA.
©2002 CRC Press LLC

