Page 119 - Nanotechnology an introduction
P. 119
interconnexions.
In this spirit, evolutionary design principles may become essential for designing nanodevices. An example of an evolutionary design algorithm, or
simply evolutionary algorithm (EA) for short, is shown in Figure 10.3. It might be initialized by a collection of existing designs, or guesses at
possible new designs. Since new variety within the design population is generated randomly, the algorithm effectively expands the imagination of
the human designer.
Figure 10.3 An evolutionary design algorithm. All relevant design features are encoded in the genome (a very simple genome is for each gene to be a single binary digit indicating absence (0) or presence (1) of a feature). The genomes
are evaluated (“survivor selection strategy”) —this stage could include human (interactive) as well as automated evaluation—and only genomes fulfilling the evaluation criteria are retained. The diminished population is then expanded in
numbers and in variety—typically the successful genomes are used as the basis for generating new ones via biologically inspired processes such as recombination and mutation.
Essential requirements of evolutionary design are, then:
1. Encoding the features of the entity to be designed. Each feature is a “gene”, and the ensemble of genes is the “genome”.
2. Generating a population of variants of the entity; in the simplest genomes, each possible feature may be present or absent, in which case the
genome is simply a binary string; more elaborate ones may allow features to take a range of possible values.
3. Selection. If the selection criterion or criteria can be applied directly to the genome, the process is very rapid, especially if the criterion is a
simple threshold (e.g., is the ultimate functional requirement met?). It may be, however, that the genome needs to be translated into the entity it
encodes, and the selection criterion or criteria applied to the entity itself; an intermediate situation is if the genome can be translated into a
model of the entity, which is then subjected to the selection criterion or criteria. Only those genomes fulfilling the criteria are retained; the rest are
discarded.
4. Regeneration of variety. Since the selection (3.) implies more or less drastic elimination, the survivors (there must be at least one) beome the
parents of the next generation, which may be asexual (“mutations”, i.e., randomly changing the values of genes) or sexual (in which two or more
parents are selected, according to some strategy, which may be random, and fragments of the parents’ genomes, again selected according to
some strategy, are combined into the offspring's genome) or both.
5. Return to 3. and continue iterating. If the selection criterion is graded, one can continue until no further increase in quality takes place. It is
almost inevitable that human intervention must then make the final choice (unless there is a single clear “winner”. Indeed, human intervention may
be incorporated at every selection stage, although it makes the process very slow. Typically human intervention takes place after, say, one
hundred iterations of automatic selection and regeneration; the human selector is presented with a small number (~10) of the best according to
the automatic selection process, and selects a few from those (e.g., [25]). This “interactive evolutionary computation” (IEC) is an excellent way to
incorporate expert human knowledge into the design process without having to define and quantify it.
Although the EA design strategy was inspired by biological (“Darwinian”) evolution, there is a crucial qualitative difference in that the latter is open
ended, whereas the former is generally applied to achieve a preset goal, even though that goal may be specified at a high level. In fact, the main
current use of evolutionary algorithms is multi-objective optimization of designs whose main features are already fixed.
The evolutionary paradigm can operate at various levels. “Genetic algorithms” merely allow the parameters of the system's description to evolve
(e.g., the coefficients of equations). “Genetic programming” allows more freedom—the algorithm itself may evolve. One may also allow the
genome's structure to evolve, and so forth. EAs are generally implemented synchronously (the entire population is simultaneously subjected to the
same process), which is of course different from evolution in nature. In a real evolving system, “bad” species may persist, and “good” species may
be eliminated for no good reason [5]. These and other biological features [14] will hopefully be implemented in the future as EAs themselves
continue to evolve. Ultimately one might reach the ability to create self-organizing functionality.
Although this strategy of evolutionary design enables the design size (i.e., the number of individual features that must be explicitly specified) to be
expanded practically without limit, one sacrifices knowledge of the exact internal workings of the system, introducing a level of unpredictability into
device performance that may require a new engineering paradigm to be made acceptable. One should, however, bear in mind that even
deterministically designed complex systems (e.g., a motor car, ship or airplane), at current levels of technological sophistication, have a vast
behavior space and every possible combination of control parameters cannot be explicitly tested. Given that therefore we already in practice
sacrifice complete knowledge of the system (even though in principle it is attainable), and yet still have a high degree of confidence in its ability to
function safely, it may be unreasonable to object to using an evolutionarily designed artifact.
10.8. Performance Criteria
The evolutionary algorithm approach demands “fitness” criteria against which the performance of offspring can be judged. Although a system's
performance can only be meaningfully measured by considering its ultimate functional output, it may still be helpful to consider component
performance. Thus, a logic gate may be assessed according to its rectification ratio and switching speed. A memory cell can be assessed
according to its flip energy (between states) and the stability of states. Sensors may be assessed by (in order of increasing preference) gain,
signal to noise ratio [73] and detection efficiency [114]. Note that the performance of microelectromechanical systems (MEMS) such as
accelerometers is degraded if they are further miniaturized down to the nanoscale [73].