Page 962 - The Mechatronics Handbook
P. 962
0066_Frame_C32.fm Page 12 Wednesday, January 9, 2002 7:54 PM
Nguyen and Widrow (1990) proposed an experimental approach for the two-layer network weight initial-
ization. In the second layer, weights are randomly chosen in the range from −0.5 to +0.5. In the first layer,
initial weights are calculated from
w ij = bz ij w n+1( )j = random −b, +b) (32.32)
(
--------,
z j
where z ij is the random number from −0.5 to +0.5 and the scaling factor b is given by
b = 0.7P 1/N (32.33)
where n is the number of inputs and N is the number of hidden neurons in the first layer. This type of
weight initialization usually leads to faster solutions.
For adequate solutions with backpropagation networks, typically many tries are required with different
network structures and different initial random weights. It is important that the trained network gains a
generalization property. This means that the trained network also should be able to handle correctly
patterns that were not used for training. Therefore, in the training procedure, often some data are removed
from the training patterns and then these patterns are used for verification. The results with backprop-
agation networks often depend on luck. This encouraged researchers to develop feedforward networks,
which can be more reliable. Some of those networks are described in the following sections.
Functional Link Network
One-layer neural networks are relatively easy to train, but these networks can solve only linearly separated
problems. One possible solution for nonlinear problems was presented by Nilsson (1965) and was then
elaborated by Pao (1989) using the functional link network shown in Fig. 32.11. Using nonlinear terms
with initially determined functions, the actual number of inputs supplied to the one-layer neural network
is increased. In the simplest case, nonlinear elements are higher order terms of input patterns. Note that
the functional link network can be treated as a one-layer network, where additional input data are generated
off-line using nonlinear transformations. The learning procedure for one-layer is easy and fast. Figure 32.12
shows an X OR problem solved using functional link networks. Note that when the functional link approach
is used, this difficult problem becomes a trivial one. The problem with the functional link network is that
proper selection of nonlinear elements is not an easy task. In many practical cases, however, it is not difficult
to predict what kind of transformation of input data may linearize the problem, and so the functional link
approach can be used.
INPUTS
OUTPUTS
NONLINEAR ELEMENTS
FIGURE 32.11 The functional link network. +1
©2002 CRC Press LLC

