Page 954 - The Mechatronics Handbook
P. 954

0066_Frame_C32.fm  Page 4  Wednesday, January 9, 2002  7:54 PM











                                                                           λ                  λ



                         (a)                (b)                (c)                (d)

                       FIGURE 32.4  Typical activation functions: (a) hard threshold unipolar, (b) hard threshold bipolar, (c) continuous
                       unipolar, (d) continuous bipolar.

                                                                      HIDDEN
                                               HIDDEN                 LAYER #2
                                               LAYER #1
                                                                                OUTPUT
                                                                                LAYER




                                      INPUTS                                           OUTPUTS












                                          +1            +1               +1


                       FIGURE 32.5  An example of the three-layer feedforward neural network, which is sometimes known also as the
                       backpropagation network.

                       32.3 Feedforward Neural Networks

                       Feedforward neural networks allow only one-directional signal flow. Furthermore, most feedforward
                       neural networks are organized in layers. An example of the three-layer feedforward neural network is
                       shown in Fig. 32.5. This network consists of input nodes, two hidden layers, and an output layer.
                         A single neuron is capable of separating input patterns into two categories, and this separation is linear.
                       For example, for the patterns shown in Fig. 32.6, the separation line is crossing x 1  and x 2  axes at points x 10
                       and x 20 . This separation can be achieved with a neuron having the following weights: w 1  = 1/x 10 , w 2  = 1/x 20 ,
                       and w 3  = −1. In general for n dimensions, the weights are


                                                         1
                                                    w i =  ------  for w n+1 =  – 1              (32.8)
                                                         x i0
                         One neuron can divide only linearly separated patterns. To select just one region in n-dimensional input
                       space, more than n + 1 neurons should be used. If more input clusters are to be selected, then the number
                       of neurons in the input (hidden) layer should be properly multiplied. If the number of neurons in the
                       input (hidden) layer is not limited, then all classification problems can be solved using the three-layer
                       network. An example of such a neural network, classifying three clusters in the two-dimensional space, is
                       shown in Fig. 32.7. Neurons in the first hidden layer create the separation lines between input clusters.


                       ©2002 CRC Press LLC
   949   950   951   952   953   954   955   956   957   958   959