Page 55 - Neural Network Modeling and Identification of Dynamical Systems
P. 55

2.1 ARTIFICIAL NEURAL NETWORK STRUCTURES                  43
                          the same layer. One example of an ANN with
                          lateral connections is the Recurrent MultiLayer
                          Perceptron (RMLP) network [8–10].

                          2.1.2.2 Examples of Layered Structural
                                 Organization for Neural Network
                                 Models
                            Examples of structural organization options
                          for static-type ANN models (i.e., without TDL
                          elements and/or feedbacks) are shown in
                          Fig. 2.9. Network ADALINE [11] is a single-layer  FIGURE 2.9 Examples of a structural organization for
                          (i.e., without hidden layers) linear ANN model.  feedforward neural networks. (A) ADALINE (Adaptive Lin-
                          Its structure is shown in Fig. 2.9A. A more  ear Network). (B) MLP (MultiLayer Perceptron). D in are
                                                                                                           (0)
                          general variant of feedforward neural network  source (input) data; D out are output data (results); L  is in-
                                                                       put layer; L (1)  is output layer.
                          (FFNN) is MLP (MultiLayer Perceptron) [10,11],
                          which is a nonlinear network with one or more
                          hidden layers (Fig. 2.9B).
                            Dynamic networks can be divided into two
                          classes [12–19]:

                          • feedforward networks, in which the input sig-
                            nals are fed through delay lines (TDL ele-
                            ments);
                          • recurrent networks in which feedbacks exist,
                            and there may also be TDL elements at the in-
                            puts of the network.
                            Examples of the structural organization of
                          the ANN models of the dynamic type of the
                          first type (i.e., with TDL elements at the net-
                          work inputs, but without feedbacks) are shown
                          in Fig. 2.10.
                            Typical variants of ANN models of this     FIGURE 2.10 Examples of a structural organization for
                          type are the Time Delay Neural Network       feedforward dynamic neural networks. (A) TDNN (Time
                                                                       Delay Neural Network). (B) DTDNN (Distributed Time De-
                          (TDNN) [10,20–27], whose structure is shown in
                                                                       lay Neural Network). D in are source (input) data; D out are
                          Fig. 2.10A (similarly, in the structural plan, the           (0)          (1)
                                                                       output data (results); L  is input layer; L  is hidden layer;
                          Focused Time Delay Neural Network [FTDNN]     (2)              (n)      (m)
                                                                       L  is output layer; TDL 1  and TDL 2  are tapped delay
                          is organized) as well as the Distributed Time De-  lines (TDLs)oforder n and m, respectively.
                          lay Neural Network (DTDNN) network [28](see
                          Fig. 2.10B).
                            Examples of the structural organization of dy-
                                                                       search began to develop, are the Jordan net-
                          namic ANN models of the second kind, that is,
                          of recurrent neural networks (RNN), are shown  work [14,15] (Fig. 2.11A), the Elman network [10,
                          in Figs. 2.11–2.13.                          29–32] (Fig. 2.11B), the Hopfield network [10,11]
                            Classical examples of recurrent networks,  (Fig. 2.12A), and the Hamming network [11,28]
                          from which, to a large extent, this area of re-  (Fig. 2.12B).
   50   51   52   53   54   55   56   57   58   59   60