Page 53 - Neural Network Modeling and Identification of Dynamical Systems
P. 53
2.1 ARTIFICIAL NEURAL NETWORK STRUCTURES 41
The most general way of introducing feed- with L (1) and up to L (N L ) , or for some of them,
back into a “stack of layers”–type structure is for some range of numbers p 1 p p 2 .The im-
showninFig. 2.7C. Here the feedback comes plementation depends on which layers of the
from some hidden layer L (q) , 1 <q <N L ,to ANN we cover by feedback. However, in any
the layer L (p) , 1 p< N L , q> p. Similar to case, some strict sequence of operation of the
the case shown in Fig. 2.7A, this variant can be layers is preserved. If one of the ANN layers
treated as a serial connection of a feedforward started its work, then, until this work is com-
(1)
neural network (layers L ,...,L (p−1) ), the net- pleted, no other layer will be launched for pro-
work with feedback (layers L (p) ,...,L (q) ), and cessing.
another feedforward network (layers L (q+1) ,..., The rejection of this kind of strict firing se-
L (N L ) ). The operation of such a network can, for quence for the ANN layers leads to the appear-
example, be interpreted as follows. The recur- ance of parallelism in the network at the level of
rent subnet (the layers L (p) ,...,L (q) ) is the main its layers. In the most general case, we allow for
part of the ANN as a whole. One feedforward any neuron from the layer L (p) and any neuron
(1)
subnet (layers L ,...,L (p−1) ) preprocesses the from the layer L (q) to establish a connection of
data entering the main subnet, while the second any type. Namely, we allow forward, backward
subnet (layers L (q+1) ,...,L (N L ) )performssome (for these cases p = q), or lateral (in this case
postprocessing of the data produced by the main p = q) connections. Here, for the time being, it is
recurrent subnet. still considered that a layered organization like
Fig. 2.7D shows an example of a generaliza- “stack of layers” is used.
tion of the structure shown in Fig. 2.7C, for the Variants of the ANN structural organization
case in which, in addition to strictly consecutive shown in Fig. 2.7 use the same “stack of lay-
connections between the layers of the network,
ers” scheme for ordering the layers of the net-
there are also bypass connections.
work. Here, at each time interval, the neurons
In all the ANN variants shown in Fig. 2.6,
of only one layer work. The remaining layers ei-
the strict sequence of layers is preserved un- ther have already worked or are waiting for their
changed. The layers are activated one after the turn. This approach applies to both feedforward
other in the order specified by forward and networks and recurrent networks.
backward connections in the considered ANN. The following variant allows us to refuse
For a feedforward network, this means that any the “stack of layers” scheme and to replace it
neuron from the layer L (p) receives its inputs with more complex structures. As an example
only from neurons from the layer L (p−1) and illustrating structures of this kind, we show in
passes its outputs to the layer L (p+1) , i.e.,
Fig. 2.8 two variants of the structures of an ANN
L (p−1) → L (p) → L (p+1) ,p ∈{0,1,...,N L }. with parallelism in them at the layer level. 4
(2.7) Consider the schemes shown in Fig. 2.7 and
Fig. 2.8. Obviously, to activate a neuron from
At the same time (simultaneously) two or more some pth layer, it must first get the values of
layers cannot be executed (“fired”), even if there all its inputs it “waits for” until that moment.
is such a technical capability (the network is ex- For paralleling the work of neurons, we must
ecuted on some parallel computing system) due meet the same conditions. Namely, all neurons
to the sequential operation logic of the ANN lay- that have a complete set of inputs at a given mo-
ers noted above.
The use of feedback introduces cyclicity into 4 If we refuse the “stack of layers” scheme, some layers in
the order of operation for the layers. We can im- the ANN can work in parallel, i.e., simultaneously with each
plement this cyclicity for all layers, beginning other, if there is such a technical possibility.