Page 238 -
P. 238
226 5 Neural Networks
Figure 5.51. Topological map of the rocks dataset, labelled according to the
winning neurons for each case: G-granite; D-diorite; S-slate; SE-serpentine; M-
marble; L-limestone; B-breccia. Distance measures increase from a full black to a
full white neuron. The figured case is the number 1 granite case.
Let us now consider the result of applying Kohonen mapping, with an output
grid of 6x5 neurons, to the Rocks dataset. Training was performed with 100 epochs
starting with an initial neighbourhood radius of 3. Different clustering solutions are
obtained depending on the initial weights and small variations of initial and final
values of the learning rate. All solutions revealed, however, a similar topology as
that exemplified by the topological map of Figure 5.5 1, where the main categories
of rocks are clearly and reasonably identified. In fact, the Kohonen map shows a
clear distinction between Si-rich rocks in the upper clusters and Ca-rich rocks in
the lower clusters (see also sections 3.4 and 3.5). The other clusters, e.g. ML, also
have a logical interpretation, since they present values for the most relevant
features that lie midway between the values of the limiting clusters.
5.1 2 Hopfield Networks
The Hopfield network is a recurrent neural network with a feedback loop from
each output to the inputs of all other units, with no self-feedback, as shown in
Figure 5.22. The full analysis of the Hopfield net, with a d-dimensional input
vector x, requires dynamical considerations that can be found e.g. in Haykin
(1999). The main result of the dynamic analysis is the convergence of the net to
local minima of the following energy function:
where it is assumed that: