Page 240 -
P. 240
228 5 Neural Networks
1 - Storage phase
Let zk be a set of c prototype patterns, with binary-valued features, to be
memorized. Assign weights to the net as:
C
w~=~z~,~z~,~ for jfi; wii=O; i,j=l,-..,d.
k=l
Note that a symmetric weight matrix is used, therefore only half of the d2
weights have to be computed.
The weight computation as a product of neuron inputs is known as Hebb rule
(see Table 5-2). The idea behind this rule is to enhance the "synaptic" link of
neurons that are in the same state.
2 - Retrieval phase
2.1 - Let y be the unknown pattern. At time t=O, initialize the inputs with the
unknown pattern:
2.2 - Iterate until convergence:
The iteration is performed assynchronously, by randomly selecting a neuron
from the subset of d neurons not yet selected. The updating of all the d neurons is
called a cycle. If a stable state has been reached at the end of a cycle, one proceeds
to the next step, otherwise a new cycle takes place.
2.3 - The retrieved vector is the one that best matches the network output vector
(stable state).
Ideally, the Hopfield net will converge to one of the c vertices of the d-
dimensional hypercube, corresponding to the prototypes. In practice, especially for
large c, extra stable states will appear, known as spurious states. A discussion
about the spurious states phenomenon can be found in Haykin (1999). The possible
presence of spurious states justifies the need for selecting the prototype vector that
best matches the stable state (counting the number of differences in component
values, the so-called Hamming distance) in the previous step 2.3.
We now exemplify the use of a Hopfield net as a CAM device, using the
HopJeld program (see Appendix B) with the two-dimensional binary prototype
images shown in Figure 5.52.
The Hopfield program allows the addition of random noise to the binary images,
thereby obtaining "unknown" patterns. It is also possible to create an arbitrary
pattern by directly editing the binary grid.