Page 239 -
P. 239
5.12 Hopfield Networks 227
- The inputs xi (i=l, ..., d) are continuous-valued variables and applied at a given
initial instant;
- The weights are symmetric, wii= wji;
- The activation functions Ax) are sigmoidal functions, identical for all neurons,
with parameter a governing the sigmoidal slope (see formulas 5-lob and 5-10c);
- The Rj are non-negative quantities that represent dissipative factors, responsible
for energy that is not spent in the "stimulation" of the neurons.
The second term of (5-110) vanishes when a + w, i.e., when the sigmoid
converges to the step function. The net then converges to local minima of:
A particularly interesting version of the network in this case corresponds to
imposing binary valued inputs, e.g. xi = +I, the so-called discrete HopJield net.
Suppose that given a matrix W of weights wii, with wii = 0 (no self feedback), the
output of neuron i, xi, is computed as:
(,dl j
xi = sgn C w..x. ,
where sgn is the sign function, identical to the step function 5-10a except that
sgn(0)=0. If the linear combination of the inputs yields a positive value, the output
is set to +l; if a negative value is obtained, the output is set to -1; for the zero
value the output is left unchanged. With this activation function the outputs of the
Hopfield neurons will also be binary valued vectors, corresponding to vertices of a
d-dimensional hypercube, known as states.
The updating can be done in a random serial way - asynchronous updating - i.e.,
each neuron is selected randomly for updating, or in a fully parallel way -
synchronous updating -, i.e, all the neurons are updated at the same time. When the
updating of the outputs is done in an assynchronous way and the matrix W is
symmetric, the net will converge to a stable state, with the outputs remaining
unchanged with further iterations. The demonstration of this important result can
be found e.g. in (Looney, 1997). The stable states of a Hopfield net correspond to
minima of the energy function (5-1 11). If the updating is done in a fully parallel
way, the network can either converge to a stable state or oscillate between two
states.
The discrete Hopfield net has found interesting applications as a content-
addressed memory (CAM) device, which allows the retrieval of a previously
memorized pattern that most resembles a given input pattern. In this case, the
neurons play the role of memories that are able to recall a previously stored
prototype pattern. The algorithm for using the Hopfield net as a CAM device is as
follows: