Page 37 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 37
24 CHAPTER 1 Nature’s Learning Rule: The Hebbian-LMS Algorithm
the threshold, the postsynaptic neuron will fire at a rate that increases with the
magnitude of the (SUM) above the threshold. The threshold voltage within the
postsynaptic neuron is a “resting potential” close to 70 mV. Summing in the post-
synaptic neuron is accomplished by Kirchoff addition.
Learning and weight changing can only be done in the presence of neurotrans-
mitter in the synaptic cleft. Thus, there will be no weight changing if the presynaptic
neuron is not firing, that is, if the input signal to the synapse is zero. If the presyn-
aptic neuron is firing, there will be weight change. The number of receptors will
gradually increase (up to a limit) if the postsynaptic neuron is firing, that is, when
the (SUM) of the postsynaptic neuron has a voltage above threshold. Then the
synaptic membrane that the receptors are attached to will have a voltage above
threshold since this membrane is part of the postsynaptic neuron. See Fig 1.17.
All this corresponds to Hebbian learning, firing together wiring together. Extending
Hebb’s rule, if the presynaptic neuron is firing and the postsynaptic neuron is not
firing, the postsynaptic (SUM) will be negative and below the threshold, the mem-
brane voltage will be negative and below the threshold, and the number of receptors
will gradually decrease.
There is another mechanism having further control over the synaptic weight
values, and it is called synaptic scaling [26e30]. This natural mechanism is imple-
mented chemically for stability, to maintain the voltage of (SUM) within an approx-
imate range about two set points. This is done by scaling up or down all of the
synapses supplying signal to a given neuron. There is a positive set point and a
FIGURE 1.17
A neuron, dendrites, and a synapse.