Page 39 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 39

26     CHAPTER 1 Nature’s Learning Rule: The Hebbian-LMS Algorithm





                         11. THE POSTULATES AND THE HEBBIAN-LMS ALGORITHM
                         The Hebbian-LMS algorithm of Eqs. (1.6e1.8), and diagrams in Figs. 1.9 and 1.10
                         as applied to both excitatory and inhibitory inputs perform in complete accord with
                         the biological postulates of synaptic plasticity.
                            An algorithm based on Hebb’s original rule would cause all the weights to
                         converge and saturate at their maximum values after many adaptive cycles. Weights
                         would only increase, never decrease. A neural network with all equal weights would
                         not be useful. Accordingly, Hebb’s rule is extended to apply to both excitatory and
                         inhibitory synapses and to the case where the presynaptic neuron fires and the post-
                         synaptic neuron does not fire. Synaptic scaling to maintain stability also needs to be
                         taken into account. The Hebbian-LMS algorithm does all this.




                         12. NATURE’S HEBBIAN-LMS ALGORITHM
                         The Hebbian-LMS algorithm performs in accord with the synaptic postulates. These
                         postulates indicate the direction of synaptic weight change, increase or decrease, but
                         not the rate of change. On the other hand, the Hebbian-LMS algorithm of Eq. (1.6)
                         not only specifies direction of weight change but also specifies rate of change. The
                         question is could nature be implementing something like Hebbian-LMS at the level
                         of the individual neuron and its synapses and in a full-blown neural network?
                            The Hebbian-LMS algorithm changes the individual weights at a rate propor-
                         tional to the product of the input signal and the error signal. The Hebbian-LMS error
                         signal is roughly proportional to the (SUM) signal for a range of values about zero.
                         The error drops off and the rate of adaptation slows as (SUM) approaches either
                         equilibrium point. The direction of adaptation reverses as (SUM) goes beyond the
                         equilibrium point, creating homeostasis.
                            In the synaptic cleft, the amount of neurotransmitter present is proportional to the
                         firing rate of the presynaptic neuron, that is, the input signal to the synapse. By
                         ohmic conduction, the synaptic membrane voltage is proportional to the voltage
                         of the postsynaptic soma, the (SUM), which determines the error signal. The rate
                         of change in the number of neurotransmitter receptors is approximately proportional
                         to the product of the amount of neurotransmitter present and the voltage of the syn-
                         aptic membrane, negative or positive. This is all in agreement with the Hebbian-
                         LMS algorithm. It is instructive to compare the drawings of Fig. 1.17 with those
                         of Figs. 1.9 and 1.15. In a functional sense, they are very similar. Figs. 1.9 and
                         1.15 show weight changing being dependent on the error signal, a function of the
                         (SUM), and the input signals to the individual weights. Fig. 1.17 indicates that
                         the (SUM) signal is available to the synaptic membrane by linear ohmic conduction
                         and the input signal is available in the synaptic cleft as the concentration of
                         neurotransmitter.
   34   35   36   37   38   39   40   41   42   43   44