Page 26 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 26

6. Bootstrap Learning With a More “Biologically Correct” Sigmoidal Neuron  13





                  6. BOOTSTRAP LEARNING WITH A MORE “BIOLOGICALLY
                     CORRECT” SIGMOIDAL NEURON

                  The inputs to the weights of the sigmoidal neuron in Fig. 1.7 could be positive or
                  negative, the weights could be positive or negative, and the outputs could be positive
                  or negative. As a biological model, this would not be satisfactory. In the biological
                  world, an input signal coming from a presynaptic neuron must have positive values
                  (presynaptic neuron firing at a given rate) or have a value of zero (presynaptic neuron
                  not firing). Some presynaptic neurons and their associated synapses are excitatory,
                  some inhibitory. Excitatory and inhibitory synapses have different neurotransmitter
                  chemistries. The inhibitory inputs to the postsynaptic neuron are subtracted from the
                  excitatory inputs to form (SUM) in the cell body of the postsynaptic neuron. Biolog-
                  ical weights or synapses behave like variable attenuators and can only have positive
                  weight values. The output of the postsynaptic neuron can only be zero (neuron not
                  firing) or positive (neuron firing) corresponding to (SUM) being negative or positive.
                  The postsynaptic neuron and its synapses diagrammed in Fig. 1.9 have the indicated
                  properties and are capable of learning exactly like the neuron and synapses in
                  Fig. 1.7. The LMS algorithm of Eq. (1.1) will operate as usual with positive excit-
                  atory inputs or negative inhibitory inputs. For LMS, these are equivalents of positive
                  or negative components of the input pattern vector.
                     LMS will allow the weight values to remain within their natural positive range
                  even if adaptation caused a weight value to be pushed to one of its limits. Subsequent
                  adaptation could bring the weight value away from the limit and into its more normal






                                              Excitatory

                                               +
                                             +
                       All                        (SUM)                     OUTPUT
                     Positive               +  Â
                      Inputs                -
                                             -
                                               -
                                                                     HALF
                                                                    SIGMOID
                                             Inhibitory
                                                     g   SIGMOID
                              All
                            Positive                       Â
                            Weights                       -  +
                                             Error
                  FIGURE 1.9
                  A postsynaptic neuron with excitatory and inhibitory inputs and all positive weights trained
                  with Hebbian-LMS learning. All outputs are positive. The (SUM) could be positive or
                  negative.
   21   22   23   24   25   26   27   28   29   30   31