Page 40 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 40

Appendix: Trainable Neural Network Incorporating Hebbian-LMS Learning    27




                     The above description of synaptic plasticity is highly simplified. The reality is
                  much more complicated. The literature on the subject is very complicated. The
                  above description is a simplified high-level picture of what occurs with adaptation
                  and learning in neural networks.
                     The question remains, When nature performs learning in neural networks, is this
                  done with an algorithm similar to Hebbian-LMS? No one knows for sure, but “if it
                  walks like a duck, quacks like a duck, and looks like a duck, maybe it is a duck.”



                  13. CONCLUSION
                  The Hebbian learning rule of 1949, “fire together wire together,” has stood the test of
                  time in the field of neurobiology. The LMS learning rule of 1959 has also stood in
                  the test of time in the field of signal processing and telecommunications. This chap-
                  ter has reviewed several forms of a Hebbian-LMS algorithm that implements
                  Hebbian-learning by means of the LMS algorithm. Hebbian-LMS extends the Heb-
                  bian rule to cover inhibitory as well as excitatory neuronal inputs, making Hebbian
                  learning more “biologically correct.” At the same time, Hebbian-LMS is an unsuper-
                  vised clustering algorithm that is very useful for automatic pattern classification.
                     Given the available parts of nature’s neural networks, namely neurons, synapses,
                  and their interconnections and wiring configurations, what kind of learning algo-
                  rithms could be implemented? There may not be a unique answer to this question.
                  But it may be possible that nature is performing Hebbian-LMS in at least some parts
                  of the brain.
                     The basic building blocks of modern-day computers are flip-flops and logic
                  gates. By analogy, is it possible that clustering is a basic building block of living
                  neural systems?



                  APPENDIX: TRAINABLE NEURAL NETWORK INCORPORATING
                  HEBBIAN-LMS LEARNING

                  Trainable neural networks can be constructed with several Hebbian-LMS layers fol-
                  lowed by a supervised output layer. An example of such network is shown in
                  Fig. 1.18.
                     The unsupervised hidden layers serve as a preprocessor for the output layer.
                  Clustering with the hidden layers aids the output layer in making final classification
                  decisions. The Hebbian-LMS neurons train independently but are in communication
                  with their neighbors, like the cells in living tissue. They work independently yet all
                  work together for the common goal of pattern clustering.
                     A four-layer network like that of Fig. 1.18 has been simulated. The neurons had
                  symmetric sigmoids. Input signals and weights could be either positive or negative.
                  This would be more typical for engineering applications. Adaptation is demon-
                  strated by the learning curves of Fig. 1.19. The hidden layers of the network were
   35   36   37   38   39   40   41   42   43   44   45