Page 205 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 205

3. Neural Networks Enter Mainstream Science    195




                     These developments led to the alliance between engineering and neuroscience
                  that spawned INNS, the IJCNN conferences, and a host of smaller neural network
                  conferences across the world. They also led to a rediscovery of the work of
                  researchers who had started in the “dark ages,” such as Grossberg, Anderson, and
                  Kohonen.
                     Yet the popularization of neural networks within academia has often been
                  accompanied by a restricted view of the nature and explanatory capabilities of
                  neural networks. Because of the influence of the PDP approach developed by
                  Rumelhart and McClelland [4], many neuroscientists, psychologists, and even
                  philosophers (e.g., Bechtel [53]), write with a view of all or most neural networks
                  as requiring extensive training to achieve a predetermined goal. This means that
                  between the sensory and effector ends of the network, the inner components start
                  out with little or no intrinsic structure and emerge out of the wash as “internal
                  representations.” Also, many authors seem to believe that the three-layer back
                  propagation network with input, output, and hidden layers is the standard “neural
                  network.”
                     By contrast, an overview of the neural network field [1] shows the great diversity
                  of neural network architectures, of which PDP networks are just a fraction. Also,
                  while the PDP approach emphasizes development over genetics, both innate and
                  learned processes are of evolutionary importance to organisms, and the richest
                  neural networks incorporate and integrate both types of processes.
                     Most importantly, different neural and psychological functions often require
                  different architectures, even if those functions are integrated together into a larger
                  network. This means that any “one size fits all” approach to neural network
                  modeling is bound to have limitations. For example Gaudiano and Grossberg
                  [54], discuss the different requirements for sensory pattern processing and for
                  motor control. Motor control involves comparing the present position with a target
                  position and inhibiting movement if the present and target positions match. In
                  sensory pattern processing, by contrast, the pattern recognizer is excited if present
                  and expected patterns match. The different requirements suggest different architec-
                  tures for the two subprocesses, and these two architectures can be concatenated
                  into a larger system architecture that generates motion in response to appropriate
                  sensory events.
                     Buoyed by early successes in modeling perception and motor control, the
                  neural network field has expanded in the last 30 years into processes several
                  synapses away from the sensory and motor ends, such as conditioning, attention,
                  cognitive control, and executive function. Some models of those processes have
                  built on and refined earlier, more abstract models of simpler processes. Others
                  have started from data about the complex processes and used simple equations
                  for neural interactions to simulate those data without reference to simpler
                  processes. Because of the unity of the brain and mind, it is my contention that those
                  models that build on the models of simpler processes are more likely to have
                  staying power.
   200   201   202   203   204   205   206   207   208   209   210