Page 84 - Foundations of Cognitive Psychology : Core Readings
P. 84

The Appeal of Parallel Distributed Processing  83

               For example, if the connection from a particular A unit to a particular B unit
               has a positive sign, when the A unit is excited (activation greater than 0), it will
               excite the B unit.For this example, we’ll simply assume that the activation of
               each unit is set to the sum of the excitatory and inhibitory effects operating on
               it.This is one of the simplest possible cases.
                 Suppose, now, that we have created on the A units the pattern corresponding
               to the first visual pattern shown in figure 4.13, the rose. How should we ar-
               range the strengths of the interconnections between the A units and the B units
               to reproduce the pattern corresponding to the aroma of a rose? We simply need
               to arrange for each A unit to tend to excite each B unit which has a positive
               activation in the aroma pattern and to inhibit each B unit which has a negative
               activation in the aroma pattern.It turns out that this goal is achieved by setting
               the strength of the connection between a given A unit and a given B unit to a
               value proportional to the product of the activation of the two units.In figure
               4.12, the weights on the connections were chosen to allow the A pattern il-
               lustrated there to produce the illustrated B pattern according to this principle.
               The actual strengths of the connections were set to G.25, rather than G1, so that
               the A pattern will produce the right magnitude, as well as the right sign, for the
               activations of the units in the B pattern.The same connections are reproduced
               in matrix form in figure 4.13.
                 Patternassociatorslikethe oneinfigure 4.12 have anumberofniceproper-
               ties.One is that they do not require a perfect copy of the input to produce the
               correct output, though its strength will be weaker in this case.For example,
               suppose that the associator shown in figure 4.12 were presented with an A
               pattern of ð1;  1; 0; 1Þ.This is the A pattern shown in the figure, with the acti-
               vation of one of its elements set to 0.The B pattern produced in response will
               have the activations of all of the B units in the right direction; however, they
               will be somewhat weaker than they would be, had the complete A pattern been
               shown.Similar effects are produced if an element of the pattern is distorted—or
               if the model is damaged, either by removing whole units, or random sets of
               connections, etc.Thus, their pattern retrieval performance of the model de-
               grades gracefully both under degraded input and under damage.

               How a Pattern Associator Learns  So far, we have seen how we as model builders
               can construct the right set of weights to allow one pattern to cause another.The
               interesting thing, though, is that we do not need to build these interconnection
               strengths in by hand.Instead, the pattern associator can teach itself the right set
               of interconnections through experience processing the patterns in conjunction
               with each other.
                 A number of different rules for adjusting connection strengths have been
               proposed.One of the first—and definitely the best known—is due to D.O.
               Hebb (1949).Hebb’s actual proposal was not sufficiently quantitative to build
               into an explicit model.However, a number of different variants can trace their
               ancestry back to Hebb.Perhaps the simplest version is:
                    When unit A and unit B are simultaneously excited, increase the strength
                    of the connection between them.
   79   80   81   82   83   84   85   86   87   88   89