Page 86 - Biomimetics : Biologically Inspired Technologies
P. 86

Bar-Cohen : Biomimetics: Biologically Inspired Technologies DK3163_c003 Final Proof page 72 21.9.2005 11:40pm




                    72                                      Biomimetics: Biologically Inspired Technologies

                    have strong links to the symbol for Stock two word lexicons later (independent of what word
                    follows it). However, if the parse has activated the phrase New Orleans no such erroneous
                    knowledge will be invoked. The other advantage of using the parsed representation is that the
                    knowledge links tend to have a longer range of utility; since they represent originally extended
                    conceptual collections that have been unitized.
                       If, as often occurs, we need to restore the words of a sentence to the word lexicons after a parse
                    has occurred (and the involved word lexicons have been automatically shut off by the resulting
                    action commands), all we need to do is to activate all the relevant downward knowledge bases and
                    simultaneously carry out confabulation on all of the word regions. This restores the word-level
                    representation. If it is not clear why this will work, it may be useful to consider the details of Figure
                    3.2 and the above description. The fact that ‘‘canned’’ thought processes (issued action commands),
                    triggered by particular confabulation outcomes, can actually do the above information processing,
                    generally without mistakes, is rather impressive.

                    3.3.3 Consensus Building

                    For sentence continuation (adding more than just one word), we must introduce yet another new
                    concept: consensus building. Consensus building is simply a set of brief, but not instantaneous,
                    temporally overlapping, mutually interacting, confabulation operations that are conducted in such
                    a way that the outcomes of each of the involved operations are consistent with one another in terms
                    of the knowledge possessed by the system. Consensus building is an example of constraint
                    satisfaction; a classic topic introduced into neurocomputing in the early 1980s by studies of
                    Boltzmann machines (Ackley et al., 1985).
                       For example, consider the problem of adding two more sensible words onto the following
                    sentence-starting word string (or simply starter): The hyperactive puppy. One approach would
                    be to simply do a W simultaneously on the fourth and fifth word lexicons. This might yield: The
                    hyperactive puppy was water; because was is the strongest fourth word choice, and based upon
                    the first three words alone, water (as in drank water) is the strongest fifth word choice. The final
                    result does not make sense.
                       But what if the given three-word starter was first used to create expectations on both the
                    fourth and fifth lexicons (e.g., using C3Fs). These would contain all the words consistent with
                    this set of assumed facts. Then, what if W’s on word lexicons four and five were carried out
                    simultaneously with a requirement that the only symbols on five that will be considered are
                    those which receive inputs from four. Further, the knowledge links back to phrase lexicons
                    having unresolved expectations from word lexicons four and five, and those in the opposite
                    directions, are used as well to incrementally enhance the excitation of symbols that are consistent.
                    Expectation symbols which do not receive incremental enhancement have their excitation levels
                    incrementally decreased (to keep the total excitation of each expectation constant at 1.0). This
                    multiple, mutually interacting, confabulation process is called consensus building. The details of
                    consensus building, which would take us far beyond the introductory scope of this chapter, are not
                    discussed here.
                       Applying consensus building yields sensible continuations of starters. For example, the starter
                    I was very, continues to: I was very pleased with my team’s, and the starter There was little
                    continues to: There was little disagreement about what importance. Thanks to my colleague
                    Robert W. Means for these examples.

                    3.3.4 Multi-Sentence Language Units

                    The ability to exploit long-range context using accumulated knowledge is one of the hallmarks of
                    human cognition (and one of the glaring missing capabilities in today’s computer and AI systems).
                    This section presents a simple example of how confabulation architectures can use long-range
   81   82   83   84   85   86   87   88   89   90   91