Page 186 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 186

176    CHAPTER 8 The New AI: Basic Concepts, and Urgent Risks




                         developed extensively over many years by many authors, though the more formal
                         strategy which I proposed in 2011 [25] could be useful in consolidating and strength-
                         ening that literature.
                            On the other hand, an ANN with just two layers (not even a hidden layer) and
                         simultaneous recurrence would be able to learn even deeper relations than
                         Fig. 8.11; in that design, “breadth” actually creates depth automatically, by learning.
                         In fact, this is how the brain achieves depth, by using a cerebral cortex which only
                         has six layers but which uses recurrent connections between layers to learn struc-
                         tures which seem like many, many functional layers of processing after the fact.


                         3.3 ROADMAP FOR MOUSE-LEVEL COMPUTATIONAL
                             INTELLIGENCE (MLCI)
                         At its start, the neural network field hoped to uncover simple, basic laws of learning,
                         powerful enough that all the complexities of higher intelligence could be understood
                         as emergent phenomena, as things which can be learned and need not be assumed a
                         priori. This vision emanated in great part from the great vision of Donald Hebb [29].
                            By about 1990, however, I understood that there are fundamental reasons why
                         learning and accounting for symmetries in a general way, and some kind of princi-
                         pled chunking of time, really is a crucial tool, even for a general-purpose learning
                         system like a brain. I was inspired in part by a close consideration of work by Albus
                         [30], but also by understanding how the mathematics really does allow much greater
                         capability when such principles are fully exploited. This led me to a workable new
                         framework for trying to build brain-like neural networks [31], including substantial
                         new algorithms. There were certain gaps, particularly on how to go from the tempo-
                         ral intelligence stage to the full creative mouse-level stage, generally filled in at least
                         conceptually by 2009 [32]. This led to the two connected roadmaps for cognitive
                         prediction and cognitive optimization summarized in my main talk at WCCI2014
                         [5] and in Fig. 8.12.

                         3.4 EMERGING NEW HARDWARE TO ENHANCE CAPABILITY BY
                             ORDERS OF MAGNITUDE
                         True neural network designs are inherently parallel, like the brain. They are designed
                         to be capable of fully exploiting any computer hardware which allows thousands or
                         billions of calculations to be done in parallel. Section 2 mentioned how GPUs played
                         an important role in the deep learning revolution, but GPUs (like feedforward
                         CoNNs) are just the first step on a path which can go much further.
                            One major landmark in recent years was the discovery of memristors in actual
                         electronics, as proposed many decades before by Leon Chua. Memristors are
                         uniquely well-suited to more efficient hardware implementation of neural networks,
                         and are opening the door to many commercial opportunities [33].
                            Many years ago, GPUs were designed and marketed simply for use in playing
                         games on a computer, or for video card kinds of applications. Dan Hammerstrom
   181   182   183   184   185   186   187   188   189   190   191