Page 191 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 191
4. Need for New Directions in Understanding Brain and Mind 181
the art use of “eCog” technology, a cousin of EEG technology which is far more pre-
cise in recording activity in the dendrites (the top level of what you see in Fig. 8.14)
at similar high sampling rate (kilohertz). They are explained further in the final joint
paper with Freeman [8].
For my own study [9], in collaboration with Yeshua Davis (who also did much of
the computer analysis for Freeman’s recent studies), I used existing datadthe best
data I could find for this purpose. I used the data from a groundbreaking study by
Buzsa ´ki’s group [38] which was perhaps the most important mainstream study
done by then on how general learning actually takes place at a systems level in
the brain. Their paper started from an intensive review of the serious bottom-up
work already done on learning in the brain, and asked whether that small-scale
work is actually reflected in what we see in a systems-level study of changes in
the whole brain when it learns new tasks. Buzsa ´ki’s group took data from more
than 100 channels from microelectrodes deep in the cerebral cortex and hippocam-
pus, at a rate of 20,000 measurements per second. This data estimated the actual
firing levels of the neurons, the outputs from the bodies of the neurons, including
outputs from the large pyramid cells you see in Fig. 8.9, from the bottom layer of
the cortex.
The graphics in our paper are not so impressive as Fig. 8.15, but the paper con-
tains a number of hard quantitative measures directly testing the two key questions:
Do we see a regular, precise, and persistent clock cycle time in the data from an
individual rat over time? Do we see an alteration of direction of flow of information
(like the mirror image impression you see in the top panels of Fig. 8.15) or does flow
just keep going from the input side of the cortex to the output side as older compu-
tational theories would suggest? The paper gives extensive details of many new
measures, all of which agreed that the clock cycle time can be measured with
high precision in this kind of data, and that the “mirror image” hypothesis fits the
data with about 40% less error than the error with more conventional theories of
biological neural network dynamics.
Of course, there are lots of caveats here, and I would urge the reader to click on
the link to the paper itself for a more complete picture. Backpropagation does not
predict that the backwards pass is a precise mirror image of the forward pass, but
on the whole it does predict a reverse flow of information in the pass which calcu-
lates derivatives (locally, of course). The results in our new paper are hopefully just a
beginning of a whole new direction, and not an end. If I were still at NSF, I would try
to organize a new forecasting competition to predict the Buzsa ´ki data (perhaps even
funding Buzsa ´ki group to collect more data, to allow fair blind testing). I would
inform the competitors of several resources, including our paper, which may be as
important to forecasting of that data as seasonal effects are to predicting things
like monthly or weekly economic data. Who knows what a full mobilization
of the computational community could offer, in deepening our understanding of
what is really happening in the brain?
But again, these two hypotheses are just a small entry point to a large range of
new opportunities discussed in detail in the paper.