Page 138 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 138
4. Brain-Like AI Systems Based on SNN. NeuCube. 127
3. Unsupervised learning spatiotemporal spike sequences in a scalable 3D SNN
reservoir;
4. Ongoing learning and classification of data over time;
5. Dynamic parameter optimization;
6. Evaluating the time for predictive modeling;
7. Adaptation on new data, possibly in an online/real-time mode;
8. Model visualization and interpretation for a better understanding of the data and
the processes that generated it;
9. Implementation of the SNN model as both software and a neuromorphic hard-
ware system (if necessary).
A NeuCube development system that allows for the above steps to be explored for
an efficient application system is available from: http://www.kedri.aut.ac.nz/neucube/.
Several applications of NeuCube are described in Ref. [116]. In Ref. [117],a
method for using SNN for efficient data compression is introduced, with wide range
of applications in telecommunication. In Ref. [118] a survey of applications using
NeuCube SNN machine can be found. They include: temporal data compression
[117]; bio- and neuroinformatics applications [119]; personal assistance [120,121];
brain data analysis [122]; automated financial, banking, and trading systems [123];
traffic streaming data modeling [124].
4.2 DEEP LEARNING AND DEEP KNOWLEDGE REPRESENTATION IN
NEUCUBE SNN MODELS: METHODS AND AI APPLICATIONS [6]
Deep learning neural networks (DNN) are ANN that have several layers of neurons and
connections in their structures (rather than 3 as shown in Fig. 6.2B). A class of DNN is
the convolutional DNN, where neurons at the first layer learn features only within a
small subsection of the input vector data (e.g., a small square of pixels from an image).
These neurons are connected to the next layer where features are combined, until the
output classification layer, where output classes are determined [125e130].
DNNs are excellent for vector- or frame-based data, but not much for temporal
(or spatio-/spectrotemporal data). There is no time of asynchronous events learned in
the model. They are difficult to adapt to new data and the structures are not flexible.
The human brain learns without fixing in advance the number of layers and num-
ber of neurons in each layer. And it makes long trajectories in neuronal connections
when learning simple tasks as repeating a spoken word. More complex connectivity
patterns are learned when learning music or languages [131].
In Ref. [6] algorithms for unsupervised, supervised, and semisupervised learning
are presented for deep in-time learning of temporal data in a NeuCube SNN archi-
tecture and for deep knowledge representation [6].
The unsupervised deep learning algorithm includes the following steps:
1. Initialization of an SNN model:
a. An SNN model is prestructured to map structural and functional areas of the
modeled process presented by the temporal or spatiotemporal data. The SNN