Page 148 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 148

References    137




                  [111] J. Binas, U. Rutishauser, G. Indiveri, M. Pfeiffer, Learning and stabilization of winner-
                       take-all dynamics through interacting excitatory and inhibitory plasticity, Frontiers in
                       Computational Neuroscience 8 (68) (2014).
                  [112] J. Sawada, et al., TrueNorth ecosystem for brain-inspired computing: scalable systems,
                       software, and applications, in: Proc. of the Int’l Conf. for High Performance
                       Computing, Networking, Storage and Analysis, USA, November 2016, 2016,
                       pp. 130e141.
                  [113] INI Labs: http://inilabs.com.
                  [114] B.V. Benjamin, et al., Neurogrid: a mixed-analog-digital multichip system for large-
                       scale neural simulations, Proceedings of the IEEE 102 (5) (2014) 699e716.
                  [115] S.B. Furber, D.R. Lester, L.A. Plana, J.D. Garside, E. Painkras, S. Temple,
                       A.D. Brown, Overview of the SpiNNaker system architecture, IEEE Transactions
                       on Computers 62 (12) (2012) 2454e2467.
                  [116] A. Mohemmed, S. Schliebs, S. Matsuda, N.K. Kasabov, Training spiking neural net-
                       works to associate spatio-temporal inputeoutput spike patterns, Neurocomputing
                       107 (2013) 3e10.
                  [117] N. Sengupta, N.K. Kasabov, Spike-time encoding as a data compression technique for
                       pattern recognition of temporal data, Information Sciences 406e407 (2017) 133e145.
                  [118] N.K. Kasabov, N. Scott, E. Tu, S. Marks, N. Sengupta, E. Capecci, M. Othman,
                       M. Doborjeh, N. Murli, R. Hartono, J.I. Espinosa-Ramos, L. Zhou, F. Alvi,
                       G. Wang, D. Taylor, V.L. Feigin, S. Gulyaev, M. Mahmoudh, Z.-G. Hou, J. Yang,
                       Design methodology and selected applications of evolving spatio-temporal data ma-
                       chines in the neucube neuromorphic framework, Neural Networks 78 (2016) 1e14.
                  [119] N. Kasabov (Ed.), The Springer Handbook of Bio- and Neuroinformatics, Springer,
                       2014, 1230 pp.
                  [120] N. Kasabov, From multilayer perceptrons and neuro-fuzzy systems to deep learning
                       machines: which method to use? e a survey, International Journal on Information
                       Technologies and Security 9 (20) (2017) 3e24.
                  [121] N. Kasabov, M. Doborjeh, Z. Doborjeh, Mapping, learning, visualisation, classifica-
                       tion and understanding of fMRI data in the NeuCube Spatio Temporal Data
                       Machine, IEEE Transactions of Neural Networks and Learning Systems 28 (4)
                       (2017) 887e899, https://doi.org/10.1109/TNNLS.2016.2612890.
                  [122] C. Ge, N. Kasabov, Z. Liu, J. Yang, A spiking neural network model for obstacle
                       avoidance in simulated prosthetic vision, Information Sciences 399 (2017) 30e42.
                  [123] R. Khansama, V. Ravi, N. Sengupta, A.R. Gollahalli, N. Kasabov, Stock market move-
                       ment prediction using evolving spiking neural networks, Evolving Systems (2018).
                  [124] E. Tu, N. Kasabov, J. Yang, Mapping temporal variables into the NeuCube spiking
                       neural network architecture for improved pattern recognition and predictive
                       modelling, IEEE Transactions on Neural Networks and Learning Systems 28 (6)
                       (2017) 1305e1317, https://doi.org/10.1109/TNNLS.2016.2536742.
                  [125] Y. LeCun, Y.Y. Bengio, G. Hinton, Deep learning, Nature 521 (7553) (2015) 436e444.
                  [126] I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, MIT Press, 2016.
                  [127] J. Schmidthuber, Deep learning in neural networks: an overview, Neural Networks 61
                       (2014) 85e117.
                  [128] Y. Bengio, Learning deep architectures for AI, Foundations and TrendsÒ in Machine
                       Learning 2 (1) (2009) 1e127. https://doi.org/10.1561/2200000006.
   143   144   145   146   147   148   149   150   151   152   153