Page 229 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 229

220    CHAPTER 11 Deep Learning Approaches to Electrophysiological






                         1. INTRODUCTION

                         The processing of multivariate time-series for identification, classification, predic-
                         tion, or feature extraction problems is a relevant research and application topic
                         related to many different science domains. The development of information
                         technology and the prevalence of sensor networks for the emergence of the Internet
                         of Things (IoT) have strongly motivated a resurgence of interest in ML and its
                         potential impact on multivariate time-series analysis. DL techniques are a new trend
                         in ML, well founded in classical neural network theory. They use many hidden
                         layers of neurons to generate a lower-dimensional projection of the input space
                         that corresponds, for example, to the signals generated by the network of sensors
                         in monitoring applications. The successive hidden layers are able to build effective
                         high-level abstraction of the raw data. State-of-the-art DL processors present archi-
                         tectural advantages and benefit of novel training paradigms synergic with other
                         approaches, like compressive sensing and sparse representations. The high number
                         of neurons and links is reminiscent of brain networks and allows the storage of the
                         essential features of the underlying inputeoutput mapping. In biomedical signal
                         processing, many diagnostic systems produce multivariate time-series, and the
                         automatic extraction of features without human intervention is of high interest for
                         supporting clinical diagnoses and for highlighting latent aspects hidden to standard
                         visual interpretation. For example, in medical imaging, small irregularities in tissues
                         may be a precursor to tumors that can be detected in the successive levels of abstrac-
                         tions of DL network. The development of efficient DL systems can have a significant
                         impact on public health, also given the possibility of incorporating real-time
                         information in the existing computational models. In this chapter, DL methods are
                         briefly presented in the historical perspective of NN studies. Electroencephalo-
                         graphic (EEG) multivariate data are considered as many application domains
                         spanning from brain computer interface (BCI) to neuroscience take advantage of
                         this noninvasive and cheap technique as the basis of brain studies. Two different
                         DL architectures will be proposed that successfully solve difficult neurology
                         applications.


                         2. THE NEURAL NETWORK APPROACH

                         Through ML, computers develop the ability to autonomously learn and interact with
                         their environment. By exploiting the available data, they learn optimal behaviors
                         without the need of a specific programming step. NN are machines explicitly
                         designed to possess this ability. NNs are collection of elementary processing nodes
                         suitably arranged in various topological architectures. The elementary node of the
                         network is referred to as neuron and includes a linear part taking a weighted linear
                         combination of its inputs and a nonlinear part where a selected nonlinear function
                         transforms the input in the final output of the node. The inputs of the neuron
   224   225   226   227   228   229   230   231   232   233   234