Page 73 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 73

2. Third Gen AI     61





                              A  A    A                     Linear classifier
                             A     AA     B     B
                                          B
                                                               Dendrite Net
                                          A  A    A
                              B     B     A     AA
                              B


                  FIGURE 3.3
                  Artificial Neural Networks (ANN) need multiple layers known as “Deep Learning” [7,9] (A)
                  Left panel shows that while a single layer of ANN can simply be a linear classifier, (B) the
                  right panel shows the dendrite net giving rise to the linear classifier equation after the
                  sigmoid threshold.

                  cells hemoglobin, squeezing through capillaries supplying the glucose and oxygen.
                  Also, the thermodynamic equilibrium keeps the chemical diffusion rate constant for
                  all generations to accumulate the experience. This is the basis of human natural in-
                  telligence (NI) based on two necessary conditions as follows:

                  2.1 MAXWELLeBOLTZMANN HOMEOSTASIS [8]


                  Boltzmann constant k B andKelvintemperature T (300 K(¼27 C) ¼ k B T ¼ 1/40 eV);


                  thus T o ¼ 37 C is slightly higher thermal reservoir energy. MaxwelleBoltzmann
                  probability W is derived from the third thermodynamic Neal’s law at nonzero tem-
                  perature that insures the incessant collision mixing homogenizing the degree of
                  uniformity measured by the total entropy S[ ¼ S env þ Sðx o Þ[ ¼ k B Log ðW[Þ
                                       Wðx o Þ¼ expð  Hðx o Þ=k B T o Þ;         (3.1)
                  H is the derived within the head x o Helmholtz Free Energy H(x o ) defined as the
                  internal energy E(x o ) in contact with a blood environment at the temperature T o .
                  The H(x o ) is the E(x o ) subtracted the thermal entropy energy T o S(x o ) and the net
                  becomes free-to-do work energy kept to be the minimum to be stable:

                                     min: Hðx o ÞY ¼ Eðx o Þ  T o Sðx o Þ[       (3.2)
                     Use is made of the isothermal equilibrium of brain in the warm blood reservoir at
                  the homeostasis temperature T o . It is further used in the second law of conservation
                  of energy DQ env. ¼ T o DS env. and the brain internal energy. DE brain þ DQ env. ¼ 0, and
                  then we integrate the change and drop the integration constant due to arbitrary prob-
                  ability normalization.
                     The WSEP has made AI ANN deep learning powerful, because we can trade the
                  temporal evolution average denoted by the underscore bar with the wide-sense
                  equivalent spatial ensemble average denoted by the angular brackets.
   68   69   70   71   72   73   74   75   76   77   78