Page 162 -
P. 162

150    5 Neural Networks





















                              Figure  5.2.  Linear  discriminant  for  a  two-class  one-dimensional  situation: (a)
                              Design set; (b) Linear network; (c) Energy surface.




                                 The  solution d(x) = x  does indeed  satisfy  the  problem. Let  us  now  see what
                              happens in terms of the energy function. Simple calculations show that:




                                 The  parabolic  surface  corresponding  to  E  is  shown  in  Figure  5.2~. The
                              minimum of E does indeed occur at the point (a=l, b=O).
                                 It is interesting to  see the role of  the bias  weights by  differentiating (5-2a) in
                              order to the bias weights alone:





                              which, solved for the biases, gives:






                                 Therefore,  the  role  of  the  biases  for  each  class  is  to  compensate  for  the
                              difference between the mean of the target values and the mean of the output values
                              corresponding to the feature vectors alone (without the biases).
                                 Note that  by  transforming the input variables using non-linear functions, one
                              can  obtain  more  complex decision boundaries than  the linear ones.  In fact, this
                              corresponds  to  using  the  concept  of  generalized  decision  functions,  presented
                              already in section 2.1.1. Instead of (5-3) we would now have:
   157   158   159   160   161   162   163   164   165   166   167