Page 253 -
P. 253

Exercises   24 1


                                 5.14 Consider a set of three points X=(x, x,E % , i=1, 2, 3).
                                     a)  Show that Xis not pseudo-shattered  by  the class of linear functions F (see Figure
                                        5.35).
                                     b)  Show that X is pseudo-shattered  by the class of quadratic functions Q=( (x,j(x) =
                                        ax2+ b);a, b~  31 1.

                                 5.15 Using an MLP approach, determine which shares of the StockExchange dataset are best
                                     predicted one-day ahead, using features LISBOR and  USD.

                                 5.16 Design  an  RBF classifier for the  three classes of  cork  stoppers, using  features ART,
                                     PRM, NG  and RAAR. Compare the solution obtained  with the one derived in Exercise
                                     5.7, using ROC curve analysis.
                                 5.17 Train the MLP18:22:10 classifier for the CTG data, described in section 5.7.1,  with the
                                     back-propagation algorithm. Compare the results with those shown in Table 5.7. What
                                     is the lower bound of the number of  training  samples needed for training and the VC
                                     dimension, assuming hard-limiting activation functions?

                                 5.18 Design  an  MLP classifier for the  CTG data for the  three classes N, S and P with  a
                                     reduced feature set. Use the conjugate-gradient method for training.
                                 5.19 Train  the  MLP  classifier  from  the  previous  exercise  with  the  genetic  algorithm
                                     approach (use the NeuroGenetic  program). Compare the solution obtained with the one
                                     from the previous exercise, regarding classification performance and convergence time.

                                 5.20 Determine the progress of  the conjugate-gradient  method for the error surfaces shown
                                     in Figures 5-8b and 5-9b by determining the successive minima and gradients at those
                                     points.
                                 5.21 Design a two-layer MLP for classification of the Rocks data into three classes: granites,
                                     limestones and marbles.  Use features SiO2, CaO and RMCS, and perform  the training
                                     with the conjugate-gradient and genetic algorithm methods.

                                 5.22 Foetal weight estimation is also clinically  relevant when approached as a classification
                                     task.  Design  an  MLP classifier for the  foetal  weight  dataset, considering the classes
                                     corresponding to the following weight intervals (in grams):
                                     a)   w,=below  1000; q=[1000,  1500[; @=[1500,  3000[;  w4=[3000, 45001; os=above
                                        4500.
                                     b)   o,=below 2000 (too low); q=[2000,4000]; &=above  4000.

                                  5.23 Design an RBF network with Gaussian kernel for the same Rocks data classification as
                                     in  the  previous  exercise. Compare  both  solutions, using  scatter  plots  with  the  class
                                     boundaries determined by the classifiers.

                                  5.24 Determine the optimal SVM hyperplane for a data set as in the example of Figure 5.44,
                                     the only difference being that point x3 is now x3=(0, 1).
   248   249   250   251   252   253   254   255   256   257   258