Page 196 - Introduction to Statistical Pattern Recognition
P. 196

178                        Introduction to Statistical Pattern Recognition



                          (b)  Assuming that the density functions of h(X) for  o1 and  02 can be
                              approximated by  normal densities, compute  the  approximated value
                              of the Bayes error for n = 8, h2A, = 2.5,  and PI = P2 = 0.5.


                      2.   Two normal distributions are characterized by











                          Calculate the errors due to the Bayes classifier and the bisector.


                       3.  Using the same data as in Problem 2 except




                          find the linear discriminant function which maximizes the Fisher criterion,
                          and minimize the error by adjusting the threshold.


                       4.   Using the same data as in Problem 3, find the optimum linear discriminant
                           function which minimizes the probability of  error.  Show that the error is
                           smaller than the one of  Problem 3.  (Check the errors for s = 0, 0.02  and
                          0.25.)

                       5.  Design the optimum linear classifier by  minimizing the mean-square error
                           of
                                           -2
                                           E  = E { ( vTx + Yo  - y(X))2 )
                           where y(X) = +1  for X  E  o2 and -1  for X  E ol, Without using the pro-
                           cedure discussed in  this chapter, take the derivative of E2 with  respect to
                           V and 1'0, equate the derivative to zero, and solve the equation for V.  Set-
                           ting  the  mixture  mean,  Mo = PIMI + P2M2, as  the  coordinate  origin,
                           confirm that the resulting optimum V is
   191   192   193   194   195   196   197   198   199   200   201