Page 34 - Classification Parameter Estimation & State Estimation An Engg Approach Using MATLAB
P. 34

BAYESIAN CLASSIFICATION                                       23

            Listing 2.2
            PRTools code for estimating decision boundaries taking account of the
            cost.


            load nutsbolts;
            cost ¼ [  0.20   0.07   0.07   0.07 ; . . .
                     0.07   0.15    0.07   0.07 ; . . .
                     0.07    0.07   0.05   0.07 ; . . .
                     0.03    0.03   0.03   0.03];
            w1 ¼ qdc(z);       % Estimate a single Gaussian per class
                               % Change output according to cost
            w2 ¼ w1*classc*costm([],cost);
            scatterd(z);
            plotc(w1);         % Plot without using cost
            plotc(w2);         % Plot using cost



            2.1.1  Uniform cost function and minimum error rate

            A uniform cost function is obtained if a unit cost is assumed when an
            object is misclassified, and zero cost when the classification is correct.
            This can be written as:


                                                      1   if i ¼ k
                   !
                 Cð^ ! i j! k Þ¼ 1    ði; kÞ  with:  ði; kÞ¼            ð2:9Þ
                                                      0   elsewhere
             (i,k) is the Kronecker delta function. With this cost function the condi-
            tional risk given in (2.4) simplifies to:

                                      K
                                     X
                             !
                                                         !
                           Rð^ ! i jzÞ¼   Pð! k jzÞ¼ 1   Pð^ ! i jzÞ   ð2:10Þ
                                    k¼1;k6¼i
            Minimization of this risk is equivalent to maximization of the posterior
            probability P(^ ! i jz). Therefore, with a uniform cost function, the Bayes
                        !
            decision function (2.8) becomes the maximum a posteriori probability
            classifier (MAP classifier):

                                ^ ! ! MAP ðzÞ¼ argmaxfPð!jzÞg          ð2:11Þ
                                            !2O

            Application of Bayes’ theorem for conditional probabilities and cancel-
            lation of irrelevant terms yield a classification, equivalent to a MAP
   29   30   31   32   33   34   35   36   37   38   39