Page 203 - Classification Parameter Estimation & State Estimation An Engg Approach Using MATLAB
P. 203

192                         FEATURE EXTRACTION AND SELECTION

            The inequality is called the Bhattacharyya upper bound. A more com-
            pact notation of it is achieved with the so-called Bhattacharyya distance.
            This performance measure is defined as:


                                         Z
                                          p  ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
                           J BHAT  ¼ ln      pðzj! 1 Þpðzj! 2 Þdz      ð6:14Þ
                                         z

            With that, the Bhattacharyya upper bound simplifies to:


                                   p ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
                             E min    Pð! 1 ÞPð! 2 Þ expð J BHAT Þ     ð6:15Þ

            The bound can be made more tight if inequality (6.11) is replaced with
                                                    s 1 s
            the more general inequality minfa, bg  a b  . This last inequality
            holds true for any s, a and b in the interval [0, 1]. The inequality leads
            to the Chernoff distance, defined as:

                             Z
                                s
                J ðsÞ¼  ln     p ðzj! 1 Þp 1 s ðzj! 2 Þdz  with: 0   s   1  ð6:16Þ
                 C
                             z
            Application of the Chernoff distance in a derivation similar to (6.12)
            yields:

                            s     1 s
                E min   Pð! 1 Þ Pð! 2 Þ  expð J ðsÞÞ  for any  s 2½0; 1Š  ð6:17Þ
                                           C
            The so-called Chernoff bound encompasses the Bhattacharyya upper
            bound. In fact, for s ¼ 0:5 the Chernoff distance and the Bhattacharyya
            distance are equal: J BHAT  ¼ J (0:5).
                                      C
              There also exists a lower bound based on the Bhattacharyya distance.
            This bound is expressed as:


                   1  h   p ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi i
                      1     1   4Pð! 1 ÞPð! 2 Þ expð 2J BHAT  Þ   E min  ð6:18Þ
                   2

            A further simplification occurs when we specify the conditional
            probability densities. An important application of the Chernoff and
            Bhattacharyya distance is the Gaussian case. Suppose that these densities
            have class-dependent expectation vectors m and covariance matrices C k ,
                                                  k
   198   199   200   201   202   203   204   205   206   207   208