Page 117 - Introduction to Statistical Pattern Recognition
P. 117
3 Hypothesis Testing 99
Bhattacharyya bound: If we do not insist on the optimum selection of
s, we may obtain a less complicated upper bound. One of the possibilities is to
select s = 1/2. Then, the upper bound is
E,, = G j m dX = Ge-p(1i2) (3.151)
in general, and for normal distributions
C I +c2
I- I
2 m.
1
+ - In 2 (3.152)
The term p(1/2) is called the Bhattacharyya distance, and will be used as an
important measure of the separability of two distributions [ 171.
When CI = C2 = C, the Chemoff distance, (3.150), becomes
s (1-s)
2
W) = ~ (M2-M1)Y(M*-M1) . (3.153)
In this case, the optimum s can be obtained by solving
dp(s) 1-2s
-- - -(M2-M1)Y(M2-M1) = 0. (3.154)
ds 2
The solution is s =OS. That is, the Bhattacharyya distance is the optimum
Chemoff distance when XI = C2.
As seen in (3.151), E, = df 1f2exp[-p(1/2)] or In E, =-p(1/2)
- In .\lp,p,. Figure 3-17 shows the relation between p(1/2) and E, for
PI =f2 =OS.
Throughout this book, we use the Bhattacharyya distance rather than the
Chemoff because of its simplicity. However, all discussions about the Bhatta-
charyya distance in this book could be extended to the Chemoff.
As seen in (3.152), the Bhattacharyya distance consists of two terms.
The first or second term disappears when MI =MZ or C1 =X2, respectively.
Therefore, the first term gives the class separability due to the mean-difference,
while the second term gives the class separability due to the covariance-