Page 28 - Classification Parameter Estimation & State Estimation An Engg Approach Using MATLAB
P. 28
BAYESIAN CLASSIFICATION 17
probability. It represents the knowledge that we have about the class of
an object before the measurements of that object are available. Since the
number of possible classes is K, we have:
K
X
Pð! k Þ¼ 1 ð2:1Þ
k¼1
The sensory system produces a measurement vector z with dimension N.
Objects from different classes should have different measurement vec-
tors. Unfortunately, the measurement vectors from objects within the
same class also vary. For instance, the eccentricities of bolts in Figure 2.2
are not fixed since the shape of bolts is not fixed. In addition, all
measurements are subject to some degree of randomness due to all kinds
of unpredictable phenomena in the sensory system, e.g. quantum noise,
thermal noise, quantization noise. The variations and randomness are
taken into account by the probability density function of z.
The conditional probability density function of the measurement vec-
tor z is denoted by p(zj! k ). It is the density of z coming from an object
with known class ! k .If z comes from an object with unknown class, its
density is indicated by p(z). This density is the unconditional density of z.
Since classes are supposed to be mutually exclusive, the unconditional
density can be derived from the conditional densities by weighting these
densities by the prior probabilities:
K
X
pðzÞ¼ pðzj! k ÞPð! k Þ ð2:2Þ
k¼1
The pattern classifier casts the measurement vector in the class that will
be assigned to the object. This is accomplished by the so-called decision
!
function ^ !(:) that maps the measurement space onto the set of possible
N
classes. Since z is an N-dimensional vector, the function maps R onto O.
That is: ^ !(:): R N ! O.
!
Example 2.2 Probability densities of the ‘mechanical parts’ data
Figure 2.4 is a graphical representation of the probability densities of
the measurement data from Example 2.1. The unconditional density
p(z) is derived from (2.2) by assuming that the prior probabilities
P(! k ) are reflected in the frequencies of occurrence of each type of
object in Figure 2.2. In that figure, there are 94 objects with frequen-
cies bolt:nut:ring:scrap ¼ 20:28:27:19. Hence the corresponding prior