Page 314 - Introduction to Statistical Pattern Recognition
P. 314
296 Introduction to Statistical Pattern Recognition
4. Repeat Projects 1 and 3 for various values of I' and k. Plot 1MSE vs. I'
for the Parzen and ZMSE vs. k for the kNN. Determine the optimal I' and
k experimentally.
5. Repeat Experiment 1.
6. Repeat Experiments 2 and 3.
Problems
1. Prove that
(1) Equation (6.3) is a density function, and
(2) the covariance matrix of (6.3) is r2A.
2. Find w of (6.15) for the kernel function of (6.3). Inserting m = 1 and 00
into the w obtained above, confirm that the w's for normal and uniform
kernels are obtained.
[Hint: r(E) + 1/& as E goes to zero.]
3. Using a normal kernel, find the optimal r* and MSE*. Compare them
with the optimal I-* and MSE* for a uniform kernel.
4. Using Ex { MSE { p(X) } } instead of (MSE ( p(X) JdX, find the optimal I'
and criterion value. Use p (X) = Nx(O,I) and the uniform kernel.
5. Derive the joint density function of coverages, ul , . . . ,up. Compute the
marginal density function of uk.
6. Using Ex (MSE { p(X)} } instead of (MSE { p(X) )dX, find the optimal k
and criterion value. Use p (X) = Nx(O,l).
7. Derive ExE{dkNN(X)} for the density function of (6.3). Inserting m = I
and 00 to the above result, confirm that the averaged distances for normal
and uniform distributions are obtained.
8. Compute ExE ( dkNN(X) }, using the second order approximation
u : + ar2/2).
pv(1