Page 347 - Computational Statistics Handbook with MATLAB
P. 347
336 Computational Statistics Handbook with MATLAB
% based on the class-conditional probabilities.
ncc = 0;
% We will use only the first two features of
% the iris data for our classification.
% This should make it more difficult to
% separate the classes.
% Delete 3rd and 4th features.
virginica(:,3:4) = [];
versicolor(:,3:4) = [];
[nver,d] = size(versicolor);
[nvir,d] = size(virginica);
n = nvir + nver;
First, we will loop through all of the versicolor observations. We build a
classifier, leaving out one pattern at a time for testing purposes. Throughout
this loop, the class-conditional probability for virginica remains the same,
so we find that first.
% Loop first through all of the patterns corresponding
% to versicolor. Here correct classification
% is obtained if pxgver > pxgvir;
muvir = mean(virginica);
covvir = cov(virginica);
% These will be the same for this part.
for i = 1:nver
% Get the test point and the training set
versitrain = versicolor;
% This is the testing point.
x = versitrain(i,:);
% Delete from training set.
% The result is the training set.
versitrain(i,:)=[];
muver = mean(versitrain);
covver = cov(versitrain);
pxgver = csevalnorm(x,muver,covver);
pxgvir = csevalnorm(x,muvir,covvir);
if pxgver > pxgvir
% then we correctly classified it
ncc = ncc+1;
end
end
We repeat the same procedure leaving out each virginica observation as
the test pattern.
% Loop through all of the patterns of virginica notes.
% Here correct classification is obtained when
% pxgvir > pxxgver
© 2002 by Chapman & Hall/CRC

