Page 233 -
P. 233
5.10 Support Vector Machines 221
Using the Support Vector Machine Toolbox for Matlab, developed by S.R.
Gunn (Gunn, 1997), several experiments were conducted as explained in the
following. Figure 5.47 illustrates the SVM approach for the non-separable
situation, exemplifying the influence of the constant C in the separation region,
namely for G100 and C=w (a very large value of C). In both cases the number of
misclassified samples is the same (3 misclassified samples). However, in the first
case the margin is smaller, therefore attempting to decrease the value of 4, with a
smaller number of support vectors.
Figure 5.47. SVM linear discrimination of two non-separable classes. (a) C=100,
with nine support vectors; (b) C=w, with twelve support vectors.
The SVM approach was also applied to the classification of the first two classes
of cork stoppers. Figure 5.48 shows the results obtained for C=10. The overall
error is 9% (2 misclassified cases of class w, and 7 misclassified cases of class R).
The solution is remarkably close to the solution obtained with a perceptron (see
Figure 5.19), with somewhat better performance, at least for the training set. Using
other values of C similar solutions were obtained, with some variation of the
separation margin and the number of support vectors.
Let us now consider the support vector approach for non-linear decision
functions. The basic idea is to perform a non-linear mapping into a higher
dimension space where the linear approach can be applied. The transformation to a
higher dimension, already presented in (2-4). is:
with = b,(x) f,(x) ... f,(x) 11' .