Page 177 -
P. 177
5.3 The Perceptron Concept 165
A little thought shows that by multiplying the original features by 2 and
subtracting 1 we convert the original features to the [-I, 11 interval, with the
convenient outcome that the product of equal features is now +1, and unequal
features -1. Thus, we must have: - (2x1 - I)(& - 1). Therefore, the perceptron with
the features and weights of Table 5.1 will solve the XOR problem.
Table 5.1. Features used to solve the XOR problem with a quadratic classifier.
Features Weights
x 1x2 -4
XI 2
xz 2
bias - 1
For other types of problems requiring complex decision surfaces, such as a
generalized version of the U vs. V problem consisting of recognizing all sorts of
handwritten characters, one would have to select the appropriate transforming
functions of the original features. However, as previously mentioned, this is a
difficult selection, with no available rules or guidance except perhaps what a
topological analysis of the problem could reveal. Therefore, although the
perceptron could in principle solve any classification problem, what we really need
is a flexible architecture that can be adapted to any problem. We will see how this
is achieved in the following section.
0.2
50 100 150 2~) - Train
Epoch
Figure 5.18. Perceptron learning curve for the two-class classification of cork
stoppers.