Page 953 - The Mechatronics Handbook
P. 953
0066_Frame_C32.fm Page 3 Wednesday, January 9, 2002 7:54 PM
x
1
x 1 x 2
x
2 x 3 T = 0
x
3 T = t x 4
x 4 x n w n+1 = − T
(a) x n (b) +1
FIGURE 32.3 Threshold implementation with an additional weight and constant input with +1 value: (a) neuron
with threshold T, (b) modified neuron with threshold T = 0 and additional weight equal to −T.
threshold is always set to be zero and the net value is calculated as
n
net = ∑ w i x i + w n+1 (32.3)
i=1
where w n+1 has the same value as the required threshold and the opposite sign. Single-layer perceptrons
were successfully used to solve many pattern classification problems. The hard threshold activation
functions are given by
(
net) +
sgn
1
(
o = f net) = ------------------------------ = 1, if net ≥ 0 (32.4)
2 0, if net < 0
for unipolar neurons and
1, if net ≥ 0
(
o = f net) = sgn ( net) = (32.5)
– 1, if net < 0
for bipolar neurons. For these types of neurons, most of the known training algorithms are able to adjust
weights only in single-layer networks.
Multilayer neural networks usually use continuous activation functions, either unipolar
1
o = f net) = ------------------------------------- (32.6)
(
1 + exp – lnet)
(
or bipolar
2
o = f net) = tanh 0.5lnet) = ------------------------------------- – 1 (32.7)
(
(
1 + exp – lnet)
(
These continuous activation functions allow for the gradient-based training of multilayer networks.
Typical activation functions are shown in Fig. 32.4. In the case when neurons with additional threshold
input are used (Fig. 32.3(b)), the λ parameter can be eliminated from Eqs. (32.6) and (32.7) and the
steepness of the neuron response can be controlled by the weight scaling only. Therefore, there is no real
need to use neurons with variable gains.
Note, that even neuron models with continuous activation functions are far from an actual biological
neuron, which operates with frequency modulated pulse trains.
©2002 CRC Press LLC

