Page 276 - PVT Property Correlations
P. 276
242 PVT Property Correlations
connection defines which nodes are connected to one another in the calcula-
tion of connection weights. The selection among different connection layouts
depends on the network application and configuration.
Weight Initialization
The first step of the ANN calculation is to initialize connection weights.
Several techniques can be used for weight initialization. A straightforward
way to initialize network weights is to assume them to be constant values.
They can also be initialized using random values, or a booster algorithm.
The booster algorithm initializes the weights through random forward calcu-
lations with random sets of the training and testing data. Weight initialization
can speed up the convergence of the ANN during training.
Transformation Function
The transformation function is used to compute the value at a particular node
from the values and weights of the nodes connected to it. Of the several
types of transformation function, the most commonly used is the summation
of the connected nodes times their weights. Another transformation function
is to take the maximum (or minimum, or average, etc.) of the node values
times the weights (for the nodes connected to the node to be calculated).
Activation Function
The activation function is used to convert the calculated hidden layer nodes
values into different values through typical algebraic functions. Examples of
activation functions include Cube, Elu, Hardsigmoid, Hardtanh, Identity,
Leakyrelu, Rational-tanh, Relu, Rrelu, Sigmoid, Softmax, Softplus, Softsign,
and Tanh. The objective of the activation function is to maintain the conver-
gence of the ANN. The activation function can be applied to the output
nodes as well. The most commonly used activation function in PVT predic-
tion applications is the Sigmoid function.
Objective Function
The objective function represents the measure of difference between the
actual output values of the training dataset and the output node values calcu-
lated by the ANN. Several types of error minimization functions are avail-
able. The mean squared error is the most commonly used objective function.
Other objective functions include exponential log likelihood, cross entropy,
multiclass cross entropy, negative log likelihood, root mean squared cross
entropy, and many others. The selection of which objective function to use
in a particular ANN is a function of the problem, number of output nodes,
and output node internode dependency.