Page 284 - MATLAB an introduction with applications
P. 284
Optimization ——— 269
4. Find the value of the next penalty parameter, r k + 1 , as
r k+1 = cr k
where c < 1.
5. Set the new value of k = k + 1, take the new starting point as X = X* , and go to step 2.
k
1
All these aspects are discussed in the following paragraphs.
Starting Feasible Point X :
1
1. Select an arbitrary point X and evaluate the constraints g (X) at the point X . Since the point X is
1
j
1
1
arbitrary, it may not satisfy all the constraints with strict inequality sign. If r out of a total of m constraints
are violated, renumber the constraints such that the last r constraints will become the violated ones,
that is,
g (X ) < 0, j = 1, 2, …, m – r
1
j
g (X ) ≥ 0, j = m – r + 1, m – r + 2, …, m
1
j
2. Identify the constraint that is violated most at the point X , that is, obtain the integer k such that
1
g (X ) = max[g (X )] for j = m – r + 1, m – r + 2, …, m
k
1
j
1
3. Formulate a new optimization problem as:
Find X which minimizes g (X)
k
subject to
g (X) ≤ 0, j = 1, 2, …, m – r
j
g (X) – g (X ) ≤ 0, j = m – r + 1, m – r + 2, …, k – 1, k + 1, …, m
1
k
j
4. Solve the optimization problem formulated in step 3 by taking the point X as a feasible starting point
1
using the interior penalty function method. Note that this optimization method can be terminated
whenever the value of the objective function g (X) drops below zero. The solution obtained X will
M
k
satisfy at least one more constraint than did the original point X .
1
5. If all the constraints are not satisfied at the point X , set the new starting point as X = X , and
1
M
M
renumber the constraints such that the last r constraints will be the unsatisfied ones (this value of r
will be different from the previous value), and go to step 2.
This procedure is repeated until all the constraints are satisfied and a point X = X is obtained for
1
M
which g (X ) < 0, j = 1, 2, …, m.
1
j
Initial Value of the Penalty Parameter (r ):
1
Since the unconstrained minimization of φ(X, r ) is to be carried out for a decreasing sequence of r , it
k
k
might appear that by choosing a very small value of r , we can avoid an excessive number of minimizations
1
of the function φ. But from a computational point of view, it will be easier to minimize the unconstrained
function φ(X, r ) if r is large. A moderate value has to be choosen for the initial penalty parameter (r ). In
k
1
k
practice, a value of r that gives the value of φ(X , r ) approximately equal to 1.1 to 2.0 times the value of
1
1
1
f(X ) has been found to be quite satisfactory in achieving quick convergence of the process. Hence, for
1
any initial feasible starting point X , the value of r can be taken as
1
1
( f X )
r 0.1 to1.0 −Σ m j = 1 1/ g 1 j (X 1 ) ...(5.17)
1