Page 243 - Numerical Methods for Chemical Engineering
P. 243

232     5 Numerical optimization



                   is expressed mathematically as
                                                minimize F(x)
                                                 g i (x) = 0  i = 1, 2,..., n e
                                      subject to                                      (5.55)
                                                h j (x) ≥ 0  j = 1, 2,..., n i
                   One approach to solve (5.55), the penalty method, is to add to F(x) penalty terms that push
                   x away from regions in which constraints are violated,
                                                                           (
                                        1    n e 	   2   n i 	            2
                          F µ (x) = F(x) +     [g i (x)] +  H(−h j (x))[h j (x)]      (5.56)
                                        2µ
                                            i=1         j=1
                   The Heaviside step function H(x)is

                                                       1, x ≥ 0
                                               H(x) =                                 (5.57)
                                                       0, x < 0
                   The problem with this approach is that to enforce the constraints exactly, we must take the
                   limit µ → 0. But as this limit is approached, the penalty terms come to dominate the actual
                   cost function of interest, and so the numerical minimization of F µ (x) in this limit is difficult.
                     Here we consider the augmented Lagrangian method, which converts the constrained
                   problem into a sequence of unconstrained minimizations. We first treat equality constraints,
                   and then extend the method to include inequality constraints.

                   Optimization with equality constraints

                   For simplicity, we first restrict our discussion to the case where we have only equality
                   constraints,

                                                minimize F(x)
                                     subject to  g i (x) = 0  i = 1, 2,..., n e       (5.58)

                   To make things even simpler, we consider at first only a single constraint,
                                                  minimize F(x)
                                             subject to  g(x) = 0                     (5.59)
                   For an unconstrained problem, a necessary condition for x min to be a minimum is that the
                   local gradient be zero,
                                                        = 0                           (5.60)
                                                 ∇F| x min
                   What is the similar necessary condition for a point to be a constrained minimum in the
                   presence of an equality constraint?
                     Let x min be a constrained minimum. Then, the curve g(x) = 0 and the contours of F(x)
                   will look something like Figure 5.11. If x min is a constrained minimum, we cannot move in
                   either direction along the curve g(x) = 0 and decrease the cost function.
                     If t is a tangent vector to the curve at x min , a first-order Taylor approximation in the
                   tangent direction gives

                                                                    · t               (5.61)
                                       F(x min + εt) − F(x min ) = ε∇F| x min
                   Thus, at a constrained minimum, ∇F must be perpendicular to any tangent direction of the
   238   239   240   241   242   243   244   245   246   247   248