Page 251 - Fundamentals of Ocean Renewable Energy Generating Electricity From The Sea
P. 251
Optimization Chapter | 9 241
(left plot in Fig. 9.1), there is only one minimum or optimal solution. By
definition, a function is convex if the line segment connecting two points
on the function always lies above it. This is not the case for the right-hand
plot in Fig. 9.1, which is nonconvex. This definition can be generalized as a
function with several variables. Although the majority of real life optimization
problems are nonconvex, classical, simple optimization techniques can quickly
find local optimums. Global optimization techniques are more complicated and,
accordingly, more computationally expensive.
9.1.2 Search Methods and Optimization Algorithms
To briefly explain how an optimization algorithm works, the basic concepts of
search techniques are explained here. An optimization technique searches for a
point that minimizes the objective function. Many mathematical techniques are
iterative. They start from an arbitrary initial point in the search space and update
that point during each iteration:
x k+1 = x k + α k g k (9.3)
in which α is the step size, and g is the step direction. During each iteration, the
decision variable vector is updated using Eq. (9.3); step size and step direction
are computed based on the optimization algorithm (e.g. maximum gradient).
Fig. 9.2 shows a 2D schematic of an optimization that starts at point x 1 and
moves iteratively towards the optimum point.
FIG. 9.2 Two-dimensional schematic of optimization using iterative gradient techniques.