Multi-parametric Optimization and Control. Efstratios N. Pistikopoulos

Читать онлайн.
Название Multi-parametric Optimization and Control
Автор произведения Efstratios N. Pistikopoulos
Жанр Математика
Серия
Издательство Математика
Год выпуска 0
isbn 9781119265191



Скачать книгу

strict inequality holds in expression (1.4) for
, then
is a strictly concave function.

      1.1.1.2 Properties of Convex Functions

      1 Let be convex functions defined on a convex subset . Their summation(1.5) is convex, and if at least of one is a strictly convex function, then their summation is strictly convex.

      2 Let a be a positive number and be a (strictly) convex function defined in a convex subset . Then the product is (strictly) convex.

      3 Let be a (strictly) convex function defined in , and be an increasing convex function defined on the range of in . Then, the composite function defined in is a (strictly) convex function.

      4 Let be convex functions defined on a convex subset . If these functions are bounded from above, their pointwise supremum(1.6) is a convex function on .

      5 Let be concave functions defined on a convex subset . If these functions are bounded from below, their pointwise infimum(1.7) is a concave function on .

      1.1.2 Optimality Conditions

      We introduce the following definitions for the solution of general nonlinear optimization problems:

      Definition 1.6 (Local Minimum)

      

is called a local minimum if there exists ball of radius
around
,
, such that

      (1.8)

      Definition 1.7 (Global Minimum)

      

is called a global minimum if

      (1.9)

subject to the inequality constraints
and equality constraints
is denoted as

      Problem (1.10) is a nonlinear optimization problem, if and only if, at least one of

is a nonlinear function. We assume that the aforementioned functions are continuous and differentiable.

      Definition 1.8 (Active Constraints)

      An inequality constraint

is called active at a point
if
. Conversely,
is called inactive if
.

      Remark 1.1

      If one step of the dual simplex algorithm consists of changing one element of the active set, i.e. let

, then the dual pivot involving the constraint
yields
.

      The first‐order constraint qualifications that will be presented in the following text are necessary prerequisites to identify whether a feasible point

is a local optimum of the function
.

       Linear independence constraint qualification: The gradients for all and for all are linearly independent.

       Slater constraint qualification: The constraints for all are pseudo‐convex1 at , while the constraints for all are quasi‐convex or quasi‐concave.2 In addition, the gradients are linearly independent and there exists such that and .

      1.1.2.1 Karush–Kuhn–Tucker Necessary Optimality Conditions

and
be differentiable at a feasible solution
, and let
have continuous partial derivatives at
. In addition, let
be the number of active inequality constraints at
. Then if one of the aforementioned constraint qualifications hold, there exist Lagrange multipliers
such that

      (1.11)

      These conditions are the Karush–Kuhn–Tucker (KKT) Necessary Conditions and they are the basis for the solution of nonlinear optimization problems.

      1.1.2.2 Karun–Kush–Tucker First‐Order Sufficient Optimality Conditions

      Consider the sets

and
. Then, if the following conditions hold:

        is pseudo‐convex at with respect to all other feasible points x.

        for all are quasi‐convex at with respect to all other feasible points x.

        for all are quasi‐convex at with respect to all other feasible points x.

        for all are quasi‐concave at with respect to all other feasible points