Читать книгу Convex Optimization - Mikhail Moklyachuk - Страница 12

1.4.2. Problems with equality and inequality constraints

Оглавление

Let fi : ℝn → ℝ be differentiable functions of n real variables. The constrained optimization problem with equality and inequality constraints is the following problem:

[1.3]

We will formulate the necessary conditions for solution of the problem [1.3].

THEOREM 1.19.– (Theorem on indeterminate Lagrange multipliers) Let be a point of local solution of the problem [1.3], and let the functions fi(x), i = 0, … , m + s, be continuously differentiable in some neighborhood U of the point . Then there will exist the Lagrange multipliers λ0, λ1, … , λm+s, not all equal to zero, such that for the Lagrange function


the following conditions hold true:

 – stationarity condition with respect to x

 – complementary slackness condition

 – non-negativity condition

Consequently, the rule of indeterminate Lagrange multipliers for solving constrained optimization problems with equality and inequality constraints is as follows:

1 1) Write the Lagrange function

2 2) Write the necessary conditions:– stationarity condition– complementary slackness condition– non-negativity condition

3 3) Find the critical points, that is, all admissible points that satisfy the necessary conditions with the Lagrange multiplier λ0 = 0 and λ0 ≠ 0.

4 4) Find a solution to the problem among the stationary points or prove that the problem has no solutions.

REMARK 1.1.– Using the rule of indeterminate Lagrange multipliers for solving constrained optimization problems with equality constraints, one can choose the number λ0 as both positive and negative. For constrained optimization problems with equality and inequality constraints, the sign of λ0 is significant.

EXAMPLE 1.2.– Solve the constrained optimization problem


The only obvious solution to this problem is the point . We solve the problem by the Lagrange method.

1 1) Write the Lagrange function .

2 2) Write the stationary equations

3 3) If λ0 = 1, we get the equationsThe first equation is incompatible with the condition . Therefore, the system of equationshas no solutions.

4 4) If λ0 = 0, then x1 = 0, x2 = 0 is a solution of the system of equations.

Answer. (0, 0) ∈ absmin.

Example 1.2 shows that by applying the rule of indeterminate Lagrange multipliers, it is not always possible to take λ0 = 1.

EXAMPLE 1.3.– Solve the constrained optimization problem


where a > 0 and b > 0 are given numbers.

Solution.

1 1) Write out the (regular) Lagrange function (as indicated in theorem 1.15 the regularity condition is satisfied here):

2 2) Since

the system of equations for determination of stationary points has the form:


This system of equations has three solutions:


1 3) Next we have

For the three solutions found, this matrix takes the form accordingly


The condition , i = 1, … , m, here is of the form . For the first two solutions, this means that h2 = 0 and h1 = 0, respectively. It is clear from this that matrices A1 and A2 satisfy the conditions of theorem 1.17 (although they are not positive definite). Therefore, the points (0, 1), (1, 0) are strict local solutions of the problem. For the matrix, A3 conditions of theorem 1.17 are not satisfied. That is why the point


cannot be a solution of the minimization problem. This point is a local solution of the maximization problem of the same function under the same restrictions.

Answer. ∈ locmin, ∈ locmin,


EXAMPLE 1.4.– Solve the constrained optimization problem


Solution.

1 1) Write out the Lagrange function

2 2) Write the necessary conditions:– stationarity condition– complementary slackness condition– non-negativity condition λ0 ≥ 0, λ1 ≥ 0.

3 3) If λ0 = 0, then, in accordance with the condition of stationarity we have λ1 = 0, and λ2 = 0. So all Lagrange multipliers are zero. This contradicts the conditions of the Lagrange theorem 1.19. Take λ0 = 1/2. Let λ1 ≠ 0. Then, under the complementary slackness condition we have 2x1 – x2 + x3 – 5 = 0. We express x1, x2, x3 through λ1, λ2 and substitute into the equationWe get λ1 = –9/14 < 0. This contradicts the non-negativity condition. Let λ1 = 0, then x1 = x2 = x3 = 1 is a critical point.

4 4) The function as ∥x∥ → ∞. By the corollary of the Weierstrass theorem, a solution of the problem exists. Since the critical point is unique, the solution of the problem can be only this point.

Answer. ∈ absmin, Smin = 3.

EXAMPLE 1.5.– An example of the irregular problem. Consider the constrained optimization problem


Figure 1.1. Example 1.5

Solution. Figure 1.1 depicts the admissible set of the problem and the level line of the objective function. The solution of the problem is the point . Active at this point are the first and the second restrictions. Wherein


The vector cannot be represented as a linear combination of vectors and . The relation


at the point can only be performed with


Gradients and in this case are linearly dependent.

Answer. ∈ absmin, Smin = 0.

EXAMPLE 1.6.– Solve the convex constrained optimization problem


Solution. Slater’s condition is fulfilled. Therefore, we write the regular Lagrange function:


The system for finding stationary points in this case (s = 0, n = 2, k = m = 3) can be written in the form


At point , the first and third restrictions are active, and the second is passive. Therefore, λ2 = 0. As a result, we obtain a system for determining λ1 and λ3:


System solution , λ3 = 1/2. The point is a solution of the problem. Make sure that there are no other stationary points and, therefore, no solution to the problem (see Figure 1.2).

Answer. ∈ absmin, .

EXAMPLE 1.7.– Let the numbers a > 0, b > 0, and let a < b. Find points of the local minimum and the local maximum of the function


on the set of solutions of the system


Solution. We denote this set by X. Let us write the Lagrange function



Figure 1.2. Example 1.6

The system for determining stationary points has the form

[1.4]

[1.5]

[1.6]

[1.7]

[1.8]

Let xi = 0. Then it follows from the system that x2 ≥ 1, . Hence either x2 = 1 or x2 ≥ –1. In other case, λ1 = 0. If x2 < –1, then λ2 = 0. But then λ1 = 0, which contradicts the conditions of the problem. Now we easily get the first two groups of system solutions.

1 1) x1 = 0, x2 = 1, bλ0 + 3λ1 − 2λ2 = 0, λ1 ≥ 0, λ2 ≥ 0, (λ1, λ2) ≠ 0;

2 2) x1 = 0, x2 = −1, bλ0 − 2λ2 = 0, λ1 = 0, λ2 > 0.

Similarly, assuming that x2 = 0, we obtain two other groups of solutions:

1 3) x1 = 1, x2 = 0, aλ0 + 3λ1 − 2λ2 = 0, λ1 ≥ 0, λ2 ≥ 0, (λ1, λ2) ≠ 0;

2 4) x1 = −1, x2 = 0, aλ0 − 2λ2 = 0, λ1 = 0, λ2 > 0.

Assume that x1 ≠ 0, x2 ≠ 0. Then equations of the system can be presented in the form


If here λ1 = 0, then λ0 = 0, since ab. But then λ2 = 0, which contradicts the system of conditions. It remains to be assumed that λ1 > 0. Then . Given that λ1 ≠ 0, λ2 ≠ 0, we deduce from this that , and therefore λ2 = 0. Now we can easily get another group of solutions of the system:


Note that in (1) and (3) the multiplier λ0 can accept both positive and negative values, in (2) and (4) it can accept only positive values and in (5) it can accept only negative values. Therefore, (0, 1) and (1, 0) are stationary points both in the minimization problem and in the maximization problem, (0, –1) and (–1, 0) are stationary points only in the minimization problem, and the point of (5) is the stationary point only in the problem of maximization.

Now we will study the stationary points for optimality. The function f is strongly convex on ℝ2. Therefore, it reaches the global minimum on any closed set X. We calculate the value of f at the stationary points of the minimization problem:


Since a < b, hence (1.0) and (–1.0) are points of the global minimum of the function f on X.

Let us represent the function f in the form


If we move from the points (0, 1) and (0, –1), remaining on the circle , and hence in X, then the value of f will decrease. Consequently, these points are not points of the local minimum f on X. At the same time, for any ε > 0, the point (–ε, 1) belongs to X and f(0, 1) < f(–ε, 1). Therefore, the point (0, 1) is not a point of the local maximum f on X. Consequently, the stationary points (0, 1) and (0, –1) are not solutions of the problem.

We now consider the matrix of the second derivatives of the Lagrangian function:


For values with (5), this matrix is as follows:


Since λ0 < 0, this matrix is positive definite. Sufficient conditions for the minimum are fulfilled. Consequently, (, ) is the point of the strict local minimum of f on X.

Answer. ∈ absmin, ∈ absmin, ∈ absmax. Δ

Convex Optimization

Подняться наверх