Читать книгу Convex Optimization - Mikhail Moklyachuk - Страница 10
1.3. Optimization problems with objective functions of several variables
ОглавлениеLet be a function of n real variables.
DEFINITION 1.3.– A function f is said to be lower semicontinuous (upper semicontinuous) at a point if there exists a δ-neighborhood
of the point such that the inequality
holds true for all x ∈ .
THEOREM 1.8.– A function is lower semicontinuous on ℝn if and only if for all a ∈ ℝ the set f–1((a, +∞]) is open (or the complementary set f–1 ((–∞, a]) is closed).
PROOF.– Let f be a lower semicontinuous on ℝn function, let a ∈ ℝ and let ∈ f–1((a, +∞]). Then there exists a δ-neighborhood of the point such that for all points x ∈ the inequality f(x) > a holds true. This means that . Consequently, the set f–1((a, +∞]) is open.
Vice versa, if the set f–1((a, +∞]) is open for any a ∈ ℝ and ∈ ℝn, then and the function f is lower semicontinuous at point by agreement, or and ∈ f–1((a, +∞]), when . Since the set f–1((a, +∞]) is open, then there exists a δ-neighborhood of the point such that and f(x) > a for all x ∈ . Consequently, the function f is lower semicontinuous at point . □
THEOREM 1.9.– (Weierstrass theorem) A lower (upper) semicontinuous on a non-empty bounded closed subset X of the space ℝn function is bounded from below (from above) on X and attains the smallest (largest) value.
THEOREM 1.10.– (Weierstrass theorem) If a function f(x) is lower semicontinuous and for some number a the set {x: f(x) ≤ a} is non-empty and bounded, then the function f(x) attains its absolute minimum.
COROLLARY 1.1.– If a function f(x) is lower (upper) semicontinuous on ℝn and
then the function f(x) attains its minimum (maximum) on each closed subset of the space ℝn.
THEOREM 1.11.– (Necessary conditions of the first order) If is a point of local extremum of the differentiable at the point function f(x), then all partial derivatives of the function f(x) are equal to zero at the point :
THEOREM 1.12.– (Necessary conditions of the second order) If is a point of local minimum of the function f(x)>, which has the second-order partial derivatives at the point , then the following condition holds true:
This condition means that the matrix
is non-negative definite.
THEOREM 1.13.– (Sufficient conditions of the second order) Let a function have the second-order partial derivatives at a point and let the following conditions hold true:
Then is the point of local minimum of the function f(x).
The second condition of the theorem means that the matrix
is positive definite.
THEOREM 1.14.– (Sylvester’s criterion) A matrix A is positive definite if and only if its principal minors are positive. A matrix A is negative definite if and only if (–1)k det Ak > 0, where
We write down the series of principal minors of the matrix A
Then:
– the matrix A is positive definite, if
– the matrix A is negative definite, if
– the matrix A is non-negative (non-positive) definite, ifand there exists j, such that Δj =0;
– the matrix A is indefinite.
EXAMPLE 1.1.– Find solutions for the optimization problem
Solution. The function is continuous. Obviously, Smax = +∞. As a result of the Weierstrass theorem, the minimum is attained. The necessary conditions of the first order
are of the form
Solving these equations, we find the critical points (0, 0), (1, 1), (–1, –1). To use the second-order conditions, we compute matrices composed of the second-order partial derivatives:
The matrix –A1 is non-negative definite. Therefore, the point (0, 0) satisfies the second-order necessary conditions for a maximum. However, a direct verification of the behavior of the function f in a neighborhood of the point (0, 0) shows that (0, 0) ∉ locextr f. The matrix A2 is positive definite. Thus, by theorem 1.13, the local minimum of the problem is attained at the points (1, 1), ( –1, –1).
Answer. (0, 0) ∉ locextr; (1, 1), ( –1, –1) ∈ locmin.