
- Home
- Introduction
- Linear Programming
- Norm
- Inner Product
- Minima and Maxima
- Convex Set
- Affine Set
- Convex Hull
- Caratheodory Theorem
- Weierstrass Theorem
- Closest Point Theorem
- Fundamental Separation Theorem
- Convex Cones
- Polar Cone
- Conic Combination
- Polyhedral Set
- Extreme point of a convex set
- Direction
- Convex & Concave Function
- Jensen's Inequality
- Differentiable Convex Function
- Sufficient & Necessary Conditions for Global Optima
- Quasiconvex & Quasiconcave functions
- Differentiable Quasiconvex Function
- Strictly Quasiconvex Function
- Strongly Quasiconvex Function
- Pseudoconvex Function
- Convex Programming Problem
- Fritz-John Conditions
- Karush-Kuhn-Tucker Optimality Necessary Conditions
- Algorithms for Convex Problems
Convex Optimization - Programming Problem
There are four types of convex programming problems −
Step 1 − minf(x), where x∈S and S be a non-empty convex set in Rn and f(x) is convex function.
Step 2 − minf(x),x∈Rn subject to
gi(x)≥0,1≤m1 and gi(x) is a convex function.
gi(x)≤0,m1+1≤m2 and gi(x) is a concave function.
gi(x)=0,m2+1≤m and gi(x) is a linear function.
where f(x) is a convex fucntion.
Step 3 − maxf(x) where x∈S and S be a non-empty convex set in Rn and f(x) is concave function.
Step 4 − minf(x), where x∈Rn subject to
gi(x)≥0,1≤m1 and gi(x) is a convex function.
gi(x)≤0,m1+1≤m2 and gi(x) is a concave function.
gi(x)=0,m2+1≤m and gi(x) is a linear function.
where f(x) is a concave function.
Cone of feasible direction
Let S be a non-empty set in Rn and let ˆx∈Closure(S), then the cone of feasible direction of S at ˆx, denoted by D, is defined as D={d:d≠0,ˆx+λd∈S,λ∈(0,δ),δ>0}
Each non-zero vector d∈D is called feasible direction.
For a given function f:Rn⇒R the cone of improving direction at ˆx is denoted by F and is given by
F={d:f(ˆx+λd)≤f(ˆx),∀λ∈(0,δ),δ>0}
Each direction d∈F is called an improving direction or descent direction of f at ˆx
Theorem
Necessary Condition
Consider the problem minf(x) such that x∈S where S be a non-empty set in Rn. Suppose f is differentiable at a point ˆx∈S. If ˆx is a local optimal solution, then F0∩D=ϕ where $F_0=\left \{ d:\bigtriangledown f\left ( \hat{x} \right )^T d
Sufficient Condition
If F0∩D=ϕ f is a pseudoconvex function at ˆx and there exists a neighbourhood of ˆx,Nε(ˆx),ε>0 such that d=x−ˆx∈D for any x∈S∩Nε(ˆx), then ˆx is local optimal solution.
Proof
Necessary Condition
Let F0∩D≠ϕ, ie, there exists a d∈F0∩D such that d∈F0 and d∈D
Since d∈D, therefore there exists δ1>0 such that ˆx+λd∈S,λ∈(0,δ1).
Since d∈F0, therefore $\bigtriangledown f \left ( \hat{x}\right )^T d
Thus, there exists δ2>0 such that $f\left ( \hat{x}+\lambda d\right )
Let δ=min{δ1,δ2}
Then $\hat{x}+\lambda d \in S, f\left (\hat{x}+\lambda d \right )
But ˆx is local optimal solution.
Hence it is contradiction.
Thus F0∩D=ϕ
Sufficient Condition
Let F0∩D≠ϕ nd let f be a pseudoconvex function.
Let there exists a neighbourhood of ˆx,Nε(ˆx) such that d=x−ˆx,∀x∈S∩Nε(ˆx)
Let ˆx is not a local optimal solution.
Thus, there exists ˉx∈S∩Nε(ˆx) such that $f \left ( \bar{x} \right )
By assumption on S∩Nε(ˆx),d=(ˉx−ˆx)∈D
By pseudoconvex of f,
$$f\left ( \hat{x} \right )>f\left ( \bar{x} \right )\Rightarrow \bigtriangledown f\left ( \hat{x} \right )^T\left ( \bar{x}-\hat{x} \right )
$\Rightarrow \bigtriangledown f\left ( \hat{x} \right) ^T d
⇒d∈F0
hence F0∩D≠ϕ
which is a contradiction.
Hence, ˆx is local optimal solution.
Consider the following problem:minf(x) where x∈X such that gx(x)≤0,i=1,2,...,m
f:X→R,gi:X→Rn and X is an open set in Rn
Let S={x:gi(x)≤0,∀i}
Let ˆx∈X, then M={1,2,...,m}
Let I={i:gi(ˆx)=0,i∈M} where I is called index set for all active constraints at ˆx
Let $J=\left \{i:g_i\left ( \hat{x}\right )
Let $F_0=\left \{ d \in \mathbb{R}^m:\bigtriangledown f\left ( \hat{x} \right )^T d
Let $G_0=\left \{ d \in \mathbb{R}^m:\bigtriangledown gI\left ( \hat{x} \right )^T d
or $G_0=\left \{ d \in \mathbb{R}^m:\bigtriangledown gi\left ( \hat{x} \right )^T d
Lemma
If S={x∈X:gi(x)≤0,∀i∈I} and X is non-empty open set in Rn. Let ˆx∈S and gi are different at ˆx,i∈I and let gi where i∈J are continuous at ˆx, then G0⊆D.
Proof
Let d∈G0
Since ˆx∈X and X is an open set, thus there exists δ1>0 such that ˆx+λd∈X for λ∈(0,δ1)
Also since gˆx0, $g_i\left ( \hat{x}+\lambda d\right )
Since d∈G0, therefore, $\bigtriangledown g_i\left ( \hat{x}\right )^T d 0, g_i\left ( \hat{x}+\lambda d\right )
Let δ=min{δ1,δ2,δ3}
therefore, $\hat{x}+\lambda d \in X, g_i\left ( \hat{x}+\lambda d\right )
⇒ˆx+λd∈S
⇒d∈D
⇒G0⊆D
Hence Proved.
Theorem
Necessary Condition
Let f and gi,i∈I, are different at ˆx∈S, and gj are continous at ˆx∈S. If ˆx is local minima of S, then F0∩G0=ϕ.
Sufficient Condition
If F0∩G0=ϕ and f is a pseudoconvex function at (ˆx,gi9x),i∈I are strictly pseudoconvex functions over some ε - neighbourhood of ˆx,ˆx is a local optimal solution.
Remarks
Let ˆx be a feasible point such that ▽f(ˆx)=0, then F0=ϕ. Thus, F0∩G0=ϕ but ˆx is not an optimal solution
But if ▽g(ˆx)=0, then G0=ϕ, thus F0∩G0=ϕ
-
Consider the problem : min f(x) such that g(x)=0.
Since g(x)=0, thus $g_1\left(x \right)=g\left(x \right)
Let ˆx∈S, then g1(ˆx)=0 and g2(ˆx)=0.
But ▽g1(ˆx)=−▽g2(ˆx)
Thus, G0=ϕ and F0∩G0=ϕ.