General Condition For Solving Optimization Problem
General Condition For Solving Optimization Problem
• Preliminaries
• Optimality Conditions
• Lagrange Multiplier
• General Condition for Solving Optimization Problem
Reading:
Chapter 4 JS Arora Page 112-178
Chapter 6 Mark French
Existence of Minima
Motivation:
• Before applying any optimization method to an optimization problem, it is important to check
if there exists a feasible solution.
1. Gradient Vector:
What is Gradient Vector? Consider a function f(x) of n variables x1, x2, …, xn. The partial
derivative of the function with respect to x1 at a given point x* is defined as ∂f(x*)/∂x1, with
respect to x2 as ∂f(x*)/∂x2, and so on.
Preliminaries: Calculus Concepts
Why?
Preliminaries: Calculus Concepts
Preliminaries: Calculus Concepts
Necessary Condition:
• The conditions that must be satisfied at the optimum point are called necessary. Stated
differently, if a point does not satisfy the necessary conditions, it cannot be optimum.
• Points satisfying the necessary conditions are called candidate optimum points.
Sufficient Condition:
• If a candidate optimum point satisfies the sufficient condition, then it is indeed an optimum
point.
Having oxygen in the earth's atmosphere is a necessary condition for Rain pouring from the sky is a
human life. Certainly, having oxygen will not guarantee human life. There sufficient condition for the ground
are many other conditions needed for human life other than oxygen in the to be wet.
atmosphere.
Similarly, it is not necessary for rain to be pouring from the sky for the
ground to be wet. The sprinkler could be on as well.
Optimality Condition
1. How?
• Hypothesis: Because this is appearing from Taylor expansion, d can be viewed as (x-x*). For two variables, d is a vector.
Hessian seems to act on a vector. Can we think of it as a transformation matrix?
Or Av= λv
(A-Iλ)v=0
Optimality Condition for function with several Variables
• (A-Iλ)v=0
• dTHd = λ dTd
• What is dTd?
T(v)=λ v
What would be necessary & sufficient conditions for General optimization problem?
Optimality Condition for function with several Variables
With Constraints
What would be necessary & sufficient conditions for optimization problem with equality constraints?
Optimality Condition for function with several Variables
With Constraints
Convert the problem into unconstrained optimization by substituting the variable. We already know the
necessary and sufficient condition for the objective function.
Optimality Condition for function with several Variables
With Constraints
x1+x2=2
Optimality Condition for function with several Variables
With Constraints
Necessary Condition:
(1.5,1.5)
Sufficient condition:
x1+x2=2
Both the condition exists, so, there exists a minima.
Optimality Condition for function with several Variables
With Constraints
Necessary Condition:
Chain Rule
How to eliminate φ?
Substituting v :
Necessary conditions
Then, highlighted equation becomes
General Optimality Condition for
function with several Variables
With Inequality Constraints
• Previous slides, we have dealt with necessary and sufficient condition for equality constraints.
• If the optimization method has inequality constraints, then how to find the necessary and sufficient
conditions.
• One way is to convert the inequality constraints into equality constraints by introducing additional
slack variables.
Observe: substitute s=0 makes the inequality active, solving them we get
This is the stationary point of L. So, it is a candidate minimum.
Note that by keeping u>=0, gradient of the cost function and constraint function points in opposite direction. This way f
can not be reduced any further by stepping in the negative gradient direction without violating the constraint as shown in
figure.
Inequality Constraints:
Lagrangian Functions:
KKT Conditions:
Thus, all the necessary KKT conditions are met.