Chapter IV. Constrained Optimization
Chapter IV. Constrained Optimization
negative constraint
• 8! = . - −4 &" − 42 ;."
= −. &" − 42 ;."
• Hence,
+1 +
• = +9 )8 = &(. -) (&" − 42).- + −. &" − 42 ;."
&2.-
+2
+1 + -.4 (&";42).- 4.- 2.-
• = )8 = − , use common denominator
+2 +9 2." &";42 ."
+ -.4∗ &";42 ." &";42 .- ;4.-∗2." 2.- -.4∗ &";42 ;4.-2
• )8 = = =
+9 2." &";42 ." 2." &";42 ."
+1
• +2
= -. 4 ∗ &" − 42 − 4. -2 = ≫≫ "4 − ". 42 =
• ≫≫ 2 = & + 3 = &" − 42 = &" − 4 = 4
Lagrange Multiplier History
' ' / -
• H = ' 3 3 = / −&. - . / > ≫
H
' 3 3 - . / −. -&4
• For the above example, we have
• 3>> = , '3 = /, + '2 = -
• 333 = −. 442.- 3;&." = −&. -, by substituting optimal L and K.
• 322 = −. 443.- 2;&." = −. -&", , by substituting optimal L and K.
• 332 = 323 = &. 03;." 2;." = . -4, , by substituting optimal L and K.
The Bordered Hessian: Sufficient conditions for extrema
• For the sign definiteness of + 7 where J7 = + + +K.
• For the values of dx and dy that satisfy the constraint,
• + 7 * L + , MNOPQ(R RS JT = ,
' '
= '
• UVV H 3 3 < , . . .
' 3 3
• Determinants are always negative
• + 7 'L + , MNOPQ(R RS JT = ,
' '
= '
• UVV H 3 3 > , . . .
' 3 3
• Determinants alternate in sign with X >
• Where Y + Z 6 [ [ 6
SOC n-case Summary
Conditions Maximization Minimization
F.O.C. 3F = 3F =
3G = 3G =
3= = 3= =
S.O.C.
X >
X <
X <
X <
X - >
X - <
(−&)\ X >
\ e.t.c
• Determinants are always negative …..Minimization
• Determinants alternate in sign …..Maximization
More Examples on constrained optimization
• Find the extreme values of the following functions and
identify whether they are minimum or maximum using
bordered Hessian
1. t = FG + ="
2. t =
+ - =2
Lagrange Multiplier Interpretation
• Now since the optimal value of L depends on & ∗ , ∗ , + =∗ . We
may consider L* to be a function of m alone.
• 3∗ = (& ∗ , ∗ ) + =∗ − '(& ∗ , ∗ )
• Differentiating L* with respect to m, we have;
+3∗ +& ∗ + ∗ ∗ ∗ +>∗ +& ∗ + ∗
• = & + + + + − '(& , ) + >∗ & − '& − '
+ + + +
+3∗
• = =∗ , u[ 6 666
+
• An interpretation of the Lagrange Multiplier.
• The Lagrange multiplier, λ , measures the effect on the objective function of a
one unit change in the constant of the constraint function (stock of a resource)
HOW to interpret λ
• Note :-
• If λ > ,0 it means that for every one unit increase (decrease)
in the constant of the constraining function, the objective
function will increase (decrease by a value approximately
equal to λ .
• If λ < ,0 a unit increase (decrease) in the constant of the
constraint will lead to an decrease (increase ) in the value of
the objective function by approximately equal to λ .
S.O.C the Bordered Hessian
• The Bordered Hessian is the condition under a constraint optimization
• Assume a quadratic case sign definiteness for simplicity
• v = + [L + L
• Subject to Y + ZL =
• The constraint implies w = −(Y⁄Z)N, we can re-write q as a function
of one variable only.
• v = + [[− Y⁄Z N] + [− Y⁄Z N]
• v = [Z − [YZ + ZY ] .
Z
• The sign of q depends on the sign of the expression in the bracket.
S.O.C the Bordered Hessian
• Assume a symmetric matrix determinant given as
Y Z
• Y [ = [YZ − Z − Y
Z [
The new determinant is an exact opposite of the above one…Z − [YZ + ZY
• Therefore,
Y Z
• v * L + Y + ZL = , UVV Y [ <
Z [
Y Z
• v 'L + Y + ZL = , UVV Y [ >
Z [
• Where z {|} ~ { | |{|.
4.3 Inequality constraints and the theorem of Kuhn-
Tucker or Karush Kuhn Tucker(KKT)
• Often called Kuhn-Tucker(KKT) theorem
• Sometimes called the Karush-Kuhn-Tucker(KKT) theorem
• Optimization with inequality constraints.
• Developed by
William Karush (1939-1997), Maths, Chicagio
Harold W. Kuhn(1925-2014), Maths, Newyork
Albert William Tucker,(1905-1995), Maths,
Canada
KKT … Single variable case
1. Single variable case :
• = & , & ≥ , non-negativity constraint
• We have three possible situations on the restrictions:
• ! & = + & > (point A)
• ! & = + & = (point B)
• ! & < + & = (point C and D)
• L6::
• These three conditions can be consolidated in to:
• ! , & ≥ and & ∗ ! & =
KKT … Single variable case
• & ∗ ! = , is referred to as complementary slackness between & &
! .
&
Conclusion
! & ≥ & > & ∗ ! = Minimization
! & & > & ∗ ! = Maximization
•
+ 4 +
+ /"
• [ 3'6'
• < = ./ ./
+ >& 4 − −
+ > (/" − − )
• ANS
• >& = , > = . /, 3 = 4, F = &-, + G = &-
Additional Example
KKT: The generalized Case