Optimization PPT - Part-2
Optimization PPT - Part-2
Optimization PPT - Part-2
(19MAT202)
DEPARTMENT OF MATHEMATICS
Gradient-based Method
Line Search in Multidimensional
Optimization
• One-dimensional search methods play an
important role in multidimensional optimization
problems.
• In particular, iterative algorithms for solving such
optimization problems (to be discussed in the
upcoming classes) typically involve a line search
at every iteration.
• To be specific, let 𝑓: 𝑅𝑛 → 𝑅 be a function that we
wish to minimize.
• Iterative algorithms for finding a minimizer of 𝑓
are of the form:
𝜙𝑘 𝛼 : = 𝑓(𝒙(𝒌) − 𝛼𝑓(𝒙(𝒌) ))
• In other words,
𝛼𝑘 = argmin 𝑓(𝒙(𝒌) − 𝛼𝑓 𝒙 𝒌
).
𝛼≥0
The Method of Steepest Descent
• To summarize, the steepest descent algorithm
proceeds as follows:
• At each step, starting from the point 𝒙(𝒌) , we
conduct a line search in the direction −𝑓(𝒙(𝒌) )
until a minimizer, 𝒙(𝒌+𝟏) , is found.
• A typical sequence resulting from the method of
steepest descent is depicted in the following
Figure:
• Observe that the method of steepest descent moves
in orthogonal steps, as stated in the following
proposition.
Remark: (Stopping Criteria)
Problem: We use the method of steepest descent to find the
minimizer of
𝒇 𝒙𝟏 , 𝒙𝟐 = 𝒙𝟏 − 𝒙𝟐 + 𝟐𝒙𝟐𝟏 + 𝟐𝒙𝟏 𝒙𝟐 + 𝒙𝟐𝟐
The initial point is 𝑥 (0) = 0, 0 𝑇 . Perform three iterations.
𝑻
𝑓 𝒙 = 𝟏 + 𝟒𝒙𝟏 + 𝟐𝒙𝟐 , −𝟏 + 𝟐𝒙𝟏 + 𝟐𝒙𝟐
𝑓(𝒙(𝟎) ) = 1, −1 𝑇
(ii). We need to find 𝒙(𝟏) .
What is this 𝛼0 ?
= 0, 0 𝑇 − 1, −1 𝑇 = −1, 1 𝑇
(ii). We need to find 𝒙(𝟐) .
𝟏
𝑓 𝒙 = [−1, −1]
Problem: We use the method of steepest descent to find the
minimizer of
𝒇 𝒙𝟏 , 𝒙𝟐 , 𝒙𝟑 = 𝒙𝟏 − 𝟒 𝟒 + 𝒙𝟐 − 𝟑 𝟐 + 𝟒 𝒙𝟑 + 𝟓 𝟒
Solution:
(i). First Find 𝑓(𝒙)
𝟑 𝟑 𝟑 𝑻
𝑓 𝒙 = 𝟒 𝒙𝟏 − 𝟒 , 𝟐 𝒙𝟐 − 𝟑 , 𝟏𝟔 𝒙𝟑 + 𝟓
What is this 𝛼0 ?
(ii). We need to find 𝒙(𝟐) .
where
Newton’s Method
Problem: Use Newton's method to minimize the following
function:
(Powell function)
Solution:
Solution:
(1) (0) 0 −1 0
𝒙 =𝒙 −𝐹 𝒙 𝒈
0 0 −1 0
=𝒙 −𝑭 𝒙 𝑓(𝒙 )
Newton’s Method