0% found this document useful (0 votes)
16 views

Numerical Optimization

This document provides an overview of nonlinear programming techniques. It discusses: 1. Classification of nonlinear programming problems into single variable and multi-variable, constrained and unconstrained cases. 2. Methods for solving one dimensional problems including elimination methods like fixed step size and Fibonacci search. 3. Newton-Raphson method for solving unconstrained multi-dimensional problems using gradients and Hessians. 4. Sequential quadratic programming for solving constrained nonlinear problems by converting it into a quadratic programming subproblem in each iteration.

Uploaded by

kesisdrderejesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Numerical Optimization

This document provides an overview of nonlinear programming techniques. It discusses: 1. Classification of nonlinear programming problems into single variable and multi-variable, constrained and unconstrained cases. 2. Methods for solving one dimensional problems including elimination methods like fixed step size and Fibonacci search. 3. Newton-Raphson method for solving unconstrained multi-dimensional problems using gradients and Hessians. 4. Sequential quadratic programming for solving constrained nonlinear problems by converting it into a quadratic programming subproblem in each iteration.

Uploaded by

kesisdrderejesh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Unit Two

Non Linear Optimization Techniques


Contents
• Nonlinear Programming
• Classification of NLP
– One dimensional NLP
– Multi dimensional NLP
• Unimodal function
• Elimination methods
• Unconstrained optimization Techniques
– Direct search methods(6.2)
– Indirect search methods
• Descent methods
• Newton’s method
• Constrained optimization techniques
• Modern optimization techniques
Nonlinear programming

• When objective function or constraint is


nonlinear
• Analytical
– If objective function is differentiable
– Constraints are equality
• Numerical method
– When objective function is not differentiable
Nonlinear programming
– Is a numerical method
– Used for non differentiable or analytically unsolvable
objective functions
– Example

f x  
0.75 1 1
 0.65 x tan  0.65
1 x 2
x
General outline of NLP
• Start with an initial trial point X1
• Find a suitable direction Si that points in the
direction of optimum
• Find an appropriate step length i for
movement in direction of Si
• Obtain the new approximation Xi+1 as
Xi 1  Xi  iSi
• Test if Xi+1 is optimum, if not optimum, set
i=i+1 and go to step 2 else stop
Classification of NLP

1. unconstrained single variable case


2. unconstrained multi-variable case
3. constrained single variable case
4. constrained multi-variable cases
Methods available for one dimensional case
Classification of NLP cont…
Unimodal function
• Has only one minima or maxima in a given interval
• A function is said to be Unimodal iff
Elimination methods
• Unrestricted search – with fixed step size
Example: Find the minimum of f such that f=x(x-1.5) starting with
x0=0.0 and with fixed step size of 0.05

• F0(0)=0 • X4=0.15
• Let x1=0.0+0.05 • F3(0.15)=0.15(0.15-
• F1(x2)=0.05(0.05-1.5) 1.5)
=-0.0725 =-0.2025
• F1<F0, continue • F3<F2, continue
• X3=0.05+0.05=0.1 • X5=0.2
• F2(0.1)=0.1(0.1-1.5) • F4(0.2)=0.2(0.2-1.5)
=-0.14 =-0.26
• F2<F1, continue • F4<F3, continue
Fibonacci
• Define two points x1 and x2 as
• Makes use of
Fibonacci numbers
• Fibonacci numbers
• Compare f(x1) and f(x2)
• Delete infeasible interval and
form new interval
• Algorithm • Repeat the procedure until
• Assume initial interval of specified number of iterations
uncertainty be L0=[a b]
• Let total number of
experiments be n , then
Flow chart of the Fibonacci method
Example: minimize f(x) given by

In the interval [0 3] with n=6

When n=6, L2 becomes then x1 and x2 will


be

X1=0+L2*=0.153846 and x2=3-1.153864=1.846154.


Accordingly f(x1)=-0.207270 and f(x2)=-0.115843, hence eliminate [x2 3]
Newton- Raphson method
• Is an NLP method based on first and second
derivatives of objective function
• It starts with an initial guess
• Improved approximation is obtained as
'
f x i 
x i 1  x i  ''
f x i 

• Stopping criteria is
f ' x i 1   
Example: Find the minimum of the function, with starting
point x=0.1 and stopping criteria =0.01
0.75 1 1
f ( x )  0.65   0.65 x tan
1 x2 x

Solution: X1=0.1

f x  
1.5x 0.65x 1 1
'
  0.65 tan
1  x 
2 2 1 x 2
x
1.51  3x  0.651  x  0.65
2 2
2.8  3.2x 2
f ''
x     
1  x 
2 3
1  x  1  x
2 2 2
1  x 
2 3

Xi f f’ f’’ Xi+1 Check


0.1 -0.1882 -0.7448 2.6865 0.377 |-0.13|>
0.377 -0.303 -0.1382 1.573 0.465 |-0.017|>
0.465 -0.3098 -0.0179 1.171 0.480 |-0.0005|<
Newton method for multivariable case

• Gradient
• Ji is the Hessian matrix of
– N component vector second order derivatives
of partial derivatives • Disadvantages
– represent direction – Needs storage of J
of steepest decent – Computation of J is
– Next iteration is difficult
obtained as – Needs the inversion
of J
• Solution: use quasi
– Where is the Newton Method
gradient of f
Example: minimize f(x1,x2) given by

f x1 , x 2   x1  x 2  2x12  2x1x 2  x 22


 4 2  0.5  0.5
J  J 1

 2 2    0.5 0.5 

1 1
x i 1  x i  J f ( x i )   
  2
xi Xi+1 J-1 f check
[0;0] [0.5;1.5] [0.5 -0.5;-0.5 1] [1;-1] ||f||=1.41>
[-1;1.5] [-1;1.5] [0.5 -0.5;-0.5 1] [0;0] ||f||=0<
Using MATLAB

• MATLAB has built in function named fminunc


• It solves unconstrained minimization
• Steps
– Write the objective function as a user defined
function in mfile
– Call fminunc using the objective function and initial
values as an argument
• Example: minimize the function

• With initial point


• Solution
• 1. write the objective function as mfile
function y=objc(x)
y=100*(x(2)-x(1)^2)^2+(1-x(1))^2

• 2. Call fminunc with the objective function as


argument
xo=[-1.2;1];
[x,fx]=fminunc(@objc,xo);
Reading assignment

• Steepest descent
• Quasi-Newton
Method
Constrained NLP
• Problem statement
Sequential quadratic programming
• Is the latest and best method
• Converts the constrained NLP to a
quadratic problem using
– Gradient of function
– Lagrangian multiplier
• Derivation
– Lagrangian equation of the above NLP is

• Converted Problem
Solution
• The solution procedure has the following steps
– Step 1 – start with an initial guess X
– Step 2 update X as

– Where S is the solution of an auxiliary optimization


problem

– Where H is the positive definite matrix initially taken as


identity and is updated to converge to Hessian of the
Lagrangian and the constant  is given by
• And  is found from the solution of

• MATLAB solution
– Steps :
• Write the objective function as m-file and save it
• Write the constraint function as a separate m-file and
save it
• Prepare the upper and lower bounds as vectors
• Call the build in function fmincon() using the objective
function, constrain functions and upper and lower
bounds as arguments
Example: solve the following minimization problem using
MATLAB

• Minimize

• Use X=[11.8765 7] as starting point


• Solution: There are no equality constraints
and lower and upper bounds
Programs
• function y=objc2(x)
y=0.1*x(1)+0.05773*x(2);
end
Save this as objc2 in a separate m-file

• function [c ceq]=constr(x)
ceq=[];
c=[(0.6/x(1)) +(0.3464/x(2))-0.1;6-x(1);7-
x(2)];
end
Save this also in a separate m-file named constr
• Write the main calling function
• xo=[ 11.8756 7];
• [x,
fx]=fmincon(@obj2,xo,[],[],[],[],[],[],@constr);
The answer will be
x=

9.4639 9.4642

fx =

1.4928
Modern optimization techniques
• Drawback of classical • AI based techniques
optimization – Fuzzy logic system
techniques – Neural network
– Need differential – Evolutionary
– Single directional algorithm
– Stack at local • Genetic algorithm
extreme • Simulated annealing
• Particle swarm

You might also like