0% found this document useful (0 votes)
19 views3 pages

SciPy Optimization

The document demonstrates the use of SciPy for solving optimization problems, specifically using the Rosenbrock function. It showcases various optimization methods such as brute force, differential evolution, and different minimization techniques including BFGS and Nelder-Mead. The results indicate successful optimization with function values approaching zero and optimal variable values near [1, 1].

Uploaded by

Artyom Kholodkov
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views3 pages

SciPy Optimization

The document demonstrates the use of SciPy for solving optimization problems, specifically using the Rosenbrock function. It showcases various optimization methods such as brute force, differential evolution, and different minimization techniques including BFGS and Nelder-Mead. The results indicate successful optimization with function values approaching zero and optimal variable values near [1, 1].

Uploaded by

Artyom Kholodkov
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Решение оптимизационных задач в SciPy

In [1]: from scipy import optimize

In [2]: def f(x): # The rosenbrock function


return .5*(1 - x[0])**2 + (x[1] - x[0]**2)**2

print f([1, 1])

0.0

In [3]: result = optimize.brute(f, ((-5, 5), (-5, 5)))


print result

[ 0.99999324 1.00001283]

In [4]: print optimize.differential_evolution(f, ((-5, 5), (-5, 5)))

nfev: 1803
success: True
fun: 1.9180451127782141e-20
x: array([ 1., 1.])
message: 'Optimization terminated successfully.'
nit: 59

In [5]: import numpy as np

def g(x):
return np.array((-2*.5*(1 - x[0]) - 4*x[0]*(x[1] - x[0]**2), 2*(x[1] - x[0]**2)))

In [6]: print optimize.check_grad(f, g, [2, 2])

2.38418579102e-07

/
In [7]: print optimize.fmin_bfgs(f, [2, 2], fprime=g)

Optimization terminated successfully.


Current function value: 0.000000
Iterations: 16
Function evaluations: 24
Gradient evaluations: 24
[ 1.00000017 1.00000026]

In [8]: print optimize.minimize(f, [2, 2])

status: 0
success: True
njev: 24
nfev: 96
hess_inv: array([[ 0.98632031, 1.97824298],
[ 1.97824298, 4.46512254]])
fun: 9.536835216356594e-15
x: array([ 1.00000007, 1.00000005])
message: 'Optimization terminated successfully.'
jac: array([ 4.74151523e-07, -1.53924328e-07])

In [9]: print optimize.minimize(f, [2, 2], method='BFGS')

status: 0
success: True
njev: 24
nfev: 96
hess_inv: array([[ 0.98632031, 1.97824298],
[ 1.97824298, 4.46512254]])
fun: 9.536835216356594e-15
x: array([ 1.00000007, 1.00000005])
message: 'Optimization terminated successfully.'
jac: array([ 4.74151523e-07, -1.53924328e-07])

/
In [10]: print optimize.minimize(f, [2, 2], method='Nelder-Mead')

status: 0
nfev: 91
success: True
fun: 1.2311995365407462e-10
x: array([ 0.99998568, 0.99996682])
message: 'Optimization terminated successfully.'
nit: 46

You might also like