Skip to content

Latest commit

 

History

History
145 lines (117 loc) · 6.98 KB

index.md

File metadata and controls

145 lines (117 loc) · 6.98 KB

Optimization.jl: A Unified Optimization Package

Optimization.jl provides the easiest way to create an optimization problem and solve it. It enables rapid prototyping and experimentation with minimal syntax overhead by providing a uniform interface to >25 optimization libraries, hence 100+ optimization solvers encompassing almost all classes of optimization algorithms such as global, mixed-integer, non-convex, second-order local, constrained, etc. It allows you to choose an Automatic Differentiation (AD) backend by simply passing an argument to indicate the package to use and automatically generates the efficient derivatives of the objective and constraints while giving you the flexibility to switch between different AD engines as per your problem. Additionally, Optimization.jl takes care of passing problem specific information to solvers that can leverage it such as the sparsity pattern of the hessian or constraint jacobian and the expression graph.

It extends the common SciML interface making it very easy to use for anyone familiar with the SciML ecosystem. It is also very easy to extend to new solvers and new problem types. The package is actively maintained and new features are added regularly.

Installation

Assuming that you already have Julia correctly installed, it suffices to import Optimization.jl in the standard way:

import Pkg
Pkg.add("Optimization")

The packages relevant to the core functionality of Optimization.jl will be imported accordingly and, in most cases, you do not have to worry about the manual installation of dependencies. However, you will need to add the specific optimizer packages.

Contributing

Overview of the Optimizers

Package Local Gradient-Based Local Hessian-Based Local Derivative-Free Box Constraints Local Constrained Global Unconstrained Global Constrained
BlackBoxOptim ❌ ✅
CMAEvolutionaryStrategy
Evolutionary 🟡
Flux
GCMAES
MathOptInterface 🟡
MultistartOptimization
Metaheuristics 🟡
NOMAD 🟡
NLopt 🟡 🟡
Optim
PRIMA
QuadDIRECT

✅ = supported

🟡 = supported in downstream library but not yet implemented in Optimization; PR to add this functionality are welcome

❌ = not supported

Citation

@software{vaibhav_kumar_dixit_2023_7738525,
	author = {Vaibhav Kumar Dixit and Christopher Rackauckas},
	month = mar,
	publisher = {Zenodo},
	title = {Optimization.jl: A Unified Optimization Package},
	version = {v3.12.1},
	doi = {10.5281/zenodo.7738525},
  	url = {https://doi.org/10.5281/zenodo.7738525}
	year = 2023}

Reproducibility

<details><summary>The documentation of this SciML package was built using these direct dependencies,</summary>
using Pkg # hide
Pkg.status() # hide
</details>
<details><summary>and using this machine and Julia version.</summary>
using InteractiveUtils # hide
versioninfo() # hide
</details>
<details><summary>A more complete overview of all dependencies and their versions is also provided.</summary>
using Pkg # hide
Pkg.status(; mode = PKGMODE_MANIFEST) # hide
</details>
using TOML
using Markdown
version = TOML.parse(read("../../Project.toml", String))["version"]
name = TOML.parse(read("../../Project.toml", String))["name"]
link_manifest = "https://github.com/SciML/" * name * ".jl/tree/gh-pages/v" * version *
                "/assets/Manifest.toml"
link_project = "https://github.com/SciML/" * name * ".jl/tree/gh-pages/v" * version *
               "/assets/Project.toml"
Markdown.parse("""You can also download the
[manifest]($link_manifest)
file and the
[project]($link_project)
file.
""")