forked from SciML/Optimization.jl
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathadtypes.jl
220 lines (171 loc) · 7.8 KB
/
adtypes.jl
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
"""
AutoEnzyme <: AbstractADType
An AbstractADType choice for use in OptimizationFunction for automatically
generating the unspecified derivative functions. Usage:
```julia
OptimizationFunction(f, AutoEnzyme(); kwargs...)
```
This uses the [Enzyme.jl](https://github.com/EnzymeAD/Enzyme.jl) package. Enzyme performs automatic differentiation on the LLVM IR code generated from julia.
It is highly-efficient and its ability perform AD on optimized code allows Enzyme to meet or exceed the performance of state-of-the-art AD tools.
- Compatible with GPUs
- Compatible with Hessian-based optimization
- Compatible with Hv-based optimization
- Compatible with constraints
Note that only the unspecified derivative functions are defined. For example,
if a `hess` function is supplied to the `OptimizationFunction`, then the Hessian
is not defined via Enzyme.
"""
AutoEnzyme
"""
AutoFiniteDiff{T1,T2,T3} <: AbstractADType
An AbstractADType choice for use in OptimizationFunction for automatically
generating the unspecified derivative functions. Usage:
```julia
OptimizationFunction(f, AutoFiniteDiff(); kwargs...)
```
This uses [FiniteDiff.jl](https://github.com/JuliaDiff/FiniteDiff.jl).
While not necessarily the most efficient, this is the only
choice that doesn't require the `f` function to be automatically
differentiable, which means it applies to any choice. However, because
it's using finite differencing, one needs to be careful as this procedure
introduces numerical error into the derivative estimates.
- Compatible with GPUs
- Compatible with Hessian-based optimization
- Compatible with Hv-based optimization
- Compatible with constraint functions
Note that only the unspecified derivative functions are defined. For example,
if a `hess` function is supplied to the `OptimizationFunction`, then the
Hessian is not defined via FiniteDiff.
## Constructor
```julia
AutoFiniteDiff(; fdtype = Val(:forward)fdjtype = fdtype, fdhtype = Val(:hcentral))
```
- `fdtype`: the method used for defining the gradient
- `fdjtype`: the method used for defining the Jacobian of constraints.
- `fdhtype`: the method used for defining the Hessian
For more information on the derivative type specifiers, see the
[FiniteDiff.jl documentation](https://github.com/JuliaDiff/FiniteDiff.jl).
"""
AutoFiniteDiff
"""
AutoForwardDiff{chunksize} <: AbstractADType
An AbstractADType choice for use in OptimizationFunction for automatically
generating the unspecified derivative functions. Usage:
```julia
OptimizationFunction(f, AutoForwardDiff(); kwargs...)
```
This uses the [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl)
package. It is the fastest choice for small systems, especially with
heavy scalar interactions. It is easy to use and compatible with most
Julia functions which have loose type restrictions. However,
because it's forward-mode, it scales poorly in comparison to other AD
choices. Hessian construction is suboptimal as it uses the forward-over-forward
approach.
- Compatible with GPUs
- Compatible with Hessian-based optimization
- Compatible with Hv-based optimization
- Compatible with constraints
Note that only the unspecified derivative functions are defined. For example,
if a `hess` function is supplied to the `OptimizationFunction`, then the
Hessian is not defined via ForwardDiff.
"""
AutoForwardDiff
"""
AutoModelingToolkit <: AbstractADType
An AbstractADType choice for use in OptimizationFunction for automatically
generating the unspecified derivative functions. Usage:
```julia
OptimizationFunction(f, AutoModelingToolkit(); kwargs...)
```
This uses the [ModelingToolkit.jl](https://github.com/SciML/ModelingToolkit.jl)
package's `modelingtookitize` functionality to generate the derivatives and other fields of an `OptimizationFunction`.
This backend creates the symbolic expressions for the objective and its derivatives as well as
the constraints and their derivatives. Through `structural_simplify`, it enforces simplifications
that can reduce the number of operations needed to compute the derivatives of the constraints. This automatically
generates the expression graphs that some solver interfaces through OptimizationMOI like
[AmplNLWriter.jl](https://github.com/jump-dev/AmplNLWriter.jl) require.
- Compatible with GPUs
- Compatible with Hessian-based optimization
- Compatible with Hv-based optimization
- Compatible with constraints
Note that only the unspecified derivative functions are defined. For example,
if a `hess` function is supplied to the `OptimizationFunction`, then the
Hessian is not generated via ModelingToolkit.
## Constructor
```julia
AutoModelingToolkit(false, false)
```
- `obj_sparse`: to indicate whether the objective hessian is sparse.
- `cons_sparse`: to indicate whether the constraints' jacobian and hessian are sparse.
"""
AutoModelingToolkit
"""
AutoReverseDiff <: AbstractADType
An AbstractADType choice for use in OptimizationFunction for automatically
generating the unspecified derivative functions. Usage:
```julia
OptimizationFunction(f, AutoReverseDiff(); kwargs...)
```
This uses the [ReverseDiff.jl](https://github.com/JuliaDiff/ReverseDiff.jl)
package. `AutoReverseDiff` has a default argument, `compile`, which
denotes whether the reverse pass should be compiled. **`compile` should only
be set to `true` if `f` contains no branches (if statements, while loops)
otherwise it can produce incorrect derivatives!**
`AutoReverseDiff` is generally applicable to many pure Julia codes,
and with `compile=true` it is one of the fastest options on code with
heavy scalar interactions. Hessian calculations are fast by mixing
ForwardDiff with ReverseDiff for forward-over-reverse. However, its
performance can falter when `compile=false`.
- Not compatible with GPUs
- Compatible with Hessian-based optimization by mixing with ForwardDiff
- Compatible with Hv-based optimization by mixing with ForwardDiff
- Not compatible with constraint functions
Note that only the unspecified derivative functions are defined. For example,
if a `hess` function is supplied to the `OptimizationFunction`, then the
Hessian is not defined via ReverseDiff.
## Constructor
```julia
AutoReverseDiff(; compile = false)
```
#### Note: currently, compilation is not defined/used!
"""
AutoReverseDiff
"""
AutoTracker <: AbstractADType
An AbstractADType choice for use in OptimizationFunction for automatically
generating the unspecified derivative functions. Usage:
```julia
OptimizationFunction(f, AutoTracker(); kwargs...)
```
This uses the [Tracker.jl](https://github.com/FluxML/Tracker.jl) package.
Generally slower than ReverseDiff, it is generally applicable to many
pure Julia codes.
- Compatible with GPUs
- Not compatible with Hessian-based optimization
- Not compatible with Hv-based optimization
- Not compatible with constraint functions
Note that only the unspecified derivative functions are defined. For example,
if a `hess` function is supplied to the `OptimizationFunction`, then the
Hessian is not defined via Tracker.
"""
AutoTracker
"""
AutoZygote <: AbstractADType
An AbstractADType choice for use in OptimizationFunction for automatically
generating the unspecified derivative functions. Usage:
```julia
OptimizationFunction(f, AutoZygote(); kwargs...)
```
This uses the [Zygote.jl](https://github.com/FluxML/Zygote.jl) package.
This is the staple reverse-mode AD that handles a large portion of
Julia with good efficiency. Hessian construction is fast via
forward-over-reverse mixing ForwardDiff.jl with Zygote.jl
- Compatible with GPUs
- Compatible with Hessian-based optimization via ForwardDiff
- Compatible with Hv-based optimization via ForwardDiff
- Not compatible with constraint functions
Note that only the unspecified derivative functions are defined. For example,
if a `hess` function is supplied to the `OptimizationFunction`, then the
Hessian is not defined via Zygote.
"""
AutoZygote