ADNLPModels

DOI GitHub release codecov

CI Cirrus CI - Base Branch Build Status

This package provides automatic differentiation (AD)-based model implementations that conform to the NLPModels API. The general form of the optimization problem is

\begin{aligned}
\min \quad & f(x) \\
& c_L \leq c(x) \leq c_U \\
& \ell \leq x \leq u,
\end{aligned}

How to Cite

If you use ADNLPModels.jl in your work, please cite using the format given in CITATION.bib.

Installation

ADNLPModels is a   Julia Language   package. To install ADNLPModels, please open Julia's interactive session (known as REPL) and press ] key in the REPL to use the package mode, then type the following command

pkg> add ADNLPModels

Examples

For optimization in the general form, this package exports two constructors ADNLPModel and ADNLPModel!.

using ADNLPModels

f(x) = 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
T = Float64
x0 = T[-1.2; 1.0]
# Rosenbrock
nlp = ADNLPModel(f, x0) # unconstrained

lvar, uvar = zeros(T, 2), ones(T, 2) # must be of same type than `x0`
nlp = ADNLPModel(f, x0, lvar, uvar) # bound-constrained

c(x) = [x[1] + x[2]]
lcon, ucon = -T[0.5], T[0.5]
nlp = ADNLPModel(f, x0, lvar, uvar, c, lcon, ucon) # constrained

c!(cx, x) = begin
  cx[1] = x[1] + x[2]
  return cx
end
nlp = ADNLPModel!(f, x0, lvar, uvar, c!, lcon, ucon) # in-place constrained

It is possible to distinguish between linear and nonlinear constraints, see .

This package also exports the constructors ADNLSModel and ADNLSModel! for Nonlinear Least Squares (NLS), i.e. when the objective function is a sum of squared terms.

using ADNLPModels

F(x) = [10 * (x[2] - x[1]^2); x[1] - 1]
nequ = 2 # length of Fx
T = Float64
x0 = T[-1.2; 1.0]
# Rosenbrock in NLS format
nlp = ADNLSModel(F, x0, nequ)

The resulting models, ADNLPModel and ADNLSModel, are instances of AbstractNLPModel and implement the NLPModel API, see NLPModels.jl.

We refer to the documentation for more details on the resulting models, and you can find tutorials on jso.dev/tutorials/ and select the tag ADNLPModel.jl.

AD backend

The following AD packages are supported:

  • ForwardDiff.jl;
  • ReverseDiff.jl;

and as optional dependencies (you must load the package before):

  • Enzyme.jl;
  • SparseDiffTools.jl;
  • Symbolics.jl;
  • Zygote.jl.

Bug reports and discussions

If you think you found a bug, feel free to open an issue. Focused suggestions and requests can also be opened as issues. Before opening a pull request, start an issue or a discussion on the topic, please.

If you want to ask a question not suited for a bug report, feel free to start a discussion here. This forum is for general discussion about this repository and the JuliaSmoothOptimizers, so questions about any of our packages are welcome.