First off, thanks for taking the time to contribute!
If you think you found a bug, feel free to open an issue. Focused suggestions and requests can also be opened as issues. Before opening a pull request, please start an issue or a discussion.
If you want to ask a question not suited for a bug report, feel free to start a discussion here, a forum for general discussion about this repository and the JuliaSmoothOptimizers organization. Discussions about any of our packages are welcome.
We welcome pull requests proposing new problems to the problem set. As a general guideline, a pull request should concern one problem only. We recommend checking existing problems as a template for your new problems.
Here is a to-do list, to help you add new problems:
- Before implementing a new problem, make sure it does not already exist in this repository.
- This package contains implementations using
JuMPandADNLPModels. A pull request should include both implementations of a new problem. Additionally, a "meta" provides general information regarding the problem. Therefore, a PR adding a new problem should contain 3 files:src/ADNLPProblems/problem_name.jlsrc/PureJuMP/problem_name.jlsrc/Meta/problem_name.jlIn both cases, the function must have the same nameproblem_nameas the file.
- When submitting a problem, please pay particular attention to the documentation. We would like to gather as much information as possible on the provenance of problems, other problem sets where the problems are present, and general information on the problem.
The documentation should be added to the file in the
PureJuMPfolder. - New problems can be scalable, see ADNLPProblems/arglina.jl and PureJuMP/arglina.jl for examples. In that case, the first keyword parameter should be the number of variables
n::Intand have the default valuedefault_nvar(constant predefined in the module). If your problem has restrictions on the number of variables, e.g.,nshould be odd, ornshould have the form4k + 3, then, instead of throwing errors when the restrictions are not satisfied, you should instead use the number of variables to be as close tonas possible. For example, if you wantnodd andn = 100is passed, you can internally convert ton = 99. If you wantn = 4k + 3, andn = 100is passed, then computek = round(Int, (n - 3) / 4)and updaten. - A first version of the
metacan be generated usinggenerate_meta. AStringis returned that can be copy-pasted into theMetafolder, and then edited.
using ADNLPModels, Distributed, NLPModels, NLPModelsJuMP, OptimizationProblems, Test
include("test/utils.jl")
# there must exists a function `problem_name` which loads the model in the environment,
# it must be exported.
create_meta_files(String["catmix", "gasoil", "glider", "methanol", "pinene", "rocket", "steering"])- Problems modeled with
ADNLPModelsshould be type-stable, i.e. they should all have keyword argumenttype::Type{T} = Float64whereTis the type of the initial guess and the type used by theNLPModelAPI.
In order to standardize the new functions, we offer here a template for both AD and JuMP models.
First, we describe the PureJuMP file function_name.jl. This file contains the documentation on the problem.
# Full name of the problem (while function_name could be an abbreviation)
#
# Source of the problem
# Don't hesitate to put more than one source if it is mentioned elsewhere
#
# CUTEst classification (if available)
#
# other information related to the problem
#
export function_name
"A short docstring on the problem"
function function_name(; n::Int = default_nvar, kwargs...)
nlp = Model()
# define the model: TODO
return nlp
end
Next, we describe the ADNLPProblems file function_name.jl.
export function_name
function function_name(; n::Int = default_nvar, type::Type{T} = Float64, kwargs...) where {T}
# define f
# define x0
# nlp = ADNLPModels.ADNLPModel(f, x0, name = "function_name"; kwargs...)
return nlp
end
- Ensure all meta fields are accurate and complete.
- For problem implementation in both ADNLP and PureJuMP, use the same initial point, variable bounds, constraint bounds and ensure objective and constraint values match within a relative tolerance.
- The objective of implementations must be callable at the starting point and should not return NaN unless expected.
- Problems modeled with
ADNLPModelsshould support thenls=true/falsekeyword to allow bothADNLPModelandADNLSModelinstantiation from the same problem. - For least-squares problems, instantiate both
ADNLPModelandADNLSModeland ensureresidual!(nls, x, Fx)is allocation-free with the objectives agree (or differ by a factor of 2 for LS). - For variable-size problems, verify that different values of
nproduce correctnvar, meta formulas predict actual values and instantiation works at various sizes.
Meta
- Every new problem (ADNLP or PureJuMP) is registered in Meta, with all fields (origin, objtype, contype, bounds, best-known, etc.) filled correctly.
- Meta formulas for variable sizes match actual model behavior.
Definition
- No extra or spurious exports are introduced.
- Model name matches the file and function name.
Implementation
- Objective and constraint values agree within tolerance at test points.
- Number of variables and constraints match.
Sanity
- Objective is callable at the starting point and does not return NaN (unless documented).
- Model instantiates without error for all supported types (Float32, Float64).
- For scalable problems, changing n updates nvar and all related meta fields.
Least-Squares & In-Place APIs
- If least-squares, ADNLP constructor supports
nls=true/falsefor both ADNLPModel and ADNLSModel. - In-place nonlinear constraint evaluation (
nln!) and least-squares residuals (residual!) are allocation-free. - For least-squares, objectives for NLP and NLS agree (or differ by a factor of 2, as appropriate).
Zero-Allocation
- All in-place APIs (constraints, residuals) are allocation-free (Julia ≥ 1.7).
- No unnecessary allocations in tight loops or callbacks.