"Native" support for parametric (LP/MILP) models #3215
Closed
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Idea
This PR is based on the initial idea and discussion over at jump-dev/MathOptInterface.jl#2092. In contrast however, it builds up a pretty similar approach based mostly on JuMP functionality. It also does not rely on a bridge, and supports multiplicative parameters in the objective functions (and to some degree duals of the parameters).
Preface: there are a lot of rough edges, this is just a first proof-of-concept
This spends some additional time during model building to setup the proper data structure, but on parameter updates / new iterations it is almost as fast as possible (there is a single
LinearAlgebra.dot
call overhead - and that should be pretty fast forSparseVector
s)!Rough outline
Example 1
Create a model and enable parameters for it:
We can now add parameters like variables, and bind them to initial values:
We can construct a paramteric constraint using
You'll notice that this constraint actually reads
2 x[1] + x[3] - q ≥ 0.0
currently. This is due to the fact, that multiplicative parameters are only substituted before calling the solver (so they are currently considered to be 0 from the constraint's point of view).We can also define an objective using parameters:
And now we just call
optimize!(model)
Let's look at the constraint now
results in
4 x[1] + x[3] - q + 2 x[2] ≥ 0.0
and the objective
objective_function(model)
results in
3 x[3] + 3 x[2] + 3 x[1]
.We can also query
but that needs to be handled with care since it only works for parameters that do not occur in a multiplicative way.
Updating a parameter is as simple as
which again does not affect the constraints immediately, but only as soon as we actually call
optimize!(...)
. It is therefore pretty fast.Example 2
Using expressions works like we would expect - except for the fact that my constraint macro is ... lacking. But that can be fixed. See spoiler
for full example:
Benchmark
I am using - similar to the MOI PR - a comparison against doing it manually. See code in the spoiler:
Results
The best possible (manual) timings are:
The timings of the parametric model are:
Notes
Gurobi
for this at the moment, sinceHiGHS
struggles with cases with "more" variables,(see also this comment by odow).
Open points
@parameter
macro is a lot slower than doing what I thought it does manually so this is completely wrong; the expression/constraint macro need to properly bind their results in the calling scopecredits and history: discourse (I've tried bulding expressions, building functions, using
DynamicExpressions.jl
, ...)cc because maybe interested: @fleimgruber