Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Complete Plasmo.jl rewrite and updates for JuMP nonlinear interface #105

Merged
merged 130 commits into from
Jul 20, 2024

Conversation

jalving
Copy link
Member

@jalving jalving commented Jul 3, 2024

This is a mega PR that re-writes the core of Plasmo.jl to fully use the new JuMP nonlinear interface in addition to a refactor meant to make the package more graph-centric (versus the current v0.5 which is very node-centric).

The main theme of Plasmo.jl is now to treat an optigraph as an optimization problem made of nodes and edges that a user is free to modify, or generate a new optigraph using partitioning or querying the graph topology. The key change is to strictly use graphs as subproblems (as opposed to nodes) which standardizes potential solution approaches users might take (it is still possible to optimize individual nodes, but it creates a new graph internally that contains one node).

In the v0.5 version of Plasmo.jl, we use a JuMP.Model for each node which does not scale in cases where nodes contain few variables and constraints and the graph consists of thousands of nodes (e.g. for dynamic optimization problems with collocation methods). The new implementation makes nodes and edges more lightweight by having their model data stored in the graph they are created in. The v0.5 implementation is also quite hacky with how it achieved the integration of JuMP.Model objects and the optigraph. It is still possible to do things like set a JuMP.Model to a node, but it copies the model data over; the JuMP.Model is not modified by performing this operation. This also means users should use the optinode once they set a JuMP.Model if they want to do further modifications to the graph.

Other major changes include:

  • The legacy nonlinear interface is not supported. @NLconstraint, @NLobjective, @NLexpression will no longer work. It would be incredibly difficult to make the legacy interface work with the new changes. In fact, the point of the rewrite is to take advantage of the new nonlinear interface and how easy it makes managing model structures.
  • We no longer use a LinkConstraintRef type to store linking constraints. Linking constraints are treated as standard MOI constraints that exist on edges.
  • The projection interface was changed and no longer returns a mapping between an optigraph and the projected graph. Every projection function returns a graph datastructure that does all the mapping. This simplifies using projections for partitioning or querying topology.
  • A hypergraph is no longer stored internally on an optigraph. If a user wants to use hypergraph functions (e.g. to query neighbors) they need to create a hypergraph projection.
  • The optigraph backend was re-written to support different possible model backends. Right now we only have a backend that maps graph elements to an internal MOI model. It is intended that we could develop and hookup a specialized structured solver interface at some point. This is what GraphOptInterface.jl is intended to do, but it's definitely a long ways away.
  • We got rid of the OptiGraphNLPEvaluator. We don't need it now that we're using the new JuMP nonlinear expressions.

Short-term issues to address:

  • We need to update the documentation
  • Need to make sure we aren't hitting any private JuMP methods
  • We should update PlasmoPlots.jl to handle the new optigraph interface.
  • We should update SchwarzOpt.jl and write it in a way such that it works as a standard graph solver. This would be a good demonstration for writing standard meta-algorithms that use the optigraph as the core algorithm data structure.
  • There are likely missing JuMP methods that need to be implemented for an OptiGraph.

The Long-term Roadmap

GraphOptInterface

There are a couple directions we could take Plasmo.jl from here. I think developing GraphOptInterface.jl and using it to interface with MadNLP.jl to do Schur decomposition could be a useful start. I have always wanted to make a standard interface to DSPopt.jl and HiOp.jl although the packages do not seem to be currently maintained. We would also need to think about how to do distributed computing with GraphOptInterface.jl.

Distributed OptiGraphs

We could also develop distributed optigraphs that work the same way as normal optigraphs (possibly by parameterizing the optigraph with some new types). These graphs could support writing standard distributed algorithms using the same methods that exist on the current OptiGraph. The way we manage distributed structures could also use or mirror what we do in GraphOptInterface.jl.

edge_pointer = optiedge.backend.optimizers[graph.id]
dual_value = MOI.get(edge_pointer, MOI.ConstraintDual(), linkref)
return dual_value
function JuMP.set_objective_function(graph::OptiGraph, expr::JuMP.AbstractJuMPScalar)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wondering if there needs to be a separate function for this when expr is a single NodeVariableREf and not an expression. If I run the code below, it throws an error, and if I query the moi_backend's optimizer.is_objective_function_set, it returns false.

using Plasmo, HiGHS
g = OptiGraph()
set_optimizer(g, HiGHS.Optimizer)
@optinode(g, ntest)
@variable(ntest, var[1:2] >= 0)
@objective(g, Min, ntest[:var][1])
optimize!(g)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not quite sure what is going on here. A variable objective function works with other optimizers like Ipopt. It does not look like JuMP does anything special to handle this case, but it definitely works with HiGHS. I'll make an issue for this if this PR doesn't resolve it.

Project.toml Show resolved Hide resolved
docs/src/documentation/modeling.md Outdated Show resolved Hide resolved
docs/src/documentation/modeling.md Outdated Show resolved Hide resolved
src/core_types.jl Outdated Show resolved Hide resolved
src/macros.jl Outdated Show resolved Hide resolved
src/optigraph.jl Show resolved Hide resolved
src/aggregate.jl Show resolved Hide resolved
JuMP.delete(JuMP.owner_model(nvref), JuMP.BinaryRef(nvref))
return nothing
end

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You had asked if there was any other JuMP methods we might be missing. This is not essential, but having the JuMP.relax_integrality could be convenient. Maybe this is just a # TODO: for now, but it could be nice in the future. In the DDP solver, I do the integration manually.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we also need set_normalized_coefficient. These may be simple copy-paste from JuMP.jl. Let's try to get these methods working in this PR.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good. If you want me to try to add those, I can do that on Monday. If you want to do it yourself, that's fine too.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added the relax_integrality methods. Feel free to take a stab at set_normalized_coefficient. JuMP has a good example implementation. You will want to call MOI.Modify on each possible backend like we do when adding constraints.

@jalving
Copy link
Member Author

jalving commented Jul 15, 2024

it looks like doing index_map_FS = index_map[F, S] here is no longer valid Julia code. I'll look into a workaround.
This turned out to be a bug that didn't get picked up in stable julia

@jalving
Copy link
Member Author

jalving commented Jul 17, 2024

@odow All of the private MOI methods should be gone with this PR. Let me know if you come across anything problematic.

@jalving jalving merged commit 06c106e into plasmo-dev:main Jul 20, 2024
5 checks passed
@jalving jalving deleted the plasmo-rewrite branch January 19, 2025 02:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants