-
-
Notifications
You must be signed in to change notification settings - Fork 397
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make things work for general AbstractArray
s
#980
Conversation
cc: @rdeits |
src/JuMP.jl
Outdated
@@ -909,9 +909,9 @@ include("print.jl") | |||
# Deprecations | |||
include("deprecated.jl") | |||
|
|||
getvalue{T<:JuMPTypes}(arr::Array{T}) = map(getvalue, arr) | |||
getvalue{T<:JuMPTypes}(arr::AbstractArray{T}) = map(getvalue, arr) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not needed, use getvalue.(arr)
instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So delete the method?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's been previously discussed. The conclusion for now is to allow getvalue
on objects that JuMP creates, and JuMP can create Array{Variable}
, e.g., PSD matrices.
src/JuMP.jl
Outdated
|
||
function setvalue{T<:AbstractJuMPScalar}(set::Array{T}, val::Array) | ||
function setvalue{T<:AbstractJuMPScalar}(set::AbstractArray{T}, val::AbstractArray) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also not needed, use setvalue.(set,val)
src/JuMPArray.jl
Outdated
@@ -10,7 +10,7 @@ immutable JuMPArray{T,N,NT} <: JuMPContainer{T,N} | |||
meta::Dict{Symbol,Any} | |||
end | |||
|
|||
@generated function JuMPArray{T,N}(innerArray::Array{T,N}, indexsets::NTuple{N,Any}) | |||
@generated function JuMPArray{T,N}(innerArray::AbstractArray{T,N}, indexsets::NTuple{N,Any}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this is needed
src/affexpr.jl
Outdated
error("The operators <=, >=, and == can only be used to specify scalar constraints. If you are trying to add a vectorized constraint, use the element-wise dot comparison operators (.<=, .>=, or .==) instead") | ||
|
||
function addVectorizedConstraint(m::Model, v::Array{LinearConstraint}) | ||
function addVectorizedConstraint(m::Model, v::AbstractArray{LinearConstraint}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is an internal function, it's unclear how it could be called with an AbstractArray
src/callbacks.jl
Outdated
@@ -59,7 +59,7 @@ function addinfocallback(m::Model, f::Function; when::Symbol = _unspecifiedstate | |||
push!(m.callbacks, InfoCallback(f, when)) | |||
end | |||
|
|||
function lazycallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::Vector{LazyCallback}) | |||
function lazycallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::AbstractVector{LazyCallback}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Internal, won't be called with AbstractVector
src/callbacks.jl
Outdated
@@ -92,7 +92,7 @@ function lazycallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::Vecto | |||
:Continue | |||
end | |||
|
|||
function attach_callbacks(m::Model, cbs::Vector{LazyCallback}) | |||
function attach_callbacks(m::Model, cbs::AbstractVector{LazyCallback}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Internal, won't be called with AbstractVector
src/callbacks.jl
Outdated
@@ -101,7 +101,7 @@ function attach_callbacks(m::Model, cbs::Vector{LazyCallback}) | |||
end | |||
end | |||
|
|||
function cutcallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::Vector{CutCallback}) | |||
function cutcallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::AbstractVector{CutCallback}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Internal, won't be called with AbstractVector
src/callbacks.jl
Outdated
@@ -119,7 +119,7 @@ function cutcallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::Vector | |||
:Continue | |||
end | |||
|
|||
function attach_callbacks(m::Model, cbs::Vector{CutCallback}) | |||
function attach_callbacks(m::Model, cbs::AbstractVector{CutCallback}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Internal, won't be called with AbstractVector
src/callbacks.jl
Outdated
@@ -129,7 +129,7 @@ function attach_callbacks(m::Model, cbs::Vector{CutCallback}) | |||
end | |||
|
|||
|
|||
function heurcallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::Vector{HeuristicCallback}) | |||
function heurcallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::AbstractVector{HeuristicCallback}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Internal, won't be called with AbstractVector
src/callbacks.jl
Outdated
@@ -147,7 +147,7 @@ function heurcallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::Vecto | |||
:Continue | |||
end | |||
|
|||
function attach_callbacks(m::Model, cbs::Vector{HeuristicCallback}) | |||
function attach_callbacks(m::Model, cbs::AbstractVector{HeuristicCallback}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Internal, won't be called with AbstractVector
src/callbacks.jl
Outdated
@@ -156,7 +156,7 @@ function attach_callbacks(m::Model, cbs::Vector{HeuristicCallback}) | |||
end | |||
end | |||
|
|||
function infocallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::Vector{InfoCallback}) | |||
function infocallback(d::MathProgBase.MathProgCallbackData, m::Model, cbs::AbstractVector{InfoCallback}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Internal, won't be called with AbstractVector
src/nlp.jl
Outdated
@@ -169,7 +169,7 @@ type NLPEvaluator <: MathProgBase.AbstractNLPEvaluator | |||
end | |||
end | |||
|
|||
function simplify_expression(nd::Vector{NodeData}, const_values, subexpression_linearity, fixed_variables, parameter_values, x_values, subexpression_values) | |||
function simplify_expression(nd::AbstractVector{NodeData}, const_values, subexpression_linearity, fixed_variables, parameter_values, x_values, subexpression_values) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
None of the changes in this file are needed
src/operators.jl
Outdated
@@ -370,7 +370,7 @@ function _multiply!{T<:JuMPTypes}(ret::Array{T}, lhs::Array, rhs::Array) | |||
end | |||
|
|||
# this computes lhs.'*rhs and places it in ret | |||
function _multiplyt!{T<:JuMPTypes}(ret::Array{T}, lhs::Array, rhs::Array) | |||
function _multiplyt!{T<:JuMPTypes}(ret::AbstractArray{T}, lhs::AbstractArray, rhs::AbstractArray) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These methods are designed for dense matrices and shouldn't be widened
src/sos.jl
Outdated
@@ -25,7 +25,7 @@ Base.copy(sos::SOSConstraint, new_model::Model) = | |||
|
|||
# Given a vector of affine expressions, extract a vector of the single | |||
# variable in each expression and a vector of their coefficients | |||
function constructSOS(m::Model, coll::Vector{AffExpr}) | |||
function constructSOS(m::Model, coll::AbstractVector{AffExpr}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These changes don't seem relevant
I'm willing to consider this use case, but the changes here are way too broad and touch a number of internal functions that would never be called with |
Sure. Thanks for the quick review. I'd like to argue that |
src/quadexpr.jl
Outdated
@@ -95,7 +95,7 @@ function getvalue(a::QuadExpr) | |||
end | |||
return ret | |||
end | |||
getvalue(arr::Array{QuadExpr}) = map(getvalue, arr) | |||
getvalue(arr::AbstractArray{QuadExpr}) = map(getvalue, arr) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also getvalue.(arr)
here
To motivate why this would be nice to have, here are some cool things that @tkoolen and I have been doing w.r.t. AxisArrays of JuMP variables: https://gist.github.com/rdeits/3371af92c9c4aa61353f38c9a91c02e9 |
AxisArrays look like a potential replacement for JuMPArrays. |
Personally I don't see much value in putting |
AxisArrays do seem like a great replacement for the categorical indexing in JuMPArray. I think they don't cover the ragged and non-1-based indexing that JuMPArray supports, though. |
I've addressed your comments. If there are more places where you want things to be more specific, let me know. I limited the test to cases that resulted in m = Model()
x = view(@variable(m, [1: 3]), :)
A = rand(3, 3)
@show @which sum(x)
@show @which diagm(x)
@show @which norm(x)
@show @which A * x
@show @which x + 1
@show @which x * 1 prints
while m = Model()
x = @variable(m, [1: 3])
A = rand(3, 3)
@show @which sum(x)
@show @which diagm(x)
@show @which norm(x)
@show @which A * x
@show @which x + 1
@show @which x * 1 prints
(note: this PR in its current form doesn't address all of these).
The main value I see is that it conveys the intention of the code. Moreover, you still get to put constraints on the dimension and element type, and you need it to avoid ambiguities. However, widening signatures to A (possibly overly) cheap response to your argument is that for some methods, you're not currently testing all possible values of |
src/operators.jl
Outdated
|
||
############### | ||
# The _multiply!(buf,y,z) adds the results of y*z into the buffer buf. No bounds/size | ||
# checks are performed; it is expected that the caller has done this, has ensured | ||
# that the eltype of buf is appropriate, and has zeroed the elements of buf (if desired). | ||
|
||
function _multiply!{T<:JuMPTypes}(ret::Array{T}, lhs::Array, rhs::Array) | ||
function _multiply!{T<:JuMPTypes}(ret::AbstractArray{T}, lhs::Array, rhs::Array) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@joehuchette should comment but I don't think this method was designed for AbstractArray
s. The built-in fallback should work
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I don't think this part of the code will work with generic abstract arrays as-is.
src/operators.jl
Outdated
@@ -571,7 +571,7 @@ end; end | |||
|
|||
# The following are primarily there for internal use in the macro code for @constraint | |||
for op in [:(+), :(-)]; @eval begin | |||
function $op(lhs::Array{Variable},rhs::Array{Variable}) | |||
function $op(lhs::AbstractArray{Variable},rhs::AbstractArray{Variable}) | |||
(sz = size(lhs)) == size(rhs) || error("Incompatible sizes for $op: $sz $op $(size(rhs))") | |||
ret = Array{AffExpr}(sz) | |||
for I in eachindex(ret) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm concerned about the validity of using eachindex
from ret
to index into lhs
and rhs
when these could be different types.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think eachindex(ret,lhs,rhs)
should work?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops, forgot this one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed (here and in a number of other locations).
src/print.jl
Outdated
@@ -44,7 +44,7 @@ end | |||
|
|||
# TODO: get rid of this! This is only a helper, and should be Base.values | |||
# (and probably live there, as well) | |||
_values(x::Array) = x | |||
_values(x::AbstractArray) = x |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where is this used and why does it need to change?
src/print.jl
Outdated
@@ -103,7 +103,7 @@ math(s,mathmode) = mathmode ? s : "\$\$ $s \$\$" | |||
# helper to look up corresponding JuMPContainerData | |||
printdata(v::JuMPContainer) = _getmodel(v).varData[v] | |||
getname(x::JuMPContainer) = hasmeta(x, :model) ? printdata(x).name : "__anon__" | |||
function printdata(v::Array{Variable}) | |||
function printdata(v::AbstractArray{Variable}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you redefining printing for AbstractArray{Variable}
? If not, this doesn't need to be touched
It's the opposite. Any code in JuMP that's calling |
One option is to follow the suggestion in the docs:
@assert all(x->isa(x, Base.OneTo), indices(A))
|
I'll push some more changes tonight. |
…-based indexing. Improve tests.
New round of changes, please take a look. |
src/operators.jl
Outdated
@@ -277,7 +277,8 @@ Base.sum(j::JuMPDict) = sum(values(j.tupledict)) | |||
Base.sum(j::JuMPArray{Variable}) = AffExpr(vec(j.innerArray), ones(length(j.innerArray)), 0.0) | |||
Base.sum(j::JuMPDict{Variable}) = AffExpr(collect(values(j.tupledict)), ones(length(j.tupledict)), 0.0) | |||
Base.sum(j::Array{Variable}) = AffExpr(vec(j), ones(length(j)), 0.0) | |||
function Base.sum{T<:GenericAffExpr}(affs::Array{T}) | |||
Base.sum(j::AbstractArray{Variable}) = sum([j...]) # to handle non-one-indexed arrays. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unless things have changed recently, splatting large arrays can cause big performance issues when the dimension is large, e.g., into the thousands. collect(j)
might be better here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
collect(::OffsetArray)
returns a copy of the OffsetArray
(with the same indices). I found it surprisingly hard to convert from an OffsetArray
to a regular Array
. Maybe I'm missing something.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about [v[i] for i in eachindex(v)] ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
D'oh. Yeah, that's better.
@@ -569,18 +570,6 @@ end; end | |||
(/){T<:JuMPTypes}(lhs::SparseMatrixCSC{T}, rhs::Number) = | |||
SparseMatrixCSC(lhs.m, lhs.n, copy(lhs.colptr), copy(lhs.rowval), lhs.nzval ./ rhs) | |||
|
|||
# The following are primarily there for internal use in the macro code for @constraint | |||
for op in [:(+), :(-)]; @eval begin |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are these methods now covered by built-ins in Julia?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, thanks for checking
Getting closer. Please correct the travis failures as well. (Appveyor failures are expected.) |
More fixes. Two random comments on getting things to work for OffsetArrays, for future reference:
|
src/affexpr.jl
Outdated
ret = Array{LinConstrRef}(size(v)) | ||
for I in eachindex(v) | ||
ret[I] = addconstraint(m, v[I]) | ||
function addVectorizedConstraint(m::Model, v::AbstractArray{LinearConstraint}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm somewhat uncertain if this method should be changed. Is there a compelling reason to pass a sparse matrix of linear constraints, for example?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Without changing this method, the following fails:
m = Model()
v = @variable(m, [1:3])
x = OffsetArray(v, -3)
@constraint(m, x .== 0)
because there is no Convert
method that constructs an Array
from an OffsetArray
(a conscious design decision in OffsetArrays).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't seem like the right fix. I think constructconstraint!(x::Array, sense::Symbol) = map(c->constructconstraint!(c,sense), x)
should be changed to return a flat Vector
of constraints. Then there's no need to touch addVectorizedConstraint
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since this also doesn't work:
t = OffsetArray(rand(3), -3)
t .== 0
I'm not sure this is desirable behavior.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, will change. @joehuchette, that seems like a bug in Base.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
Travis failures with no output seem to be because Amazon S3 is having issues: https://www.traviscistatus.com/incidents/hmwq9yy5dh9d. |
I'll restart the builds when travis stabilizes a bit. |
@@ -277,7 +277,8 @@ Base.sum(j::JuMPDict) = sum(values(j.tupledict)) | |||
Base.sum(j::JuMPArray{Variable}) = AffExpr(vec(j.innerArray), ones(length(j.innerArray)), 0.0) | |||
Base.sum(j::JuMPDict{Variable}) = AffExpr(collect(values(j.tupledict)), ones(length(j.tupledict)), 0.0) | |||
Base.sum(j::Array{Variable}) = AffExpr(vec(j), ones(length(j)), 0.0) | |||
function Base.sum{T<:GenericAffExpr}(affs::Array{T}) | |||
Base.sum(j::AbstractArray{Variable}) = sum([j[i] for i in eachindex(j)]) # to handle non-one-indexed arrays. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As long as vec(::AbstractArray{Variable})
will reliably work, I think the definition above should work fine.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried that first, but AffExpr
is a type alias of GenericAffExpr{Float64,Variable}
, so that directly calls the inner constructor of AffExpr
, which expects precisely a Vector{Variable}
. I tried adding an inner constructor to GenericAffExpr
that takes AbstractArray
s (+ additional outer constructors), but ran into some more issues that I don't remember anymore. Basically, it was going to be a pretty big change, whereas sum
seemed to be the only current case where handling AbstractVector
s was needed when constructing AffExpr
s.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool, thanks for the explanation.
Modulo my iffy-ness about changing |
src/macros.jl
Outdated
@@ -305,20 +305,21 @@ function constructconstraint!(normexpr::SOCExpr, sense::Symbol) | |||
end | |||
|
|||
constructconstraint!(x::Array, sense::Symbol) = map(c->constructconstraint!(c,sense), x) | |||
constructconstraint!(x::AbstractArray, sense::Symbol) = constructconstraint!([x[i] for i in eachindex(x)], sense::Symbol) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the second ::Symbol
shouldn't be necessary
Looks good to merge to me. @mlubin? |
I'm having trouble viewing the status of travis. |
@tkoolen, thanks for addressing all of our concerns. Looking forward to seeing the uses of |
Awesome, thanks guys! |
Thanks for this, @tkoolen! |
A lot of functions currently require
Array
types, while they would work just fine for any other subtype ofAbstractArray
. This PR simply replaces most::Array
s in function signatures by::AbstractArray
(and likewise forVector
andMatrix
).I defined
norm
only forAbstractVector
andAbstractMatrix
, not for generalAbstractArray
s, because Base only defines it forAbstractVector
andAbstractMatrix
, and defining it forAbstractArray
results in an ambiguity error.Please be extra careful reviewing my changes to
_multiply!
and_multiplyt!
in operators.jl.This came up while I was playing around with storing JuMP
Variable
s inAxisArray
s for a trajectory optimization style problem, because I had trouble keeping track of the order of the dimensions of the variable arrays (this is proving to be quite a nice combination of tools by the way).