Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use promote_op in broadcasting and matrix multiplication #13803

Merged
merged 1 commit into from
Oct 29, 2015

Conversation

timholy
Copy link
Member

@timholy timholy commented Oct 28, 2015

This makes it easier to support these operations on AbstractArrays with generic eltypes.

This makes it easier to support these operations on AbstractArrays with generic eltypes
@timholy
Copy link
Member Author

timholy commented Oct 28, 2015

The @_pure_meta annotations can be ditched in the backport to 0.4.

timholy added a commit that referenced this pull request Oct 29, 2015
Use promote_op in broadcasting and matrix multiplication
@timholy timholy merged commit f8a4340 into master Oct 29, 2015
@timholy timholy deleted the teh/more_promote_op branch October 29, 2015 09:53
@andreasnoack
Copy link
Member

😄 Ref: 1257643

@yuyichao
Copy link
Contributor

Hold on for backporting. This breaks DataArrays which imports type_minus, type_div and type_pow.

yuyichao added a commit to yuyichao/DataArrays.jl that referenced this pull request Oct 30, 2015
@yuyichao
Copy link
Contributor

DataArrays appears to be the only public use of the three functions removed in this PR (assuming I can trust github search....) and it should be fixed by JuliaStats/DataArrays.jl#173.

yuyichao added a commit to yuyichao/DataArrays.jl that referenced this pull request Oct 30, 2015
@tkelman
Copy link
Contributor

tkelman commented Oct 30, 2015

This is a bit on the feature side as far as backporting goes...

yuyichao added a commit to yuyichao/DataArrays.jl that referenced this pull request Oct 30, 2015
@timholy
Copy link
Member Author

timholy commented Oct 30, 2015

I wondered about that. I'll let others make the call about whether this is worth the risk.

@@ -972,7 +972,7 @@ end # macro
(.-)(A::Number, B::SparseMatrixCSC) = A .- full(B)
( -)(A::Array , B::SparseMatrixCSC) = A - full(B)

(.*)(A::AbstractArray, B::AbstractArray) = broadcast_zpreserving(*, A, B)
(.*)(A::AbstractArray, B::AbstractArray) = broadcast_zpreserving(MulFun(), A, B)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any idea why this didn't need to be Base.MulFun or explicitly imported in base/sparse.jl ? On release-0.4 this causes a test failure unless it's qualified

ERROR: LoadError: On worker 4:
LoadError: test error in expression: full(factorize(A)) ≈ full(factorize(SymTridiagonal(d,e)))
UndefVarError: MulFun not defined
 [inlined code] from sparse/sparsematrix.jl:1026
 in full at linalg/ldlt.jl:82
 in anonymous at test.jl:90
 in do_test at test.jl:50
 [inlined code] from /home/tkelman/Julia/julia-0.4/test/linalg/dense.jl:123
 in anonymous at no file:0
 [inlined code] from essentials.jl:112
 in include_string at loading.jl:266
 in include_from_node1 at ./loading.jl:307
 [inlined code] from util.jl:179
 in runtests at /home/tkelman/Julia/julia-0.4/test/testdefs.jl:7
 in anonymous at multi.jl:908
 in run_work_thunk at multi.jl:646
 [inlined code] from multi.jl:908
 in anonymous at task.jl:63
while loading /home/tkelman/Julia/julia-0.4/test/linalg/dense.jl, in expression starting on line 37
while loading /home/tkelman/Julia/julia-0.4/test/runtests.jl, in expression starting on line 13
        From worker 4:       * linalg/dense

which makes me wonder whether that test or the underlying code paths have changed, or if this line is missing coverage on master?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should definitely be added to the using statements in sparse.jl. Do you want to add that, or should I?

The error is really bizarre: according to https://codecov.io/github/JuliaLang/julia/base/sparse/sparsematrix.jl?ref=9eb74d4730961f52b37ccb8cfdea3b21455e7365#l-976, that line is well-covered.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm changing this line to Base.MulFun in the backport. I'm not sure what's going on with master here but I'm moving on to other backports so not thinking too much about it right this second.

timholy added a commit that referenced this pull request Nov 29, 2015
This makes it easier to support these operations on AbstractArrays with generic eltypes

(cherry picked from commit 82f9a21)
ref #13803
remove _pure_meta for backport
also qualify Base.MulFun
@timholy
Copy link
Member Author

timholy commented Jan 3, 2016

FWIW, I just tested SALSA and NullableArrays on a local version that includes my own backport of this PR, and both passed. That said, if other unregistered packages make use of unexported functionality, this will be breaking, so there's still nontrivial risk.

Assuming this would now pass PkgEval, I don't think there will be further changes in the fundamental nature of this decision during the lifecycle of 0.4. So, we could decide now whether to backport in 0.4.3 or to remove the backport-pending tag. @tkelman, feel free to do whichever seems best to you.

@tkelman
Copy link
Contributor

tkelman commented Jan 9, 2016

What concerns me is this: the people who are most aware of this change and wanting to make use of it right away are paying attention and will adjust their REQUIRE files to carefully depend on the first point release that has this backported. The people who aren't paying close attention right now, but might make use of this feature several months from now would be writing code that works on later versions of 0.4.x but not 0.4.0, 0.4.1, or 0.4.2.

@timholy
Copy link
Member Author

timholy commented Jan 9, 2016

Valid concern. Let's just have this appear in julia 0.5. In the meantime I'll carry a local patch for my lab.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants