-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Restructure of the promotion mechanism for broadcast #18642
Conversation
|
58796f1
to
87b0cb7
Compare
Will this mean that |
@@ -15,7 +16,7 @@ export broadcast_getindex, broadcast_setindex! | |||
broadcast(f) = f() | |||
@inline broadcast(f, x::Number...) = f(x...) | |||
@inline broadcast{N}(f, t::NTuple{N}, ts::Vararg{NTuple{N}}) = map(f, t, ts...) | |||
@inline broadcast(f, As::AbstractArray...) = broadcast_t(f, promote_eltype_op(f, As...), As...) | |||
@inline broadcast(f, As::AbstractArray...) = _broadcast(f, As...) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there still a point in calling this _broadcast
instead of broadcast
? Isn't multiple dispatch enough here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Calling broadcast
now first checks the containers types to determine the resulting container. In this case I was trying to avoid that check for the case were we know all are Array
like containers.
@@ -300,7 +300,6 @@ import Base.Meta: isexpr | |||
# PR 16988 | |||
@test Base.promote_op(+, Bool) === Int | |||
@test isa(broadcast(+, [true]), Array{Int,1}) | |||
@test Base.promote_op(Float64, Bool) === Float64 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you dropping this test because it would fail now? That would be unfortunate in case something outside Base relies on it...
@@ -253,17 +250,41 @@ function broadcast_t(f, ::Type{Any}, As...) | |||
B[I] = val | |||
return _broadcast!(f, B, keeps, Idefaults, As, Val{nargs}, iter, st, 1) | |||
end | |||
|
|||
@inline broadcast_t(f, T, As...) = broadcast!(f, similar(Array{T}, broadcast_indices(As...)), As...) | |||
@inline function _broadcast_t(f, T, shape, iter, As...) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add some comments here and below describing what these are for
|
||
function broadcast_c(f, ::Type{Tuple}, As...) | ||
shape = broadcast_indices(As...) | ||
check_broadcast_indices(shape, As...) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why is this being removed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
broadcast_indices
already fails when the sizes are not compatible. The only places where we really need it is for broadcast!
where we don't know the size of the supplied container.
@@ -300,7 +300,6 @@ import Base.Meta: isexpr | |||
# PR 16988 | |||
@test Base.promote_op(+, Bool) === Int | |||
@test isa(broadcast(+, [true]), Array{Int,1}) | |||
@test Base.promote_op(Float64, Bool) === Float64 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does this now return something different, or is it a method error?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I returns Any
because I remove the specialized method when the first argument is a type. promote_op
would only be used by the unary and binary elementwise operations in base, as well as by the matrix multiplication methods, where we know that the operator is not a type.
I guess I could leave the definition for uses outside Base addressing @martinholters concern above.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should it be deprecated if base won't need it going forward?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I guess so, will do.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FWIW
promote_op(op::Type, T) = Core.Inference.return_type(op, Tuple{T})
promote_op(op::Type, T1, T2) = Core.Inference.return_type(op, Tuple{T1,T2})
seems to be inferable and handles cases where the constructor specializes, e.g.
promote_op(Array{Int64}, Int)
then correctly returns Array{Int64,1}
, not just Array{Int64}
.
Not that I'm too worried about improving the obsolete promote_op
, but maybe you can employ this somehow to increase the chances of getting a leaftype in _broadcast
?
@TotalVerb: yes. The only case where it would work as before is for elementwise unary and binary operators in Base, where it would still preserve the more general abstract type. |
6b55f4a
to
0b7e17f
Compare
Comments so far addressed. Can we check performance? |
@nanosoldier |
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
0b7e17f
to
5d4cd87
Compare
Can we ask nanosoldier again? |
@nanosoldier |
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
20f5ba0
to
fa03b1d
Compare
Ok. I believe there should be no more performance regressions. |
@nanosoldier |
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
All right, there are no more regressions (just the usual noisy benchmarks). But feel free to review or make more suggestions. |
@@ -1009,4 +1009,15 @@ export @vectorize_1arg, @vectorize_2arg | |||
@deprecate abs(M::SymTridiagonal) abs.(M) | |||
@deprecate abs(x::AbstractSparseVector) abs.(x) | |||
|
|||
# promote_op method where the operator is also a type | |||
function promote_op(op::Type, Ts::Type...) | |||
depwarn("promote_op(op::Type, ::Type...) is deprecated as is no longer " * |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as it is
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed
7cca21a
to
6bd6840
Compare
Best to run the benchmarks again, since @nanosoldier |
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
That's odd, some of those performance regressions weren't present before the last rebase (in which there are no changes, just the rebase). Probably other PRs touching Anyway, I'll see what I can do. |
@pabloferz, the previous time the benchmark was run on this PR was September 28, which was before the broadcast benchmarks were added to BaseBenchmarks. ... oh wait, nanosoldier is showing an improvement in the broadcast fusion benchmarks. The regressions are in linalg? |
The regressions in linalg may come from the fact that some matrix products, e.g. |
The problem was with Should be better now. |
8e78b26
to
a174718
Compare
Let's see: @nanosoldier |
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
Ah yes, much better. What about the sparse transposes? Unrelated/noise? |
I believe that the sparse transposes are either unrelated to the PR or noise. |
Yeah, it doesn't look like the sparse-transpose code uses anything touched by this PR. The travis failure seems to be an unrelated network problem with I think this is okay to merge. |
@inline broadcast_elwise_op(f, As...) = | ||
broadcast!(f, similar(Array{promote_eltype_op(f, As...)}, broadcast_indices(As...)), As...) | ||
|
||
ftype(f, A) = typeof(a -> f(a)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this just be typeof(f)
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I think so, I don't know why I put it like this (same for the one below).
T = _promote_op(f, _default_type(S)) | ||
@_pure_meta | ||
Z = Tuple{_default_type(S)} | ||
T = _default_eltype(Generator{Z, typeof(a -> f(a))}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here.
* Restructure the promotion mechanism for broadcast * More broadcast tests * Use broadcast for element wise operators where appropriate
This addresses #18622 and replaces #18623. This PR is an overhaul of the promotion mechanism for
broadcast
and gets rid off somepromote_eltype_op
and_promote_op
methods. The later is somewhat replaced by the already existing_default_eltype
which is used bymap
and comprehensions.