Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restructure of the promotion mechanism for broadcast #18642

Merged
merged 3 commits into from
Nov 7, 2016

Conversation

pabloferz
Copy link
Contributor

This addresses #18622 and replaces #18623. This PR is an overhaul of the promotion mechanism for broadcast and gets rid off some promote_eltype_op and _promote_op methods. The later is somewhat replaced by the already existing _default_eltype which is used by map and comprehensions.

@pabloferz
Copy link
Contributor Author

promote_op is left only for unary and binary elementwise operators and matrix multiplication and it is decoupled from the rest of broadcast methods which will rely on _default_eltype.

@TotalVerb
Copy link
Contributor

TotalVerb commented Sep 23, 2016

Will this mean that broadcast(sqrt, Integer[1.0, 2.0, 3.0]) will return Vector{Float64} now, consistent with maps and comprehensions? Currently it returns the strange Vector{Real} type, due to promote_op.

@@ -15,7 +16,7 @@ export broadcast_getindex, broadcast_setindex!
broadcast(f) = f()
@inline broadcast(f, x::Number...) = f(x...)
@inline broadcast{N}(f, t::NTuple{N}, ts::Vararg{NTuple{N}}) = map(f, t, ts...)
@inline broadcast(f, As::AbstractArray...) = broadcast_t(f, promote_eltype_op(f, As...), As...)
@inline broadcast(f, As::AbstractArray...) = _broadcast(f, As...)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there still a point in calling this _broadcast instead of broadcast? Isn't multiple dispatch enough here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Calling broadcast now first checks the containers types to determine the resulting container. In this case I was trying to avoid that check for the case were we know all are Array like containers.

@@ -300,7 +300,6 @@ import Base.Meta: isexpr
# PR 16988
@test Base.promote_op(+, Bool) === Int
@test isa(broadcast(+, [true]), Array{Int,1})
@test Base.promote_op(Float64, Bool) === Float64
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you dropping this test because it would fail now? That would be unfortunate in case something outside Base relies on it...

@@ -253,17 +250,41 @@ function broadcast_t(f, ::Type{Any}, As...)
B[I] = val
return _broadcast!(f, B, keeps, Idefaults, As, Val{nargs}, iter, st, 1)
end

@inline broadcast_t(f, T, As...) = broadcast!(f, similar(Array{T}, broadcast_indices(As...)), As...)
@inline function _broadcast_t(f, T, shape, iter, As...)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add some comments here and below describing what these are for


function broadcast_c(f, ::Type{Tuple}, As...)
shape = broadcast_indices(As...)
check_broadcast_indices(shape, As...)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this being removed?

Copy link
Contributor Author

@pabloferz pabloferz Sep 23, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

broadcast_indices already fails when the sizes are not compatible. The only places where we really need it is for broadcast! where we don't know the size of the supplied container.

@@ -300,7 +300,6 @@ import Base.Meta: isexpr
# PR 16988
@test Base.promote_op(+, Bool) === Int
@test isa(broadcast(+, [true]), Array{Int,1})
@test Base.promote_op(Float64, Bool) === Float64
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this now return something different, or is it a method error?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I returns Any because I remove the specialized method when the first argument is a type. promote_op would only be used by the unary and binary elementwise operations in base, as well as by the matrix multiplication methods, where we know that the operator is not a type.

I guess I could leave the definition for uses outside Base addressing @martinholters concern above.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should it be deprecated if base won't need it going forward?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I guess so, will do.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FWIW

promote_op(op::Type, T) = Core.Inference.return_type(op, Tuple{T})
promote_op(op::Type, T1, T2) = Core.Inference.return_type(op, Tuple{T1,T2})

seems to be inferable and handles cases where the constructor specializes, e.g.
promote_op(Array{Int64}, Int) then correctly returns Array{Int64,1}, not just Array{Int64}.

Not that I'm too worried about improving the obsolete promote_op, but maybe you can employ this somehow to increase the chances of getting a leaftype in _broadcast?

@pabloferz
Copy link
Contributor Author

pabloferz commented Sep 23, 2016

Will this mean that broadcast(sqrt, Integer[1.0, 2.0, 3.0]) will return Vector{Float64} now, consistent with maps and comprehensions? Currently it returns the strange Vector{Real} type, due to promote_op.

@TotalVerb: yes. The only case where it would work as before is for elementwise unary and binary operators in Base, where it would still preserve the more general abstract type.

@kshyatt kshyatt added types and dispatch Types, subtyping and method dispatch broadcast Applying a function over a collection labels Sep 23, 2016
@pabloferz pabloferz force-pushed the pz/betterbroadcast branch 2 times, most recently from 6b55f4a to 0b7e17f Compare September 24, 2016 09:31
@pabloferz
Copy link
Contributor Author

Comments so far addressed. Can we check performance?

@tkelman
Copy link
Contributor

tkelman commented Sep 24, 2016

@nanosoldier runbenchmarks(ALL, vs = ":master")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels

@pabloferz
Copy link
Contributor Author

pabloferz commented Sep 28, 2016

Can we ask nanosoldier again?

@vchuravy
Copy link
Member

@nanosoldier runbenchmarks(ALL, vs = ":master")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels

@pabloferz pabloferz force-pushed the pz/betterbroadcast branch 2 times, most recently from 20f5ba0 to fa03b1d Compare September 28, 2016 14:36
@pabloferz
Copy link
Contributor Author

Ok. I believe there should be no more performance regressions.

@mbauman
Copy link
Member

mbauman commented Sep 28, 2016

@nanosoldier runbenchmarks(ALL, vs = ":master")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels

@pabloferz pabloferz changed the title WIP: Restructure promotion mechanism for broadcast Restructure of the promotion mechanism for broadcast Sep 28, 2016
@pabloferz
Copy link
Contributor Author

All right, there are no more regressions (just the usual noisy benchmarks). But feel free to review or make more suggestions.

@@ -1009,4 +1009,15 @@ export @vectorize_1arg, @vectorize_2arg
@deprecate abs(M::SymTridiagonal) abs.(M)
@deprecate abs(x::AbstractSparseVector) abs.(x)

# promote_op method where the operator is also a type
function promote_op(op::Type, Ts::Type...)
depwarn("promote_op(op::Type, ::Type...) is deprecated as is no longer " *
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as it is

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

@pabloferz pabloferz force-pushed the pz/betterbroadcast branch 2 times, most recently from 7cca21a to 6bd6840 Compare October 4, 2016 15:19
@stevengj
Copy link
Member

stevengj commented Nov 4, 2016

Best to run the benchmarks again, since broadcast benchmarks were only added recently (JuliaCI/BaseBenchmarks.jl#30), although I don't think any of them will be affected by the changes here.

@nanosoldier runbenchmarks(ALL, vs = ":master")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels

@pabloferz
Copy link
Contributor Author

pabloferz commented Nov 4, 2016

That's odd, some of those performance regressions weren't present before the last rebase (in which there are no changes, just the rebase). Probably other PRs touching broadcast or inference are causing the performance hit?

Anyway, I'll see what I can do.

@stevengj
Copy link
Member

stevengj commented Nov 5, 2016

@pabloferz, the previous time the benchmark was run on this PR was September 28, which was before the broadcast benchmarks were added to BaseBenchmarks. ... oh wait, nanosoldier is showing an improvement in the broadcast fusion benchmarks. The regressions are in linalg?

@pabloferz
Copy link
Contributor Author

pabloferz commented Nov 5, 2016

The regressions in linalg may come from the fact that some matrix products, e.g. *(A::Diagonal, B::Diagonal) rely on broadcast. So the problem is probably with some element-wise binary operations.

@pabloferz
Copy link
Contributor Author

pabloferz commented Nov 5, 2016

The problem was with promote_op, it was being inlined before, but some changes here (and possibly elsewhere) were preventing inlining.

Should be better now.

@martinholters
Copy link
Member

Let's see: @nanosoldier runbenchmarks(ALL, vs = ":master")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels

@martinholters
Copy link
Member

Ah yes, much better. What about the sparse transposes? Unrelated/noise?

@pabloferz
Copy link
Contributor Author

I believe that the sparse transposes are either unrelated to the PR or noise.

@stevengj
Copy link
Member

stevengj commented Nov 7, 2016

Yeah, it doesn't look like the sparse-transpose code uses anything touched by this PR.

The travis failure seems to be an unrelated network problem with apt-get.

I think this is okay to merge.

@stevengj stevengj merged commit d16d994 into JuliaLang:master Nov 7, 2016
@pabloferz pabloferz deleted the pz/betterbroadcast branch November 7, 2016 17:44
@inline broadcast_elwise_op(f, As...) =
broadcast!(f, similar(Array{promote_eltype_op(f, As...)}, broadcast_indices(As...)), As...)

ftype(f, A) = typeof(a -> f(a))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this just be typeof(f)?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I think so, I don't know why I put it like this (same for the one below).

T = _promote_op(f, _default_type(S))
@_pure_meta
Z = Tuple{_default_type(S)}
T = _default_eltype(Generator{Z, typeof(a -> f(a))})
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.

fcard pushed a commit to fcard/julia that referenced this pull request Feb 28, 2017
* Restructure the promotion mechanism for broadcast

* More broadcast tests

* Use broadcast for element wise operators where appropriate
Sacha0 added a commit to Sacha0/julia that referenced this pull request May 17, 2017
@Sacha0 Sacha0 added needs news A NEWS entry is required for this change deprecation This change introduces or involves a deprecation and removed needs news A NEWS entry is required for this change labels May 17, 2017
tkelman pushed a commit that referenced this pull request Jun 3, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
broadcast Applying a function over a collection deprecation This change introduces or involves a deprecation types and dispatch Types, subtyping and method dispatch
Projects
None yet
Development

Successfully merging this pull request may close these issues.