Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generalize strides for ReinterpretArray and ReshapedArray #44027

Merged
merged 6 commits into from
Feb 6, 2022

Conversation

N5N3
Copy link
Member

@N5N3 N5N3 commented Feb 3, 2022

strides for ReinterpretArray was genelized in #37414, but it works only for contiguous parent (So the check there is also not correct.)
This PR tries to extend it by using parent's strides rather than self's size during calculation.

Similar generalization is also applied to ReshapedArray.

Test added.

@N5N3 N5N3 added the arrays [a, r, r, a, y, s] label Feb 3, 2022
@timholy
Copy link
Member

timholy commented Feb 3, 2022

I haven't reviewed in detail, but when asking about strides and subarrays it's usually best to be sure to create some parents with dimensions non-commensurate with the step. E.g.,

A = reshape(1:21*4, 21, 4)
V = @view A[begin:4:end,:]

and check that all the conditions for strides (even spacing along each axis) are met for the values of V. To me it looks like all your tests use a parent size that's evenly-divisible by the step.

@N5N3
Copy link
Member Author

N5N3 commented Feb 3, 2022

I add more test via testing a[i] === Base.unsafe_load(pointer(A, i)), as a[i] should be correct while pointer is strides-based.
I'm not sure is it OK to you @timholy. But with this there's no need to hardcode the experted output of strides.

Copy link
Member

@timholy timholy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This mostly looks fantastic, nice work!

test/abstractarray.jl Outdated Show resolved Hide resolved
base/reshapedarray.jl Show resolved Hide resolved
@timholy timholy merged commit e0a4b77 into JuliaLang:master Feb 6, 2022
@timholy
Copy link
Member

timholy commented Feb 6, 2022

Thanks @N5N3! Really nice work.

@N5N3 N5N3 deleted the strides-reinterp branch February 6, 2022 13:41
@mkitti
Copy link
Contributor

mkitti commented Feb 9, 2022

There might have been a regression here. HDF5.jl tests are showing this now fails: stride(fill(43, ()), 1)

plain: Error During Test at /home/runner/work/HDF5.jl/HDF5.jl/test/plain.jl:25
  Got exception outside of a @test
  ArgumentError: reducing with add_sum over an empty collection of element type Union{} is not allowed.
  You may be able to prevent this error by supplying an `init` value to the reducer.
  Stacktrace:
    [1] _empty_reduce_error(f::Any, T::Type)
      @ Base ./reduce.jl:307
    [2] reduce_empty(#unused#::typeof(Base.add_sum), #unused#::Core.TypeofBottom)
      @ Base ./reduce.jl:338
    [3] reduce_empty(op::Base.BottomRF{typeof(Base.add_sum)}, #unused#::Type{Union{}})
      @ Base ./reduce.jl:347
    [4] reduce_empty_iter
      @ ./reduce.jl:371 [inlined]
    [5] reduce_empty_iter
      @ ./reduce.jl:370 [inlined]
    [6] foldl_impl(op::Base.BottomRF{typeof(Base.add_sum)}, nt::Base._InitialValue, itr::Tuple{})
      @ Base ./reduce.jl:49
    [7] mapfoldl_impl(f::typeof(identity), op::typeof(Base.add_sum), nt::Base._InitialValue, itr::Tuple{})
      @ Base ./reduce.jl:44
    [8] mapfoldl(f::Function, op::Function, itr::Tuple{}; init::Base._InitialValue)
      @ Base ./reduce.jl:[162](https://github.com/JuliaIO/HDF5.jl/runs/5121315625?check_suite_focus=true#step:5:162)
    [9] mapfoldl
      @ ./reduce.jl:162 [inlined]
   [10] #mapreduce#264
      @ ./reduce.jl:294 [inlined]
   [11] mapreduce(f::Function, op::Function, itr::Tuple{})
      @ Base ./reduce.jl:294
   [12] sum(f::Function, a::Tuple{}; kw::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
      @ Base ./reduce.jl:520
   [13] sum(f::Function, a::Tuple{})
      @ Base ./reduce.jl:520
   [14] sum(a::Tuple{}; kw::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
      @ Base ./reduce.jl:549
   [15] sum(a::Tuple{})
      @ Base ./reduce.jl:549
   [16] stride(A::Array{Int32, 0}, k::Int64)
      @ Base ./abstractarray.jl:549
   [17] write_dataset(dataset::HDF5.Dataset, memtype::HDF5.Datatype, buf::Array{Int32, 0}, xfer::HDF5.DatasetTransferProperties) (repeats 2 times)
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:1529
   [18] write_dataset(parent::HDF5.File, name::String, data::Array{Int32, 0}; pv::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:1201
   [19] write_dataset(parent::HDF5.File, name::String, data::Array{Int32, 0})
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:1[199](https://github.com/JuliaIO/HDF5.jl/runs/5121315625?check_suite_focus=true#step:5:199)
   [20] write(parent::HDF5.File, name::String, data::Array{Int32, 0}; pv::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:1244
   [21] write(parent::HDF5.File, name::String, data::Array{Int32, 0})
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:1244
   [22] setindex!(parent::HDF5.File, val::Array{Int32, 0}, path::String; pv::Base.Pairs{Symbol, Integer, Tuple{Symbol, Symbol}, NamedTuple{(:shuffle, :deflate), Tuple{Bool, Int64}}})
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:649
   [23] macro expansion
      @ ~/work/HDF5.jl/HDF5.jl/test/plain.jl:37 [inlined]
   [24] macro expansion
      @ /opt/hostedtoolcache/julia/nightly/x64/share/julia/stdlib/v1.8/Test/src/Test.jl:1356 [inlined]
   [25] top-level scope
      @ ~/work/HDF5.jl/HDF5.jl/test/plain.jl:28
   [26] include(fname::String)
      @ Base.MainInclude ./client.jl:476
   [27] macro expansion
      @ ~/work/HDF5.jl/HDF5.jl/test/runtests.jl:18 [inlined]
   [28] macro expansion
      @ /opt/hostedtoolcache/julia/nightly/x64/share/julia/stdlib/v1.8/Test/src/Test.jl:1356 [inlined]
   [29] top-level scope
      @ ~/work/HDF5.jl/HDF5.jl/test/runtests.jl:17
   [30] include(fname::String)
      @ Base.MainInclude ./client.jl:476
   [31] top-level scope
      @ none:6
   [32] eval
      @ ./boot.jl:368 [inlined]
   [33] exec_options(opts::Base.JLOptions)
      @ Base ./client.jl:276
   [34] _start()
      @ Base ./client.jl:522
empty and 0-size arrays: Error During Test at /home/runner/work/HDF5.jl/HDF5.jl/test/plain.jl:604
  Got exception outside of a @test
  ArgumentError: reducing with add_sum over an empty collection of element type Union{} is not allowed.
  You may be able to prevent this error by supplying an `init` value to the reducer.
  Stacktrace:
    [1] _empty_reduce_error(f::Any, T::Type)
      @ Base ./reduce.jl:307
    [2] reduce_empty(#unused#::typeof(Base.add_sum), #unused#::Core.TypeofBottom)
      @ Base ./reduce.jl:338
    [3] reduce_empty(op::Base.BottomRF{typeof(Base.add_sum)}, #unused#::Type{Union{}})
      @ Base ./reduce.jl:347
    [4] reduce_empty_iter
      @ ./reduce.jl:371 [inlined]
    [5] reduce_empty_iter
      @ ./reduce.jl:370 [inlined]
    [6] foldl_impl(op::Base.BottomRF{typeof(Base.add_sum)}, nt::Base._InitialValue, itr::Tuple{})
      @ Base ./reduce.jl:49
    [7] mapfoldl_impl(f::typeof(identity), op::typeof(Base.add_sum), nt::Base._InitialValue, itr::Tuple{})
      @ Base ./reduce.jl:44
    [8] mapfoldl(f::Function, op::Function, itr::Tuple{}; init::Base._InitialValue)
      @ Base ./reduce.jl:162
    [9] mapfoldl
      @ ./reduce.jl:162 [inlined]
   [10] #mapreduce#264
      @ ./reduce.jl:294 [inlined]
   [11] mapreduce(f::Function, op::Function, itr::Tuple{})
      @ Base ./reduce.jl:294
   [12] sum(f::Function, a::Tuple{}; kw::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
      @ Base ./reduce.jl:520
   [13] sum(f::Function, a::Tuple{})
      @ Base ./reduce.jl:520
   [14] sum(a::Tuple{}; kw::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
      @ Base ./reduce.jl:549
   [15] sum(a::Tuple{})
      @ Base ./reduce.jl:549
   [16] stride(A::Array{Float64, 0}, k::Int64)
      @ Base ./abstractarray.jl:549
   [17] write_dataset(dataset::HDF5.Dataset, memtype::HDF5.Datatype, buf::Array{Float64, 0}, xfer::HDF5.DatasetTransferProperties) (repeats 2 times)
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:1529
   [18] write_dataset(parent::HDF5.File, name::String, data::Array{Float64, 0}; pv::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:1[201](https://github.com/JuliaIO/HDF5.jl/runs/5121315625?check_suite_focus=true#step:5:201)
   [19] write_dataset(parent::HDF5.File, name::String, data::Array{Float64, 0})
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:1199
   [20] write(parent::HDF5.File, name::String, data::Array{Float64, 0}; pv::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:1[244](https://github.com/JuliaIO/HDF5.jl/runs/5121315625?check_suite_focus=true#step:5:244)
   [21] write(parent::HDF5.File, name::String, data::Array{Float64, 0})
      @ HDF5 ~/work/HDF5.jl/HDF5.jl/src/HDF5.jl:1244
   [22] macro expansion
      @ ~/work/HDF5.jl/HDF5.jl/test/plain.jl:610 [inlined]
   [23] macro expansion
      @ /opt/hostedtoolcache/julia/nightly/x64/share/julia/stdlib/v1.8/Test/src/Test.jl:1356 [inlined]
   [24] top-level scope
      @ ~/work/HDF5.jl/HDF5.jl/test/plain.jl:605
   [25] include(fname::String)
      @ Base.MainInclude ./client.jl:476
   [26] macro expansion
      @ ~/work/HDF5.jl/HDF5.jl/test/runtests.jl:18 [inlined]
   [27] macro expansion
      @ /opt/hostedtoolcache/julia/nightly/x64/share/julia/stdlib/v1.8/Test/src/Test.jl:1356 [inlined]
   [28] top-level scope
      @ ~/work/HDF5.jl/HDF5.jl/test/runtests.jl:17
   [29] include(fname::String)
      @ Base.MainInclude ./client.jl:476
   [30] top-level scope
      @ none:6
   [31] eval
      @ ./boot.jl:368 [inlined]
   [32] exec_options(opts::Base.JLOptions)
      @ Base ./client.jl:[276](https://github.com/JuliaIO/HDF5.jl/runs/5121315625?check_suite_focus=true#step:5:276)
   [33] _start()
      @ Base ./client.jl:522

Working on confirming this.

@mkitti
Copy link
Contributor

mkitti commented Feb 9, 2022

I just built master:

julia> stride( fill(43,()) , 1)
ERROR: ArgumentError: reducing with add_sum over an empty collection of element type Union{} is not allowed.
You may be able to prevent this error by supplying an `init` value to the reducer.
Stacktrace:
  [1] _empty_reduce_error(f::Any, T::Type)
    @ Base ./reduce.jl:307
  [2] reduce_empty(#unused#::typeof(Base.add_sum), #unused#::Core.TypeofBottom)
    @ Base ./reduce.jl:338
  [3] reduce_empty(op::Base.BottomRF{typeof(Base.add_sum)}, #unused#::Type{Union{}})
    @ Base ./reduce.jl:347
  [4] reduce_empty_iter
    @ ./reduce.jl:371 [inlined]
  [5] reduce_empty_iter
    @ ./reduce.jl:370 [inlined]
  [6] foldl_impl(op::Base.BottomRF{typeof(Base.add_sum)}, nt::Base._InitialValue, itr::Tuple{})
    @ Base ./reduce.jl:49
  [7] mapfoldl_impl(f::typeof(identity), op::typeof(Base.add_sum), nt::Base._InitialValue, itr::Tuple{})
    @ Base ./reduce.jl:44
  [8] mapfoldl(f::Function, op::Function, itr::Tuple{}; init::Base._InitialValue)
    @ Base ./reduce.jl:162
  [9] mapfoldl
    @ ./reduce.jl:162 [inlined]
 [10] #mapreduce#264
    @ ./reduce.jl:294 [inlined]
 [11] mapreduce(f::Function, op::Function, itr::Tuple{})
    @ Base ./reduce.jl:294
 [12] sum(f::Function, a::Tuple{}; kw::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Base ./reduce.jl:520
 [13] sum(f::Function, a::Tuple{})
    @ Base ./reduce.jl:520
 [14] sum(a::Tuple{}; kw::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ Base ./reduce.jl:549
 [15] sum(a::Tuple{})
    @ Base ./reduce.jl:549
 [16] stride(A::Array{Int64, 0}, k::Int64)
    @ Base ./abstractarray.jl:549
 [17] top-level scope
    @ REPL[3]:1

@N5N3
Copy link
Member Author

N5N3 commented Feb 9, 2022

The specialization for stride(A::Array, i) was removed in this PR. Seems our general fallback dislike 0-dim case.

function stride(A::AbstractArray, k::Integer)
    st = strides(A)
    k  ndims(A) && return st[k]
    return sum(st .* size(A)) # here
end

I suggest to return length(A) instead of sum(st .* size(A)), as the value of stride(A, i) should be arbitrary if i > ndims(A)?

@mkitti
Copy link
Contributor

mkitti commented Feb 9, 2022

Just add a keyword init = 1 to sum as the error suggests.

julia> function Base.stride(A::AbstractArray, k::Integer)
           st = strides(A)
           k ≤ ndims(A) && return st[k]
           return sum(st .* size(A); init = 1)
       end

@N5N3
Copy link
Member Author

N5N3 commented Feb 9, 2022

We have a daily PkgEval (https://s3.amazonaws.com/julialang-reports/nanosoldier/pkgeval/by_date/2022-02/07/report.html).
Some other PKG seems also failed because of this PR.
I'll take a look to decide which value to use.

LilithHafner pushed a commit to LilithHafner/julia that referenced this pull request Feb 22, 2022
LilithHafner pushed a commit to LilithHafner/julia that referenced this pull request Mar 8, 2022
N5N3 added a commit to N5N3/julia that referenced this pull request Apr 28, 2022
It turns out `strides(a::StridedReinterpretArray)` won't call `Base.size_to_strides` anymore after JuliaLang#44027.
As it is dispatched to `strides(::NonReshapedReinterpretArray)`/`strides(::ReshapedReinterpretArray)`.
This commit fix that regression.
N5N3 added a commit to N5N3/julia that referenced this pull request Apr 30, 2022
It turns out `strides(a::StridedReinterpretArray)` won't call `Base.size_to_strides` anymore after JuliaLang#44027.
As it is dispatched to `strides(::NonReshapedReinterpretArray)`/`strides(::ReshapedReinterpretArray)`.
This commit fix that regression.
N5N3 added a commit to N5N3/julia that referenced this pull request May 25, 2022
It turns out `strides(a::StridedReinterpretArray)` won't call `Base.size_to_strides` anymore after JuliaLang#44027.
As it is dispatched to `strides(::NonReshapedReinterpretArray)`/`strides(::ReshapedReinterpretArray)`.
This commit fix that regression.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
arrays [a, r, r, a, y, s]
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants