Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Backports for 1.4-RC1 #34238

Merged
merged 31 commits into from
Jan 20, 2020
Merged

WIP: Backports for 1.4-RC1 #34238

merged 31 commits into from
Jan 20, 2020

Conversation

KristofferC
Copy link
Member

@KristofferC KristofferC commented Jan 2, 2020

Backported PRs:

@nanosoldier runbenchmarks(ALL, vs = ":release-1.3")

(cherry picked from commit d723cee)
@KristofferC KristofferC added needs nanosoldier run This PR should have benchmarks run on it DO NOT MERGE Do not merge this PR! needs pkgeval Tests for all registered packages should be run with this change release Release management and versioning. labels Jan 2, 2020
@KristofferC KristofferC changed the base branch from master to release-1.4 January 2, 2020 16:59
@KristofferC
Copy link
Member Author

KristofferC commented Jan 2, 2020

I had the wrong merge target branch (master intstead of release-1.4) when starting nanosoldiers but I don't think anything has really been merged so should be ok.

@KristofferC
Copy link
Member Author

@nanosoldier runtests(ALL, vs = ":release-1.3")

@nanosoldier
Copy link
Collaborator

Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @ararslan

@nanosoldier
Copy link
Collaborator

Your test job has completed - possible new issues were detected. A full report can be found here. cc @maleadt

@KristofferC
Copy link
Member Author

Some failures from JuliaLang/Pkg.jl#1548 not being included here.

@JeffBezanson
Copy link
Member

AIBECS: BSON
ARCHModels: numerical issue
AbstractNumbers: fails because div is now applicable to Complex
even though it doesn't work
AdaptiveDistanceFields: AxisAlgorithms: bad Vararg length; looks fixed (upgrade a dependency?)
Atom: pathsep not defined
BSON: JuliaIO/BSON.jl#63
BandedMatrices: no method matching bunchkaufman!(::Symmetric{Float32,BandedMatrix{Float32,Array{Float32,2},Base.OneTo{Int64}}}, ::Bool; check=false)

@JeffBezanson
Copy link
Member

JeffBezanson commented Jan 6, 2020

BitIntegers: #33459. we removed a fallback signed() method it used.
CBinding: jl_gc_total_bytes renamed; use Base.gc_bytes() instead
CategoricalArrays: different whitespace in output
CatmullClark: #33954, fixed
CurrenciesBase: FixedDecimals: another div issue (#34284)
DBFTables: output changed
DataInterpolations: Zygote needs to handle loopinfo FluxML/Zygote.jl#393
DecisionTree: code coverage? https://github.com/bensadeghi/DecisionTree.jl/issues/112
DiffEqParamEstim: Zygote
DiffEqSensitivity: Zygote
Distributions: NaN === NaN ?
Econometrics: output changed
EfficientGlobalOptimization: Zygote
ExactPredicates: #34252
FilePathsBase: BSON
FixedEffectModels: PosDefException: matrix is not positive definite; Cholesky factorization failed.
Flux: Zygote
GFlops: some numbers don't match; not sure
GLM: PosDefException: matrix is not positive definite; Cholesky factorization failed.
GeoArrays: curl issue?
GeometricFlux: Zygote
GeometricIntegratorsDiffEq: DecFP
HalfIntegers: div.jl issue with cld; packages need to implement div(a, b, RoundUp)
Hecke: maybe due to changes in reduce?
[1] _empty_reduce_error() at ./reduce.jl:292
[2] reduce_empty(::Function, ::Type{T} where T) at ./reduce.jl:302
[3] reduce_empty(::Base.BottomRF{typeof(max)}, ::Type{T} where T) at ./reduce.jl:317
[4] mapreduce_empty(::typeof(abs), ::Base.BottomRF{typeof(max)}, ::Type{T} where T) at ./reduce.jl:333
HomotopyContinuation: @allocated changes
Hyperspecialize: crash
IncrementalInference: numerical issue?
InfrastructureSystems: #34243
JWAS: zero(Any) ?
JuMP: output changes
KernelFunctions: Zygote

@KristofferC
Copy link
Member Author

@vtjnash
Copy link
Member

vtjnash commented Jan 6, 2020

DecisionTree was a known bug in that package (since fixed)

Hyperspecialize is probably an instance of #34252 (prior to this release, @testset used to expand to heuristics that would force off inference and optimization. It has forced compilation heuristics since 057fa2a, we just would not infer the code first.)

@JeffBezanson
Copy link
Member

prior to this release, @testset used to expand to heuristics that would force off inference and optimization

Do you happen to know what that change was?

@vtjnash
Copy link
Member

vtjnash commented Jan 7, 2020

b9546c8

JeffBezanson and others added 10 commits January 8, 2020 10:27
…eltypes (#34230)

In cases where we have multiple arrays with `OneTo` axes that do not share the same axis eltype, we should simply default to constructing a new array with `OneTo{Int}` axes.

(cherry picked from commit 11e7c33)
* Update svd.jl

* Update svd.jl

Make doc more useful to the reader.

* Fix grammar and one trailing whitespace

* Update svd.jl

* Fix doctests

* Avoid duplicating documentation between mutating and non-mutating
versions of svd functions.

* Reorganize generalized svd docstring and doctests

* Break some long lines

Co-authored-by: Viral B. Shah <viral@juliacomputing.com>
Co-authored-by: Andreas Noack <andreas@noack.dk>
(cherry picked from commit cec4c32)
add more-compatible fallback for `divrem`

(cherry picked from commit 8f53987)
@sostock
Copy link
Contributor

sostock commented Jan 8, 2020

HalfIntegers: div.jl issue with cld; packages need to implement div(a, b, RoundUp)

This is #34134. It is fixed on HalfIntegers master, but I wanted to wait for a decision on #34129 before releasing a new HalfIntegers version that contains the fix.

@andreasnoack
Copy link
Member

Whitespace error fixed in #34299

@KristofferC
Copy link
Member Author

Regarding cbrt getting 11x slower (and cos, sin etc) I think this is just benchmark noise.

On 1.3:

julia> @btime cbrt($(nextfloat(zero(Float64))));
  0.029 ns (0 allocations: 0 bytes)

and master

julia> @btime cbrt($(nextfloat(zero(Float64))));
  155.990 ns (0 allocations: 0 bytes)

Doing the "Ref trick":

1.3:

julia> @btime cbrt($(Ref(nextfloat(zero(Float64))))[]);
  178.467 ns (0 allocations: 0 bytes)

and master

julia> @btime cbrt($(Ref(nextfloat(zero(Float64))))[]);
  156.010 ns (0 allocations: 0 bytes)

@KristofferC
Copy link
Member Author

KristofferC commented Jan 15, 2020

Allocation increase in ["array", "index", "(\"sumvector\", \"1.0:0.00010001000100010001:2.0\")"] is MWEd with

julia> v = 1.0:0.00010001000100010001:2.0
1.0:0.00010001000100010001:2.0

julia> r = rand(1:length(v), 5);

julia> f(v, r) = v[r, 1]
f (generic function with 1 method)

julia> @btime f($v, $r);
  76.075 ns (3 allocations: 192 bytes)

(makes 2 allocations on 1.3). Changing to v[r] makes the allocation go back to 2.

@sostock
Copy link
Contributor

sostock commented Jan 16, 2020

Before releasing a new HalfIntegers version that is compatible with Julia 1.4 (and marking old versions as incompatible), I wanted to know whether #34129 could be included in 1.4 as well. I guess it’s too late for that now?

While it is not a bugfix, I think it is an important change to the implementation in #33910 (which was merged shortly before the 1.4 branch happened). Without #34129, it is rather inconvenient to extend gcd/lcm/gcdx to custom Real types (see #33910 (comment)).

KristofferC and others added 10 commits January 17, 2020 09:10
There was a bug in initial value handling of `FlatteningRF` and the
following example failed:

    @test mapfoldl(
        x -> (x, x),
        ((a, b), (c, d)) -> (min(a, c), max(b, d)),
        Iterators.flatten((1:2, 3:4)),
    ) == (1, 4)

This is because `BottomRF(op.rf)` was called inside `FlatteningRF`
where `op.rf` is already a "non-bottom" reducing function; here it's a
`MappingRF`.  As `BottomRF(rf)` forwards anything on the second
argument on the first invocation as the first argument (accumulator)
of the next calls, we need to make sure that this value is processed
through `MappingRF` in the above example.  However, if we do
`BottomRF(op.rf)` where `op.rf` is a `MappingRF`, this `BottomRF`
bypasses any processing that has to happen in `op.rf`.

(cherry picked from commit 0ee3264)
This fixes #34247 by changing the way gc preserve is lowered.
Instead of lowering it in a macro, lower it in the frontend.
This allows us to use an SSA value directly for the return token
of the gc begin expression. This bypasses the slot-renaming pass
of the compiler, thus preventing the compiler from trying to save
and restore the token. Of course, this kind of code would generally
not be legal (because it uses an SSA value outside of the regular
domination relation), but since this is a julia restriction, not
an LLVM restriction, we can simply exempt gc_begin tokens from this
particular validation. This works fine at the LLVM level also, because
it doesn't have this particular restriction. It also doesn't have
the same correctness problems as doing the same for non-token values,
as the tokens get lowered away by the try/catch lowering before reaching
the LLVM backend.

(cherry picked from commit 07a16d6)
Fixes the same CI failure as #34336,
but hopefully should be immediately mergeable without objection about what
the interface is.

(cherry picked from commit 3ed4e94)
After much debugging, it turns out that OpenBlas is sometimes hanging
on exit due to bad assumptions in its thread management. The upstream
discussion is at OpenMathLib/OpenBLAS#2350, but this
should fix our frequent win32 CI failures in the meantime.
@KristofferC KristofferC force-pushed the backports-release-1.4 branch from 3405e7d to 0360daa Compare January 17, 2020 08:10
sostock and others added 6 commits January 20, 2020 09:31
* Use UIUD to create random tempname on Windows

* Use underscores and remove extension

* Truncate to 10 chars the UUID

* Generate the random name from a random byte array

* Update file.jl

(cherry picked from commit d759b5b)
In both places we call it. Also tabs-to-spaces for the new code
to match the formatting of the rest of the code.

(cherry picked from commit f120989)
This was removed before and added back apparently by mistake.

(cherry picked from commit 5928786)
@KristofferC
Copy link
Member Author

@sostock, your changes should be in this branch now.

@ararslan ararslan removed DO NOT MERGE Do not merge this PR! needs nanosoldier run This PR should have benchmarks run on it needs pkgeval Tests for all registered packages should be run with this change labels Jan 20, 2020
@ararslan ararslan added this to the 1.4 milestone Jan 20, 2020
@ararslan ararslan merged commit 1c977b0 into release-1.4 Jan 20, 2020
@ararslan ararslan deleted the backports-release-1.4 branch January 20, 2020 21:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release Release management and versioning.
Projects
None yet
Development

Successfully merging this pull request may close these issues.