-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
inference regression on afoldl
#39175
Comments
A major contribution seems to be the operation Julia 1.5.3: julia> f(x...) = Int.(1:max(x...))
f (generic function with 1 method)
julia> @btime f(1,2)
50.888 ns (1 allocation: 96 bytes) master: julia> f(x...) = Int.(1:max(x...))
f (generic function with 1 method)
julia> @btime f(1,2)
444.369 ns (5 allocations: 240 bytes) |
The current master does not specialize the function. |
It is not only the line you are focusing on. The following version
Shows a factor of two difference between 1.5.3 and 1.6.0-beta1 |
If specialization of |
Specializing with |
Is this going to be fixed in 1.6? It seems to me |
The performance tips page says
Maybe it is not clear that the second sentence only refers to |
I do not understand. Do you find normal that the performance will be degraded in 1.6 with respect to 1.5? |
@jmichel7, first, please be patient. You only filed this 3 days ago, there are not legions of people available to fix these issues and all of them have big TODO lists. I am not certain this will be fixed, though; Julia's specialization heuristics have always influenced performance, both runtime and compile-time (latency). We've long recommended for |
afoldl
For some reason we're no longer able or willing to infer through |
Ah, we changed this intentionally in the new version:
=> v1.6
|
With constant-propagation, inference (and Varargs runtime) is likely better able to handle this version now (and it removes the n^2 behavior definitions for semi-low argument counts). Now we also need to force Vararg specialization up to 16 arguments manually, so we do that explicitly (and slightly hack-y). Fixes regression in #39175
With constant-propagation, inference (and Varargs runtime) is likely better able to handle this version now (and it removes the n^2 behavior definitions for semi-low argument counts). Now we also need to force Vararg specialization up to 16 arguments manually, so we do that explicitly (and slightly hack-y). Fixes regression in #39175
With constant-propagation, inference (and Varargs runtime) is likely better able to handle this version now (and it removes the n^2 behavior definitions for semi-low argument counts). Now we also need to force Vararg specialization up to 16 arguments manually, so we do that explicitly (and slightly hack-y). Fixes regression in #39175 (cherry picked from commit 5cd1e3e)
With constant-propagation, inference (and Varargs runtime) is likely better able to handle this version now (and it removes the n^2 behavior definitions for semi-low argument counts). Now we also need to force Vararg specialization up to 16 arguments manually, so we do that explicitly (and slightly hack-y). Fixes regression in #39175 (cherry picked from commit 5cd1e3e)
With constant-propagation, inference (and Varargs runtime) is likely better able to handle this version now (and it removes the n^2 behavior definitions for semi-low argument counts). Now we also need to force Vararg specialization up to 16 arguments manually, so we do that explicitly (and slightly hack-y). Fixes regression in JuliaLang#39175
With constant-propagation, inference (and Varargs runtime) is likely better able to handle this version now (and it removes the n^2 behavior definitions for semi-low argument counts). Now we also need to force Vararg specialization up to 16 arguments manually, so we do that explicitly (and slightly hack-y). Fixes regression in JuliaLang#39175
With constant-propagation, inference (and Varargs runtime) is likely better able to handle this version now (and it removes the n^2 behavior definitions for semi-low argument counts). Now we also need to force Vararg specialization up to 16 arguments manually, so we do that explicitly (and slightly hack-y). Fixes regression in #39175 (cherry picked from commit 5cd1e3e)
I tried my big algebra package on 1.6.0-beta1. Many parts are slightly faster, but one important part is many times slower.
I found several problems but the shortest MWE is
The text was updated successfully, but these errors were encountered: