-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
unwrapMode != UnwrapMode::LegalFullUnwrap with Julia <1.8 #1629
Comments
Hm would you be able to reduce this further and ideally remove the
injectors dependency?
…On Thu, Jul 11, 2024 at 8:39 AM Markus Hauru ***@***.***> wrote:
module MWE
using Bijectors: VecCholeskyBijector, inverse
import Enzyme
import Random
Random.seed!(23)
function test_ad(f, x)
Enzyme.gradient(Enzyme.Reverse, f, x)
end
b = VecCholeskyBijector(:L)
y = [0.8572525342293206]
binv = inverse(b)
test_ad(x -> sum(b(binv(x))), y)
end
Output:
Assertion failed: (unwrapMode != UnwrapMode::LegalFullUnwrap), function unwrapM, file /workspace/srcdir/Enzyme/enzyme/Enzyme/GradientUtils.cpp, line 1668.
signal (6): Abort trap: 6
in expression starting at /Users/mhauru/projects/Enzyme-mwes/julia16_unwrapMode/mwe.jl:17
__pthread_kill at /usr/lib/system/libsystem_kernel.dylib (unknown line)
Allocations: 109107809 (Pool: 109048203; Big: 59606); GC: 52
Abort trap: 6
This happens on
julia> versioninfo()
Julia Version 1.7.2
Commit bf53498635 (2022-02-06 15:21 UTC)
Platform Info:
OS: macOS (arm64-apple-darwin21.2.0)
CPU: Apple M1 Pro
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-12.0.1 (ORCJIT, cyclone)
but not on Julia v1.8.5.
—
Reply to this email directly, view it on GitHub
<#1629>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAJTUXCQKKKJFMDIPMTINC3ZLZ4IBAVCNFSM6AAAAABKW4XCP6VHI2DSMVQWIX3LMV43ASLTON2WKOZSGQYDGMBYGEYTCMI>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Now with 100% less bijectors: module MWE
import Enzyme
using LinearAlgebra
import Random
Random.seed!(23)
_triu1_dim_from_length(d) = (1 + isqrt(1 + 8d)) ÷ 2
function _link_chol_lkj_from_upper(W::AbstractMatrix)
K = LinearAlgebra.checksquare(W)
N = ((K - 1) * K) ÷ 2 # {K \choose 2} free parameters
y = similar(W, N)
idx = 1
@inbounds for j in 2:K
y[idx] = atanh(W[1, j])
idx += 1
remainder_sq = 1 - W[1, j]^2
for i in 2:(j - 1)
z = W[i, j] / sqrt(remainder_sq)
y[idx] = atanh(z)
remainder_sq -= W[i, j]^2
idx += 1
end
end
return y
end
function _inv_link_chol_lkj(y::AbstractVector)
LinearAlgebra.require_one_based_indexing(y)
K = _triu1_dim_from_length(length(y))
W = similar(y, K, K)
T = float(eltype(W))
idx = 1
@inbounds for j in 1:K
log_remainder = zero(T) # log of proportion of unit vector remaining
for i in 1:(j - 1)
z = tanh(y[idx])
idx += 1
W[i, j] = z * exp(log_remainder)
log_remainder += log1p(-z^2) / 2
end
W[j, j] = exp(log_remainder)
for i in (j + 1):K
W[i, j] = 0
end
end
return W
end
_link_chol_lkj_from_lower(W::AbstractMatrix) = _link_chol_lkj_from_upper(permutedims(W))
b(x) = _link_chol_lkj_from_lower(x.L)
binv(y) = Cholesky(permutedims(_inv_link_chol_lkj(y)), 'L', 0)
function test_ad(f, x)
Enzyme.gradient(Enzyme.Reverse, f, x)
end
y = [0.8572525342293206]
test_ad(x -> sum(b(binv(x))), y)
end |
So this succeeded for me on 1.8.4: wmoses@beast:~/git/Enzyme.jl ((HEAD detached at origin/main)) $ cat fuw.jl
import Enzyme
using LinearAlgebra
import Random
Random.seed!(23)
_triu1_dim_from_length(d) = (1 + isqrt(1 + 8d)) ÷ 2
function _link_chol_lkj_from_upper(W::AbstractMatrix)
K = LinearAlgebra.checksquare(W)
N = ((K - 1) * K) ÷ 2 # {K \choose 2} free parameters
y = similar(W, N)
idx = 1
@inbounds for j in 2:K
y[idx] = atanh(W[1, j])
idx += 1
remainder_sq = 1 - W[1, j]^2
for i in 2:(j - 1)
z = W[i, j] / sqrt(remainder_sq)
y[idx] = atanh(z)
remainder_sq -= W[i, j]^2
idx += 1
end
end
return y
end
function _inv_link_chol_lkj(y::AbstractVector)
LinearAlgebra.require_one_based_indexing(y)
K = _triu1_dim_from_length(length(y))
W = similar(y, K, K)
T = float(eltype(W))
idx = 1
@inbounds for j in 1:K
log_remainder = zero(T) # log of proportion of unit vector remaining
for i in 1:(j - 1)
z = tanh(y[idx])
idx += 1
W[i, j] = z * exp(log_remainder)
log_remainder += log1p(-z^2) / 2
end
W[j, j] = exp(log_remainder)
for i in (j + 1):K
W[i, j] = 0
end
end
return W
end
_link_chol_lkj_from_lower(W::AbstractMatrix) = _link_chol_lkj_from_upper(permutedims(W))
b(x) = _link_chol_lkj_from_lower(x.L)
binv(y) = Cholesky(permutedims(_inv_link_chol_lkj(y)), 'L', 0)
function f(x)
sum(b(binv(x)))
end
y = [0.8572525342293206]
Enzyme.gradient(Enzyme.Reverse, f, y) |
Yeah, it works on 1.8 and fails on 1.7 (and probably 1.6, based on the CI run of the Bijectors.jl PR this came from). Excitingly, this, too, seems to be indeterministic. Sometimes the error is the one I pasted above, about unwrapMode assertion failure, sometimes it's rather this:
Any possible relation to the indeterminism in #1626? |
Should be fixed by #1657 please reopen otherwise |
Output:
This happens on
but not on Julia v1.8.5.
The text was updated successfully, but these errors were encountered: