Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong result with exp! #14

Closed
baggepinnen opened this issue May 17, 2023 · 3 comments · Fixed by #16
Closed

Wrong result with exp! #14

baggepinnen opened this issue May 17, 2023 · 3 comments · Fixed by #16

Comments

@baggepinnen
Copy link
Contributor

I'm using this package to differentiate through exp! with ForwardDiff, but I seem to get incorrect results. Below is a MWE that produces a large error.

using LinearAlgebra, ForwardDiff, FiniteDiff, ForwardDiffChainRules
@ForwardDiff_frule LinearAlgebra.exp!(x1::AbstractMatrix{<:ForwardDiff.Dual})
function test_exp(x)
    X = copy(reshape(x, 4, 4))
    LinearAlgebra.exp!(X)
    sum(X)
end

x = randn(16)
g1 = ForwardDiff.gradient(test_exp, x)
g2 = FiniteDiff.finite_difference_gradient(test_exp, x)
norm(g1-g2)

julia> norm(g1-g2)
3.7329399053147245

The exp! rule in ChainRules appears to be extensively tested, so I assume that the error lies here?

@baggepinnen
Copy link
Contributor Author

I believe the problem is due to exp! working in-place, but some package along the way assumes that it doesn't

If I modify my function to make use of the return value of exp!, it works as expected

function test_exp(x)
    X = copy(reshape(x, 4, 4))
    X2 = LinearAlgebra.exp!(X)
    sum(X2)
end

x = randn(16)
g1 = ForwardDiff.gradient(test_exp, x)
g2 = FiniteDiff.finite_difference_gradient(test_exp, x)
norm(g1-g2)

julia> norm(g1-g2)
7.220435699988837e-10

@baggepinnen
Copy link
Contributor Author

Actually, scratch that, if I run the code multiple times, it sometimes produce a good answer and sometimes complete garbage

using LinearAlgebra, ForwardDiff, FiniteDiff, ForwardDiffChainRules
@ForwardDiff_frule LinearAlgebra.exp!(x1::AbstractMatrix{<:ForwardDiff.Dual})
function test_exp(x)
    X = copy(reshape(x, 4, 4))
    X2 = LinearAlgebra.exp!(X)
    sum(X2)
end

for i = 1:20
    x = randn(16)
    g1 = ForwardDiff.gradient(test_exp, x)
    g2 = FiniteDiff.finite_difference_gradient(test_exp, x)
    @show norm(g1-g2)
end

norm(g1 - g2) = 0.7837873290852769
norm(g1 - g2) = 4.273819159966698
norm(g1 - g2) = 3.615018535697867e-10
norm(g1 - g2) = 1.4710020159772985
norm(g1 - g2) = 5.635191097418673e-10
norm(g1 - g2) = 1.9245924669231074
norm(g1 - g2) = 5.107089061936941e-10
norm(g1 - g2) = 1.7131885110939008e-9
norm(g1 - g2) = 5.312492090497393e-10
norm(g1 - g2) = 4.768520016802831
norm(g1 - g2) = 3.355171746570431e-10
norm(g1 - g2) = 1.3810866694636642e-9
norm(g1 - g2) = 2.968108261531919
norm(g1 - g2) = 0.6199744911207663
norm(g1 - g2) = 1.0541068611698485e-9
norm(g1 - g2) = 2.4861853228116223e-10
norm(g1 - g2) = 1.685781100596634
norm(g1 - g2) = 6.829920935101087e-10
norm(g1 - g2) = 4.842450839137196
norm(g1 - g2) = 7.080227584299486

@baggepinnen
Copy link
Contributor Author

I think I've found the problem, exp! mutates it's input argument and that is ont correctly handled by this package, a copy of the primal must be passed into frule for the results to be correct

baggepinnen added a commit to baggepinnen/ForwardDiffChainRules.jl that referenced this issue May 19, 2023
@baggepinnen baggepinnen mentioned this issue May 19, 2023
ThummeTo added a commit that referenced this issue Sep 12, 2023
* fix #14

* add FD

* Update test/runtests.jl

Co-authored-by: Mohamed Tarek <mohamed82008@gmail.com>

---------

Co-authored-by: Mohamed Tarek <mohamed82008@gmail.com>
Co-authored-by: ThummeTo <83663542+ThummeTo@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant