Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conflict between Zygote and AbstractGPs #394

Closed
jmbyars opened this issue Mar 8, 2024 · 5 comments
Closed

Conflict between Zygote and AbstractGPs #394

jmbyars opened this issue Mar 8, 2024 · 5 comments

Comments

@jmbyars
Copy link

jmbyars commented Mar 8, 2024

I am trying to replicate the Mauna Loa example from AbstractGP (https://juliagaussianprocesses.github.io/AbstractGPs.jl/stable/examples/1-mauna-loa/) line by line. However, I keep getting the following error when trying to use the L-BFGS optimization that the example recommends.

default_optimizer = LBFGS(;
    alphaguess=Optim.LineSearches.InitialStatic(; scaled=true),
    linesearch=Optim.LineSearches.BackTracking(),
)

function optimize_loss(loss, θ_init; optimizer=default_optimizer, maxiter=1_000)
    options = Optim.Options(; iterations=maxiter, show_trace=true)

    θ_flat_init, unflatten = ParameterHandling.value_flatten(θ_init)
    loss_packed = loss ∘ unflatten

    # https://julianlsolvers.github.io/Optim.jl/stable/#user/tipsandtricks/#avoid-repeating-computations
    function fg!(F, G, x)
        if F !== nothing && G !== nothing
            val, grad = Zygote.withgradient(loss_packed, x)
            G .= only(grad)
            return val
        elseif G !== nothing
            grad = Zygote.gradient(loss_packed, x)
            G .= only(grad)
            return nothing
        elseif F !== nothing
            return loss_packed(x)
        end
    end

    result = optimize(Optim.only_fg!(fg!), θ_flat_init, optimizer, options; inplace=false)

    return unflatten(result.minimizer), result
end

θ_opt, opt_result = optimize_loss(loss, θ_init)

ERROR: MethodError: no method matching AbstractGPs.FiniteGP(::GP{AbstractGPs.ZeroMean{Float64}, KernelSum{Tuple{ScaledKernel{TransformedKernel{SqExponentialKernel{Distances.Euclidean}, ScaleTransform{Float64}}, Float64}, KernelProduct{Tuple{TransformedKernel{SqExponentialKernel{Distances.Euclidean}, ChainTransform{Tuple{PeriodicTransform{Vector{Float64}}, ScaleTransform{Float64}}}}, ScaledKernel{TransformedKernel{SqExponentialKernel{Distances.Euclidean}, ScaleTransform{Float64}}, Float64}}}, ScaledKernel{TransformedKernel{RationalQuadraticKernel{Float64, Distances.Euclidean}, ScaleTransform{Float64}}, Float64}, ScaledKernel{TransformedKernel{SqExponentialKernel{Distances.Euclidean}, ScaleTransform{Float64}}, Float64}, ScaledKernel{WhiteKernel, Float64}}}}, ::Float64)

Do you all have any advice on fixing this issue?

@willtebbutt
Copy link
Member

Hi @jmbyars , thanks for opening this issue.

This seems to be fine when I try it locally using my own data. Could you please turn your example into a MWE that we can run which reproduces your issue?

@simsurace
Copy link
Member

Also, it would be nice to have the output of ]st --manifest in the environment in which you are running this.

@simsurace
Copy link
Member

On a related note, we should bump the version and release #390. @jmby did you try to run the example on AbstractGPs#master?

@simsurace
Copy link
Member

Ok, there is now a new release v0.5.20 with the newest version of the example.

@jmbyars
Copy link
Author

jmbyars commented Mar 11, 2024

@simsurace thank you for this. Upgrading to v.0.5.20 lets me run the example.

@jmbyars jmbyars closed this as completed Mar 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants