Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AD fails for MixtureModel #722

Closed
cpfiffer opened this issue Mar 20, 2019 · 14 comments
Closed

AD fails for MixtureModel #722

cpfiffer opened this issue Mar 20, 2019 · 14 comments
Assignees

Comments

@cpfiffer
Copy link
Member

AD fails through the MixtureModels function. This is related to #674, should I move this over there?

MWE:

using Turing

@model MarginalizedGMM(x, K) = begin
    N = length(x)

    # Draw the paramters
    μ = Vector(undef, K)
    τ = Vector(undef, K)
    for i in 1:K
        μ[i] ~ Normal()
        τ[i] ~ Gamma()
    end

    # Set the Dirichlet prior.
    α = 1.0
    w ~ Dirichlet(K, α)

    # Calculate weighted Normals.
    for i in 1:N
      x[i] ~ MixtureModel(Normal, [(μ[i], τ[i]) for i = 1:K], w)
    end
end

val = rand(10)
chn = sample(MarginalizedGMM(val, 3), NUTS(1000, 0.65))

Error:

ERROR: LoadError: MethodError: no method matching MixtureModel(::Type{Normal}, ::Array{Tuple{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#413#61")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Real,9},ForwardDiff.Dual{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#413#61")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Float64,9}},1}, ::Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#413#61")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Real,9},1})
Closest candidates are:
  MixtureModel(::Type{C<:Distribution}, ::AbstractArray) where C<:Distribution at /home/cameron/.julia/packages/Distributions/WHjOk/src/mixtures/mixturemodel.jl:122
  MixtureModel(::Type{C<:Distribution}, ::AbstractArray, ::Array{Float64,1}) where C<:Distribution at /home/cameron/.julia/packages/Distributions/WHjOk/src/mixtures/mixturemodel.jl:139
Stacktrace:
 [1] macro expansion at /home/cameron/.julia/dev/Turing/src/core/compiler.jl:44 [inlined]
 [2] macro expansion at /home/cameron/code/julia/misc.jl:20 [inlined]
 [3] (::getfield(Main, Symbol("###inner_function#413#61")){Int64})(::Turing.Core.VarReplay.VarInfo, ::Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}, ::Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#413#61")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}) at /home/cameron/.julia/dev/Turing/src/core/compiler.jl:388
 [4] #call#3 at /home/cameron/.julia/dev/Turing/src/Turing.jl:62 [inlined]
 [5] Model at /home/cameron/.julia/dev/Turing/src/Turing.jl:62 [inlined]
 [6] macro expansion at /home/cameron/.julia/dev/Turing/src/core/VarReplay.jl:112 [inlined]
 [7] runmodel!(::Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#413#61")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}, ::Turing.Core.VarReplay.VarInfo, ::Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}) at /home/cameron/.julia/dev/Turing/src/core/VarReplay.jl:107
@xukai92
Copy link
Member

xukai92 commented Mar 20, 2019

Which Distributions.jl version is the error from? We used to host a customised version of GMM which supports AD but we removed it in some time. However, this PR JuliaStats/Distributions.jl#615 is recently merged so I'm not sure if it's not testing with the newest Distributions or that PR doesn't solve the problem.

@yebai
Copy link
Member

yebai commented Mar 20, 2019

Related: #266

@cpfiffer
Copy link
Member Author

This fails on Distributions v0.17.0. I also ran it on the master branch, and now I've got this error:

ERROR: LoadError: TypeError: in typeassert, expected Float64, got ForwardDiff.Dual{Nothing,Float64,9}
Stacktrace:
 [1] setindex!(::Array{Float64,1}, ::ForwardDiff.Dual{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#369#29")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Float64,9}, ::Int64) at ./array.jl:767
 [2] _mixlogpdf1(::MixtureModel{Univariate,Continuous,Normal,ForwardDiff.Dual{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#369#29")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Real,9}}, ::Float64) at /home/cameron/.julia/packages/Distributions/fMt8c/src/mixtures/mixturemodel.jl:369
 [3] observe at /home/cameron/.julia/packages/Distributions/fMt8c/src/mixtures/mixturemodel.jl:433 [inlined]
 [4] observe at /home/cameron/.julia/dev/Turing/src/inference/Inference.jl:179 [inlined]
 [5] observe(::Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}, ::MixtureModel{Univariate,Continuous,Normal,ForwardDiff.Dual{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#369#29")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Real,9}}, ::Float64, ::Turing.Core.VarReplay.VarInfo) at /home/cameron/.julia/dev/Turing/src/inference/hmc.jl:311

@yebai
Copy link
Member

yebai commented Mar 27, 2019

Which Distributions.jl version is the error from? We used to host a customised version of GMM which supports AD but we removed it in some time. However, this PR JuliaStats/Distributions.jl#615 is recently merged so I'm not sure if it's not testing with the newest Distributions or that PR doesn't solve the problem.

@xukai92
There seem to be some leftover issues with JuliaStats/Distributions.jl#615. The same model still doesn't work even after changing MixtureModel to UnivariateGMM.

using Turing

@model MarginalizedGMM(x, K) = begin
    N = length(x)

    # Draw the paramters
    μ = Vector{Real}(undef, K)
    τ = Vector{Real}(undef, K)
    for i in 1:K
        μ[i] ~ Normal()
        τ[i] ~ Gamma()
    end

    # Set the Dirichlet prior.
    α = 1.0
    w ~ Dirichlet(K, α)

    # Calculate weighted Normals.
    for i in 1:N
      x[i] ~ Distributions.UnivariateGMM(μ,τ, Categorical(w))
    end
end

val = rand(10)
chn = sample(MarginalizedGMM(val, 3), NUTS(1000, 0.65))

Output:

julia> chn = sample(MarginalizedGMM(val, 3), NUTS(1000, 0.65))
[ Info: [Turing] looking for good initial eps...
ERROR: TypeError: in typeassert, expected Float64, got ForwardDiff.Dual{Nothing,Float64,9}
Stacktrace:
 [1] setindex!(::Array{Float64,1}, ::ForwardDiff.Dual{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{,,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Float64,9}, ::Int64) at ./array.jl:767
 [2] _mixlogpdf1(::UnivariateGMM{Array{Real,1},Array{Real,1}}, ::Float64) at /Users/hg344/.julia/packages/Distributions/fMt8c/src/mixtures/mixturemodel.jl:369
 [3] logpdf at /Users/hg344/.julia/packages/Distributions/fMt8c/src/mixtures/mixturemodel.jl:433 [inlined]
 [4] observe at /Users/hg344/.julia/dev/Turing/src/inference/Inference.jl:192 [inlined]
 [5] observe at /Users/hg344/.julia/dev/Turing/src/inference/Inference.jl:180 [inlined]
 [6] observe at /Users/hg344/.julia/dev/Turing/src/inference/hmc.jl:308 [inlined]
 [7] macro expansion at /Users/hg344/.julia/dev/Turing/src/core/compiler.jl:52 [inlined]
 [8] macro expansion at ./REPL[39]:18 [inlined]
 [9] (::getfield(Main, Symbol("###inner_function#457#33")){Int64})(::Turing.Core.VarReplay.VarInfo, ::Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}, ::Turing.Model{Tuple{,,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}) at /Users/hg344/.julia/dev/Turing/src/core/compiler.jl:388
 [10] #call#3 at /Users/hg344/.julia/dev/Turing/src/Turing.jl:62 [inlined]
 [11] Model at /Users/hg344/.julia/dev/Turing/src/Turing.jl:62 [inlined]
 [12] macro expansion at /Users/hg344/.julia/dev/Turing/src/core/VarReplay.jl:110 [inlined]
 [13] runmodel!(::Turing.Model{Tuple{,,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}, ::Turing.Core.VarReplay.VarInfo, ::Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}) at /Users/hg344/.julia/dev/Turing/src/core/VarReplay.jl:105
 [14] f at /Users/hg344/.julia/dev/Turing/src/core/ad.jl:109 [inlined]
 [15] vector_mode_dual_eval(::getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}}, ::Array{Real,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Real,9,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Real,9},1}}) at /Users/hg344/.julia/packages/ForwardDiff/N0wMF/src/apiutils.jl:37
 [16] vector_mode_gradient!(::Array{Real,1}, ::getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}}, ::Array{Real,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Real,9,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Real,9},1}}) at /Users/hg344/.julia/packages/ForwardDiff/N0wMF/src/gradient.jl:103
 [17] gradient! at /Users/hg344/.julia/packages/ForwardDiff/N0wMF/src/gradient.jl:35 [inlined]
 [18] gradient!(::Array{Real,1}, ::getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}}, ::Array{Real,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Real,9,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(Turing.Core, Symbol("#f#26")){Turing.Core.VarReplay.VarInfo,Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}},Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}},Real},Real,9},1}}) at /Users/hg344/.julia/packages/ForwardDiff/N0wMF/src/gradient.jl:33
 [19] gradient_logp_forward(::Array{Real,1}, ::Turing.Core.VarReplay.VarInfo, ::Turing.Model{Tuple{,,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}, ::Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}) at /Users/hg344/.julia/dev/Turing/src/core/ad.jl:116
 [20] gradient_logp at /Users/hg344/.julia/dev/Turing/src/core/ad.jl:80 [inlined]
 [21] #2 at /Users/hg344/.julia/dev/Turing/src/inference/support/hmc_core.jl:12 [inlined]
 [22] #_leapfrog#26(::Nothing, ::Nothing, ::Function, ::Array{Real,1}, ::Array{Float64,1}, ::Int64, ::Float64, ::getfield(Turing.Inference, Symbol("##2#3")){Turing.Core.VarReplay.VarInfo,Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}},Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}}) at /Users/hg344/.julia/dev/Turing/src/inference/support/hmc_core.jl:147
 [23] _leapfrog(::Array{Real,1}, ::Array{Float64,1}, ::Int64, ::Float64, ::Function) at /Users/hg344/.julia/dev/Turing/src/inference/support/hmc_core.jl:147
 [24] #_find_good_eps#29(::Int64, ::Function, ::Array{Real,1}, ::getfield(Turing.Inference, Symbol("##4#5")){Turing.Core.VarReplay.VarInfo,Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}},Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}}, ::Function, ::getfield(Turing.Inference, Symbol("##12#13")), ::getfield(Turing.Inference, Symbol("##10#11")){Int64}) at /Users/hg344/.julia/dev/Turing/src/inference/support/hmc_core.jl:253
 [25] _find_good_eps(::Array{Real,1}, ::Function, ::Function, ::Function, ::Function) at /Users/hg344/.julia/dev/Turing/src/inference/support/hmc_core.jl:245
 [26] find_good_eps(::Turing.Model{Tuple{,,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}, ::Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}, ::Turing.Core.VarReplay.VarInfo) at /Users/hg344/.julia/dev/Turing/src/inference/support/hmc_core.jl:236
 [27] step(::Turing.Model{Tuple{,,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}, ::Turing.Sampler{NUTS{Turing.Core.ForwardDiffAD{40},Any}}, ::Turing.Core.VarReplay.VarInfo, ::Val{true}) at /Users/hg344/.julia/dev/Turing/src/inference/hmc.jl:202
 [28] macro expansion at ./util.jl:213 [inlined]
 [29] #sample#37(::Bool, ::Nothing, ::Int64, ::Nothing, ::Function, ::Turing.Model{Tuple{:μ,:τ,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}, ::NUTS{Turing.Core.ForwardDiffAD{40},Any}) at /Users/hg344/.julia/dev/Turing/src/inference/hmc.jl:150
 [30] sample(::Turing.Model{Tuple{,,:w},Tuple{:x},getfield(Main, Symbol("###inner_function#457#33")){Int64},NamedTuple{(:x,),Tuple{Array{Float64,1}}},NamedTuple{(:x,),Tuple{Symbol}}}, ::NUTS{Turing.Core.ForwardDiffAD{40},Any}) at /Users/hg344/.julia/dev/Turing/src/inference/hmc.jl:102
 [31] top-level scope at none:0

@xukai92
Copy link
Member

xukai92 commented Mar 28, 2019

I see. This is from this line: https://github.com/JuliaStats/Distributions.jl/blob/master/src/mixtures/mixturemodel.jl#L369, which basically constraints the return log-pdf to be the same type as the observed variable (in our case it's Float64).

The following hack works:

    for i in 1:N
      _x = eltype(μ)(x[i])
      _x ~ Distributions.UnivariateGMM(μ,τ, Categorical(w))
    end

but I will try to resolve the in Distributions.jl soon.

@xukai92
Copy link
Member

xukai92 commented Mar 28, 2019

but I will try to resolve the in Distributions.jl soon.

By making the returned log-pdf same as the type of parameters.

@xukai92
Copy link
Member

xukai92 commented Mar 28, 2019

Submitted JuliaStats/Distributions.jl#853

@yebai
Copy link
Member

yebai commented Mar 28, 2019

@xukai92 can you test against both ForwardDiff and Flux.Tracker?

@xukai92
Copy link
Member

xukai92 commented Mar 28, 2019

OK you mean added a test in Turing or locally just.

@yebai
Copy link
Member

yebai commented Mar 28, 2019

a local test would be fine for now; we can add UniveriateGMM to #712 later.

@xukai92
Copy link
Member

xukai92 commented Mar 28, 2019

Just checked with JuliaStats/Distributions.jl#853 it works with both forward and reverse mode.

@yebai
Copy link
Member

yebai commented Apr 11, 2019

@cpfiffer could you give this another try, since a fix has been merged into Distributions.jl?

BTW: we need to use UniveriateGMM instead of MixtureModel (see here); and Distributions > v0.18.0.

Ps: this is the output when using Distributions.jl#master

julia> chn = sample(MarginalizedGMM(val, 3), NUTS(1000, 0.65))
[ Info: [Turing] looking for good initial eps...
[ Info: [Turing] found initial ϵ: 0.534375
[ Info:  Adapted ϵ = 0.0580430187561183, std = [1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0]; 500 iterations is used for adaption.
[NUTS] Sampling...100% Time: 0:00:46
[NUTS] Finished with
  Running time        = 45.401994264000045;
  #lf / sample        = 0.0;
  #evals / sample     = 159.324;
  pre-cond. metric    = [1.0, 1.0, 1.0, 1.0, 1.0, 1.0,....
Object of type Chains, with data of type 1000×15×1 Array{Union{Missing, Float64},3}

Log evidence      = 0.0
Iterations        = 1:1000
Thinning interval = 1
Chains            = 1
Samples per chain = 1000
internals         = elapsed, epsilon, eval_num, lf_eps, lf_num, lp
parameters        = τ[3], μ[2], τ[1], τ[2], μ[3], w[1], w[3], w[2], μ[1]

parameters
      Mean    SD   Naive SE  MCSE     ESS   
w[1] 0.3611 0.3097   0.0098 0.0364   72.2558
w[2] 0.3682 0.2954   0.0093 0.0321   84.8491
w[3] 0.2707 0.2831   0.0090 0.0184  236.6837
μ[1] 0.2623 0.6867   0.0217 0.0334  422.6678
μ[2] 0.3501 0.6140   0.0194 0.0251  598.7659
μ[3] 0.1939 0.8178   0.0259 0.0150 1000.0000
τ[1] 0.5724 0.6730   0.0213 0.0417  260.2737
τ[2] 0.5381 0.6824   0.0216 0.0586  135.4392
τ[3] 0.7176 0.8237   0.0260 0.0389  447.9841

@cpfiffer
Copy link
Member Author

I'll run it through today or tomorrow to see if the results look good.

@mohamed82008
Copy link
Member

This seems to be working now. I will close the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants