Skip to content

Conversation

NickHoernle
Copy link

This change follows Algorithm 9 and Appendix C of E. Fox dissertation 2009.

The change reparameterized alpha and kappa as (alpha+kappa) and rho=kappa/(kappa+alpha). We place a Gamma prior over (alpha+kappa) and a Beta prior over rho. The WeakLimitStickyHDPHMM class is updated to allow the new hyper-parameters. Further, the new classes are added to the transitions.py class to include "FullConcGibbs" classes to indicate that we are now sampling over all of the hyper-parameters in the model (rather than just the alpha and gamma parameters as before).

I ran the updated model on example-data.txt following the code in hsmm.py (changing the relevant model to:

posteriormodel = pyhsmm.models.WeakLimitStickyHDPHMM(
                    gamma_a_0=1,
                    gamma_b_0=1/4,
                    alpha_kappa_a_0=1,
                    alpha_kappa_b_0=1/4,
                    rho_c_0=1,
                    rho_d_0=1,
                    init_state_concentration=1,
                    obs_distns=obs_distns)

The result is that we now have a posterior over kappa:
plt.hist([m.trans_distn.kappa for m in models]):
image

I'd very much appreciate it if you have any comments/suggestions/reviews.

Thanks very much

…mma and kappa). The hyperparameters are reparameterised to (kappa+alpha) and rho=kappa/(alpha+kappa). transitions.py the new "FullConcGibbs" classes are included to allow this resampling
…tionary to hold the hyperparameters and the probability value. Possible need for neater update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant