Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Normalizing Flows Examples #1992

Closed
emited opened this issue Aug 4, 2019 · 19 comments
Closed

Normalizing Flows Examples #1992

emited opened this issue Aug 4, 2019 · 19 comments
Labels
Examples help wanted Issues suitable for, and inviting external contributions

Comments

@emited
Copy link

emited commented Aug 4, 2019

Hi,

It would be nice to have some go to examples for training normalizing flows, perhaps reproducing results on toy datasets (e.g. the ones implemented here). For now, from current state of things, it is not very clear how to proceed.

What do you think? @stefanwebb

Would be happy to help.

@jpchen jpchen added Examples help wanted Issues suitable for, and inviting external contributions labels Aug 5, 2019
@stefanwebb
Copy link
Contributor

Hi @emited, thanks for your interest in NFs! I'm glad you've found the code useful.

I absolutely agree, it would be great to have a tutorial on using NFs for the examples, so users can understand what NFs are, how they can benefit from them, and how to use the library.

I've got some other things occupying me for the next two/three weeks and was planning on returning to the NF part of Pyro then. But in the meanwhile, you would be very welcome to design and submit a tutorial, e.g. showing how to learn simple toy datasets like in the link. Or perhaps we can work on it together when I am available again if you're interested?

We would then extend this tutorial showing how to use conditional NFs for amortized inference after my most recent PRs have been cleared, perhaps with some examples on deep generative models + easyguides.

Another avenue for contribution if you're interested is implementing neural ODES/FFORJD as a Pyro/PyTorch transform and integrating into the NF library

@emited
Copy link
Author

emited commented Aug 7, 2019

@stefanwebb thanks a lot for your reply, working together to make a tutorial on NFs in pyro seems like a great idea. Maybe we can discuss this more in detail somewhere else.

For starters, I was thinking about implementing some simple tests, in order to guarantee that the implemented models are able to model some very simple distributions, like they should (maybe you already have something like this?). What do you think?

I am currently implementing a simplified version of neural ODES/FFJORD in Pyro, so would be happy to discuss this too.

@martinjankowiak
Copy link
Collaborator

some sort of tutorial would be great!

i would probably advise against actual tests, however. this is because even if the flow implementation is correct, it can be tricky to fit even quite "simple" distributions. in many cases it will just be a matter of hyperparameter tuning. i suspect that in most cases where authors of flow papers show pretty pictures, a lot of tuning went into it. (so these methods are never quite as blackbox/magical as advertised).

for that reason i think it makes more sense to stick a flow in an actual model, whether that be a VAE or something else. there one can demonstrate improved test log likelihoods, which, depending on the application might actually be useful. probably more useful than pretty pictures.

@stefanwebb
Copy link
Contributor

@emited I agree with Martin here - unit tests wouldn't be that useful. But I think learning toy distributions is useful though in a tutorial for illustrating how to use the API. I would start off with that, then have a second section that shows how to use them in guide programs to improve VI, for VAEs/AIR/traditional Bayesian models etc.

Yes, let's take this discussion offline. Could you send me an email please (my address is on my GitHub profile)?

@emited
Copy link
Author

emited commented Aug 8, 2019

This seems like a valid argument. But it would be important to be able to reproduce results on the toy datasets at least, with a bit of parameter tweaking of course. This could then be included in the tutorial.

I've just sent you an email !

@awarebayes
Copy link

Yeah, Normalized Flows would be super useful with VAE examples like in bjlkeng's article: http://bjlkeng.github.io/posts/variational-autoencoders-with-inverse-autoregressive-flows/

@bmazoure
Copy link

I recently had to re-write my code from PyTorch to Pyro to support many NF families, so I made a documented Jupyter notebook to fit a very simple bivariate mixture of two Gaussians using any flow from Pyro.

Link to notebook

The code might not be optimal, so comments/feedback appreciated!

@saeed1262
Copy link

Hey Guys,
Did you make these tutorials by any chance?
Thanks

@stefanwebb
Copy link
Contributor

Hi @saeed1262, these are in the pipeline and hopefully I will get a chance to finish them soon.

I'm planning on doing a three part tutorial:

  • Part 1 covering the API and learning simple distributions
  • Part 2 covering reproducing SoTA results from the Neural Spline paper
  • Part 3 covering using normalizing flows for flexible variational inference in Pyro

@saeed1262
Copy link

saeed1262 commented May 4, 2020

Hi @stefanwebb ,
Thanks for the quick reply.
That would be great.
We are also planning to set up an NF tutorial class at YorkU. So if you are interested I can share my stuff with you when they are ready.

Do you have an approx date for each part?

@stefanwebb
Copy link
Contributor

I’ll have the first part submitted for review as a PR before Friday this week, but can’t commit to a timeframe for the other two yet

@saeed1262
Copy link

Great. Thanks

@stefanwebb
Copy link
Contributor

@saeed1262 sorry for the delay! See #2542 for a draft of the first tutorial (feedback welcome)

@fritzo fritzo closed this as completed Jul 27, 2020
@heroxbd
Copy link

heroxbd commented Apr 28, 2021

Looking forward to the second part.

@fritzo
Copy link
Member

fritzo commented Apr 28, 2021

@heroxbd normalizing flows work has moved to https://flowtorch.ai

@maulberto3
Copy link

@heroxbd normalizing flows work has moved to https://flowtorch.ai

Hi, does this mean that Pyro will drop normalizing functionality?

@stefanwebb
Copy link
Contributor

@fritzo what do you think about this?

@maulberto3 I just wanted to mention that I had to suspend FlowTorch development at Meta after the economic climate changed earlier in the year. However, I am able to resume work on it now!

@maulberto3
Copy link

@stefanwebb I know what you are saying, and it's not nice, I wish all of you the best. For what's it's worth, I still haven't tuned a simple mnist NF example of mine here or with pyro, maybe soon. Any help you need with this, let me know.

@fritzo
Copy link
Member

fritzo commented Nov 14, 2022

what do you think about [Pyro dropping normalizing flow functionality]?

Pyro tries to avoid changes that would break user code. Thus if we do drop maintenance for normalizing flows, we'd at most add a DeprecationWarning or FutureWarning; I do not foresee us removing any of the existing flows from Pyro. @stefanwebb if you plan to maintain flowtorch going forward, then feel free to add DeprecationWarning or FututreWarning to Pyro's flows, including links that point to the corresponding flows in flowtorch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Examples help wanted Issues suitable for, and inviting external contributions
Projects
None yet
Development

No branches or pull requests

10 participants