Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spline and a few other transforms don't work for density estimation... #2545

Closed
stefanwebb opened this issue Jun 28, 2020 · 1 comment
Closed
Assignees
Labels

Comments

@stefanwebb
Copy link
Contributor

I've discovered there is some bug that means that .backwards() throws an error when you call it the second time during the learning loop for density estimation. I've created an extra test for this and am in the process of debugging (see #2544)

Here's an example of the bug:

import torch
import pyro
import pyro.distributions as dist
import pyro.distributions.transforms as T

import numpy as np
from sklearn import datasets
from sklearn.preprocessing import StandardScaler

n_samples = 1000
X, y = datasets.make_circles(n_samples=n_samples, factor=0.5, noise=0.05)
X = StandardScaler().fit_transform(X)

base_dist = dist.Normal(torch.zeros(2), torch.ones(2))
spline_transform = T.spline(2)
flow_dist = dist.TransformedDistribution(base_dist, [spline_transform])

torch.autograd.set_detect_anomaly(True)

steps = 1000
dataset = torch.tensor(X, dtype=torch.float, requires_grad=False)
optimizer = torch.optim.Adam(spline_transform.parameters(), lr=1e-2)
for step in range(steps):
    optimizer.zero_grad()
    loss = -flow_dist.log_prob(dataset.detach()).mean()
    loss.backward()
    optimizer.step()
    
    if step % 100 == 0:
        print(f'step: {step}, loss: {loss.item()}')
@stefanwebb stefanwebb added the bug label Jun 28, 2020
@stefanwebb stefanwebb self-assigned this Jun 28, 2020
@stefanwebb
Copy link
Contributor Author

#2544 has merged so I'm closing this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant